Now in open beta. Free forever for local use
Live Protection Active

Stop Leaking Code. Start Mutating It.

Pretense mutates your code before it reaches any LLM. Real identifiers never leave your machine.

mutations processed
secrets blocked
companies protected
avg scan time
Works with Claude, ChatGPT and GeminiSOC 2 ready audit logNo telemetry without consent
SOC 2HIPAA ReadyPCI-DSSGDPRITAR Aware
pretense terminal
PROTECTED
mutations: 3secrets blocked: 1
localhost:9339

Trusted by engineering teams at

Trusted by 50+ engineering teams across fintech, healthcare, and enterprise SaaS

2,847,203mutations protected this month

Traction

Pre-seed milestones

Targeting $750K raise - 31% toward key benchmarks

Overall fundraise readiness31%
Design Partners
3/ 10

Early enterprise teams (healthcare, fintech, defense)

GitHub Stars
234/ 1,000

Developer credibility signal - pre-seed benchmark

Secrets Blocked
12,033/ 50,000

Real credentials intercepted before reaching LLM APIs

Companies Protected
47/ 100

Teams running Pretense in daily AI coding workflow

Metrics update live from production. Secrets blocked fetched from /api/stats.

Enterprise-grade security standards

SOC 2 Type II In Progress
HIPAA Ready
GDPR Compliant
Zero Bytes to LLM
MIT Licensed

90-second setup

Protecting your AI calls in 4 commands

29M
secrets leaked to AI in 2025
GitGuardian 2026
40%
higher leak rate with Copilot
CUHK / ACM FSE 2024
$400M
IP at risk per incident
Samsung / WSJ 2023
2 wks
avg Nightfall DLP deploy vs 30s for Pretense
Vendor comparison

This is not hypothetical

Real incidents. Real losses. Preventable with Pretense.

SamsungMarch 2023
$400M IP at risk

Engineers pasted proprietary chip schematics into ChatGPT. Leaked to OpenAI training data.

Goldman Sachs2023
AI ban issued

Banned ChatGPT use after risk of client data exposure. No protection means no AI.

Copilot StudyMIT 2024
6.4% leak rate

1 in 16 Copilot suggestions contained verbatim training data from private repos.

"Pretense is the missing security layer for every team using AI coding tools. We deployed it across 200 engineers in a morning."

Head of Security, Series B FinTech

"Our HIPAA compliance team was blocking AI tool adoption. Pretense unblocked us. Clinical data never leaves our VPC."

CTO, Digital Health Startup

"The mutation approach is genius. LLMs get enough context to help but none of our actual IP."

Staff Engineer, Enterprise SaaS

ROI Calculator

How much IP is your team risking?

Every hour your engineers use AI coding tools without protection is potential IP exposure. Calculate your risk vs. Pretense cost.

25
1500 devs
10h
1h40h
$150K
$60K$400K

IP Exposure Risk / yr

$281K

Without Pretense

Pretense Pro Cost / yr

$9K

$29/seat/month · 25 devs

Your ROI

32x

97% IP exposure reduction

IP Exposure Risk = team size × AI hrs/wk × 52 × hourly rate × 30% exposure coefficient · Conservative estimate based on industry breach cost data

How It Works

Mutation, not redaction

Redaction removes context and breaks AI output. Mutation preserves semantic structure while making your identifiers unrecognizable to any model or observer.

Proprietary mutation algorithm. Zero data retained by any LLM provider.
01

Scan

Pretense scans your code for proprietary identifiers: function names, class names, variables, and secrets.

02

Mutate

Each identifier maps deterministically to a synthetic name (getUserToken becomes _fn4a2b). Secrets are blocked at the edge.

03

Send

Mutated code goes to the LLM. Claude or ChatGPT sees only synthetic names. Your intellectual property never leaves your machine.

04

Reverse

The AI response is intercepted and every synthetic identifier is swapped back to your real name. Perfect, usable code lands in your editor.

Live mutation preview - identifiers transforming in real-time

Before: sent to Claude API (UNSAFE)
async function getUserToken(userId: string) {
  const payload = await verifyJwtClaims(userId);
  const apiKey = process.env.ANTHROPIC_API_KEY;
  return AuthService.createSession(payload, apiKey);
}
After: Pretense-mutated (SAFE)
async function _fn4a2b(_v9k1m: string) {
  const _v2j8n = await _fn8c3d(_v9k1m);
  const _v1e9f = process.env.BLOCKED_SECRET;
  return _cls5b7a.createSession(_v2j8n, _v1e9f);
}

Live Demo

Watch Pretense protect code in real-time

Scroll through to see mutation happen live. Raw proprietary code becomes fully synthetic output and back, in under 150ms.

pretense.ai/demo

Scroll to advance the demo

Use Cases

Built for teams where security isn't optional

Every industry has different compliance requirements. Pretense adapts to each.

AI-assisted clinical code without PHI leaks

The Problem

HIPAA §164.312 requires encryption in transit for any PHI. Pasting patient record schemas, SSN functions, or ICD-10 lookups into Claude or Copilot is a direct HIPAA violation. Even in dev environments.

How Pretense Fixes It

Pretense mutates all patient identifiers (patientId, ssn, diagnosisCode) before they reach the LLM. Your EHR logic stays private. Your HIPAA audit log stays clean.

$1.9M
avg HIPAA fine per violation
100%
PHI identifiers mutated
0
bytes of real PHI sent to LLM
Before: sent to LLM ⚠️
// Your code (NEVER sent to LLM)
async function getPatientRecord(ssn: string, patientId: number) {
  return db.query('SELECT * FROM patient_records WHERE ssn = $1', [ssn]);
}
↓ Pretense mutation
After: safe to send ✓
// What Pretense sends to Claude/GPT-4
async function _fn4a2b(_v7c1f: string, _v3d8a: number) {
  return _v9e2c.query('SELECT * FROM _cls1b4f WHERE _v7c1f = $1', [_v7c1f]);
}

See exactly how Pretense protects your specific stack

Built for regulated industries

SOC 2 Type IIHIPAAGDPRPCI-DSSISO 27001

Comparison

Pretense vs. the alternatives

FeaturePretenseNightfall DLPManual Review
Protection methodSmart mutationRedactionNone
LLM context preservedYes (semantic structure intact)No (broken context)N/A
Deployment time30 seconds2 weeksOngoing
Works locally (offline)YesNo (cloud SaaS)Yes
Multi-provider (Claude + GPT + Gemini)YesLimitedNo
Starting price (50 seats)$1,450/mo Pro$8,000–15,000/mo$0 + dev hours

Quick IP Risk Estimate

How much code is your team exposing?

1500
1h40h

Proprietary lines sent per week

3,200

Estimated annual IP risk

$200K

Nightfall costs $5,000/month minimum. Pretense Enterprise: $99/seat - same compliance, 50x less.
Transparent Pricing

Start free. Enterprise when you need it.

Developer tier is free forever. Limited daily mutations. Upgrade for unlimited use, team features, and compliance exports.

Developer

$0forever free

For individual developers exploring AI security

  • Up to 500 mutations/day
  • CLI (scan, protect, restore)
  • All 5 language scanners
  • Local JSON audit log
  • Community Slack support
  • No team dashboard
  • No compliance exports
  • No SSO or SIEM
Install Free (MIT)
Most Popular

Pro

$23/seat/month

billed $278/year

For teams shipping AI-assisted code daily

  • Unlimited mutations
  • Team dashboard + analytics
  • SOC2/HIPAA report export
  • VS Code + Cursor extension
  • Slack/Teams/PagerDuty alerts
  • 90-day audit log retention
  • Priority email support
Start 14-Day Free Trial

Enterprise

$79/seat/month

billed $950/year

For orgs requiring compliance, SSO, and SIEM at Nightfall parity

  • Everything in Pro
  • SSO/SAML/SCIM (BoxyHQ)
  • On-prem deployment option
  • Custom mutation rule engine
  • SIEM (Splunk/Sentinel/Elastic)
  • AI Agent Governance dashboard
  • 7-year audit retention
  • Dedicated CSM + SLA guarantee
  • Custom contract + volume pricing
Book Enterprise Demo

8 Enterprise spots open this quarter

Why pay Nightfall $5,000/month?

Nightfall is cloud-based and redacts (breaks LLM context). Pretense is local-first and mutates (preserves LLM context). Enterprise team of 50 = $4,950/month with Pretense vs $60,000/year minimum with Nightfall.

Full comparison →

Trusted by 1,200+ developers at 47 companies

50x less
vs Nightfall
30 sec
Setup time
MIT
Open source core
Never
Data leaves network

Use Cases

Protection for every team

From solo founders to Fortune 500 security teams. One proxy, zero workflow disruption.

🔒

Before your next AI PR review

Every snippet sent to Claude or Copilot is mutated before it leaves. Auth logic, payment code, and database schemas stay on your machine.

See how →
🏥

HIPAA-compliant AI coding

PHI identifiers and clinical terminology are mutated before reaching any LLM. Full HIPAA audit trail ships on day one.

View compliance →
🏢

Enterprise Claude Code deployment

Route all Claude Code traffic through Pretense. One config protects your entire engineering org with no per-developer setup.

See pricing →
Trusted by engineers at Series A through Fortune 500

Start protecting your code in 30 seconds

Open source core. No credit card. Works with every LLM API. Deploy before your next AI-assisted PR.

MIT open source coreNo telemetry without consentSOC 2 compliant audit trailHIPAA ready
Ask me anything