Stop Leaking Code. Start Mutating It.
Pretense mutates your code before it reaches any LLM. Real identifiers never leave your machine.
Trusted by engineering teams at
Trusted by 50+ engineering teams across fintech, healthcare, and enterprise SaaS
Traction
Pre-seed milestones
Targeting $750K raise - 31% toward key benchmarks
Early enterprise teams (healthcare, fintech, defense)
Developer credibility signal - pre-seed benchmark
Real credentials intercepted before reaching LLM APIs
Teams running Pretense in daily AI coding workflow
Metrics update live from production. Secrets blocked fetched from /api/stats.
Enterprise-grade security standards
This is not hypothetical
Real incidents. Real losses. Preventable with Pretense.
Engineers pasted proprietary chip schematics into ChatGPT. Leaked to OpenAI training data.
Banned ChatGPT use after risk of client data exposure. No protection means no AI.
1 in 16 Copilot suggestions contained verbatim training data from private repos.
"Pretense is the missing security layer for every team using AI coding tools. We deployed it across 200 engineers in a morning."
"Our HIPAA compliance team was blocking AI tool adoption. Pretense unblocked us. Clinical data never leaves our VPC."
"The mutation approach is genius. LLMs get enough context to help but none of our actual IP."
ROI Calculator
How much IP is your team risking?
Every hour your engineers use AI coding tools without protection is potential IP exposure. Calculate your risk vs. Pretense cost.
IP Exposure Risk / yr
$281K
Without Pretense
Pretense Pro Cost / yr
$9K
$29/seat/month · 25 devs
Your ROI
32x
97% IP exposure reduction
IP Exposure Risk = team size × AI hrs/wk × 52 × hourly rate × 30% exposure coefficient · Conservative estimate based on industry breach cost data
How It Works
Mutation, not redaction
Redaction removes context and breaks AI output. Mutation preserves semantic structure while making your identifiers unrecognizable to any model or observer.
Scan
Pretense scans your code for proprietary identifiers: function names, class names, variables, and secrets.
Mutate
Each identifier maps deterministically to a synthetic name (getUserToken becomes _fn4a2b). Secrets are blocked at the edge.
Send
Mutated code goes to the LLM. Claude or ChatGPT sees only synthetic names. Your intellectual property never leaves your machine.
Reverse
The AI response is intercepted and every synthetic identifier is swapped back to your real name. Perfect, usable code lands in your editor.
Live mutation preview - identifiers transforming in real-time
async function getUserToken(userId: string) {
const payload = await verifyJwtClaims(userId);
const apiKey = process.env.ANTHROPIC_API_KEY;
return AuthService.createSession(payload, apiKey);
}async function _fn4a2b(_v9k1m: string) {
const _v2j8n = await _fn8c3d(_v9k1m);
const _v1e9f = process.env.BLOCKED_SECRET;
return _cls5b7a.createSession(_v2j8n, _v1e9f);
}Live Demo
Watch Pretense protect code in real-time
Scroll through to see mutation happen live. Raw proprietary code becomes fully synthetic output and back, in under 150ms.
Scroll to advance the demo
Use Cases
Built for teams where security isn't optional
Every industry has different compliance requirements. Pretense adapts to each.
AI-assisted clinical code without PHI leaks
The Problem
HIPAA §164.312 requires encryption in transit for any PHI. Pasting patient record schemas, SSN functions, or ICD-10 lookups into Claude or Copilot is a direct HIPAA violation. Even in dev environments.
How Pretense Fixes It
Pretense mutates all patient identifiers (patientId, ssn, diagnosisCode) before they reach the LLM. Your EHR logic stays private. Your HIPAA audit log stays clean.
// Your code (NEVER sent to LLM)
async function getPatientRecord(ssn: string, patientId: number) {
return db.query('SELECT * FROM patient_records WHERE ssn = $1', [ssn]);
}// What Pretense sends to Claude/GPT-4
async function _fn4a2b(_v7c1f: string, _v3d8a: number) {
return _v9e2c.query('SELECT * FROM _cls1b4f WHERE _v7c1f = $1', [_v7c1f]);
}See exactly how Pretense protects your specific stack
Built for regulated industries
Comparison
Pretense vs. the alternatives
| Feature | Pretense | Nightfall DLP | Manual Review |
|---|---|---|---|
| Protection method | Smart mutation | Redaction | None |
| LLM context preserved | Yes (semantic structure intact) | No (broken context) | N/A |
| Deployment time | 30 seconds | 2 weeks | Ongoing |
| Works locally (offline) | Yes | No (cloud SaaS) | Yes |
| Multi-provider (Claude + GPT + Gemini) | Yes | Limited | No |
| Starting price (50 seats) | $1,450/mo Pro | $8,000–15,000/mo | $0 + dev hours |
Quick IP Risk Estimate
How much code is your team exposing?
Proprietary lines sent per week
3,200
Estimated annual IP risk
$200K
Start free. Enterprise when you need it.
Developer tier is free forever. Limited daily mutations. Upgrade for unlimited use, team features, and compliance exports.
Developer
For individual developers exploring AI security
- Up to 500 mutations/day
- CLI (scan, protect, restore)
- All 5 language scanners
- Local JSON audit log
- Community Slack support
- No team dashboard
- No compliance exports
- No SSO or SIEM
Pro
billed $278/year
For teams shipping AI-assisted code daily
- Unlimited mutations
- Team dashboard + analytics
- SOC2/HIPAA report export
- VS Code + Cursor extension
- Slack/Teams/PagerDuty alerts
- 90-day audit log retention
- Priority email support
Enterprise
billed $950/year
For orgs requiring compliance, SSO, and SIEM at Nightfall parity
- Everything in Pro
- SSO/SAML/SCIM (BoxyHQ)
- On-prem deployment option
- Custom mutation rule engine
- SIEM (Splunk/Sentinel/Elastic)
- AI Agent Governance dashboard
- 7-year audit retention
- Dedicated CSM + SLA guarantee
- Custom contract + volume pricing
8 Enterprise spots open this quarter
Why pay Nightfall $5,000/month?
Nightfall is cloud-based and redacts (breaks LLM context). Pretense is local-first and mutates (preserves LLM context). Enterprise team of 50 = $4,950/month with Pretense vs $60,000/year minimum with Nightfall.
Trusted by 1,200+ developers at 47 companies
Use Cases
Protection for every team
From solo founders to Fortune 500 security teams. One proxy, zero workflow disruption.
Before your next AI PR review
Every snippet sent to Claude or Copilot is mutated before it leaves. Auth logic, payment code, and database schemas stay on your machine.
HIPAA-compliant AI coding
PHI identifiers and clinical terminology are mutated before reaching any LLM. Full HIPAA audit trail ships on day one.
Enterprise Claude Code deployment
Route all Claude Code traffic through Pretense. One config protects your entire engineering org with no per-developer setup.
Start protecting your code in 30 seconds
Open source core. No credit card. Works with every LLM API. Deploy before your next AI-assisted PR.