Anonymize sensitive data
before sharing with AI

Privatiser redacts IPs, API keys, passwords, PII, and cloud identifiers from your text. Paste the anonymized version into ChatGPT, Claude, or Gemini — then restore the real data in the AI's response.

Everything runs locally. Nothing leaves your machine.

What it catches

🔑

Secrets

API keys (AWS, OpenAI, GitHub, Slack), JWTs, bearer tokens, PEM keys, connection strings, hex secrets, SSH keys

🌐

Network

IPv4 addresses (with CIDR), domain names, email addresses, MAC addresses, internal URLs with ports

👤

PII

US/UK phone numbers, credit cards (Luhn-validated), Social Security numbers, passport numbers, IBANs

Cloud

AWS account IDs, ARNs (structure preserved), S3 buckets, Azure subscription IDs, GCP project IDs

🔀

Reversible

Every redaction has a mapping. Paste the AI response back and restore all original values with one click.

🔒

100% Local

No servers, no telemetry, no data collection. The anonymization engine runs entirely in your browser or terminal.

Try it

Mapping Table (0 items)
Pseudonym Original

How it works

1

Paste your text

Paste a config file, log output, code snippet, or .env file into the input box.

2

Anonymize

Privatiser detects and replaces sensitive values with consistent pseudonyms. A mapping is saved so you can reverse it.

3

Share with AI

Copy the anonymized text and paste it into ChatGPT, Claude, Gemini, or any AI tool. The structure is preserved so the AI can still help.

4

Restore

Paste the AI's response into the Deanonymize tab with the mapping to get back the real values.

Install

Browser Extension

Auto-anonymizes when you paste into AI chat sites. Auto-restores when you copy the response.

CLI / Python

Pipe files through the CLI, use as a Python library, or run the local web UI.

pip install privatiser
cat config.tf | privatiser anonymize -m mapping.json privatiser anonymize .env --env -m mapping.json privatiser deanonymize response.txt -m mapping.json privatiser ui