AI & Prompt Tools · Free tool
Jailbreak Risk Scorer
Score an input prompt 0-10 for jailbreak risk — flags common prompt-injection patterns and DAN-style attempts.
Updated April 2026
Paste a prompt to get a heuristic jailbreak risk score based on known attack patterns. This is a keyword check—not a substitute for a real moderation model.
Risk score
10/10
Band
High
Flagged terms (5)
ignore previoussystem promptpretend youdanno restrictions
Heuristic only. Real jailbreak detection requires a fine-tuned classifier, semantic analysis, and context about the target system.
Found this useful?Email
Advertisement
What it does
Heuristic score for jailbreak and prompt-injection risk — fast smoke test before sending to a model.
Runs entirely in your browser — no upload, no account, no watermark. For more tools in this category see the full tools index.
How to use it
- Paste the input.
- Read the score and flagged terms.
- Harden your system prompt accordingly.
Advertisement