Skip to content
Free Tool Arena

Glossary · Definition

System prompt

A system prompt is the persistent instruction sent to an LLM before user messages. It defines the AI's role, style, behavior, and constraints. Cached on most providers, so investing in a good one is cheap.

Updated May 2026 · 4 min read
100% in-browserNo downloadsNo sign-upMalware-freeHow we keep this safe →

Definition

A system prompt is the persistent instruction sent to an LLM before user messages. It defines the AI's role, style, behavior, and constraints. Cached on most providers, so investing in a good one is cheap.

What it means

Every modern API supports a separate system role for setup instructions. Best system prompts include: role + domain ('You are a senior engineer specializing in TypeScript'), audience ('the user is a mid-level dev'), style ('concise, no preamble'), must/never lists, and 1-2 examples of desired output. Length: 200-2,000 tokens is typical. Cached on Anthropic / Gemini / OpenAI at ~10% of normal input — keep stable.

Advertisement

Why it matters

A good system prompt = consistent output without repeating instructions every message. It's the highest-leverage prompt-engineering investment. With prompt caching, you pay ~10% of full price for system tokens after the first request — making longer, more detailed system prompts genuinely affordable.

Related free tools

Frequently asked questions

How long should it be?

200-2,000 tokens for most use cases. Custom GPT instructions max at 8,000 tokens. Above that, returns diminish quickly and you risk model drift.

Should I include examples?

Yes — 1-3 examples of ideal Q&A pairs anchor the style better than abstract instructions alone.

Related terms