Skip to content
Free Tool Arena

AI & LLMs · Guide · AI & Prompt Tools

How to Use Continue.dev

Installing Continue in VS Code/JetBrains, config.json, choosing models, custom context providers, slash commands.

Updated April 2026 · 6 min read

Continue is an open-source AI coding assistant that runs inside VS Code and JetBrains. It is the closest thing to a drop-in GitHub Copilot alternative where you own the model choice, the config, and can even run fully local.

Advertisement

What Continue actually is

Continue ships four primitives: inline autocomplete, a chat sidebar, a quick-edit command, and an agent mode that can run tools. Unlike Copilot, you point it at whatever model you want — Claude, GPT, Gemini, DeepSeek, a local Ollama model, or a self-hosted vLLM endpoint — and you version-control the config.

Installing

Install Continue from the VS Code Marketplace or the JetBrains plugin repository. On first launch it opens a config UI; you can also edit ~/.continue/config.yaml directly, which is what most power users end up doing. For autocomplete, a small fast model (Qwen-Coder-7B or similar on Ollama) feels better than a frontier model — latency matters more than IQ for that slot.

A first session

# ~/.continue/config.yaml
models:
  - name: Claude Sonnet
    provider: anthropic
    model: claude-sonnet-4-5
    apiKey: ${ANTHROPIC_API_KEY}
  - name: Local Autocomplete
    provider: ollama
    model: qwen2.5-coder:7b
    roles: [autocomplete]

Reload the window. Hit the chat shortcut, select a block of code, ask “why does this allocate twice” and you will get an answer with the right file context attached. Use @ in the chat input to pin specific files, docs, or the terminal output into the prompt.

Configuration that matters

The config.yaml is the entire product surface. Define multiple models with roles (chat, edit, autocomplete, apply, embed, rerank) so you can use Sonnet for chat and a cheap local model for inline completions. Add docs entries pointing at library docs you care about — Continue will index them and let you @ reference them in prompts, which is the killer feature for unfamiliar frameworks.

Team workflow

Continue supports team configs via the Hub: one person defines the model setup, everyone else pulls it. This is how you avoid the Copilot problem where half the team is on a different model with no way to standardize. Commit a repo-level .continue/ and you can ship prompt templates that encode house style.

When Continue shines

Teams that want Copilot ergonomics but need to self-host, comply with data residency rules, or just want to not be locked to one vendor. Also great for solo devs who want to mix a local autocomplete model with a cloud chat model to keep costs flat.

When not to use it

If you want a fully autonomous agent that plans multi-step changes, Continue’s agent mode works but Cline and Cursor feel more polished for that flow. And the autocomplete is only as good as the model you wire up — plugging a frontier chat model into the autocomplete role makes typing miserable, so always use a small fast model there.

Advertisement

Found this useful?Email