Open-source AI code assistant with deep IDE integration. Extensible context providers, pluggable model backends, slash commands, autocomplete, and agentic workflows across VS Code and JetBrains.
Continue is an open-source AI code assistant that connects any LLM to any IDE. It provides chat, autocomplete, inline editing, and agentic code generation -- all running inside VS Code, JetBrains, or the CLI with zero vendor lock-in.
The architecture follows a core + extensions pattern: a shared TypeScript core handles all AI logic (model routing, context retrieval, indexing, tool execution), while thin IDE-specific extensions implement the IDE interface to bridge editor APIs. A React-based GUI (webview) provides the chat/agent UI, communicating with the extension host via a typed message protocol.
Conversational code assistance with plan-based autonomous execution. Streams responses from any configured model with full context awareness.
core/Real-time code suggestions using fill-in-the-middle (FIM) models. Debounced, token-budget-aware, with inline ghost text rendering.
core/autocomplete/Select code and describe changes in natural language. The model generates a diff that gets applied in-place with a review UI.
core/edit/31+ built-in context providers surface files, codebase search, docs, git history, terminals, databases, and web content to the model.
core/context/Agentic tool execution for file editing, terminal commands, web search, and MCP server integration. Policy-controlled access.
core/tools/Four-layer indexing pipeline: code snippets (tree-sitter), full-text search (SQLite FTS5), chunk embeddings, and vector search (LanceDB).
core/indexing/The system is split into four layers: the IDE extension (VS Code / JetBrains / CLI), the core engine (shared TypeScript), the GUI webview (React + Redux), and external model providers. The extension implements the IDE interface, the GUI communicates via a typed webview protocol, and the core orchestrates everything.
The core/ package is the brain of Continue. It is a pure TypeScript library with no IDE dependencies, making it portable across VS Code, JetBrains, and CLI. It defines the key abstractions (ILLM, IContextProvider, IDE) and orchestrates all AI interactions.
Key architectural decisions:
IDE interface defines 40+ methods for file I/O, navigation, subprocess execution, git operations, and workspace access. Each extension implements this interface.protocol/ module defines typed messages for all GUI-to-core interactions, ensuring type safety across the webview boundary.Context providers are the mechanism by which Continue surfaces relevant information to the LLM. Users invoke them with @ mentions in chat (e.g., @file, @codebase, @docs). Each provider implements the IContextProvider interface with two key methods: getContextItems() to fetch content and loadSubmenuItems() for dynamic option lists.
The retrieval subsystem (context/retrieval/) powers the @codebase provider, combining vector similarity search from the indexing pipeline with full-text search for hybrid retrieval. The @docs provider crawls and indexes documentation sites, storing embeddings for semantic search.
The MCP Context Provider bridges the Model Context Protocol, allowing any MCP server to surface context items. The HTTP Context Provider enables fetching context from arbitrary REST endpoints.
Continue's LLM abstraction layer decouples AI capabilities from specific providers. The ILLM interface defines methods for chat, completion, FIM, embeddings, and reranking. Each provider implements this interface, and models are assigned to specific roles in the config.
| Role | Interface Method | Description |
|---|---|---|
| Chat | streamChat() |
Powers conversational interactions, agent mode, and code explanation |
| Edit | streamChat() |
Handles complex code transformations and refactoring tasks |
| Apply | streamChat() |
Executes targeted, surgical code modifications |
| Autocomplete | streamFim() |
Real-time fill-in-the-middle code suggestions |
| Embedding | embed() |
Transforms code into vectors for semantic search and indexing |
| Reranker | rerank() |
Re-orders search results by semantic relevance |
Slash commands are user-invoked actions triggered by typing / in the chat input. They extend the assistant's capabilities beyond conversation. Tools are model-invoked actions that enable agentic behavior -- the LLM decides when and how to use them during execution.
The tool system uses a policy layer to control which tools the model can invoke. Policies can be set per-tool to allow, deny, or require confirmation. The MCP bridge maps external MCP server tools into Continue's tool namespace, enabling seamless integration with any MCP-compatible service.
Continue indexes the entire codebase to power @codebase search, autocomplete context, and agent retrieval. The pipeline uses a content-addressed tagging system so switching branches only re-indexes changed files.
| Index | Source | Storage | Use Case |
|---|---|---|---|
| CodeSnippetsIndex | Tree-sitter AST queries | SQLite | Function/class lookup, symbol navigation |
| FullTextSearchCodebaseIndex | Raw file content | SQLite FTS5 | Keyword search, grep-like queries |
| ChunkCodebaseIndex | Recursive code chunking | References | Embedding input preparation |
| LanceDbIndex | Chunk embeddings | LanceDB (vector) | Semantic similarity search |
Continue supports three IDE targets through the shared IDE interface. Each extension is a thin adapter that translates IDE-specific APIs into the common interface, allowing the core engine to remain completely IDE-agnostic.
Implements IDE via VsCodeIde.ts. Uses VS Code's webview API for the GUI panel. Registers commands, IntelliSense providers, diff viewers, and terminal listeners. Communication flows through webviewProtocol.ts.
Written in Kotlin, following JetBrains platform conventions. Uses JCEF (Chromium Embedded) for the webview. Communicates with the TypeScript core through a proxy layer. Hooks into IDE lifecycle via activities and services.
extensions/intellij/Runs Continue outside any IDE. Supports both an interactive TUI mode for terminal-based chat and a headless mode for CI/CD pipelines and automated checks. Powered by the cn CLI tool.
The GUI is a React application rendered inside the IDE's webview panel. It uses Redux for global state management and React Context for localized state. The same codebase is shared across VS Code and JetBrains (via JCEF).
Continue uses a layered YAML configuration system. The primary config file lives at ~/.continue/config.yaml (global) or .continue/config.yaml (workspace). Configuration covers model selection, context providers, rules, MCP servers, and tool permissions.
Configuration supports environment variable references for secrets (e.g., $ANTHROPIC_API_KEY), workspace-level overrides that merge with global settings, and Mission Control -- a web interface for managing configurations across teams.
This diagram traces a complete interaction from user input to streamed response, showing how all subsystems coordinate.