cargo install vtcode
or brew install vinhnx/tap/vtcode
VT Code is a Rust-based terminal coding agent with modular architecture supporting multiple LLM providers.
Built for developers who demand precision, security, and efficiency in their coding workflows.
Install globally with Cargo:
cargo install vtcode
Alternatively, with Homebrew:
brew install vinhnx/tap/vtcode
Then simply run vtcode
to get started:
vtcode
You can also download pre-built binaries from GitHub Releases.
Available for:
- macOS: Apple Silicon (
aarch64-apple-darwin
) and Intel (x86_64-apple-darwin
) - Linux: x86_64 and ARM64 architectures
- Windows: x86_64 architecture
Each archive contains the executable - extract and rename to vtcode
if needed.
Set your API key for your preferred provider:
export GEMINI_API_KEY="your_key_here"
# or
export OPENAI_API_KEY="your_key_here"
# or
export ANTHROPIC_API_KEY="your_key_here"
# or
export OPENROUTER_API_KEY="your_key_here"
Alternatively, create a .env
file in your project directory:
# .env file
GEMINI_API_KEY=your_gemini_key_here
OPENAI_API_KEY=your_openai_key_here
ANTHROPIC_API_KEY=your_anthropic_key_here
OPENROUTER_API_KEY=your_openrouter_key_here
Automatic API Key Inference: VTCode automatically uses the correct environment variable based on your provider setting in vtcode.toml
:
provider = "openai"
→OPENAI_API_KEY
provider = "anthropic"
→ANTHROPIC_API_KEY
provider = "gemini"
→GEMINI_API_KEY
provider = "deepseek"
→DEEPSEEK_API_KEY
provider = "openrouter"
→OPENROUTER_API_KEY
VT Code supports advanced configuration via vtcode.toml
. See Configuration for details.
OpenRouter support unlocks any hosted model by ID, including the latest Grok and Qwen3 coding releases.
vtcode --provider openrouter --model x-ai/grok-code-fast-1 chat
Or persist the configuration in vtcode.toml
:
[agent]
provider = "openrouter"
default_model = "qwen/qwen3-coder"
Custom model IDs are accepted as long as they match your OpenRouter account access. Streaming and tool-calling work out of the box using the OpenAI-compatible Responses API.
Multi-Provider AI Support
- Gemini, OpenAI, Anthropic, OpenRouter, and DeepSeek integration
- Automatic provider selection and failover
- Cost optimization with safety controls
Advanced Code Intelligence
- Tree-sitter parsing for 6+ languages (Rust, Python, JavaScript, TypeScript, Go, Java)
- Semantic code analysis and pattern recognition
- Intelligent refactoring and optimization suggestions
Enterprise Security
- Workspace boundary enforcement
- Configurable command allowlists
- Human-in-the-loop controls for safety
- Comprehensive audit logging
Modular Architecture
- Trait-based tool system for extensibility
- Multi-mode execution (terminal, PTY, streaming)
- Intelligent caching and performance optimization
- Plugin architecture for custom tools
- Getting Started - Installation and basic usage
- Configuration - Advanced configuration options
- Architecture - Technical architecture details
- Advanced Features - Safety controls and debug mode
- API Reference - Complete API documentation
- Contributing - Development guidelines
This project is licensed under the MIT License - see LICENSE for details.