Secure Visual Studio 2022/2026 extension with local and private AI agent support.
- Local First: No code leaves your machine by default.
- Private AI: Native support for local LLMs (Ollama, LM Studio, vLLM).
- Control: You define the endpoint and API keys. No telemetry.
- Autonomous Agent: Can read files, execute terminal commands, git operations, and apply code changes.
- Integrated Chat: Blazor WebAssembly UI running directly inside Visual Studio.
- Task Management: Built-in system to track and plan development tasks.
- Build Integration: Can trigger builds and analyze compilation errors.
Option 1: Manual
- Download the VSIX from Releases.
- Install the VSIX
Option 2: MarketPlace
Option 1: Local (Recommended) Use LM Studio or Ollama.
- Endpoint:
http://localhost:11434/api(Ollama default) - Model:
llama3,mistral,codellama,zai-org/glm-4.7-flash - Key: (Leave empty)
Option 2: Remote / Self-Hosted
- Endpoint: URL of your OpenAI-compatible provider.
- Key: Your API Key (stored securely).
| Category | Tools |
|---|---|
| Files | Read, Create, Search, Apply Diff |
| Project | Build, Get Errors, Run tests, Inspect Structure |
| Git | Status, Log, Diff, Branch Info |
| System | Execute Shell Commands, Fetch URLs |
| Part | Description |
|---|---|
| Extension | VS SDK (.NET Framework 4.8) handles system operations. |
| UI | Blazor WebAssembly (.NET 10) hosted in WebView2. |
| Bridge | JSON-RPC communication between UI and VS Host. |
- Visual Studio: 2022 (17.14+) or 2026 (18.0+)
- Runtimes: .NET Framework 4.8, .NET 10 SDK
See LICENSE.