Quick Start
Get up and running with Ollama Code Review in minutes.
Requirements
Before installing, ensure you meet the following requirements:
- VS Code 1.102.0 or later — Download the latest version.
- Git: Installed and available in your system's PATH.
1. Installation
Install the Ollama Code Review extension from the VS Code Marketplace.
2. Interactive Setup Guide
On first install, the extension will automatically show an interactive Setup Guide. This guide walks you through:
- Choosing a Provider: Select between local Ollama or various cloud providers.
- Configuration: Enter API keys or pull recommended local models.
- First Review: Run your first code review to see the extension in action.
You can reopen this guide at any time via the Command Palette: Ollama Code Review: Open Setup Guide.
3. Configure a Provider
Depending on your preference, follow the setup for either local or cloud models:
Option A: Local Ollama (Privacy Focused)
- Install Ollama.
- Open your terminal and pull a recommended model:
ollama pull qwen2.5-coder:7b
- The extension will auto-detect your local Ollama instance.
Option B: Cloud Models (Highest Reasoning)
- Obtain an API key from your preferred provider (e.g., Google AI Studio for Gemini, Anthropic for Claude).
- Open VS Code Settings (
Cmd+,) and search forOllama Code Review. - Enter your API key in the corresponding field (e.g.,
Gemini Api Key).
4. Your First Review
- Stage some changes in your Git repository.
- Open the Source Control panel.
- Click the Ollama: Review Staged Changes button (chat icon) in the title bar.
- Wait a few seconds for the AI feedback to appear in the "Ollama Code Review" panel!
Useful Commands
Press Cmd+Shift+P (Mac) or Ctrl+Shift+P (Windows/Linux) and type Ollama to see all available commands:
Ollama: Review Staged ChangesOllama: Generate Commit MessageOllama: Suggestion(right-click on selected code)Ollama Code Review: Select AI Model