Skip to main content
VS Code includes built-in AI chat through GitHub Copilot Chat. Ollama models can be used directly in the Copilot Chat model picker. VS Code with Ollama

Prerequisites

VS Code requires you to be logged in to use its model selector, even for custom models. This doesn’t require a paid GitHub Copilot account; GitHub Copilot Free will enable model selection for custom models.

Quick setup

ollama launch vscode
Recommended models will be shown after running the command. See the latest models at ollama.com. Make sure Local is selected at the bottom of the Copilot Chat panel to use your Ollama models.
Ollama Local Models

Run directly with a model

ollama launch vscode --model qwen3.5:cloud
Cloud models are also available at ollama.com.

Manual setup

To configure Ollama manually without ollama launch:
  1. Open the Copilot Chat side bar from the top right corner
    VS Code chat Sidebar
  2. Click the settings gear icon () to bring up the Language Models window
    VS Code model picker
  3. Click Add Models and select Ollama to load all your Ollama models into VS Code
    VS Code model options dropdown to add ollama models
  4. Click the Unhide button in the model picker to show your Ollama models
    VS Code unhide models button