Desktop with lots of cables

How MCP Servers Turn VS Code Into an AI Development Powerhouse

Learn how Model Context Protocol integration enables GitHub Copilot to query databases, manage repositories, and orchestrate multi-step development tasks autonomously
Jens Churchill
Jens Churchill, CTO - IBM Champion
March 18, 2026

Visual Studio Code has transformed from a lightweight code editor into a fully agentic development environment. At the heart of this transformation is support for the Model Context Protocol (MCP) - an open standard that lets AI models connect to external tools and services through a unified interface. With MCP support now baked into GitHub Copilot inside VS Code, developers can extend their AI assistant with virtually unlimited contextual capabilities: reading databases, querying APIs, browsing file systems, managing GitHub repositories, and much more - all from within their editor.

 

This article explains what MCP servers are, how they work in VS Code, how to set them up, and which AI models you can pair with them.

 

What Is the Model Context Protocol?

MCP is an open standard developed by Anthropic that defines how applications share context with large language models. Think of it as a universal adapter - instead of requiring custom integrations between every AI model and every tool, MCP standardizes the interface. An MCP server exposes a set of "tools" that an AI can invoke, such as reading a file, querying a database, calling a web API, or interacting with services like GitHub or Slack.

 

In VS Code, MCP servers are one of three ways to extend Copilot Chat with tools, alongside built-in tools and extension-contributed tools. Once a server is configured, its tools become available in Copilot's Agent Mode and can be invoked automatically based on context or explicitly referenced in prompts.

 

"With MCP support now baked into GitHub Copilot inside VS Code, developers can extend their AI assistant with virtually unlimited contextual capabilities: reading databases, querying APIs, browsing file systems, managing GitHub repositories, and much more - all from within their editor."

 

How VS Code Supports MCP Servers

MCP support in VS Code arrived as part of GitHub Copilot's evolution toward agentic workflows. Starting with VS Code version 1.99, users can configure local MCP servers running on their own machines, as well as remote MCP servers hosted elsewhere. Both configurations are fully supported.

 

Agent Mode: The Gateway to MCP

MCP tools are used through Agent Mode in Copilot Chat. Unlike the simpler Ask or Edit modes, Agent Mode allows Copilot to independently plan and execute multi-step tasks - running terminal commands, editing files, calling MCP tools, and even self-correcting errors along the way. To activate it, simply select "Agent" from the mode dropdown in the Copilot Chat panel.

 

Once in Agent Mode, Copilot can automatically invoke MCP tools based on the task at hand. For example, if the GitHub MCP server is installed, asking "list my open pull requests" will automatically trigger the right tool without any special syntax. You can also reference tools explicitly by typing # followed by the tool name.

 

Configuring Local MCP Servers

Local MCP servers run as processes on your own machine and communicate with VS Code via stdio (standard input/output). This makes them ideal for tools that need access to your local environment, private data, or internal systems.

 

Configuration is done through an mcp.json file. For workspace-specific servers, this lives at .vscode/mcp.json and can be committed to source control to share the setup with your team. User-level servers are configured globally in your VS Code profile. A typical local server configuration looks like this:

 

{
  "servers": {
    "myDatabaseServer": {
      "type": "stdio",
      "command": "node",
      "args": ["/path/to/my-mcp-server/index.js"],
      "env": {
        "DB_CONNECTION_STRING": "${input:dbConnection}"
      }
    }
  }
}

 

Using ${input:...} variables keeps sensitive credentials out of the config file - VS Code will prompt you for the value when the server starts. This is particularly important for database passwords, API keys, and tokens.

 

The MCP Server Gallery

VS Code includes an MCP server gallery directly in the Extensions view. By typing @mcp in the extension search bar, you get a curated list of servers from the GitHub MCP Registry. You can install servers for your user profile or for a specific workspace, and manage them (start, stop, view logs, uninstall) directly from the Extensions panel or via the Command Palette.

 

"MCP is an open standard that defines how applications share context with large language models. Think of it as a universal adapter - instead of requiring custom integrations between every AI model and every tool, MCP standardizes the interface."

 

Remote MCP Servers

VS Code also supports remote MCP servers using Server-Sent Events (SSE). These are hosted elsewhere - on a cloud service, a team server, or a vendor's infrastructure - and are useful when you don't want to run a process locally or when access needs to be centrally managed. Organizations on Copilot Business or Enterprise plans can enforce policies about which MCP servers team members are allowed to use.

 

Debugging MCP Servers

For developers building their own MCP servers, VS Code offers a dedicated debug mode for Node.js and Python servers. This makes it practical to develop, test, and iterate on custom MCP servers inside the same environment where you'll ultimately use them.

 

Toolsets

To avoid clutter when many tools are available, VS Code supports toolsets - named groups of related MCP tools that can be enabled or disabled together. This makes it easy, for example, to activate your "database tools" only when working on data-heavy tasks, keeping the tool picker manageable.

 

Practical MCP Use Cases in VS Code

The combination of Agent Mode and MCP servers opens up a wide range of workflows that were previously impossible without custom integrations:

 

Repository and project management. The official GitHub MCP server lets Copilot search repositories, create or comment on issues, open pull requests, and inspect CI/CD workflow runs - all through natural language. Asking "create a GitHub issue for the bug we just found" is enough to trigger the full flow.

 

Database queries. A local database MCP server can expose read-only query tools, letting Copilot understand your actual schema and data when generating queries or debugging SQL. You can instruct Copilot via .github/copilot-instructions.md to always prefer the read-only MCP tool over generating raw SQL.

 

Infrastructure management. For DevOps work, MCP servers that connect to Kubernetes, Terraform state, or cloud provider APIs give Copilot live context about running infrastructure - enabling it to help diagnose issues, suggest changes, and generate Infrastructure as Code with real configuration details.

 

Custom internal tools. Teams can build private MCP servers in any language (the protocol uses JSON over stdio or HTTP) that expose internal APIs, documentation, or datasets. These can be shared via the new enterprise private MCP registry, making them discoverable only within the organization.

 

AI Models Available in VS Code with MCP Support

One of the most powerful aspects of VS Code's Copilot integration is that you're not locked into a single AI model. You can choose from a range of models - and the choice matters, because different models handle MCP tool use, reasoning, and code generation with different strengths.

 

It's worth noting that, for Agent Mode (required for MCP tool use), the model list is filtered to include only models with strong support for tool calling. Not every model in the Copilot lineup supports agentic workflows equally well.

 

Models Available Through GitHub Copilot

Note: This list is always expanding and changing, for instance Opus 4 is currently on a point release of 4.6, but VS Code / GitHub Copilot is fairly good at keeping popular models current in the interface.

 

  • OpenAI GPT-4.1 is the current default model for inline suggestions and a strong general-purpose choice for everyday development work. It offers a good balance of speed and quality.
  • OpenAI GPT-5 and GPT-5 mini are the most capable models available from OpenAI. GPT-5 is well-suited for complex, multi-step agentic tasks - exactly the kind of work that benefits from MCP tool use. GPT-5 mini trades some capability for lower latency and reduced premium request consumption.
  • OpenAI o3-mini and o4-mini are reasoning-focused models, strong for tasks involving logic, mathematical reasoning, or step-by-step problem solving. They tend to be faster and cheaper than the flagship models.
  • OpenAI GPT-4o retains multimodal support (images, screenshots, UI diagrams) and is a solid option when visual context matters.
  • Anthropic Claude Sonnet 4 and Claude Opus 4 are Anthropic's flagship models and among the best available in Copilot for agentic workflows. Claude models are especially well regarded for long-context reasoning, code understanding, and careful multi-step planning - all of which are important when orchestrating MCP tool calls. Claude Opus 4 is the heavier, more thorough option; Claude Sonnet 4 is faster and more cost-efficient. VS Code's Auto mode currently selects between Claude Sonnet 4, GPT-5, and GPT-5 mini by default.
  • Google Gemini 2.5 Pro is Google's strongest Copilot model and performs particularly well on data transformation tasks and multimodal inputs. Attaching logs, CSVs, or charts and asking Copilot to analyze them is a natural fit for Gemini 2.5 Pro.
  • Google Gemini 2.0 Flash prioritizes low latency and is a good fit for quick tasks where response time matters more than depth of reasoning.

 

Auto Model Selection

VS Code introduced an Auto option in the model picker, which automatically routes requests to the best available model based on availability, load, and task context. For users who don't want to think about model selection, Auto is a practical default - it still uses premium models like Claude Sonnet 4 and GPT-5, but manages switching transparently and helps avoid rate limit issues.

 

Bring Your Own Key (BYOK)

For users who want to go beyond the built-in models, VS Code supports adding models via their own API keys. This means you can connect directly to Anthropic's API, OpenAI, Azure OpenAI, Mistral, Ollama (for local models like Phi-4 or Llama), or any other compatible provider. BYOK is available on individual Copilot plans but not on Business or Enterprise plans, which are managed centrally.

 

This is particularly valuable for teams that want to use a fine-tuned internal model, experiment with newer releases not yet integrated into Copilot, or run models fully locally without any data leaving the machine.

 

"VS Code's MCP integration represents a meaningful shift in how developers interact with AI in their daily work. By connecting AI models to real tools and data through a standardized protocol, Copilot moves from a code-suggestion engine to a true development partner."

 

Choosing the Right Model for MCP-Heavy Workflows

When your workflow involves heavy use of MCP tools - querying databases, calling APIs, managing infrastructure - a few principles help guide model choice:

 

For tasks requiring careful multi-step reasoning across tools, Claude Opus 4 or Claude Sonnet 4 tend to be the strongest performers. Their long-context handling and careful planning make them well-suited for orchestrating multiple tool calls in sequence.

 

For rapid iterative development where you're invoking a single MCP tool at a time and primarily want fast code generation, GPT-4.1 or GPT-5 mini strikes a good balance between capability and speed.

 

For analytical tasks - especially when attaching logs, database exports, or structured data as context - Gemini 2.5 Pro is worth considering.

 

If you're uncertain, start with Auto and let VS Code optimize the selection for you. You can always select a specific model for a particular session.

 

Getting Started

To start using local MCP servers in VS Code:

 

  1. Install VS Code 1.99 or later and ensure GitHub Copilot is set up with an active subscription (including the free tier).

  2. Open the Extensions view (Ctrl+Shift+X / Cmd+Shift+X) and search @mcp to browse the server gallery.

  3. Install a server (for example, the GitHub MCP server) in your workspace or user profile.

  4. A .vscode/mcp.json file will be created or updated with the server configuration.

  5. Click the "Start" button in the mcp.json file to start the server.

  6. Open Copilot Chat and switch to Agent mode.

  7. Click the Tools icon to see available MCP tools and toggle them on or off as needed.

 

For a custom local server, create your mcp.json manually with the stdio configuration pointing to your server's executable, then restart the server from the Command Palette using MCP: List Servers.

 

Conclusion

VS Code's MCP integration represents a meaningful shift in how developers interact with AI in their daily work. By connecting AI models to real tools and data through a standardized protocol, Copilot moves from a code-suggestion engine to a true development partner capable of understanding and acting across your entire development environment.

 

With a growing selection of AI models from Anthropic, OpenAI, and Google - each with distinct strengths - and the flexibility to bring your own models via API keys or run them locally, VS Code gives developers both the power and the control to build the AI-augmented workflow that suits them best. The combination of Agent Mode, MCP servers, and multi-model choice is, for many developers, the most capable coding environment available today.

FAQ & Takeaways


Modules

Sitemule is a powerful, modular platform built for IBM i innovation.

The Companies We Help

We provide solutions and services that support both standard and tailor-made systems for companies worldwide, serving a wide range of industries such as banking, finance, insurance, manufacturing, retail, logistics, and beyond. Let us help you - get in touch today!

Co-Ownership

Co-Ownership

Redefining ownership, affordability, and community living
Santander Bank

Santander Bank

Enabling financial confidence, smart mobility, and personal growth
ABN AMRO

ABN AMRO

Empowering innovation, sustainable finance, and inclusive progress
Uno-X

Uno-X

Fueling cleaner mobility, energy access, and everyday simplicity
Molslinjen A/S

Molslinjen A/S

Connecting people, regions, and experiences
Berry Superfoss

Berry Superfoss

Driving circular packaging, customer value, and smarter logistics
Get in Touch
Please select