11 KiB
Claude Code MCP Configuration Guide
This document explains how to configure MCP (Model Context Protocol) servers for Claude Code, covering both the CLI and VS Code extension.
The Two Config Files
Claude Code uses two separate configuration files for MCP servers. They must be kept in sync manually.
| File | Used By | Notes |
|---|---|---|
~/.claude.json |
Claude CLI (claude command) |
Requires "type": "stdio" in each server |
~/.claude/settings.json |
VS Code Extension | Simpler format, supports "disabled": true |
Important: Changes to one file do NOT automatically sync to the other!
File Locations (Windows)
C:\Users\<username>\.claude.json # CLI config
C:\Users\<username>\.claude\settings.json # VS Code extension config
Config Format Differences
VS Code Extension Format (~/.claude/settings.json)
{
"mcpServers": {
"server-name": {
"command": "path/to/executable",
"args": ["arg1", "arg2"],
"env": {
"ENV_VAR": "value"
},
"disabled": true // Optional - disable without removing
}
}
}
CLI Format (~/.claude.json)
The CLI config is a larger file with many settings. The mcpServers section is nested within it:
{
"numStartups": 14,
"installMethod": "global",
// ... other settings ...
"mcpServers": {
"server-name": {
"type": "stdio", // REQUIRED for CLI
"command": "path/to/executable",
"args": ["arg1", "arg2"],
"env": {
"ENV_VAR": "value"
}
}
}
// ... more settings ...
}
Key difference: CLI format requires "type": "stdio" in each server definition.
Common MCP Server Examples
Memory (Knowledge Graph)
// VS Code format
"memory": {
"command": "D:\\nodejs\\npx.cmd",
"args": ["-y", "@modelcontextprotocol/server-memory"]
}
// CLI format
"memory": {
"type": "stdio",
"command": "D:\\nodejs\\npx.cmd",
"args": ["-y", "@modelcontextprotocol/server-memory"],
"env": {}
}
Filesystem
// VS Code format
"filesystem": {
"command": "d:\\nodejs\\node.exe",
"args": [
"c:\\Users\\<user>\\AppData\\Roaming\\npm\\node_modules\\@modelcontextprotocol\\server-filesystem\\dist\\index.js",
"d:\\path\\to\\project"
]
}
// CLI format
"filesystem": {
"type": "stdio",
"command": "d:\\nodejs\\node.exe",
"args": [
"c:\\Users\\<user>\\AppData\\Roaming\\npm\\node_modules\\@modelcontextprotocol\\server-filesystem\\dist\\index.js",
"d:\\path\\to\\project"
],
"env": {}
}
Podman/Docker
// VS Code format
"podman": {
"command": "D:\\nodejs\\npx.cmd",
"args": ["-y", "podman-mcp-server@latest"],
"env": {
"DOCKER_HOST": "npipe:////./pipe/podman-machine-default"
}
}
Gitea
// VS Code format
"gitea-myserver": {
"command": "d:\\gitea-mcp\\gitea-mcp.exe",
"args": ["run", "-t", "stdio"],
"env": {
"GITEA_HOST": "https://gitea.example.com",
"GITEA_ACCESS_TOKEN": "your-token-here"
}
}
Redis
// VS Code format
"redis": {
"command": "D:\\nodejs\\npx.cmd",
"args": ["-y", "@modelcontextprotocol/server-redis", "redis://localhost:6379"]
}
Bugsink (Error Tracking)
Important: Bugsink has a different API than Sentry. Use bugsink-mcp, NOT sentry-selfhosted-mcp.
Note: The bugsink-mcp npm package is NOT published. You must clone and build from source:
# Clone and build bugsink-mcp
git clone https://github.com/j-shelfwood/bugsink-mcp.git d:\gitea\bugsink-mcp
cd d:\gitea\bugsink-mcp
npm install
npm run build
// VS Code format (using locally built version)
"bugsink": {
"command": "d:\\nodejs\\node.exe",
"args": ["d:\\gitea\\bugsink-mcp\\dist\\index.js"],
"env": {
"BUGSINK_URL": "https://bugsink.example.com",
"BUGSINK_TOKEN": "your-api-token"
}
}
// CLI format
"bugsink": {
"type": "stdio",
"command": "d:\\nodejs\\node.exe",
"args": ["d:\\gitea\\bugsink-mcp\\dist\\index.js"],
"env": {
"BUGSINK_URL": "https://bugsink.example.com",
"BUGSINK_TOKEN": "your-api-token"
}
}
- GitHub: https://github.com/j-shelfwood/bugsink-mcp
- Get token from Bugsink UI: Settings > API Tokens
- Do NOT use npx - the package is not on npm
Sentry (Cloud or Self-hosted)
For actual Sentry instances (not Bugsink), use:
"sentry": {
"command": "D:\\nodejs\\npx.cmd",
"args": ["-y", "@sentry/mcp-server"],
"env": {
"SENTRY_AUTH_TOKEN": "your-sentry-token"
}
}
Troubleshooting
Server Not Loading
-
Check both config files - Make sure the server is defined in both
~/.claude.jsonAND~/.claude/settings.json -
Verify server order - Servers load sequentially. Broken/slow servers can block others. Put important servers first.
-
Check for timeout - Each server has 30 seconds to connect. Slow npx downloads can cause timeouts.
-
Fully restart VS Code - Window reload is not enough. Close all VS Code windows and reopen.
Verifying Configuration
For CLI:
claude mcp list
For VS Code:
- Open VS Code
- View → Output
- Select "Claude" from the dropdown
- Look for MCP server connection logs
Common Errors
| Error | Cause | Solution |
|---|---|---|
Connection timed out after 30000ms |
Server took too long to start | Move server earlier in config, or use pre-installed packages instead of npx |
npm error 404 Not Found |
Package doesn't exist | Check package name spelling |
The system cannot find the path |
Wrong executable path | Verify the command path exists |
Connection closed |
Server crashed on startup | Check server logs, verify environment variables |
Disabling Problem Servers
In ~/.claude/settings.json, add "disabled": true:
"problem-server": {
"command": "...",
"args": ["..."],
"disabled": true
}
Note: The CLI config (~/.claude.json) does not support the disabled flag. You must remove the server entirely from that file.
Adding a New MCP Server
-
Install/clone the MCP server (if not using npx)
-
Add to VS Code config (
~/.claude/settings.json):"new-server": { "command": "path/to/command", "args": ["arg1", "arg2"], "env": { "VAR": "value" } } -
Add to CLI config (
~/.claude.json) - find themcpServerssection:"new-server": { "type": "stdio", "command": "path/to/command", "args": ["arg1", "arg2"], "env": { "VAR": "value" } } -
Fully restart VS Code
-
Verify with
claude mcp list
Quick Reference: Available MCP Servers
| Server | Package/Repo | Purpose |
|---|---|---|
| memory | @modelcontextprotocol/server-memory |
Knowledge graph persistence |
| filesystem | @modelcontextprotocol/server-filesystem |
File system access |
| redis | @modelcontextprotocol/server-redis |
Redis cache inspection |
| postgres | @modelcontextprotocol/server-postgres |
PostgreSQL queries |
| sequential-thinking | @modelcontextprotocol/server-sequential-thinking |
Step-by-step reasoning |
| podman | podman-mcp-server |
Container management |
| gitea | gitea-mcp (binary) |
Gitea API access |
| bugsink | j-shelfwood/bugsink-mcp (build from source) |
Error tracking for Bugsink |
| sentry | @sentry/mcp-server |
Error tracking for Sentry |
| playwright | @anthropics/mcp-server-playwright |
Browser automation |
Best Practices
-
Keep configs in sync - When you change one file, update the other
-
Order servers by importance - Put essential servers (memory, filesystem) first
-
Disable instead of delete - Use
"disabled": truein settings.json to troubleshoot -
Use node.exe directly - For faster startup, install packages globally and use
node.exeinstead ofnpx -
Store sensitive data in memory - Use the memory MCP to store API tokens and config for future sessions
Future: MCP Launchpad
Project: https://github.com/kenneth-liao/mcp-launchpad
MCP Launchpad is a CLI tool that wraps multiple MCP servers into a single interface. Worth revisiting when:
- Windows support is stable (currently experimental)
- Available as an MCP server itself (currently Bash-based)
Why it's interesting:
| Benefit | Description |
|---|---|
| Single config file | No more syncing ~/.claude.json and ~/.claude/settings.json |
| Project-level configs | Drop mcp.json in any project for instant MCP setup |
| Context window savings | One MCP server in context instead of 10+, reducing token usage |
| Persistent daemon | Keeps server connections alive for faster repeated calls |
| Tool search | Find tools across all servers with mcpl search |
Current limitations:
- Experimental Windows support
- Requires Python 3.13+ and uv
- Claude calls tools via Bash instead of native MCP integration
- Different mental model (runtime discovery vs startup loading)
Future: Graphiti (Advanced Knowledge Graph)
Project: https://github.com/getzep/graphiti
Graphiti provides temporal-aware knowledge graphs - it tracks not just facts, but when they became true/outdated. Much more powerful than simple memory MCP, but requires significant infrastructure.
Ideal setup: Run on a Linux server, connect via HTTP from Windows:
// Windows client config (settings.json)
"graphiti": {
"type": "sse",
"url": "http://linux-server:8000/mcp/"
}
Linux server setup:
git clone https://github.com/getzep/graphiti.git
cd graphiti/mcp_server
docker compose up -d # Starts FalkorDB + MCP server on port 8000
Requirements:
- Docker on Linux server
- OpenAI API key (for embeddings)
- Port 8000 open on LAN
Benefits of remote deployment:
- Heavy lifting (Neo4j/FalkorDB + embeddings) offloaded to Linux
- Always-on server, Windows connects/disconnects freely
- Multiple machines can share the same knowledge graph
- Avoids Windows Docker/WSL2 complexity
Last updated: January 2026