MCP (Model Context Protocol): Standardizing AI Context
Every AI tool reinvents context management. Anthropic’s Model Context Protocol (MCP) proposes a standard: one protocol for connecting AI models to data sources, tools, and systems.
The Problem
Today’s AI integrations are fragmented:
ChatGPT → Custom plugin system
Claude → Custom tool format
Gemini → Function calling API
Local LLMs → Various frameworks
Each integration is custom. Building a tool that works across AI systems means implementing it multiple times.
What is MCP?
MCP is an open protocol defining how AI applications connect to external systems:
AI Application ←→ MCP Protocol ←→ MCP Server (Your System)
Key components:
- MCP Hosts: AI applications (Claude Desktop, IDEs, etc.)
- MCP Servers: Connectors to data sources and tools
- MCP Protocol: The communication standard
How It Works
Server Definition
# mcp_server.py
from mcp import Server, Resource, Tool
server = Server("my-company-data")
@server.resource("documents/{doc_id}")
async def get_document(doc_id: str) -> Resource:
"""Fetch a document from the company database."""
doc = await database.get_document(doc_id)
return Resource(
uri=f"documents/{doc_id}",
content=doc.content,
mime_type="text/markdown"
)
@server.tool("search_docs")
async def search_documents(query: str, limit: int = 10) -> list[Resource]:
"""Search company documents."""
results = await database.search(query, limit=limit)
return [Resource(uri=f"documents/{r.id}", content=r.summary) for r in results]
Client Connection
# In an AI application
from mcp import Client
async with Client("my-company-data") as client:
# List available resources
resources = await client.list_resources()
# Read a specific resource
doc = await client.read_resource("documents/quarterly-report")
# Use a tool
results = await client.call_tool("search_docs", {"query": "revenue growth"})
MCP Architecture
┌──────────────────────────────────────────────────────────────┐
│ MCP Host │
│ (Claude Desktop, IDE, Custom Application) │
│ │
│ ┌─────────────────────────────────────────────────────────┐ │
│ │ MCP Client │ │
│ └───────────────┬─────────────────────────────────────────┘ │
└──────────────────┼───────────────────────────────────────────┘
│ MCP Protocol (JSON-RPC over stdio/SSE)
│
┌──────────────────┼───────────────────────────────────────────┐
│ ┌───────────────▼─────────────────────────────────────────┐ │
│ │ MCP Server │ │
│ │ • Resources (data access) │ │
│ │ • Tools (actions) │ │
│ │ • Prompts (templates) │ │
│ └─────────────────────────────────────────────────────────┘ │
│ Your System │
│ (Database, API, File System, External Service) │
└──────────────────────────────────────────────────────────────┘
Core Concepts
Resources
Read-only data that AI can access:
@server.resource("database/{table}")
async def get_table_data(table: str) -> Resource:
data = await db.query(f"SELECT * FROM {table} LIMIT 100")
return Resource(uri=f"database/{table}", content=json.dumps(data))
Tools
Actions with side effects:
@server.tool("create_ticket")
async def create_ticket(title: str, description: str, priority: str) -> dict:
"""Create a support ticket in our ticketing system."""
ticket = await jira.create_issue(title, description, priority)
return {"ticket_id": ticket.key, "url": ticket.url}
Prompts
Reusable prompt templates:
@server.prompt("code_review")
async def code_review_prompt(language: str, code: str) -> str:
return f"""Review this {language} code for:
- Security vulnerabilities
- Performance issues
- Best practices
Code:
```{language}
{code}
```"""
Real-World Use Cases
IDE Integration
# MCP server for your codebase
@server.resource("files/{path}")
async def get_file(path: str) -> Resource:
content = open(path).read()
return Resource(uri=f"files/{path}", content=content)
@server.tool("run_tests")
async def run_tests(test_path: str) -> dict:
result = subprocess.run(["pytest", test_path], capture_output=True)
return {"output": result.stdout, "success": result.returncode == 0}
Knowledge Base Access
# MCP server for company docs
@server.tool("search_confluence")
async def search_confluence(query: str) -> list[Resource]:
results = await confluence.search(query)
return [Resource(uri=r.url, content=r.body) for r in results]
API Automation
# MCP server for your APIs
@server.tool("deploy")
async def deploy_to_staging(service: str, version: str) -> dict:
result = await kubernetes.deploy(service, version, env="staging")
return {"status": result.status, "url": result.url}
Adoption
MCP is gaining traction:
- Claude Desktop: Native MCP support
- Continue (IDE extension): MCP integration
- Cursor: MCP support
- Open source: Growing ecosystem of servers
Building an MCP Server
Quick Start
pip install mcp
# minimal_server.py
from mcp import Server
server = Server("my-server")
@server.tool("hello")
async def hello(name: str) -> str:
return f"Hello, {name}!"
if __name__ == "__main__":
server.run()
Configuration
In Claude Desktop (claude_desktop_config.json):
{
"mcpServers": {
"my-server": {
"command": "python",
"args": ["/path/to/my_server.py"]
}
}
}
Best Practices
Security
# Validate all inputs
@server.tool("query_db")
async def query_db(query: str) -> dict:
# Never execute raw SQL
if not is_safe_query(query):
raise ValueError("Invalid query")
...
Error Handling
@server.tool("fetch_data")
async def fetch_data(source: str) -> Resource:
try:
data = await external_api.get(source)
return Resource(uri=source, content=data)
except ExternalAPIError as e:
raise MCPError(f"Failed to fetch: {e}")
Documentation
@server.tool("complex_operation")
async def complex_operation(
param1: str, # Description of param1
param2: int = 10 # Optional with default
) -> dict:
"""
Perform a complex operation.
This tool does X, Y, and Z. Use it when you need to...
Returns:
dict: Contains 'result' and 'metadata' keys
"""
...
The Bigger Picture
MCP isn’t just about connecting AI to tools—it’s about creating an ecosystem where:
- Tools work across AI platforms
- Context flows seamlessly
- Developers build once, run everywhere
Like HTTP standardized web communication, MCP could standardize AI context.
Final Thoughts
Standards take time to mature. MCP is early but promising.
If you’re building AI integrations, consider MCP. Build your servers with the protocol. Contribute to the ecosystem.
The AI equivalent of REST is emerging.
One protocol to connect them all.