MCP Python SDK 入门
1. MCP 是什么?
模型上下文协议(Model Context Protocol, MCP)使开发者可以构建以安全、标准化的方式向 LLM 应用程序暴露数据和功能的服务。可以将其想象成 Web API,但其专为 LLM 交互而设计。MCP 服务可以:
- 通过资源(Resource)暴露数据(可以将资源想象成 GET 端点;资源用于将信息加载到 LLM 的上下文中)
- 通过工具(Tool)提供功能(有点像 POST 端点;工具用于执行代码或产生副作用)
- 通过提示词(Prompt)定义交互模式(用于 LLM 交互的可重用模版)
- 等等
2. MCP Python SDK 概览
MCP 允许应用程序以标准化的方式为 LLM 提供上下文,将提供上下文的关注点与实际的 LLM 交互分离。官方的 Python SDK 实现完整的 MCP 规范,使其易于:
- 构建可以连接到任何 MCP 服务端的 MCP 客户端
- 创建暴露资源(Resource)、提示词(Prompt)和工具(Tool)的 MCP 服务端
- 使用标准传输协议,比如 stdio 和 SSE
- 处理所有 MCP 协议消息和生命周期事件
3. 快速入门
3.1. 安装
pip install "mcp[cli]"
3.2. 示例
下面的 MCP 服务暴露计算器工具及一些数据:
# server.py
from mcp.server.fastmcp import FastMCP
# Create an MCP server
mcp = FastMCP("Demo")
# Add an addition tool
@mcp.tool()
def add(a: int, b: int) -> int:
"""Add two numbers"""
return a + b
# Add a dynamic greeting resource
@mcp.resource("greeting://{name}")
def get_greeting(name: str) -> str:
"""Get a personalized greeting"""
return f"Hello, {name}!"
可以使用 MCP Inspector 对其进行测试和调试(需要先安装 uv
库):
mcp dev server.py
3.2.1. 直接运行
对于高级场景,比如自定义部署:
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("My App")
if __name__ == "__main__":
mcp.run()
运行:
python server.py
# or
mcp run server.py
4. 核心概念
4.1. 服务(Server)
FastMCP 服务是 MCP 协议的核心接口。它处理连接管理、协议遵从性和消息路由:
# Add lifespan support for startup/shutdown with strong typing
from contextlib import asynccontextmanager
from collections.abc import AsyncIterator
from dataclasses import dataclass
from fake_database import Database # Replace with your actual DB type
from mcp.server.fastmcp import Context, FastMCP
# Create a named server
mcp = FastMCP("My App")
# Specify dependencies for deployment and development
mcp = FastMCP("My App", dependencies=["pandas", "numpy"])
@dataclass
class AppContext:
db: Database
@asynccontextmanager
async def app_lifespan(server: FastMCP) -> AsyncIterator[AppContext]:
"""Manage application lifecycle with type-safe context"""
# Initialize on startup
db = await Database.connect()
try:
yield AppContext(db=db)
finally:
# Cleanup on shutdown
await db.disconnect()
# Pass lifespan to server
mcp = FastMCP("My App", lifespan=app_lifespan)
# Access type-safe lifespan context in tools
@mcp.tool()
def query_db(ctx: Context) -> str:
"""Tool that uses initialized resources"""
db = ctx.request_context.lifespan_context.db
return db.query()
4.2. 资源(Resource)
资源是向 LLM 暴露数据的方式。类似于 REST API 中的 GET 端点 - 其提供数据,但不应执行大量计算或产生副作用。
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("My App")
@mcp.resource("config://app")
def get_config() -> str:
"""Static configuration data"""
return "App configuration here"
@mcp.resource("users://{user_id}/profile")
def get_user_profile(user_id: str) -> str:
"""Dynamic user data"""
return f"Profile data for user {user_id}"
4.3. 工具(Tool)
工具允许 LLM 通过 MCP 服务执行操作。与资源不同,工具可以执行计算,并且产生副作用。
import httpx
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("My App")
@mcp.tool()
def calculate_bmi(weight_kg: float, height_m: float) -> float:
"""Calculate BMI given weight in kg and height in meters"""
return weight_kg / (height_m**2)
@mcp.tool()
async def fetch_weather(city: str) -> str:
"""Fetch current weather for a city"""
async with httpx.AsyncClient() as client:
response = await client.get(f"https://api.weather.com/{city}")
return response.text
4.4. 提示词(Prompt)
提示词是可重用的模板,可以帮助 LLM 有效地与 MCP 服务进行交互:
from mcp.server.fastmcp import FastMCP
from mcp.server.fastmcp.prompts import base
mcp = FastMCP("My App")
@mcp.prompt()
def review_code(code: str) -> str:
return f"Please review this code:\n\n{code}"
@mcp.prompt()
def debug_error(error: str) -> list[base.Message]:
return [
base.UserMessage("I'm seeing this error:"),
base.UserMessage(error),
base.AssistantMessage("I'll help debug that. What have you tried so far?"),
]
4.5. 图像(Image)
FastMCP 提供自动处理图像数据的 Image
类:
from mcp.server.fastmcp import FastMCP, Image
from PIL import Image as PILImage
mcp = FastMCP("My App")
@mcp.tool()
def create_thumbnail(image_path: str) -> Image:
"""Create a thumbnail from an image"""
img = PILImage.open(image_path)
img.thumbnail((100, 100))
return Image(data=img.tobytes(), format="png")
4.6. 上下文(Context)
Context 对象为工具和资源提供访问 MCP 功能的权限:
from mcp.server.fastmcp import FastMCP, Context
mcp = FastMCP("My App")
@mcp.tool()
async def long_task(files: list[str], ctx: Context) -> str:
"""Process multiple files with progress tracking"""
for i, file in enumerate(files):
ctx.info(f"Processing {file}")
await ctx.report_progress(i, len(files))
data, mime_type = await ctx.read_resource(f"file://{file}")
return "Processing complete"
4.7. 挂载到现有的 ASGI 服务
可以使用 sse_app
方法将 SSE 服务挂载到现有的 ASGI 服务上。这样可以将 SSE 服务与其它 ASGI 应用程序集成。
from starlette.applications import Starlette
from starlette.routing import Mount, Host
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("My App")
# Mount the SSE server to the existing ASGI server
app = Starlette(
routes=[
Mount('/', app=mcp.sse_app()),
]
)
# or dynamically mount as host
app.router.routes.append(Host('mcp.acme.corp', app=mcp.sse_app()))
5. 示例
5.1. Echo 服务
下面是展示资源、工具和提示词的简单服务:
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("Echo")
@mcp.resource("echo://{message}")
def echo_resource(message: str) -> str:
"""Echo a message as a resource"""
return f"Resource echo: {message}"
@mcp.tool()
def echo_tool(message: str) -> str:
"""Echo a message as a tool"""
return f"Tool echo: {message}"
@mcp.prompt()
def echo_prompt(message: str) -> str:
"""Create an echo prompt"""
return f"Please process this message: {message}"
5.2. SQLite 浏览器
下面是展示数据库集成的复杂示例:
import sqlite3
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("SQLite Explorer")
@mcp.resource("schema://main")
def get_schema() -> str:
"""Provide the database schema as a resource"""
conn = sqlite3.connect("database.db")
schema = conn.execute("SELECT sql FROM sqlite_master WHERE type='table'").fetchall()
return "\n".join(sql[0] for sql in schema if sql[0])
@mcp.tool()
def query_data(sql: str) -> str:
"""Execute SQL queries safely"""
conn = sqlite3.connect("database.db")
try:
result = conn.execute(sql).fetchall()
return "\n".join(str(row) for row in result)
except Exception as e:
return f"Error: {str(e)}"
6. 高级用法
6.1. 底层服务
为获得更多控制权,可以直接使用底层服务实现。这样可以完全访问协议,并且可以自定义服务,包括通过生命周期 API 进行生命周期管理:
from contextlib import asynccontextmanager
from collections.abc import AsyncIterator
from fake_database import Database # Replace with your actual DB type
from mcp.server import Server
@asynccontextmanager
async def server_lifespan(server: Server) -> AsyncIterator[dict]:
"""Manage server startup and shutdown lifecycle."""
# Initialize resources on startup
db = await Database.connect()
try:
yield {"db": db}
finally:
# Clean up on shutdown
await db.disconnect()
# Pass lifespan to server
server = Server("example-server", lifespan=server_lifespan)
# Access lifespan context in handlers
@server.call_tool()
async def query_db(name: str, arguments: dict) -> list:
ctx = server.request_context
db = ctx.lifespan_context["db"]
return await db.query(arguments["query"])
生命周期 API 提供:
- 提供在服务启动时初始化资源,停止时清理资源的方法
- 通过请求上下文在处理器中访问已初始化的资源
- 在生命周期和请求处理器之间进行类型安全的上下文传递
import mcp.server.stdio
import mcp.types as types
from mcp.server.lowlevel import NotificationOptions, Server
from mcp.server.models import InitializationOptions
# Create a server instance
server = Server("example-server")
@server.list_prompts()
async def handle_list_prompts() -> list[types.Prompt]:
return [
types.Prompt(
name="example-prompt",
description="An example prompt template",
arguments=[
types.PromptArgument(
name="arg1", description="Example argument", required=True
)
],
)
]
@server.get_prompt()
async def handle_get_prompt(
name: str, arguments: dict[str, str] | None
) -> types.GetPromptResult:
if name != "example-prompt":
raise ValueError(f"Unknown prompt: {name}")
return types.GetPromptResult(
description="Example prompt",
messages=[
types.PromptMessage(
role="user",
content=types.TextContent(type="text", text="Example prompt text"),
)
],
)
async def run():
async with mcp.server.stdio.stdio_server() as (read_stream, write_stream):
await server.run(
read_stream,
write_stream,
InitializationOptions(
server_name="example",
server_version="0.1.0",
capabilities=server.get_capabilities(
notification_options=NotificationOptions(),
experimental_capabilities={},
),
),
)
if __name__ == "__main__":
import asyncio
asyncio.run(run())
6.2. 编写 MCP 客户端
SDK 提供用于连接 MCP 服务的高级客户端接口:
from mcp import ClientSession, StdioServerParameters, types
from mcp.client.stdio import stdio_client
# Create server parameters for stdio connection
server_params = StdioServerParameters(
command="python", # Executable
args=["example_server.py"], # Optional command line arguments
env=None, # Optional environment variables
)
# Optional: create a sampling callback
async def handle_sampling_message(
message: types.CreateMessageRequestParams,
) -> types.CreateMessageResult:
return types.CreateMessageResult(
role="assistant",
content=types.TextContent(
type="text",
text="Hello, world! from model",
),
model="gpt-3.5-turbo",
stopReason="endTurn",
)
async def run():
async with stdio_client(server_params) as (read, write):
async with ClientSession(
read, write, sampling_callback=handle_sampling_message
) as session:
# Initialize the connection
await session.initialize()
# List available prompts
prompts = await session.list_prompts()
# Get a prompt
prompt = await session.get_prompt(
"example-prompt", arguments={"arg1": "value"}
)
# List available resources
resources = await session.list_resources()
# List available tools
tools = await session.list_tools()
# Read a resource
content, mime_type = await session.read_resource("file://some/path")
# Call a tool
result = await session.call_tool("tool-name", arguments={"arg1": "value"})
if __name__ == "__main__":
import asyncio
asyncio.run(run())
6.3. MCP 原语
MCP 协议定义服务端可以实现的三个核心原语:
原语 | 控制 | 描述 | 示例 |
提示词 | 用户控制 | 由用户选择触发的交互式模版 | 斜杠命令、菜单选项 |
资源 | 应用程序控制 | 客户端程序管理的上下文数据 | 文件内容、API 响应 |
工具 | 模型控制 | 暴露给 LLM 的函数 | API 调用、数据更新 |
6.4. 服务端功能
MCP 服务端在初始化期间声明其功能:
能力 | 特性标记 | 描述 |
prompts | listChanged | 提示词模版管理 |
resources | subscribelistChanged | 资源暴露及更新 |
tools | listChanged | 工具发现及执行 |
logging | - | 服务端日志配置 |
completion | - | 参数补全建议 |