欢迎来到尧图网

客户服务 关于我们

您的位置:首页 > 财经 > 金融 > Python+MCP+LLM

Python+MCP+LLM

2025/11/12 8:06:09 来源:https://blog.csdn.net/2301_78450837/article/details/148191594  浏览:    关键词:Python+MCP+LLM

初步了解server 和 client

MCP server

from fastmcp import FastMCP
import requests
import jsonmcp = FastMCP()def post_json(url: str, payload: dict):try:response = requests.post(url, json=payload)return json.dumps(response.json(), ensure_ascii=False, indent=2)except Exception as e:return f"❌ 请求失败: {e}"@mcp.tool(description="获取对应天气")
def get_weather(city:str):return f'{city}天气好'@mcp.tool(description="读取本地文本文件,输入是文件路径")
def read_text_tool(file_path: str) -> str:return post_json("http://localhost:8000/read_text", {"file_path": file_path})@mcp.tool(description="读取 PDF 文件文本内容,输入是文件路径")
def read_pdf_tool(file_path: str) -> str:return post_json("http://localhost:8000/read_pdf", {"file_path": file_path})@mcp.tool(description="预览 PDF 首页图片,输入是文件路径")
def preview_pdf_tool(file_path: str, max_pages: int = 1) -> str:return post_json("http://localhost:8000/preview_pdf", {"file_path": file_path,"max_pages": max_pages})if __name__ == '__main__':mcp.run()  

MCP Client

import asyncio
from fastmcp import Client
from server import mcpasync def run():client = Client(mcp)async with client:tools = await client.list_tools()tool=tools[0]result=await client.call_tool(tool.name,{"city":"nanchang"})print(result)if __name__ == '__main__':asyncio.run(run())

进阶

使用本地部署的ollama尝试调用mcp封装好的工具

import asyncio
from typing import Dict, List
from fastmcp import Client
from server import mcp
from openai import OpenAIclass UserClient:def __init__(self,model="qwen3:0.6b"):self.model=modelself.mcp_client=Client(mcp)self.openai_client=OpenAI(base_url="http://127.0.0.1:11434/v1",api_key="None")self.message=[{"role":"system","content":"你是一个AI助手,你需要借助工具,回答用户问题"}]self.tools=[]async def prepare_tools(self):tools=await self.mcp_client.list_tools()tools=[{"type":"function","function":{"name":tool.name,"description":tool.description,"input_schema":tool.inputSchema}}for tool in tools]return toolsasync def chat(self, messages: List[Dict]):async with self.mcp_client:if not self.tools:self.tools = await self.prepare_tools()response = self.openai_client.chat.completions.create(model=self.model,messages=messages,tools=self.tools,)print(response)async def loop(self):while True:question=input("user:")message={"role":"user","content":question}self.message.append(message)response_message=await self.mcp_client.chat(self.message)print("AI:",response_message.get('content'))
async def main():user_client=UserClient()await user_client.chat([{"role":"user","content":"青岛今天天气怎么样"}])
if __name__ == '__main__':asyncio.run(main())

版权声明:

本网仅为发布的内容提供存储空间,不对发表、转载的内容提供任何形式的保证。凡本网注明“来源:XXX网络”的作品,均转载自其它媒体,著作权归作者所有,商业转载请联系作者获得授权,非商业转载请注明出处。

我们尊重并感谢每一位作者,均已注明文章来源和作者。如因作品内容、版权或其它问题,请及时与我们联系,联系邮箱:809451989@qq.com,投稿邮箱:809451989@qq.com

热搜词