Skip to content

🚀 MCP Integration Now Available via FastMCP ​

We're excited to announce support for MCP (Model Context Protocol) integration through FastMCP!

This new feature allows LLM providers—including OpenAI, Gemini, Deepseek, and Anthropic - to connect to external MCP servers using tool calling. This enables more dynamic, configurable interactions between your LLMs and external systems.

Check out the new module in aicore/llm/mcp, updated configuration options in LlmConfig, and an example integration in async_llm_call_with_mcp.py.

Released under the MIT License.