Overview​ MCP has taken off as the standardized platform for AI integrations, and it's difficult to justify not supporting it. However, this popularity will be short-lived. Some of this popularity stems from misconceptions about what MCP uniquely accomplishes, but the majority is due to the fact that it's very easy to add an MCP server. For a brief period, it seemed like adding an MCP server was a nice avenue for getting attention to your project, which is why so many projects have added support. What is MCP?​ MCP claims to solve the "NxM problem": with N agents and M toolsets, users would otherwise need many bespoke connectors. The NxM problem​ A common misconception is that MCP is required for function calling. It's not. With tool-calling models, a list of available tools is provided to the LLM with each request. If the LLM wants to call a tool, it returns JSON-formatted parameters: The application is responsible for providing tool schemas, parsing parameters, and executing calls. The problem arises when users want to reuse toolsets across different agents, since each has slightly different APIs. For example, tools are exposed to Gemini's API via functionDeclarations nested inside a tools array: curl "https://generativelanguage.googleapis.com/v1beta/models/gemini-2.5-flash:generateContent" \ -d '{ "contents": [...], "tools": [ { "functionDeclarations": [ { "name": "set_meeting", "description": "...",... In OpenAI's API, tool schemas use a flat tools array with type: "function": curl -X POST https://api.openai.com/v1/responses \ -d '{ "model": "gpt-4o", "input": [...], "tools": [ { "type": "function", "name": "get_weather",... This is the "NxM" problem. In theory, users must build N × M connectors. In practice, the differences are minor (same semantics, slightly different JSON shape), and frameworks like LangChain, LiteLLM, and SmolAgents already abstract them away. Crucially, these options execute tool calls in the same runtime as the agent. How MCP addresses it​ ...
First seen: 2026-01-09 10:50
Last seen: 2026-01-09 12:51