< >
< >
< >
< >
< >
MCP and the innovation paradox: why open standards will save AI in front of yourself - current-scope.com
< >
< >

MCP and the innovation paradox: why open standards will save AI in front of yourself


Take part in our daily and weekly newsletters to get the latest updates and exclusive content for reporting on industry -leading AI. Learn more


Larger models do not drive the next wave of the AI ​​innovation. The real disorder is quieter: standardization.

Launched by Anthropic in November 2024, the one who Model context protocol (MCP) Standardized how AI applications interact with the world beyond your training data. Similar to HTTP and the rest, standardized how web applications are associated with services, MCP standardized how AI models are connected to tools.

You have probably read a dozen items that explain what MCP is. But what most are missing is the boring – and powerful – part: MCP is a standard. Standards don’t just organize technology. They create growth shell wheels. Adopt them early and you drive with the wave. Ignorate them and you fall back. This article explains why MCP is now important, what challenges it introduces and how the ecosystem is already redesigned.

How MCP moves us from chaos to the context

Meet Lily, a product manager at a cloud infrastructure company. She Juggles projects Over half a dozen tools such as Jira, Figma, Github, Slack, Google Mail and Confluence. How many do they drown into updates.

By 2024, Lily saw how good large voice models (LLMS) had sent in the synthesis of information. She discovered an opportunity: If she could insert all the tools of her team into a model, she was able to automate updates, design communication and answer questions on request. However, each model had its user -defined way to connect with services. Every integration pulled it deeper to the platform of a single provider. When she had to move in Gong transcripts, this meant establishing another tailor -made connection, which made it even more difficult to change to a better LLM later.

Then Anthropic MCP started: An open protocol for the standardization of the context flows to LLMS. MCP quickly took the support of OpenaiPresent AWSPresent AzurePresent Microsoft Copilot Studio And soon Google. Official SDKs are available pythonPresent typescriptPresent JavaPresent C#Present rustPresent Kotlin And Fast. Community SDKS for Go And others followed. Adoption was quick.

Today Lily carries out everything about Claude, which is connected to her job applications via a local MCP server. Status reports itself. A command of management are removed. If new models appear, it can exchange them without losing one of their integrations. If she writes code on the side, she uses cursor with a model of Openai and the same MCP server as in Claude. Your IDE already understands the product that it builds. MCP just did it.

The strength and effects of a standard

Lily’s story shows a simple truth: Nobody likes to use fragmented tools. No user is happy to be included in the provider. And no company wants to rewrite integrations every time they change models. You want freedom to use the best tools. MCP delivers.

Now the standards that have an impact.

First, Saa’s providers are susceptible to outdoor dating. MCP tools depend on these APIs and customers will require support for their AI applications. There are no excuses with a de facto standard.

Second, AI application cycles are dramatically accelerated. Developers no longer have to write a custom code to test simple AI applications. Instead, you can integrate MCP servers into slightly available MCP clients such as Claude Desktop, cursor and windsurf.

Third, the switching costs collapsed. Since the integrations are decoupled from certain models, Claude to Openai can migrate from Claude to Gemini – or the mixed models – without reconstruction infrastructure. Future LLM provider is benefited from an existing ecosystem around MCP so that you can concentrate on better price -performance.

Navigate challenges with MCP

Each standard introduces new friction points or leaves existing friction points unresolved. MCP is no exception.

Trust is critical: There have been dozens of MCP registers and offer thousands of servers in the community. However, if you do not control the server – or the party that does this – you risk that you let go of an unknown third secret. If you are a SaaS company, make official servers. If you are a developer, look for official servers.

Quality is variable: APIs develop and poorly maintained MCP servers can easily fall out of synchronization. LLMs are based on high -quality metadata to determine which tools are to be used. There is still no relevant MCP register, which reinforces the need for official servers from trustworthy parties. If you are a SaaS company, take care of your servers while your APIs develops. If you are a developer, look for official servers.

BIG MCP server increase costs and a lower supply company: Too many tools in a single server bundle the costs through token consumption and overwhelming models with too much choice. LLMs are easy to confuse if you have access to too many tools. It is the worst in both worlds. Smaller, task -oriented servers will be important. Remember while creating and distributing servers.

There are authorization and identity problems exist: These problems existed before MCP and they still exist with MCP. Imagine Lily gave Claude the opportunity to send emails and gave well-intentioned instructions such as: “Send Chris Schnell a status update.” Instead of sending your boss, Chris, by e -mail, the LLM -E emails send everyone who has named Chris in their contact list to ensure that Chris receives the message. Man must remain in the loop for high -quality actions.

Look ahead

MCP is not a hype – it is a fundamental shift in the infrastructure for AI applications.

And just like with every well -adopted standard in front of it, MCP creates a self -reinforcing flywheel: every new server, every new integration, every new application connects the swing.

New tools, platforms and registers have already appeared to simplify, test, provide and discover MCP servers. While the ecosystem develops, AI applications offer simple interfaces to connect new functions. Teams that accept the protocol will send products with better integration stories faster. Companies that offer public APIs and official MCP servers can be part of the history of integration. Late adopters have to fight for relevance.

Noah Schwartz is a product manager for Postman.


Leave a Reply

Your email address will not be published. Required fields are marked *

< >