I agree with the article, and I love how the author is (mis-)using MCP. I just want to rephrase what the accident actually is.
The accident isn't that somehow we got a protocol to do things we couldn't do before. As other comments point out MCP (the specificaiton), isn't anything new or interesting.
No, the accident is that the AI Agent wave made interoperability hype, and vendor lock-in old-fashioned.
I don't know how long it'll last, but I sure appreciate it.
stavros · 6m ago
I haven't seen an app that didn't have an API create one via MCP. The only MCP servers I've seen were for things that I could already access programmatically.
visarga · 3h ago
The main benefit is not that it made interoperability fashionable, or that it make things easy to interconnect. It is the LLM itself, if it knows how to wield tools. It's like you build a backend and the front-end is not your job anymore, AI does it.
In my experience Claude and Gemini can take over tool use and all we need to do is tell them the goal. This is huge, we always had to specify the steps to achieve anything on a computer before. Writing a fixed program to deal with dynamic process is hard, while a LLM can adapt on the fly.
freeone3000 · 1h ago
The issue holding us back was never that we had to write a frontend — it was the data locked behind proprietary databases and interfaces. Gated behind API keys and bot checks and captchas and scraper protection. And now we can have an MCP integrator for IFTTT and have back the web we were promised, at least for a while.
whitten · 15m ago
what is IFTTT ?
lazyasciiart · 3m ago
If this then that - a zapier type glue provider.
sshine · 5h ago
Hype, certainly.
But the way I see it, AI agents created incentives for interoperability. Who needs an API when everyone is job secure via being a slow desktop user?
Well, your new personal assistant who charges by the Watt hour NEEDS it. Like when the CEO will personally drive to get pizzas for that hackathon because that’s practically free labor, so does everyone want everything connected.
For those of us who rode the API wave before integrating became hand-wavey, it sure feels like the world caught up.
I hope it will last, but I don’t know either.
mh- · 4h ago
Unfortunately, I think we're equally likely to see shortsighted lock-in attempts like this [0] one from Slack.
I tried to find a rebuttal to this article from Slack, but couldn't. I'm on a flight with slow wifi though. If someone from Slack wants to chime in that'd be swell, too.
I've made the argument to CFOs multiple times over the years why we should continue to pay for Slack instead of just using Teams, but y'all are really making that harder and harder.
Sounds sort of like an innovator's dilemma response. New technology appears and the response is gatekeeping and building walls rather than adaptation.
dgacmu · 3h ago
I'm happier we went with Zulip each day.
ebiester · 4h ago
It's going to take more people willing to move away from slack for those purposes.
As it is, I'm going to propose that we move more key conversations outside of slack so that we can take advantage of feeding it into ai. It's a small jump from that to looking for alternatives.
mellosouls · 3h ago
the AI Agent wave made interoperability hype, and vendor lock-in old-fashioned
Perhaps but we see current hypes like Cursor only using MCP one way; you can feed into Cursor (eg. browser tools), but not out (eg. conversation history, context etc).
I love Cursor but this "not giving back" mentality originally reflected in it's closed source forking of VS Code leaves an unpleasant taste in the mouth and I believe will ultimately see it lose developer credibility.
Lock-in still seems to be locked in.
talos_ · 2h ago
The VSCode extension Continue provides similar capabilities and gives you full access to your interaction traces (local database and JSON traces)
adregan · 4h ago
How ironic given the amount of APIs that were locking down access in response to AI training!
Though the general API lockdown was started long before that, and like you, I’m skeptical that this new wave of open access will last if the promise doesn’t live up to the hype.
TimTheTinker · 3h ago
MCP is supposed to grant "agency" (whatever that means), not merely expose curated data and functionality.
In practice, the distinction is little more than the difference between different HTTP verbs, but I think there is a real difference in what people are intending to enable when creating an MCP server vs. standard APIs.
adregan · 3h ago
Might be another reflection of McLuhan‘s “the medium is the message” in that APIs are built with the intended interface in mind.
To this point, GUIs; going forward, AI agents. While the intention rhymes, the meaning of these systems diverge.
notatoad · 1h ago
i don't think it's ironic at all. the AI boom exposed the value of data. there's two inevitable consequences when the value of something goes up: the people who were previously giving it away for free start charging for it, and the people who weren't previously selling it at all start selling it.
the APIs that used to be free and now aren't were just slightly ahead of the game, all these new MCP servers aren't going to be free either.
Animats · 2h ago
> Want spell check? MCP server.
> Want it to order coffee when you complete 10 tasks? MCP server.
With a trip through an LLM for each trivial request? A paid trip?
With high overhead and costs?
notatoad · 2h ago
the whole point of the article is that it doesn't need to be an LLM, MCP is just a standard way to expose tools to things that use tools. LLMs can use tools, but so can humans.
therein · 1h ago
So the whole point of the article is that an API is an API and anything can call an API?
lazyasciiart · 2m ago
Maybe phrasing it this way will be the lightbulb moment for everyone who hasn’t got that yet.
conradev · 2h ago
AI agents didn't only make adversarial interoperability hype, they've also made it inevitable! From here all the way until they're probing hardware to port Linux and write drivers.
bitwize · 4h ago
Remember Web 2.0? Remember the semantic web? Remember folksonomies? Mash-ups? The end of information silos? The democratizing power of HTTP APIs?Anyone? Anyone?
apgwoz · 4h ago
I think we found a new backronym for MCP: Mashup Context Protocol.
(The mashup hype was incredible, btw. Some of the most ridiculous web contraptions ever.)
kasey_junk · 4h ago
Yes. Pieces of all of those things surround us now. And where we are wrt locking and interop is far beyond where we were when each of those fads happened.
Mcp is a fad, it’s not long term tech. But I’m betting shoveling data at llm agents isn’t. The benefits are too high for companies to allow vendors to lock the data away from them.
fragmede · 37m ago
I know we're all just soaked by a wave of hype right now but I think MCP will go the way of other "but it works" tech, like zip files, RSS and shell scripts.
karaterobot · 4h ago
I don't understand your point. Some of those things were buzzwords, some were impossible dreams, some changed the way the web works completely. Are you just saying that the future is unknown?
klabb3 · 3h ago
No. What they are saying is best said with a quote from Battlestar Galactica:
> All of this has happened before, and all of this will happen again.
”It” here being the boom and inevitable bust of interop and open API access between products, vendors and so on. As a millenial, my flame of hope was lit during the API explosion of Web 2.0. If you’re older, your dreams were probably crushed already by something earlier. If you’re younger, and you’re genuinely excited about MCP for the potential explosion in interop, hit me up for a bulk discount on napkins.
bwfan123 · 3h ago
And then, there are "architecture astronaut"s dreaming of an entire internet of MCP speaking devices - an "internet of agents" if you will. That is now requiring a separate DNS, SMTP, BGP etc. for that internet.
fragmede · 28m ago
I'm older and would like a discount please. The "this time it's different" energy is because assuming a human can interact with the system, and that vision models can drive a gui, who cares if there's an actual API, just have the AI interact with the system as if it was coming in as a human.
potatolicious · 2h ago
I take their point to be that the underlying incentives haven't changed. The same forces and incentives that scuttled those things are likely to scuttle this as well.
I actually disagree with the OP in this sub-thread:
> "No, the accident is that the AI Agent wave made interoperability hype, and vendor lock-in old-fashioned."
I don't think that's happened at all. I think some interoperability will be here to say - but those are overwhelmingly the products where interoperability was already the norm. The enterprise SaaS that your company is paying for will support their MCP servers. But they also probably already support various other plugin interfaces.
And they're not doing this because of hype or new-fangledness, but because their incentives are aligned with interoperability. If their SaaS plugins into [some other thing] it increases their sales. In fact the lowering of integration effort is all upside for them.
Where this is going to run into a brick wall (and I'd argue: already has to some degree) is that closed platforms that aren't incentivized to be interoperable still won't be. I don't think we've really moved the needle on that yet. Uber Eats is not champing at the bit to build the MCP server that orders your dinner.
And there are a lot of really good reasons for this. In a previous job I worked on a popular voice assistant that integrated with numerous third-party services. There has always been vehement pushback to voice assistant integration (the ur-agent and to some degree still the holy grail) because it necessarily entails the service declaring near-total surrender about the user experience. An "Uber Eats MCP" is one that Uber has comparatively little control over the UX of, and has poor ability to constrain poor customer experiences. They are right to doubt this stuff.
I also take some minor issue with the blog: the problem with MCP as the "everything API" is that you can't really take the "AI" part out of it. MCP tools are not guaranteed to communicate in structured formats! Instead of getting an HTTP 401 you will get a natural language string like "You cannot access this content because the author hasn't shared it with you."
That's not useful without the presence of a NL-capable component in your system. It's not parseable!
Also importantly, MCP inputs and outputs are intentionally not versioned nor encouraged to be stable. Devs are encouraged to alter their input and output formats to make them more accessible to LLMs. So your MCP interface can and likely will change without notice. None of this makes for good API for systems that aren't self-adaptive to that sort of thing (i.e., LLMs).
> made interoperability hype, and vendor lock-in old-fashioned
I always imagined software could be written with a core that does the work and the UI would be interchangeable. I like that the current LLM hype is causing it to happen.
iLoveOncall · 5h ago
> I don't know how long it'll last
I'm just baffled no software vendor has already come up with a subscription to access the API via MCP.
I mean obviously paid API access is nothing new, but "paid MCP access for our entreprise users" is surely on the pipeline everywhere, after which the openness will die down.
adamesque · 3h ago
I think for enterprise it’s going to become part of the subscription you’re already paying for, not a new line item. And then prices will simply rise.
Optionality will kill adoption, and these things are absolutely things you HAVE to be able to play with to discover the value (because it’s a new and very weird kind of tool that doesn’t work like existing tools)
Bjartr · 4h ago
And I expect there'll eventually be a way for an AI to pay for an MCP use microtransaction style.
Heck, if AIs are at some point given enough autonomy to simply be given a task and a budget, there'll be efforts to try to trick AIs into thinking paying is the best way to get their work done! Ads (and scams) for AIs to fall for!
pininja · 3h ago
Mapbox is just a small step away from that with their MCP server wrapping their pay-by-use API. I wouldn’t be surprised to see a subscription offering with usage limits if that somehow appealed to them. MapTiler already offers their service as a subscription so they’re even closer if they hosted a server like this on their own.
I don’t want to undermine the author’s enthusiasm for the universality of the MCP. But part of me can’t help wondering: isn’t this the idea of APIs in general? Replace MCP with REST and does that really change anything in the article? Or even an Operating System API? POSIX, anyone? Programs? Unix pipes? Yes, MCP is far simpler/universal than any of those things ended up being — but maybe the solution is to build simpler software on good fundamental abstractions rather than rebuilding the abstractions every time we want to do something new.
Jonovono · 5h ago
MCP is not REST. In your comparison, its more that MCP is a protocol for discovering REST endpoints at runtime and letting users configure what REST endpoints should be used at runtime.
Say i'm building a app and I want my users to be able to play spotify songs. Yea, i'll hit the spotify api. But now, say i've launched my app, and I want my users to be able to play a song from sonofm when they hit play. Alright, now I gotta open up the code and do some if statements hard code the sonofm api and ship a new version, show some update messages.
MCP is literally just a way to make this extensible so instead of hardcoding this in, it can be configured at runtime
Wait was it? HATEOAS is all about hypermedia, which means there must be a human in the loop being presented the rendered hypermedia. MCP seems like it's meant to be for machine<->machine communication, not human<->machine
layer8 · 2h ago
I agree that HATEOAS never made sense without a human in the loop, although I also have never seen it be described as such. IMO that’s an important reason why it never gained useful traction.
There is a confused history where Roy Fielding described REST, then people applied some of that to JSON HTTP APIs, designating those as REST APIs, then Roy Fielding said “no you have to do HATEOAS to achieve what I meant by REST”, then some people tried to make their REST APIs conform to HATEOAS, all the while that change was of no use to REST clients.
But now with AI it actually can make sense, because the AI is able to dynamically interpret the hypermedia content similar to a human.
NomDePlum · 1h ago
My understanding was that the discoverable part of HATEAOS was meant for machine to machine. Actually all of REST is machine to machine except in very trivial situations.
Not sure I'm understanding your point in hypermedia means there is human in the loop. Can you expand?
renerick · 26m ago
H in HATEOAS stands for "hypermedia". Hypermedia is a type of document that includes hypermedia controls, which are presented by the hypermedia client to a user for interaction. It's the user who makes decision what controls to interact with. For example, when I'm writing this comment, HN server gave a hypermedia document, which contains your comment, a textarea input and a button to submit my reply, and me, the human in the loop, decides what to put in it the input and when to press the button. A machine can't do that on its own (but LLMs potentially can), so a user is required. That also means that JSON APIs meant for purely machine to machine interactions, commonly referred to as REST, can't be considered HATEOAS (and REST) due to absence of hypermedia controls.
MCP is a JSON RPC implementation of OpenAPI, or, get this, XML and WSDL/SOAP.
jaredsohn · 2h ago
Feels like segment.com but for calling APIs rather than adding libraries to the frontend.
nikolayasdf123 · 4h ago
so... is this OpenAPI then?
lobsterthief · 3h ago
Basically, yes. But with much more enthusiasm!
kvdveer · 5h ago
The main difference between MCP and Rest is that MCP is self described from the very start. REST may have OpenAPI, but it is a later addon, and we haven't quite standardised on using it. The first step of exposing an MCP is describing it, for Rest is is an optional step that's often omitted.
xg15 · 4h ago
Is it "self-described" in the sense I can get a list of endpoints or methods, with a human- (or LLM-) readable description for each - or does it supply actual schemata that I could also use with non-AI clients?
(Even if only the former, it would of course be a huge step forward, as I could have the LLM generate schemata. Also, at least, everyone is standardizing on a base protocol now, and a way to pass command names, arguments, results, etc. That's already a huge step forward in contrast to arbitrary Rest+JSON or even HTTP APIs)
Spivak · 3h ago
For each tool you get the human description as well as a JSON schema for the parameters needed to call the function.
talos_ · 2h ago
You're getting an arbirary string back though...
Szpadel · 5h ago
isn't also SOAP self described?
kerng · 1h ago
When I read about MCP the first time and saw that it requires a "tools/list" API reminded me of COM/DCOM/ActiveX from Microsoft, it had things like QueryInterface and IDispatch. And I'm sure that wasn't the first time someone came up with dynamic runtime discovery of APIs a server offers.
Interestingly, ActiveX was quite the security nightmare for very similar reasons actually, and we had to deal with infamous "DLL Hell". So, history repeats itself.
souldeux · 4h ago
And gRPC with reflection, yeah?
hansonkd · 3h ago
and GQL with reflection?
notpushkin · 2h ago
JSON-LD?
gaunds · 3h ago
खस्नादिर्बीद
Skdbiebdjv
light_hue_1 · 5h ago
But you're describing it in a way that is useless to anything but an LLM. It would have been much better if the description language had been more formalized.
Majromax · 4h ago
> It would have been much better if the description language had been more formalized.
To speculate about this, perhaps the informality is the point. A full formal specification of something is somewhere between daunting and Sisyphean, and we're more likely to see supposedly formal documentation that nonetheless is incomplete or contains gaps to be filled with background knowledge or common sense.
A mandatory but informal specification in plain language might be just the trick, particularly since vibe-APIing encourages rapid iteration and experimentation.
0x696C6961 · 5h ago
The description includes an input and output json schema.
caust1c · 4h ago
In my mind the only thing novel about MCP is requiring the schema is provided as part of the protocol. Like, sure it's convenient that the shape of the requests/response wrappers are all the same, that certainly helps with management using libraries that can wrap dynamic types in static types, but everyone was already doing that with APIs already we just didn't agree on what that envelope's shape should be. BUT, with the requirement that schema be provided with the protocol, and the carrot of AI models seamlessly consuming it, that was enough of an impetus.
marcosdumay · 3h ago
> the only thing novel about MCP is requiring the schema is provided as part of the protocol
You mean, like OpenAPI, gRPC, SOAP, and CORBA?
sneak · 3h ago
You can’t connect to a gRPC endpoint and ask to download the client protobuf, but yes.
ahmedtd · 44m ago
It's not enabled by default, but you can --- gRPC Reflection:
You can then use generic tools like grpc_cli or grpcurl to list available services and methods, and call them.
gdecaso · 1h ago
The main difference between MCP and REST is `list-tools`.
REST APIs have 5 or 6 ways of doing that, including "read it from our docs site", HATEOAS, OAS running on an endpoint as part of the API.
MCP has a single way of listing endpoints.
OJFord · 47m ago
> The main difference between MCP and REST is `list-tools`.
> REST APIs have 5 or 6 ways of doing that
You think nobody's ever going to publish a slight different standard to Anthropic's MCP that is also primarily intended for LLMs?
gavinray · 1h ago
WSDL + XML API's have been around since 1998.
OpenAPI, OData, gRPC, GraphQL
I'm sure I'm missing a few...
spenczar5 · 5h ago
honestly, yes - but MCP includes a really simple 'reflection' endpoint to list the capabilities of an API, with human readable docs on methods and types. That is something that gRPC and OpenAPI and friends have supported as an optional extension for ages, but it has largely been a toy. MCP makes it central and maybe that makes all the difference.
spudlyo · 5h ago
At a previous job most of our services supported gRPC reflection, and exploring and tinkering with these APIs using the grpc_cli tool was some of the most fun I had while working there. Building and using gRPC services in golang left a strong positive impression on me.
lobsterthief · 3h ago
I had the same experience working with GQL :)
bayesianbot · 5h ago
My first thought as well. But maybe at least people wanting to plug their apps to their AI forces developers to actually implement the interface, unlike APIs that are mostly unheard of in general population and thus not offered?
TZubiri · 2h ago
Damn, I just read this and it's comforting to see how similar it is to my own response.
To elaborate on this, I don't know much about MCP, but usually when people speak about it is in a buzzword-seeking kind of way, and the people that are interested in it make these kinds of conceptual snafus.
Second, and this applies not just to MCP, but even things like JSON, Rust, MongoDB. There's this phenomenon where people learn the complex stuff before learning the basics. It's not the first time I've cited this video on Homer studying marketing where he reads the books out of order https://www.youtube.com/watch?v=2BT7_owW2sU . It makes sense that this mistake is so common, the amount of literature and resources is like an inverted pyramid, there's so little classical foundations and A LOT of new stuff, most of which will not stand the test of time. Typically you have universities to lead the way and establish a classical corpus and path, but being such a young discipline, 70 years in and we are still not finding much stability, Universities have gone from teaching C, to teaching Java, to teaching Python (at least in intro to CS), maybe they will teach Rust next, but this buzzwording seems more in line with trying to predict the future, and there will be way more losers than winners in that realm. And the winners will have learned the classicals in addition to the new technology, learning the new stuff without the classics is a recipe for disaster.
jampa · 5h ago
I don't want to sound like a skeptic, but I see way more people talking about how awesome MCP is rather than people building cool things with it. Reminds me of blockchain hype.
MCP seems like a more "in-between" step until the AI models get better. I imagine in 2 years, instead of using an MCP, we will point to the tool's documentation or OpenAPI, and the AI can ingest the whole context without the middle layer.
qsort · 5h ago
Regardless of how good a model gets, it can't do much if it doesn't have access to deterministic tools and information about the state of the world. And that's before you take into account security: you can't have a model running arbitrary requests against production, that's psychotic.
I don't have a high opinion of MCP and the hype it's generating is ridicolous, but the problem it supposedly solves is real. If it can work as an excuse to have providers expose an API for their functionality like the article hopes, that's exciting for developers.
mtkd · 4h ago
It's very different to blockchain hype
I had similar skepticism initially, but I would recommend you dip toe in water on it before making judgement
The conversational/voice AI tech now dropping + the current LLMs + MCP/tools/functions to mix in vendor APIs and private data/services etc. really feels like a new frontier
It's not 100% but it's close enough for a lot of usecases now and going to change a lot of ways we build apps going forward
moooo99 · 3h ago
Probably my judgement is a bit fogged. But if I get asked about building AI into our apps just one more time I am absolutely going to drop my job and switch careers
mtkd · 3h ago
That's likely because OG devs have been seeing the hallucination stuff, unpredicability etc. and questioning how that fits with their carefully curated perfect system
What blocked me initially was watching NDA'd demos a year or two back from a couple of big software vendors on how Agents were going to transform enterprise ... what they were showing was a complete non-starter to anyone who had worked in a corporate because of security, compliance, HR, silos etc. so I dismissed it
This MCP stuff solves that, it gives you (the enterprise) control in your own walled garden, whilst getting the gains from LLMs, voice etc. ... the sum of the parts is massive
It more likely wraps existing apps than integrates directly with them, the legacy systems becoming data or function providers (I know you've heard that before ... but so far this feels different when you work with it)
moooo99 · 1h ago
> That's likely because OG devs have been seeing the hallucination stuff, unpredicability etc. and questioning how that fits with their carefully curated perfect system
That is the odd part. I am far from being part of that group of people. I‘m only 25, I joined the industry in 2018 as part of an training program in a large enterprise.
The odd part is, many of the promises are a bit Déjà-vu even for me. „Agents going to transform the enterprise“ and other promises do not seem that far off the promises that were made during the low code hype cycle.
Cynically, the more I look at the AI projects as an outsider, the more I think AI could fail in enterprises largely because of the same reason low code did. Organizations are made of people and people are messy, as a result the data is often equally messy.
bwfan123 · 3h ago
There are 2 kinds of usecases that software automates. 1) those that require accuracy and 2) those that dont (social media, ads, recommendations).
Further, there are 2 kinds of users that consume the output of software. a) humans, and b) machines.
Where LLMs shine are in the 2a usecases, ie, usecases where accuracy does not matter and humans are end-users. there are plenty of these usecases.
The problem is that LLMs are being applied to 1a, 1b usecases where there is going to be a lot of frustration.
How does MCP solve any of the problems you mentioned? The LLM still has to access your data, still doesn't know the difference between instructions and data, and still gives you hallucinated nonsense back – unless there's some truly magical component to this protocol that I'm missing.
ashwinsundar · 2h ago
I had a use case - I wanted to know what the congresspeople from my state have done this week. This information is surprisingly hard to just get from the news. I learned about MCP a few months ago and thought that it might be a cool way to interact with the congress.gov API.
I made this MCP server so that you could chat with real-time data coming from the API - https://github.com/AshwinSundar/congress_gov_mcp. I’ve actually started using it more to find out, well, what the US Congress is actually up to!
bryancoxwell · 5h ago
But this whole post is about using MCP sans AI
iLoveOncall · 5h ago
MCP without AI is just APIs.
MCP is already a useless layer between AIs and APIs, using it when you don't even have GenAI is simply idiotic.
The only redeeming quality of MCP is actually that it has pushed software vendors to expose APIs to users, but just use those directly...
ricardobeat · 5h ago
And that’s the whole point - it’s APIs we did not have. Now app developers are encouraged to have a public, user friendly, fully functional API made for individual use, instead of locking them behind enterprise contracts and crippling usage limits.
candiddevmike · 4h ago
Do you have an example of a company who previously had an undiscoverable API now offering a MCP-based alternative?
drivers99 · 4h ago
> it’s APIs we did not have
Isn't that what we had about 20 years ago (web 2.0) until they locked it all up (the APIs and feeds) again? ref: this video posted 18 years ago: https://www.youtube.com/watch?v=6gmP4nk0EOE
(Rewatching it in 2025, the part about "teaching the Machine" has a different connotation now.)
Maybe it's that the protocol is more universal than before, and they're opening things up more due to the current trends (AI/LLM vs web 2.0 i.e. creating site mashups for users)? If it follows the same trend then after a while it will become enshittified as well.
iLoveOncall · 5h ago
Right, but we would have had them even if MCP did not exist. The need to access those APIs via LLM-based "agents" would have existed without MCP.
At work I built an LLM-based system that invoke tools. We started before MCP existed, and just used APIs (and continue to do so).
Its engineering value is nil, it only has marketing value (at best).
> we will point to the tool's documentation or OpenAPI
You can already do this as long as your client has access to a HTTP MCP.
You can give the current generation of models an openAPI spec and it will know exactly what to do with it.
nikolayasdf123 · 4h ago
you don't even need MCP for that. just access to hosted swagger file.
arbuge · 4h ago
I could see that happening... perhaps instead of plugging in the URL of the MCP server you'd like to use, you'd just put in the URL of their online documentation and trust your AI assistant of choice to go through all of it.
caust1c · 4h ago
It's incredible for investigating audit logs. Our customers use it daily.
I wasn't able to find a good source on it, but I read a couple of times that Anthropic (builders of MCP) do astroturfing/shilling/growth hacking/SEO/organic advertisement. Everything I've read so far with MCP and Claude and the hype I see on social media is consistent with that, hype and no value.
jasondclinton · 1h ago
This is false.
afro88 · 37m ago
The author is missing the bit that the LLM provides: automatically mapping input parameters to things the user wants to do, and responses to the way the UI displays them.
Take out the LLM and you're not that far away from existing protocols and standards. It's not plugging your app into any old MCP and it just works (like the USB-C example).
But, it is a good point that the hype is getting a lot of apps and services to offer APIs in a universal protocol. That helps.
sureglymop · 3h ago
I've thought of this as well but in reality, aren't MCP servers mostly just clients for pre existing APIs?
For example, the Kagi MCP server interacts with the Kagi API. Wouldn't you have a better experience just using that API directly then?
On another note, as the number of python interpreters running on your system increases with the number of MCP servers, does anyone think there will be "hosted" offerings that just provide a sort of "bridge" running all your MCP servers?
graerg · 1h ago
This has been my take, and maybe I'm missing something, but my thinking has been that in an ideal case there's an existing API with an OpenAPI spec you can just wrap with your FastMCP instantiation. This seemed neat, but while I was trying to do authenticated requests and tinkering with it with Goose I ended up just having Goose do curl commands against the existing API routes and I suspect with a sufficiently well documented OpenAPI spec, isn't MCP kinda moot?
On the other hand, in the absence of an existing API, you can implement your MCP server to just [do the thing] itself, and maybe that's where the author sees things trending.
neoden · 4h ago
So much scepticism in the comments. I spent last week implementing an MCP server and I must say that "well-designed" is probably an overstatement. One of the principles behind MCP is that "an MCP server should be very easy to implement". I don't know, maybe it's a skill issue but it's not that easy at all. But what is important imo, is that so many eyes are looking in one direction right now. That means, it has good chances to have all the problems to be solved very quickly. And second, often it's so hard to gather a critical mass of attention around something to create an ecosystem but this is happening right now. I wish all the participants patience and luck)
newtwilly · 2h ago
It's pretty easy if you just use the MCP Python library. You just put an annotation on a function and there's your tool. I was able to do it and it works great without me knowing anything about MCP. Maybe it's a different story if you actually need to know the protocol and implement more for yourself
klabb3 · 3h ago
> One of the principles behind MCP is that "an MCP server should be very easy to implement".
I’m not familiar with the details but I would imagine that it’s more like:
”An MCP server which re-exposes an existing public/semi-public API should be easy to implement, with as few changes as possible to the original endpoint”
At least that’s the only way I can imagine getting traction.
mattmanser · 3h ago
We've done it before, it hasn't worked before and it's only a matter of years if not months before apps starting locking down the endpoints so ONLY chatgpt/claude/etc. servers can use them.
Interoperability means user portability. And no tech bro firm wants user portability, they want lock in and monopoly.
vinkelhake · 4h ago
While reading this, the old ARexx (Amiga Rexx) popped into my head. It was a scripting language that in itself wasn't very noteworthy. However, it also made it easy for applications to expose functionality through an ARexx port. And again, offering up an API itself isn't noteworthy either. But it shipped by default in the system and if an application wanted to open itself up for scripting, ARexx was the natural choice. As a result, a ton of applications did have ARexx ports and there was a level of universality that was way ahead of its time.
Come to think of it - I don't know what the modern equivalent would be. AppleScript?
billmcneale · 2h ago
Microsoft introduced this in Windows in 1993, it's called COM and is still in (heavy) use today.
It basically powers all inter communication in Windows.
layer8 · 4h ago
PowerShell with COM interfaces.
inheritedwisdom · 4h ago
Lowering the bar to integrate and communicate is what has historically allowed technology to reach critical mass and enabled adoption. MCP is an evolution in that respect and shouldn’t be disregarded.
We had a non technical team member write an agent to clean up a file share. There are hundreds of programming languages, libraries, and apis that enabled that before MCP but now people don’t even have to think about it. Is it performant no, is it the “best” implementation absolutely not. Did it create enormous value in a novel way that was not possible with the resources, time, technology we had before 100%. And that’s the point.
citizenpaul · 3h ago
>non technical team member write an agent to clean up a file share
This has to be BS(or you think its true) unless it was like 1000 files. In my entire career I've seen countless crazy file shares that are barely functional chaos. In nearly ever single "cleanup" attempt I've tried to get literally ANYONE from the relevant department to help with little success. That is just for ME to do the work FOR THEM. I just need context from them. I've on countless occasion had to go to senior management to force someone to simply sit with me for an hour to go over the schema they want to try to implement. SO I CAN DO IT FOR THEM and they don't want to do it and literally seemed incapable of doing so when forced to. COUNTLESS Times. This is how I know AI is being shilled HARD.
If this is true then I bet you anything in about 3-6 months you guys are going to be recovering this file system from backups. There is absolutely no way it was done correctly and no one has bothered to notice yet. I'll accept your downvote for now.
Cleaning up a file share is 50% politics, 20% updating procedures, 20% training and 10% technical. I've seen companies go code red and practically grind to a halt over a months long planned file share change. I've seen them rolled back after months of work. I've seen this fracture the files shares into insane duplication(or more) because despite the fact it was coordinated, senior managers did not as much as inform their department(but attended meetings and signed off on things) and now its too late to go back because some departments converted and some did not. I've seen helpdesk staff go home "sick" because they could not take the volume of calls and abuse from angry staff afterwards.
Yes I have trauma on this subject. I will walk out of a job before ever doing a file share reorg again.
You'll roll it out in phases? LOL
You'll run it in parallel? LOL
You'll do some <SUPER SMART> thing? LOL.
gavinray · 1h ago
> "The author discovers API's/JSON RPC"
I'm too young to be posting old_man_yells_at_cloud.jpg comments...
rubatuga · 53m ago
Can someone link to this supposed toaster with DP-alt mode? That supposedly runs on 240W? (Max PD power)
belter · 13m ago
This all starting to look like autonomous driving. We are nowhere near solving it but everybody acts like it's here.
bigmattystyles · 4h ago
I thought MCPs just ‘figured out’ using docs how to call a program’s API. Won’t it matter that many APIs just suck?
chopete3 · 2h ago
The real accident is that the prompts became a programming language. I don't think the ML Engineers set out to create a general purpose programming language.
A2A (agent 2 agent) mechanism is an another accidental discovery for the interoperability across agent boundaries
nimish · 2h ago
Interoperability is, and always was, the hardest part of programming systems together. It's telling that the ai tooling needed sustained non ai effort to expose the interfaces via MCP (or ws-* or rest or an enterprise service bus or xml or CORBA or EJB or...)
I know this is nit-picky and not really relevant to the actual meat of the story, but a toaster (outside of a gag gift or gimmick) cannot run on USB-C since your typical toaster draws ~1kW and USB-C power spec tops out at 240W.
hnlmorg · 4h ago
A car lighter also cannot run a pizza oven for the same reason.
Someone should write an AI tool that evaluates every top article in hacker News and provides the appropriate XKCD comic as a comment.
dexterdog · 2h ago
And then a few steps later it's just bots talking to bots. Then what did we read when we're on the loo?
bovermyer · 2h ago
> emotional support portable fan
I can't be the only person that non-ironically has this.
MontagFTB · 2h ago
Bret Victor had an old video where he talked about a world in which computers very organically figured out how to interoperate. MCP feels like the first realization of that idea.
iambateman · 4h ago
Where do I get started with MCP? I’m all in, but kinda…confused?
A REST API makes sense to me…but this is apparently significantly different and more useful. What’s the best way to think about MCP compared to a traditional API? Where do I get started building one? Are there good examples to look at?
randomcatuser · 3h ago
Yeah, one way to think about it is like... protocols restrict things, so that people can expect the same stuff.
With a traditional API, people can build it any way they want, which means you (the client) need API docs.
With MCP, you literally restrict it to 2 things: get the list of tools, and call the tool (using the schema you got above). Thus the key insight is just about: let's add 1 more endpoint that lists the APIs you have, so that robots can find it.
Finally, hook it up to an LLM client. It’s dead simple to do in Claude Code, create an .mcp.json file and define the server’s startup command.
airstrike · 4h ago
It's kinda like a REST API in which the schema tags along with the requests.
The use case in AI is sort of reversed such that the code runs on your computer
stefan_ · 3h ago
I think I'm living in a parallel universe. You can tell an LLM in a million ways what "tools" it can "call". Anthropic & co standardized a shitty variant so they have an uniform way of letting others play in their sandbox, until they invariably decide which of these things make sense and then usurp them in a desperate way out of the commodity rat race to the bottom.
mudkipdev · 3h ago
Anyone else feel like this article was written with ChatGPT
neuronic · 3h ago
Not in this particular case. At this point I am starting to wonder if the
> Anyone else feel like this article was written with ChatGPT
So does that mean mcp is good to integrate along with Agentic AI
ummadi · 56m ago
So does that mean mcp is good to integrate along with Agentic AI
furyofantares · 4h ago
Agents (presumably) increase the demand for APIs and if those APIs as well as already existing APIs get exposed as MCPs then I can see it.
It is dependent on agents actually creating new demand for APIs and MCP being successful as a way to expose them.
quotemstr · 5h ago
It's articles like this that tell you we're close to peak hype. There's nothing revolutionary about a text encoding plus a schema. SOAP could do this 20 years ago.
rikafurude21 · 5h ago
This reminded me of that HN comment on the Dropbox announcement post where the user says that theres nothing new about it since FTP and USB-sticks exist. Also, anyone who ever had the misfortune of using SOAP know how horrendeous it is. Truth is, sometimes the "new thing" does it better and wins out. Applications have standardized APIs now because of AI hype. This is a step in the right direction
AnotherGoodName · 5h ago
SOAP was worse than horrendous though. I’m sure i’m not the only one hit by ‘well Java SOAP and .net SOAP encode differently so they don’t work together well’ (let alone all the other different implementations each with their own similar differences).
Or how about ‘oh it looks like your client is using SOAP 1.2 but the server is 1.1 and they are incompatible’. That was seriously a thing. Good luck talking to many different servers with different versions.
SOAP wasn’t just bad. It was essentially only useable between same languages and versions. Which is an interesting issue for a layer whose entire purpose was interoperability.
b0a04gl · 4h ago
before we just figured out apis ,read some docs ,guessed and moved on... now apis gotta work for llms they don't read anything they just parse ,so standardising shifts... from neat docs to explainable api specs. schema + inputs + reflection + docstrings + specs + X . else agent just skips it. we are not only the consumer now
phreeza · 3h ago
Is this basically the XML/RSS/semantic web of this tech wave?
brap · 3h ago
I guess I’m finally old enough to become old-man-yelling-at-cloud.
I’m convinced that the only reason why MCP became a thing is because newcomers weren’t that familiar with OpenAPI and other existing standards, and because a protocol that is somehow tied to AI (even though it’s not, as this article shows) generates a lot of hype these days.
There’s absolutely nothing novel about MCP.
roenxi · 5h ago
... MCP is almost literally just a JSON schema and a "yo, this stuff exists" for AI. It is great to have it standardised and we're all very thankful not to be using XML but there just isn't that much there.
MCP is fulfilling the promise of AI agents being able to do their own thing. None of this is unintended, unforeseen or particularly dependent on the existence of MCP. It is exciting, the fact that AI has this capability captures the dawn of a new era. But the important thing in the picture isn't MCP - it is the power of the models themselves.
layer8 · 3h ago
XML actually works better with LLMs than JSON.
zahlman · 52m ago
Why?
shalev123 · 3h ago
Oh boy, if only our personal AI assistants could be as reliable as a good old-fashioned pizza run by the CEO. The irony is delicious - we're moving towards universal plugins not because of some grand vision, but because everyone's desperate to make sure their AI doesn't go on an energy-saving nap mid-task.
It's almost poetic how we're now incentivizing interoperability simply because our digital buddies have to eat (or rather, drink) to stay awake. Who would've thought that the quest for connectivity would be driven by the humble Watt hour?
I guess when it comes down to it, even AI needs a good power-up - and hopefully, this time around, the plugins will stick. But hey, I'll believe it when my assistant doesn't crash while trying to order takeout.
Uml2013 · 55m ago
So does that mean it’s good to integrate mcp with agentic ai
moron4hire · 4h ago
This isn't a snide comment, I am legitimately asking. I don't understand the difference between MCP and REST. I know there are differences because I've used it a little. I mean, like, on an existential level. Why isn't it just REST? What parts do MCP give us that REST doesn't?
taytus · 4h ago
The author glosses over some practical realities. Just because something can be repurposed doesn't mean it should be. MCP was designed with specific assumptions about how AI models consume and process information. Using it as a general plugin system might work, but you'd likely hit limitations around things like authentication, real-time communication, or complex data flows that weren't priorities for the AI use case.
croes · 4h ago
Universal but insecure
OJFord · 5h ago
HTTP: A (Deliberately) Universal Plugin System
TZubiri · 3h ago
>What if it's just "a standardized way to connect AI models literally anything to different data sources and tools"?
Then you aren't exploring a novel concept, and you are better served learning about historical ways this challenge has been attempted rather than thinking it's the first time.
Unix pipes? APIs? POSIX? Json? The list is endless, this is one of the requirements that you can identify as just being a basic one of computers. Another example is anything that is about storing and remembering information. If it's so foundational, there will be tools and protocols to deal with this since the 70s.
For the love of god, before diving into the trendy new thing, learn about he boring old things.
spiritplumber · 5h ago
Yes, I'm old. Old enough to remember the MCP when he was just a chess program! He started small, and he'll end small!
rlboston · 4h ago
Sounds like programming! What happened to low code/no code? I AM old, retired in fact. IT has more "middleware" than the library of congress and mainframes still exist. But I will dig around, because I'm still curious. Carry on. LOL
The accident isn't that somehow we got a protocol to do things we couldn't do before. As other comments point out MCP (the specificaiton), isn't anything new or interesting.
No, the accident is that the AI Agent wave made interoperability hype, and vendor lock-in old-fashioned.
I don't know how long it'll last, but I sure appreciate it.
In my experience Claude and Gemini can take over tool use and all we need to do is tell them the goal. This is huge, we always had to specify the steps to achieve anything on a computer before. Writing a fixed program to deal with dynamic process is hard, while a LLM can adapt on the fly.
But the way I see it, AI agents created incentives for interoperability. Who needs an API when everyone is job secure via being a slow desktop user?
Well, your new personal assistant who charges by the Watt hour NEEDS it. Like when the CEO will personally drive to get pizzas for that hackathon because that’s practically free labor, so does everyone want everything connected.
For those of us who rode the API wave before integrating became hand-wavey, it sure feels like the world caught up.
I hope it will last, but I don’t know either.
I tried to find a rebuttal to this article from Slack, but couldn't. I'm on a flight with slow wifi though. If someone from Slack wants to chime in that'd be swell, too.
I've made the argument to CFOs multiple times over the years why we should continue to pay for Slack instead of just using Teams, but y'all are really making that harder and harder.
[0]: https://www.reuters.com/business/salesforce-blocks-ai-rivals...
As it is, I'm going to propose that we move more key conversations outside of slack so that we can take advantage of feeding it into ai. It's a small jump from that to looking for alternatives.
Perhaps but we see current hypes like Cursor only using MCP one way; you can feed into Cursor (eg. browser tools), but not out (eg. conversation history, context etc).
I love Cursor but this "not giving back" mentality originally reflected in it's closed source forking of VS Code leaves an unpleasant taste in the mouth and I believe will ultimately see it lose developer credibility.
Lock-in still seems to be locked in.
Though the general API lockdown was started long before that, and like you, I’m skeptical that this new wave of open access will last if the promise doesn’t live up to the hype.
In practice, the distinction is little more than the difference between different HTTP verbs, but I think there is a real difference in what people are intending to enable when creating an MCP server vs. standard APIs.
To this point, GUIs; going forward, AI agents. While the intention rhymes, the meaning of these systems diverge.
the APIs that used to be free and now aren't were just slightly ahead of the game, all these new MCP servers aren't going to be free either.
> Want it to order coffee when you complete 10 tasks? MCP server.
With a trip through an LLM for each trivial request? A paid trip? With high overhead and costs?
(The mashup hype was incredible, btw. Some of the most ridiculous web contraptions ever.)
Mcp is a fad, it’s not long term tech. But I’m betting shoveling data at llm agents isn’t. The benefits are too high for companies to allow vendors to lock the data away from them.
> All of this has happened before, and all of this will happen again.
”It” here being the boom and inevitable bust of interop and open API access between products, vendors and so on. As a millenial, my flame of hope was lit during the API explosion of Web 2.0. If you’re older, your dreams were probably crushed already by something earlier. If you’re younger, and you’re genuinely excited about MCP for the potential explosion in interop, hit me up for a bulk discount on napkins.
I actually disagree with the OP in this sub-thread:
> "No, the accident is that the AI Agent wave made interoperability hype, and vendor lock-in old-fashioned."
I don't think that's happened at all. I think some interoperability will be here to say - but those are overwhelmingly the products where interoperability was already the norm. The enterprise SaaS that your company is paying for will support their MCP servers. But they also probably already support various other plugin interfaces.
And they're not doing this because of hype or new-fangledness, but because their incentives are aligned with interoperability. If their SaaS plugins into [some other thing] it increases their sales. In fact the lowering of integration effort is all upside for them.
Where this is going to run into a brick wall (and I'd argue: already has to some degree) is that closed platforms that aren't incentivized to be interoperable still won't be. I don't think we've really moved the needle on that yet. Uber Eats is not champing at the bit to build the MCP server that orders your dinner.
And there are a lot of really good reasons for this. In a previous job I worked on a popular voice assistant that integrated with numerous third-party services. There has always been vehement pushback to voice assistant integration (the ur-agent and to some degree still the holy grail) because it necessarily entails the service declaring near-total surrender about the user experience. An "Uber Eats MCP" is one that Uber has comparatively little control over the UX of, and has poor ability to constrain poor customer experiences. They are right to doubt this stuff.
I also take some minor issue with the blog: the problem with MCP as the "everything API" is that you can't really take the "AI" part out of it. MCP tools are not guaranteed to communicate in structured formats! Instead of getting an HTTP 401 you will get a natural language string like "You cannot access this content because the author hasn't shared it with you."
That's not useful without the presence of a NL-capable component in your system. It's not parseable!
Also importantly, MCP inputs and outputs are intentionally not versioned nor encouraged to be stable. Devs are encouraged to alter their input and output formats to make them more accessible to LLMs. So your MCP interface can and likely will change without notice. None of this makes for good API for systems that aren't self-adaptive to that sort of thing (i.e., LLMs).
https://www.palantir.com/docs/foundry/ontology/overview
I always imagined software could be written with a core that does the work and the UI would be interchangeable. I like that the current LLM hype is causing it to happen.
I'm just baffled no software vendor has already come up with a subscription to access the API via MCP.
I mean obviously paid API access is nothing new, but "paid MCP access for our entreprise users" is surely on the pipeline everywhere, after which the openness will die down.
Optionality will kill adoption, and these things are absolutely things you HAVE to be able to play with to discover the value (because it’s a new and very weird kind of tool that doesn’t work like existing tools)
Heck, if AIs are at some point given enough autonomy to simply be given a task and a budget, there'll be efforts to try to trick AIs into thinking paying is the best way to get their work done! Ads (and scams) for AIs to fall for!
https://github.com/mapbox/mcp-server
Say i'm building a app and I want my users to be able to play spotify songs. Yea, i'll hit the spotify api. But now, say i've launched my app, and I want my users to be able to play a song from sonofm when they hit play. Alright, now I gotta open up the code and do some if statements hard code the sonofm api and ship a new version, show some update messages.
MCP is literally just a way to make this extensible so instead of hardcoding this in, it can be configured at runtime
https://en.wikipedia.org/wiki/HATEOAS
There is a confused history where Roy Fielding described REST, then people applied some of that to JSON HTTP APIs, designating those as REST APIs, then Roy Fielding said “no you have to do HATEOAS to achieve what I meant by REST”, then some people tried to make their REST APIs conform to HATEOAS, all the while that change was of no use to REST clients.
But now with AI it actually can make sense, because the AI is able to dynamically interpret the hypermedia content similar to a human.
Not sure I'm understanding your point in hypermedia means there is human in the loop. Can you expand?
Further reading:
- https://htmx.org/essays/how-did-rest-come-to-mean-the-opposi...
- https://htmx.org/essays/hateoas/
* https://news.ycombinator.com/item?id=43307225
* https://www.ondr.sh/blog/ai-web
(Even if only the former, it would of course be a huge step forward, as I could have the LLM generate schemata. Also, at least, everyone is standardizing on a base protocol now, and a way to pass command names, arguments, results, etc. That's already a huge step forward in contrast to arbitrary Rest+JSON or even HTTP APIs)
Interestingly, ActiveX was quite the security nightmare for very similar reasons actually, and we had to deal with infamous "DLL Hell". So, history repeats itself.
To speculate about this, perhaps the informality is the point. A full formal specification of something is somewhere between daunting and Sisyphean, and we're more likely to see supposedly formal documentation that nonetheless is incomplete or contains gaps to be filled with background knowledge or common sense.
A mandatory but informal specification in plain language might be just the trick, particularly since vibe-APIing encourages rapid iteration and experimentation.
You mean, like OpenAPI, gRPC, SOAP, and CORBA?
* https://github.com/grpc/grpc-java/blob/master/documentation/...
* https://grpc.io/docs/guides/reflection/
You can then use generic tools like grpc_cli or grpcurl to list available services and methods, and call them.
REST APIs have 5 or 6 ways of doing that, including "read it from our docs site", HATEOAS, OAS running on an endpoint as part of the API.
MCP has a single way of listing endpoints.
> REST APIs have 5 or 6 ways of doing that
You think nobody's ever going to publish a slight different standard to Anthropic's MCP that is also primarily intended for LLMs?
OpenAPI, OData, gRPC, GraphQL
I'm sure I'm missing a few...
To elaborate on this, I don't know much about MCP, but usually when people speak about it is in a buzzword-seeking kind of way, and the people that are interested in it make these kinds of conceptual snafus.
Second, and this applies not just to MCP, but even things like JSON, Rust, MongoDB. There's this phenomenon where people learn the complex stuff before learning the basics. It's not the first time I've cited this video on Homer studying marketing where he reads the books out of order https://www.youtube.com/watch?v=2BT7_owW2sU . It makes sense that this mistake is so common, the amount of literature and resources is like an inverted pyramid, there's so little classical foundations and A LOT of new stuff, most of which will not stand the test of time. Typically you have universities to lead the way and establish a classical corpus and path, but being such a young discipline, 70 years in and we are still not finding much stability, Universities have gone from teaching C, to teaching Java, to teaching Python (at least in intro to CS), maybe they will teach Rust next, but this buzzwording seems more in line with trying to predict the future, and there will be way more losers than winners in that realm. And the winners will have learned the classicals in addition to the new technology, learning the new stuff without the classics is a recipe for disaster.
MCP seems like a more "in-between" step until the AI models get better. I imagine in 2 years, instead of using an MCP, we will point to the tool's documentation or OpenAPI, and the AI can ingest the whole context without the middle layer.
I don't have a high opinion of MCP and the hype it's generating is ridicolous, but the problem it supposedly solves is real. If it can work as an excuse to have providers expose an API for their functionality like the article hopes, that's exciting for developers.
I had similar skepticism initially, but I would recommend you dip toe in water on it before making judgement
The conversational/voice AI tech now dropping + the current LLMs + MCP/tools/functions to mix in vendor APIs and private data/services etc. really feels like a new frontier
It's not 100% but it's close enough for a lot of usecases now and going to change a lot of ways we build apps going forward
What blocked me initially was watching NDA'd demos a year or two back from a couple of big software vendors on how Agents were going to transform enterprise ... what they were showing was a complete non-starter to anyone who had worked in a corporate because of security, compliance, HR, silos etc. so I dismissed it
This MCP stuff solves that, it gives you (the enterprise) control in your own walled garden, whilst getting the gains from LLMs, voice etc. ... the sum of the parts is massive
It more likely wraps existing apps than integrates directly with them, the legacy systems becoming data or function providers (I know you've heard that before ... but so far this feels different when you work with it)
That is the odd part. I am far from being part of that group of people. I‘m only 25, I joined the industry in 2018 as part of an training program in a large enterprise.
The odd part is, many of the promises are a bit Déjà-vu even for me. „Agents going to transform the enterprise“ and other promises do not seem that far off the promises that were made during the low code hype cycle.
Cynically, the more I look at the AI projects as an outsider, the more I think AI could fail in enterprises largely because of the same reason low code did. Organizations are made of people and people are messy, as a result the data is often equally messy.
Further, there are 2 kinds of users that consume the output of software. a) humans, and b) machines.
Where LLMs shine are in the 2a usecases, ie, usecases where accuracy does not matter and humans are end-users. there are plenty of these usecases.
The problem is that LLMs are being applied to 1a, 1b usecases where there is going to be a lot of frustration.
I made this MCP server so that you could chat with real-time data coming from the API - https://github.com/AshwinSundar/congress_gov_mcp. I’ve actually started using it more to find out, well, what the US Congress is actually up to!
MCP is already a useless layer between AIs and APIs, using it when you don't even have GenAI is simply idiotic.
The only redeeming quality of MCP is actually that it has pushed software vendors to expose APIs to users, but just use those directly...
Isn't that what we had about 20 years ago (web 2.0) until they locked it all up (the APIs and feeds) again? ref: this video posted 18 years ago: https://www.youtube.com/watch?v=6gmP4nk0EOE
(Rewatching it in 2025, the part about "teaching the Machine" has a different connotation now.)
Maybe it's that the protocol is more universal than before, and they're opening things up more due to the current trends (AI/LLM vs web 2.0 i.e. creating site mashups for users)? If it follows the same trend then after a while it will become enshittified as well.
At work I built an LLM-based system that invoke tools. We started before MCP existed, and just used APIs (and continue to do so).
Its engineering value is nil, it only has marketing value (at best).
Anthropic wants to define another standard now btw https://www.anthropic.com/engineering/desktop-extensions
You can already do this as long as your client has access to a HTTP MCP.
You can give the current generation of models an openAPI spec and it will know exactly what to do with it.
https://blog.runreveal.com/introducing-runreveal-remote-mcp-...
Take out the LLM and you're not that far away from existing protocols and standards. It's not plugging your app into any old MCP and it just works (like the USB-C example).
But, it is a good point that the hype is getting a lot of apps and services to offer APIs in a universal protocol. That helps.
For example, the Kagi MCP server interacts with the Kagi API. Wouldn't you have a better experience just using that API directly then?
On another note, as the number of python interpreters running on your system increases with the number of MCP servers, does anyone think there will be "hosted" offerings that just provide a sort of "bridge" running all your MCP servers?
On the other hand, in the absence of an existing API, you can implement your MCP server to just [do the thing] itself, and maybe that's where the author sees things trending.
I’m not familiar with the details but I would imagine that it’s more like:
”An MCP server which re-exposes an existing public/semi-public API should be easy to implement, with as few changes as possible to the original endpoint”
At least that’s the only way I can imagine getting traction.
Interoperability means user portability. And no tech bro firm wants user portability, they want lock in and monopoly.
Come to think of it - I don't know what the modern equivalent would be. AppleScript?
It basically powers all inter communication in Windows.
We had a non technical team member write an agent to clean up a file share. There are hundreds of programming languages, libraries, and apis that enabled that before MCP but now people don’t even have to think about it. Is it performant no, is it the “best” implementation absolutely not. Did it create enormous value in a novel way that was not possible with the resources, time, technology we had before 100%. And that’s the point.
This has to be BS(or you think its true) unless it was like 1000 files. In my entire career I've seen countless crazy file shares that are barely functional chaos. In nearly ever single "cleanup" attempt I've tried to get literally ANYONE from the relevant department to help with little success. That is just for ME to do the work FOR THEM. I just need context from them. I've on countless occasion had to go to senior management to force someone to simply sit with me for an hour to go over the schema they want to try to implement. SO I CAN DO IT FOR THEM and they don't want to do it and literally seemed incapable of doing so when forced to. COUNTLESS Times. This is how I know AI is being shilled HARD.
If this is true then I bet you anything in about 3-6 months you guys are going to be recovering this file system from backups. There is absolutely no way it was done correctly and no one has bothered to notice yet. I'll accept your downvote for now.
Cleaning up a file share is 50% politics, 20% updating procedures, 20% training and 10% technical. I've seen companies go code red and practically grind to a halt over a months long planned file share change. I've seen them rolled back after months of work. I've seen this fracture the files shares into insane duplication(or more) because despite the fact it was coordinated, senior managers did not as much as inform their department(but attended meetings and signed off on things) and now its too late to go back because some departments converted and some did not. I've seen helpdesk staff go home "sick" because they could not take the volume of calls and abuse from angry staff afterwards.
Yes I have trauma on this subject. I will walk out of a job before ever doing a file share reorg again.
You'll roll it out in phases? LOL
You'll run it in parallel? LOL
You'll do some <SUPER SMART> thing? LOL.
I'm too young to be posting old_man_yells_at_cloud.jpg comments...
A2A (agent 2 agent) mechanism is an another accidental discovery for the interoperability across agent boundaries
http://archive.today/OUymS
But you’re right, it does kind of miss the point.
I can't be the only person that non-ironically has this.
A REST API makes sense to me…but this is apparently significantly different and more useful. What’s the best way to think about MCP compared to a traditional API? Where do I get started building one? Are there good examples to look at?
With a traditional API, people can build it any way they want, which means you (the client) need API docs.
With MCP, you literally restrict it to 2 things: get the list of tools, and call the tool (using the schema you got above). Thus the key insight is just about: let's add 1 more endpoint that lists the APIs you have, so that robots can find it.
Example time: - Build an MCP server (equivalent of "intro to flask 101"): https://developers.cloudflare.com/agents/guides/remote-mcp-s... - Now you can add it to Claude Desktop/Cursor and see what it does - That's as far as i got lol
Then use FastMCP to write an MCP server in Python - https://github.com/jlowin/fastmcp
Finally, hook it up to an LLM client. It’s dead simple to do in Claude Code, create an .mcp.json file and define the server’s startup command.
The use case in AI is sort of reversed such that the code runs on your computer
> Anyone else feel like this article was written with ChatGPT
comments are actually written by ChatGPT.
It is dependent on agents actually creating new demand for APIs and MCP being successful as a way to expose them.
Or how about ‘oh it looks like your client is using SOAP 1.2 but the server is 1.1 and they are incompatible’. That was seriously a thing. Good luck talking to many different servers with different versions.
SOAP wasn’t just bad. It was essentially only useable between same languages and versions. Which is an interesting issue for a layer whose entire purpose was interoperability.
I’m convinced that the only reason why MCP became a thing is because newcomers weren’t that familiar with OpenAPI and other existing standards, and because a protocol that is somehow tied to AI (even though it’s not, as this article shows) generates a lot of hype these days.
There’s absolutely nothing novel about MCP.
MCP is fulfilling the promise of AI agents being able to do their own thing. None of this is unintended, unforeseen or particularly dependent on the existence of MCP. It is exciting, the fact that AI has this capability captures the dawn of a new era. But the important thing in the picture isn't MCP - it is the power of the models themselves.
It's almost poetic how we're now incentivizing interoperability simply because our digital buddies have to eat (or rather, drink) to stay awake. Who would've thought that the quest for connectivity would be driven by the humble Watt hour?
I guess when it comes down to it, even AI needs a good power-up - and hopefully, this time around, the plugins will stick. But hey, I'll believe it when my assistant doesn't crash while trying to order takeout.
Then you aren't exploring a novel concept, and you are better served learning about historical ways this challenge has been attempted rather than thinking it's the first time.
Unix pipes? APIs? POSIX? Json? The list is endless, this is one of the requirements that you can identify as just being a basic one of computers. Another example is anything that is about storing and remembering information. If it's so foundational, there will be tools and protocols to deal with this since the 70s.
For the love of god, before diving into the trendy new thing, learn about he boring old things.