Chapter 1 of 9 · Model Context Protocol
The room before the protocol
before MCP, every AI integration was a one-off — and your terminal already hides the receipt.
You ask Claude to summarise your Notion doc. Two months ago it couldn't. What changed wasn't the model — it was the wiring between the model and your data.
And if you read this as "oh, function-calling, but newer", you're missing what the wiring used to cost.
The bill nobody added up#
Before late 2024, every pairing of an AI app and an external system was a custom adapter. Cursor wanted to read your filesystem? Cursor wrote that adapter. Claude Desktop wanted the same? Anthropic's team wrote a different one. Your internal Slack bot wanted it? You wrote a third.
With M hosts and N tools, you got M × N adapters. Three hosts × five tools is fifteen integrations to author and — quietly worse — fifteen integrations to keep alive when any one of them changes shape.
Crank M to 4 and N to 9 — a small team with a handful of SaaS surfaces — and the count is 36. Toggle the protocol on and it collapses to 13. Same hosts. Same tools. One layer between them.
USB-C for AI#
That's the elevator pitch the spec leans on, and the metaphor is load-bearing. MCP — the Model Context Protocol — is the single shared shape that sits between hosts and tools, so every host learns the protocol once and every tool exposes itself once. The cable in the middle does the translation. Anthropic shipped the open spec in November 2024; by Q1 2026 the supported-clients list reads like the AI tools page of a current laptop: Claude, ChatGPT, VS Code, Cursor, Zed, Replit, Codeium, Sourcegraph.
The metaphor breaks where you'd expect — USB-C carries power and bytes; MCP carries capabilities, and capabilities have to be negotiated. We'll get there. For now, hold onto the shape: one protocol, M + N integrations, every host and every tool meeting in the middle.
How we got here#
The protocol didn't arrive in a vacuum. Each of the four pre-MCP eras moved some of the burden — but the M × N never collapsed.
Read those panels in order and a pattern shows up: every era moved the burden one layer further from the application code, but always stopped short of the boundary that mattered. Schemas got typed but stayed inside one host's SDK. Frameworks abstracted vendors but stayed inside one runtime. The integration point — the place where a host meets a tool — kept getting re-implemented per host.
MCP closes that boundary by naming three roles — a host (Claude Desktop, Cursor, VS Code), a client (the per-server connection manager that lives inside the host), and a server(the process that exposes a tool, a resource, or a prompt). Chapter 2 formalises these. For now, it's enough to know that one protocol speaks across all three.
First glimpse: a message on the wire#
You won't learn the wire here — that's chapter 3 — but it's worth seeing one message before we go further. When a host first connects to a server, one of the earliest things it asks is tools/list: what can you do?
{
"jsonrpc": "2.0",
"id": 2,
"method": "tools/list"
}Four fields. JSON-RPC2.0, fifteen years older than the protocol that uses it. The server replies with the list of tools it offers, each described in a shape the client can render. We'll read that response apart in chapter 3, decode it inside the capability handshake in chapter 4, and watch a server hand-author one in chapter 6.
For this chapter, what matters is that this same message works whether the host is Claude or Cursor or your Slack bot, and whether the tool is filesystem search or a Postgres query or a flight-booking API. The cable in the middle no longer cares.
Predict the count#
One last check before we move on. A new SaaS API ships tomorrow. Your team — Claude Desktop, Cursor, and an internal Slack bot — wants to use it. Pick the integration count, then read the verdict.
That ratio — one server, N hosts — is what makes MCP a protocol instead of a framework. A framework lives inside one runtime. A protocol crosses runtimes by naming the shape they have to share.
What's next#
We've named the problem and watched it collapse. Now: what does the solution actually look like at the wire — what are the parts, what does each one do, and who talks to whom? Chapter 2: three roles, one connection →