Skip to main content
StudioMeyer
What Is WebMCP? The New Language Between Websites and AI
Back to Blog
AI & Automation February 15, 2026 10 min readby Matthias Meyer

What Is WebMCP? The New Language Between Websites and AI

SEO made websites visible to Google. WebMCP makes them usable for AI agents. What agents.json is and why the direction is clear.

Imagine someone asks ChatGPT: "Find me a web designer in Mallorca who also builds AI chatbots." What happens next? The agent searches the web, finds websites, reads their content -- and tries to understand what these businesses offer. For most websites, it has to guess. It parses HTML, interprets text, draws conclusions from headings, and hopes it gets it right.

That's roughly as efficient as hiring a new employee and telling them: "Read our website and guess what we do." It works, sort of. But there's a better way.

The SEO Parallel -- 20 Years Later

In the early 2000s, there was a similar problem. Search engines could find websites but couldn't properly understand them. The solution: SEO. Meta tags, structured data, clean HTML. Websites learned to speak Google's language.

WebMCP is the same concept -- but for AI agents instead of search engines.

SEO = Optimizing websites for search engines. WebMCP = Optimizing websites for AI agents.

The difference: search engines index and link. AI agents want to act. They don't just want to know a restaurant exists -- they want to book a table. They don't just want to see prices -- they want to request a quote. They don't just want to browse portfolios -- they want to find the right project for their client.

What WebMCP Means Technically

WebMCP stands for "Web Model Context Protocol." It sounds technical, but the core is simple: it's a proposal for how websites should tell AI agents what they can do and how to interact with them.

At its heart are two files that live under /.well-known/ on a website:

agents.json -- The Menu for AI Agents

The agents.json file lists what services a website offers and how an agent can access them. It describes:

  • Name and description of the business
  • Available tools with endpoint, method, and parameters
  • Capabilities like supported languages or protocols

When an AI agent fetches studiomeyer.io/.well-known/agents.json, it doesn't get vague marketing copy. It gets a clear list: "Here you can filter portfolios, here you can request a quote, here you can book a consultation."

agent-card.json -- The Resume

While agents.json describes the available tools, agent-card.json is more like a profile. It follows the A2A protocol (Agent-to-Agent) and describes:

  • Skills of the business (what can it do?)
  • Input/output formats (how does it communicate?)
  • Protocol version and capabilities

Together, these two files form a machine-readable interface. No interpretation needed, no guessing, no HTML parsing.

Let's Be Honest: The Standard Is Still Young

Now for the part many marketing articles leave out. WebMCP is not a finished W3C standard supported by all browsers and agents. As of February 2026:

  • There's a W3C Community Group working on the Web Model Context Protocol
  • The agents.json specification comes as a community project from GitHub (nicepkg/agents.json)
  • Google has announced initial experiments with navigator.modelContext in Chrome 146
  • The A2A protocol (Agent-to-Agent, v0.3.0) is further along but still a draft

What does this mean practically? Not every AI agent supports agents.json. ChatGPT, Claude, and others can already analyze websites and use APIs -- but they don't automatically look for an agents.json file. That's coming.

The situation is reminiscent of robots.txt in the 90s: first came the crawlers, then the need for a way to communicate with them. The crawlers are here. The communication is being standardized.

Why the Direction Is Clear Regardless

Three reasons why WebMCP isn't hype but a logical evolution:

1. AI Agents Need Structured Data

LLMs can read and interpret HTML. But it's error-prone, slow, and unreliable. Just as search engines didn't stay with pure HTML and required structured data (Schema.org, JSON-LD), AI agents need structured endpoints.

2. The Major Players Are Investing

Anthropic released MCP (Model Context Protocol) as an open-source standard. Google is working on WebMCP integration in Chrome. Microsoft is integrating agent capabilities into Copilot. When the three largest AI companies are heading in the same direction, that's not a coincidence.

3. Agent Traffic Is Growing Exponentially

AI agents still account for a small fraction of web traffic. But the growth curve is steep. Every other ChatGPT user employs web search. Claude can analyze websites. And this is just the beginning -- agents acting on behalf of users will become the norm.

What WebMCP Means for Businesses

The practical question: what does this give me today?

Short Term (2026): Preparation

  • Build structured APIs: Not just beautiful websites, but machine-readable endpoints
  • Deploy agents.json and agent-card.json: Costs almost nothing, positions you as an early adopter
  • Maintain structured data: JSON-LD, Schema.org -- this pays into SEO and AI readiness

Medium Term (2027-2028): Early-Mover Advantage

  • AI agents will increasingly look for agents.json
  • Websites with structured endpoints will be preferred
  • Those who build early will have experience when the market picks up

Long Term: A New Traffic Source

  • Agent traffic as its own channel alongside organic and paid
  • Transactions directly via API endpoints
  • AI agents as a sales channel

Five Use Cases That Work Today

WebMCP isn't just theory. Here are five scenarios that are already possible with an agents.json file and matching API endpoints:

1. Portfolio Discovery: An AI agent searches for a web designer on behalf of its user. Instead of crawling websites, it queries agents.json, filters by industry and style, and presents matching projects.

2. Automatic Price Calculation: "What does a website with 10 pages, blog, and contact form cost?" The agent sends a structured request to the /api/v1/quote endpoint and gets an estimate back.

3. Appointment Booking: The agent books a consultation directly, without the user having to fill out a form.

4. Website Audit: An agent checks whether a website is AI-Ready and delivers a score with specific improvement suggestions.

5. Tool Generation: An agent generates agents.json files for other businesses -- based on industry and business model.

These aren't hypothetical scenarios. StudioMeyer has implemented exactly these nine tools in its agents.json. Are they being used en masse today? No. Are they ready when usage comes? Yes.

The robots.txt Analogy

In 1994, Martijn Koster proposed robots.txt. Back then, most websites didn't have one. Why would they -- there were barely any crawlers. Then came Google, Yahoo, and the others. And suddenly robots.txt wasn't optional but essential.

With WebMCP, we're roughly in the mid-90s, translated to AI agents. The agents are here. The standards are emerging. Adoption will follow.

The difference: in the 90s, the transition took 10 years. In the AI era? More like 2-3.

What This Means for Your Website

Three concrete steps you can take today:

Step 1: Check your APIs. Does your website have structured endpoints? Or just HTML pages? If the latter: start small. An /api/services endpoint that returns your services as JSON is a beginning.

Step 2: Create an agents.json. List what your website can do. What information can an agent retrieve? What actions can it perform? The file format is simple -- a JSON array with tools, endpoints, and parameters.

Step 3: Measure. Who's hitting your API endpoints? How is agent traffic developing? This data will be decisive in the coming years.

Conclusion: Not If, but When

WebMCP isn't a question of if, but when. The infrastructure is being built. Standards are emerging. AI agents are getting better. And at some point -- probably faster than most think -- the question won't be "Do I need agents.json?" but "Why didn't I do this sooner?"

SEO was the same in the 2000s. Those who invested early had an advantage that lasted for years. WebMCP won't be any different.

The technology is ready. Adoption is coming. The only question is: are you?

Matthias Meyer

Matthias Meyer

Founder & AI Director

Founder & AI Director at StudioMeyer. Has been building websites and AI systems for 10+ years. Living on Mallorca for 15 years, running an AI-first digital studio with its own agent fleet, 680+ MCP tools and 5 SaaS products for SMBs and agencies across DACH and Spain.

webmcpai-readyagents-jsonki-agentenweb-standard
What Is WebMCP? The New Language Between Websites and AI