Google-Agent is a new user agent from Google that identifies AI agents autonomously navigating the web on a user's behalf — reading pages, filling forms, comparing prices, and executing purchases. Officially documented since March 20, 2026, it is not a crawler like Googlebot but a User-Triggered Fetcher that responds to direct user requests. Google-Agent is the first to use the Web Bot Auth protocol for cryptographic bot verification.
If you've been watching Google's crawler documentation over the past months, you know the pattern: a new user agent pops up every few weeks. But Google-Agent is different. It's the first official identifier for AI agents that autonomously act on the web — and it comes with a cryptographic identity card.
What is Google-Agent and how does it differ from Googlebot?
Google-Agent is a User-Triggered Fetcher — meaning it only visits websites when a real human gives it a task. This is a fundamental difference from Googlebot, which crawls automatically in the background.
| Property | Googlebot | Google-Agent |
|---|---|---|
| Type | Crawler | User-Triggered Fetcher |
| Trigger | Automatic (crawl schedule) | User request ("Book me a hotel") |
| What it does | Read pages, index | Navigate, click, fill forms, purchase |
| IP ranges | common-crawlers.json | user-triggered-agents.json (new, separate) |
| robots.txt | Respects | Ignores (user-triggered) |
| Identification | User-Agent + Reverse DNS | User-Agent + Web Bot Auth (cryptographic) |
Desktop User-Agent string:
Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; Google-Agent;
+https://developers.google.com/crawling/docs/crawlers-fetchers/google-agent)
Chrome/W.X.Y.Z Safari/537.36
What is Project Mariner and why does it matter?
Project Mariner is Google's AI agent that uses the Google-Agent user agent. Introduced at Google I/O 2025, it's based on Gemini 2.0 and can:
- Navigate the web autonomously — open websites, follow links, scroll
- Fill out forms — contact forms, booking forms, checkout processes
- Compare prices — visit multiple shops and summarize results
- Execute actions — prepare bookings, fill shopping carts
Mariner works on the Observe-Plan-Act principle: It analyzes page content, plans a sequence of steps, and executes them — always under user control. Since the I/O update, Mariner runs in Cloud VMs and can handle up to ten tasks in parallel.
Current status (April 2026): Available only in the US, only for Google AI Ultra subscribers ($249.99/month). European rollout is announced but without a concrete date. Based on past patterns, Europe typically follows a few months after the US launch.
What is the Web Bot Auth protocol?
Perhaps the most important innovation: Google is experimenting with cryptographic bot verification for the first time. Instead of identifying itself only via a user-agent string (easily faked), Google-Agent signs its HTTP requests with a private key.
How it works:
- Google-Agent has a key pair (private key + public key)
- Every HTTP request is signed — the private key creates a cryptographic signature
- The server verifies — using the public key from a public directory
- Result: Mathematical proof that the request really comes from Google
The identity is https://agent.bot.goog — an IETF draft (Internet Engineering Task Force), making it a potential internet standard.
Who already supports it:
- Google (Google-Agent)
- Cloudflare (WAF integration)
- Akamai (bot management)
- Amazon (AgentCore Browser)
Think of it as a digital passport for bots. Previously, any bot could claim to be Googlebot. With Web Bot Auth, identity is cryptographically secured and verifiable in real-time.
What does Google-Agent mean for website owners?
robots.txt doesn't apply
Google-Agent is a User-Triggered Fetcher and ignores robots.txt — just like a regular browser. Because it acts on behalf of a real user, you cannot block Google-Agent via robots.txt.
Check your WAF and bot protection
Aggressive anti-bot measures (CAPTCHAs, JavaScript challenges, rate limiting) can block Google-Agent — frustrating the user who delegated a task. Review your WAF rules and ensure Google-Agent is not falsely blocked.
Structured data becomes a competitive advantage
AI agents use structured data to execute tasks: comparing product prices, checking availability, preparing bookings. Websites with clean Schema markup (Product, Offer, LocalBusiness, FAQ) are preferred — because the agent can extract data more reliably.
agents.json and A2A protocol
The Agent-to-Agent (A2A) protocol by Google defines how AI agents communicate with each other. Websites with an agents.json file (at /.well-known/agents.json) signal to AI agents what capabilities they offer. This is the future of web interaction — no longer human-to-website, but agent-to-website.
Checklist: What to prepare now
Even though Google-Agent isn't active in Europe yet — preparation is simple and never hurts:
- Server logs: Track Google-Agent as a separate bot type (distinct UA string)
- WAF rules: Ensure Google-Agent is not blocked
- Schema markup: Product, Offer, LocalBusiness, FAQ — clean and current
- agents.json: Serve at
/.well-known/agents.json(describes website capabilities) - Forms: Semantic HTML, aria labels, clear field names — agents need to understand forms
- Web Bot Auth: Monitor — if you use Cloudflare or Akamai, it will be supported automatically
Conclusion
Google-Agent marks the beginning of a new era: Agentic Search. Instead of presenting search results to humans and letting them click, Google sends an AI agent to complete the task. For website owners, this means: if you're machine-readable, you get the traffic. If you're not, you get skipped.
The good news: Everything that matters for SEO and AI visibility — structured data, semantic HTML, clear content — is also relevant for Agentic Search. Those who invest in Schema markup and AI-ready infrastructure today are prepared.
Further reading: What is the MCP Protocol? | Schema Markup Guide | FAQ and HowTo Schema
