Skip to main content
Google-Agent: The New User Agent for AI Agents Explained
Back to Blog
AI & Automation April 3, 2026 9 min readby Matthias Meyer

Google-Agent: The New User Agent for AI Agents Explained

Google-Agent is the first user agent for autonomous AI agents on the web. What it does, how Web Bot Auth works, and what website owners should prepare now.

Google-Agent is a new user agent from Google that identifies AI agents autonomously navigating the web on a user's behalf — reading pages, filling forms, comparing prices, and executing purchases. Officially documented since March 20, 2026, it is not a crawler like Googlebot but a User-Triggered Fetcher that responds to direct user requests. Google-Agent is the first to use the Web Bot Auth protocol for cryptographic bot verification.

If you've been watching Google's crawler documentation over the past months, you know the pattern: a new user agent pops up every few weeks. But Google-Agent is different. It's the first official identifier for AI agents that autonomously act on the web — and it comes with a cryptographic identity card.

What is Google-Agent and how does it differ from Googlebot?

Google-Agent is a User-Triggered Fetcher — meaning it only visits websites when a real human gives it a task. This is a fundamental difference from Googlebot, which crawls automatically in the background.

PropertyGooglebotGoogle-Agent
TypeCrawlerUser-Triggered Fetcher
TriggerAutomatic (crawl schedule)User request ("Book me a hotel")
What it doesRead pages, indexNavigate, click, fill forms, purchase
IP rangescommon-crawlers.jsonuser-triggered-agents.json (new, separate)
robots.txtRespectsIgnores (user-triggered)
IdentificationUser-Agent + Reverse DNSUser-Agent + Web Bot Auth (cryptographic)

Desktop User-Agent string:

Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; Google-Agent;
+https://developers.google.com/crawling/docs/crawlers-fetchers/google-agent)
Chrome/W.X.Y.Z Safari/537.36

What is Project Mariner and why does it matter?

Project Mariner is Google's AI agent that uses the Google-Agent user agent. Introduced at Google I/O 2025, it's based on Gemini 2.0 and can:

  • Navigate the web autonomously — open websites, follow links, scroll
  • Fill out forms — contact forms, booking forms, checkout processes
  • Compare prices — visit multiple shops and summarize results
  • Execute actions — prepare bookings, fill shopping carts

Mariner works on the Observe-Plan-Act principle: It analyzes page content, plans a sequence of steps, and executes them — always under user control. Since the I/O update, Mariner runs in Cloud VMs and can handle up to ten tasks in parallel.

Current status (April 2026): Available only in the US, only for Google AI Ultra subscribers ($249.99/month). European rollout is announced but without a concrete date. Based on past patterns, Europe typically follows a few months after the US launch.

What is the Web Bot Auth protocol?

Perhaps the most important innovation: Google is experimenting with cryptographic bot verification for the first time. Instead of identifying itself only via a user-agent string (easily faked), Google-Agent signs its HTTP requests with a private key.

How it works:

  1. Google-Agent has a key pair (private key + public key)
  2. Every HTTP request is signed — the private key creates a cryptographic signature
  3. The server verifies — using the public key from a public directory
  4. Result: Mathematical proof that the request really comes from Google

The identity is https://agent.bot.goog — an IETF draft (Internet Engineering Task Force), making it a potential internet standard.

Who already supports it:

  • Google (Google-Agent)
  • Cloudflare (WAF integration)
  • Akamai (bot management)
  • Amazon (AgentCore Browser)

Think of it as a digital passport for bots. Previously, any bot could claim to be Googlebot. With Web Bot Auth, identity is cryptographically secured and verifiable in real-time.

What does Google-Agent mean for website owners?

robots.txt doesn't apply

Google-Agent is a User-Triggered Fetcher and ignores robots.txt — just like a regular browser. Because it acts on behalf of a real user, you cannot block Google-Agent via robots.txt.

Check your WAF and bot protection

Aggressive anti-bot measures (CAPTCHAs, JavaScript challenges, rate limiting) can block Google-Agent — frustrating the user who delegated a task. Review your WAF rules and ensure Google-Agent is not falsely blocked.

Structured data becomes a competitive advantage

AI agents use structured data to execute tasks: comparing product prices, checking availability, preparing bookings. Websites with clean Schema markup (Product, Offer, LocalBusiness, FAQ) are preferred — because the agent can extract data more reliably.

agents.json and A2A protocol

The Agent-to-Agent (A2A) protocol by Google defines how AI agents communicate with each other. Websites with an agents.json file (at /.well-known/agents.json) signal to AI agents what capabilities they offer. This is the future of web interaction — no longer human-to-website, but agent-to-website.

Checklist: What to prepare now

Even though Google-Agent isn't active in Europe yet — preparation is simple and never hurts:

  1. Server logs: Track Google-Agent as a separate bot type (distinct UA string)
  2. WAF rules: Ensure Google-Agent is not blocked
  3. Schema markup: Product, Offer, LocalBusiness, FAQ — clean and current
  4. agents.json: Serve at /.well-known/agents.json (describes website capabilities)
  5. Forms: Semantic HTML, aria labels, clear field names — agents need to understand forms
  6. Web Bot Auth: Monitor — if you use Cloudflare or Akamai, it will be supported automatically

Conclusion

Google-Agent marks the beginning of a new era: Agentic Search. Instead of presenting search results to humans and letting them click, Google sends an AI agent to complete the task. For website owners, this means: if you're machine-readable, you get the traffic. If you're not, you get skipped.

The good news: Everything that matters for SEO and AI visibility — structured data, semantic HTML, clear content — is also relevant for Agentic Search. Those who invest in Schema markup and AI-ready infrastructure today are prepared.

Further reading: What is the MCP Protocol? | Schema Markup Guide | FAQ and HowTo Schema

Matthias Meyer

Matthias Meyer

Founder & AI Architect

Full-stack developer with 10+ years of experience in web design and AI systems. Builds AI-ready websites and AI automations for SMBs and agencies.

google-agentagentic-searchweb-bot-authproject-marinerai-crawlerrobots-txt
Google-Agent: The New User Agent for AI Agents Explained