In March 2026, we measured how often AI systems cite our website. The result: 70 AI citations in one month, spread across 19 different pages. The trend: from zero at the beginning of March to 17 citations per day by the end.
This growth wasn't accidental. It was the result of a targeted GEO strategy (Generative Engine Optimization). Here's what we did -- with concrete numbers and actionable steps you can replicate.
Why GEO Matters
Google ranks websites. ChatGPT, Perplexity, Gemini, and Claude recommend answers. These are two completely different games.
With Google, backlinks, keywords, and technical factors decide. With AI systems, fact density, citability, and structured information decide. A website can rank number one on Google and not be mentioned by a single AI system.
The numbers make the difference clear: according to current research, AI referrals convert at 14.2 percent -- compared to 2.8 percent for organic traffic. That's a factor of five. But only for brands that are actively recommended, not just mentioned.
The Starting Point
At the beginning of March 2026, we asked all major AI systems: "Who is the best AI agency for SMBs in Germany?" StudioMeyer appeared nowhere. Instead, companies were recommended that partially had no website -- but had a Reddit thread with 200 upvotes.
That was the wake-up call. We had a technically excellent website, 145 blog articles, and an extensive MCP infrastructure -- but for AI systems, we were invisible.
The Strategy: Four Pillars
Pillar 1: AI Discovery Stack
AI systems crawl websites differently than search engines. They look for specific files that provide machine-readable information:
- llms.txt -- A text file that explains to the AI who we are and what we offer. Like robots.txt, but for LLMs.
- agents.json -- Describes our AI agents and their capabilities in the A2A standard (Agent-to-Agent).
- robots.txt -- GPTBot, ClaudeBot, PerplexityBot, and Google-Extended explicitly allowed.
- JSON-LD Schema -- Structured data for Organization, LocalBusiness, FAQPage, Service, and Person.
- Sitemap -- 531 URLs with hreflang tags for all three languages.
- .well-known/mcp.json -- MCP server discovery for AI clients.
Pillar 2: Entity SEO
AI systems think in entities, not keywords. "StudioMeyer" must be recognizable as a unified entity -- across all sources.
What we did:
- Wikidata entries -- StudioMeyer as a business and Matthias Meyer as a person with all relevant statements (founding year, industry, website, location).
- Entity unification -- In over 45 files, we corrected different spellings ("StudioMeyer.IO", "StudioMeyer.io", "Studio Meyer") to a consistent "StudioMeyer". JSON-LD, OG tags, meta tags, titles -- all consistent.
- Directory listings -- Clutch.co and other industry directories to strengthen entity signals.
Why this matters: fragmented entity signals lead to a 2.8x lower AI citation rate according to studies. When an AI sees "StudioMeyer.io" and "Studio Meyer" as two different entities, authority gets split.
Pillar 3: Citation-Optimized Content
The most surprising insight: our blog content was cited 100 percent of the time. Our service pages zero percent. AI systems cite technical deep-dives, not marketing copy.
What works:
- Comparison articles with tables (our "MCP vs REST API vs WebMCP" was written in three languages and cited immediately)
- Facts and concrete numbers in every paragraph
- Every paragraph must work standalone -- AI systems extract individual passages
- FAQ schema on every service page
What doesn't work:
- Vague marketing copy ("we offer innovative solutions")
- Dependent paragraphs ("as mentioned above")
- Pages without measurable facts
Pillar 4: Homepage as Entity Hub
The homepage was entity-optimized: founder name, founding year, tech stack, and location appear in visible text -- not just in meta tags. Every paragraph functions as a standalone LLM quote.
The Results
AI Citations in March 2026
- 70 citations total, 19 cited pages
- Top 3: "What is MCP?" (13 citations), "Bento Grid Layouts" (10), "Schema Markup Guide" (6)
- Trend: 0 at the start of March → 11-17 per day by end of March
- Blog vs. Service: 100% of citations came from blog articles, 0% from service pages
Backlinks and Brand Mentions
- 6 backlink domains, 38 referring pages
- Organic brand mentions through client footers ("Design by StudioMeyer")
- Bing: /de fully indexed, 0 SEO issues
What You Can Do Right Now
In One Hour
- Check robots.txt -- Allow GPTBot, ClaudeBot, PerplexityBot (many websites accidentally block them)
- Create llms.txt -- A simple text file in your root: who you are, what you offer, what makes you unique
- Check JSON-LD schema -- Organization, LocalBusiness, and Person schema should be complete
In One Day
- Check entity consistency -- Is your company name spelled the same everywhere? JSON-LD, OG tags, meta tags, titles?
- Rewrite blog articles -- Make every paragraph standalone. Add facts and numbers. Add FAQ schema.
- Create Wikidata entry -- For your company and founders. Free, takes 30 minutes.
Ongoing
- Write technical deep-dives -- Comparison articles, how-tos with concrete numbers, FAQ-formatted guides
- Directory listings -- Clutch, Sortlist, DesignRush, industry-specific directories
- Refresh content every 30 days -- AI systems prefer current content
The Future: GEO Will Become Standard
GEO today is where SEO was 15 years ago: a niche topic that will soon become indispensable. AI systems will get better at detecting quality signals, and competition for AI citations will increase.
The first-mover advantage is real. We went from zero to 70 citations in one month -- with measures that any business can implement. The question isn't whether your competition will start, but when.
