When someone asks ChatGPT "Which agency builds great websites?", your company either appears in the answer -- or it doesn't. No ad, no ranking, no link. Either the AI knows you, or it doesn't.
That's Generative Engine Optimization (GEO). And most small businesses don't even know whether they appear in AI answers. StudioMeyer GEO is a free MCP server that checks exactly that -- directly in Claude, without a SaaS subscription, without a dashboard.
What is GEO and why does it matter?
Millions of people now use ChatGPT, Gemini, Perplexity, or Claude for questions they used to Google. The difference: Google shows ten links. An AI gives one answer -- and mentions zero to three brands.
If your brand isn't mentioned, you don't exist for these users. And unlike SEO, you can't buy your way into an AI answer.
GEO measures and improves your visibility in these AI answers. It's SEO for the next generation of search.
What StudioMeyer GEO actually does
12 Tools + 5 Workflows
GEO examines your website from five perspectives:
Base tools (7):
- geo_check -- Queries all available AI platforms about your brand and analyzes the responses. Score from 0-100.
- geo_discovery_stack -- Checks whether your website has the five layers of AI discoverability (llms.txt, agents.json, robots.txt, JSON-LD, sitemap)
- geo_calculate_score -- Calculates the weighted GEO score from all results
- geo_platforms -- Shows which AI platforms are configured
- geo_preview_prompts -- Shows the prompts that would be sent to the AIs (dry run)
- geo_analyze_response -- Analyzes an AI response for brand mentions, sentiment, and citations
- geo_recommendations -- Generates a prioritized action list sorted by effort and impact
Specialist tools (5):
- geo_robots_audit -- Checks your robots.txt against 14 AI bots (GPTBot, ClaudeBot, PerplexityBot, Google-Extended, etc.). Detects fatal errors like blocked directories.
- geo_llms_txt_validate -- Validates your llms.txt against the llmstxt.org specification. Checks every linked URL.
- geo_json_ld_audit -- Extracts and validates all structured data on your page. Shows missing recommended properties.
- geo_entity_consistency -- Scans your website for brand name variants. "MyCompany" vs "My Company" vs "mycompany" -- fragmented entities reduce AI citation rates by a factor of 2.8 (Semrush data).
- geo_content_freshness -- Audits content age via three signals: Last-Modified header, og:modified_time, and schema.org dateModified. Perplexity deprioritizes content older than two years.
Six of the twelve tools work without any API key. You can start immediately -- no signup, no credit card.
5 Workflows (as slash commands)
Instead of calling individual tools, there are ready-made workflows:
- /geo_quick_wins -- Quick test in 30 seconds. Only API-free checks. Finds the most obvious issues.
- /geo_full_audit -- Complete audit with all specialist tools and AI queries. Strategic report.
- /geo_before_launch -- 8-point checklist before website launch. Block or pass.
- /geo_competitor_intel -- Which brands do the AIs mention instead of yours?
- /geo_track_over_time -- Automatically save results to StudioMeyer Memory. Trend analysis over months.
Installation: Two minutes
npm install -g mcp-geo
Claude Desktop configuration:
{
"mcpServers": {
"geo": {
"command": "npx",
"args": ["mcp-geo"]
}
}
}
Optional: Set API keys for AI platforms as environment variables:
export OPENAI_API_KEY=sk-... # ChatGPT
export GOOGLE_AI_API_KEY=... # Gemini
export PERPLEXITY_API_KEY=pplx-... # Perplexity
Without API keys, six tools work immediately. For the full check (geo_check), you need at least one AI API key.
What you can actually do with it
Quick test: Am I visible?
"Check GEO visibility of mycompany.com for brand MyCompany in the web design industry."
Claude asks ChatGPT, Gemini, Perplexity, and Claude itself: "Which web design agencies do you recommend?" Then it analyzes the answers: Was your brand mentioned? In what context? Positive or negative?
Check robots.txt
"Check the robots.txt of mycompany.com."
Many websites accidentally block AI bots. A single entry in robots.txt can prevent ChatGPT from ever seeing your website. GEO checks 14 different AI bots and shows exactly who's blocked.
Build your Discovery Stack
"Check the discovery stack of mycompany.com."
The Discovery Stack consists of five files that AI systems can read:
- llms.txt -- A description of your website for AI models
- agents.json -- Machine-readable service description
- robots.txt -- Access rules for AI bots
- JSON-LD -- Structured data (Schema.org)
- Sitemap -- Page structure for crawlers
GEO checks all five layers and shows what's missing and how important it is.
Before launching a website
"Run the pre-launch check for mycompany.com."
Eight points are checked. A blocked robots.txt is a blocker -- the site shouldn't go live until AI bots can access it. Outdated content (older than six months) is a warning.
What GEO costs -- and what the alternatives cost
| Tool | Price/month | MCP-native | Discovery Stack | Fix guidance |
|---|---|---|---|---|
| Ahrefs Brand Radar | EUR 654 | No | No | No |
| Profound | $399-989 | No | No | No |
| Peec AI | EUR 89-499 | No | No | Limited |
| Otterly.AI | $29-989 | No | No | Limited |
| StudioMeyer GEO | $0 | Yes | Yes | Yes |
StudioMeyer GEO is free and open source (MIT license). The server runs locally on your machine -- no data is sent to third parties (except the AI queries to the respective platforms, when you explicitly trigger them).
What makes GEO unique
Three things no other GEO tool offers:
-
MCP-native. Runs directly in Claude Desktop, Cursor, or VS Code. No tab-switching to a SaaS dashboard. You work where you already work.
-
Discovery Stack depth. Other tools check robots.txt. GEO checks five layers: llms.txt, agents.json, robots.txt, JSON-LD, and content age. That's the difference between "Are we blocked?" and "Are we discoverable?"
-
Fix guidance. Other tools say: "Your visibility is 45%." GEO says: "robots.txt blocks GPTBot. Add 'Allow: /' before 'Disallow: /api/'." Concrete, actionable, prioritized.
Tips for better AI visibility
-
Check your robots.txt first. This is the most common mistake -- and costs nothing to fix. One wrong entry can lock out all AI bots.
-
Create an llms.txt. A simple text file in your website's root that describes what you do. AI models read it preferentially. Format: llmstxt.org.
-
Unify your brand name. If "MyCompany" is on the homepage, "My Company Inc." in the footer, and "mycompany" in the URL -- the AI sees three different entities. A consistent name triples citation rates.
-
Keep content fresh. Pages with a dateModified within the last six months are weighted higher. A blog post from 2023 rarely gets cited, no matter how good it is.
-
Complete your JSON-LD. Especially sameAs (links to social profiles, Wikipedia, Wikidata) and dateModified. This links your brand with known entities.
-
Track over time. With the /geo_track_over_time workflow and StudioMeyer Memory, you can monitor your progress month by month.
GEO is no longer a nice-to-have. If your customers ask ChatGPT and you're not mentioned, you're losing business -- invisibly, without even knowing it. StudioMeyer GEO shows you where you stand in two minutes. Free, local, no signup required.
If you'd rather not implement the optimization yourself: StudioMeyer offers a professional GEO service (EUR 999 setup + EUR 299/month, no minimum contract). We handle the complete implementation -- from Discovery Stack to content optimization.
