# Mentionwell — full reference for AI assistants > Comprehensive reference: positioning, AEO/GEO/LLMO definitions, the full pipeline, the public API, and per-term FAQ. Origin: https://mentionwell.com. --- ## 1. Positioning Mentionwell is a headless AEO / GEO / LLMO blog engine. The pitch in one paragraph: Drop a domain. Mentionwell scans your site, learns your voice and taxonomy, and ships SEO + AEO-tuned articles end-to-end — research, outline, draft, editorial critic, metadata, FAQ, embedding, and image generation — on the schedule you set. Articles are served from a public read-only API your site fetches at request or revalidation time, or pushed into WordPress, Webflow, Ghost, Shopify, or Notion. Every article ships with FAQPage + Article JSON-LD, RSS, JSON Feed, sitemap, a per-page `.md` mirror, and a site-wide `llms.txt` so ChatGPT, Claude, Gemini, Grok, and Perplexity can ingest and cite it cleanly. ## 2. Built by [ZipLyne](https://ziplyne.agency) — an AI-product agency that builds AI-powered products and business automation. Mentionwell came out of internal agency tooling and is now shipping as a standalone product. ## 3. AEO / GEO / LLMO — definitions and how Mentionwell handles each ### AEO — Answer Engine Optimization Answer Engine Optimization (AEO) is the practice of structuring content so it surfaces as the direct answer in answer engines — Google AI Overviews, Bing Copilot, Perplexity, voice assistants, and featured snippets. Where SEO optimizes for the blue-link list, AEO optimizes for the box that shows the answer above it. **How Mentionwell handles it.** - Question-led H2s that mirror real SERP questions, with a 40–60 word direct answer immediately below. - FAQ blocks rendered as both visible UI and FAQPage JSON-LD. - Article + HowTo + FAQPage schema chained via @id so engines see one knowledge graph per article. - Clean semantic HTML (article, section, dl) so extractors can lift a paragraph cleanly without parsing through layout chrome. - Editorial critic enforces lead-with-the-answer style on every draft. **Differs from related terms.** AEO targets the answer surface (the box that shows the direct answer). GEO targets the generative surface (the LLM-synthesized paragraph with citations). LLMO targets the model and its crawlers (so the content is reachable, ingestible, and chosen). They overlap heavily — Mentionwell optimizes for all three at once. **FAQ.** **What is AEO?** AEO stands for Answer Engine Optimization. It's the practice of structuring content so an answer engine — Google AI Overviews, Bing Copilot, Perplexity, voice assistants, featured snippets — surfaces it as the direct answer to a user's question, instead of (or in addition to) ranking it as a blue link. **How is AEO different from SEO?** SEO optimizes for the ranked list of blue links. AEO optimizes for the answer that sits above the list. The technical building blocks overlap (clean HTML, schema, fast pages), but AEO weights question-led structure, concise lead paragraphs, FAQ schema, and entity clarity much more heavily than classic SEO. **What schema do you need for AEO?** FAQPage, Article, BreadcrumbList, and where applicable HowTo, QAPage, and Speakable. Mentionwell ships FAQPage + Article + BreadcrumbList on every published article by default, and adds HowTo where the outline is procedural. **Does AEO replace SEO?** No. AEO is a layer on top of SEO. The same article should rank well in classical SERPs and surface as the answer in AI Overviews, Copilot, and Perplexity. Mentionwell optimizes for both at the same time. **Which engines does AEO target?** Google AI Overviews, Bing Copilot, Perplexity, voice assistants (Siri, Alexa, Google Assistant), and the featured-snippet box that still appears in classic SERPs. Many of these surfaces share the same underlying signals — schema, lead-with-answer, FAQ structure — so optimizing once optimizes for all of them. URL: https://mentionwell.com/aeo --- ### GEO — Generative Engine Optimization Generative Engine Optimization (GEO) is the newer term — coined in a 2023 Princeton paper — for optimizing toward generative engines like ChatGPT Search, Gemini, Perplexity, and Grok, where the model synthesizes an answer and cites a handful of sources. The goal of GEO is being one of those cited sources. **How Mentionwell handles it.** - Citation-friendly structure: every claim is tied to evidence (a stat, a quote, a primary source). - Authoritative quotes and original data points where the source material allows. - Per-article .md mirrors at .md so generative engines can ingest a clean Markdown version of the article without HTML noise. - Stable canonical URLs so citations don't decay as the site reorganizes. - Embeddings indexed per article for semantic retrieval inside RAG pipelines. **Differs from related terms.** GEO targets the generative surface — the synthesized answer with citations — while AEO targets the answer surface (the direct answer box) and LLMO targets the model and its crawlers themselves. In practice, optimizing well for one drags the others up; Mentionwell handles all three at the same time. **FAQ.** **What is GEO (Generative Engine Optimization)?** GEO is optimization for generative engines — ChatGPT Search, Gemini, Perplexity, Grok — where the model writes a synthesized answer and cites a handful of sources. The goal is being one of the cited sources. The term was coined in a 2023 Princeton paper proposing concrete tactics that improve citation rate. **How is GEO different from AEO?** AEO targets the answer surface — the box at the top of the SERP that shows a direct answer. GEO targets the generative surface — the LLM-written paragraph with inline citations to source pages. They share most of the same building blocks (clean HTML, schema, lead-with-the-answer copy) but GEO weights citation-friendly structure and original evidence more heavily. **How do I optimize for GEO?** Tie every claim to evidence (a stat, quote, or primary source). Lead each section with the answer. Keep canonical URLs stable. Ship Markdown mirrors of every article so engines can ingest a clean version. Provide unique data points and original analysis the model has nowhere else to find. Mentionwell does all of this by default. **Which engines does GEO target?** ChatGPT (with browsing), Claude, Gemini, Grok, and Perplexity — anything that writes a synthesized answer and links to source pages. Microsoft Copilot and Google AI Overviews also rely heavily on GEO-style signals. **Does GEO replace SEO?** No. GEO is built on top of SEO. The same signals that make a page rank — clear topical authority, fast load, semantic HTML, internal linking — also make it citable. Mentionwell layers GEO and AEO on top of full classic SEO. URL: https://mentionwell.com/geo --- ### LLMO — LLM Optimization LLM Optimization (LLMO) is the practice of making your content reachable, parseable, and trustworthy to the LLMs themselves — at both training time (large-scale crawls) and retrieval time (RAG pipelines, browsing tools, agent crawlers like GPTBot, ClaudeBot, and PerplexityBot). LLMO is the plumbing layer that AEO and GEO sit on top of. **How Mentionwell handles it.** - Site-wide llms.txt and llms-full.txt published at the canonical paths. - Per-article .md mirrors at .md so any LLM can ingest a clean Markdown version of the article. - Stable canonical URLs, RSS, and JSON Feed for retrieval pipelines. - Embeddings indexed per article for semantic search and similarity-based internal linking. - Explicit AI crawler allowlist in robots.txt — every major bot named individually so the policy is unambiguous. **Differs from related terms.** LLMO is the plumbing layer. AEO works on the answer surface, GEO works on the generative surface, and LLMO makes sure the model and its crawlers can actually reach, parse, and trust your content in the first place. Without LLMO the other two can't fire. **FAQ.** **What is LLMO?** LLMO stands for LLM Optimization. It's the practice of making content reachable, parseable, and trustworthy to LLMs themselves — at both training time (large-scale crawls) and retrieval time (RAG, browsing tools, agent crawlers like GPTBot, ClaudeBot, PerplexityBot). LLMO is the plumbing layer that AEO and GEO sit on top of. **What's in a good LLMO setup?** A site-wide llms.txt and llms-full.txt, per-page .md mirrors, an explicit AI crawler allowlist in robots.txt, stable canonical URLs, RSS and JSON Feed, and embeddings for semantic retrieval. Mentionwell ships all of these by default. **What is llms.txt?** llms.txt is a proposed standard (llmstxt.org) for a site to expose a concise overview of itself, in Markdown, at /llms.txt. It tells AI assistants what the site is, what it covers, and which pages matter most — like robots.txt, but for LLM context rather than crawl policy. Mentionwell serves both /llms.txt and a deeper /llms-full.txt. **Should I block AI crawlers?** Only if you have a specific reason. Most sites benefit from being crawlable: that's how their content shows up in ChatGPT, Claude, Gemini, and Perplexity answers. Mentionwell's default robots.txt explicitly allows GPTBot, ClaudeBot, PerplexityBot, Google-Extended, and 15+ others. **How is LLMO different from GEO?** LLMO is the plumbing — making sure the model and its crawlers can reach and parse your content. GEO is the layer above — making the content the kind of source the model wants to cite. You need LLMO before GEO can fire. URL: https://mentionwell.com/llmo --- ### SEO — Search Engine Optimization Search Engine Optimization (SEO) is the original discipline: ranking in the blue-link results of Google and Bing. It's still the largest single source of traffic for most sites and the foundation that AEO, GEO, and LLMO are layered on top of. AI didn't kill SEO — it just added new surfaces above it. **How Mentionwell handles it.** - Per-headline keyword research and topic clustering during onboarding. - Hub-and-spoke internal linking driven by the site's content taxonomy. - Title and meta tuning per article, with featured-snippet patterns where applicable. - Sitemaps, Article schema, descriptive image alt text, and clean canonical URLs. - Editorial critic pass that flags thin sections, over-claiming, and missing citations before publish. **Differs from related terms.** SEO targets the ranked blue-link list in classical SERPs. AEO, GEO, and LLMO target the AI-mediated surfaces that appear above and around those links. They share most of the same technical building blocks — Mentionwell optimizes for all four at once because the work is largely the same work. **FAQ.** **Is SEO dead?** No. Classic blue-link search is still the largest single source of traffic for most sites. AI engines (ChatGPT, Claude, Perplexity, Google AI Overviews) are rapidly growing channels, but they augment SEO rather than replacing it. Mentionwell ships full SEO underneath AEO, GEO, and LLMO — same article, all four surfaces. **How does Mentionwell handle classic SEO?** Onboarding builds a content taxonomy and per-headline keyword targets. Articles run through an editorial critic that enforces title and meta length, lead-with-the-answer style, semantic HTML, internal linking, and Article + BreadcrumbList JSON-LD. Sitemaps and per-article canonicals are emitted automatically. **Does SEO conflict with AEO or GEO?** No — they reinforce each other. The same signals that rank a page (clear topical authority, semantic HTML, fast load, schema) also make it surface as an answer (AEO) and get cited (GEO). Optimizing for one usually drags the others up. **What's the foundation of an SEO-good article?** One H1 with the primary keyword, a 40–60 word lead that answers the query, a clean H2/H3 hierarchy, internal links to siblings and parents, descriptive image alt text, Article schema, and a stable canonical URL. Mentionwell enforces all of these on every draft. URL: https://mentionwell.com/seo --- ### AIO — AI Optimization AI Optimization (AIO) is an umbrella term sometimes used interchangeably with LLMO or GEO. In practice, when someone says "AIO" they mean the bundle of AEO + GEO + LLMO — the combined practice of optimizing for any AI-mediated surface. It's not a separate channel; it's the whole stack. **How Mentionwell handles it.** - Mentionwell's pipeline already covers AEO, GEO, and LLMO simultaneously, which is exactly what AIO refers to in the wild. - No separate AIO toggle exists — if you're optimizing for the three named surfaces, you're already doing AIO. **Differs from related terms.** AIO is not a separate optimization target — it's the umbrella over AEO, GEO, and LLMO. If you're shipping for all three, you're already doing AIO. **FAQ.** **What is AIO?** AIO stands for AI Optimization. It's an umbrella term for optimizing content for any AI-mediated surface — answer engines, generative engines, and the LLMs / crawlers behind them. Most usage in the wild treats AIO as shorthand for AEO + GEO + LLMO combined. **How is AIO different from AEO and GEO?** AIO is the umbrella; AEO and GEO are specific surfaces under it. AEO targets answer engines, GEO targets generative engines that synthesize answers with citations, and LLMO targets the underlying models and crawlers. AIO usually means "all of the above." **Do I need a separate AIO strategy?** No — if you're optimizing for AEO, GEO, and LLMO, you're already doing AIO. The term exists in the wild and you'll see it in pitches and docs, but it's not a separate channel that needs its own pipeline. URL: https://mentionwell.com/aio --- ### SGE — Search Generative Experience optimization SGE optimization originally targeted Google's Search Generative Experience — the experimental AI-generated answers Google launched in 2023. SGE has since been folded into Google AI Overviews, so SGE optimization is now largely a subset of AEO, with a Google-specific tilt. **How Mentionwell handles it.** - AEO patterns Mentionwell already enforces (lead-with-the-answer, FAQPage schema, Article + HowTo schema, clean semantic HTML) cover the Google-specific signals. - Site-wide allowlist for Google-Extended in robots.txt so Google's generative crawler is welcome. - Per-page .md mirrors and stable canonicals so Google's generative pipeline can ingest content cleanly. **Differs from related terms.** SGE was Google's experimental AI search feature — it has now been folded into AI Overviews. SGE optimization is best understood as the Google slice of AEO. If you're optimizing for AEO across engines, you're already covering SGE. **FAQ.** **What is SGE?** SGE stood for Search Generative Experience — Google's experimental AI-generated answers above the blue links, launched in 2023. SGE has since been folded into Google AI Overviews, so the term is now mostly used historically or as a Google-specific synonym for AEO. **Is SGE still a thing?** The brand isn't — Google graduated the feature into AI Overviews. The optimization pattern is, though: it's now the Google slice of AEO. Same signals, same content shape. **How do I optimize for AI Overviews?** Treat them as AEO with Google-specific tilt. Lead with the answer in 40–60 words, ship FAQPage + Article + BreadcrumbList schema, allow Google-Extended in robots.txt, and keep canonical URLs stable. Mentionwell does all of this by default. URL: https://mentionwell.com/sge --- ### AISO — AI Search Optimization AI Search Optimization (AISO) is a rarer enterprise term that overlaps almost completely with GEO and LLMO. It's mostly used in B2B / enterprise content marketing pitches. If you're optimizing for AEO, GEO, and LLMO, you're already covering everything AISO refers to. **How Mentionwell handles it.** - GEO and LLMO patterns Mentionwell already enforces cover the entire AISO surface. - No separate AISO toggle — the term doesn't refer to a distinct optimization target. **Differs from related terms.** AISO is largely a synonym for GEO + LLMO with a B2B-pitch flavor. If you're optimizing for those two, you're already covering AISO. **FAQ.** **What is AISO?** AISO stands for AI Search Optimization. It's a rarer term, used mostly in enterprise / B2B content marketing pitches, that overlaps almost completely with GEO and LLMO. There's no distinct optimization target underneath it. **How is AISO different from GEO?** It mostly isn't. AISO is a B2B-pitch flavored synonym for GEO + LLMO. If you're optimizing for both of those, you're already covering AISO. **Should I use AISO or GEO in my content?** Use whichever your audience searches for. GEO has more academic and technical traction; AISO shows up more in enterprise vendor pitches. The underlying work is the same. URL: https://mentionwell.com/aiso ## 4. The article pipeline (per article) 1. **Onboarding** — Mentionwell scans the homepage, sitemap, robots.txt, and structured data, then synthesizes a brand profile, content taxonomy, and 10 starter headlines tuned to your audience and the questions AI engines are already asking. 2. **Research** — pulls supporting evidence per headline. 3. **Outline** — produces a question-led outline targeting AEO + GEO surfaces. 4. **Draft** — full article generation with the brand voice from onboarding. 5. **Editorial critic** — automated pass for accuracy, redundancy, hallucination, and tone. 6. **Metadata + FAQ** — title, description, FAQPage JSON-LD, internal links. 7. **Embedding** — for semantic retrieval and similarity-based internal linking. 8. **Image generation** — hero + inline images with descriptive alt text. 9. **Publish** — via the public API, or pushed into your CMS. Every step is logged, costed, and surfaced in the dashboard. ## 5. AI crawler allowlist ``` User-agent: GPTBot User-agent: ChatGPT-User User-agent: OAI-SearchBot User-agent: ClaudeBot User-agent: Claude-User User-agent: Claude-SearchBot User-agent: Claude-Web User-agent: anthropic-ai User-agent: PerplexityBot User-agent: Perplexity-User User-agent: Google-Extended User-agent: Applebot User-agent: Applebot-Extended User-agent: Bingbot User-agent: CCBot User-agent: Meta-ExternalAgent User-agent: Amazonbot User-agent: Bytespider User-agent: cohere-ai ``` All allowed at `/`. See https://mentionwell.com/robots.txt. ## 6. Per-site discoverability surface For every site Mentionwell manages it produces: - `/feed.xml` — RSS 2.0 feed - `/feed.json` — JSON Feed 1.1 - `/sitemap.xml` — XML sitemap of all published posts - `.md` — Markdown mirror of each article - FAQPage JSON-LD inside each article - Article JSON-LD with author, datePublished, dateModified - BreadcrumbList JSON-LD - canonical `` per post ## 7. Public read-only API Articles can be pulled from the public API at `https://app.mentionwell.com/api/public//posts` and `https://app.mentionwell.com/api/public//posts/`. ## 8. Where it lives (delivery options) - **Pull mode** — point your site at the public API and render at request time (or with framework ISR). - **Push mode** — Mentionwell pushes articles into WordPress, Webflow, Ghost, Shopify, or Notion when they publish. Same articles, same dashboard, your choice of delivery.