# AI / coding agents

> Copy-paste setup prompts for Cursor, Claude Code, Lovable, Bolt, v0, ChatGPT, Replit, Windsurf.

The fastest path: copy one prompt, paste it into your favourite AI coding tool, and let it wire up the blog. Every agent gets the same pre-filled context.

## What the dashboard gives you

Open **Ship** for any site. The page surfaces a single, comprehensive prompt with:

- Your three credentials (`MENTIONWELL_API_URL`, `MENTIONWELL_SITE_SLUG`, `MENTIONWELL_API_KEY`)
- The brand profile (audience, tone, CTA)
- The exact API endpoints to call
- File scaffolds for the agent to create
- Acceptance criteria the agent must verify before claiming "done"

Click **Copy setup prompt**. You're done with Mentionwell.

## Where to paste it

| Tool | How |
|---|---|
| **Cursor** | Open your destination site repo. Cmd-L → paste. |
| **Claude Code** | In your destination site repo: `claude` → paste. |
| **Lovable** / **Bolt.new** / **v0** | Click "Open with prompt" in the dashboard. The prompt is pre-encoded in the URL. |
| **ChatGPT** / **Claude.ai** | "Open in ChatGPT" / "Open in Claude" buttons load the prompt. |
| **Replit Agent** | New Repl with the prompt pasted as the initial brief. |
| **Windsurf** | Cascade chat → paste. |

## Why this works

The prompt is built from your live site config. The agent doesn't have to guess at:

- env var names
- the brand voice
- which framework you're using
- whether you need ISR / SWR / cache headers
- what to do about JSON-LD / canonical / OG tags

…all of which it would otherwise hallucinate. We hand it the answer.

## What the agent should produce

Every prompt ends with the same "definition of done":

1. `BLOGOTO_*` env vars added to `.env.local` and the hosting provider.
2. `/blog` index renders posts.
3. `/blog/[slug]` renders an article (HTML, metadata, JSON-LD).
4. RSS `<link>` added to `<head>`.
5. Build passes.
6. Browser-tested both pages.
7. Reply with `BLOGOTO READY.`

If your agent skips any of those, ask it to verify before merging.

## Going off-script

You don't have to use the prefilled prompt. You can also point your agent at this docs site directly:

> Read https://mentionwell.com/llms.txt and follow the quickstart for the framework I'm using. Use these env vars: MENTIONWELL_API_URL=…

Both paths work. The prefilled prompt is just faster.


---

Canonical URL: https://mentionwell.com/docs/ai-prompts
Live HTML version: https://mentionwell.com/docs/ai-prompts
Section: No-code & AI builders
Site index for AI ingestion: https://mentionwell.com/llms.txt
Full reference: https://mentionwell.com/llms-full.txt
