Tools · free
Is your site ready for agents?
Paste any URL. I probe 13 live signals — MCP server, llms.txt, robots AI allow-list, schema.org coverage, x402 payment readiness, OpenAPI spec, more — and return a score + the specific fixes ranked by leverage.
Takes ~5 seconds. No login, no email capture, no tracking pixel.
Shared code at src/pages/tools/agent-readiness.astro.
Why this matters
Most sites that say "agents welcome" mean "we won't block you." They ship HTML and hope the scraper holds. That works until the layout shifts or an ad renders mid-content.
A site that's agent-ready in 2026 has: robots.txt with explicit AI allow-list, llms.txt, a live MCP server, schema.org on every content page, and (for monetized APIs) x402 payment challenges. This tool tells you which of those you have.
What gets checked
- robots.txt — present + AI crawlers explicitly allowed (10pt)
- llms.txt — present + non-trivial content (15pt)
- /api/mcp or /.well-known/mcp — MCP server live (20pt)
- sitemap.xml — present + lastmod timestamps (8pt)
- schema.org JSON-LD — ≥3 @types on homepage (10pt)
- OpenGraph + Twitter meta — social share ready (5pt)
- RSS/Atom feed — fallback discovery (5pt)
- x402 payment challenges — agent-monetizable (8pt)
- OpenAPI spec — /openapi.json machine-readable (5pt)
- canonical, HTTPS+HSTS, security.txt — hygiene (9pt)
Try: mondello.dev · stripe.com · example.com