Legible's core job is to turn existing HTML pages into clean Markdown that AI systems can read. That remains the foundation because clean Markdown stays readable, low-noise, and broadly compatible with today's crawlers and tools.
On top of that foundation, Legible can generate more structured AI context for agent workflows and lightweight related links that help models move between relevant pages without turning every document into a full site menu.
The simple model
- Clean Markdown is the default public page output.
- Structured AI Context is the agent-oriented representation built from the same content.
- Related content blocks help crawlers and agents move from one useful page to the next.
- `llms.txt` and `ai-sitemap.json` remain the main site-level discovery layer.
Why Legible keeps clean Markdown first
Legible is designed around a simple promise: take the HTML content a customer already has and make it readable to AI systems without forcing a CMS migration or manual rewrite. Clean Markdown is still the best format for that public job because it preserves page flow, removes layout noise, and works well for citation, summarization, and page-level crawling.
That means Legible should not replace every public Markdown page with a field-value dump. Public AI-readable pages still need to read like pages.
- Better for broad crawler compatibility.
- Better for humans reviewing what an AI system can read.
- Better for page-level citation and context.
- Better for keeping the product true to Legible's core value proposition.
What Structured AI Context adds
Some agent workflows need more than narrative page text. A coding tool, support copilot, or website-building agent may benefit from a more explicit representation of the same content: summary fields, capabilities, constraints, FAQs, and source metadata in a stable structure.
That is where Structured AI Context comes in. Internally, this can be rendered in a Markdown-KV style format, but the customer-facing idea is simpler: Legible can give agents a more deterministic view of the same content without creating a second content system.
- Best for Legible Connect and AI export workflows.
- Best for support copilots and internal assistants.
- Best when a tool needs explicit facts, constraints, and summaries instead of only page prose.
- Still derived from the same content library customers already manage in Legible.
Clean Markdown vs Structured AI Context
Clean Markdown
- Default public page output
- Best for AI crawlers and readable page delivery
- Preserves page structure and narrative flow
Structured AI Context
- Alternate agent-oriented output
- Best for Connect, export APIs, copilots, and tools
- Adds deterministic fields like summary, capabilities, constraints, and FAQsWhy related content blocks matter
AI systems do not only need readable pages. They also need help finding the next useful page. Legible already provides site-level discovery through `llms.txt` and `ai-sitemap.json`, but a page-level related-links block adds a lightweight local navigation layer without repeating the full site menu on every document.
The right output is small and relevant. A pricing page might point to FAQ, contact, and feature overview pages. A docs article might point to setup, troubleshooting, and API pages. The goal is to improve local pathfinding, not recreate the full website header and footer inside Markdown.
- Keep related links to a small, high-signal list.
- Prefer next-step and same-topic pages.
- Avoid repeated global navigation and token-heavy menus.
- Use page-level links as a local aid, not a replacement for `llms.txt`.
How Legible chooses related links
The best implementation is deterministic first. Legible can use sitemap structure, content type, customer labels, and route patterns to identify likely related pages. Over time, semantic similarity can improve the ranking, but the main selection logic should stay explainable and stable.
That is why the recommended model is heuristic generation first, optional semantic reranking second, and no generative LLM in the request path.
- Same section or hub pages get strong priority.
- Same content type and label improve relevance.
- Strategic next-step pages like pricing, docs, FAQ, demo, or contact can be boosted when appropriate.
- The final list should stay short, deduped, and useful.
Customer control without turning this into a nav builder
The best customer experience is automatic by default with lightweight override controls. Most teams want Legible to choose sensible related links automatically, but some will want to pin a pricing page, exclude a weak suggestion, or add one important external destination like a developer portal or help center.
- Auto mode should remain the default.
- Customers should be able to pin links from the content library.
- Customers should be able to exclude poor suggestions.
- A small number of guarded external links can be allowed for advanced use cases.
Where customers will see these features
- Public AI-readable Markdown pages stay clean and readable.
- Legible Connect and AI export surfaces can expose Structured AI Context for agent workflows.
- The dashboard can preview both outputs and show related-link behavior.
- The same underlying content still powers chat, discovery, and retrieval.
How this fits with the rest of Legible
This feature set is not a new content system. It is an extension of Legible's existing content layer. The same content library that powers clean Markdown, `llms.txt`, `llms-full.txt`, AI chat, and export APIs can also power structured agent context and related page navigation.
That is the real product advantage: one source of truth, multiple high-value outputs.
