Why this matters
Legible gives Intercom a cleaner, more current knowledge layer than raw site scraping. The same content customers publish to the web can be retrieved as structured documents and chunks for support AI.
When Intercom And Legible Fit Together
Use this setup when you want your Intercom AI assistant to answer from the same content Legible already syncs and cleans for AI delivery. That is especially useful for docs-heavy or marketing-led sites where the website changes frequently.
Recommended Integration Pattern
- Treat Legible as the content preparation and retrieval layer.
- Use Content Chat to test how well the content answers real questions before rollout.
- Sync by chunks if your Intercom workflow expects retrieval-ready passages rather than whole pages.
Legible syncs website content
-> Legible cleans and chunks it
-> Your integration pulls documents or chunks from the AI Export API
-> Intercom AI assistant uses that retrieved context to answer customer questionsWhat To Pull From Legible
- Use `ai-index` to discover available documents and detect updates.
- Use `ai-content` if your Intercom workflow benefits from full Markdown documents.
- Use `ai-chunks` if you want passage-level ingestion for support answers and better citation grounding.
Why Customers Use Legible Here
- Cleaner answers than raw HTML ingestion.
- Less manual upkeep than exporting and re-uploading help content by hand.
- Better alignment between public website content and support assistant behavior.
