Skip to main content

Why am I seeing unexpected sources?

Maurizio Wagenhaus avatar
Written by Maurizio Wagenhaus
Updated over 4 months ago

AI models sometimes reference sources that seem random or irrelevant to humans. This happens because AI selects sources differently than we do.

How AI finds sources: When we test your prompts, AI breaks them down into multiple search terms (called "query fan-out") and searches the web for relevant content. Sometimes this process finds sources through unexpected search terms, like adding "2025" for recent info or "forums" when looking for opinions.

AI also prioritizes sources that are easy to cite, even if they're not the most authoritative. A less relevant source might get chosen because it has a clear paragraph that summarizes a topic well.

What you can do with this: If an unexpected source only appears for one AI model or pops up occasionally, don't worry about it. These one-off appearances aren't worth your time.

If a source shows up consistently across multiple AI models, treat it as genuinely relevant and consider optimizing for it. The sources that appear repeatedly are the ones worth your attention.

Did this answer your question?