Hallucination

Hallucination

A hallucination in AI refers to a phenomenon where a generative AI model (such as LLM) produces incorrect, incoherent or totally invented information, while presenting it as factual.

These errors stem from the very nature of LLMs, which are designed to predict statistically plausible rather than verified responses.