Canvas CloudAI
Canvas Cloud AI

Hallucination

intermediate
ai & ml
Enhanced Content

Definition

When an AI model generates plausible-sounding but factually incorrect or made-up information. Like a confident student giving a wrong answer that sounds right.

Real-World Example

An AI chatbot confidently cited a court case that didn't exist, a classic hallucination that caused problems when used in a legal brief.

Cloud Provider Equivalencies

Hallucination is an AI/ML behavior (a model producing incorrect but plausible output), not a cloud service. All major clouds offer tools to reduce hallucinations (e.g., grounding with retrieval, evaluation, guardrails, and monitoring), but there is no direct one-to-one service named “Hallucination.”

Explore More Cloud Computing Terms