#AIHallucination
AI confidently invents false information, fooling unsuspecting users.
Origin & Impact
As ChatGPT usage spread, users discovered it would confidently generate fake citations, nonexistent laws, and fabricated facts. The term “hallucination” became standard for describing AI’s tendency to invent plausible-sounding but false information. Viral examples included AI lawyers citing fake cases and students submitting papers with imaginary sources. The phenomenon highlighted critical limitations of LLMs despite their impressive capabilities.
Related Hashtags
#ChatGPT #AIlimitations #FactCheck