Hallucination (Code)
In the context of AI-generated code, a hallucination is when the model confidently produces something that doesn't actually exist — an import from a package that was never published, a method on an object that doesn't have it, a config flag that looks right but isn't real. The code looks fine. The tests even pass if you don't cover that path. Then you deploy and it blows up, or worse, some attacker squats the non-existent package name on npm and your machine pulls their code next time you run install. This is called 'slopsquatting' and it's already a documented supply-chain attack vector. Hallucinations are more common when you ask for obscure libraries, recent versions, or language features the model's training data didn't cover well. The fix: always verify imports actually exist, check the method signature against real docs (not AI-quoted docs), and run the code end-to-end before calling it done. Our checklist makes 'fake imports' the first thing you check because it's the cheapest bug to catch and the most expensive to ignore.