Discord's Facial Scan Test Could Change Social Media Forever—But at What Cost?
AI Hallucinations Introduce New Software Supply Chain Vulnerabilities Researchers from the University of Texas at San Antonio, the University of Oklahoma, and Virginia Tech have identified a novel software supply chain threat from hallucinations in code-generating large language models. This phenomenon, termed slopsquatting, occurs when LLMs suggest fictitious package names during code generation. Malicious actors can exploit this by creating packages with these non-existent names, leading unsuspecting developers to incorporate potentially harmful code into their projects. Key Findings In a study involving 16 popular LLMs, none were free from package hallucinations. The models generated over 205,000 unique fictitious package names, 81 percent unique to the model that produced them. Commercial models exhibited hallucination rates of at least 5.2 percent, while open-source models showed higher rates at 21.7 percent. Notably, 58 percent of these hallucinati...