Subj : Hallucinated packages could be the next big security risk hitting To : All From : TechnologyDaily Date : Tue Apr 02 2024 15:00:05 Hallucinated packages could be the next big security risk hitting AI developers Date: Tue, 02 Apr 2024 13:49:19 +0000 Description: What if someone creates a previously hallucinated package - and then makes it malicious? FULL STORY ====================================================================== The risks of Generative AI tools being able to hallucinate - or suggest sources, or tools, that dont exist - has long been a concern for developers. Now, experts have warned that if a threat actor discovers a Generative AI hallucination of a, lets say, software package, they can actually build it, and have it be malicious. That way, theyll end up using super popular AI tools to distribute malware . Not purely theoretical Bar Lanyado, a cybersecurity researcher from Lasso Security, recently set out to see if the risk is purely theoretical, and concluded that it could be abused in the wild. For his analysis, he collected almost 50,000 how to questions which developers might ask Generative AI tools while building a software solution. He focused on five programming languages: python, node.js, go, .net, and ruby, and asked Chat-GPT 3.5-Turbo, GPT-4, Gemini Pro, and Coral. GPT 4 hallucinated (made software packages up, essentially), 24.2% of the time, repeating the same answers in 19.6% of cases. GPT3.5 hallucinated 22.2% of the time, with 13.6% of repetitiveness, while Gemini hallucinated 64.5% of the time, with 14% of repetitiveness. Finally, Coral returned 29.1% of hallucinations, with 24.2% repetitiveness. So far, so good. In theory, these four tools would often suggest developers download the same, non-existent packages. If the researcher noticed it, so could hackers, and they could create these hallucinated packages to carry malicious code and let Gen AI promote them. It works in practice, too. Lanyado said. He took one of the hallucinated packages and created it. To verify the number of real downloads, Lanyado also uploaded a dummy package, to eliminate scanner downloads from the total. The results are astonishing, he concluded. In three months the fake and empty package got more than 30k authentic downloads! (and still counting). More from TechRadar Pro ChatGPT malware use is growing at an alarming rate Here's a list of the best firewalls around today These are the best endpoint security tools right now ====================================================================== Link to news story: https://www.techradar.com/pro/security/hallucinated-packages-could-be-the-next -big-security-risk-hitting-ai-developers --- Mystic BBS v1.12 A47 (Linux/64) * Origin: tqwNet Technology News (1337:1/100) .