Hallucination attack
March 30, 2024 2:52 PM   Subscribe

"During our research [we] encountered an interesting python hallucinated package called “huggingface-cli”... In three months the fake and empty package got more than 30k authentic downloads! (and still counting)... Our findings revealed that several large companies either use or recommend this package in their repositories..." [via The Register]
posted by clawsoon (9 comments total) 5 users marked this as a favorite
 
This is not the cyber-dystopia we were promised, but I'll take it, for now.
posted by signal at 3:10 PM on March 30


Of course, this wouldn’t detect hallucinated packages someone else already noticed and squatted!

I usually mentally model LLMs as an eager but sloppy intern whose work is only useful if it’s faster to check than redo, but I hadn’t really thought of people adversarially trying to figure out what wrong answers the AI is likely to give other people. So it’s more like a sloppy intern who talks about his work in a bar full of competitors and criminals every night.
posted by smelendez at 4:04 PM on March 30 [3 favorites]


And my work just released an internal AI tool based on ChatGPT 4. Terrifying.
posted by lock robster at 4:21 PM on March 30


And my work just released an internal AI tool based on ChatGPT 4. Terrifying.

Might wanna send them the article...
posted by clawsoon at 4:29 PM on March 30


Much like the wild west of crypto scams, wallet draining malware, rugpulls, and speedrunning the entire menu of financial and securities crimes, LLMs are creating a whole landscape of new threats.

It's an interesting parallel to consider how there are two distinct but inseparable issues - how useful and effective are these technologies at their intended purposes, and how damaging are all of the new ways for bad actors to exploit them?
posted by allegedly at 4:46 PM on March 30


I keep saying that the current round of "AI" is a bunch of bullshit just like crypto, the .com bubble, and every other tech wet dream that would make billions without an obvious problem the tech was capable of solving.

Two to five years from now, the grift will have moved on to something else, and we'll all mostly forget about the AI hype.
posted by Ickster at 5:06 PM on March 30


Two to five years from now, the grift will have moved on to something else, and we'll all mostly forget about the AI hype.

Except the aforementioned fads were all more-or-less inside-baseball kind of tech stuff. AI, on the other hand, is being wedged into all manner of everyday stuff that have direct effects on regular folk, who are going to be harmed by it. It’s a whole different level of terrible.
posted by Thorzdad at 5:17 PM on March 30


The real package for the huggingface cli is huggingface-hub, easy mistake to make. If the LLM is ingesting wrong package references from existing code this is GIGO (garbage in, garbage out).
posted by muddgirl at 5:18 PM on March 30


This is one reason I hate the term "hallucinations."
posted by muddgirl at 5:19 PM on March 30 [1 favorite]


« Older Tar Trap Caught   |   "If that offends them, so be it." Newer »


You are not currently logged in. Log in or create a new account to post comments.