1 Matching Annotations
  1. Last 7 days
    1. Hallucinated packages are the sleeper threat. LLMs regularly invent package names that don't exist. One study found that nearly 20% of AI-recommended packages were fabrications, and 43% of those hallucinated names appeared consistently across queries.

      大多数人认为AI推荐的包都是真实存在的,但作者揭示了AI经常推荐不存在的包,这已成为一种新的攻击向量。攻击者利用这一现象注册'幻觉包'并植入恶意代码,这种'slopsquatting'技术让AI本身成为供应链攻击的放大器。