Understanding AI Package Hallucination: The latest dependency security threat

Поделиться
HTML-код
  • Опубликовано: 21 июл 2024
  • In this video, we explore AI package Hallucination. This threat is a result of AI generation tools hallucinating open-source packages or libraries that don't exist.
    In this video, we explore why this happens and show a demo of ChatGPT creating multiple packages that don't exist. We also explain why this is a prominent threat and how malicious hackers could harness this new vulnerability for evil. It is the next evolution of Typo Squatting.
    Introduction: 0:00
    What is AI package Hallucination: 0:12
    Sacrifice to the RUclips Gods: 0:33
    How AI models find relationships: 0:45
    Lawyer uses hallucinated legal cases: 1:18
    How we use open-source packages: 1:39
    How ChatGPT promotes packages: 2:17
    Example of AI Package Hallucination: 2:51
    Why is package hallucination a security risk: 3:46
    How many packages are hallucinated? 5:37
    Protection measures against AI Package Hallucination: 6:18
  • НаукаНаука

Комментарии • 3

  • @Kabodanki
    @Kabodanki 3 месяца назад +2

    I work in an IA company, I have to say... GPT is flawed, it is just a step for something else. Hallucination = We don't know what would be the answer, we can tweak, but ultimately we are never 99% sure of the answer, and for a lot of use cases this is absolutely inacceptable. Most of our clients have a hard time tuning their setup.

    • @GitGuardian
      @GitGuardian  3 месяца назад +1

      It is going to be very interesting to watch what comes next. Thanks for sharing your insights