WebLLM: A high-performance in-browser LLM Inference engine

Поделиться
HTML-код
  • Опубликовано: 21 ноя 2024

Комментарии • 3

  • @erkintek
    @erkintek 10 часов назад +1

    3 years ago I run some models in tensor flow js, also on mobile. They were not performat but way cheaper than on server.

  • @SriniVenkata-my4uw
    @SriniVenkata-my4uw 11 часов назад

    Excellent video

  • @nri_raj
    @nri_raj 17 часов назад +1

    3rd like 😂