How Transformer.js Can Help You Create Smarter AI In Your Browser

Поделиться
HTML-код
  • Опубликовано: 2 ноя 2024

Комментарии • 5

  • @torquebiker9959
    @torquebiker9959 5 месяцев назад +1

    wow, thats nice!!

  • @MaxM9000
    @MaxM9000 10 месяцев назад +2

    My biggest gripe with this library is that something about the model conversion is not well documented. I’ve tried converting and using several types of models for supported tasks and they always give me a problem in some way when I try to load them.

    • @nerding_io
      @nerding_io  10 месяцев назад

      I’ve had some issue too and it took some digging. However I posted a GitHub issue and it was resolved quickly. Plus it was fun to dig.
      What were you converting? Maybe I can make a video on the conversion process.

    • @MaxM9000
      @MaxM9000 10 месяцев назад +2

      @@nerding_ioI've tried a small range of models including BERT (for the pooled embeddings), Flan-T5 (for conditional-text generation & LangchainJS), and Whisper (for automatic speech recognition on NodeJS, Electron, & React/ReactNative). None of these projects have actually been successfully completed past trying to load the models.
      My main method of conversion is via the optimum-cli conversion command. Transformers JS also has their own script but I'd have to download their repo. The Transformers JS conversion script also adds quantization but I just need the models converted. Also, I've used the conversion script and even then I cant get the models working.
      A video going over both methods and showing them working would be great help to people who want to convert their models for mobile/web/local applications.