My biggest gripe with this library is that something about the model conversion is not well documented. I’ve tried converting and using several types of models for supported tasks and they always give me a problem in some way when I try to load them.
I’ve had some issue too and it took some digging. However I posted a GitHub issue and it was resolved quickly. Plus it was fun to dig. What were you converting? Maybe I can make a video on the conversion process.
@@nerding_ioI've tried a small range of models including BERT (for the pooled embeddings), Flan-T5 (for conditional-text generation & LangchainJS), and Whisper (for automatic speech recognition on NodeJS, Electron, & React/ReactNative). None of these projects have actually been successfully completed past trying to load the models. My main method of conversion is via the optimum-cli conversion command. Transformers JS also has their own script but I'd have to download their repo. The Transformers JS conversion script also adds quantization but I just need the models converted. Also, I've used the conversion script and even then I cant get the models working. A video going over both methods and showing them working would be great help to people who want to convert their models for mobile/web/local applications.
wow, thats nice!!
Thanks!
My biggest gripe with this library is that something about the model conversion is not well documented. I’ve tried converting and using several types of models for supported tasks and they always give me a problem in some way when I try to load them.
I’ve had some issue too and it took some digging. However I posted a GitHub issue and it was resolved quickly. Plus it was fun to dig.
What were you converting? Maybe I can make a video on the conversion process.
@@nerding_ioI've tried a small range of models including BERT (for the pooled embeddings), Flan-T5 (for conditional-text generation & LangchainJS), and Whisper (for automatic speech recognition on NodeJS, Electron, & React/ReactNative). None of these projects have actually been successfully completed past trying to load the models.
My main method of conversion is via the optimum-cli conversion command. Transformers JS also has their own script but I'd have to download their repo. The Transformers JS conversion script also adds quantization but I just need the models converted. Also, I've used the conversion script and even then I cant get the models working.
A video going over both methods and showing them working would be great help to people who want to convert their models for mobile/web/local applications.