@@yanaitalk I am getting this error : No module named 'transformers.models.cohere.configuration_cohere' at line [llama3 = AutoModelForCausalLM.from_pretrained(..)]....somewhere on internet i found that I need to update the transformer version and I changed the version in requirements.txt and used the latest version.But I am still getting this error. Can anyone help with that plz. Also, I just download the whole model from huggingface and uploaded to my drive. Can I use it directly instead of downloading it using the above line where I am getting the error.
Nice video and notebook, thanks for sharing!
Thank you!!
So this isn't a local Llama3 model, does huggingface store api request data?
This is a local model, downloaded from huggingface.
@@yanaitalk Just to be clear, you downloaded the llama3 model on your hard drive?
@@kylewilliams4721 yes, in the notebook, it is downloaded to local drive in colab.
@@yanaitalk
I am getting this error : No module named 'transformers.models.cohere.configuration_cohere' at line [llama3 = AutoModelForCausalLM.from_pretrained(..)]....somewhere on internet i found that I need to update the transformer version and I changed the version in requirements.txt and used the latest version.But I am still getting this error. Can anyone help with that plz.
Also, I just download the whole model from huggingface and uploaded to my drive. Can I use it directly instead of downloading it using the above line where I am getting the error.
Thanks
Superb.. Thanks