A bunch of future videos on LLM seem like a good idea. I see many good ideas in comments. my contribution: Benchmark (different CPUs) (memory speed) Different GPUs Running on Gaming laptops since they are super popular
Can you share us a thought on building large language models specific to an industry that can be run locally? Also a video on code generation locally will be great. Thanks.
Great video! If you can run a large LLM on your pc you can also run it on a hosting service, right? How would this work if you want to use the LLM for AI voice calling?
Hi it would be nice to see the code instruct case. Is it possible for these model to interact with with applications on your machine such as ms excel. A use case will be where you give it a task and it opens excel, writes to the file and save to a file location, or open an existing file, read some data and interpret the information. Please help.
I see CPU-only locally run Windows AI chatbots gaining more and more acceptance because many people simply can't afford expensive GPUs. This will get accepted from nerds to more mainstream users in the same way PCs first arrived on the scene. I'm looking forward to more open-source LLMs and effective speed-up tweaks for CPU-only vids.
GPU prices may go down very soon. Canon, the previous leader in chip-making, has recently invented a technology that, if it becomes mainstream, will reduce costs by 100x or more.
Hi Gary u ar the best, im newbie and have a question, i exceed the token, is there any way we can get more tokens or do we have to wait like how long? thanks Gary
Not that signed necessarily means it is safe either but chances are slimmer. I do hope they just open source it once they are ready for that just like LLaMA itself.
Hi Gary, can you make an episode using this software to create multiple agents on our own machine that will create programs? The lead agent suggests a variety of projects based on keywords (ex: automation of business cards) and the other agents create a program that can do it. That would be fun to have on my pc.
I def want to learn the Code one. I'd like to make a Chat one to help people that are shutins - people afraid to go out. They would have something to become a virtual friend with. How would someone with little coding abilities teach a model? Say I want to make something that builds a desktop application for Windows and MAC, make it executable by clicking an icon, how to I load a coding language into it? It would be great if I could teach it any subject using PDFs or use scholarly articles about psychology, for example for it to learn psychology. Then make it capable of going on the web to find information if information isn't in the model. Can you show how to do that? That would be so amazingly helpful and I'd be able to terminate my ChatGPT subscription.
I get the infamous error: Failed to load model Error: Error loading model. Exit code: 1 for all models that I've tried I have a very beefy laptop with a CPU: Intel(R) Core(TM) i9-10885H CPU ( yes it has AVX2) + 64Gb of ram + GPU NVIDIA Quadro T2000 with Max-Q Design I have installed the c++ requirements, I have AVX2 , I have a good enough CPU, the latest version of windows 10, the 0.25 version of LM studio, latest gpu drivers I have tried smaller and bigger models, but nothing works and I am stuck with the above error :(
@@GaryExplains I used Lm studio to download the models and they are in a location like: C:\Users\user\.cache\lm-studio\models I tried to enable/disable gpu acceleration with 5 layers but without any luck. All other settings are default.
Oh. That isn't good. Sadly I can't help more than that, as it works fine for me on Windows and Mac. I am also using the 0.25 version now. I guess you could try contacting LM Studio via Discord.
@@GaryExplains Done that. They told me they will release a debugging version to get more info on the exit 1 code as it is a popular topic on discord. Thanks and keep up the good work And offtopic, as a funny thing, when I try to explain something to my girlfrend ( or past girlfriends) they all rolled their eyes signaling that they don't want any explanation ( you know the new feminism mainsplaining shaming tactic). When I see each of your videos and you say: "if you wanna know more, let me explain" I all the time imagine your wife rolling up her eyes and saying internally Noooooooo :)
You would need a human lawyer, so if you just used an AI it would be yourself that represent yourself as a “lawyer”. I am not sure if it is possible for you to run an AI in court. So as Gary wrote, obviously not, lol.
@@Gubby-Man I honestly can't tell if you are being sarcastic. But just in case you aren't, I literally say at the beginning of most of my videos, "hello my name is Gary Sims and this is Gary Explains" 🤦♂️
A video on code instruct would be very useful.
Can I just say that thumbnail is amazing!!
A bunch of future videos on LLM seem like a good idea. I see many good ideas in comments.
my contribution:
Benchmark (different CPUs) (memory speed)
Different GPUs
Running on Gaming laptops since they are super popular
Nvidia vs AMD
@Gar, this is dead on.
Can you share us a thought on building large language models specific to an industry that can be run locally? Also a video on code generation locally will be great. Thanks.
Yes please do a video on code instruct.
Yes please tell us about code instruct.
Great video! If you can run a large LLM on your pc you can also run it on a hosting service, right? How would this work if you want to use the LLM for AI voice calling?
This is amazing, thanks for uploading this, I followed your instructions and omg local Ai 👍🍻
Hi it would be nice to see the code instruct case. Is it possible for these model to interact with with applications on your machine such as ms excel. A use case will be where you give it a task and it opens excel, writes to the file and save to a file location, or open an existing file, read some data and interpret the information. Please help.
I see CPU-only locally run Windows AI chatbots gaining more and more acceptance because many people simply can't afford expensive GPUs. This will get accepted from nerds to more mainstream users in the same way PCs first arrived on the scene. I'm looking forward to more open-source LLMs and effective speed-up tweaks for CPU-only vids.
GPU prices may go down very soon. Canon, the previous leader in chip-making, has recently invented a technology that, if it becomes mainstream, will reduce costs by 100x or more.
Hi Gary u ar the best, im newbie and have a question, i exceed the token, is there any way we can get more tokens or do we have to wait like how long? thanks Gary
Not that signed necessarily means it is safe either but chances are slimmer.
I do hope they just open source it once they are ready for that just like LLaMA itself.
Apparently the Windows .exe is now signed. 👍
Hi Gary, can you make an episode using this software to create multiple agents on our own machine that will create programs? The lead agent suggests a variety of projects based on keywords (ex: automation of business cards) and the other agents create a program that can do it. That would be fun to have on my pc.
Will all the restrictions be removed or no?
Code instruct should be next
I def want to learn the Code one. I'd like to make a Chat one to help people that are shutins - people afraid to go out. They would have something to become a virtual friend with. How would someone with little coding abilities teach a model?
Say I want to make something that builds a desktop application for Windows and MAC, make it executable by clicking an icon, how to I load a coding language into it? It would be great if I could teach it any subject using PDFs or use scholarly articles about psychology, for example for it to learn psychology. Then make it capable of going on the web to find information if information isn't in the model. Can you show how to do that? That would be so amazingly helpful and I'd be able to terminate my ChatGPT subscription.
I just installed this on my MBA 13", wish I had more RAM, but about 17% downloaded at 12:33am. Might be a Saturday project, going to bed. $0.02
Please tell us about code construct Gary 🙂
do i need nvida gpu or amd is fine
can we access the app via some form of API using localhost ?
Yes. There is an Open AI compatible local host server.
Code instruct please!
Good video.
Glad you enjoyed it
I get the infamous error: Failed to load model Error: Error loading model. Exit code: 1 for all models that I've tried
I have a very beefy laptop with a CPU: Intel(R) Core(TM) i9-10885H CPU ( yes it has AVX2) + 64Gb of ram + GPU NVIDIA Quadro T2000 with Max-Q Design
I have installed the c++ requirements, I have AVX2 , I have a good enough CPU, the latest version of windows 10, the 0.25 version of LM studio, latest gpu drivers
I have tried smaller and bigger models, but nothing works and I am stuck with the above error :(
Did you download the models inside of LM Studio or externally? Did you change any of the default Settings (like GPU acceleration etc)?
@@GaryExplains I used Lm studio to download the models and they are in a location like: C:\Users\user\.cache\lm-studio\models
I tried to enable/disable gpu acceleration with 5 layers but without any luck. All other settings are default.
Oh. That isn't good. Sadly I can't help more than that, as it works fine for me on Windows and Mac. I am also using the 0.25 version now. I guess you could try contacting LM Studio via Discord.
@@GaryExplains Done that.
They told me they will release a debugging version to get more info on the exit 1 code as it is a popular topic on discord.
Thanks and keep up the good work
And offtopic, as a funny thing, when I try to explain something to my girlfrend ( or past girlfriends) they all rolled their eyes signaling that they don't want any explanation ( you know the new feminism mainsplaining shaming tactic).
When I see each of your videos and you say: "if you wanna know more, let me explain" I all the time imagine your wife rolling up her eyes and saying internally Noooooooo :)
So guess what just happened. I installed LM Studio on another PC, and I am seeing the same error as you!!!
I can't load my model because I keep getting an error; error load model exit code: 1
I need this but with code, not a no code tool.
LLM localy with any language
Try the Code Llama model.
There is a beta for a Linux version.
Q: Should all non-lawyer people get a court appointed AI attorney?
Obviously not.
You would need a human lawyer, so if you just used an AI it would be yourself that represent yourself as a “lawyer”. I am not sure if it is possible for you to run an AI in court.
So as Gary wrote, obviously not, lol.
Wait, your name isn't Garry Explains?
🤦♂️
@@GaryExplainschill, it was a joke.
🤦♂️
Holy sh*t, I thought Explains was just a weird last name until I read your comment. So his username is like that because he “explains” things? 🤯🤯
@@Gubby-Man I honestly can't tell if you are being sarcastic. But just in case you aren't, I literally say at the beginning of most of my videos, "hello my name is Gary Sims and this is Gary Explains" 🤦♂️
not only this garbage isn't signed, but also installs itself on a system volume without any prompt or option to change the destination🤬
Hmmm. It is signed. That changed shortly after the video was released. Are you sure you are using a legitimate version? 😲
unsigned binary.. yeah no thanks lol
They fixed that since I made the video 👍
How Do We Run Own ChatGPT-like LLM On Our Brain?
I have exams, that would be helpful!
Thanks In Advance 🫠
There are some projects that uses this. But I won't give more details you cheating bstrd
so what? if he can get a better grade why won't he. how can you be so selfish@@thcookieh
@@doublea9891 wow, now cheat and let people be mediocre is to be selfish... Ok, so be it