I have built machines all my life. My first build was a SOL20 back in 1977. It had 4K of RAM and used a TV as the monitor. You had 40 characters per line and 8 columns on the TV (The Graphics card was capable of 80 characters per line and 16 columns but it was too blurry on a TV screen). Used a tape recorder to back up my programs. My first program I wrote was "I will not chew gum in class" 500 times. Used my dads teletype machine to print it out and handed it to the teacher! Fast forward. Just built an AMD Threadripper with 16 cores and installed an NVIDIA 3090 GPU has a Corsair carbide Series Air 540 Case and a Corsair 1000W power supply. It also has 32 GB of RAM and a 4TB M.2 SSD that set me back a bunch!
@@argish I I'm turning 61 this month. Now I am playing with Arduino Controllers. I designed a can crusher using an Arduino controller and a NEMA 23 stepper motor, which produces 60 pounds of force! Lookup "Kent Harris Arduino can crusher" on RUclips.
Excellent! This was so clear and straight to the point! Thank you for this video! It will help me in my decision on what kind of Deep Learning rig to patch up in the upcoming months :)
Please continue on the video, I really like them. I have lost motivation on learning ML but your video have inspired me to continue learning on ML. Great work!
Thanks for the video. Very helpful. In my own experience of building machine, most are true as you said. But only problem is when something went wrong, mostly hardware parts, then the headache can be very frustrating.
Would you rather suggest going with Intel for Cpu instead of AMD ryzen, as intel has some optimized libraries for python and even there is a model optimization toolkit called Openvino which is also by Intel. Thanks!!!
Just a note with ryzen CPU's you get ram with as much clock speed as you can because the cpu is built using chiplets the transfer speed of which is determined by your ram speed
I found water cooling absolutely essential to get 4-by and above. At 4 2080 it throttled even in a 62 degree room with an Air conditioner blower on it. I am now using four ampere based RTX 3090's and will be adding 3 more in January. It works well, but I have about 250 hours into it. Next machine I am buying from Bizon.
@@CE-vd2px haha I use to program mine to do all my work for me. 😂 On one hand, I KNEW HOW to manually do the work. So if it was taken away from me, I could still churn out As' on my work, it'd just take longer. Said f that put this little thing to work. So I wrote the code and saved it in one of the slots. That huge whopping 12kb of massive storage.. Every problem I was given, I had a program for it. All I had to do was put in the variables, hit enter, and there was the answer.. It took the teacher longer to flip to the back of the book and find the answer given, than it did for me to just type in the veriables and hit enter. 😂
Hi Michael, any chance to do a video on the system OS set up of your AI training machine? Like what and how to install, things that we need to watch out for ... thanks!
Ahhh this stuff is exciting! I literally just bought a used gpu server that's shipping with 4 quadro rtx 5000s. Partnering it up with 2x xeon gold 6148s and 384gb ddr4 2600 :D:D... scored it all for ~$4700 usd
I am going to nitpick because I expected a setup video, not a parts list. this is in good clean fun though, I did like the video even if I've seen too many "Parts" videos lol keep up the good work, keep having fun with making videos, and stay silly my guy. Few small corrections: - (20) (80) - P C ( E, the I always comes first - "Each GPU generally takes about 2 spaces of the PCIE slot" Welp I have news for ya! size is ironically measured in Slots, so if a card takes 2 PCIE slots, the it's a 2 slot card. Here's a mouthful "The GPUs I chose are 2 Slot cards which means the PCIE slot adjacent to it will be inaccessible and that can further be hindered by bigger cards" - Storage, you should've taken the opportunity to let newcomers know that and SSD comes in 2 forms, one being M.2, and the other being SATA. Sata is a bit slower than the M.2. - PSU, I would've liked to see a solid resource. - Case isn't just for style. Airflow matters, Size for upgrades matter. The RAM clock speed will not yield positive results . AMD's Ryzen lineup REQUIRES faster ram as well as Dual Channel ram in order to perform at peak efficiency. I have a Ryzen 9 5950x, and I upgraded from DDR4 2400mhz to 3600mhz and saw a very impressive boost in performance. I understand that Intel does not need it, but if you're running a Ryzen chip (AM4) you should absolutely aim for a minimum of 3200mhz, if you can afford to go higher, by all means do so. All out of love! Would love to see a tutorial on using Colossal-AI, or even beginner stuff like setting up a chatbot using OpenAI API, or a deep dive into this conversational AI by Jinho D. Choi emory.gitbook.io/conversational-ai/
Your build are still very expensive, top tier cpu/GPU are optimized for 3D games. You can spend maybe 500 bucks for a retied HP/DELL work station with 2 intel Xeon E5 cpus which easily provide 20 cores/40 Hyper threads. Those Xeon are optimized for backend multi threads long running calculation. Regarding GPU, you do not need RTX 2080ti, buy 3 Nvidia Tesla K80 GPU cards, each only cost 150 dollars but equipped with 24 gb GPU memory, then you will get 20 core cpu/128gb ram / 72 Gb GPU resources with only 950 usd. That server build are particular for deep learning, you cannot play 3D game on such server., but for a true deep learning fan, that is the cheap and power machine needed
Did you burn up your SSDs? I did when ETH mining.. SLC is the only way to go and that means Optane (pedabytewrites), as the PCIe linked NVMe for the data dump (write) as it is as fast as the PCIe 4 bandwith (4x) and the lanes are really only 8x as 16x is 2 8x lanes
I have a spare 3080 and want to build a deep learning machine/home server PC. Probably wont be that fast, but I figure it will be good enough for me to further my education.
Hi A.I. Hacker, just ordered a P620 Thinkstation with RTX A4000 GPU. I did not do a custom build, because I am relying on the Workstation e.g. it has ECC Ram + 3 year Premier support + I guess the resale value is higher than a custom build. But great build + video ;-)
Is it possible to add more GPU processing power with additional external GPUs? At what point does it make more sense to use an AMD Thread Ripper with 64 cores, 128 threads, up to 2 TBs RAM?
If they dont have any cheep motherboards you can design the pcb on a modeler app and send it to PCBway.com and they will make it for you. And you can add the components your self.
I'm in this situation right now - need to buy a rig for my PhD work. Need to do reinforcement and federated learning in addition to deep learning. The more I'm looking, the more confused I become. How about expansion? What if I split configuration across 2 towers? And keep updating them? I would need additional software to virtualize all of it?
I built a mining rig with 9x3090’s and a 5950x … I wonder if I can turn that into a deep learning rig? Do training machines work well, or at all, with pcie risers?
They work as well! I needed and AMD card that support 4 GPUS and the latest models with 4GPUs support were more expensive than my model although they did do better on benchmarks.
@@theaihacker777 Thanks for your reply. I am thinking the lastest AMD Ryzen Threadripper 3970X 32, but do you think even amd 3950x will do an almost similar performance ?
Does multithreading require syncing with all of the other threads, because as I imagine how it would work some of the threads might get to the 3th layer, while the other ones are still at 1st layer in NN?
is this for running things like blender AI, fusion 360, stable diffusion, and deepfakes? Im curious about this stuff and wanna know how to build a pc for that in mind. Thanks for the video!
Hey Michael ...thank you so much for the info ..have question though ...can I setup this machine and allow my ML team to access it to perform training activities ? have you tried that ?
One other issue with the training in the cloud is the issue to get your data there quickly. Getting 500 Gigs from different sources into the cloud for an experiment is just pain in the ass and can take a couple days
Everyone seems to have built their own dual gpu , or at least tried using an inexpensive crypto gaming setup with a high end motherboard for machine learning. Which has worked for many , but the question that I think you should really answer is WHAT LINUX DiSTRO did you use?
in case of doing reinforcement learning in python and building your environment in a separate simulator (and link them via API), is it still possible to use cloud?
What do you think about the 24 GB Tesla k80 video card with duel GPU? They are going for $170 on eBay (I picked up 4 of them). I will be picking up 1 of the new 3090/3080 cards for my system, when they come out. But for now, how do you think the K80 will hold up with deep learning systems?
Could you comment on overclocking the CPU for machine learning? When I do overnight runs, I feel like stability of the CPU is more important than a few percentage gains from overclocking.
If overclocked correctly the OC should have no impact on stability if it does reduce the OC (or take a different OC path / more Voltage ect.) till it doesnt, the only thing u can worry about in OC is the Temp ur room will take and the consumption of the hardware.
110% wattage for the PSU seems a little too low to me. Also if you're gonna use GPUs with blower style coolers anyways it doesn't really matter what CPU cooler you get since it will be loud AF regardless.
I am new to the world of AI ML.. But I have planned to run personal study and research for next one year... I have invested into my rig. Below is the component list: AMD Ryzen 5900X processor, MSI S360 AIO liquid cooler, MSI X570S Ace Max motherboard, XPG Spectrix D60G 3200mhz 32gb RAM (8*4), MSI RTX 3080 Suprim X 10gb Graphics card, WD Black SN770 gent 1tb m2 nvme SSD, WD Blue 1tb 7200rpm HDD, NZXT C850 80+ Gold power supply. Please let me know whether this is a good system to start with for my journey towards AI ML, deep learning, etc.
Im a beginner too, I was researching a lot and I found out that the more GB of vram you have, the better for ML, maybe when I could afford a 3090 ill go for it, it has 24gb and its at a lower price in my country than the 4090.
Great video, I already have a PC built but I actually have trouble running PyTorch on Anaconda, I run into all sorts of environment issues etc. Could you do a tutorial on how to actually run Deep learning code on Jupyter Notebook locally? That would be very appreciated and subbed!
@@theaihacker777 I'm using Windows 10, I have been using google colab since starting out and want a simple solution with a Jupyter Notebook style, I'm not familair with wsl2 but looks like I would need to know linux and looks complicated? Not sure how wsl2 will help?
@@seanteng1234 I run fast.ai (based on top of Pytorch) on an AMD Ryzen system with a single RTX 2070. I tried running Windows on this box, but there's an annoying bug under Windows where it takes forever for your training epochs to start running. In the end, I had to install Linux on the box instead to get faster training cycles. I use a separate PC with Windows to just run Jupyter notebooks on the Linux server. I've found that instructions on the Internet for setting up various libraries can be outdated very quickly, so you'll need a bit of patience and to do a bit of Googling around to fix library version incompatibilities.
My build: CPU: AMD Ryzen 7 3700x (best bang for buck for now) Motherboard: Tuf gaming x570 (with wifi ha) GPU: RTX 3090 RAM: G.SKILL Flare x D4 3200M (2x16G) C16 SSD: Samsung 980 NVM 2 SSD 1TB GPU cooler: Corsair 4000D Airflow TG CPU cooler: Corsari H100x CPU cooler Fan: Corsair 3 pack fan Power supply: ROG Strix 1000W PSU I have room to add a couple more GPUs. I suspect I might have to upgrade my CPU first after adding 1 or more GPUs.
Thanks so much for this, and your other videos, they're extremely high quality! I have a few questions. 1. Does the manufacturer of the GPU matter much (besides potential noisiness) or will they be pretty much identical? I see a lot of different companies that produce the 2070/2080/2080ti. 2. 3. PC Part Picker shows the Asus GeForce RTX 2080 Ti is out of stock. What would you replace it with? 3. Any thoughts on used GPUs? I assume theyre a bad idea but am curious to hear what you think. Thank you!
Hey thanks for watching! 1. Manufacture doesn’t matter but I would do my diligence on brand for purchasing. 2. I would look for any 2080 Ti with blower style fans if you’re going multi gpu setup. I wouldn’t care much about manufacture but like I said above, I’d research to make sure the brand is not faulty. 3. I would consider buying used if it saves me lots of coin! Just make sure they’re in top conditions.
Just a hint considering your comment is pretty recent - The next gen nvidia GPUs (i.e. RTX 3080 TI) are coming out soon, rumored around september, so if it can wait I would probably not buy a 2080 TI now. You could probably get it cheaper soon or get something even faster
A fairer comparison with using cloud GPUs would take into account the price of the energy consumption of your local machine :) Any idea what that costs a month if used 24/7?
What about storage? They charge you for storage. They charge you for hosting databases too. The energy cost is somewhat balanced. You also need to send your data back and forth. That's energy required for internet.
You're right. I did the math, and running a GPU 24/7 for a month would cost about 20 euros/month (consumption : 0.28 kW, price: 0.1 euros/kWh --> 0.28*0.1*24*30 = 20 euros / month), which is much less than my gut-level expectations.
Beautiful machine that. Reminds me of the movie "War Games". You opted for the 2080 Ti there, but as far as I understand it does not have dual DMA engines. Apparently the Titan X (Pascal?) had them which is unique in the GeForce lineup. Which do you think is more important : Tensor Cores or something like dual DMA for asynchronous communication with the host system?
I’m building ai model rn for machine learning deep learning and I have built it with different novels approaches and techniques and instead of using a huge deep learning model I have made a design where it as smart as a deep learning model like with billion of nodes but just with a few hundred million nodes to match a multi purpose model instead just for one thing but it still work in progress
@@isbestlizard not sure about the PCIe spacing on that MB, the rtx3090s are 2.5-3 slots card. The Gigabyte Designare MB has 4cm PCIe spacing (center to center) the nearby PCIe slot gets covered by the 3090 cards! means out of 4 slots you could use 2 only!! PCIe spacing on desingare are considered wide compared to others.
@@isbestlizard We did considered that option, Blower types are loud (down-clocked too) specially when having a 350W 4x 3090s. We might use PCIe 4.0 riser cables to mount all four. Proper thermal solution will be another issue!
I had fun building my rig last year, I'm glad to see we practically chose the same components save for the cooling and the mobo
I have built machines all my life. My first build was a SOL20 back in 1977. It had 4K of RAM and used a TV as the monitor. You had 40 characters per line and 8 columns on the TV (The Graphics card was capable of 80 characters per line and 16 columns but it was too blurry on a TV screen). Used a tape recorder to back up my programs. My first program I wrote was "I will not chew gum in class" 500 times. Used my dads teletype machine to print it out and handed it to the teacher! Fast forward. Just built an AMD Threadripper with 16 cores and installed an NVIDIA 3090 GPU has a Corsair carbide Series Air 540 Case and a Corsair 1000W power supply. It also has 32 GB of RAM and a 4TB M.2 SSD that set me back a bunch!
That old system sounds so cool! It’s crazy how far pc technology has come since then
man just how old are you? xD we new gen fellas fall far behind before we can experience such changes. Inspiring. Thank You.
@@argish I
I'm turning 61 this month. Now I am playing with Arduino Controllers. I designed a can crusher using an Arduino controller and a NEMA 23 stepper motor, which produces 60 pounds of force! Lookup "Kent Harris Arduino can crusher" on RUclips.
@@kentharris7427 cool stuff! I'll check it out!
@@kentharris7427 also, Happy 61st B'day in advance!!
this is exactly what i need & you explained really clear! please keep going with more videos!
Thanks a ton for the video! As someone who is training RL models it was comforting to find someone who built a rig for the same purpose.
I would love to see a update on this one. What are you running today, I hope not the RTX 2080, at least not as your primary GPU.
Excellent! This was so clear and straight to the point! Thank you for this video! It will help me in my decision on what kind of Deep Learning rig to patch up in the upcoming months :)
What does your hardware look like today? Still Intel? Still NVidia?
Is a GPU with only 8GB VRAM even useful? (that's what I have)
Please continue on the video, I really like them. I have lost motivation on learning ML but your video have inspired me to continue learning on ML. Great work!
Thanks for the video. Very helpful. In my own experience of building machine, most are true as you said. But only problem is when something went wrong, mostly hardware parts, then the headache can be very frustrating.
Would you rather suggest going with Intel for Cpu instead of AMD ryzen, as intel has some optimized libraries for python and even there is a model optimization toolkit called Openvino which is also by Intel. Thanks!!!
But.. When you have to upgrade the CPU, you would need to replace ENTIRE system (because Intel likes to change the socket EVERY 2 generations) 😂
Just a note with ryzen CPU's you get ram with as much clock speed as you can because the cpu is built using chiplets the transfer speed of which is determined by your ram speed
I found water cooling absolutely essential to get 4-by and above. At 4 2080 it throttled even in a 62 degree room with an Air conditioner blower on it. I am now using four ampere based RTX 3090's and will be adding 3 more in January. It works well, but I have about 250 hours into it. Next machine I am buying from Bizon.
Great video. Exactly what I was looking for. Thanks a mil.
Useful vid... Thank you!
This is such a good video. I especially like the War Machine PC build. It is awesome!
I assume this setup would work incredibly well for gaming as well? In case one would want to use the machine for both activities?
Yes, but can it run Crysis?
@@theaihacker777 probably not sadly
@@theaihacker777 my Texas instrument TI-83 can. On FULL
@@daviedood2503 dude! I got one of those! So epiccc
@@CE-vd2px haha I use to program mine to do all my work for me. 😂
On one hand, I KNEW HOW to manually do the work. So if it was taken away from me, I could still churn out As' on my work, it'd just take longer. Said f that put this little thing to work. So I wrote the code and saved it in one of the slots. That huge whopping 12kb of massive storage..
Every problem I was given, I had a program for it. All I had to do was put in the variables, hit enter, and there was the answer..
It took the teacher longer to flip to the back of the book and find the answer given, than it did for me to just type in the veriables and hit enter. 😂
Hi Michael, any chance to do a video on the system OS set up of your AI training machine? Like what and how to install, things that we need to watch out for ... thanks!
Exactly what I was going to ask!
We need this.
This is very good info specially the cloud comparison as that is the major decision point. Thanks!!
How many gpu's did you finally end up with? What was the final configuration?
Ahhh this stuff is exciting! I literally just bought a used gpu server that's shipping with 4 quadro rtx 5000s. Partnering it up with 2x xeon gold 6148s and 384gb ddr4 2600 :D:D... scored it all for ~$4700 usd
I am going to nitpick because I expected a setup video, not a parts list. this is in good clean fun though, I did like the video even if I've seen too many "Parts" videos lol keep up the good work, keep having fun with making videos, and stay silly my guy.
Few small corrections:
- (20) (80)
- P C ( E, the I always comes first
- "Each GPU generally takes about 2 spaces of the PCIE slot" Welp I have news for ya! size is ironically measured in Slots, so if a card takes 2 PCIE slots, the it's a 2 slot card. Here's a mouthful "The GPUs I chose are 2 Slot cards which means the PCIE slot adjacent to it will be inaccessible and that can further be hindered by bigger cards"
- Storage, you should've taken the opportunity to let newcomers know that and SSD comes in 2 forms, one being M.2, and the other being SATA. Sata is a bit slower than the M.2.
- PSU, I would've liked to see a solid resource.
- Case isn't just for style. Airflow matters, Size for upgrades matter.
The RAM clock speed will not yield positive results . AMD's Ryzen lineup REQUIRES faster ram as well as Dual Channel ram in order to perform at peak efficiency. I have a Ryzen 9 5950x, and I upgraded from DDR4 2400mhz to 3600mhz and saw a very impressive boost in performance. I understand that Intel does not need it, but if you're running a Ryzen chip (AM4) you should absolutely aim for a minimum of 3200mhz, if you can afford to go higher, by all means do so.
All out of love! Would love to see a tutorial on using Colossal-AI, or even beginner stuff like setting up a chatbot using OpenAI API, or a deep dive into this conversational AI by Jinho D. Choi emory.gitbook.io/conversational-ai/
Congrats. Very cool !
Your build are still very expensive, top tier cpu/GPU are optimized for 3D games. You can spend maybe 500 bucks for a retied HP/DELL work station with 2 intel Xeon E5 cpus which easily provide 20 cores/40 Hyper threads. Those Xeon are optimized for backend multi threads long running calculation. Regarding GPU, you do not need RTX 2080ti, buy 3 Nvidia Tesla K80 GPU cards, each only cost 150 dollars but equipped with 24 gb GPU memory, then you will get 20 core cpu/128gb ram / 72 Gb GPU resources with only 950 usd. That server build are particular for deep learning, you cannot play 3D game on such server., but for a true deep learning fan, that is the cheap and power machine needed
Thanks bro! Totally like your videos & your energy!
Thanks, loved your recommendation.
Have you updated it since that and if yes whar are the specs for 2024?
This guy is really good at explaining
Thanks Michael for such a crisp and informative video. May I ask, which OS do you use for your War Machine?
Are RAID volumes really needed for these setups?
That was awesome ! Thanks man
I want to go all out and get the best bang for my buck: gets an Intel
In the context of deep learning, yes.
LOL
@@unknownperson3691 I guess if you ignore threadrippers being a thing then sure.
lmao
@@elGringo69 epic! lmao
Did you burn up your SSDs? I did when ETH mining.. SLC is the only way to go and that means Optane (pedabytewrites), as the PCIe linked NVMe for the data dump (write) as it is as fast as the PCIe 4 bandwith (4x) and the lanes are really only 8x as 16x is 2 8x lanes
What do you think about ECC RAM? is it important?
I have a spare 3080 and want to build a deep learning machine/home server PC. Probably wont be that fast, but I figure it will be good enough for me to further my education.
Brilliant!! Thank you for sharing 🙏
Hi A.I. Hacker, just ordered a P620 Thinkstation with RTX A4000 GPU. I did not do a custom build, because I am relying on the Workstation e.g. it has ECC Ram + 3 year Premier support + I guess the resale value is higher than a custom build. But great build + video ;-)
Have you test the bert model with a 1gb of corpus information in your eep learning machine?
which gpu is needed for starting deep learning to start
Cool! 👏 Thank you for the video! What about the OS? Maybe Linux? If Linux, which one?
Hi, many thanks for the great video. One quick question, if I choose AMD Ryzen, would I be able to install CUDA suit on it?
Do you use the Lambda stack?
Is it possible to add more GPU processing power with additional external GPUs? At what point does it make more sense to use an AMD Thread Ripper with 64 cores, 128 threads, up to 2 TBs RAM?
Hey Michael, any suggestions to for power backup incase of power outtage?
Super helpful video! Thanks for sharing this!
If they dont have any cheep motherboards you can design the pcb on a modeler app and send it to PCBway.com and they will make it for you. And you can add the components your self.
can i use different gpu , or different genration GPU, ?
if i buy old gpu's from marketplace, i can save even more cost
Thank's really super Build for Deep learning.
Does a multi GPU setup need to use SLI? I have an HP Z840 Dual Xeon workstation where only specific Quadro cards support SLI
I'm in this situation right now - need to buy a rig for my PhD work. Need to do reinforcement and federated learning in addition to deep learning.
The more I'm looking, the more confused I become.
How about expansion? What if I split configuration across 2 towers? And keep updating them? I would need additional software to virtualize all of it?
Hi, super explanatory and easy to follow video! Do you have any updates? Maybe using "cheap" AMDs like 6700xt? haha
I built a mining rig with 9x3090’s and a 5950x … I wonder if I can turn that into a deep learning rig? Do training machines work well, or at all, with pcie risers?
I loved this video, thank you!!!!
Question,how about those new AMD cpu? Are they good as well?
They work as well! I needed and AMD card that support 4 GPUS and the latest models with 4GPUs support were more expensive than my model although they did do better on benchmarks.
@@theaihacker777 Thanks for your reply. I am thinking the lastest AMD Ryzen Threadripper 3970X 32, but do you think even amd 3950x will do an almost similar performance ?
@@chrish8941 oh, thanks for those details.
Does multithreading require syncing with all of the other threads, because as I imagine how it would work some of the threads might get to the 3th layer, while the other ones are still at 1st layer in NN?
is this for running things like blender AI, fusion 360, stable diffusion, and deepfakes? Im curious about this stuff and wanna know how to build a pc for that in mind. Thanks for the video!
Great video! I am fascination by the tension of computers, what is it that you used this machine for?
Hey Michael ...thank you so much for the info ..have question though ...can I setup this machine and allow my ML team to access it to perform training activities ? have you tried that ?
I'm building my own HPC cluster farm, 4 Clusters with 32 servers each to start. Do you have any experience running your DL over clusters?
One other issue with the training in the cloud is the issue to get your data there quickly. Getting 500 Gigs from different sources into the cloud for an experiment is just pain in the ass and can take a couple days
Was thinking of making a AI rig i have 8x3090's and 5800x or i can also get a 5900x Where should i start
thanks, bro. well explained.
Everyone seems to have built their own dual gpu , or at least tried using an inexpensive crypto gaming setup with a high end motherboard for machine learning. Which has worked for many , but the question that I think you should really answer is WHAT LINUX DiSTRO did you use?
Michael,
thanks for the video. i needed some help with building my rig.
would it be possible to do a brief chat ?
I read online that 64 cores is ideal. Xeon maybe or AMD Threadripper GPU and Ram too
in case of doing reinforcement learning in python and building your environment in a separate simulator (and link them via API), is it still possible to use cloud?
What do you think about the 24 GB Tesla k80 video card with duel GPU? They are going for $170 on eBay (I picked up 4 of them). I will be picking up 1 of the new 3090/3080 cards for my system, when they come out. But for now, how do you think the K80 will hold up with deep learning systems?
Could you comment on overclocking the CPU for machine learning? When I do overnight runs, I feel like stability of the CPU is more important than a few percentage gains from overclocking.
If overclocked correctly the OC should have no impact on stability if it does reduce the OC (or take a different OC path / more Voltage ect.) till it doesnt, the only thing u can worry about in OC is the Temp ur room will take and the consumption of the hardware.
110% wattage for the PSU seems a little too low to me. Also if you're gonna use GPUs with blower style coolers anyways it doesn't really matter what CPU cooler you get since it will be loud AF regardless.
I am new to the world of AI ML.. But I have planned to run personal study and research for next one year... I have invested into my rig. Below is the component list:
AMD Ryzen 5900X processor,
MSI S360 AIO liquid cooler,
MSI X570S Ace Max motherboard,
XPG Spectrix D60G 3200mhz 32gb RAM (8*4),
MSI RTX 3080 Suprim X 10gb Graphics card,
WD Black SN770 gent 1tb m2 nvme SSD,
WD Blue 1tb 7200rpm HDD,
NZXT C850 80+ Gold power supply.
Please let me know whether this is a good system to start with for my journey towards AI ML, deep learning, etc.
Im a beginner too, I was researching a lot and I found out that the more GB of vram you have, the better for ML, maybe when I could afford a 3090 ill go for it, it has 24gb and its at a lower price in my country than the 4090.
So informative video huge tahnks!!!
You did a fine job.
Great video. Very helpful.
Great video, I already have a PC built but I actually have trouble running PyTorch on Anaconda, I run into all sorts of environment issues etc. Could you do a tutorial on how to actually run Deep learning code on Jupyter Notebook locally? That would be very appreciated and subbed!
Yeah for sure. What OS are you running? Also checkout docker if you're linux, or wsl2 with cuda if you're on windows.
@@theaihacker777 I'm using Windows 10, I have been using google colab since starting out and want a simple solution with a Jupyter Notebook style, I'm not familair with wsl2 but looks like I would need to know linux and looks complicated? Not sure how wsl2 will help?
@@seanteng1234 I run fast.ai (based on top of Pytorch) on an AMD Ryzen system with a single RTX 2070. I tried running Windows on this box, but there's an annoying bug under Windows where it takes forever for your training epochs to start running. In the end, I had to install Linux on the box instead to get faster training cycles. I use a separate PC with Windows to just run Jupyter notebooks on the Linux server.
I've found that instructions on the Internet for setting up various libraries can be outdated very quickly, so you'll need a bit of patience and to do a bit of Googling around to fix library version incompatibilities.
What about operating system?
My build:
CPU: AMD Ryzen 7 3700x (best bang for buck for now)
Motherboard: Tuf gaming x570 (with wifi ha)
GPU: RTX 3090
RAM: G.SKILL Flare x D4 3200M (2x16G) C16
SSD: Samsung 980 NVM 2 SSD 1TB
GPU cooler: Corsair 4000D Airflow TG
CPU cooler: Corsari H100x CPU cooler
Fan: Corsair 3 pack fan
Power supply: ROG Strix 1000W PSU
I have room to add a couple more GPUs. I suspect I might have to upgrade my CPU first after adding 1 or more GPUs.
Which tower have you used? The tower he used is not available anymore.
i just want to ask if there is any existing study on how much storage capacity required for deep learning?
I'm going for the r9 5950x, Lf g skill trident 4x16, will it be ok with 3000mhz? Gpu rtx 3080
Can you explain what you're using the machine for instead of how to build it?
Thanks for showing me how to be independent!
Thanks so much for this, and your other videos, they're extremely high quality! I have a few questions.
1. Does the manufacturer of the GPU matter much (besides potential noisiness) or will they be pretty much identical? I see a lot of different companies that produce the 2070/2080/2080ti.
2. 3. PC Part Picker shows the Asus GeForce RTX 2080 Ti is out of stock. What would you replace it with?
3. Any thoughts on used GPUs? I assume theyre a bad idea but am curious to hear what you think.
Thank you!
Hey thanks for watching!
1. Manufacture doesn’t matter but I would do my diligence on brand for purchasing.
2. I would look for any 2080 Ti with blower style fans if you’re going multi gpu setup. I wouldn’t care much about manufacture but like I said above, I’d research to make sure the brand is not faulty.
3. I would consider buying used if it saves me lots of coin! Just make sure they’re in top conditions.
Just a hint considering your comment is pretty recent - The next gen nvidia GPUs (i.e. RTX 3080 TI) are coming out soon, rumored around september, so if it can wait I would probably not buy a 2080 TI now. You could probably get it cheaper soon or get something even faster
Plip agreed!
@@vibingcat1 Thank you! I was wondering when those are coming out. I can definitely wait until then.
@@chrish8941 Thank you, this is awesome and very helpful
GPU and TPU is free to use for 30 hours per week on kaggle
What operating system do you recommend?
Hello if I buy a intel Xeon V3 (example 2678 v3 , 2680 v3) , I wouldn't have any problem whith compability? .
I have a GTX 1660 Ti
Hm? Is it not a corsair chassis?
When my dude said he was going to buy some more GPU when he gets ballin, he went from Chang- Chad
Haha I wish I saw this 6 months ago :) great video!!
I saw you haven't been active recently, do you have another channel?
Heyy. What os did you use
A fairer comparison with using cloud GPUs would take into account the price of the energy consumption of your local machine :) Any idea what that costs a month if used 24/7?
What about storage? They charge you for storage. They charge you for hosting databases too. The energy cost is somewhat balanced. You also need to send your data back and forth. That's energy required for internet.
You're right. I did the math, and running a GPU 24/7 for a month would cost about 20 euros/month (consumption : 0.28 kW, price: 0.1 euros/kWh --> 0.28*0.1*24*30 = 20 euros / month), which is much less than my gut-level expectations.
Beautiful machine that. Reminds me of the movie "War Games". You opted for the 2080 Ti there, but as far as I understand it does not have dual DMA engines. Apparently the Titan X (Pascal?) had them which is unique in the GeForce lineup. Which do you think is more important : Tensor Cores or something like dual DMA for asynchronous communication with the host system?
Hey Michael, do you think AMD CPUs are good for Deep Learning? Or would you rather choose Intel? I've heard that the Intel MKL is important.
Depending on how much vram you have on your GPU, you may need above 4G decoding on your motherboard. Older motherboards don't have this.
Would get the new Nvidia gpus for deep learning?
is this deep learning computer like a nural net cpu??? (skyNet????
Rocking that made KC hat. Gang Gang.
Gang Gang!
I’m building ai model rn for machine learning deep learning and I have built it with different novels approaches and techniques and instead of using a huge deep learning model I have made a design where it as smart as a deep learning model like with billion of nodes but just with a few hundred million nodes to match a multi purpose model instead just for one thing but it still work in progress
I would like to build one but I'm way ignorant on coding and building.
Thank you!
What happend if I run a model with RTX3070 but the model needs more memory? The model stops or run slower?
It stops. That's why some people recommend getting an RTX 3060 with 12 gigs of memory, over a 3060ti/3070/3070ti that only has 8 gigs of memory.
Thanks for sharing, Any recommendations for motherbord that supports 4x 3090s? along with the 3990x
get an epyc processor and an asrock romed8-2t
@@isbestlizard not sure about the PCIe spacing on that MB, the rtx3090s are 2.5-3 slots card. The Gigabyte Designare MB has 4cm PCIe spacing (center to center) the nearby PCIe slot gets covered by the 3090 cards! means out of 4 slots you could use 2 only!! PCIe spacing on desingare are considered wide compared to others.
@@zainfadhil2588 Use the gigabyte 3090 turbo, it's a 2 slot blower card, will be perfect!
@@isbestlizard We did considered that option, Blower types are loud (down-clocked too) specially when having a 350W 4x 3090s. We might use PCIe 4.0 riser cables to mount all four. Proper thermal solution will be another issue!