Thanks! I bought the hard cover book nnfs. Didn’t know about it before. A month ago I bought Asus ROG Strix G17 2021 model for $2000 (MSRP was $1800 at launch, but availability was always limited). It has NVIDA RTX 3070 gpu with 8 GB video RAM, 16GB regular RAM, Ryzen 9 cpu, and 1 TB SSD. It was the sweet spot among all Nvidia gpu laptops I reviewed. $2000 was my budget. 3080 gpu starts having heat issues that reduce how much performance bump you get. This one does really well with cooling fans. I used free Google Colab before, and language model epoch speed is twice as fast on this laptop. I get frustrated using only Colab all the time. Data has to be downloaded/uploaded to Colab servers each time I run a notebook, only one notebook can be connected at a time, so all code snippets need to fit in one notebook, and there are frequent outages. Mounting large dataset to Google Drive really slows down processing, so is not an option for deep learning. I am so happy I bought a gpu laptop. I will use it for tweaking and pilot testing, then run the really large models on paid cloud services. It’s very convenient having the conda environment set up just the way I want, and Windows Subsystem for Linux 2 and Ubuntu set up the way I like too. Thank you for your years of teaching online. I am so glad there are teachers like you in this world, at this time.
Wohoooo, The new studio and server in the background is looking awesome! I'm still planning mine, but working on it :D Thanks for the invaluable explanation on how laptop GPUs are slightly underpowered versions of the desktop ones. It took me a long time and I learned it the hard way!
Mac trackpads usually do not click, they just have a pressure sensor and haptics to simulate the click. Hence they usually have same click feel everywhere, whereas the razer trackpad is hinged at the top hence the harder press.
Yeah, the tech is similiar to the iphone force touch sensors I believe. Not sure as to which generation the macbook is but some of the recent ones have it (I checked the 9th gen intel and the M1 macbook pro)
Thanks for the review! I was actually looking for a review of this new version of the Tensorbook, but none of the ones I found was so instructive as this. Good job!!!
Please, we bought 5 tensorbooks for our research team, and we have never regretted anything more! The computer overheats very quickly when training networks on them an and just opening pycharm makes the fans go crazy. We have send them back is "defect" but every new tensorbook we gets have the same problem. If you intend on training networks for longer periods of time, it will heat up and lowering the lifetime of the component as it is a compact devices with minimum airflow capabilities. I'm not here to argue with people and some might have good experiences with this, but I do want to share our experiences as we have fully regretted our purchase.
I would never train networks on something like this anyway, but it would be nice to be able to do small local tests with a gpu, not to mention inference locally with speed.
Thanks for the review. I remember your breakdown on using the cloud vs local and for me, not doing data science for all the time, cloud is cost-effective way to go.
Also, I needed to upgrade my laptop anyway, which was 4.5 years old and I paid $700. A $2000 laptop is something I could have bought anyway for my main recent model laptop. And now I can train deep learning models on it using PyTorch and Tensorflow GPU libraries. 2x speed increase over the free Google Colab GPU was an unexpected icing on top. Asus ROG Strix G17 is a very nice laptop for general use. (I don’t play games, but have watched movies on it). Jennifer :-D.
on a Macbook you do not have real buttons under the trackpad but a taptic engine which is why you have the same click feedback across the whole trackpad.
When you were testing between this laptop and the 1080 did you run any Tensorflow tests? The reason I ask is the 1080 and M1 does not have any Tensor Cores but the 3080 does so I would expect the 3080 to really shine in that category. Also can we have access to the test suite you used? I'm to what I'll see here verses the results you got.
M1 macs have a lot of tensor compute. The issue is that most deep learning libraries don’t have very good support for them. So unless you want to use Apple’s libraries, it might not make sense
Nice review, however I was expecting more testing with different models for training deep learning or even running very heavy models like the ones from Koboldai
I also recently bought a tensorbook. Everything is perfect except I am getting weird text that appears during boot up and shutdown. Is this also happening to you?
Good review. Just wanted to add that the mobile workstation GPUs are just rebranded mobile gaming GPUs. Chip companies can just charge a lot more money for them because they're "workstation" class.
No longer just confusing laptop SKUs for Nvidia Ampere, they got like 8 desktop versions of the 3060 now My ThinkPad has a tiny GPU so I can test if my code works on CUDA before running it in colab or even try to run it at home. I love the keyboard, the screen and battery life aren't great. Touchpad is alright with a few multi touch gestures. Still trying to get more into using the nipple
I'm not an ML/AI engineer, so please forgive me if this is a bad question. There's lots of press around all of the amazing ML asic's. for the 80w allocated to the GPU, using an ML asic would make more sense, obviously that would probably be much harder in something like a razer laptop, designed for GPU space. On the otherhand, I imagine with a different chassis it would be possible to implement. Software/Firmware/Hardware problems would need to be worked out, but after a few generations it could be magical no?
hi, how about the cpu performance, I see there is intel i7 11800H CPU, so how about the frequency, can it reach to 4.2GHz or just run in basic frequency?
If only. If only. If the logo for the Razer laptops was on the outside. If only, it wasn't painted green, and it was dark black like the outside color, it would be one of the most beautiful things that computers have made As for Ethernet, why don't they make a smaller version of it, I'm not a tech scientist or engineer, but, it's necessary, and there are many "including me" who are bothered by the limits of Wi-Fi.
but so yes, you do need to have the actual hardware with you specially because of the peripheries that you will have access to if and only if you have the computer on your own hands. limitation arise when you plan to use the web, no access to micro sd cards--how will you get the data transfer--no access to an yof the usb drivers--how will you use other peripheries or microcontroers.i think then, if you are stricly only some sort sort of software engineer then you could do fine but if you are an engineer of some sorts--electronics, hardware related stuff--you'd want to have the laptop on your lap
Doesn't really make a lot of sense from a business point of view, as it seems like a product aimed at a niche group; I mean if its main audience are those into deep learning why not just make a desktop card like a DPU? probably be cheaper and easier to scale with desktop power supplies.
What I do to keep temperatures in check on my laptop under heavy load is some customized ThrottleStop profiles. Amazing software, barely influences performance when set up right.
Im not a fan of this time of stuff, but if you are, here is the answer: It depends on the dataset. If you have a really good amount of quality data, then you can train a model as usual. But you will spend 97% your time building a good dataset and trying to understand if there is some natural relationship between an animal behaivior and our language, maybe house pets are a good start, like dogs that have spends generations with us. Genetics is an excellent indicator.
From a biology standpoint: Most of animal language is physical behavior, and it changes from species to species. So you would have to first create a dataset of visual/kinetic behaviors for your software to analyze.
Even if you have a thick laptop with great cooling, doing deep learning in a laptop will get hotter than running the same on PC. It might be good for entry level tasks or for learning but i don't think people who actually do intensive tasks like your GTA series it wouldnt be possible with a laptop
For the laptops I’m personally done with linux/windows and don’t think I’m going back. First thing is the crazy heat and fan even with just vscode open with some browser tabs. Other thing is If I am asked to work on some crazy DL project, I would be provided with equal compute power or Id rather rent a cloud if its a freelance work. Macs is the way to go for everyday workflow which i can take it anywhere and use it comfortably like one the airplane.
My advice from my personal experience with this laptop - Please don't buy it. It is not worth the money. The laptop is a barebone one sold from Sager. It has overheating problem on prolonged use. The battery lifetime is horrible. Once the battery expires it enlarges almost ripping out the touchpad. It is very difficult to get a replacement battery. I contacted the LambdaLabs their support was horrible once the unit is sold. Their advice to me - get a new laptop. For machine learning It is better to build a custom desktop with many cooling accessories available than waste money buying a tensorbook laptop.
Why buy from Lambda over buying directly from Razer? I’d assume Razer is cheaper. And what’s the deal with Lambda asking $500 extra to include Windows 10 Pro (dual boot) over just Ubuntu?
Its not cheaper from razer, last I checked. Id only guess Lambda gets a bulk discount. As for the cost of windows, not sure, but you can also just put it on there yourself if you like.
What is the use case for this? Why not just remote in to the desktop via any plain laptop? I think this is an overkill, most programmers use multiple monitor + external keyboard anyways, and if traveling can just remote in to the desktop from any laptop to make minor changes. I’m not trying to knock the product, more that I don’t understand why a laptop like this is necessary over remote connection.
Plane travel, car travel, boat travel to start. Anywhere when wifi isn't great/available, to start. From there, it's really an argument of why you'd ever have any local compute. Here, it's a question of how many hours/days of training you do. That said, this all was a foreseen question, and is covered in the video :P
@@sentdex Thanks for the reply. I didn’t have the time to watch the video yet. Just wanted to hear an example of what the use case would be before watching, since I couldn’t think of a solid case from my end
In my case I started accessing our powerfull desktop but after some months I ended up buying a gaming laptop. Running code remotely is not a problem but the lag we have (at least in Argentina) to write code and to debug it is painfull!! Now I write the code on my laptop and once it runs fine I get back to the desktop. I think starlink would fix this issue but its not arriving soon here
I feel like you should center your camera a bit better (lower, so that you’re more in the center) and then zoom in a bit more. Almost looks like you’re in the background now
Is there really any reason to need to gpu run locally? For training jobs, you can submit those over ssh to your desktop, even from an android mobile device 😀
This is addressed in the video. There are lots of reasons why, and many of those reasons overlap with why you'd want it to be a laptop. With enough training time, cloud also stops making sense.
Pytorch have added MPS backend for M1 so MacBook Pro is really good alternative, It is really good proven laptop and it have powerful GPU so I think it make more sense to go with macbook. Also for more powerful thing I think you should just go with cloud or dedicated workstation.
Thanks! I’ve been under a lot of stress early in my data science career. This video has spurred a lot of great idea’s. Thanks a lot!
damn bro u just left 50 buckaroos and no reply :(
RUclips seems to file away these super thanks in a discrete location lol. Hope the stress has lifted by now for you, and thank you!
Thanks! I bought the hard cover book nnfs. Didn’t know about it before. A month ago I bought Asus ROG Strix G17 2021 model for $2000 (MSRP was $1800 at launch, but availability was always limited). It has NVIDA RTX 3070 gpu with 8 GB video RAM, 16GB regular RAM, Ryzen 9 cpu, and 1 TB SSD. It was the sweet spot among all Nvidia gpu laptops I reviewed. $2000 was my budget. 3080 gpu starts having heat issues that reduce how much performance bump you get. This one does really well with cooling fans. I used free Google Colab before, and language model epoch speed is twice as fast on this laptop. I get frustrated using only Colab all the time. Data has to be downloaded/uploaded to Colab servers each time I run a notebook, only one notebook can be connected at a time, so all code snippets need to fit in one notebook, and there are frequent outages. Mounting large dataset to Google Drive really slows down processing, so is not an option for deep learning. I am so happy I bought a gpu laptop. I will use it for tweaking and pilot testing, then run the really large models on paid cloud services. It’s very convenient having the conda environment set up just the way I want, and Windows Subsystem for Linux 2 and Ubuntu set up the way I like too. Thank you for your years of teaching online. I am so glad there are teachers like you in this world, at this time.
Wohoooo, The new studio and server in the background is looking awesome! I'm still planning mine, but working on it :D
Thanks for the invaluable explanation on how laptop GPUs are slightly underpowered versions of the desktop ones. It took me a long time and I learned it the hard way!
Mac trackpads usually do not click, they just have a pressure sensor and haptics to simulate the click. Hence they usually have same click feel everywhere, whereas the razer trackpad is hinged at the top hence the harder press.
So you're saying there's a device that just simulates the tactile "click" feeling? That's very interesting!
Yeah, the tech is similiar to the iphone force touch sensors I believe. Not sure as to which generation the macbook is but some of the recent ones have it (I checked the 9th gen intel and the M1 macbook pro)
@@smyaknti every macbook since the 2015 macbook pros have this i believe
why the fuck would you subject yourself to a trackpad, use mouse my g
Thanks for the review! I was actually looking for a review of this new version of the Tensorbook, but none of the ones I found was so instructive as this. Good job!!!
Your videos production quality has skyrocketed really nice to watch.
Please, we bought 5 tensorbooks for our research team, and we have never regretted anything more!
The computer overheats very quickly when training networks on them an and just opening pycharm makes the fans go crazy.
We have send them back is "defect" but every new tensorbook we gets have the same problem.
If you intend on training networks for longer periods of time, it will heat up and lowering the lifetime of the component as it is a compact devices with minimum airflow capabilities.
I'm not here to argue with people and some might have good experiences with this, but I do want to share our experiences as we have fully regretted our purchase.
I would never train networks on something like this anyway, but it would be nice to be able to do small local tests with a gpu, not to mention inference locally with speed.
Thanks for the review. I remember your breakdown on using the cloud vs local and for me, not doing data science for all the time, cloud is cost-effective way to go.
I have done an embarrassing amount of machine learning using a mere laptop.. I need a DL laptop.
For those times when you need to do deep learning on the move.
Also, I needed to upgrade my laptop anyway, which was 4.5 years old and I paid $700. A $2000 laptop is something I could have bought anyway for my main recent model laptop. And now I can train deep learning models on it using PyTorch and Tensorflow GPU libraries. 2x speed increase over the free Google Colab GPU was an unexpected icing on top. Asus ROG Strix G17 is a very nice laptop for general use. (I don’t play games, but have watched movies on it). Jennifer :-D.
Asus ROG Strix G17 looks nice for tensorflow and I wish there was more info on laptops in this category before buying a new one. Thanks.
Thank you for the review, and all the work you put into the channel and the book.
on a Macbook you do not have real buttons under the trackpad but a taptic engine which is why you have the same click feedback across the whole trackpad.
When you were testing between this laptop and the 1080 did you run any Tensorflow tests? The reason I ask is the 1080 and M1 does not have any Tensor Cores but the 3080 does so I would expect the 3080 to really shine in that category. Also can we have access to the test suite you used? I'm to what I'll see here verses the results you got.
Having the same laptop, but seems MSI WE 76 with A5000 and i9 on the board is better solution for DL tasks.
M1 macs have a lot of tensor compute. The issue is that most deep learning libraries don’t have very good support for them. So unless you want to use Apple’s libraries, it might not make sense
If so, why the hack isn't Apple just making the intermediate software ?
yay new sentdex video, time to watch it :D
Interesting review. Are there any worthy 17" alternatives ?
do you run things overnight on this? and for long stretches of time? wondering how it handles it + heat/cooling
Nice review, however I was expecting more testing with different models for training deep learning or even running very heavy models like the ones from Koboldai
why no one show live usage of this thing??
"sounds like an oxymoron to me" damn my first thought too
Is your book available on Amazon's Kindle store?
wow great info. so which laptops are good enough to study deep learning and keep the around prices under $2,000? 🤔 thanks a lot.
I also recently bought a tensorbook. Everything is perfect except I am getting weird text that appears during boot up and shutdown. Is this also happening to you?
Good review.
Just wanted to add that the mobile workstation GPUs are just rebranded mobile gaming GPUs. Chip companies can just charge a lot more money for them because they're "workstation" class.
No longer just confusing laptop SKUs for Nvidia Ampere, they got like 8 desktop versions of the 3060 now
My ThinkPad has a tiny GPU so I can test if my code works on CUDA before running it in colab or even try to run it at home. I love the keyboard, the screen and battery life aren't great. Touchpad is alright with a few multi touch gestures. Still trying to get more into using the nipple
I'm not an ML/AI engineer, so please forgive me if this is a bad question.
There's lots of press around all of the amazing ML asic's. for the 80w allocated to the GPU, using an ML asic would make more sense, obviously that would probably be much harder in something like a razer laptop, designed for GPU space.
On the otherhand, I imagine with a different chassis it would be possible to implement. Software/Firmware/Hardware problems would need to be worked out, but after a few generations it could be magical no?
hi, how about the cpu performance, I see there is intel i7 11800H CPU, so how about the frequency, can it reach to 4.2GHz or just run in basic frequency?
In the website, they say the laptop have 3080 Ti. So is it max q or ti ?
3080 Max-Q is not as good as a normal 3080 laptop GPU.
If only. If only. If the logo for the Razer laptops was on the outside. If only, it wasn't painted green, and it was dark black like the outside color, it would be one of the most beautiful things that computers have made
As for Ethernet, why don't they make a smaller version of it, I'm not a tech scientist or engineer, but, it's necessary, and there are many "including me" who are bothered by the limits of Wi-Fi.
doing so via the web gets expensive tho, unless you find a way to use your own computer and saved some fees
but so yes, you do need to have the actual hardware with you specially because of the peripheries that you will have access to if and only if you have the computer on your own hands. limitation arise when you plan to use the web, no access to micro sd cards--how will you get the data transfer--no access to an yof the usb drivers--how will you use other peripheries or microcontroers.i think then, if you are stricly only some sort sort of software engineer then you could do fine but if you are an engineer of some sorts--electronics, hardware related stuff--you'd want to have the laptop on your lap
what laptop is that 86000 that you talked in the video?
It’s a desktop a6000 by lambda as well
Doesn't really make a lot of sense from a business point of view, as it seems like a product aimed at a niche group; I mean if its main audience are those into deep learning why not just make a desktop card like a DPU? probably be cheaper and easier to scale with desktop power supplies.
This is what Jessie from breaking bad would have become if he had studied instead of inventing chili powder goodies.
Great video
What I do to keep temperatures in check on my laptop under heavy load is some customized ThrottleStop profiles. Amazing software, barely influences performance when set up right.
I do have to add that on newer processors undervolting is locked down because of the Plundervolt exploits.
I bought the laptop for ML courses. It's awesome and I been using Ubuntu for the past 15 years.
How is it holding up?
@@tacorevenge87 So far so good, I use it to crunch ML data every day.
@@tuapuikia awesome. Thank you
Thanks.
Please can you answer this question.
Can a language model learn animals language and translate it to English.
Now don't think about datasets.
No. There is no way to know what animals say. The closest humans have come is talking VERY basic sign language with monkeys.
Im not a fan of this time of stuff, but if you are, here is the answer: It depends on the dataset. If you have a really good amount of quality data, then you can train a model as usual. But you will spend 97% your time building a good dataset and trying to understand if there is some natural relationship between an animal behaivior and our language, maybe house pets are a good start, like dogs that have spends generations with us. Genetics is an excellent indicator.
From a biology standpoint: Most of animal language is physical behavior, and it changes from species to species. So you would have to first create a dataset of visual/kinetic behaviors for your software to analyze.
@@BenjaSerra but the thing is how will I create that dataset?
if I am a freshman student. I would ask this to my parents. eventhough in the end I would play games all day
I just want a laptop with the 12gb 3060 instead of the 6gb
Very neat laptop
chair?!
Even if you have a thick laptop with great cooling, doing deep learning in a laptop will get hotter than running the same on PC.
It might be good for entry level tasks or for learning but i don't think people who actually do intensive tasks like your GTA series it wouldnt be possible with a laptop
For the laptops I’m personally done with linux/windows and don’t think I’m going back. First thing is the crazy heat and fan even with just vscode open with some browser tabs. Other thing is If I am asked to work on some crazy DL project, I would be provided with equal compute power or Id rather rent a cloud if its a freelance work. Macs is the way to go for everyday workflow which i can take it anywhere and use it comfortably like one the airplane.
My advice from my personal experience with this laptop - Please don't buy it. It is not worth the money.
The laptop is a barebone one sold from Sager. It has overheating problem on prolonged use.
The battery lifetime is horrible. Once the battery expires it enlarges almost ripping out the touchpad.
It is very difficult to get a replacement battery. I contacted the LambdaLabs their support was horrible once the unit is sold.
Their advice to me - get a new laptop.
For machine learning It is better to build a custom desktop with many cooling accessories available than waste money buying a tensorbook laptop.
starlink is comming!
Why buy from Lambda over buying directly from Razer? I’d assume Razer is cheaper. And what’s the deal with Lambda asking $500 extra to include Windows 10 Pro (dual boot) over just Ubuntu?
Its not cheaper from razer, last I checked. Id only guess Lambda gets a bulk discount.
As for the cost of windows, not sure, but you can also just put it on there yourself if you like.
Any one has the link where I can rent the A100 in the cloud? Cheers!!
The one mentioned in the video is: lambdalabs.com/service/gpu-cloud
What is the use case for this? Why not just remote in to the desktop via any plain laptop? I think this is an overkill, most programmers use multiple monitor + external keyboard anyways, and if traveling can just remote in to the desktop from any laptop to make minor changes.
I’m not trying to knock the product, more that I don’t understand why a laptop like this is necessary over remote connection.
Plane travel, car travel, boat travel to start. Anywhere when wifi isn't great/available, to start. From there, it's really an argument of why you'd ever have any local compute. Here, it's a question of how many hours/days of training you do. That said, this all was a foreseen question, and is covered in the video :P
@@sentdex Thanks for the reply. I didn’t have the time to watch the video yet. Just wanted to hear an example of what the use case would be before watching, since I couldn’t think of a solid case from my end
In my case I started accessing our powerfull desktop but after some months I ended up buying a gaming laptop. Running code remotely is not a problem but the lag we have (at least in Argentina) to write code and to debug it is painfull!! Now I write the code on my laptop and once it runs fine I get back to the desktop. I think starlink would fix this issue but its not arriving soon here
"most programmers use multiple monitor + external keyboard anyways" seems like you haven't met most programmers then.
@@xynyde0 In a workplace setting, your programmers use one monitor? I’m not too sure about at home, since I don’t see what people use there.
arent these giga expensive?
They really are.
I feel like you should center your camera a bit better (lower, so that you’re more in the center) and then zoom in a bit more. Almost looks like you’re in the background now
Is there really any reason to need to gpu run locally? For training jobs, you can submit those over ssh to your desktop, even from an android mobile device 😀
This is addressed in the video. There are lots of reasons why, and many of those reasons overlap with why you'd want it to be a laptop. With enough training time, cloud also stops making sense.
We want Gta V part 2
Same here! we're working on it. Currently back-documenting the project up to this point.
Hello Sir 👋
More like Deep Marketing ! ;-)
base macbook air + cloud for me
Wow
now its time to check FPS benchmarks :)
Pytorch have added MPS backend for M1 so MacBook Pro is really good alternative, It is really good proven laptop and it have powerful GPU so I think it make more sense to go with macbook. Also for more powerful thing I think you should just go with cloud or dedicated workstation.
Ini sayaa leptop jadi manusiaa kantor asli semuaa naa jugaa tuhann namaku ok
😆
Just buy the latest MacBook
Nope, not if you want to do deep learning on it.