Does a Deep Learning Laptop Exist? - Tensorbook Review

Поделиться
HTML-код
  • Опубликовано: 18 июл 2022
  • Reviewing Lambda and Razer's Tensorbook, a laptop aimed at deep learning, with 16GB of VRAM (GPU memory), 64GB of RAM, 2TB of NVMe storage and an 8-core intel i7 11800H CPU.
    lambdalabs.com/deep-learning/...
    Neural Networks from Scratch book: nnfs.io
    Channel membership: / @sentdex
    Discord: / discord
    Reddit: / sentdex
    Support the content: pythonprogramming.net/support...
    Twitter: / sentdex
    Instagram: / sentdex
    Facebook: / pythonprogramming.net
    Twitch: / sentdex

Комментарии • 96

  • @zskater1234
    @zskater1234 Год назад +2

    Thanks for the review! I was actually looking for a review of this new version of the Tensorbook, but none of the ones I found was so instructive as this. Good job!!!

  • @ChaiTimeDataScience
    @ChaiTimeDataScience Год назад +22

    Wohoooo, The new studio and server in the background is looking awesome! I'm still planning mine, but working on it :D
    Thanks for the invaluable explanation on how laptop GPUs are slightly underpowered versions of the desktop ones. It took me a long time and I learned it the hard way!

  • @datasciyinfo5133
    @datasciyinfo5133 Год назад +7

    Thanks! I bought the hard cover book nnfs. Didn’t know about it before. A month ago I bought Asus ROG Strix G17 2021 model for $2000 (MSRP was $1800 at launch, but availability was always limited). It has NVIDA RTX 3070 gpu with 8 GB video RAM, 16GB regular RAM, Ryzen 9 cpu, and 1 TB SSD. It was the sweet spot among all Nvidia gpu laptops I reviewed. $2000 was my budget. 3080 gpu starts having heat issues that reduce how much performance bump you get. This one does really well with cooling fans. I used free Google Colab before, and language model epoch speed is twice as fast on this laptop. I get frustrated using only Colab all the time. Data has to be downloaded/uploaded to Colab servers each time I run a notebook, only one notebook can be connected at a time, so all code snippets need to fit in one notebook, and there are frequent outages. Mounting large dataset to Google Drive really slows down processing, so is not an option for deep learning. I am so happy I bought a gpu laptop. I will use it for tweaking and pilot testing, then run the really large models on paid cloud services. It’s very convenient having the conda environment set up just the way I want, and Windows Subsystem for Linux 2 and Ubuntu set up the way I like too. Thank you for your years of teaching online. I am so glad there are teachers like you in this world, at this time.

  • @Kevin_KC0SHO
    @Kevin_KC0SHO Год назад +2

    Thank you for the review, and all the work you put into the channel and the book.

  • @sanjaymarison5629
    @sanjaymarison5629 Год назад +1

    Your videos production quality has skyrocketed really nice to watch.

  • @shaunbowman8036
    @shaunbowman8036 Год назад +4

    Thanks! I’ve been under a lot of stress early in my data science career. This video has spurred a lot of great idea’s. Thanks a lot!

    • @Nabikko
      @Nabikko Год назад

      damn bro u just left 50 buckaroos and no reply :(

    • @sentdex
      @sentdex  Год назад +2

      RUclips seems to file away these super thanks in a discrete location lol. Hope the stress has lifted by now for you, and thank you!

  • @kenchang3456
    @kenchang3456 Год назад +19

    Thanks for the review. I remember your breakdown on using the cloud vs local and for me, not doing data science for all the time, cloud is cost-effective way to go.

  • @lashlarue59
    @lashlarue59 Год назад +2

    When you were testing between this laptop and the 1080 did you run any Tensorflow tests? The reason I ask is the 1080 and M1 does not have any Tensor Cores but the 3080 does so I would expect the 3080 to really shine in that category. Also can we have access to the test suite you used? I'm to what I'll see here verses the results you got.

  • @Zartris
    @Zartris Год назад +13

    Please, we bought 5 tensorbooks for our research team, and we have never regretted anything more!
    The computer overheats very quickly when training networks on them an and just opening pycharm makes the fans go crazy.
    We have send them back is "defect" but every new tensorbook we gets have the same problem.
    If you intend on training networks for longer periods of time, it will heat up and lowering the lifetime of the component as it is a compact devices with minimum airflow capabilities.
    I'm not here to argue with people and some might have good experiences with this, but I do want to share our experiences as we have fully regretted our purchase.

    • @sorvex9
      @sorvex9 Год назад

      I would never train networks on something like this anyway, but it would be nice to be able to do small local tests with a gpu, not to mention inference locally with speed.

  • @smyaknti
    @smyaknti Год назад +11

    Mac trackpads usually do not click, they just have a pressure sensor and haptics to simulate the click. Hence they usually have same click feel everywhere, whereas the razer trackpad is hinged at the top hence the harder press.

    • @sentdex
      @sentdex  Год назад +2

      So you're saying there's a device that just simulates the tactile "click" feeling? That's very interesting!

    • @smyaknti
      @smyaknti Год назад +4

      Yeah, the tech is similiar to the iphone force touch sensors I believe. Not sure as to which generation the macbook is but some of the recent ones have it (I checked the 9th gen intel and the M1 macbook pro)

    • @bharat1031
      @bharat1031 Год назад +1

      @@smyaknti every macbook since the 2015 macbook pros have this i believe

    • @egeerdem8272
      @egeerdem8272 Год назад

      why the fuck would you subject yourself to a trackpad, use mouse my g

  • @datasciyinfo5133
    @datasciyinfo5133 Год назад +4

    Also, I needed to upgrade my laptop anyway, which was 4.5 years old and I paid $700. A $2000 laptop is something I could have bought anyway for my main recent model laptop. And now I can train deep learning models on it using PyTorch and Tensorflow GPU libraries. 2x speed increase over the free Google Colab GPU was an unexpected icing on top. Asus ROG Strix G17 is a very nice laptop for general use. (I don’t play games, but have watched movies on it). Jennifer :-D.

    • @qzorn4440
      @qzorn4440 Год назад

      Asus ROG Strix G17 looks nice for tensorflow and I wish there was more info on laptops in this category before buying a new one. Thanks.

  • @Mutual_Information
    @Mutual_Information Год назад +7

    I have done an embarrassing amount of machine learning using a mere laptop.. I need a DL laptop.

  • @SumGuyLovesVideos
    @SumGuyLovesVideos Год назад

    yay new sentdex video, time to watch it :D

  • @panconqueso9195
    @panconqueso9195 Год назад

    Nice review, however I was expecting more testing with different models for training deep learning or even running very heavy models like the ones from Koboldai

  • @induction7895
    @induction7895 Год назад +8

    For those times when you need to do deep learning on the move.

  • @jxsl13
    @jxsl13 Год назад +1

    on a Macbook you do not have real buttons under the trackpad but a taptic engine which is why you have the same click feedback across the whole trackpad.

  • @EvilAbs
    @EvilAbs Год назад

    Interesting review. Are there any worthy 17" alternatives ?

  • @wadwad6571
    @wadwad6571 Год назад +2

    Having the same laptop, but seems MSI WE 76 with A5000 and i9 on the board is better solution for DL tasks.

  • @ToTheGAMES
    @ToTheGAMES Год назад

    What I do to keep temperatures in check on my laptop under heavy load is some customized ThrottleStop profiles. Amazing software, barely influences performance when set up right.

    • @ToTheGAMES
      @ToTheGAMES Год назад

      I do have to add that on newer processors undervolting is locked down because of the Plundervolt exploits.

  • @qzorn4440
    @qzorn4440 Год назад

    wow great info. so which laptops are good enough to study deep learning and keep the around prices under $2,000? 🤔 thanks a lot.

  • @KieranHolroyd
    @KieranHolroyd Год назад

    I'm not an ML/AI engineer, so please forgive me if this is a bad question.
    There's lots of press around all of the amazing ML asic's. for the 80w allocated to the GPU, using an ML asic would make more sense, obviously that would probably be much harder in something like a razer laptop, designed for GPU space.
    On the otherhand, I imagine with a different chassis it would be possible to implement. Software/Firmware/Hardware problems would need to be worked out, but after a few generations it could be magical no?

  • @Veptis
    @Veptis Год назад

    No longer just confusing laptop SKUs for Nvidia Ampere, they got like 8 desktop versions of the 3060 now
    My ThinkPad has a tiny GPU so I can test if my code works on CUDA before running it in colab or even try to run it at home. I love the keyboard, the screen and battery life aren't great. Touchpad is alright with a few multi touch gestures. Still trying to get more into using the nipple

  • @keco185
    @keco185 Год назад +3

    M1 macs have a lot of tensor compute. The issue is that most deep learning libraries don’t have very good support for them. So unless you want to use Apple’s libraries, it might not make sense

    • @NisseOhlsen
      @NisseOhlsen Год назад +2

      If so, why the hack isn't Apple just making the intermediate software ?

  • @jasonpham1426
    @jasonpham1426 Год назад

    I also recently bought a tensorbook. Everything is perfect except I am getting weird text that appears during boot up and shutdown. Is this also happening to you?

  • @mohamed-bana
    @mohamed-bana Год назад

    Is your book available on Amazon's Kindle store?

  • @mingyangyu998
    @mingyangyu998 Год назад

    hi, how about the cpu performance, I see there is intel i7 11800H CPU, so how about the frequency, can it reach to 4.2GHz or just run in basic frequency?

  • @ares31337
    @ares31337 3 месяца назад

    do you run things overnight on this? and for long stretches of time? wondering how it handles it + heat/cooling

  • @jjpp1993
    @jjpp1993 Год назад +1

    why no one show live usage of this thing??

  • @saturdaysequalsyouth
    @saturdaysequalsyouth Год назад

    Good review.
    Just wanted to add that the mobile workstation GPUs are just rebranded mobile gaming GPUs. Chip companies can just charge a lot more money for them because they're "workstation" class.

  • @user-xn1um6kf8p
    @user-xn1um6kf8p Год назад

    In the website, they say the laptop have 3080 Ti. So is it max q or ti ?

  • @tacorevenge87
    @tacorevenge87 9 месяцев назад

    Great video

  • @vuongbinhan
    @vuongbinhan Год назад +2

    "sounds like an oxymoron to me" damn my first thought too

  • @hernanliseras4395
    @hernanliseras4395 Год назад

    what laptop is that 86000 that you talked in the video?

    • @tacorevenge87
      @tacorevenge87 9 месяцев назад

      It’s a desktop a6000 by lambda as well

  • @ita6072
    @ita6072 Год назад

    If only. If only. If the logo for the Razer laptops was on the outside. If only, it wasn't painted green, and it was dark black like the outside color, it would be one of the most beautiful things that computers have made
    As for Ethernet, why don't they make a smaller version of it, I'm not a tech scientist or engineer, but, it's necessary, and there are many "including me" who are bothered by the limits of Wi-Fi.

  • @festro1000
    @festro1000 Год назад

    Doesn't really make a lot of sense from a business point of view, as it seems like a product aimed at a niche group; I mean if its main audience are those into deep learning why not just make a desktop card like a DPU? probably be cheaper and easier to scale with desktop power supplies.

  • @jordiarellano1996
    @jordiarellano1996 Год назад

    Any one has the link where I can rent the A100 in the cloud? Cheers!!

    • @sentdex
      @sentdex  Год назад +1

      The one mentioned in the video is: lambdalabs.com/service/gpu-cloud

  • @camdenparsons5114
    @camdenparsons5114 Год назад

    Is there really any reason to need to gpu run locally? For training jobs, you can submit those over ssh to your desktop, even from an android mobile device 😀

    • @sentdex
      @sentdex  Год назад +4

      This is addressed in the video. There are lots of reasons why, and many of those reasons overlap with why you'd want it to be a laptop. With enough training time, cloud also stops making sense.

  • @serta5727
    @serta5727 Год назад

    Very neat laptop

  • @adempc
    @adempc Год назад

    Thanks.

  • @ashu-
    @ashu- Год назад +3

    Even if you have a thick laptop with great cooling, doing deep learning in a laptop will get hotter than running the same on PC.
    It might be good for entry level tasks or for learning but i don't think people who actually do intensive tasks like your GTA series it wouldnt be possible with a laptop

  • @pradiptahafid
    @pradiptahafid Год назад

    if I am a freshman student. I would ask this to my parents. eventhough in the end I would play games all day

  • @ashreid20
    @ashreid20 Год назад

    chair?!

  • @nsxkkxlnmiyo8722
    @nsxkkxlnmiyo8722 6 месяцев назад

    but so yes, you do need to have the actual hardware with you specially because of the peripheries that you will have access to if and only if you have the computer on your own hands. limitation arise when you plan to use the web, no access to micro sd cards--how will you get the data transfer--no access to an yof the usb drivers--how will you use other peripheries or microcontroers.i think then, if you are stricly only some sort sort of software engineer then you could do fine but if you are an engineer of some sorts--electronics, hardware related stuff--you'd want to have the laptop on your lap

  • @nsxkkxlnmiyo8722
    @nsxkkxlnmiyo8722 6 месяцев назад

    doing so via the web gets expensive tho, unless you find a way to use your own computer and saved some fees

  • @shreeshaaithal-
    @shreeshaaithal- Год назад

    Please can you answer this question.
    Can a language model learn animals language and translate it to English.
    Now don't think about datasets.

    • @ToTheGAMES
      @ToTheGAMES Год назад +4

      No. There is no way to know what animals say. The closest humans have come is talking VERY basic sign language with monkeys.

    • @BenjaSerra
      @BenjaSerra Год назад +1

      Im not a fan of this time of stuff, but if you are, here is the answer: It depends on the dataset. If you have a really good amount of quality data, then you can train a model as usual. But you will spend 97% your time building a good dataset and trying to understand if there is some natural relationship between an animal behaivior and our language, maybe house pets are a good start, like dogs that have spends generations with us. Genetics is an excellent indicator.

    • @DoctorNemmo
      @DoctorNemmo Год назад +1

      From a biology standpoint: Most of animal language is physical behavior, and it changes from species to species. So you would have to first create a dataset of visual/kinetic behaviors for your software to analyze.

    • @shreeshaaithal-
      @shreeshaaithal- Год назад

      @@BenjaSerra but the thing is how will I create that dataset?

  • @Totial
    @Totial Год назад

    starlink is comming!

  • @hamzak.741
    @hamzak.741 Год назад

    This is what Jessie from breaking bad would have become if he had studied instead of inventing chili powder goodies.

  • @amafuji
    @amafuji Год назад

    I just want a laptop with the 12gb 3060 instead of the 6gb

  • @damaomiX
    @damaomiX Год назад +2

    3080 Max-Q is not as good as a normal 3080 laptop GPU.

  • @SpaceTimeBeing_
    @SpaceTimeBeing_ Год назад +1

    Damn, why does it run Ubuntu instead of Pop OS. Ubuntu has become crap thanks to snap packages.

  • @cedricdb
    @cedricdb Год назад

    Why buy from Lambda over buying directly from Razer? I’d assume Razer is cheaper. And what’s the deal with Lambda asking $500 extra to include Windows 10 Pro (dual boot) over just Ubuntu?

    • @sentdex
      @sentdex  Год назад +4

      Its not cheaper from razer, last I checked. Id only guess Lambda gets a bulk discount.
      As for the cost of windows, not sure, but you can also just put it on there yourself if you like.

  • @jawwadsabir4620
    @jawwadsabir4620 Год назад

    For the laptops I’m personally done with linux/windows and don’t think I’m going back. First thing is the crazy heat and fan even with just vscode open with some browser tabs. Other thing is If I am asked to work on some crazy DL project, I would be provided with equal compute power or Id rather rent a cloud if its a freelance work. Macs is the way to go for everyday workflow which i can take it anywhere and use it comfortably like one the airplane.

  • @TheOriginalJohnDoe
    @TheOriginalJohnDoe Год назад

    I feel like you should center your camera a bit better (lower, so that you’re more in the center) and then zoom in a bit more. Almost looks like you’re in the background now

  • @MrMusicaify
    @MrMusicaify Год назад +1

    What is the use case for this? Why not just remote in to the desktop via any plain laptop? I think this is an overkill, most programmers use multiple monitor + external keyboard anyways, and if traveling can just remote in to the desktop from any laptop to make minor changes.
    I’m not trying to knock the product, more that I don’t understand why a laptop like this is necessary over remote connection.

    • @sentdex
      @sentdex  Год назад +5

      Plane travel, car travel, boat travel to start. Anywhere when wifi isn't great/available, to start. From there, it's really an argument of why you'd ever have any local compute. Here, it's a question of how many hours/days of training you do. That said, this all was a foreseen question, and is covered in the video :P

    • @MrMusicaify
      @MrMusicaify Год назад

      @@sentdex Thanks for the reply. I didn’t have the time to watch the video yet. Just wanted to hear an example of what the use case would be before watching, since I couldn’t think of a solid case from my end

    • @Totial
      @Totial Год назад +1

      In my case I started accessing our powerfull desktop but after some months I ended up buying a gaming laptop. Running code remotely is not a problem but the lag we have (at least in Argentina) to write code and to debug it is painfull!! Now I write the code on my laptop and once it runs fine I get back to the desktop. I think starlink would fix this issue but its not arriving soon here

    • @xynyde0
      @xynyde0 Год назад

      "most programmers use multiple monitor + external keyboard anyways" seems like you haven't met most programmers then.

    • @MrMusicaify
      @MrMusicaify Год назад

      @@xynyde0 In a workplace setting, your programmers use one monitor? I’m not too sure about at home, since I don’t see what people use there.

  • @anupakulathunga345
    @anupakulathunga345 Год назад

    Wow

  • @excelfan85
    @excelfan85 Год назад

    now its time to check FPS benchmarks :)

  • @jakobpcoder
    @jakobpcoder Год назад +1

    We want Gta V part 2

    • @sentdex
      @sentdex  Год назад +3

      Same here! we're working on it. Currently back-documenting the project up to this point.

  • @MadlipzMarathi
    @MadlipzMarathi Год назад

    Pytorch have added MPS backend for M1 so MacBook Pro is really good alternative, It is really good proven laptop and it have powerful GPU so I think it make more sense to go with macbook. Also for more powerful thing I think you should just go with cloud or dedicated workstation.

  • @Kage1128
    @Kage1128 Год назад +1

    arent these giga expensive?

  • @Stinosko
    @Stinosko Год назад

    Hello Sir 👋

  • @SeaJay_Oceans
    @SeaJay_Oceans Год назад

    More like Deep Marketing ! ;-)

  • @jjpp1993
    @jjpp1993 Год назад

    base macbook air + cloud for me

  • @waverider1674
    @waverider1674 Год назад +1

    My advice from my personal experience with this laptop - Please don't buy it. It is not worth the money.
    The laptop is a barebone one sold from Sager. It has overheating problem on prolonged use.
    The battery lifetime is horrible. Once the battery expires it enlarges almost ripping out the touchpad.
    It is very difficult to get a replacement battery. I contacted the LambdaLabs their support was horrible once the unit is sold.
    Their advice to me - get a new laptop.
    For machine learning It is better to build a custom desktop with many cooling accessories available than waste money buying a tensorbook laptop.

  • @mutakhirfatir8185
    @mutakhirfatir8185 Год назад

    Ini sayaa leptop jadi manusiaa kantor asli semuaa naa jugaa tuhann namaku ok

  • @TheLeoPoint
    @TheLeoPoint Год назад

    😆

  • @TheOriginalJohnDoe
    @TheOriginalJohnDoe Год назад

    Just buy the latest MacBook

    • @sentdex
      @sentdex  Год назад +6

      Nope, not if you want to do deep learning on it.

  • @tuapuikia
    @tuapuikia Год назад +1

    I bought the laptop for ML courses. It's awesome and I been using Ubuntu for the past 15 years.

    • @tacorevenge87
      @tacorevenge87 9 месяцев назад

      How is it holding up?

    • @tuapuikia
      @tuapuikia 9 месяцев назад +1

      @@tacorevenge87 So far so good, I use it to crunch ML data every day.

    • @tacorevenge87
      @tacorevenge87 9 месяцев назад

      @@tuapuikia awesome. Thank you