CUDA Explained - Why Deep Learning uses GPUs

Поделиться
HTML-код
  • Опубликовано: 29 дек 2024

Комментарии • 136

  • @deeplizard
    @deeplizard  6 лет назад +17

    Check out the corresponding blog and other resources for this video at:
    deeplizard.com/learn/video/6stDhEA0wFQ

    • @kudoamv
      @kudoamv 6 лет назад

      So I can't use Computer Vision Programs which require GPU because I am an AMD card user?

    • @mae1000
      @mae1000 4 года назад

      @@kudoamv Actually, you can, I think. AMD invested in that field too, google "gpuopen".

  • @debajyotisg
    @debajyotisg 5 лет назад +85

    Great job speeding Jensen Huang up. xD

  • @larryteslaspacexboringlawr739
    @larryteslaspacexboringlawr739 5 лет назад +7

    5:13 ballmer ambush, panic clicking to skip (thank you for awesome video)

  • @majeedhussain3276
    @majeedhussain3276 6 лет назад +9

    This channel seriously deserves million subs.Have been watching many series from this channel.Great work !!!!! keep going I'm sure this channel gonna flow with lots of subscribers someday .

    • @deeplizard
      @deeplizard  6 лет назад +1

      Thank you, Majeed! We're glad to hear you've been enjoying multiple series here, and we're happy to have you as an engaged member of the community! Always appreciate seeing your comments :)

  • @Xiler6969
    @Xiler6969 5 лет назад +55

    Congratulations, You've impressed me. Very professional series. Right to the good stuff, clear and sharp voice, broad yet specific explanations.

  • @ajwadakil6020
    @ajwadakil6020 4 года назад +3

    This channel should have more subscribers, seriously

  • @engkamyabi
    @engkamyabi 2 года назад +2

    Amazing video and loved the short clips! Thank you!

  • @shivangitomar5557
    @shivangitomar5557 2 года назад +2

    The BEST VIDEO on this topic!

  • @Sikuq
    @Sikuq 4 года назад +5

    Beautifully done, Chris. Wow. Thanks. I learned a lot.

  • @Aweheid
    @Aweheid 4 года назад +2

    Rich informative video !! No better explanation is more than yours!!

  • @gabriellugmayr2871
    @gabriellugmayr2871 5 лет назад +10

    wow, I saw these first 4 videos of the Pytorch series and am impressed how much time & effort you put into these tutorials. Thanks a lot.
    Also, you have developed enormously (although the older tutorials were already very good)

  • @IgorAherne
    @IgorAherne 6 лет назад +17

    Good overview, also having 8 cores won't necessarily speed up computation by exactly x8, perhaps by x7 in practice
    I just wish you would mention that processors use SSE2, AVX2 and similar things that allow each core do 8 summations/multiplications/shifts/etc at a time, rather than one by one. This allows a CPU registers to process arrays in chunks of 8. Many C/C++ programmers don't know about those, and build programs that are by default doomed to underperform.
    So I feel everyone is always unfair towards the CPU. Everybody is pointing at the cores, but each core can (and should) use intrinsics, doing parallel things.
    Especially with various RNNs where we only win if we move the entire algorithm to the GPU to avoid data transfer bottlenecks, and when the RNN is decently wide in each layer for the GPU.
    Also, CPU is really flexible when it comes to 'if/else' or while loops, reacting faster and wiry when the branch occurs.

    • @deeplizard
      @deeplizard  6 лет назад +2

      Hey Igor - Thanks for adding these details. Great stuff. Much appreciated! 🙏

  • @taj-ulislam6902
    @taj-ulislam6902 8 месяцев назад +1

    Very professional video. Good information.

  • @bazzinga204
    @bazzinga204 3 года назад +2

    Beautifully explained.

  • @estebansevilla2229
    @estebansevilla2229 Год назад +10

    I ended up here because my daughter is learning "AI" at high school and now I need to understand how this all works to make her a PC.

  • @mayur9876
    @mayur9876 6 лет назад +4

    Thanks for putting in all the efforts.

  • @Ammothief41
    @Ammothief41 5 лет назад +8

    Any idea on how the rtx graphics cards and their tensor core stuff compares to the standard gtx gpus? Is that something that tensorflow or pytorch take advantage of?

    • @deeplizard
      @deeplizard  5 лет назад

      Haven't seen the comparisons. And. Yes. www.geforce.com/hardware/technology/cuda/supported-gpus

  • @MarcelloNesca
    @MarcelloNesca 3 года назад +4

    Thank you very much, currently learning deep learning and this was perfect to explain why I need a good GPU

    • @chinonsoalumona6734
      @chinonsoalumona6734 2 года назад

      Hi Marcello how’s your deep learning experience going?

  • @jamesferry8484
    @jamesferry8484 6 лет назад +5

    Thank you for sharing. Very helpful.

    • @deeplizard
      @deeplizard  6 лет назад +1

      Hey James - You are welcome!

  • @AdrianDucao
    @AdrianDucao 5 лет назад +144

    my girl friend and i been doing a lot of deep learning lately

    • @osumanaaa9982
      @osumanaaa9982 4 года назад +72

      you sure it's not just shallow computations ? :p

    • @q3d385
      @q3d385 4 года назад +5

      @@osumanaaa9982 🤣

    • @Aditya_Kumar_12_pass
      @Aditya_Kumar_12_pass 4 года назад +30

      I hope you are not hoping for any output.

    • @anamitrasingha6362
      @anamitrasingha6362 4 года назад +3

      @@Aditya_Kumar_12_pass RUclips should have a Haha react xD

    • @vibesnovibes6320
      @vibesnovibes6320 4 года назад +11

      How many layers for protection? Are you clear on how back propogation is supposed to work? 😂😂😂

  • @Not0rious7
    @Not0rious7 4 года назад +1

    Nice video, I like all the graphics you used. Where do you find them?

  • @durafshanjawad5250
    @durafshanjawad5250 4 года назад +1

    Very Helpful Series

  • @Henry..s
    @Henry..s 4 года назад +6

    What I learned from this video is that nvidia gpu got their speed from the CEO

  • @jordan5253
    @jordan5253 5 лет назад +3

    Jesus @2:41 I spit out my water lol

  • @nozaxi0327
    @nozaxi0327 6 лет назад +4

    Thank you. This series so helpful for me

    • @deeplizard
      @deeplizard  6 лет назад +1

      Glad to hear that, jesse! You're welcome!

  • @sqliu9489
    @sqliu9489 3 года назад

    brilliant explanation

  • @kudoamv
    @kudoamv 6 лет назад +3

    PLEASE HELP, Is it something like -
    "you have to download pytorch with cuda if you wanna use gpu or else you will only be able to use cpu" ?
    I am an AMD user.

    • @JasperHatilima
      @JasperHatilima 5 лет назад +3

      CUDA is NVIDIA platform and so only supports NVIDIA cards like the GTX GPUs. For AMD, the framework for parallel programming is OpenCL...which unfortunately does not have a development community as big as the CUDA community.

    • @ankitaharwal5886
      @ankitaharwal5886 5 лет назад +1

      I can feel your pain 😔 😔

  • @e4r281
    @e4r281 6 лет назад +75

    Maybe if we start telling people the brain is an app they will start using it.

    • @clbl8706
      @clbl8706 4 года назад +2

      That's some cringeworthy joke my grandma would share on FB.

    • @e4r281
      @e4r281 4 года назад

      @@clbl8706 actually, it's not a joke.

    • @nikhilmathur3351
      @nikhilmathur3351 4 года назад +1

      @@e4r281 Unlike you

    • @hailongvan8285
      @hailongvan8285 4 года назад

      ok commie

  • @vibesnovibes6320
    @vibesnovibes6320 4 года назад +1

    Great video

  • @ajaykrishnan4277
    @ajaykrishnan4277 6 лет назад +4

    you just nailed it

  • @SW-ud1wt
    @SW-ud1wt 8 месяцев назад

    I have HP envy ci7 laptop having GeForce rtx 2050 card, will it be used for machine learning tasks?

  • @mexzarkashiev2435
    @mexzarkashiev2435 5 лет назад

    So which graphics card should I buy for deep learning?

  • @RohanSawant-s3q
    @RohanSawant-s3q 8 месяцев назад

    😄very nice video

  • @henson2k
    @henson2k 4 года назад

    When multiple apps are using CUDA how it's managed by GPU? Can GPU execute different kernels at the same time?

  • @True38
    @True38 5 лет назад

    Did they remove all those stats/functions on a newer version of Cudo? Because I just recently downloaded it, and the only thing I can see on the screen is CPU, XMRig, h/s to the left and Payout Coin to the right. That's it! I'm using CPU but want to use GPU, but can't see any option. Please help if you can.

  • @jackt9535
    @jackt9535 2 года назад

    I'd like to know whether you could use dedicated graphic card for deep learning while you don't have a CPU with no iGPU.
    This will help me much with my screen cable management issue (I'm new to this) !
    Thanks!

  • @anujsaboo7081
    @anujsaboo7081 5 лет назад +1

    You have a mistake in the quiz section:
    Q. Different PyTorch components are written in different programming languages. PyTorch is written in all of the following programming languages except?
    Ans. Java (on blog correct answer is coming as Python)

    • @deeplizard
      @deeplizard  5 лет назад +1

      Hey Anuj - Thank you so much for pointing this out! I've fixed it. You may need to clear your cache to see the change.
      Chris

  • @TheTariqibnziyad
    @TheTariqibnziyad 5 лет назад +11

    Finally these nerds got the guts to call something a funny name, Embarrassingly parallel xD

    • @deeplizard
      @deeplizard  5 лет назад +3

      🤓

    • @zackrider3708
      @zackrider3708 3 года назад +1

      @@deeplizard can the new XE server gpu from intel handle ai or Deep Learning workloads like nvidia gpu ?

  • @faizahmed8034
    @faizahmed8034 4 года назад

    Aren't GPU's used for image processing (i.e conversion of binary code to analog Graphic pixels)? If so how can we use them for mathematical computations ?

  • @rifatmasud
    @rifatmasud 5 лет назад +3

    Anyone have that video link regarding "python is slow"?

    • @deeplizard
      @deeplizard  5 лет назад +2

      Added it to the description. Here you go: ruclips.net/video/DBVLcgq2Eg0/видео.html

    • @arkamitra4345
      @arkamitra4345 5 лет назад +1

      GIL is evil 😓

  • @benjaminhansen4808
    @benjaminhansen4808 3 месяца назад +3

    You can explain this whole AI trend 5 YEARS AGO

  • @robertatvitalitystar2444
    @robertatvitalitystar2444 6 лет назад +4

    ty! Peeling the plastic of a brand new GPU is a good day, lol.

  • @qusayhamad7243
    @qusayhamad7243 4 года назад +1

    thank you

  • @krishna_o15
    @krishna_o15 4 года назад

    I am agreeing since Soumith has said it.

  • @lancelotxavier9084
    @lancelotxavier9084 5 лет назад +4

    nVidia is holding back processing power in order to make selling their products sustainable.

    • @deeplizard
      @deeplizard  5 лет назад +5

      This is important. A company's incentive to make a profit can be a double-edged-sword. Consider the same problem in healthcare or biotech.

    • @henson2k
      @henson2k 4 года назад

      @@deeplizard conflict of interests indeed, every software developer knows that LOL

  • @magelauditore333
    @magelauditore333 4 года назад

    5:25 i was like wtf. Can anyone have the link of whole video. 😂😂. Man i got excited and started shouting at home

    • @deeplizard
      @deeplizard  4 года назад

      It's a popular one. Google and you will find 😂

  • @adithi729
    @adithi729 4 года назад

    Sir is there any way to do cuda programing online? i mean any online compiler is present now? my system is not supporting cuda..pls help

  • @henson2k
    @henson2k 4 года назад

    At research stage I can see how Python is acceptable choice. However for production systems Python too large in size and not fast enough!

  • @_SupremeKing
    @_SupremeKing Год назад +1

    Although still being confused, I just picked up some new knowledge for a layman like me

  • @ezevictor4448
    @ezevictor4448 5 лет назад

    Is a gtx 960m or 1050m worth using?

  • @MMphego
    @MMphego 4 года назад +1

    Subscribed.............

  • @zeppybrawlstars3906
    @zeppybrawlstars3906 5 лет назад +1

    So when I play games with advance Ai make sure my gpu is ready

  • @Priya_dancelover
    @Priya_dancelover 2 года назад +1

    excellent

  • @vky771
    @vky771 6 месяцев назад +1

    Everyone wishes they saw your video 5 years back 😅

  • @isbestlizard
    @isbestlizard 5 лет назад +2

    OH MY GOD WOW are you a lizard too? i love ai and stuff as well :D

  • @alkeryn1700
    @alkeryn1700 5 лет назад

    Guys, don't use cuda, use HIP so it runs everywhere
    or use opencl or sycl but don't have your software stuck to proprietary and platform specific hardware and software

  • @louerleseigneur4532
    @louerleseigneur4532 4 года назад +1

    merci merci

  • @A-Predator
    @A-Predator 4 года назад +1

    i just learned the name of deep learning
    😵

  • @kanui3618
    @kanui3618 5 лет назад

    Can I combine Nvidia GTX 1070 or higher with amd ryzen 5

  • @richarda1630
    @richarda1630 3 года назад

    exciting and if you read WBW blog thrilling times

  • @pscheuerling
    @pscheuerling 4 года назад

    I came Here to learn how to utilitise deep learning cores for Training my own ai... I still don't know why i should buy cores i cant use.

    • @A-Predator
      @A-Predator 4 года назад +1

      😵

    • @lakeguy65616
      @lakeguy65616 2 года назад

      the forward pass can rely on matrix math which can be run through CUDA(software layer) and done on an NVIDIA GPU. The more GPU cores, the faster the process. A gpu with 100 cores will perform this step 10x faster than a GPU with 10 cores (in general...).

  • @sajolsajol8393
    @sajolsajol8393 Год назад +1

    wow!

  • @beomheelee1249
    @beomheelee1249 3 года назад +1

    you're so geinus!! Thanks

  • @SuperMIKevin
    @SuperMIKevin Год назад +1

    Lmao, I can't believe they actually named it embarrassingly parallel.

  • @hedonismbot1508
    @hedonismbot1508 Год назад

    Never in my lifetime would I have ever imagined "Embarrassingly [something]" would be an actual technical term.

  • @DinoFancellu
    @DinoFancellu 4 года назад

    Its a pity that AMD don't seem to support CUDA. Their new big navi cars look really nice apart from that

  • @avdhutjadhav5657
    @avdhutjadhav5657 2 года назад

    Does that mean GPU with more CUDA is better for Deep learning ?

  • @Joco5012
    @Joco5012 3 года назад

    Wow was it this ok to do so much coke back in the day? 5:12 damn dude take it down a notch

  • @Infinityxx3
    @Infinityxx3 6 лет назад +1

    u need some music in vid... attracks views.... cool vid

    • @deeplizard
      @deeplizard  6 лет назад

      What kind of music do you like?

    • @Infinityxx3
      @Infinityxx3 6 лет назад +1

      ​@@deeplizard PEWDIEPIE bitch lasanga ? jk... any relaxing music when u talk... when u changing camera shot etc... ;)

    • @deeplizard
      @deeplizard  6 лет назад

      haha. That sent us on a tangent. Hadn't seen that before.🤣

  • @alebecker12
    @alebecker12 10 месяцев назад

    I have a doubt, does Nvidia has a monopoly in such hardware? If not, does CUDA only work on Nvidia hardware?

    • @deeplizard
      @deeplizard  10 месяцев назад +1

      Yes. Nvidia built CUDA. It only works with their hardware.

  • @robertsmith512
    @robertsmith512 5 лет назад +1

    SUBBED !
    EVERYBODY SUB THIS CHAN !
    THIS ONE KNOWS HE'S STUFF !
    GO LOOK AT THE PLAYLIST LIB !

    • @deeplizard
      @deeplizard  5 лет назад

      Thanks Robert! Note that there are two of us here. 🦎🦎

  • @りょりょりょ-b6s
    @りょりょりょ-b6s 4 года назад

    5:57 東工大でるのは草www

  • @MeowsyDancer
    @MeowsyDancer 3 года назад

    I will now send this to anyone who asked why I bought a 3090! rip wallet tho

  • @MilesBellas
    @MilesBellas Год назад +3

    Jensen is better at 2x

  • @cubul32
    @cubul32 4 года назад

    Former CEO - and we can see why.

    • @deeplizard
      @deeplizard  4 года назад

      Although, he became a billionaire from his tenure at Microsoft 😄

  • @invest8198
    @invest8198 3 месяца назад

    someone who bought $NVDA 2018 ?

  • @ErrorRaffyline0
    @ErrorRaffyline0 4 года назад

    I really dislike CUDA becouse it’s not open-source and AMD is not able to use it, it makes development for both AMD and NVIDIA much harder.

  • @iLevelTechnology
    @iLevelTechnology 7 месяцев назад

    He doesn’t explain tensor very well but overall good job

  • @Drtsaga
    @Drtsaga 4 года назад

    "Deep learning" should not be on the title in my opinion.

  • @jacobcorr337
    @jacobcorr337 3 года назад +2

    CUDA not explained at all

    • @lakeguy65616
      @lakeguy65616 2 года назад +1

      CUDA is a software layer that interfaces with Nvidia GPUS to allow porting some problems (think forward pass) to the GPU which can be done in parallel. (your pc has an Nvidia GPU, with software like Pytorch, you tell PC cuda is available and to send certain processes to GPU for processing in parallel.) vastly over simplified.

  • @escapefelicity2913
    @escapefelicity2913 3 года назад

    I'm glad I don't need to listen to Jensen Huang.

  • @mmm-ie5ws
    @mmm-ie5ws 8 месяцев назад

    u did a rly bad job at explaining why gpu's are better for parrallel computing than cpu's.