GPU Computing Explained | How A GPU Works

Поделиться
HTML-код
  • Опубликовано: 19 сен 2024

Комментарии • 150

  • @OptimisticFuturology
    @OptimisticFuturology  6 лет назад +4

    Want to learn more about the Technological Revolution? Watch our playlist here: ruclips.net/video/ENWsoWjzJTQ/видео.html
    - ALSO - Become a RUclips member for many exclusive perks from exclusive posts, bonus content, shoutouts and more! subscribe.futurology.earthone.io/member - AND - Join our Discord server for much better community discussions! subscribe.futurology.earthone.io/discord

    • @holdthetruthhostage
      @holdthetruthhostage 5 лет назад

      I don't think it will take 2040 most likely a start up company with 3d printing will accomplish this sooner

    • @bhuvaneshs.k638
      @bhuvaneshs.k638 5 лет назад

      Ur channel is awesome... Please do a video on TPU

    • @safaulansari7782
      @safaulansari7782 3 года назад

      P

    • @darrenc2370
      @darrenc2370 9 месяцев назад

      the music from 7:46 onwards, anyone else getting Serial Experiments Lain vibes there?
      Edit:
      link for the uninitiated ruclips.net/video/iOVlx4PRxZE/видео.htmlsi=optCloI6NAEgjj9w&t=248

  • @PauLWaFFleZ
    @PauLWaFFleZ 6 лет назад +146

    Please PLEASE do NOT stop this series on Computing.

  • @kayrosis5523
    @kayrosis5523 6 лет назад +88

    Graphics cards 100,000x as good as now? That simulation hypothesis might be onto something

    • @gs-nq6mw
      @gs-nq6mw 4 года назад +3

      Nope,the Moore law is expected to end by 2020 by most specialist,including Moore himself

    • @projectjt3149
      @projectjt3149 4 года назад +2

      g s how about Tensor Cores?

    • @thewalnutwoodworker6136
      @thewalnutwoodworker6136 3 года назад +1

      We are down to 2nm as of 2021, 1nm will be the end of silicon. We might to be able to push it father with other atoms to go sub nm. We will probably make major advancements in architecture before we go quantum. For example x86 is bloated af, risc is what we need past 1nm. AMD is making major pushes in architecture such as chiplets/mcm. As of now the rdna3 leaks are showing over 2x performance on rasterastion.

    • @spacevspitch4028
      @spacevspitch4028 2 года назад +1

      @@gs-nq6mw Yowza. 2 years ago and still going.

    • @player111q7
      @player111q7 Год назад

      No it's not, 2023 and it's still valid

  • @PauLWaFFleZ
    @PauLWaFFleZ 6 лет назад +49

    Bro your videos are simply AMAZING. Your presentation is completely pulls you in, and makes you want to wish that the video never stops. Makes me more excited to be going to school for Computer Engineering. Keep at it bro...

    • @OptimisticFuturology
      @OptimisticFuturology  6 лет назад +5

      Thank you for watching :)!

    • @projectjt3149
      @projectjt3149 5 лет назад

      Oh it didn't stop for me! I am about to write a research abstract in college discussing GPU computing and this video was a HUGE help!

  • @L2Xenta
    @L2Xenta 6 лет назад +7

    +1 sub for showing off the ambition of Star Citizen, a game project that takes a long time but challenges multiple barriers of the limited industry.

  • @annoloki
    @annoloki 6 лет назад +15

    Do check out IBM's TrueNorth architecture which'll put an end to all this GPU for AI stuff as TrueNorth doesn't run code to simulate neurons - it provides silicon based neurons on a chip... they are clockless, parallel, programmable, run faster than instruction based neural nets, but use an absolutely minute amount of power to do the same thing... neurons in silicon rather than in software is what will really revolutionise AI.

    • @OptimisticFuturology
      @OptimisticFuturology  6 лет назад +6

      Thanks for watching! I have a video on neuromorphic computing coming soon!

    • @walter0bz
      @walter0bz 6 лет назад

      I hope that AI solutions keep being programable. the same thing happened with graphics: we went from pure-software , to fixed-function hardware accelerators, to programmable hardware that eventually generalised into a fully programable parallel processor. The reason is the variety in algorithms (eg convolutions, capsules, various compression schemes.. ); i would prefer to see this variety increase; we'd just see chips that are more dataflow/low-precision oriented for AI (rather than the GPUs who's datapaths and precision grew around the demands of texturing and geometry)

    • @annoloki
      @annoloki 6 лет назад

      Fantastic, if I can recommend something well worth looking at in that realm which you may wish to incorporate bits into your video (if you've not come across it already) that's an explanation of the general purpose cortical column, which is slightly different from the deep learning neural nets, using SDR (sparse distributed representation) in order to act as memory, and pattern prediction that branches out to predict many different possible futures, allowing pruning of branches that didn't occur once data comes in. It's what powers mammalian (including our) brains, and is being put to use for things like recognising when faults will occur on computer systems to helping predict power generation requirements on a grid. Numenta seem to be a/the leader in this work... it might be a bit out of the scope of what you're doing if you're more on the [near-]consumer tech, but it's super interesting and will play a big role in our futures, so I'll just give you this one video link as it contains some good visual explanations, and will spare you further recommendations unless they're helpful/welcome rather than redundant. Cheers for replying :-)
      ruclips.net/video/iNMbsvK8Q8Y/видео.html

    • @annoloki
      @annoloki 6 лет назад +1

      walter0bz - TrueNorth is programmable, but not in the same way as a general purpose processor, because it's clockless, asynchronous, very highly parallel etc etc etc... it's more like electrical wiring, like having a load of gates, capacitors, transistors, all on silicon, and "programming" it uses an FPGA to wire them up to make your own circuits... it's more like a physical brain that you choose how neurons are connected to each other and how they behave than like a CPU... but it can do the job of a rack of servers in something that will fit in a phone and barely touch your battery... but I'll leave it there as SP's doing a video on the subject 'n he'll no doubt do a better job covering it, oh just to say, I wouldn't worry about limitation of hardware, when it comes to AI, it's US who are going to be falling behind, esp now Google are already using AIs to make AIs better than people can make them.

    • @walter0bz
      @walter0bz 6 лет назад

      programability is a sliding scale. 'fixed-function GPUs' are configurable and can do different effects with layering, ordering according to a 'program', but fine grain programability in shaders was superior. The device I hope 'wins' would be the grid-of-RISC-cores, with inter-core messaging (this can be clockless, highly parallel..) - just with the right custom instructions (e.g. low precision dot-products) to handle AI workloads

  • @tomojeetchakraborty5459
    @tomojeetchakraborty5459 6 лет назад +8

    I am your greatest fan .sir . I am very obilged by the knowledge u provide us . And make us aware of the modern trends
    .,💐💐💐💐💐💐

  • @djsvideodiarys
    @djsvideodiarys 6 лет назад +51

    Can't wait for QPU.

    • @OptimisticFuturology
      @OptimisticFuturology  6 лет назад +32

      But could it run Crisis ;)

    • @djsvideodiarys
      @djsvideodiarys 6 лет назад +3

      Hahahaha

    • @djsvideodiarys
      @djsvideodiarys 6 лет назад +10

      Haha Hopefully it will solve Crisis

    • @ActualGenius
      @ActualGenius 6 лет назад +3

      Feel like I was processing in CPU terms my whole life and instantaneously entered QPU age when I started watching this channel.

    • @TheDanm22
      @TheDanm22 5 лет назад +1

      nothing will run crisis.
      @@OptimisticFuturology

  • @googledev566
    @googledev566 4 года назад +1

    *_Keep creating such crucial and informative videos_*

  • @thejointcoach
    @thejointcoach 6 лет назад +3

    Please do a video on quantum computing!! I love your channel and I think you deserve way more subscribers, I'll do my best to spread your name

  • @rawding1976
    @rawding1976 6 лет назад +8

    One of my favorite channels!!! This channel is gonna explode with subscribers very soon! Watch & see! Great job as always!

  • @Slayer3915
    @Slayer3915 5 лет назад

    Apparently it's hard for some people, but I for one appreciate your spoken words per minute.

  • @rahmanash9856
    @rahmanash9856 6 лет назад +1

    Awesome as always .. waiting for Quantum and other types of computing, Graphene applications,AI and so much others

  • @marymcreynolds8355
    @marymcreynolds8355 6 лет назад +6

    Star Citizen...quite a peek of the latest peak.

  • @The_Masked_Frenchman
    @The_Masked_Frenchman 5 лет назад

    Watching these is reinvigorating my love for technology and makes me want to go back to school for computer engineering

  • @madsgrand
    @madsgrand 6 лет назад +4

    Please change the coldfusion inspired intro its just to close for comfort. On the content side your videos are so much better!

  • @barney9008
    @barney9008 6 лет назад +1

    nice delivery reminds me of cold fusion

  • @james_gemma
    @james_gemma 6 лет назад +3

    Your channel should have way more views and subscribers. I guess it's only a matter of time before everyone discovers your excellent informative tech videos.

    • @cdreid99999
      @cdreid99999 5 лет назад

      Funny the number of comments mirroring this. In a row. But I'm sure you're not sockpuppets or bots..

    • @originproductions6120
      @originproductions6120 5 лет назад

      He's ripping off ColdFusion's intro completely

  • @dangdiggity9916
    @dangdiggity9916 5 лет назад +1

    one thing im wondering about for "ai" in gaming is if they can optimize ray tracing for certain games so when you open the game they have already figured out where on the map its used so its basically learning something before doing it for the first time by the user (but ofc probably still improving)

  • @bommaritohawaii
    @bommaritohawaii 6 лет назад +9

    Great job!

  • @Mordred478
    @Mordred478 6 лет назад +2

    Great video, very informative. Is the implication here that Dell and other companies will soon start offering PCs with GPUs instead of CPUs?

    • @jwadaow
      @jwadaow 6 лет назад

      No

    • @mike288190
      @mike288190 5 лет назад

      I believe you would still need a cpu

  • @karehaqt
    @karehaqt 5 лет назад

    Just discovered your channel via this series and you gained my sub, great videos so far.

  • @srungarapusaikrishna5583
    @srungarapusaikrishna5583 4 года назад +1

    That my friend felt like a rocket science class!!!😵🤒

  • @wolfisraging
    @wolfisraging 6 лет назад +2

    Great job, make more

  • @borisgotov9838
    @borisgotov9838 6 лет назад +1

    give simple piece of code a little bit more complex than hello world program. Something like rolling ball or rotating square...

  • @dinozaurpickupline4221
    @dinozaurpickupline4221 3 года назад +1

    An AI could be used to map tonal changes,of different structures and things & their outcome can be used to decrease load times In games like how cool would it be if a computer or software already knows reflection details of every object texture color variation size & scaling using separate sets of data & create further smaller tests this would increase performance drastically

  • @zahanjavaid
    @zahanjavaid 5 лет назад +1

    I seriously liked your videos but not in a position to understand most of the part
    LOL 😂😂😂

  • @NietJeffrey
    @NietJeffrey 5 лет назад

    I wish you every bit of youtube fame coming to you. You have some great content!

  • @meowmeowmoogabenrules4854
    @meowmeowmoogabenrules4854 5 лет назад

    why is youtube barely showing me this. Amazing work man

  • @peefwellington8794
    @peefwellington8794 4 года назад

    Boy the new gen of gpu In 2020 is gonna be incredible

  • @PabloGonzalezVargas
    @PabloGonzalezVargas 5 лет назад

    Wow !!! I'm a big fan of your channel, impressive eloquent work *

  • @benyeo7930
    @benyeo7930 5 лет назад

    I love all your videos - great content that is educational and serves as good primers for the uninitiated. 2 issues with all your videos - the voice of commentator is way too Low and boring (no fluctuations in tone at all - drones on and on) and the speed of narration is also way too rapid; good thing, there are subtitles to follow which solves the issue for the Super interested audience. However, even scanning the subtitles requires full and complete attention which reinforce the comment that the speed and tone issues are real!

  • @chuckbuckets1
    @chuckbuckets1 5 лет назад

    ai and protein folding will be one of the most profound paradigm shifts of humanity.

  • @HeadStronger-HS
    @HeadStronger-HS 6 лет назад

    this blew my mind.. Major advances in GPUs

  • @supremepartydude
    @supremepartydude 6 лет назад +1

    As computer enthusiast for 30 years you did a great job.

  • @DarthRaver-og7qq
    @DarthRaver-og7qq 5 лет назад

    Dam think about what a laptop or desktop gamin rig will like look in say 50 years?? Could you imagine having something portable the size of a Nintendo Switch, yet as powerful as a full desktop gaming rig today with the best of everything. Thats crazy lol. I hope im still alive by then. Everything in the world looks like its actually heading toward a "Bladerunner" type civilization lol.

  • @ELECTR0HERMIT
    @ELECTR0HERMIT 4 года назад

    excellent job.

  • @edwardbrownstien8741
    @edwardbrownstien8741 5 лет назад

    Great channel . Love the content .

  • @infinitworld7106
    @infinitworld7106 6 лет назад +8

    MORE CONTENT!!!

  • @usertogo
    @usertogo 6 лет назад

    Nice, now if somebody has a graphics accelerator how does one enable the cores to be easily used by the operating system and applications?

  • @system2072
    @system2072 6 лет назад +1

    great videos man..... but can you please slow down while explaining.. you speak fast which is very hard to understand sometime...

  • @anshulsharma9424
    @anshulsharma9424 6 лет назад

    Keep up the good work

  • @pegasusted2504
    @pegasusted2504 6 лет назад

    Good stuff all round :~)

  • @DANTHETUBEMAN
    @DANTHETUBEMAN 6 лет назад +6

    as soon as computers get consciousness they will no longer serve humanity,

    • @flynnkay
      @flynnkay 5 лет назад

      Ok then Ill unplug that bitch haha

    • @cdreid99999
      @cdreid99999 5 лет назад +1

      You don't understand how computers work then. And we are nowhere close to an. What the hypewagon are calling ai now is what we used to call expert systems .. ie standard proging/algorithms. We simply don't have the processing power yet. We might be able to build a simulated human brain but it would be built on FPGA like neural networks boards, be the size of a Walmart and probably require a power plant to run

    • @TheDanm22
      @TheDanm22 5 лет назад

      @@cdreid99999 ai is going under a new dynamic now. 1 ai is programmed to evaluate and judge. another ai is programmed to design and invent. this is the judge and the inventor ai. two ai working together to improve each other processes. it wont be the size of walmart. its going to evolve exponentially.

    • @viniciusbueno2160
      @viniciusbueno2160 5 лет назад

      And now, 1 or 2 months ago IBM released the first quantum computer outside the lab!!!! Now things can move faster

  • @Army2willis
    @Army2willis 5 лет назад

    I see you popped in some SCU to show just how crazy graphics are today. You know you like SC

  • @daniel_960_
    @daniel_960_ 5 лет назад

    But what do the cores in apples mobile graphics mean? The a11 or a12 have only 3 or 4 graphics cores but are still really powerful. Previous generations had more cores as far as I know.

  • @winkipinky
    @winkipinky 5 лет назад

    Fantastic .... 😁

  • @TheDanm22
    @TheDanm22 5 лет назад

    in the first minute.... thats called moores law. it deserves the reference.

  • @nad1901
    @nad1901 5 лет назад +1

    Still I don't know how people get the idea of using annoying music background on informative videos. And since when did we've music playing while we learning in schools :/

  • @hemendrapratapsingh4156
    @hemendrapratapsingh4156 3 года назад

    So I thought to watch the complete ad today. But it skipped automatically. 🤐

  • @snoogboonin
    @snoogboonin 5 лет назад

    Your vids are fucking unreal dude. Subbed.

  • @MyWatchIsEnded
    @MyWatchIsEnded 5 лет назад +2

    But can the GPU from 2040 run Crysis?

    • @ahuttee
      @ahuttee 5 лет назад +1

      Might be possible

  • @brushhog7089
    @brushhog7089 6 лет назад

    nice horn in the background but really distracting I guess I'll go somewhere as I was for information

  • @PauLWaFFleZ
    @PauLWaFFleZ 6 лет назад

    When can we plan on seeing some of the videos on the AI series?

    • @OptimisticFuturology
      @OptimisticFuturology  6 лет назад

      Starting May!

    • @PauLWaFFleZ
      @PauLWaFFleZ 6 лет назад

      Ah come on man, I can't wait that long... You gotta give me the line up for what else is going to be coming out until then...

  • @govinds3951
    @govinds3951 6 лет назад

    Jheeez good work

  • @wandrinsheep
    @wandrinsheep 6 лет назад

    oho a star citizens fan i see, awesome

  • @m_sedziwoj
    @m_sedziwoj 6 лет назад

    From last week, look at Google TPU 3, 100 petaflops for DL, it 1000x more then Nvidia Titan V

  • @meatofpeach
    @meatofpeach 6 лет назад +1

    Incredible RUclips channel. Wow. Keep it up

  • @jackharpe3rd233
    @jackharpe3rd233 4 года назад

    I could care less for AI or a real life Hal or Skynet. Not because I'm scared, but cause all I truly want is more Pixels and more Polygons being rendered for my videogames. Unless Moore's law affects that then okay, let's use Ray Tracing and other Visual Tricks to fool our eyes into a better Graphical Future. I also want Great Story Telling as well which thanks to the rise of SJWs, E-Sports, Lllumination Studios, and Modern Activism has told the world of Computer Generated Imagery that Story Doesn't Matter Anymore! Please don't let us down Sony!

  • @tonytony7225
    @tonytony7225 5 лет назад

    you gotta talk about amd's threadripper and its AI technology

  • @fuzzylumpkin8030
    @fuzzylumpkin8030 5 лет назад

    Yeah that’s cool but at what cost to gaming

  • @sarmadnajim4839
    @sarmadnajim4839 6 лет назад

    Wonderful document , direct and clear , smartly done 👍🏻

  • @aoeu256
    @aoeu256 5 лет назад

    GPU voice recognition never heard of it.

  • @Chrisimplayer
    @Chrisimplayer 5 лет назад

    to me it's highly debatable if star citizen is still in development

  • @Keiktu
    @Keiktu 5 лет назад

    Insta suscribed

  • @kokomanation
    @kokomanation 6 лет назад

    I feel that cpu and GPU are getting merged together

    • @rpzcsonli
      @rpzcsonli 5 лет назад

      CPU can calculate everything , GPU needs special cores to be better than a CPU so if they add cores to calculate everything the GPU will become a CPU . It won't become one because it will be to big , expensive , powerhungry . They could make a AiPU(ai proccesing unit) and put it in a card , a RTPU(ray tracing proccesing unit) and put it in a card and so on and make it that you can add 20 cards in your computer and then we will have the true "performance" in everything but it won't happen because is stupid. CPU will be the "CENTRAL Proccesing Unit" untill humanity ends and GPU will just be it's slave doing all the work then it will put everything together and make it pleasent for you to interact with.

  • @platin2148
    @platin2148 5 лет назад

    The stacking will also make nvidia obsolete as both intel and amd can make pretty capable gpus so no need to work with nvidia.
    So i suspect the either go server or bet on Arm so if the could make a Arm1000 chip that is tightly integrated with there GPUs the basically would have won.
    And he saying that he made a special chip for AI which is completely wrong he made matrix calculations faster not AI but it could be even faster with FPGA's.

  • @vyor8837
    @vyor8837 5 лет назад

    Volta isn't on a true 12nm node.

  • @ekaterinavalinakova2643
    @ekaterinavalinakova2643 6 лет назад +2

    1.13 Quintilian flop gaming system by 2040. 100,000 x 11.3 teraflops.

    • @xsuploader
      @xsuploader 6 лет назад

      Not quite, he said in the 2040s not 2040. At the current rate of 1.5x per year it would take log(100000)/log1.5 years or 28 years approximately putting the year at 2046. At around the 2045 singularity proposed by kurzweil.

  • @harrym8556
    @harrym8556 5 лет назад

    "1^14 FLOPS in performance..." Dude, what are you talking about??
    You know that 1^14 = 1, right?
    Did you mean to say 10^14 FLOPS?

  • @bassbs
    @bassbs 6 лет назад

    Did you, SP, do SU?

  • @frozencode5238
    @frozencode5238 6 лет назад

    i love you man...

  • @BakiWho
    @BakiWho 6 лет назад

    you sound like the hardy boys in south park :) i have a raging clue

  • @Gollywog
    @Gollywog 6 лет назад +7

    you talk too fast. I love the info but it needs my full attention (not something I can listen in the background) because you say everything so fast.

    • @735Secure
      @735Secure 6 лет назад

      StiX it’s because he’s just reading the stuff. If you have a technical background and are a scientist or an engineer you don’t just put on a show. He is all about the show. Descent information but I don’t thrust the information he provides.

    • @curiosity1865
      @curiosity1865 6 лет назад +1

      Turn down the speed of video

    • @originproductions6120
      @originproductions6120 5 лет назад

      @@polishpepe239 stop trying to sound smart. I'm sure you wouldn't have a problem with 10x speed as well right because you're such an intellectual. It's annoying because my full attention has to be on the video and if I'm playing halo this guy talks too fast for that. Stop trying to show off and be honest with yourself. Can you understand him at 2x speed? Probably, but it's missing the whole point of the video. That point is to absorb the information he's giving you and think about it, and you can't do all of that at 2x speed even if you're Albert fucking Einstein

    • @originproductions6120
      @originproductions6120 5 лет назад

      @@polishpepe239 also I just watched it at 2x speed and now know that you're just bullshitting if 2x speed isn't even an ideal speed. Stfu no one cares about you trying to sound smart

  • @mohamedsalahoshi1486
    @mohamedsalahoshi1486 6 лет назад

    *J* *U* *S* *T* *A* *M* *A* *Z* *I* *N* *G*

  • @kapilbsingh
    @kapilbsingh 6 лет назад

    Whatever they will develop it will find a place in landfills.

  • @MegaFlemo
    @MegaFlemo 5 лет назад

    WOW

  • @Sean_Lightning_OBrien
    @Sean_Lightning_OBrien 6 лет назад

    At this point I’m still waiting for the GYX 1180/2080 😂

  • @szirbektamas2571
    @szirbektamas2571 5 лет назад

    After this video I feel myslef so stupid

  • @strangevideos3048
    @strangevideos3048 5 лет назад

    We live in Matrix!

  • @CCRob720
    @CCRob720 3 года назад

    what could we do with a billion gpu power.....

  • @SumWanYo
    @SumWanYo 6 лет назад

    Why is the nvidia ceo so nervous?

    • @rpzcsonli
      @rpzcsonli 5 лет назад

      doesn't know how to burn AMD to the ground so he can have all the moneiz and maybe some monopoly issues and anti competitive practices that he pays the governments to don't dismantle nvidia to pieces. Nvidia single handedly slowed progress for GPU's by making all the bullshit technology and buying the competitors and other technologies to use in their cards only so it will be "better" then stall for 2-5 years untill AMD catches on then get another "revolutionary" technology that is nvidia only and "help" developers by givind them the tech and money and destroy AMD performance and wait again untill AMD catches on and repeat. I'm not a AMD fanboy but i hate nvidia with everything i have because they did and do everything i sayed , google a little and you'll be enlightened by what nvidia did in the past 20 years.

  • @Julia-hk9jp
    @Julia-hk9jp 6 лет назад +2

    to sum it up this video is just nvidia commercial..

  • @perspgold8945
    @perspgold8945 2 года назад

    Not sure if it was the speed speaking or the content but this video was disjointed

  • @tomislavnikolic5778
    @tomislavnikolic5778 5 лет назад

    Holy shit

  • @ITSotechAI
    @ITSotechAI 8 дней назад

    ❤❤❤ hi all very

  • @cameronh3260
    @cameronh3260 6 лет назад

    But can it run Minecraft?

    • @rpzcsonli
      @rpzcsonli 5 лет назад

      with 400 mods yea but 600 mods ... i don't think so

  • @vladimirtchuiev2218
    @vladimirtchuiev2218 6 лет назад

    And now people start using GPUs for crypto-currency mining, driving GPU prices up...

  • @Goldnr
    @Goldnr 6 лет назад +1

    Nvidia‘s Cuda - showing an AMD card...

  • @zalanta7
    @zalanta7 5 лет назад

    this video is 4k

  • @projectjt3149
    @projectjt3149 2 месяца назад

    Even after all the success with Generative #AI and #NVIDIA recently, no one seems to be watching this video!

  • @dr.zoidberg8666
    @dr.zoidberg8666 6 лет назад

    We're reeping closer & closer every day to machines with the processing power & storage capacity to simulate human minds.
    Once that's achieved, all we need to do is figure out how to transfer someone over without breaking their stream of consciousness in the process, & we'll have a reliable path forward to radical life extension.

    • @raunak1147
      @raunak1147 6 лет назад

      Dr. Zoidberg By 2021, or if before that, something revolutionary like Graphene/3D processors happen

    • @rpzcsonli
      @rpzcsonli 5 лет назад

      @@raunak1147 you are dreaming to big just like the 1990 people that were thinking we will have flying cars ... maybe another 30 years untill then.

  • @albertgerard4639
    @albertgerard4639 6 лет назад

    Moors law never took into account bitcoin... ouch

  • @Drixidamus
    @Drixidamus Год назад +1

    Your channel is criminally unsubscribed

  • @gertjanvandermeij4265
    @gertjanvandermeij4265 5 лет назад

    Nvidia is just a big bully !

  • @madscientistshusta
    @madscientistshusta 6 лет назад +1

    Excuse me,starcitizen is a joke.

  • @utubekullanicisi
    @utubekullanicisi 4 года назад

    Too fast.