How Nvidia Won Graphics Cards

Поделиться
HTML-код
  • Опубликовано: 16 окт 2021
  • In 1995, there were over thirty different companies competing with one another to build the best graphics chips for the personal computer.
    Six years later, there would be only three. With one clearly in the lead: Nvidia.
    As of this writing, Nvidia Corporation is the 15th biggest company in the world, worth half a trillion dollars.
    Their graphics cards sell out like gangbusters the second they come onto the market.
    And the company is seeking to buy ARM for $40 billion.
    In this video, we are going to look back into the past and see how a little startup came up from behind everyone else to dominate the graphics card industry on route to being the world-leading tech juggernaut it is today.
    Links:
    - The Asianometry Newsletter: asianometry.com
    - Patreon: / asianometry

Комментарии • 777

  • @Asianometry
    @Asianometry  2 года назад +100

    Hope you enjoyed the video. Like and subscribe, etc. etc.
    For other company profiles, check out the playlist: ruclips.net/p/PLKtxx9TnH76Qod2z94xcDNV95_ItzIM-S

    • @Conservator.
      @Conservator. 2 года назад +2

      Excellent as always, thank you!

    • @vladdx
      @vladdx 2 года назад +3

      Wait, why was "Going Fabless" written when switching from SGT to TSMC? The SGT deal was also fabless because Nvidia didn't own any fabs then either

    • @passiveincomestream8813
      @passiveincomestream8813 2 года назад +2

      So good.

    • @Travlinmo
      @Travlinmo 2 года назад +1

      Excellent review of the history. seeing the Doom and Quake reminded me of the simple days of getting doom running and just wondering what that noise was behind me. :)

    • @solarwolf678
      @solarwolf678 2 года назад

      How did you comment before the video started

  • @excitedbox5705
    @excitedbox5705 2 года назад +1144

    My dad worked at Zeng Labs starting in 1996 before it was bought by ATI. Stayed on until Ati was bought by AMD, and stayed on at AMD during the GPU wars. He left AMD in 2005 or 2006. His team built the first Radeon chip.

    • @SianaGearz
      @SianaGearz 2 года назад +30

      You man Tseng Labs?

    • @VandalIO
      @VandalIO 2 года назад +15

      Lies !

    • @truthsocialmedia
      @truthsocialmedia 2 года назад +6

      I had a veretrie 2200 powered by tseng labs. It was a dissapointment

    • @SianaGearz
      @SianaGearz 2 года назад +15

      @@truthsocialmedia I think you might be confusing something. V2200 is entirely Tseng-free. And Tseng never finished its 3d chip, at least not before going belly up.
      But it was indeed fairly useless.

    • @argh6666
      @argh6666 2 года назад +4

      So?

  • @tneper
    @tneper 2 года назад +760

    One correction:
    NVIDIA did coin the term GPU in 1999.
    GPU is short for Graphics Processing Unit (not General Processing Unit).
    GPGPU is the term for General Purpose computing on GPUs. (as far as I know it was not coined by NVIDIA though)
    General purpose computing on GPUs started to become more common after programmable shaders were introduced in 2001, with the NV20.
    Great video. Loved hearing again about the early days of NVIDIA. There's a lot more to the story for sure, but this hit all the right notes, thank you!
    I've worked for NVIDIA from 1999 through today. I lived through a good portion of this, it was (and is) exciting.

    • @mingdianli7802
      @mingdianli7802 2 года назад +7

      What was your university degree in?

    • @tneper
      @tneper 2 года назад +77

      @@mingdianli7802 I dropped out of high school when I was 13

    • @VandalIO
      @VandalIO 2 года назад +3

      I thought it was 3dfx

    • @xchazz86
      @xchazz86 2 года назад +4

      Can you get me a GPU?

    • @lembkamb
      @lembkamb 2 года назад +4

      Maybe you could collaborate with asianometry , a new concept video, interview video

  • @SuperlativeCG
    @SuperlativeCG 2 года назад +256

    Nvidia should sponsor a NASCAR to confuse the fans every time there's a new driver.

  • @curtswartz1897
    @curtswartz1897 2 года назад +67

    I used to work in the video card industry in the valley. I helped design video cards and all chip companies like Nvidia, Tseng labs, Rendition and 3D labs would come and show us their latest wares. We actually made a 3DFX Voodoo card that sold very well. At one point I put together a video card that was both PCI and AGP on a single card. You would just flip the card over and change the bracket to use the card with the other bus. Wow such memories.

    • @alexlo7708
      @alexlo7708 Год назад

      The 1st ides to USB-c and Apple charger.

  • @joelcorley3478
    @joelcorley3478 2 года назад +266

    You mention the IBM PGA, but your presentation seems to ignore other 2D graphics adapters (with and without acceleration) that rose to prominence with Windows before 3D graphics. Also there were actually a number of graphics chip design houses that were already on the scene creating chips with 3D acceleration (or deceleration depending on who you ask) when nVidia and 3dfx arrived on the scene. Admittedly none were particularly good by comparison.
    Also every last IHV on the market was writing device drivers for their graphics accelerator chips. The system OEMs didn't write squat. OEMs contracted out all the productization and customization to board manufacturers which hired their own device driver development teams for that purpose. At least that was true in the '90s.
    Admittedly nVidia always had a substantial device driver development team practically from the get-go. But that was actually for self-serving reasons that probably weren't apparent to the public. The early nVidia designs had something unusual - it's own integrated command execution pipeline tied to it's own DMA channels. However all of the less common commands in the pipeline were actually virtualized and farmed back out to the host CPU for simulation. To accomplish this virtualization, nVidia needed a much larger driver team and that team needed to be more involved with the silicon design team. That's actually what initially drove their vertical integration - they just couldn't rely on board manufacturers to address issues in this chip virtualization system - though at first they tried.
    Also about the demise of 3dfx: Before 3dfx approached it, STB Systems was probably the largest or second largest contract graphics board manufacturer in the world. All the major OEMs bought cards from STB Systems. But unlike companies like Diamond and Matrox, STB Systems did not sell self-branded retail products to the public pre-merger. Instead its business model was to take sample silicon from graphics chip makers, spin a board and customize the driver with it's own library of graphics optimizations. (The optimizations were the secret sauce it used to sell to OEMs because they impacted benchmark numbers.) It would then offer these sample boards up the OEMs and each OEM would order SKUs, usually from the two best performing chipsets. This model kept STB System's Mexico PCB factory line near 100% capacity for several years.
    Before 3dfx made it's offer, STB Systems had seen huge success with both the nVidia Riva 128 and the TNT. At the time of the merger announcement about 90% of STB Systems' sales were nVidia TNT-based boards and every major system OEM was buying them. Post-merger announcement nVidia obviously refused to offer any new chip designs to STB Systems. What's worse 3dfx had never been forced to meet an OEM development cycle and even if it had, their new Banshee was at best competing with the nVidia TNT (and not even beating that) and not the current generation silicon.
    When 3dfx and STB Systems merged they were flush with something like $100M in cash. However STB Systems had done a capital lease financing arrangement on it's Mexico production line and those multi-million dollar lease payments had to be made each month whether the production lines were producing products 3dfx/STB could sell or not. It didn't take very long before the fabs in Mexico were idle and the company was staring bankruptcy in the face, because the few wholly 3dfx-based boards they produced sold only a tiny fraction of what the Mexico fabs could spit out. Also STB Systems had just built a new headquarters that it had to pay for and all those engineers STB and 3dfx had on staff didn't work for free.
    So it wasn't too long after the merger that they went looking for suitors. nVidia cut them a deal to buy out their intellectual property and hire their device driver and firmware developers. The remains of 3dfx entered bankruptcy and the support staff were shown the door.
    Regards,
    Joel Corley,
    Windows Device Driver Developer,
    Recently Retired from Microsoft,
    Formerly a Developer for STB Systems and 3dfx...

    • @dbcooper.
      @dbcooper. 2 года назад +9

      RIP who read this all along

    • @arenzricodexd4409
      @arenzricodexd4409 2 года назад +38

      @@dbcooper. it is a nice read

    • @emmausgamer
      @emmausgamer 2 года назад +21

      @@dbcooper. It's a very interesting read.

    • @stevencorley6691
      @stevencorley6691 2 года назад +3

      😱 that was one hell of a long comment

    • @Kaboomnz
      @Kaboomnz 2 года назад +17

      Thank you for your post, it's a very interesting read.

  • @RenzoTravelsTheEarth
    @RenzoTravelsTheEarth 2 года назад +29

    My uncle used to work for SGI back in the early 90s and he said that at some stage he remembers some people pitching the idea of a consumer graphics card and they basically got laughed out of the room.
    He was also telling me they basically invented video on demand but at the time the bandwidths available were too low to be practical and the project was eventually abandoned.

    • @georgesdev4577
      @georgesdev4577 Год назад +7

      I can confirm the first point. It happened during the worldwide sales conference around 1992 or 1993. Audience of about 500 sales and engineers.
      Top management dismissed the idea.
      Some engineers mumbled their disagreement in the back of the room.
      The rest is history...

    • @georgesdev4577
      @georgesdev4577 Год назад +2

      Regarding VoD, that is right too.
      But it was not abandoned, it was spun off into Kasenna and had many commercial successes, including at Tier 1 operators.

    • @TheUtuber999
      @TheUtuber999 Год назад +1

      Probably laughing out loud while filing for the patent.

    • @lemagreengreen
      @lemagreengreen Год назад +1

      @@TheUtuber999 Not quite but a few SGI engineers were listening and agreed, those are the guys left shortly after and founded 3dfx, using their experience to design that first Voodoo chip.

    • @Martinit0
      @Martinit0 2 месяца назад

      B2B and B2C distribution are very different beasts. SGI may well have been capable of solving the technical challenge but then there is the go-to-market strategy that has to match the customer. Jon gave us right in this video the example of how you can fail with 3dfx trying to go from B2B (selling chips to gfx card makers) to B2C (selling finished cards to consumers). Fail you will if not prepared for differences in sales and marketing approach.

  • @AirCrash1
    @AirCrash1 2 года назад +17

    From my experience in the industry it was how Nvidia released driver updates for their cards. Sometimes 5 years after they stopped selling them which was practically forever compared to the other players who released graphics cards with buggy drivers and then never fixed them and stopped releasing updates after 1 year. In the early days of windows a graphics driver problem would result in a BSD and loss of work and no real clue what the actual problem was with the PC. I saved the company I worked for thousands in warranty repairs by only using reliable and supported graphics cards and ditching all the other brands. Just about all the other PC components if there was a fault would clearly show themselves. Graphics card problems cost a fortune to sort out, customers bringing back machines that worked all day in the workshop but would BSD when the client was doing some unusual operation, I would return the cards to the OEM, they would test them and send them back. A nightmare of epic proportions.

  • @TheNefastor
    @TheNefastor 2 года назад +63

    Such a trip down memory lane ! I remember living through it all, but of course back then we didn't have all this information about how it all came to pass. Thanks, man, I enjoyed that.

    • @playnochat
      @playnochat 2 года назад +5

      I had Nvidia's Riva 128 in my first computer, but immediately bought Voodoo 2, because Riva was such a trash. It only sell as OEM graphics card. I don't even know if any game actually worked well with it. Not that it was NVidia's fault because DirectX was such a bad joke at that time:
      "Plug and play" -> "Plug and pray"
      "3D accelerator" -> "3D decelerator"
      At least DirectX had build-in software rendering support, so you could play games with it as the last resolt.

    • @jhutfre4855
      @jhutfre4855 11 месяцев назад

      @@playnochat indeed i also remember Riva as quite a trash in fact, I even had to play Fifa 99 on Software mode! Why did I pay for that card??

  • @noutram1000
    @noutram1000 2 года назад +57

    There's clearly a follow up to this as NVidia GPUs begin to get used in things like neural networks, crypto mining and cutting edge supercomputers... You could argue that without video gaming we wouldn't have had 'the third wave' of AI we now see is so transformative.

    • @brodriguez11000
      @brodriguez11000 Год назад +6

      Agreed. The never ending quest by gamers for realism (physics, compute shaders, etc) helped pay for the R&D that later opened up all those other needs.

    • @johndoh5182
      @johndoh5182 Год назад

      There's also a follow up to this as Nvidia is now not so friendly with its board partners (AIBs) anymore. You just had one drop out who was a bigger name in N. America, EVGA.
      They're clearly trying to move to where they manufacture their own products and they don't have any board partners in the future.
      The latest boondoggle almost shows the lack of concern between Nvidia and AIBs, which is the move to a high power 12V power connection from the PSU to the GPU. Nvidia I'm sure did a lot of research with this connection and for their branded boards the way they're mounted is different than probably all the AIBs are, and now AIB GPUs are creating a fire hazard for the brand new RTX 4090. Meanwhile Nvidia GPUs don't have a problem. Nvidia is probably the company that wanted the AIBs to move to this high power connector (AMD doesn't use it) because Nvidia developed the standard for it, but most likely failed to give the AIBs ALL the research data that went along with the development of this connector.

  • @sweealamak628
    @sweealamak628 2 года назад +227

    This reminds me of another game changer back in the day: Sound Blaster. Would be nice to have a recap of the rise and fall of Creative.

    • @TheNefastor
      @TheNefastor 2 года назад +13

      AWE32... The first time I played with a true MIDI synthesizer. It blew our socks off back then. I remember playing too much Rise of the Triad just because the music was so good.

    • @daviidgray1681
      @daviidgray1681 2 года назад +13

      Excellent suggestion given the legacy of Creative Technologies. A technology hardware company founded in Singapore and sufficiently 'global' to achieve a NASDAQ listing. This was truly rare 20 years ago.

    • @mceajc
      @mceajc 2 года назад +9

      Agreed! I still have an Audigy2 card - which was (and probably still is) superb - but no good driver support for the latest OS's. It did things that no sound driver/card I've seen in ten years has been able to do. It seems Realtek do things /just/ well enough that it's not worth shelling out for anything better unless you are a real enthusiast.

    • @sweealamak628
      @sweealamak628 2 года назад +5

      @@TheNefastor I was a little kid back then and everyone would crowd in front of the family computer to marvel at the interactive game play with sound effects better than the arcade. Somehow I wish I could mod my PC and play those games again.

    • @sweealamak628
      @sweealamak628 2 года назад +8

      @@daviidgray1681 Yep. Very rare indeed. I had a glimpse of the founder himself when he was a humble computer repairman in a rickety old shopping mall. A hardworking and honest man that repaired my brother’s computer. Little did we knew he would go on to much bigger things.

  • @pdsnpsnldlqnop3330
    @pdsnpsnldlqnop3330 2 года назад +40

    Also important was how Nvidia were able to recruit whole teams from SGI when Microsoft announced they were doing Fahrenheit with SGI - this was in the days when Microsoft also bought SoftImage and wanted to take over the workstation space with Win NT, causing SGI difficulties apparent to all their staff. It wasn't even seen as a big deal for SGI staff to defect. So they did, taking their knowledge of the fundamentals with them.

    • @LMB222
      @LMB222 2 года назад +11

      Now I understand why Windows NT was also built for the MIPS architecture, the one used by SGI.

    • @excitedbox5705
      @excitedbox5705 2 года назад +5

      Nvidia has always played dirty.

  • @fraktaalimuoto
    @fraktaalimuoto 2 года назад +100

    Nvidia going for General Purpose GPU computing with CUDA was a genius move. As a GPU computing expert doing physics simulations, I say it is now a significant performance and cost benefit in high performance computing sphere.

    • @TheNefastor
      @TheNefastor 2 года назад +9

      Didn't AMD try to do the same thing with OpenCL ? I'm a total noob compared to you, but I remember OpenCL being praised for being open-source, unlike CUDA. Did something go wrong ? Or is AMD competitive in that area ?

    • @TheNefastor
      @TheNefastor 2 года назад +7

      @@minespeed2009 I've recently started learning deep-learning and yeah, I'm a bit scared of being trapped in the Nvidia ecosystem the way I got trapped into the Microsoft ecosystem.

    • @franciscoanconia2334
      @franciscoanconia2334 2 года назад +6

      @@TheNefastor Just wait for the XILINX acquisition to be complete. OpenCL and FPGA processing will compete favorably against CUDA. ALSO cuda uses an api-oriented architecture while OpenCL is written in machine code and executed directly.
      So far NVDIA ecosystem has gained traction for sure. BUT AMD is on a roll and wound't be surprised if they somehow developed new synergies with their x86+GPU+FPGA ecosystem all under a single roof.

    • @TheNefastor
      @TheNefastor 2 года назад +1

      @@franciscoanconia2334 that would be great since I speak VHDL too 😄

    • @alexandresen247
      @alexandresen247 2 года назад +1

      how does it compare to openCL? why did you choose to use cuda instead of openCL?

  • @2drealms196
    @2drealms196 2 года назад +29

    Thank you for covering Nivea. Love their lavender infused moisturizing cream.

    • @Cypherdude1
      @Cypherdude1 2 года назад +1

      LOL. Nivea is pissing off a lot of people. They are going to stop producing their 30 series video chips this month. At least Intel is going to produce new video cards. There seems to be a never-ending insatiable appetite for video cards. I'm wondering when this shortage will end, if ever. They're saying the shortage will remain for 18 more months. At this point, I don't think people will care who makes the card or what the performance is as long as they get SOMETHING which is stable.

    • @mattmexor2882
      @mattmexor2882 2 года назад

      @@Cypherdude1 Nvidia isn't going to stop producing 30 series cards this month

    • @michaelmassino6344
      @michaelmassino6344 2 года назад +2

      @@Cypherdude1 Look again at the spelling of the above comment. Nivea makes cosmetics.

    • @David-lr2vi
      @David-lr2vi 2 года назад +1

      😂

    • @sandyleask92
      @sandyleask92 2 года назад

      @@mattmexor2882 This makes my purchase of buying a an already overpriced 3080ti worth it. They are doing this to keep prices high I hear.

  • @LikaLaruku
    @LikaLaruku Год назад +6

    I remember the graphics card wars.
    I remember when graphics cards cost less than a complete desktop PC set.

  • @Edward135i
    @Edward135i Год назад +4

    3:31 SGI was so far ahead of everyone at this point they could have easily been the size of Nvidia today. SGI created the graphical hardware for the N64, and the N64 dev kits where SGI work stations. What a incredible blunder they made not getting into the consumer space.

  • @toothofthewolf
    @toothofthewolf 2 года назад +58

    Odd Matrox didn't get a mention. The Matrox Millenium was a top selling card for a while. Intel's MMX extensions was also a resource Quake utilized that helped Nvidia piggyback. But otherwise this doco was very on point.

    • @pfefferle74
      @pfefferle74 2 года назад +12

      Matrox was only able to compete for a while because the 2D performance of their chips and DAC converters for VGA was one of the best out there, especially at higher resolutions. But when more and more games shifted to 3D and DVI became a thing, they were left in the dust.

    • @andycristea
      @andycristea Год назад +3

      Quake does not use MMX.

    • @alexlo7708
      @alexlo7708 Год назад

      How about S3?

    • @fffUUUUUU
      @fffUUUUUU Год назад +2

      ​@@andycristea agree

    • @Martinit0
      @Martinit0 2 месяца назад

      I had a Matrox card specifically for their good 2D performance (at the time I didn't care about 3D games).

  • @CodeCharmer
    @CodeCharmer Год назад +6

    Didnt some of the top engineers at Nvidia come from Silicon Graphics Inc? I think its fascinating how people moved around from various companies. From the 70's-90's I think fewer than 100 people really shaped the whole home computer market. Probably less than 50 in the chip and motherboard design space.

  • @RalfStephan
    @RalfStephan 2 года назад +15

    Actually, Castle Wolfenstein was the first 1st person shooter. Its wide distribution in the mailbox scene prepared for Doom's success.

    • @EivindSkau
      @EivindSkau 4 месяца назад

      One could argue that Ultima Underworld was the first one as it was released two months before Wolfenstein.

  • @THE16THPHANTOM
    @THE16THPHANTOM 2 года назад +22

    damn, i have never once made the connection between Nvidia and the Latin Invidia. this is the second time this has happened to me. first time being the connection between Asus and Pegasus. so much for the creativity part of my brain.

    • @Interestingworld4567
      @Interestingworld4567 2 года назад +5

      If you put it this way I think Nvidia with their name are trying to say that every one else that are not them are Jealous meaning Envidia in Spanish.

    • @SuperBlackmen10
      @SuperBlackmen10 7 месяцев назад

      I have read another theory saying the name came from the similarity to the word "video"

    • @renecruz2964
      @renecruz2964 4 месяца назад

      Lol I feel the same way

  • @la7era1u54
    @la7era1u54 2 года назад +4

    3Dfx, I haven't heard that that company name in a long long time. The first graphics card I ever bought was a 3Dfx. I remember buying it like it was yesterday. I hadn't played games on a computer since the Commodore days in the 80s, but I had very fond memories of playing games all weekend and summer with the other boys from neighborhood so when I was finally on my own I went and bought a PC and a game. I took it home and tried to play it. After getting practically zero framerate I read the box and realized I needed a GPU. I knew what they were, but had no clue how to shop for one so I started researching and over 20 years later I still haven't stopped. That 3Dfx card was what got me interested into how PCs work and ultimately had me going back to school to take as many classes as I could on the subject

  • @snawsomes
    @snawsomes 2 года назад +6

    I hope this video makes you a lot of money over the years. Simply one of the best lectures about the history of graphics cards on the net.

  • @sarcasmo57
    @sarcasmo57 2 года назад +6

    Oh dude! I remember those old Vodoos they had a cable in the back of the pc that went from the video card into the 3d card. My first was the Banshee, 16mb of power! It was awesome at the time.

  • @tradito
    @tradito 2 года назад +10

    22:00 pretty sure they mean Graphics Processing Unit, not general processing unit.

  • @crylittlebitch
    @crylittlebitch 2 года назад +10

    great video. Personally, I would've liked a section that goes more indepth into the recent years, as well as the ARM deal. Also, I'd love a video similar to this one for AMD, as well as one on the merger between AMD and Xilinx. Hopefully some of these topics will make it into future videos!

  • @punditgi
    @punditgi 2 года назад +7

    Superb video with great explanations! Keep these videos coming!

  • @scottfranco1962
    @scottfranco1962 2 года назад +8

    Just one small addition: When Intel pushed onboard graphics, where the graphics memory was part of the main memory of the CPU, it was thought that the video solution would actually be faster, since the CPU would have direct access to the frame buffer, as well as having all of the resources there to access it (cache, DMA, memory management, etc). The reason they lost that advantage in the long run was the dual advantages of VRAM or dual ported video ram, a ram that could both be read and written by the CPU at the same time as being serially read out to scan the video raster device, as well as the rise of the GPU, meaning that most of the low level video memory access was handled by a GPU on the video card that did the grunt work of drawing bits to the video ram. Thus Intel ran instead down the onboard video rabbit hole. Not only didn't they win the speed race with external video cards, but people began to notice that the onboard video solutions were sucking considerable CPU resources away from compute tasks. Thus the writing was on the wall. Later, gamers only knew the onboard video as that thing they had to flip a motherboard switch to disable when putting a graphics card in, and nowadays not even that. Its automatic.

    • @PaulSpades
      @PaulSpades 2 года назад

      Shared CPU-GPU memory still is a good idea, sadly the graphics APIs didn't take advantage of it one bit and I think Intel's graphics drivers were always a bit crap. As such, only game consoles and some mobile devices properly implemented a memory pool with both cpu and gpu access. The Vulkan api seems to treat GPU and CPU cores symetrically, and shared memory on the ZEN platform should finally fix this on mainstream pcs.
      At this point, modern CPU cores support simd with AVX extensions, I don't know if there's any any point having separate graphics oriented APIs and architectures for gpus... the architectures seem to have merged towards eachother(GPUs getting more general instruction support and CPUs getting more parallel data instructions).

  • @fisherfish9589
    @fisherfish9589 2 года назад +4

    The greatest move NVIDIA did was acquiring the marketing genius Brandon “Atrioc” "glizzy hands" Ewing. He pitched and developed the iconic Greenout campaign. He also pitched "Frames Win Games" Esports campaign, which is currently the flagship Esports campaign for NVIDIA and its hardware partners. Truly the GOAT of marketing

  • @justacasualgamer5774
    @justacasualgamer5774 2 года назад +20

    Nvidia did a lot of dirty work to win the market domination~ the CEO is cunning. That's what makes a successful company.

    • @ConorDoesItAll
      @ConorDoesItAll 2 года назад +10

      Yeah most companies have been stabbed by Nvidia but they have so much money that they don’t care.

  • @EveGantian
    @EveGantian Год назад +2

    Insanely interesting, having grown up using computers in the whole period covered by this video, it's super cool to now understand all these brands, technologies, terminologies, strategies and decision making by the companies. Thank you so much for the information - I thoroughly enjoyed it. Please keep up the good work.

  • @aniksamiurrahman6365
    @aniksamiurrahman6365 2 года назад +4

    A followup with a GPU wars will be awesome.

  • @jackeldogo3952
    @jackeldogo3952 Год назад

    I love this video, it's like what I lived through in the 90s trying to build and constantly improve my fankenstein machines. All the churn with the video cards brings me down memory lane (3Dfx, Intel 740i, D3D, OpenGL AGP vs PCI, Real3D, Voodoo 1, 2, Banshee, etc). Thanks!

  • @Jimblefy
    @Jimblefy Год назад +2

    Your videos are all very well researched and very interesting. Thanks to you and your team.

  • @Ivan-pr7ku
    @Ivan-pr7ku 2 года назад +7

    3Dfx was a typical cash burning startup in the Dotcom era of the late 90s. The captured the PC gaming market in the right time with simple and working solution, but they always had struggled with evolving forward, with unusually long R&D cycles, while relying too much on recycling their old and proven tech. But the alienating of the OEM partners would be their biggest undoing from a long string of missteps... and they had no chance to compete with Nvidia's 6-month release cadence.

  • @wino0000006
    @wino0000006 2 года назад +4

    Jeez - I remember all these graphic cards revolutions in the 90s. ATI, RIVA TNT, VOODOO 3Dfx. It was the time when the AGP standard was introduced before being replaced by PCI Express.

  • @pyrosphynx5449
    @pyrosphynx5449 2 года назад

    this took a lot of effort and it turned out great! well done

  • @girohead
    @girohead 2 года назад

    Excellent overview and history! Thanks, well researched.

  • @esvegateban
    @esvegateban 2 года назад +19

    Don't forget that Nvidia never stopped selling their GPUs to third party companies, so today there's a myriad of brands using Nvidia chips from where to chose.

  • @JamesLaserpimpWalsh
    @JamesLaserpimpWalsh 2 года назад

    Cheers for the vid. Concise and accurate. Good work

  • @alixxworkshop846
    @alixxworkshop846 2 года назад +2

    This was one of those videos that I've really enjoyed watching and learned something new... Good job SIR!

  • @exinetv1894
    @exinetv1894 2 года назад

    Man, I just love your voice and calm explanation style.

  • @djnavari
    @djnavari 2 года назад +10

    This is an excellent channel. I’m definitely becoming a Patreon supporter this content is so well curated. it’s just technically excellent

  • @JohnGaltAustria
    @JohnGaltAustria 2 года назад

    Very well done overview.

  • @crunchworks22
    @crunchworks22 6 месяцев назад

    I love your channel! I'm a mechanical engineer who switched from heavy machinery to semiconductor fab equipment engineering (CMP). Your videos have been so helpful to me learning how all this crazy wafer stuff works.

  • @swamihuman9395
    @swamihuman9395 Год назад

    - Excellent.
    - Thx for all your effort, and talent.
    - I got into computer graphics in the early 90s, so, I experienced all the history you covered, but did not know any of the back story... until now. Thx, again.

  • @robertmorrison107
    @robertmorrison107 2 года назад +1

    Great video. I wish you'd do one on Matrox as well.

  • @MisakaMikotoDesu
    @MisakaMikotoDesu 2 года назад

    Love these videos! Thanks!

  • @BaghaShams
    @BaghaShams 2 года назад +5

    GPU always stood for Graphics Processing Unit; GPGPU is the term for General Purpose Graphics Processing Unit

  • @sepolopez6706
    @sepolopez6706 2 года назад

    Great video as always! Keep in mind that without ASML nothing would be possible. Thanks!

  • @hrishikeshb
    @hrishikeshb 2 года назад +4

    Great video and I loved knowing the history behind the giant. If I've read correctly, a GPU is a Graphics Processing Unit and not a General Processing Unit as you mention it towards the end of the video - en.wikipedia.org/wiki/Graphics_processing_unit. The rise of AI and ML sciences made these chips popular because it is faster in orders of magnitude than a regular CPU with more or less the same power consumption. Nvidia is leveraging its know-how in developing graphics chips to now capture this market.

  • @organichand-pickedfree-ran1463
    @organichand-pickedfree-ran1463 2 года назад

    20:42 that Unreal Tournament screen shot is some good nostalgia :D

  • @GBlunted
    @GBlunted 2 года назад +3

    So nostalgic! All these brand names take me back to my childhood and asking Santa for Voodoo cards MMX processors and AGP slots...

  • @fcfdroid
    @fcfdroid Год назад

    Now this is the video I was looking for! Good stuff dude 😎

  • @RonJohn63
    @RonJohn63 Год назад

    18:37 I remember STB. They made a good Tseng Labs ET4000/W32 board that ran well with OS/2.

  • @nufosmatic
    @nufosmatic Месяц назад

    1:15 - in 1991 triangle-database graphics engines where $M machines that needed a $M computer to run their databases and were used to provide displays for simulation flight trainers. I worked for a company that provided the simulation computer that requested a set of displays at 15/20/30/60Hz. Five years later those graphics machines were obsolete...

  • @Bonifaquisha
    @Bonifaquisha 2 года назад +6

    For those interested: The IBM Professional Graphics Controller is called 'PGA' because the A can stand for either Array or Adapter. In modern English the letters 'A' and 'C' are commonly used interchangeably, much like the letters 'ß' and '§'. This can often be confusing for those who speak British English because IBM chose to use the uncommon 'Controller', instead of the classic spelling 'Aontroller'.

  • @RevDrCCoonansr
    @RevDrCCoonansr 2 года назад

    What an awesome channel! Liked, commented, subbed! Thank you!

  • @JohnnyWednesday
    @JohnnyWednesday 8 месяцев назад +1

    The Cromemco "Dazzler" was the first graphics card in 1976. (and you're about 10 years too late for the graphics revolution - the Amiga was doing all of this in 1985 as well as the GUI. The screenshot of GLQuake you show is actually the software renderer - it doesn't use bilinear filtering. If you go to the place you sourced the image you'll see you took the wrong screenshot - this is a software shot next to the GL shot for comparison)

  • @NatrajChaturvedi
    @NatrajChaturvedi 2 года назад +2

    I was young during the days of games like doom2, quake 1,2, Red Alert etc and I remember being able to run these games perfectly on our 486 machine without any discrete graphics installed. I know my dad and he hated the idea of paying extra for discrete cards so I'm sure there wasn't one in our system.
    We did have sound blaster tho and the games looked and sounded awesome!!

  • @wonlop469
    @wonlop469 2 года назад +9

    I can only speak to my experience as a PC builder sine the late 90's.
    It was the Drivers.
    At least in my experience, Nvidia drivers were generally more stable than their peers.

    • @cpt_bill366
      @cpt_bill366 Год назад +1

      Uhh, NO. My Voodoo worked just fine out of the box while my friends with Nvidia TnT cards were always complaining about constantly patching. To this day Nvidia's rapid patch schedule is annoying, because every time I patch I need to reset my resolution scaling. I just stopped patching as a result.

    • @wonlop469
      @wonlop469 Год назад +1

      ​@@cpt_bill366 Your going way back there brother, when we had to mess with IRQ's and such. I think my first was a GeForce 6600 on AGP. Never messed with ISA cards so can't speak to them. God IRQ's were a pain in the arse, along with 9600 baud modems.

  • @josecuervo3351
    @josecuervo3351 2 года назад

    Excellent historical summary!

  • @milleniumsalman1984
    @milleniumsalman1984 2 года назад

    Very good video explanation

  • @CaptainDangeax
    @CaptainDangeax 4 месяца назад

    I was running a computer shop between 1997 and 2000, and I saw brands come and go. I sold a lot of 3dfx voodoo1, voodoo2 and banshee. But in 2000, Riva 128 and Riva TNT were arriving and changing the face of the market . The king's choice of NVidia was to handle the drivers by themselves. Even with ATI there's always a wheelbarrow full of compatibility issues, even today

  • @petergambier
    @petergambier Год назад

    Great show thanks Asianometry,
    My whole gaming world improved when I installed the Voodoo card, it was as different as night and day.

  • @djmips
    @djmips 6 месяцев назад

    I worked at one of those 3rd party board companies in the mid nineties and so far this is the only presentation that gets it right. Great job.
    I was even offered a job at Nvidia who directly recruited me in their drive to vertically integrate graphics drivers. I am sad I said no.
    But I could tell that Nvidia was going to win it all, on my visits, they shared with me their secret inner sanctum where there computers were situated running that vaunted design simulation. Their record of getting chips out on first turn was unmatched. It's almost like your observation about TSMC - they would win with cycle times.

  • @hamsta11
    @hamsta11 Год назад

    Great video! There's also the issue where 3dfx kept suing nVidia who just patiently bore the brunt of the lawsuits until when they finally bought them. Its a good business lesson.. don't worry about doing things that your competitor might sue you for if you can plan to just eventually buy them out.

  • @williamogilvie6909
    @williamogilvie6909 Год назад

    Very informative. In 1986 I joined a company in British Columbia, Gemini Technology, that had great ambitions to dominate the graphics card market. We designed an EGA chipset using Daisy workstations. I taped out the design at LSI Logic in Sept., 1987, a few months after IBM introduced the VGA. Earlier that year we had spent a month in Japan, at Seiko Epson's design center. I don't know why that effort failed, since their was little transparency there. After completing the design at LSI, I moved to Video-7, an up and coming graphics card company that had just completed a VGA chipset, and was selling graphics boards. My LSI Logic EGA chip, surprisingly, found some customers. At V-7 we examined the IBM PGA, trying to reverse engineer the chipset and I also helped integrate my previous design with a V-7 product. Gemini eventually failed and was absorbed by Seiko Epson. Video 7 merged with an LSI Logic division, but had some legal problems that required the CEO to wear an ankle bracelet. I continued designing custom chips and even ran into my former Gemini Tech. co-workers at a Silicon Valley design center. Most of all I enjoyed the time I spent in Japan.

  • @nufosmatic
    @nufosmatic Месяц назад

    17:05 - 2008 - The company I worked for had a choice between ATI and nVidia for a general-purpose computing solution using graphics processors for a defense platform. Our team spoke ATI as a first language, which was hieroglyphics, the customer wanted performance and nVidia but couldn't get both. We settled on ATI. That product lasted one generation. CUDA led the way. And the customer did not need our specialized hieroglyphics programmers...

  • @refinedsolutions1513
    @refinedsolutions1513 2 года назад +6

    Good to acknowledge the legend Mr Carmack. Blessings from the yoruba-Oduduwa.

  • @zodiacfml
    @zodiacfml 2 года назад +6

    0:15 dang. Nvidia is now #11 next to TSMC at 545B just right now. i dont see any reason except the rumors that Nvidia are slowing down production to maintain high GPU prices

    • @elon6131
      @elon6131 2 года назад +2

      it's a BS rumour that makes no sense lol, as nvidia sells the GPUs at the same price regardless of final price to consumers. Nvidia's stock price has everything to do with their dominance in data center and AI, very little to do with gaming sales even though these represent a good chunk of their revenue still.

    • @zodiacfml
      @zodiacfml 2 года назад +1

      ​@@elon6131 while i did not watch that rumor, Best Buy increased their prices on Nvidia cards.
      that is shocking new info. nvidia received flak from some of their investors when crypto crashed in 2018 and crashing the stock, arguing that Nvidia downplayed their revenue from graphics cards/mining in 2018.

    • @elon6131
      @elon6131 2 года назад +2

      ​@@zodiacfml and then nvidia got mad at them and they had to roll back. Though FE editions are the only ones nvidia directly sells, final retail price is not in their hand nor do they benefit from any markup the retailer puts on the product. The best buy increase was a best buy move (consistent with their pricing on the other cards, until nvidia had them roll it back). nvidia's FE msrp has not changed.
      i am aware they got flak after the crypto downturn, though it was most certainly not malicious as nvidia cannot know who buys their cards once they are in retailers hands (and lying on reports would be very, very illegal. you do not mess with the SEC).

    • @zodiacfml
      @zodiacfml 2 года назад

      @@elon6131 good one.👍

    • @cedricchiu9763
      @cedricchiu9763 2 года назад

      crypto mining

  • @Mr30friends
    @Mr30friends 2 года назад +15

    7:25
    A video on Malta and the fabs it has managed to lure in might be interesting.

    • @RandomByte89
      @RandomByte89 2 года назад +2

      Is that Malta the country, or Malta in NY?

    • @Mr30friends
      @Mr30friends 2 года назад

      @@RandomByte89 the country of course

    • @ntabile
      @ntabile 2 года назад

      @@RandomByte89 If Malta in NY, THEN Global Foundries

  • @MostlyPennyCat
    @MostlyPennyCat 9 месяцев назад

    I remember when I first had a go with Linux, it was Debian.
    I couldn't figure out how to install drivers.
    Right up until the point you realise they just get detected and loaded as kernel modules at boot or, at most, pulled in with an apt-get command.
    Certainly was an eye opening moment in my journey as a software engineer and architect.

  • @lexingtRick
    @lexingtRick 2 года назад +5

    Québec's Matrox, the Parhelia, the origin of the ring bus. Canadian Ati picked it up. Matrox is still alive today.

    • @SianaGearz
      @SianaGearz 2 года назад +2

      Parhelia, aka too little, too late, too expensive.

    • @aniksamiurrahman6365
      @aniksamiurrahman6365 2 года назад

      What's the situation of Matrox today?

    • @jcurran8860
      @jcurran8860 2 года назад +1

      We bought the latest Martox Hardware to get the upper hand in Decent II. Fun times in Online Gaming.

  • @user-7165jdhrnxymzn
    @user-7165jdhrnxymzn Год назад +1

    great video! 👍🏻 could you please make a video about matrox, 3dfx, number9 , thank you

  • @Drumaier
    @Drumaier Год назад

    Nice video and It was incredible to see that little graphic card that Intel made in the '90s that went nowhere....same with Arc now. I don't understand why the N1 CPU maker in the world can't just make some competitive good-performing GPUs. They have experience designing chips and financial resources.

  • @philflip1963
    @philflip1963 2 года назад

    More info on the detailed architecture of these systems would be interesting.

  • @evansearch7935
    @evansearch7935 2 года назад +3

    Wait, does “GPU” not stand for “graphics processing unit”?

  • @harrison3452
    @harrison3452 2 года назад +5

    It’s a shame how hard it is presently to get a hold of the new 3000 series gpus from nvidia. Really hope this changes, for a long time I built my pcs with amd but with the new gpus they came out with it makes gaming more affordable

    • @Statek63
      @Statek63 Год назад

      Just wait for the crypto craze to die (in pain) and you'll be good 🙂

  • @llamathenerd1672
    @llamathenerd1672 2 года назад

    2:36 my word, that is a beautiful card.

  • @freddyng1843
    @freddyng1843 2 года назад +8

    Excellent video. One thing you missed out was Matrox. I believe they were the 4th market player back in the early 2000s.
    Though Nvidia GPUs usually cost more than it's rival, stability is something I always look out for which both Nvidia and Intel provides it. The red team has time and time again disappointed me in its quality. The introduction of Radeon Vega APU though change the whole gameplay for budget PCs again. I hope Intel's newest foray into the GPU space will bring a new level of benefits for consumers.
    Here are some of the Nvidia GPUs that I have ever owned and used includes:
    - Riva TNT
    - GeForce 2 MX
    - GeForce 6600 GT
    - GeForce GTX 660 Ti
    - GeForce GTX 770
    - GeForce GTX 1060
    Unfortunately bloody crypto miners are ruin the GPU prices for gamers now.

  • @anv.4614
    @anv.4614 2 года назад

    Thanks for you analysis.

  • @ArnaudMEURET
    @ArnaudMEURET 2 года назад +2

    I owned every one of these early graphics cards and monitored Carmack’s .plan file eagerly in these years. The timeline is not entirely clear and some events and declarations end up seeming intermingled, albeit not intentionally.

  • @picobyte
    @picobyte 2 года назад

    Double voodoo2 with viper550 was my cannon game system, later got the 770 and so on, the early days of gaming where one of the best rides during my life.

  • @pastasauce99
    @pastasauce99 2 года назад

    Great reference on Computer aided design. That must have been cool to have a dad that did that. I do Logic design and when I was a kid I used to play around with a software called Powedertoy. when I got alittlebit older I was referred a $1 software called Digital Logic Sim. It let you build simple processors and logic gates even memory. This is not an ad as I don't really know who the people who made it. but its a fun pastime. Fun fact Jensen and Dr Lisa from AMD are relatives.

  • @angelg3986
    @angelg3986 2 года назад +3

    Somehow Nvidia got the devs of machine learning frameworks to use their proprietary CUDA API instead of OpenCL or GLSL. This made a huge impact making their GPUs the only ones capable of AI. It would be interesting to have a video of why/how all developers chose CUDA. It still seems a strange choice locking them to a vendor.

    • @PaulSpades
      @PaulSpades 2 года назад +1

      I'm not AI guy, but... GLSL is OpenGL's hack for opening up the graphics pipeline to software - a c-like language to process vertex and pixel data. While it can be used to process data, it needs two or at least another layer API above it, because it presents entirely the wrong abstractions for general computing. And the language compilers aren't anything to write home about either. OpenCL is that API layer above GLSL, and it's not a great API from what I've read. So CUDA being more popular is not at all surprising.
      Vulkan seem to be alot better though - it presents GPU and CPU cores (or any other type of processors) the same way, with the same memory model - this was the whole AMD homogenious compute idea and they've been executing it on the hardware level for more than a decade, now the software is there to provide a decent programming model for it.
      TLDR, There's no use making brilliant hardware if nobody can write software for it, what ABI or API, and sofftware tools you present to a programmer matters a lot.

    • @angelg3986
      @angelg3986 2 года назад +1

      @@PaulSpades CUDA wasn't so good at the beginning - just a proprietary way to write shaders. But I agree that AMD and the rest (Intel?) neglected what impact would have this few years later.

  • @PlanetFrosty
    @PlanetFrosty 2 года назад

    I worked as Director of Development at an SGI major integrator and we later added Sun Microsystems and I become CTO of a Telecom Manufacturer who pattered with one that hand licensed equipment manufactured in Taiwan.

  • @AediionTV
    @AediionTV 2 года назад +22

    Now this is a real Brandon "Atrioc" "Marketing Extrodidinare" "G.H." "Ewing" Ewing classic

  • @williamlloyd3769
    @williamlloyd3769 2 года назад +2

    Recall getting these graphic boards as hand me downs from our stock trade floor. Traders were upgrading their desktops in the 1990s at will so why let last years model go to E-waste.
    PS - are you going to do a part 2 on impact of digital currency on the graphics market?

  • @mkmishra.1997
    @mkmishra.1997 2 года назад

    one feedback i would like to add to an otherwise great content is to increase the microphone volume output

  • @RayMak
    @RayMak 2 года назад +12

    I still remember their Voodoo Graphics Card

  • @RichieH81
    @RichieH81 2 года назад

    What a great video, I’ve just arrived in Taipei and hopefully after government quarantine I’ll get out to do some tech related sight seeing. If any one has any recommendations please let me know.

  • @FingersKungfu
    @FingersKungfu 11 месяцев назад +1

    I still remembered when the Voodoo cards ruled the market.

  • @epposh
    @epposh 2 года назад +18

    i've always thought that GPU stood for graphics processing unit... perhaps you mean to say GPGPU?

    • @William-Morey-Baker
      @William-Morey-Baker 2 года назад +8

      it does mean graphis processing unit... he was wrong... at least it was relatively small, but it makes me doubt a lot of the other stuff he says

    • @javaguru7141
      @javaguru7141 2 года назад

      @@William-Morey-Baker All people make mistakes, even really braindead ones. It doesn't necessarily have any deeper implications.

    • @javaguru7141
      @javaguru7141 2 года назад

      @@William-Morey-Baker I should also add that that kind of "gotcha" thinking can be very toxic and is often used to reinforce anti-intellectualism. Be careful to avoid bias.

    • @epposh
      @epposh 2 года назад

      @@javaguru7141 i kinda had the same thoughts as william morey-baker. this wasn't just a slip of the tongue - this is basics; and if he can make mistake in this, it makes me question his knowledge about computers. i wonder if he's just someone simply reading a script, without understanding of what he's reading, which i've seen happen too many times in other youtube videos.
      p.s. this is a very simple point that william is making. big phrases like "toxic thinking" or "reinforce anti-intellectualism" are completely out of place here. sounds more like you saw this whole sentence somewhere else and decided to paste it here to appear smart.

    • @javaguru7141
      @javaguru7141 2 года назад

      @@epposh *sigh*... I won't bother opening a conversation about the latter. But regarding the former, I watched the video again and I'm pretty convinced it really was just that - a slip of the tongue. NVIDIA really did start pushing GPGPU around that time. The acronym for that is General-Purpose GPU. You could easily backronym that as "General-Purpose Unit". I wouldn't be surprised if someone at NVIDIA actually uttered those words at some point. The important information he was trying to convey was accurate.

  • @stevechiasson5086
    @stevechiasson5086 2 года назад +1

    The voodoo was my very first GPU. Paired with an Intel 333mhz single core cpu and if I remember correctly, 128mb of ram. ah, the memories :)

  • @illshootyou5199
    @illshootyou5199 Год назад

    Showing my age... i played Quake 1 competitively and remember having a 3DFX Voodoo2 and loved GLQuake haha

  • @shawnstangeland3011
    @shawnstangeland3011 2 года назад

    Loved my visit to Tainan...people are so friendly

  • @Martinit0
    @Martinit0 2 месяца назад

    I hadn't considered the "Nvidia taking control of the driver" as explicit as you said it. In fact, I suspect by doing that they helped all their card OEM customers taking away the burden to write and keep up to date their drivers.

  • @bujin5455
    @bujin5455 2 года назад +1

    A very interesting comparison there at the end between the different vertical integration paths. People have a hard time understanding what things you should bring in house which will materially add to product quality, verses materially adding to complexity and being a burden on quality. Reminds me of the people who seem to think Apple buying TSMC is a good idea. Nothing could be further from the truth.

  • @MikkoRantalainen
    @MikkoRantalainen Год назад

    Great document about GPU history! I'm old enough to have actually bought a Voodoo graphics card with my own money as a new.
    I feel that the drivers of NVIDIA cards are still the most important part of the technology. It appears that AMD (ex ATI) hardware may actually have more computing power but drivers fail to surface it to the applications.
    In addition, OpenCL took too many years to create and NVIDIA proprietary CUDA took the market and many apps (including open source Blender 3D) have proper hardware support for CUDA only, even today. AMD is supposed to have something cooking for Blender 3D version 3.5 but the actual performance remains to be seen.

  • @jacobwcrosby
    @jacobwcrosby 2 года назад

    The picture of the motherboard at 20:24, does it actually have on board graphics, or is it what I find to appear as, an 'APU' as AMD referred to them...?

    • @SianaGearz
      @SianaGearz 2 года назад +1

      That be an ASRock A790GX with an AMD AM2 CPU socket, so no APU support. It has a Radeon HD 3300 integrated into the board chipset.