NVIDIA's Greatest GPU Is Not What You'd Think...

Поделиться
HTML-код
  • Опубликовано: 22 авг 2024

Комментарии • 146

  • @JamesSmith-sw3nk
    @JamesSmith-sw3nk 2 месяца назад +51

    What a lot of today's reviewers forget is that 1080p wasn't always the most popular resolution. When this card was released, 1024 x 768 was the most popular resolution.
    Many lcd's had that resolution. Expensive and larger lcd monitors (22" used to be the large lcd monitors, remember that?) were often only 1680x1050 or 1600x900.

    • @coccoborg
      @coccoborg 2 месяца назад +5

      Yes!!! I had a 1440*900 panel :D

    • @JohnSmith-iu8cj
      @JohnSmith-iu8cj 2 месяца назад +3

      1680x1050 was the standard for new tfts at the time. Old tfts had mostly 1280x1024

    • @JamesSmith-sw3nk
      @JamesSmith-sw3nk 2 месяца назад

      @@JohnSmith-iu8cj Resolution mostly varied by lcd panel size. In 2008, according to the Steam survey, over 70% of pc gamers ran at either 1024 x 768 ( I ran that resolution on a 17" lcd) or 1280 x 960 (I never ran that resolution.)

    • @manitoba-op4jx
      @manitoba-op4jx 2 месяца назад +2

      @@JohnSmith-iu8cj i still have my 1280x1024 dell monitor. it's been my primary monitor for 20 years now

    • @FullyBuffered
      @FullyBuffered  2 месяца назад +1

      True! These tests were meant to show what performance is like at a modern resolution, but 1080p did not become mainstream until around 2010 or so. However, higher resolution were already available on larger CRTs.

  • @Sam-K
    @Sam-K 2 месяца назад +47

    In a lot of ways, the 8800GTX was the 3090Ti of its time.
    Like the 8800GTX, the 3090Ti considerably raised the power envelope - from the generally accepted 250W to a staggering 450W - and was about 1.8X faster than the previous gen. 2080Ti as a result.
    And now, the 220W 4070 Super and 245W 7700XT are considered "power efficient" cards. I'm pretty sure things will eventually go up to 500-600W, now that we have finally hit the silicon barrier and it's getting harder and harder to shrink transistors and/or run them more efficiently.

    • @harryshuman9637
      @harryshuman9637 2 месяца назад +10

      Also,don't forget, at $600+ it was like x2-3 times more expensive than the GPUs that came before. Compared to $250 of the mainstream 8800 GT. Kinda like RTX 3090/4090 vs 3070/4070.
      Also keep in mind, people would be buying 2, 3 and sometimes 4 of these cards to use with SLi. So top tier GPU setups were around $2000, again similar to 3090s and 4090s of today.
      Just to highlight that nothing really changed in the past 18 years, and top tier PCs of today aren't really that more expensive than the top tier PCs of 2006.

    • @Sam-K
      @Sam-K 2 месяца назад +13

      @@harryshuman9637 Pretty much. That's exactly why I consider the 4070Ti to be Nvidia's flagship. Cards beyond the 4070Ti are Titan-class behemoths with ridiculous TDPs and terrible price-to-performance ratios. You're paying a lot for not a whole lot of performance.
      For example, the 4090 offers ~60% better performance than the 4070Ti, yet cost twice as much, as far as MSRP is concerned.

    • @StaelTek
      @StaelTek 2 месяца назад +2

      more like 8800 Ultra, imo

    • @AlpineTheHusky
      @AlpineTheHusky 2 месяца назад

      A big "Issue" is also how well those cards scale with additional power. You dont lose a lot of performance with limiting power yet you gain a good bit with just increasing the power budget. Unlike older gens where you would put in tons of power and get only slight gains.

    • @harryshuman9637
      @harryshuman9637 2 месяца назад

      @@Sam-K You are comparing gaming performance tho, but in 2024 you aren't buying top tier GPU for gaming. You are buying it for ML applications, and 4090 is around twice as fast as a 4070.

  • @TheBasedSociety
    @TheBasedSociety 2 месяца назад +8

    i have the Quadro version of the 8800 GTX. Same deal but has 1.5 GB VRAM instead. Absolutely massive for the era.

  • @PixelPipes
    @PixelPipes 2 месяца назад +14

    A VERY excellent video, and exactly the point of view I would have taken. It was clear some years ago, even before the AI craze, that CUDA was the distant leader of the supercomputing-on-a-GPU revolution, and that was all because as you said, they laid the inroads early on and basically formed the industry from the ground up around their products. It was hard for AMD or other competitors to break in, because it wasn't just the GPGPU industry they pursued, it was _NVIDIA's_ GPGPU industry. The 8800GTX and G80 GPU at the time was described by multiple publications as an "inflection point", but they couldn't have known back then how right they'd be.

    • @FullyBuffered
      @FullyBuffered  2 месяца назад +1

      Very true - thanks for the kind words man :)

  • @Ivan-pr7ku
    @Ivan-pr7ku 2 месяца назад +16

    The foundation for CUDA was a set of programming and debugging tools derived from the Open64 project in the early 2000s -- an open source compiler for Itanium and x86-64.

  • @piked86
    @piked86 Месяц назад +3

    Your videos are so well produced and researched. I wish you made them more frequently.

  • @JamesSmith-sw3nk
    @JamesSmith-sw3nk 2 месяца назад +37

    If you own a 8800gtx, use MSI Afterburner and set it to the 8800 Ultra's clocks. They are the same gpu. The Ultra has a better cooler.

    • @TheBasedSociety
      @TheBasedSociety 2 месяца назад +7

      Ultra has a newer revision G80 and lower latency memory.

    • @Mr371312
      @Mr371312 2 месяца назад +6

      The same silicone doesn't equal same "specs", learned that the hard way. Power delivery, number of phases, cooling solutions for components outside of core.

    • @JamesSmith-sw3nk
      @JamesSmith-sw3nk 2 месяца назад +1

      @@Mr371312 It often can. When it comes to overclocking, it's ALWAYS:..
      "Your mileage will vary."

    • @TheBasedSociety
      @TheBasedSociety 2 месяца назад +3

      @@JamesSmith-sw3nk I have a Quadro FX 5600, the pro 8800 GTX with 1.5 GB. It achieves exactly the same clocks as a stock 8800 Ultra and not 1 MHz more.

    • @Different_Name_
      @Different_Name_ Месяц назад

      Binning

  • @TimmyJoePCTech
    @TimmyJoePCTech 2 месяца назад +14

    Extremely informative and interesting :)

    • @MuzdokOfficial
      @MuzdokOfficial 2 месяца назад +1

      Hi there Timmy

    • @CopperPopperComputers
      @CopperPopperComputers 2 месяца назад +2

      Whoa Timmy, nice to see you again bud. Miss your videos

    • @FullyBuffered
      @FullyBuffered  2 месяца назад +1

      Thanks for the kind words Timmy - good to see you around! :)

  • @wertywerrtyson5529
    @wertywerrtyson5529 2 месяца назад +7

    5% from datacenters in 2015 to 87% that is a massive change. They barely even need gamers anymore. I remember when the 8800GTX came out but didn’t pay much attention to CUDA as the gaming performance was so impressive.

    • @FullyBuffered
      @FullyBuffered  2 месяца назад

      Indeed! At the moment gaming has become a side hussle for them...

  • @ProjectPhysX
    @ProjectPhysX 2 месяца назад +5

    More important than CUDA, the 8800 GTX was also Nvidia's first GPU to support OpenCL, the open GPGPU language that is equally fast/efficient as CUDA, yet works on all hardware from Nvidia, AMD, Intel, Apple, ARM, ... OpenCL is increasingly important especially in science, where new supercomputers regularly have GPUs from different vendors - the same OpenCL code works everywhere and developers don't have to waste years on code porting.
    13:29 AMD's counterpart to CUDA is not ROCm, but HIP, which is their proprietary nonsense language that doesn't even work on most of their own GPUs.

    • @Slavolko
      @Slavolko 2 месяца назад

      Both CUDA and HIP are dialects of C++, so I don't know why you're singling out HIP specifically, especially when HIP is able to run on more than just AMD GPUs.

    • @FullyBuffered
      @FullyBuffered  2 месяца назад +1

      Interesting - thanks for the comment!

  • @Geekzmo
    @Geekzmo Месяц назад +1

    Great video! is nite to see you back!

  • @exaltedb
    @exaltedb 2 месяца назад +3

    I own a 8800 GT, which has basically a shrunk-down, slightly reconfigured G80, the G92, where the memory interface was cut down but counteracted with twice the TMUs. Was probably too good of a card for where NVIDIA placed it

  • @veilside62
    @veilside62 2 месяца назад +2

    i used to do distributed computing (seti@home) back in the day, and the first time i saw my 8800 working with cuda instead of the CPU, it was so fast i thought there was a problem with it because there was no way it could be that much faster than a CPU

  • @Snufflegrunt
    @Snufflegrunt 2 месяца назад +5

    The 8800GTX was the most important thing to happen to computers since the IBM 5150.

  • @Obie327
    @Obie327 2 месяца назад +1

    Been using Nvidia since the beginning and enjoyed the ride. My first G80 was the 8800GTS and then the XFX Ultra XXX edition in july of 2007 to play Crysis. But my favorite G series was the final iteration on a 55nm G92 Die or my EVGA 250 GTS 1 gig. That GPU had refined higher clocks speeds and process. Thanks Fully buffered for the look back and informative video.

  • @knightfall71
    @knightfall71 Месяц назад +2

    i remember the gtx 8800 i wanted one bad lol. but i could only afford the 7600gs:(. the card was still 225 bucks and i was earning 90 bucks a week as an apprentice mechanic 1st year lol

  • @_odaxelagnia
    @_odaxelagnia 2 месяца назад +2

    Your videos are like a treat

    • @FullyBuffered
      @FullyBuffered  2 месяца назад +1

      Thank you! That's great to hear :)

  • @MrHav1k
    @MrHav1k 2 месяца назад +2

    It definitely was the turning point no doubt about it. Thing is it really took over a decade for it to take off, and 15 years for CUDA to become the staple it is today.

  • @MadeleineTakam
    @MadeleineTakam 2 месяца назад +1

    Excellently presented as always.

  • @laz7354
    @laz7354 2 месяца назад +3

    Hard work took a decade to pay off, but it paid off big.

  • @legomaniac601
    @legomaniac601 2 месяца назад +1

    I got a pair of 8800 GT in an XPS I got recently so it show how far we have come from the pair of SLI 8800GT to where I am now with my 3080 12gb

  • @Micecheese
    @Micecheese 2 месяца назад +1

    I loved the amd opteron quad cpu setup, but that case is really expensive.
    This old 8800 GTX is some huge up in nvidia's game, cuda helped me in more ways than just fps improvements in older games.

  • @AbsolutelyCisDude
    @AbsolutelyCisDude 15 дней назад

    Just found out your channel, love the calming way you speak and love the deep dive into Nvidia's GPUs

  • @JamesSmith-sw3nk
    @JamesSmith-sw3nk 2 месяца назад +2

    Great video. I owned three 8800gtx's, and for about a week I remember I had 14th on 3D mark's scoreboard. Tri-sli stuttered a lot in games so I eventually just ran 2 way sli and used the 3rd card for PhyXs.

  • @geort45
    @geort45 2 месяца назад +5

    EVGA SR-2 FTW!

  • @roki977
    @roki977 2 месяца назад +1

    never owned 8800gtx but i did own BFG 8800gts 320 and 2X EVGA 8800GTin sli, they where way to hot.. Those where super exciting times for gaming with great games..

  • @lwwells
    @lwwells 2 месяца назад +1

    This card excited me so much when it came out that I bought 3 of them for 3way SLI. THAT WAS A MISTAKE.

    • @FullyBuffered
      @FullyBuffered  2 месяца назад

      Haha that would have been an awesome setup back in the day!

    • @lwwells
      @lwwells 2 месяца назад

      @@FullyBuffered I felt like a king for two generations. 😂

  • @jmtradbr
    @jmtradbr 2 месяца назад +1

    CUDA is the market default. It's like Adobe. Even if a program better than Photoshop and Premiere exists, they will ask in your curriculum Adobe products.

  • @razorsz195
    @razorsz195 2 месяца назад +1

    One day ill get around to setting up my dream beast, 2 G92 8800GT Gold cards in SLi with an Athlon II X2 example running over 4Ghz, a golden theme for a golden era, back when i was using an FX5200 and ol' Sempron where the only thing it was good at was forum and review binging the hardware i could never afford :')

    • @FullyBuffered
      @FullyBuffered  2 месяца назад

      Do it!

    • @razorsz195
      @razorsz195 2 месяца назад

      @@FullyBuffered Got a 7950GT SLi system in mind too with an e6500K, still hunting for a second sparkle 7950GT though, need to sell on some pcs first before making any more :P

  • @SinaFarhat
    @SinaFarhat 2 месяца назад +2

    Thanks for a informative video!

  • @ccleorina
    @ccleorina 2 месяца назад +1

    i have collected 3 GTS8800 and 2 8800 Ultra sadly they die for unknow reason. but my 4 GTX8800 still working great at 2024 inside my retro gaming pc.

  • @Olibelus
    @Olibelus 2 месяца назад +2

    Great video!!

  • @eightyd2554
    @eightyd2554 2 месяца назад +1

    Casual SR-2

  • @wewillrockyou1986
    @wewillrockyou1986 Месяц назад +1

    Nvidia became a trillion dollar company not because they made good hardware, but because they made useful software... See: Microsoft

  • @fr8shaker289
    @fr8shaker289 2 месяца назад +1

    Informative. Thank you.

  • @VladislavKusmin
    @VladislavKusmin Месяц назад

    в 2007 году я смотрел на 8800, но предпочёл взять ati 19xx. К сожалению, года через 4-5 она сломалась, но до того момента я не особо страдал от отсутствия унифицированных шейдеров

  • @lemagreengreen
    @lemagreengreen 2 месяца назад +3

    I have sort of wondered if we'll move on from the "GPU" naming convention given it's just a general purpose vector processor that we've been running for a long time now.

  • @ButtaDawg6969
    @ButtaDawg6969 Месяц назад

    old gpus are so funky and fun

  • @DanielCardei
    @DanielCardei 2 месяца назад +2

    8800 GTX was a groundbreaking product that pushed the boundaries of what was possible in GPUs.

  • @Sitharii
    @Sitharii 2 месяца назад +1

    definately the 2 Data-Center/A.I. landmarks for nVIDIA are *Tesla* (8xxx-lineup ) and *Fermi* (GTX5xx-lineup) architectures !!
    Both of them heavilly compute-based architectures that paved the way for the A.I.-era
    coincidentally ,these 2 have always been my most beloved architectures although i'm(was) just a videogamer ...

    • @FullyBuffered
      @FullyBuffered  2 месяца назад +1

      True!

    • @GrainGrown
      @GrainGrown Месяц назад

      It's written NVIDIA, not "nVIDIA".

    • @Sitharii
      @Sitharii Месяц назад

      @@GrainGrown i'm an nVIDIA customer since 1999 ( *Riva TNT2 Ultra* ) i know very well how it's written , write in google : nVIDIA logo and check the letters ,the "n" is a small one and the other letters are capitals ,but they have the exact same size ,which ,of course i can't write it the same way using "MSword" since ,in "MSword" small & caps letters have different sizes ...😉
      (*it can be written the way you say also , but this doesn't mean that the way i write it isn't correct as well )

  • @SedatedByLife
    @SedatedByLife Месяц назад

    Loving the new haircut. I always thought you were gorgeous but now... woof. 😊

  • @tonycrabtree3416
    @tonycrabtree3416 2 месяца назад

    Pretty sure it’s “Ai, Ai, Ai” 😂😂😂 or at least that’s what so many CEOs keep saying at every shareholder meeting.

  • @JohnSmith-iu8cj
    @JohnSmith-iu8cj 2 месяца назад +1

    Gamers and nerds, unite!

  • @TCBOT
    @TCBOT Месяц назад

    just showing off the evga sr2 lol

  • @inkysteve
    @inkysteve 2 месяца назад +1

    Can AI explain why one man needs so many spatulas?

  • @cheedam8738
    @cheedam8738 2 месяца назад

    I never really understood what that long plastic that extends the GPU is, what is it actually? Some older AMD Radeon HD blower type cards had those too.

    • @Mrproud696
      @Mrproud696 2 месяца назад +5

      Put in or remove card from server easily

    • @TheBackyardChemist
      @TheBackyardChemist 2 месяца назад +3

      Stiffening bracket for mechanical integrity in transport. It is intended to be used in a case that has slots to support the weight from both sides.

    • @lemagreengreen
      @lemagreengreen 2 месяца назад +1

      I think it was just like in the very old days when "full length" ISA cards sometimes had a little extension to lock into support brackets at the front of cases. Maybe Nvidia anticipated the sag problems of the future even if it wasn't strictly necessary with these cards?

    • @JohnSmith-iu8cj
      @JohnSmith-iu8cj 2 месяца назад

      I have the first 7800gtx dual slot with that extender, from dell. It has no cuda

    • @arenzricodexd4409
      @arenzricodexd4409 2 месяца назад

      ​@@JohnSmith-iu8cjthose were older architecture. Nvidia only support CUDA with unified shaders architecture.

  • @dubment
    @dubment 2 месяца назад

    gtx10 from 9 series brought the performance leap that you mentioned from 7950 to 8800gtx, funny that when they released the 1070ti, the only difference between it and gtx1080 was gddr5x and 128 more cuda cores, basically cutting an 8800gtx...

  • @DragonBane299
    @DragonBane299 2 месяца назад +2

    WOOWW your hair is so nice!
    Its always nice when you post a video, highlight of my day everytime!

    • @FullyBuffered
      @FullyBuffered  2 месяца назад +1

      Many thanks for the kind words! I'm glad to hear that! :D

  • @WhoCaresGamingIsDeadCuzOfAI
    @WhoCaresGamingIsDeadCuzOfAI Месяц назад

    Its hilarious that the 8800GTX can now get spanked and DESTROYED BY AN AMD RADEON RX 780m lmmfao! 780m (8700G ) is the performance today of an Radeon R9 280X but lol in the cpu !! and is very close to the R9 290 desktop & obliterates the Flagship (ATT) HD 6990 6GB dual GPU CFX card $1500 GPU in 2010

  • @Moderna_
    @Moderna_ 2 месяца назад +2

    Thanks cuda core, really cool

  • @WhoCaresGamingIsDeadCuzOfAI
    @WhoCaresGamingIsDeadCuzOfAI Месяц назад

    2032

  • @LawrenceTimme
    @LawrenceTimme 2 месяца назад

    Aipu

  • @DuneRunnerEnterprises
    @DuneRunnerEnterprises 2 месяца назад +1

    Also, it's became the basis for 9800,that became later the 250!😊😊😊

    • @Warbob11
      @Warbob11 2 месяца назад

      Which I pulled a 9800+ from someones old build years ago and it still works. I keep it in my collection of older GPUs I use for my Win Xp 32Bit play everything I can get working parts.

    • @DuneRunnerEnterprises
      @DuneRunnerEnterprises 2 месяца назад

      @@Warbob11
      Got 3 of 'em now.
      8)

    • @classic_jam
      @classic_jam 2 месяца назад +2

      Indirectly, those all use G92 and not G80. But G92 is a more efficient refresh with upsides and downsides.

    • @DuneRunnerEnterprises
      @DuneRunnerEnterprises 2 месяца назад

      @@classic_jam
      Maybe.
      I might make a video,going from 8800 to the 250.
      Just got to setup a platform.

  • @WhoCaresGamingIsDeadCuzOfAI
    @WhoCaresGamingIsDeadCuzOfAI Месяц назад

    yeah nvidia the anti-consumer - anti competitive POS GPU Company

  • @MrFreeman1981
    @MrFreeman1981 2 месяца назад

    your GPU test is being bottlenecked by the CPU, your imput lag is also horrible whats clearly seen at 9:02 when moving the mouse, in my opinion it is unplayable like this, yes there are some casual players who dont know the feel of actual 5ms frametime 144Hz and up gameplay, who are fine with it and even playing on a controler, but i could not game like this.

    • @FullyBuffered
      @FullyBuffered  2 месяца назад

      I can imagine! Thanks for the comment

  • @TheRealEtaoinShrdlu
    @TheRealEtaoinShrdlu Месяц назад +1

    It's "H100", not "Haayitch100".

  • @VITAS874
    @VITAS874 2 месяца назад

    Now nvidia is greedy company.

    • @bp-it3ve
      @bp-it3ve 2 месяца назад

      more like always

    • @VITAS874
      @VITAS874 2 месяца назад

      @@bp-it3ve earlyer they holds off price appetites . Now they make excuses and lies with "we will do low prices after"

    • @LawrenceTimme
      @LawrenceTimme 2 месяца назад

      Sounds like something a poor would say.

    • @VITAS874
      @VITAS874 2 месяца назад

      @@LawrenceTimme sounds like something a rich man would say. And if high prices are normal for you, then I feel sorry for you.

    • @VITAS874
      @VITAS874 2 месяца назад

      @@bp-it3ve They used to keep their appetites. Now they are lying by saying “we will lower prices after Covid”.

  • @х0хлы-підоры
    @х0хлы-підоры 2 месяца назад

    туда

  • @stuartthurstan
    @stuartthurstan 2 месяца назад

    Please, let's try to forget about those cringe, disgusting "gamer" GPUs. Everyone knows that nvidia is really an AI server company.

    • @FullyBuffered
      @FullyBuffered  2 месяца назад

      That's certainly their trajectory...

  • @homelessEh
    @homelessEh 2 месяца назад +1

    i think all this ai crap should be a crime against humanity.. i do not approve of the ai crap happening these days..

    • @homelessEh
      @homelessEh 2 месяца назад

      it'll get to the point where it'll be a being whos bloods electric and its now lobbying for rights that superceed human rights to become an electric god....at that point our blood will become a mere color to be used in a twisted pallet when it subverts us in a matrix like future un the guise of ART.... sounds crazy i know..

    • @FullyBuffered
      @FullyBuffered  2 месяца назад

      You can certainly make some strong arguments against it...

  • @supabass4003
    @supabass4003 2 месяца назад

    We all helped create the worlds most valuable company, even though nvidia has basically told gamers to get rich or touch grass, I'm happy I threw my hat into the ring 24 years ago.
    7h3 m0r3 y0u buy 7h3 m0r3 y0u 54v3

    • @LawrenceTimme
      @LawrenceTimme 2 месяца назад

      Why were you a share holder?