The Evolution of Nvidia GeForce Graphics (1999-2022)

Поделиться
HTML-код
  • Опубликовано: 23 авг 2024

Комментарии • 4,4 тыс.

  • @NikTek
    @NikTek  Год назад +2314

    19:58 Well this aged like fine Milk XD

    • @PrimeJade
      @PrimeJade Год назад +43

      Lol

    • @untouchedforce6388
      @untouchedforce6388 Год назад +160

      $1200

    • @dominique-jq553
      @dominique-jq553 Год назад +7

      Frr

    • @onionguy1
      @onionguy1 Год назад +4

      real

    • @martbro135
      @martbro135 Год назад +74

      Yep... Can't believe Nvidia made it such a high price! $2500 is totally insane. You can buy a car for that. Will definitely result in less profit for Nvidia. Graphics cards like the RTX 3090 are like 4x cheaper but can still get you 60+ FPS on almost any game even with high settings. Only the hardcore Nvidia fans will buy this untill the price goes down.

  • @AudioVisionary000
    @AudioVisionary000 2 года назад +1126

    The demos they had earlier in the 2000s is still very impressive to see what they could do

    • @dutchvanderlinde4969
      @dutchvanderlinde4969 Год назад +16

      proto RT 😅

    • @Vindsvelle
      @Vindsvelle Год назад

      @@dutchvanderlinde4969 How ya figger, Dutch?

    • @puciohenzap891
      @puciohenzap891 Год назад +6

      Yep, 3Dmark06 firefly forest

    • @raylopez99
      @raylopez99 Год назад +5

      Demos are what? Images that took two weeks to render? Layouts that are known ahead of time so they can be rendered easily? Or is it a random image done in real time? Not impressed with demos. Apparently in the last few months some research papers have shown you can do real-time ray tracing on a random scene that actually works quickly, using some new hardware and software, but I doubt it's in production yet.

    • @charleshines8523
      @charleshines8523 Год назад +8

      and back then seeing those the first time was just a whole other level of awe!

  • @jessecampos4358
    @jessecampos4358 2 года назад +4264

    The GTX 10 series cards were and still are pretty great. They just refuse to die.

    • @CN-mo4uq
      @CN-mo4uq 2 года назад +244

      so is the 900 series

    • @ValexNihilist
      @ValexNihilist 2 года назад +317

      @@CN-mo4uq definitely agree with you there. Wife's pc has a gtx 970 that still runs games like Elden Ring and Borderlands 3 with no issues at respectable framerates and detail settings.

    • @RonnieMcNutt666
      @RonnieMcNutt666 2 года назад +164

      1070 1080 and even 1080ti +6600xt/6600 are currently best value gpus by quite alot

    • @vipinchandra8929
      @vipinchandra8929 2 года назад +31

      @@ValexNihilist with potential FSR 2.0 performance, it'll give these 900 series gpus another life boost.

    • @RonnieMcNutt666
      @RonnieMcNutt666 2 года назад +2

      @@LUL69214 i guess but those gpus should go down too if the 20 30 6700xt+ gpu go down too

  • @Shorkshire
    @Shorkshire Год назад +866

    I love how optimistic that $699 price tag for the 4080 was lmao

    • @laggianput
      @laggianput Год назад

      @enrique amaya hail satan

    • @maxdeborde6772
      @maxdeborde6772 Год назад

      @enriqueamaya3883 amen

    • @curie1420
      @curie1420 Год назад +123

      ​@enrique amaya jesus didnt make the gpu prices lower so no

    • @user-zg1yv2yy4s
      @user-zg1yv2yy4s Год назад +38

      @enrique amaya following a nonexistent fictional "god" will be a regret
      so is following what nvidia says in their 4070 ti demos

    • @Puphahadd
      @Puphahadd Год назад +2

      that would have been to nice

  • @KroryykDB
    @KroryykDB Год назад +403

    God I love the old tech demos. One of my first memories of crazy graphics was back in the later 90s as a kid and my sister brought me to one of her friend's houses and he showed me a graphical test of liquid physics that was blood red. I was truly amazed as I've never seen that before and it's been stuck in my mind ever since.

    • @joeturner7959
      @joeturner7959 Год назад +7

      Minus Nalu. They only show a painted version.

    • @cachalotreal
      @cachalotreal 11 месяцев назад +2

      Experiencing that in the 90s sound magical

  • @zydeox1221
    @zydeox1221 2 года назад +1732

    I gotta say, the evolution in the starting price is also astounding.

    • @FR4M3Sharma
      @FR4M3Sharma 2 года назад +49

      Yeah, came down here to comment the same. That's almost 10 times the price too.

    • @arnox4554
      @arnox4554 2 года назад +127

      To be fair, Inflation plays a big chunk into that price, plus there are much more premium components in the more modern GPUs, and finally, they're using absolute bleeding edge tech to make these GPUs.
      Still probably overpriced though.

    • @jimmydandy9364
      @jimmydandy9364 2 года назад +19

      @@arnox4554 Yeah our wallets are bleeding too :P The use of premium components is debatable - NVIDIA has had a history of shoddy QA and quality issues with prematurely failing GPUs, 8800? and others. Companies in a modern world are cutting corners - cost cutting and increasing prices. There is more to a graphics card than the premium caps. I've been had by a company called BFG, life time warranty my arse, now that company is bankrupt, no surprise, were they making quality stuff, NO, all hype, none of the cards I purchased from them lasted more than a few months - Never had issues with other brands like EVGA or ASUS, but I know a lot of casual gamers that had premature failures on their 3000 series GPUs though, it's not uncommon - so the argument of paying more for quality is complete and utter bullshit. We should be paying $1k-$1.5k less on the latest cards. and now with Intel desktop ARC GPUs a total flop and epic fail (just like I predicted) , NVIDIA is in a right position to gauge prices even higher.

    • @Noqtis
      @Noqtis 2 года назад +90

      @@arnox4554 I hate the argument of "new technology". the first gpu nvidia ever made was 250 bucks and it was the newest technology back then. not only was it infinitely harder to make the first model than everything after that. even if they would be using more premium materials by now, all the extra cost is diminished by all the logistics that are now in place to support the tech industry and which were not even thought of 20 years ago.
      just like people nowadays think games cost more because they are harder to make. when in reality just making a game boy game like pokemon you had to overcome so many limitations, it was much harder than making a shitty cod/fifa clone for the hundredth of time

    • @arnox4554
      @arnox4554 2 года назад +6

      @@Noqtis As I said. I agree that they're still overpriced, but I do still think that something like the RIVA TNT was definitely more cheaply made than the even the 1080 Tis of today. I mean, just look at that damn cooler alone. Those old ass GPUs only needed those dinky little fans with a little heatsink to cool those chips. (Although with that said, Nvidia's chips are starting to get pretty fucking absurd in terms of TDP, especially with these Lovelace chips if the rumors are anything to go by. I WISH we only needed a tiny fan for today's GPUs.)

  • @jasjeetsingh1401
    @jasjeetsingh1401 2 года назад +3743

    I came here expecting memes and left with some knowledge

  • @MLFreese
    @MLFreese Год назад +221

    Back in 1999 and 2000, my high school AV class had a very expensive render station donated to it that would take minutes or even hours to render what a GeForce1 could do in real time. That era was amazing to witness.

    • @Rehn98
      @Rehn98 11 месяцев назад

      Im gay for god @enriqueamaya3883

    • @an2thea514
      @an2thea514 9 месяцев назад +12

      @enriqueamaya3883 can Jesus render faster than a GeForce1 can?

    • @daddylocs6253
      @daddylocs6253 7 месяцев назад +3

      ​@an2thea514 unless you ripped out his brain and used it for some form of bio horror human computer then no

    • @an2thea514
      @an2thea514 7 месяцев назад

      @@daddylocs6253 yeah, my joke was funnier when the religious spam reply still existed.

  • @215Days
    @215Days Год назад +17

    Those old demo showcases are so creative and so nice to look at, even today still impressive, my favourite has to be that green golem in 11:06

    • @shiro3146
      @shiro3146 6 месяцев назад

      seems like graphic demo from early stage of GPU(somewhere in that Nvidia as a new company and 3dfx still exist era) all the way to windows Vista/PS3/xbox360 era wich around 2006-2008 were more foccused to the fact GPU can HW Accelerate 3d rendering,real time reflection and lighting, while past 2008 they more foccused to shows that GPU can create Uncaningly close to real life with subsurface scattering on skin and the likes, 2018 seems like GPU market were shift its foccusses to refine 3d pipeline and introduced HW Ray Tracing
      while 2020 till today seems the GPU market were moved to AI and complicated math

  • @Shubbhaaam
    @Shubbhaaam 2 года назад +369

    Oh wow
    A serious and informative NIKTEK video

    • @orangejulep8665
      @orangejulep8665 2 года назад +13

      yes a serious video

    • @NikTek
      @NikTek  2 года назад +96

      there are some memes mixed in this though XD

    • @Shubbhaaam
      @Shubbhaaam 2 года назад +36

      @@NikTek I missed the part where that's my problem

    • @NikTek
      @NikTek  2 года назад +46

      @@Shubbhaaam Spiderman 3 theme starts playing*

    • @Shubbhaaam
      @Shubbhaaam 2 года назад +22

      @@NikTek I am gonna put some like in your eyes

  • @jolness1
    @jolness1 Год назад +74

    I still love my 1080Ti(s). Still very efficient for the time and outside of RT, they still hold their own in 1440P well enough for most games.
    Really wanted a 3080FE but couldn't get a chance to get one until a couple months before the 40 series launch.

    • @ertgb_
      @ertgb_ Год назад +1

      I'm gonna keep using mine until I actually need to upgrade. It's probably gonna come down to Nvidia ending driver support for them.

    • @TheBigNoobVR
      @TheBigNoobVR 6 месяцев назад +1

      Aperture science1!1!1!1!1!1

  • @matthewpierce7717
    @matthewpierce7717 8 месяцев назад +9

    I’ve still got a “retro” PC with dual 8800 Ultras in SLI. Each GPU has an aftermarket Theraltake air cooler that used to be a relatively popular upgrade back in 2006-2007. The whole build is in a limited edition fatal1ty case that’s made completely out of thick gauge sheet aluminum. Even though I haven’t turned it on in years, I’ll always keep it around!

  • @schwartzseymour357
    @schwartzseymour357 2 года назад +902

    "Computer graphics today are the best they've ever been". They are always the best they've ever been.

    • @Davethreshold
      @Davethreshold 2 года назад +41

      That's a damn good point!🤡

    • @anewman
      @anewman 2 года назад +61

      But this time they're the best they've ever BEAN

    • @tarekcat1
      @tarekcat1 2 года назад +5

      They're getting better overtime 💀

    • @4gbmeans4gb61
      @4gbmeans4gb61 2 года назад +9

      That we know off. Aliens be like wtf is this crap.

    • @Sulfen
      @Sulfen 2 года назад +1

      If you think about it they’re also outdated at the same time because you know something better will be available in the future.

  • @supermasterfighter
    @supermasterfighter 2 года назад +471

    Niktek should have covered the 1080 ti after the 1080, that card was REALLY something special

    • @NikTek
      @NikTek  2 года назад +134

      I honestly wanted to cover that card too but I couldn’t find any demos and I wanted to keep this video from being to long, so I mentioned it as the better option when the 2080 was out

    • @jasperfianen3431
      @jasperfianen3431 2 года назад +25

      Bro i got so lucky i got a 1080 ti for 150 dollars

    • @sirbughunter
      @sirbughunter 2 года назад +1

      @@jasperfianen3431 This year?

    • @jasperfianen3431
      @jasperfianen3431 2 года назад +1

      @@sirbughunter ye

    • @jasperfianen3431
      @jasperfianen3431 2 года назад +15

      @@sirbughunter no last year sry (december)

  • @TheWizardGamez
    @TheWizardGamez Год назад +55

    Looking at the past feels so weird. Especially considering how recent this was. Whatever that guys name was that coined the doubling in processing power… man… I can feel it. This vid made me try to get my 360 to not red ring. And it ran for a couple minutes(thank the heavens) and the technical impressive games felt like the kind of indie game a first timer would make.

    • @Starfuly
      @Starfuly Год назад +2

      It’s called moores law fyi

  • @learningthehardway
    @learningthehardway Год назад +66

    As someone who has owned 75 percent of these cards, the 280, 580, 780, 980, 1060, 1080ti, 2060 and 3090ti all hold a special place in my heart. This doesn't change the fact that the GeForce3 will forever live in my heart as the first real GPU that was presented to me. Now im just waiting for that 4090ti to come out.

    • @MikeJD
      @MikeJD Год назад +1

      Oh yes, getting my first real gaming card in 2000 was a massive leap in performance and it truly showed me how the hardware is progressing

    • @oozly9291
      @oozly9291 Год назад +6

      Why do you waste that much money on sometimes worse cards🤦‍♂️?

    • @learningthehardway
      @learningthehardway Год назад +17

      @@oozly9291 I've been building computers for 20 years personally and for different reasons. For instance, I ran the 1080ti for years till the 3090ti came out. I had the 2060 in my wife's build. I built my daughters computer with a 1060 that I got from a pawn shop for almost nothing. I now use the 1080ti for a home NAS. The only reason I want to buy the 4090TI coming out is because I plan on buying the Pimax Vision 8K to get both eyes in 4k up to 90hz in games like DCS. Ehh I don't need it but its a hobby along with being an IT person at work.

    • @kezoalbertkastell726
      @kezoalbertkastell726 Год назад +1

      Gimme one

    • @armathyx
      @armathyx Год назад +1

      @@learningthehardway Let's all be real about why we really buy new cards: bragging rights.

  • @amdftw
    @amdftw 2 года назад +890

    I love these generational looks. Would be fun to see the history of Radeon now… 😁

    • @webx135
      @webx135 2 года назад +41

      Gone full circle coming back to the Radeon 7000 series.

    • @PatrickStarthnxroxrock
      @PatrickStarthnxroxrock 2 года назад +5

      Agreed. He should make another video!

    • @ZERARCHIVE2023
      @ZERARCHIVE2023 2 года назад +7

      E2 7110 be like : i can run 2D game and 3D polygonal shapes...but can't emulate or play games from 2010s

    • @jothain
      @jothain 2 года назад +9

      ATI/AMD has had some really good products. Like 9800 Pro. Nvidia didn't have anything sensible to compete with. I've never been fanboy and always disliked them. I think my stack of card in all my years is about 50/50 on those. _Always_ buy with good performance price ratio.

    • @milo20060
      @milo20060 2 года назад +5

      r9 280 at around 300 euros.
      Never forgetting that beast.

  • @CODA96
    @CODA96 2 года назад +381

    I remember that nobody ever managed 60FPS on 4k with the 900 series on demanding games. When the 1070+1080 came out, people finally managed it with a SINGLE GPU. NVIDIA really changed the game with their 1000's lineup. I still own a 1060 and its still kicking.

    • @jenkims1953
      @jenkims1953 2 года назад +5

      but any 80 series can do 4k mileage varying from 780 or r9 480 on wards lol. I would say 980 is the bare minimum for 4k now.

    • @johnnyblaze9217
      @johnnyblaze9217 2 года назад +13

      @@jenkims1953 3060 can do bit of 4k especially with dlss

    • @snowy9635
      @snowy9635 2 года назад +5

      I have a gtx 1050 ti

    • @dylanbasstica8316
      @dylanbasstica8316 Год назад +4

      got a gtx 760 in my older system I use for classic gaming and games that can only run on windows 7, still runs great to this day.

    • @CODA96
      @CODA96 Год назад

      @@dylanbasstica8316 VRAM is the biggest limit here. But i recently upgraded to a 3060 😀

  • @andreymoura9865
    @andreymoura9865 Год назад +7

    "Today I'll sleep early."
    Me 2:22 AM:

  • @stephenwakeman3074
    @stephenwakeman3074 Год назад +3

    The most notable change between the 7 and 8 series was not the video memory at all. It was the change to a unified shader architecture from the previously fixed vertex and rasterisation pipelines. It was an enormous shift and one which AMD were first to market with before Nvidia and was used in the Xbox360 hardware.

  • @NikTek
    @NikTek  2 года назад +668

    What was the first GPU that you owned?

    • @tahatahiri3124
      @tahatahiri3124 2 года назад +62

      A laptop gtx960m ,I sticked with me until the end

    • @NikTek
      @NikTek  2 года назад +42

      @@tahatahiri3124 that's great, did it do well in games that you've played?

    • @mynameisciko4885
      @mynameisciko4885 2 года назад +26

      Family pc has gt 7200
      And my first gpu is gtx 550ti from 2011

    • @ChristoAg1
      @ChristoAg1 2 года назад +13

      GTX 460 1GB

    • @nomad_12
      @nomad_12 2 года назад +22

      gt 210 and I played crysis

  • @JoseSanchez-xj3xn
    @JoseSanchez-xj3xn 2 года назад +87

    My first card was the Gainward GeForce 3 Ti 200 128MB GS. This was also my very first online purchase (through eBay) back in October of 2003. I have kept it to this day, it's a very special piece of personal computer history.

    • @rexlr695
      @rexlr695 Год назад +3

      GeForce 3 was bad ass. It was the basis of the original X-Box GPU with the first Halo which pushed Microsoft in to the console market (which was considered nearly impossible at the time vs. Sony, Nintendo, and to some degree Sega.) I'm really really surprised that wasn't mentioned in this video.

    • @ruudschmitz1
      @ruudschmitz1 Год назад

      Gainward was big in those days. So we're the boxes 😉

    • @OnTheRocks71
      @OnTheRocks71 Год назад +1

      What up GeForce 3 Ti200 buddy! I remember when I finally got my hands on one to replace a well past its prime TNT2. It was revolutionary. I feel like those moments in PC hardware are getting few and far between.

  • @JosiahBradley
    @JosiahBradley Год назад +4

    Major correction but tessellation was in retail games using ATi cards 6 years before Fermi. All the way back to Quake 1. It looked great in Morrowind with the 9c patch and UT as well.

  • @darkstar_-hi6wp
    @darkstar_-hi6wp Год назад +2

    Am I the only one who wants to see the rest of the demo videos. 12:15 I came for the graphics cards but I stayed for the videos.

  • @Capanel
    @Capanel 2 года назад +48

    it cant be understated at how how big the 8800 gtx was for its time.
    Crysis was released shortly after its launch, and the card was an absolute monster for anything at the time that was thrown at it. It was legendary and so many people had them.

    • @AbsolutePixelMaster
      @AbsolutePixelMaster 2 года назад +8

      Yeah he kind of just blew past that card. I would consider it one of the most significant in Nvidia's history.

    • @str1xt
      @str1xt 2 года назад +1

      It was a beast when it came out. A game changer indeed. I was so excited when went to pick mine up.

    • @terminalvelocity4858
      @terminalvelocity4858 2 года назад

      Yep, I remember a time when a top-tier graphics card was $500. Still have two of 'em.
      *Also made great heaters in the Winter. 🤣

    • @boboznek7090
      @boboznek7090 2 года назад +4

      8800 gtx and 1080ti are easily the greatest nvidia gpus to ever exist in history. he probably overlooked 8800 gtx since it came out over a decade ago.

    • @Trgn
      @Trgn 2 года назад +2

      8800 gtx was peak gaming for 90s teenagers

  • @chrisrogers1092
    @chrisrogers1092 Год назад +361

    Interesting seeing these price predictions and how waaaaay off they are for the 40 series.

    • @BlackhatAudio
      @BlackhatAudio Год назад +1

      He didn't take into account that Biden would toss 3 trillion dollars into an already overheated economy and blow it up.

    • @pillington1338
      @pillington1338 Год назад +53

      I laughed when I saw the $699-750 prediction now that it’s out and is $1200.

    • @Vector_Ze
      @Vector_Ze Год назад +26

      @@pillington1338 Looking on Amazon, I don't see any 4080s. They have 4090s, priced from $2200 to $2500. That's insane in my book.

    • @BlackhatAudio
      @BlackhatAudio Год назад +17

      @@pillington1338 yeah he was way off but didn't include the problems that have occurred since Biden took over.

    • @anthosm
      @anthosm Год назад

      @@Vector_Ze you aren't seeing any 4080s because they haven't been released yet

  • @erer5106
    @erer5106 Год назад +5

    I remember the first GPU I ever bought was a GTX 470 in 2014, 2 years later I bought a GTX 1050ti. That thing served me very well until I built a new computer in 2020.

  • @sridrawings4510
    @sridrawings4510 Год назад +5

    9:04 no way that would work on gtx 8800, its looks like straight from rtx series levels of graphics

    • @jees3474
      @jees3474 Год назад +1

      Well it does

    • @MrTefe
      @MrTefe Год назад

      that is not RTX that is reflections

    • @sridrawings4510
      @sridrawings4510 Год назад

      @@MrTefe ik but what I mean it is close to today's RTX levels of graphics

    • @MrTefe
      @MrTefe Год назад

      @@sridrawings4510 not really

    • @sridrawings4510
      @sridrawings4510 Год назад

      @@MrTefe yes it is

  • @phenom957
    @phenom957 2 года назад +90

    Something oddly captivating about late 90's, early 2000's GPU benchmarks. Fantastic video.

    • @zxb995511
      @zxb995511 2 года назад +13

      It was how imaginative the benchmark demos were.

  • @V1perKiller
    @V1perKiller 2 года назад +171

    I remember asking my dad for a 6800 ultra, but prices in my country was super high, so I ended up with a 6200 AGP instead lol Damn, time flies

    • @NikTek
      @NikTek  2 года назад +31

      just like this current GPU shortage that has been happening those past 2 years

    • @yan3066
      @yan3066 2 года назад +12

      @@NikTek 2 years ?! Wow, time flies for sure.

    • @LNCRFT
      @LNCRFT 2 года назад +2

      I also had an AGP model of the 6200. That was quite a bad card but defined my childhood and powered my favorite games

    • @V1perKiller
      @V1perKiller 2 года назад +4

      @@LNCRFT Yes, it was so bad that I had to go to my friends house to play Crysis after school. I remember he had a 7600GTS, and we had to play on low settings. Still, I was jealous.

    • @ekstrapolatoraproksymujacy412
      @ekstrapolatoraproksymujacy412 2 года назад +1

      hahahaha quite a reality check, 6200 was the worst possible new card back then xd

  • @rayeasom
    @rayeasom 8 месяцев назад +1

    My first graphics card was a voodoo 2. See GL Quake for the first time was absolutely mind blowing. The step up from non base quake was unreal.

  • @alaproskater
    @alaproskater 10 месяцев назад +2

    The gt 6800 will always be the best in heart i had the best days of my life with it

  • @shinnosukenohara5264
    @shinnosukenohara5264 2 года назад +177

    Times have changed, expected a 40 second meme,but watched the whole 20 minutes of knowledge of something I never got and never gonna get.
    All hail to Gamers without GPU🔥

    • @joopozo
      @joopozo 2 года назад

      GeForceNow is here to save me

    • @str1xt
      @str1xt 2 года назад +4

      Without a GPU? That would be my first pc, which was a Commodore 64 back in 1984. You used a cassette tape to load the games which took forever. Good times.The first game I played on it was the hunch back of Norte dame. Mesmerising at the time.

    • @EndahParmadiyanti
      @EndahParmadiyanti 2 года назад +1

      Oh I didn't realize this video is 20 minutes long😅. I really enjoy watching it🤩

  • @jobbus22
    @jobbus22 2 года назад +48

    I remember these nvidia demos for each generation. I could not believe what graphics they can do, back in the days

  • @TheWizardGamez
    @TheWizardGamez Год назад +8

    I remember when the 980 was the hottest thing on the block. I knew one guy who was ducking rich man. Got a SLI setup and we were all jealous of him. Man…. Those were the days

    • @Hollow222...
      @Hollow222... Год назад

      Congratulations 👏 you have just won a price 💰DM me now to claim your rewards ✅👍..

  • @baronsengir187
    @baronsengir187 Год назад +4

    The Geforce 4200 and the 970 have been the best bang for your money in all Nvidias history.

  • @shrekwes4957
    @shrekwes4957 2 года назад +39

    Those old 90’s Benchmarks just hit so much different idk how to describe it.

    • @ballstothewall38
      @ballstothewall38 Год назад +4

      I remember upgrading my graphics to the 256 in order to get decent frames in Battlefield.

    • @shrekwes4957
      @shrekwes4957 Год назад +1

      @@ballstothewall38 Ahh the GeForce 256. I remember my brothers PC had one when he was first getting me into PCs. Back when pulling 60 FPS was akin to 120 nowadays.

  • @LOLTROLL0245
    @LOLTROLL0245 2 года назад +198

    Id love to see an AMD history vid like this

    • @siali85
      @siali85 2 года назад +7

      me too 😀

    • @NikTek
      @NikTek  2 года назад +103

      i’ll start working on it very soon!

    • @abram730
      @abram730 2 года назад +8

      AMD's history starts with them buying ATI. There was a lot of drama there. AMD and Nvidia were very close before that. ATI was always jumping ahead of Nvidia in tech. They gambled everything on using the XBox360 to jump ahead of Nvidia with universal shaders. XBox360 even had tessellation. The XBox360 was almost an entire generation ahead of ATI's PC offerings and many PC gamers jumped to console gaming. That bankrupted ATI as they didn't get much from XBox sales, and AMD bought them. NVidia's 8000 series was their first series with universal shaders and when they introduced CUDA. It was their first physX capable card too. Their base architecture remained the same until the 400 series, never fully supported DX10.1. Rather they jumped to DX11 with the 400 series. AMD however stayed on the same architecture and never fully supported DX 11. They pushed conspiracy theories rather than innovating. Nvidia went so far as to enable multi-threaded rendering in games not coded for it in a driver update. Devs weren't coding for it because AMD didn't support it. DX12 was just Mantle an APU designed for AMD hardware, with some name changes. They still had the same slides in the documentation. DX12 was basically Glide. It didn't improve graphics, but slowed down Nvidia cards. Nvidia had to change their cards to work better with an older way of doing things, as they were doing universal pipes, rather than a single hardware scheduler.

    • @astaris7931
      @astaris7931 2 года назад +1

      @@siali85

    • @GeneralMerc
      @GeneralMerc 2 года назад

      @@NikTek thanks

  • @memohdfromwwe_
    @memohdfromwwe_ 3 дня назад +1

    I remember my first video card was a Geforce 4 MMX440 PCI (not AGP) back in 2003ish, was blown away how much more powerful it was versus integrated graphics.

  • @Savitarax
    @Savitarax Год назад +1

    I just barely got a retro pc setup! An original GeForce 256 except the DDR version!
    Installing it and realizing windows 64 bit was not used until 2003 made me realize just how much stuff has changed.
    Really was an interesting time to be alive!

  • @anthonysmith6413
    @anthonysmith6413 2 года назад +46

    I remember getting a new PC with the 8800GTX. For that time it was so expensive and huge, me and my friends couldn't believe it. We even used the exact tech demos shown here to test it out.
    Well and the Tech-Demo called Crysis.
    Good times.

    • @NikTek
      @NikTek  2 года назад +4

      That’s great, the gold old times :)

    • @CantankerousDave
      @CantankerousDave 2 года назад +1

      I had the plebian 8800 GTS, the 320mb model. I played through Crysis on it, but the poor thing was practically melting down by the end.

    • @brightwebltd2864
      @brightwebltd2864 2 года назад +1

      Remember playing on machines with core 2 Q6600 and two 8800GTXs in SLi with 4GB of RAM in 2007/08. That was a beast - ran maxed out Crysis all day, everyday like it was calc.exe

    • @WallachiaTacos
      @WallachiaTacos 2 года назад +3

      “Hey I got a gpu”
      “Can it run crysis tho?”

    • @fillmorehillmore8239
      @fillmorehillmore8239 2 года назад

      I remember MW on a new P75.

  • @StievenStefanovic
    @StievenStefanovic 2 года назад +33

    The FX 5800 Ultra, also called "The Hairdryer". I remembering it for its "Ultra"-loud fans and high temperatures. That was all the rage about it and drove many people to ATI Radeons

    • @Gatorade69
      @Gatorade69 2 года назад

      Yep. I had a GeForce 2, 3 and 4. Went Radeon around that time, also because the price and performance of Radeon cards were getting very good around that time.

    • @maanmahmoud4537
      @maanmahmoud4537 2 года назад

      @@Gatorade69 Radeon card is always best

  • @fuzzblightyear145
    @fuzzblightyear145 Год назад +1

    VoodooRush, Voodoo2, GeForce2GTS, GEforce4Ti, GTX8800, GTX260, GTX460 and still rocking my GTX980.
    Man, those early demos brought back some memories

  • @DanielGT_93
    @DanielGT_93 Год назад +2

    You guys think that the 1080 being two 980 was brutal? Look back at what the 8800GTX did with the 7800GTX and come back here to talk. Boy, i owned one G92 (8800GTX refresh) from 2009 to 2015 and ran almost anything at 1080p. Only after dx11 adoption did i change gpu.

  • @SashimiSteak
    @SashimiSteak 2 года назад +40

    My dad bought me a Geforce2 and I used it for years.
    As a student I didn't really have much cash so I went for budget AMD GPU after that for many years.
    Eventually I got my first job and bought a GTX 580, then another GTX 580 for SLI and then another GTX 580 for tri SLI.
    Then I got married and bought a property. With a mortgage on my back I couldn't afford to upgrade for a long time. Until that marriage broke off and we sold/split all our assets. Quit my job, work casual for 3 years before going back into full time work and started earning double of what I used to make.
    With those additional cash I finally upgraded to RTX 3090 in 2020.

    • @angrysocialjusticewarrior
      @angrysocialjusticewarrior 2 года назад

      Marriage is a scam man. ALWAYS keep them in the girlfriend category. That marriage contract is nothing more than a will... with the difference being that a regular will gives away your assets when you die, and a marriage contract gives away your hard earned assets when you divorce.

    • @fiece4767
      @fiece4767 2 года назад +5

      See how life can get great again when you lose 180 pounds of dead weight. Remember kids never get maried, work on yourself and you hobbies and dont look back. Enjoy your 3090 you deserve it, she can serve you for many years.

  • @x98334
    @x98334 2 года назад +46

    I remember going from a Voodoo 2. 12 mb card to the GeForce 256. I cannot even explain how awesome that was. My first computer however was an AST 486x33mhz and that had a 512Kb graphic chip installed, i found that out after buing a game that required a total of 1 mb of video ram. oh glory days.

    • @ChrisM541
      @ChrisM541 2 года назад +1

      I remember purchasing two Voodoo 2 cards and running them in SLI. What an eye-opener at the time. It's a pity Nvidia killed off 3dfx after their buy-out, then again, they do reign supreme in anti-competitive practices - right up to today.

    • @billlam7756
      @billlam7756 2 года назад

      I remember needed to buy a 4mb stealth diamond gpu to run rainbow 6 back in 98 😂

    • @phillycheesetake
      @phillycheesetake 2 года назад

      @@ChrisM541 3dfx killed itself through anti-competitive practices, seeing them as a victim of Nvidia is the wrong take IMO.

    • @ChrisM541
      @ChrisM541 2 года назад +1

      @@phillycheesetake Bullsh#t, lol!

    • @phillycheesetake
      @phillycheesetake 2 года назад +3

      @@ChrisM541 This isn't arguable, 3dfx did kill itself through anti-competitive practices.
      They bought out a board partner and tried to bring all manufacturing in-house. They ran into a manufacturing bottleneck and as a result couldn't get their chips to market. As a result, you can still buy new VSA-100 chips for not much money, because chip production vastly over-ran card production.
      Nvidia isn't the reason 3dfx doesn't exist, 3dfx is the reason 3dfx doesn't exist. And I say that as a 3dfx fanboy.

  • @sheed64
    @sheed64 7 месяцев назад +1

    The 8800GTX was such a monster when it came out. Legendary card. I remember Company of Heroes bringing a couple of PCs I had to their knees. Ran like butter on the 8800. Was the only card that could feasibly run Crysis as well (albeit on lower settings).

  • @yaromakc
    @yaromakc Год назад +3

    The Evolution of Nvidia GeForce Graphics....249USD-->>1999USD

  • @TheBeatboxHitmanTwo
    @TheBeatboxHitmanTwo 2 года назад +89

    It's crazy how far technology has come. Can imagine in 2030

    • @adrianafamilymember6427
      @adrianafamilymember6427 2 года назад +5

      The MU 4 F X-10 Tendy 🍌

    • @wizzotizzo
      @wizzotizzo 2 года назад +8

      @@adrianafamilymember6427 what

    • @viltiarnusa
      @viltiarnusa 2 года назад +6

      @@adrianafamilymember6427 3:57 "Be nice you need more friends" lol

    • @Villosa64
      @Villosa64 2 года назад +6

      Nvidia prolly got some crazy shit in the background that they dont want to release yet

    • @misy4ru3
      @misy4ru3 2 года назад +11

      MSRP 1999 dollars, TDP 1400W in 2030

  • @DjVortex-w
    @DjVortex-w 2 года назад +26

    One thing that's not conveyed in this video is the increase in _resolution_ that each generation was capable of (as in, a card from the next generation being able to run the same demos and games as the previous generation but at a significantly higher resolution). This is quite an important factor, besides the visuals.

    • @theredpanda00
      @theredpanda00 Год назад +1

      And I'm here still using a 1440x900 monitor from 2008, but it's very special to me even if it isn't 1080p. Plus for some reason it's 75Hz, something I never would've known if I stopped using it for my PS3 (why it's special, it's always been there through the years) and used it for a PC for once. And yes I still game with it

  • @kwlxxi4813
    @kwlxxi4813 Год назад +2

    15:26 "Now, what's probably my favorite release of all time" 👍 Still a beast!

  • @jeffkardosjr.3825
    @jeffkardosjr.3825 Год назад +2

    15:20 Now they can fake the Moon landing just using a graphics card and some software.

  • @exp-eri-mental
    @exp-eri-mental 2 года назад +121

    The thing that strikes me is how imaginative the demos were back in the day. Such unique art style we can't seem to replicate today.

    • @JacobKinsley
      @JacobKinsley 2 года назад +16

      It's what most early cgi looked like

    • @XpRnz
      @XpRnz 2 года назад +16

      You can clearly see they weren't really art-directed and often made by companies that didn't specialize in these types of visuals so they came out very quirky and unusual and directed towards certain effects they wanted to showcase.

    • @rolux4853
      @rolux4853 2 года назад +3

      @@XpRnz they looked much better back then!

    • @pulledtrigger
      @pulledtrigger 2 года назад +5

      Some of it felt eerie but unique, it feels nostalgic like the ps1/2 Era

    • @CommanderTato
      @CommanderTato 2 года назад +1

      Luna, my all time favourite.

  • @bs-yn7su
    @bs-yn7su 2 года назад +78

    256 / first gpu
    2 / real ttime lighting , per pixel lighting , higher triangles
    3 / shading effect , reflective texture , complex facial geometry
    4 ti 4600 / real time volumetric fur , anti-aliasing
    fx 5800 ultra / cinematic depth of field blur , lightining effect
    6800 ultra / complex vertex and fragment shaders
    7800 /
    8800 / bouncy physics , realistic face
    9800 /
    280 / 1 billion transistors
    480 / tessellations
    580 /
    680 / hair work system
    780 / physiscs ,destructuion,smoke effects
    980 / real time global illuminations
    1080 / realistic and photographic
    2080 / rtx real time ray tracing
    3080 / double trx
    4080 / 75%

    • @EinSwitzer
      @EinSwitzer 2 года назад +2

      really

    • @TacticalPhoenixYT
      @TacticalPhoenixYT 2 года назад +2

      well 1080, not really. It was pretty much just faster and more efficient.

    • @EinSwitzer
      @EinSwitzer 2 года назад

      @@TacticalPhoenixYT meaning new software YAY!!!

    • @TacticalPhoenixYT
      @TacticalPhoenixYT 2 года назад

      @@EinSwitzer The real diff was with turing, due to its RT cores. But older cards can still RT, just at worse fps.

    • @EinSwitzer
      @EinSwitzer 2 года назад +2

      @@TacticalPhoenixYT I'm aware of this I Was able to get my 1080Ti Msi Seahawk corsair edition to clock 2.1 max 1.987ghz average max ram based on values then went into games that supported it 60 to 100fps just . like . 2080Ti . searching for missing ram chip.. and channels to more so tech the 1080ti could have 2x the ram based on thread to register... relax I give it all to nvidia or intel who ever's parts I use to get more iNPUT I give them the findings.

  • @volvo09
    @volvo09 Год назад +1

    Great video! Loved the addition of the Nvidia demos from all the generations.

  • @AmirZaimMohdZaini
    @AmirZaimMohdZaini Год назад +1

    It's pretty amazing to see Nvidia engineers doing demo stuff that pushed their GPU into its limits especially on early 2000's era and its like trying to compete with PS2 and OG Xbox consoles.

  • @Kyori0
    @Kyori0 2 года назад +48

    i love these old demos, they're so cool for some reason

    • @qudel5644
      @qudel5644 2 года назад +4

      yeah the nostalgia with them is so great

    • @bigboat8329
      @bigboat8329 2 года назад +5

      some of them were creepy as hell

    • @qudel5644
      @qudel5644 2 года назад +1

      @@bigboat8329 true true

    • @dieglhix
      @dieglhix Год назад +1

      try searching demoscene, those that started with 64kb files, they are pretty amazing

    • @memespeech
      @memespeech Год назад

      @@bigboat8329 you meant nniiiice... simps lol

  • @Radek__
    @Radek__ 2 года назад +9

    8:49 a hard slap in to the frog face :)

  • @kurzackd
    @kurzackd Год назад +4

    16:23 -- errr... you kinda missed the 16- series there, buddy... :S

    • @BrucifyMe
      @BrucifyMe 5 месяцев назад

      Technically yeah but he's only doing the _mainstream_ flagships of each generation. The 16 series was based on Turing - the 2080 cards were the flagships there.

  • @jessepotter365
    @jessepotter365 Год назад +1

    GTX 280 x2, GTX 580 x2, GTX 680 x2, GTX 780, GTX 980 ti, GTX 1080 ti, RTX 3080, RTX 4090. It's been amazing (and expensive) living through this evolution.

  • @paulmorphy6187
    @paulmorphy6187 2 года назад +29

    Very interesting video. Its funny how you fail to mention that the geforce 256 was the first card to use hardware T&L (Transform and lighting)....at the time this was its big selling point and the feature that was supposed to make it different from previous cards.

  • @ydoucare55
    @ydoucare55 2 года назад +13

    3:45 - seriously sounds like you say "shitting effects"

  • @Kougeru
    @Kougeru Год назад +2

    The 4080 price here is so optimistic 😂. If only it was accurate. It's insane how we went from $499 in 2012 to $1199 in 2022 for this tier of GPU. The average person's yearly income has barely changed since then and yet prices are up by over 2x.

  • @tofu_golem
    @tofu_golem 7 месяцев назад +1

    Your estimated price of the 4080 was hilariously optimistic.

  • @todorsamardzhiev144
    @todorsamardzhiev144 2 года назад +44

    The increase of video memory on 8800GTX wasn't that important at the time. Unified shader architecture, the raw number of shaders (128 vs 24+8 for 7800GTX), the DirectX 10 support, and the 384bit bus width were all more impressive.

    • @brightwebltd2864
      @brightwebltd2864 2 года назад +2

      That 384bit bandwidth blew everything else out of the water, even for a few years after

  • @ogshotglass9291
    @ogshotglass9291 Год назад +13

    At this point, I would love for Nvidia to show pretty much the same types of videos as their original to showcase just how much their technology has improved and its capabilities.

  • @Coyote27981
    @Coyote27981 8 месяцев назад +1

    I was there gandalf, 3000 years ago...
    My first nvidia graphics card was a TNT2, but i got a geforce 2 in 2000.
    It was a completely different beast.

  • @kaiperdaens7670
    @kaiperdaens7670 8 месяцев назад +1

    The diss at around 04:05 is crazy.
    And it feels like it is directed at modern gamers.

  • @2DarkHorizon
    @2DarkHorizon 2 года назад +5

    The graphics in 2012 at 12:45 still looks better than nearly all game characters today.

    • @rodellrellin4228
      @rodellrellin4228 Год назад +1

      Thats awful alot so that much they used Nvidia characters went used.

  • @Totto_90FPS
    @Totto_90FPS 2 года назад +110

    Trust me or not, i just wanted to check google the evolution of Nvidia GPU’s LMAO 😂

    • @NikTek
      @NikTek  2 года назад +16

      Lol XD

    • @captainvenom7252
      @captainvenom7252 2 года назад +1

      Wait 2hours ago?
      Even the vid is uploaded 1 hour ago.

    • @FrostyTheOne_
      @FrostyTheOne_ 2 года назад +2

      @@captainvenom7252 channel member ig

    • @captainvenom7252
      @captainvenom7252 2 года назад

      @@FrostyTheOne_ unlisted viewers ig

    • @GM-os1bl
      @GM-os1bl 2 года назад +3

      Talking about supply and demand :)

  • @gohtwm
    @gohtwm Год назад +1

    The super sonic sled and design garage demos look really fun tbh

  • @Serveck
    @Serveck Год назад +1

    Made the jump from a GeForce 2 to a 9800gtx to a rtx3070ti, the jumps in quality are astounding but they all were used to play half life.

  • @ccramit
    @ccramit 2 года назад +8

    I remember getting into PC gaming around 2007 and wishing I could afford two 8800's in SLI. Ended up with just a single 8600GTS, but it still ran CoD4 very well.
    Ah, to be young and poor. Now I'm old and poor.

  • @toespic
    @toespic 2 года назад +10

    4:13 Imagine flexing u have a 4060ti early

  • @Beateau
    @Beateau Год назад +1

    No one gonna talk about how creepy that demo at 4:00 is?

  • @Hereford1642
    @Hereford1642 Год назад

    Started gaming with a geforce mx440 in a Dell Dimension playing BF 1942. I was 14 and it got me into looking at other cards that could play Planetside 1. I remember the ATI 9800 pro and the geforce FX 5900 ultra having a big battle. I got a FX 5600 and was so happy being able to play BF 2. Stopped for a long time and missed many generations the first card I truely bought was a GTX 960 that served me well. Now I am a games developer and 3D Artist pushing these cards for all their worth. I love 3D graphics. I now have a RTX 2080 super using Unreal Engine 5 I am very lucky to be able to do this kind of thing.

  • @ijhuana
    @ijhuana 2 года назад +8

    Great video mate, it was at the end of the video that I recognized the logo. Could not believe you are capable of producing amazing memes and super well-designed and commented long videos too.

    • @NikTek
      @NikTek  2 года назад +5

      Thanks a lot , I’m glad to hear the positive feedback from all of you! i’m known for being a tech memer but topic like these always intrigued me and I made this video

  • @HardProduct
    @HardProduct 2 года назад +5

    10:55 so true I remember this card getting so hot you could literally feel the heat with your hand not even touching it.
    Grill Force - The way its meant to be grilled 😂🤣😂

  • @LooNeYlv
    @LooNeYlv Год назад +1

    My Nvidia graphics road started in 2001 in 'house shared computer' with GF2(AGP), GF4(AGP), GF 6600(AGP), then our home computer was upgraded and had my first pci-e card GF 8800, then i was 14/15yo i spend 10s of hours reserching many forum sites for the 'best' hw combo and build my own 'gaming pc' from money that i saved up with gtx280 it held the crown very good as i only replaced when i upgraded to a new platform and gtx660ti then gtx 1070ti and now i have Z690 platform with RTX 3070ti.
    P.S. 14:03 The original PhsysX engine was actually developed by Ageia, you needed an separate expansion card and special drivers to be able to run this realistic physics engine, then Nvidia just bought the SDK rights in 2008 and integrated in it's system so if you had an nvidia gpu you could just 'enable' the phsysX system and render the realistic physics in demos and few games that implemented this system.
    So it was already duable in GTX280 times(but back then it still was preferred if you had separate, some older any nvidia card, with at least 256mb of video memory, slotted in, which you in nvidia driver HUB dedicated for the phsysX work.)
    Then in ~2011 there where first public rumors that nvidia is working on a new physics engine and in 2013 lineup- nvidia released the rehauled 'phsysX' engine so now it was even more realistic and dedicated phsysX cards where gone, as it all was done and could be done with one powerful gpu, the GTX 780. 😉

  • @elleodurkin409
    @elleodurkin409 Год назад +2

    OK, so NV1 to NV5 aren't in the GeForce category, but I think it would be good to include at least the TNT and TNT2 to make this video comprehensive. And, I suppose, mention when they changed from PCI to AGP to PCIe but there's a large overlap for those interfaces. Remember when you had to buy a new motherboard for a new GPU? I'm sure there was at least one generation when I also had to buy new RAM-so much for upgrading one thing! I've still got my GeForce 256 in a box, I was impressed that it could also capture video but the "3D glasses" had a flicker that was unpleasant for most people.

    • @Hollow222...
      @Hollow222... Год назад

      Congratulations 👏 you have just won a price 💰DM me now to claim your rewards ✅👍.

  • @viltiarnusa
    @viltiarnusa 2 года назад +16

    3:57 "Be nice you need more friends" lol

  • @m.hasler7263
    @m.hasler7263 Год назад +3

    Well that price prediction for the 40s aged like milk

  • @alexello1189
    @alexello1189 7 месяцев назад

    The first pc I remember using was a standard cream windows tower, played some pajama sam, learn to type, and rogue squadron. Later on in 2005 my dad got us this sick gaming pc in a red case and had a viper logo. That one had a Nivida GeForce 7800 in it and about 8gb ddr2 ram. With that I started playing pirates of the Caribbean online, a little bit of wow (till the free trial expired), and minecraft. Later on it started having startup issues and we had to upgrade. We got an origin gaming laptop with a gtx 980m and 16gb ddr3. With that I was able to get into steam and play tf2, csgo, and skyrim. In late 2021 the laptops battery was basically degraded and it would often overheat despite numerous cleaning with computer duster and a vacuum. Currently I got a completely custom build with 16gb ddr4 an i7-9th gen, and a RTX 3060, only downside is my motherboard doesnt have a Tpm chip and therefore can’t upgrade to windows 11….
    But it runs rdr2 no problem so that’s good 👍

  • @BritishBoy
    @BritishBoy Год назад

    Some of those old Nvidia demos were straight NIGHTMARE FUEL

  • @StacyDubC
    @StacyDubC 2 года назад +27

    AGP to PCI-E was a massive step too and often overlooked. Bandwidth increase was no joke.

    • @shapeshiftsix
      @shapeshiftsix 2 года назад

      Not really at first, it took a few gens before pci-e was any faster than agp 8x

    • @StacyDubC
      @StacyDubC 2 года назад +1

      @@wayn3h I had a 6800 ultra and then they brought out Pcie. After that mobos stopped supporting AGP and PCIE was kicking AGP ass. I felt abandoned.
      I remember having ISA and PCI in the same board back in the day. Not too slow for 56k modem, everything else was PCI

  • @jothain
    @jothain 2 года назад +10

    Card have changed a lot in past years in so many ways. My first card (3d one) was Voodoo 2 and it was massive upgrade in all technical ways that we just haven't seen like since those days. Power consumption is one interesting too. I still have 6990 lying somewhere and boy that created a lot of heat.

  • @lanasmasher594
    @lanasmasher594 6 месяцев назад

    It's crazy how those early cards were responsible for some of the best animated movies that came out in the early 2000s

  • @ElJuli24
    @ElJuli24 Год назад +1

    Back in 2004 my mom gave me my first real pc , it didnt had gpu only amd integrated graphics and 512 gb of ram , I was able to play Max-payne 2, world of warcraft bc(very very slow graphics) and gta SA . I enjoyed it a lot even with only that simple components . In 2011 I got a pc with 512gb gpu and 8gb ram. Is amazing how technology advances . This week i willl build my 4090 fe rig , but you will always remember that first pc .

  • @MikeDoesRandomThings
    @MikeDoesRandomThings 2 года назад +17

    Still using my 1080, 10 series was the last time I recall Nvidia ever being reasonable.

    • @genesisgaming3756
      @genesisgaming3756 2 года назад

      ehhhh, the 1660 SUPER is really good, it just doesn't love 1080p. it is really great for VR and 1440p though.

    • @angrysocialjusticewarrior
      @angrysocialjusticewarrior 2 года назад +1

      1080 is ancient bro. hopefully you finally catch up with modern times by upgrading to an RTX 4070 when it comes out.

    • @genesisgaming3756
      @genesisgaming3756 2 года назад

      @@angrysocialjusticewarrior depends what you do, the 1080 can run FH5 at like 50FPS, but it doesn't do RTX. Personally, I don't really like "RTX ON" games, it doesn't look quite right.

    • @bored78612
      @bored78612 2 года назад +1

      @@genesisgaming3756 Control looks great. That is the only one that is tho.

    • @fuzzypanda1684
      @fuzzypanda1684 2 года назад

      @@angrysocialjusticewarrior Lol, judging by your avatar and name I'm guessing you're joking, in which case thanks for the laugh!

  • @Icureditwithmybrain
    @Icureditwithmybrain 2 года назад +13

    I got into pc gaming in 2001 but never actually had a pc with a graphics card until 2008. I currently have a RTX 3070 in my pc.

    • @xxgamergirlxx7917
      @xxgamergirlxx7917 2 года назад +1

      Planning to upgrade my PC soon from a 1050 to a 3070
      How's ur experience with the 3070?

    • @black_shadow8137
      @black_shadow8137 2 года назад

      @@xxgamergirlxx7917 I have laptop omen 16 with rtx 3070 and its overkill.

    • @ferryry3358
      @ferryry3358 2 года назад

      I've a RTX 3070 Ti. it's a monster when I play games at 1440p@144fps.

    • @xxgamergirlxx7917
      @xxgamergirlxx7917 2 года назад +1

      @@ferryry3358 wish I could get a Ti myself.
      But the price is already getting out of my budget.

    • @ferryry3358
      @ferryry3358 2 года назад +1

      @@xxgamergirlxx7917 yeah understandable. The price was as much as I pay for my apartment monthly. If I haven't saved up that much I couldn't afford this GPU. I had a GTX 960 previously and it is definitely a huge upgrade.

  • @blutadlerx
    @blutadlerx Год назад +2

    Started with a GT440, then R9 270X, GTX 970 and now finally my live long dream, a GPU to max out every setting in every game in 1080p and some even in 1440p and 4K, the AMD RX6700XT. Always wanted a card like this, my entire life ♥️😍

    • @fredriksvard2603
      @fredriksvard2603 Год назад

      Started with canopus pure3d 6mb

    • @blutadlerx
      @blutadlerx Год назад

      @@fredriksvard2603 technology has come so far, I love it :)

  • @glitchtrap1987
    @glitchtrap1987 2 года назад +3

    Thanks for the effort but my man just skipped the entire 16 series

  • @leonkernan
    @leonkernan Год назад +3

    Damn, I remember the jump in quality going from an S3 Trio 64 to a 3DFX Card, then a TNT2...

  • @coffeepot3123
    @coffeepot3123 Год назад +1

    Back when i was a kid i would visit a friend who had the werewolf one playing on his pc.
    Shit was amazing, really drummed up a kids imagination.

  • @DenverStarkey
    @DenverStarkey 6 месяцев назад

    also fun fact : Doom 3 began development in 1999 when the Geforce GTS was new and didn't make it to market till the geforce 6800 ultra was the card every one wanted in 2004 (aug).

  • @Tek4uOfficial
    @Tek4uOfficial 2 года назад +6

    3:34 My Intel HD 620 be like: Finally found the opponent 😁

  • @Volcom1947
    @Volcom1947 Год назад +49

    That MSRP for the 4090 is $1600 what an absolutely insane time to be alive.

    • @TheSlickmicks
      @TheSlickmicks Год назад +7

      I am completely disgusted by the price tags. And people bought it. And people will buy the 4080 and the fake 4080. It's gross. If people just told Nvidia how they felt with their wallets, by not buying, then Nvidia would be forced to cut prices to a fair margin.

    • @frtzkng
      @frtzkng Год назад +3

      I remember back in 2016 I built a mid to high-end PC for $1,500 The 980 Ti was like $600 or something. Now we're talking $2,000 and more for the same bracket, with the RTX xx80, non Ti cards alone costing around $1,000. Although it's insane what even mid range PCs are capable of now.

    • @TheRealMikeMichaels
      @TheRealMikeMichaels Год назад +4

      ​@@frtzkng No GPU should cost over £600. Period.

    • @high-octane-stunts86
      @high-octane-stunts86 6 месяцев назад

      Actually the price makes sense , people back then would have to pay around 600$ for a card that is 10 times slower then the 4090 , so it’s actually a good deal if you think about it

    • @anhduc0913
      @anhduc0913 6 месяцев назад +1

      ​@@high-octane-stunts86bruh. It's like saying the world's first computer is supposed to be 5 cent because it's thousands time weaker than your phone (spoiler: it's not cheaper). That gpu was cutting edge at the time, and pretty much the only gpu back then. It should falls in range with the 70s card right now, so it's a premium product but the demand is just not there yet. For the 4000s cards, only the 4080 above is overpriced. The 4070 is slightly overpriced, but the rest is normal. The most problem lies in the fact that nvidia made them worse for their tier, all cards from 4070 down should have been moved back one tier with the specs. They leveraged better technology and software to cut corners with the 4000 mid and low range, while overpricing the high range.

  • @theroyalaustralian
    @theroyalaustralian Год назад +3

    7:35 The PS3 has a 60nm GT 710 actually, not a 7800.

  • @VioletGiraffe
    @VioletGiraffe 11 месяцев назад

    I wish this video was more technical and highlighted the achievements of each generation. For example, the most notable change with 8800GTX was not 768 MV of VRAM, it was the unified shading architecture.

  • @themanman7642
    @themanman7642 2 года назад +8

    Never expected a long niktek vid but I'm not dissapointed this is really good

    • @NikTek
      @NikTek  2 года назад +4

      I’m so glad to hear that thanks!