I might have commented this already on another video, but as someone who has a higher end card, but currently playing on a GTX 960 4GB and RX 6400 PCIE 3.0 I've learned that as long as I have driver support and a low heat output PC that has FPS over 65, 70 and texture doesnt matter too much. Thank you for making GTX 900/Maxwell owners feel valid
Its amazing how cheap these flagships and high end cards in general are nowadays, something that was at one point the absolute best you could buy for a considerable amount of money is now a budget option. Hell im at the point that the cards like the 1080 ti i dreamed of as a kid is slowly being considered a budget oriented card, while the 1060 is considered a proper budget option
Look at how people look at 4090 right now. People will pay nearly $2000 XD. when the 7070ti rolls around, we'll look back at the 4090 like how we look at the 980ti now LOL.
GTX 980 was overlooked because of how good the 970 was. Usually there is a sizeable gap in performance between 70 and 80 class. However, if you were lucky enough, you could quite literally turn your 970 into 980 by overclocking it to around 1500 mhz core clock. I had an asus strix 970 myself and was lucky to do just that. You could also buy the likes of gigabyte extreme gaming and not bother overclocking at all. 970 was a one hit wonder in that regard, never again. Also, ps4 did run god of war natively in 1080p 30fps, ps4 pro had same textures, better effects and higher res, but in checkerboard render and 30fps. Performance 60fps mode enabled dynamic res. scaling, but jaguar cpu dropped the ball as always. Ps4 pro is rx 480 with lower frequencies, which puts it in ballpark of r9 280x you reviewed, 380x rehash also had 4gb vram. So, 980 getting 65fps average in 1080p native sounds about right, it is faster than 280x/380x and not bottlenecked.
this card still can handle most of the recent titles but it's also depends on target resolution in my opinion for example if the resolution of monitor is below 1080p or 900p with 60ghz the card can offer a great performance on low medium perecet. but again if someone wants to porchuse the card for around 30 to 40 bucks it worth it even in 2024
I rencently build a PC with a i5-6500 and a gtx 960 4GB, i am able to play Valorant at 240 fps at average, so to play esport title in 1080p those old graphics still kick in !!
my fiance's computer using old parts of mine is a 5600x / 1070 GTX combo and it still runs most everything on 1440p with 40+ fps still. The only things that give much issue are the most modern games with high end volumetric clouds/effects/lighting, usually need to turn these down to medium or less.
@@AquibDoki She just got the 5600x because i took it out of my old setup, just like the 1070,vwhen I upgraded to a 7800x3d to pair with my 3090. I know it bottlenecks but the most modern games she ever plays are Baldurs Gate 3 & Borderlands 3 and it still performs at 1440p highest settings with 70+ fps so no reason for us to upgrade now. If she ever needs more we'll definitely upgrade
In general, a lot of games from the past few years need an 8GB GPU to run at 1440p. And I think people should be getting a minimum of 12GB for 1440p if they are buying a GPU in late 2024. And a minimum of 8GB for 1080p. But, as recent testing with a 2060 6GB has shown me, 6GB is probably going to be OK for a couple more years. I think you're going to be turning settings way down in GTAVI with a 6GB card, at 1080p. 4GB GPUs are really only good for older games and esports in 2024, with few exceptions. Hogwarts can run on a 4GB GPU but just barely. Same for Starfield, which can run on a 6GB card at 1440p with medium-high settings. Cyberpunk is fine on a 4GB card at 1080p and 6GB card at 1440p. Fun fact, the 750Ti is actually a 900 series GPU. But if you only have 750Ti money, you should be buying a Quadro K2200, which is basically 4GB 750Ti and is more useful in 2024 due to VRAM. Still has driver support because it is also a Maxwell GPU. It is clocked a bit lower than the 750Ti but nothing a little OC with Afterburner can't fix. Hypothetically, the K2200 has better binned silicon but whether you can get one running more powerful than a 750Ti really depends on how the GPU was used originally, which is highly variable for used enterprise hardware. It is also the cheapest way to have NVENC support.
i bought this as a second graphics card to pair with my 4080. I needed another graphics card so that I can handle multiple different refresh rates across multiple monitors (one pair running at 240hz and a third running at 60hz)
I sold my 1050 ti then got a 1650 super , sold that and got a gtx1070 , traded up for an rtx3070 ,sold that and got a r9 280x for £20 just to get the monitor to come on as I have i5 11400f which doesn't have integrated graphics . Gamed on the r9 280x for several months , fortnite ,gta 5 etc and it was usable . Using an rtx 2070 super now ands its great for 1080p gaming . Keeping the r9 280x for emergency back up , best £20 I ever spent , great card for emulation too.
All influencers are already talking about the 5000 series RTX and Bro is going completely the other way. If you are willing to compromise a bit and you really want to play something, there are also options with 4GB VRAM and an old card. Although the RTX 2000 has already aged better, on the feature
I was thinking about swapping my RX 580 8 gb sapphire nitro with my friend's ASUS STRIX GTX 980 DirectCU II OC GB because his casing isnt suitable with the size of the graphics card , will there be any decrese or increse performance ? im currently using ryzen 5 5600
Also be mindful of different settings using more or less of the CPU so you're saying Oh the GTX980 got this but your CPU was also running double duty compared to the low test.
you technically needed 12gb since atlas (ark survival evolved's pirate sister)... however it's fine if you don't have 8gb for most games, as long as you're capable of playing games at 30 ish fps, without all the eye candy
I'm surprised how well 4GB kinda holds up, considering the cost of a card like the GTX 980. If you stick to eSports and odler games, 4GB is perfectly fine
On topic, Maxwell is so good it's the reason that when it came to me selling an old graphics card, I sold my 1070 and kept my 750 Ti 4GB 😁 Totally off-topic, I've just posted a massive yt comment on another channel regarding PCI-E 3.0 x16 vs 4.0 x16 and 4.0 x8 regarding cards like 3090, 4090 and the 5090 now that we know it will be approx 40-50% faster than the 4090 - summary of my comment elsewhere is: PCI-E 4.0 x16 will be enough for 5090 and PCI-E 3.0 x16 or PCI-E 4.0 x8 can do 5090 with about a 10% reduction in fps! Just thought folks here might also like to know now that I've finished researching it all! Cheers 🍻
Up to the RTX 4090, every card will still perform the same, regardless of whether it's PCIe 3.0 or 4.0. It's possible that you'll lose a few FPS in some titles with a 4090. But that's almost nothing more than measurement inaccuracies, it's that small. I've had the same experience myself with a 5800x and an RTX 3080. I changed the motherboard from x470 to x570, so PCIe 3.0 to 4.0 and there were no differences.
@@hrbt78 It's really not margin of error, you have to understand which tests to perform as not all test are maxing out the PCI-E throughput whereas other usage cases are - and that even applies to 3090 on PCI-E 3.0 x16 - in that card you lose 6% on most games but some games that actually saturate PCI-E, it's even more than that. It's for sure not margin of error as it comes down to usage of PCI-E by each application/game engine. Btw the 3080 doesn't fully saturate PCI-E 3.0 x16, even the 3090 only just does it and that's by about 6-10%, 4090, does it by about 10-15%. 3080 Ti and 3090 is about the point at which PCI-E 3.0 x16 can be fully saturated and then for 4000 series it's about 4070 and up eg 4070 Ti, 4080, 4080 super, 4090. Games with lots of grass really show up PCI-E throughput saturation 😉
I might have commented this already on another video, but as someone who has a higher end card, but currently playing on a GTX 960 4GB and RX 6400 PCIE 3.0 I've learned that as long as I have driver support and a low heat output PC that has FPS over 65, 70 and texture doesnt matter too much. Thank you for making GTX 900/Maxwell owners feel valid
You're absolutely welcome. Maxwell has aged incredibly well🙌
Its amazing how cheap these flagships and high end cards in general are nowadays, something that was at one point the absolute best you could buy for a considerable amount of money is now a budget option.
Hell im at the point that the cards like the 1080 ti i dreamed of as a kid is slowly being considered a budget oriented card, while the 1060 is considered a proper budget option
Look at how people look at 4090 right now. People will pay nearly $2000 XD.
when the 7070ti rolls around, we'll look back at the 4090 like how we look at the 980ti now LOL.
It is a decade old...
Is only because the improvements have stagnated that anyone even remembers it exists
It is a decade old...
Is only because the improvements have stagnated that anyone even remembers it exists
@@myne00 Stagnated? The 10, 30 and 40 series have made absolutely massive leaps in performance
GTX 980 was overlooked because of how good the 970 was. Usually there is a sizeable gap in performance between 70 and 80 class. However, if you were lucky enough, you could quite literally turn your 970 into 980 by overclocking it to around 1500 mhz core clock. I had an asus strix 970 myself and was lucky to do just that. You could also buy the likes of gigabyte extreme gaming and not bother overclocking at all. 970 was a one hit wonder in that regard, never again.
Also, ps4 did run god of war natively in 1080p 30fps, ps4 pro had same textures, better effects and higher res, but in checkerboard render and 30fps. Performance 60fps mode enabled dynamic res. scaling, but jaguar cpu dropped the ball as always. Ps4 pro is rx 480 with lower frequencies, which puts it in ballpark of r9 280x you reviewed, 380x rehash also had 4gb vram. So, 980 getting 65fps average in 1080p native sounds about right, it is faster than 280x/380x and not bottlenecked.
this card still can handle most of the recent titles but it's also depends on target resolution in my opinion for example if the resolution of monitor is below 1080p or 900p with 60ghz the card can offer a great performance on low medium perecet. but again if someone wants to porchuse the card for around 30 to 40 bucks it worth it even in 2024
Its crazy, i remember when 980 was top of the top... and everyone praised 980 like todays 4090
I rencently build a PC with a i5-6500 and a gtx 960 4GB, i am able to play Valorant at 240 fps at average, so to play esport title in 1080p those old graphics still kick in !!
Old cards are perfect for budget eSports machines🙌
my fiance's computer using old parts of mine is a 5600x / 1070 GTX combo and it still runs most everything on 1440p with 40+ fps still. The only things that give much issue are the most modern games with high end volumetric clouds/effects/lighting, usually need to turn these down to medium or less.
Should tell them to upgrade the gpu cuz its bottlenecking the 5600x , 5600x could be paired with like a 6750xt Or 3070
@@AquibDoki She just got the 5600x because i took it out of my old setup, just like the 1070,vwhen I upgraded to a 7800x3d to pair with my 3090. I know it bottlenecks but the most modern games she ever plays are Baldurs Gate 3 & Borderlands 3 and it still performs at 1440p highest settings with 70+ fps so no reason for us to upgrade now. If she ever needs more we'll definitely upgrade
In general, a lot of games from the past few years need an 8GB GPU to run at 1440p. And I think people should be getting a minimum of 12GB for 1440p if they are buying a GPU in late 2024. And a minimum of 8GB for 1080p. But, as recent testing with a 2060 6GB has shown me, 6GB is probably going to be OK for a couple more years. I think you're going to be turning settings way down in GTAVI with a 6GB card, at 1080p.
4GB GPUs are really only good for older games and esports in 2024, with few exceptions. Hogwarts can run on a 4GB GPU but just barely. Same for Starfield, which can run on a 6GB card at 1440p with medium-high settings. Cyberpunk is fine on a 4GB card at 1080p and 6GB card at 1440p.
Fun fact, the 750Ti is actually a 900 series GPU. But if you only have 750Ti money, you should be buying a Quadro K2200, which is basically 4GB 750Ti and is more useful in 2024 due to VRAM. Still has driver support because it is also a Maxwell GPU. It is clocked a bit lower than the 750Ti but nothing a little OC with Afterburner can't fix. Hypothetically, the K2200 has better binned silicon but whether you can get one running more powerful than a 750Ti really depends on how the GPU was used originally, which is highly variable for used enterprise hardware. It is also the cheapest way to have NVENC support.
have an i7 4790 and I've upgraded my GTX 970 to 980 (7 dollar upgrade overall) and I'm very happy with this card
Maxwell and pascal was Nvidias fine wine fr
100%, Nvidia did everything right with these generations imo
Great video. If I may suggest adding frame gen benchmark for lossless scaling. Even x4 frame gen plays fine
Damn... am old.... i remember not long ago the 980 gtx being a beast and the new thing...
I remeber that too. I even remember when Nvidia released the 780 Ti🤣
i bought this as a second graphics card to pair with my 4080.
I needed another graphics card so that I can handle multiple different refresh rates across multiple monitors (one pair running at 240hz and a third running at 60hz)
I am currently using a GTX 960 4GB and it runs my normal games like Minecraft or Black Mesa just fine.
this is basically what a RX 580 is, so no one really forgot it
they still use it basically when they buy a 580
HUGE ANIME BREASTS
@@KokoroKatsurawtf 💀
I sold my 1050 ti then got a 1650 super , sold that and got a gtx1070 , traded up for an rtx3070 ,sold that and got a r9 280x for £20 just to get the monitor to come on as I have i5 11400f which doesn't have integrated graphics .
Gamed on the r9 280x for several months , fortnite ,gta 5 etc and it was usable .
Using an rtx 2070 super now ands its great for 1080p gaming .
Keeping the r9 280x for emergency back up , best £20 I ever spent , great card for emulation too.
All influencers are already talking about the 5000 series RTX and Bro is going completely the other way.
If you are willing to compromise a bit and you really want to play something, there are also options with 4GB VRAM and an old card. Although the RTX 2000 has already aged better, on the feature
*Compare the Gtx 980 vs RX6500 xt to see how well it stacks up?*
Sub 180W, compatible with retro XP dual boot builds. This is a great GPU, just a tad faster than the legendary 1060 6GB.
Yep it sure is, it's aged quite well too despite it's 4GB of VRAM
awesome as always
Thanks man!
I was thinking about swapping my RX 580 8 gb sapphire nitro with my friend's ASUS STRIX GTX 980 DirectCU II OC GB because his casing isnt suitable with the size of the graphics card , will there be any decrese or increse performance ? im currently using ryzen 5 5600
The RX 580 should be faster
How does the gtx 980 compare to the ryzen 5 7600 igpu with 4gb frame buffer set for it?
It will smoke it. The Ryzen iGPUs are pretty weak. I could make a video on it to be fair
@@ProYamYamPC even the 780m in the 8700g?
@@gejamugamlatsoomanam7716 I imagine it’s close. But regular non-G CPUs wouldn’t have a chance
Do gtx 1060 3gb, it performes better than most 4gb cards for some reason
Nvidia actually cared about performance back then. Pascal saw incredible gains
Wow nearly a decade old!
Plz rx 7900 gre or rx 7800xt vs ps5 pro can we skip it
I can do that easy. The 7900 GRE will smoke the PS5 Pro
Man I have a 4070S and I’m already thinking of upgrading to the 5080 when it comes out 😂
Im still using a 1050ti
The GPU I used in my first custom gaming rig
@@ProYamYamPC i got mine used from a random old pc someone gave me
Had this
I use the 980ti daily
Got it used for 100$ back during the black plague
Also be mindful of different settings using more or less of the CPU so you're saying Oh the GTX980 got this but your CPU was also running double duty compared to the low test.
Awesome GPU, one of the goats imo
you technically needed 12gb since atlas (ark survival evolved's pirate sister)... however it's fine if you don't have 8gb for most games, as long as you're capable of playing games at 30 ish fps, without all the eye candy
I'm surprised how well 4GB kinda holds up, considering the cost of a card like the GTX 980. If you stick to eSports and odler games, 4GB is perfectly fine
For older games 4gb 6gb cards work great. Newer games 8gb is even starting to not be enough so you need like 12gb atleast
I've noticed 8GB is still ok if you're using rasterisation. However, that doesn't excuse Nvidia for releasing $400 8GB GPUs with a meager 128 bit bus.
On topic, Maxwell is so good it's the reason that when it came to me selling an old graphics card, I sold my 1070 and kept my 750 Ti 4GB 😁
Totally off-topic, I've just posted a massive yt comment on another channel regarding PCI-E 3.0 x16 vs 4.0 x16 and 4.0 x8 regarding cards like 3090, 4090 and the 5090 now that we know it will be approx 40-50% faster than the 4090 - summary of my comment elsewhere is: PCI-E 4.0 x16 will be enough for 5090 and PCI-E 3.0 x16 or PCI-E 4.0 x8 can do 5090 with about a 10% reduction in fps!
Just thought folks here might also like to know now that I've finished researching it all!
Cheers 🍻
Up to the RTX 4090, every card will still perform the same, regardless of whether it's PCIe 3.0 or 4.0. It's possible that you'll lose a few FPS in some titles with a 4090. But that's almost nothing more than measurement inaccuracies, it's that small. I've had the same experience myself with a 5800x and an RTX 3080. I changed the motherboard from x470 to x570, so PCIe 3.0 to 4.0 and there were no differences.
@@hrbt78 It's really not margin of error, you have to understand which tests to perform as not all test are maxing out the PCI-E throughput whereas other usage cases are - and that even applies to 3090 on PCI-E 3.0 x16 - in that card you lose 6% on most games but some games that actually saturate PCI-E, it's even more than that.
It's for sure not margin of error as it comes down to usage of PCI-E by each application/game engine.
Btw the 3080 doesn't fully saturate PCI-E 3.0 x16, even the 3090 only just does it and that's by about 6-10%, 4090, does it by about 10-15%.
3080 Ti and 3090 is about the point at which PCI-E 3.0 x16 can be fully saturated and then for 4000 series it's about 4070 and up eg 4070 Ti, 4080, 4080 super, 4090.
Games with lots of grass really show up PCI-E throughput saturation 😉