1060, not even in 85% gpu usage kinda weird and also draws very less power. The stupid thing is that the game is NVIDIA sponsored and has performance bug on their hardware even on their latest hardware.
@@Sam-cq9bj That's due the Vram memory speed bottleneck blocking the true performance of the GPU, add 999Mhz to memory and you'll see a difference in performance and consumption
Not really fair tho ..the 480 came out same time as the 1060 ..580 came out a year later..also the 580 was to compete with the 1070 not 1060 ..a fair comparison for age is the 470 against the 1060 ..and we know who wins then
@@tanuviel1929nop, Vega 56. RX 590 was just special edition RX 580 590 was 12nm, instead of 14nm that's why it could boost to 1560mhz instead of 1340mhz of 580. other than that it's the same gpu
If you use Linux and a certain launch parameter, Alan wake 2 will run 30fps on the same settings as shown in this video, twice as fast as the GTX one Check videos if you don't believe me
@@bervirus Just open your eyes... The character and some environment is looking way more blurry on AMD here. And why should i look for reasons for nvidia? I owned many nvidia and amd cards in my life. So im totally unbiased. Just wondered.
@@v1d you know a bit about video encoding? Basically, If you encode 30fps and 60fps at the same bitrate, the one with lower fps will looks clearer, and the one with higher fps will look more blurry or start showing some artefacts because it has only as half the information data for each frame Anyway - Can't compare image quality from video uploaded to youtube - It has been reported alot that ALL and ONLY gtx 10 series even 1080Ti are underperforming in GoT, and its not because they looks better or other (including nvidia 16 series) looks much worse
@@bervirus Ok, maybe you right? Like i said: my initial question was just a question. But if i take your words, shouldn't look nvidia look worse in alan wake 2 because the way higher fps?
@jonathanxjames8634 Comparing apples and pears. Consoles perform vastly better than a PC with a GPU of similar specs and a CPU of similar specs. For one, the GPU is integrated into the CPU (so much faster buses), and shares the same memory as the CPU (which eliminates huge amounts of copying between RAM and VRAM). Secondly the 1st party development kits, and (to a lesser extent) 3rd party engines can utilize the exact specification (known CPU, known GPU, dozens of special optimizations designed into these custom chips) where as DX/Vulkan/GL games have to run on a variety of CPUs and GPUs and cannot utilise the same optimisations nearly so well (i.e. cannot be coded nearly so "close to the metal").
I5-12400, 32 Gb and rx580 a great solution for gaming if you don't have the money for a new modern graphics card😂 I play modern games at 30-40 fps in 1080. I don't need more for gaming.
you forgot the mention amd rx 580 has the worst software support and product quality it restart your pc while gaming show green screen and amd bug report error pop up on the other hand gtx 1060 is truly optimize
the release date of the RX 580 and RX 480 are 8 months apart, and thus the RX 480 was only out for 8 months. The 580 is just a refined version of the 480 except it clocks better, and which makes the 580 roughly 6% faster.
@@FrEffectsIntroforOffer24 an extra 34w ish for 4 extra fps in Cyberpunk high settings without upscaling is not bad considering you shouldn’t be playing at high settings without FSR on either card
@@superhitragini yeah that's cope lol, did you even see the performance on "last" year's alan wake 2 lol, maybe it will run "next" year's gta 6 at 30fps if it was 480p...
@@Shockload its not cope i think he's actually right on that. If GTA 6 runs on an Xbox Series S, it will run on the 580. Besides, @m8x425 is true. If GTA 6 uses DX Ultimate (12_2), then all the 580 users are doomed.
The patches benefited the GTX cards more, and it doesn't use raw performance. The games that does take advantage of the whole raw performance of the cards are Ghost of Tsushima and CoD MW.
Ghost of Tsushima is a proof that Nvidia just did "Apple" They intentionally slow down old GPU by driver so people will buy newer GPU. In the past, GTX 1060 always better than RX 580 when both first came out
explain alan wake 2 is much worse on amd, but the answer is simple, more Vram, that's why the 580 was better, 2gb really makes a difference, if both had 6 gb, they would be tied or nvidia a little better.
@@PanchoGDMKWii simple, why does linux use the vulkan api which is better than dx12, and which is even better on amd, apart from linux itself being lighter than w10, but we are talking about windows here not linux.
What many dont get is that 580 needs manual tweaks. Undervolting set power to 110-120watt at stock clock or allow OC There r gddr5 memory timing optimization available for free from mining folks which improves bandwidth by fair margin If pc supports rebar, 580 can also rip the benefit of it in cyberpunk, forza horizon 5 & other games
its still marginal gains, its a very mature card, you are going to be lucky if you squeeze another 10-15% , which won't really make a difference. By the same note, the 1060 can be tweaked.
@@ishiddddd4783 i have never seen gddr5 memory timing optimization nor rebar support for 1060. On UV, every gpu can b undervolted. 10~15% gain wd result in net 15~18% gain over 1060 when everything works ok
RX 480-580 can't compete with 1060 6GB on performance/efficiency. if you lower the clock speed and voltage to match 1060s power draw, it will become slower. but it's true that RX 470/480/570/580 always run at 1.15v and usually you can decrease that voltage to even 1.06 on some even lower at 1400mhz. but you have to understand that RX 480/580 is less efficient than Vega 56, when it comes to performance/efficiency i think The best solution is power limiting it to lets say 120-130W, and then trying to lower voltage so the gpu can boost higher when the game uses less cores. but even with that it just can't, GTX 1000 series where the most efficient gpus and amd couldn't compete with that efficiency. im not talking about mining... btw no idea there
@@KHALID47 i have a system with a 580 and (for my specific model atleast) it does'nt have much undervolt potential, even if i managed to undervolt it to a point where the difference in power consumption is actualy noticable it would then lose substantial amount of performance.
>130w >A LOT Lmao it's still less or same than the others budget card of the last years: 2060 (160w), 3060(170w), 4060 (115w). Original TDP for RX580 is around 170w and for 1066 is 130w. I think the author has heavily undervolted both cards.
@@mindmazejoyed after less than a year my 4060 Laptop has started giving me so many problems with different games (with certain games like Cyberpunk and Metro Exodus which used to run great now being unplayable
After all the tests, it's safe to announce the final winner is gtx1070 !!!, with which you get best of both world, better dx12 and driver support and 8gb vram! Both gtx1060 and rx580 have their own fatal drawbacks, with the former having gimped 6gb vram and latter having outdated driver support, which are not recommended by today standard.
I think, if you have compared in 2024, you should use current games, because old games have not many players anymore, it does not make sense when testing on them.
Games :
Assassin's Creed Mirage - 0:06
Cyberpunk 2077 - 1:00
Senua's Saga Hellblade II - 2:03
Ghost of Tsushima - 3:04
Alan Wake 2 - 4:07
Forza Horizon 5 - 4:56
Hogwarts Legacy - 5:50
Avatar Frontiers of Pandora - 6:47
Horizon Forbidden West - 7:47
Starfield - 8:34
System:
Windows 11
Ryzen 7 7800X3D - bit.ly/43e3VxW
MSI MPG X670E CARBON
G.SKILL Trident Z5 RGB 32GB DDR5 6000MHz - bit.ly/3XlBGdU
RADEON RX 580 8GB - bit.ly/37Trv6S
GeForce GTX 1060 6GB - bit.ly/31b1TAL
SSD - 2xSAMSUNG 970 EVO M.2 2280 1TB - bit.ly/2NmWeQe
Power Supply CORSAIR RM850i 850W - bit.ly/3i2VoGI
I hope "bodycam" can be added for testing
Next time add Valorant plz
as a 1060 user, I wouldn't dare to run Alan Wake 2 lol, that thing is already melting the 4090
You can play aw2 with vulkan mode, about 30 fps avg
😔
I have 4070 and still am scared about buying it lol
@@MultiKamil97 and I thought Starfield was poorly optimized, 😂 at least Starfield is playable on RX 580
@@drew2626 Starfield look and play like 10y old game
I remember buying the rx 580 when I was super cheap and selling it for double the price when mining craze happen
i sold both 570 and 1060 at their price i pay for them lul
Now u can get a 580 for $100.
Evil
You are what is wrong with this world.
Gtx 1060 : 2016
Rx 580 : 2017
Dont forget that
yet rx 580 was cheaper
@@JOHNTECH112 gtx better
@@Homiealrt nope
@@JOHNTECH112yes *
@@Homiealrtdid you watch the same video💀
I had 1060 6GB until march this year and it was an amazing card. It served me really well.
I’m still running one 😢 If I last another year will 4090s CHEAP..er?
@@spacetoilet no
This battle will go on forever...
No battle nvidia is years ahead ok few extra frames with radeon but the colours on nvidia are so much better and the drivers work better
@@haggis4952 I love nvidia too...... they're making me rich
@@haggis4952lmao you gotta severely dented in the head to think that 6gb vram gpu has more future than 8gb vram gpu
Это последняя битва 😢
@@haggis4952 Definition Of Facts Are Given, But Doesn't Listen and still wants it to be proved wrong that's it's not better 💀
Difference in tsushima is insane though.
Most likely vram limitation 6gb vs 8gb
Yes, in Alan Wake too
1060, not even in 85% gpu usage kinda weird and also draws very less power. The stupid thing is that the game is NVIDIA sponsored and has performance bug on their hardware even on their latest hardware.
@@Sam-cq9bj That's due the Vram memory speed bottleneck blocking the true performance of the GPU, add 999Mhz to memory and you'll see a difference in performance and consumption
@@Sam-cq9bj it's a problem with FSR
This was a legendary battle back then....
Am still running my Sapphire RX580 with a 3300x. Gets me decent fps on PUBG and xDefiant. :)
Guys what a good budget gpu to buy in 2024 for 1080p gameplay?
buy a used one
New- 4060
Used- rx 6600
If you go on cheap buy used 6600, 6600xt, 2060Super, 3060, or 6700xt
You can also buy new 4060 or 6700xt too
6600/xt
4070s ti would be good for native
A difference of 50 watt is massive
Lord RX 580❤
this 2 masterpieces are still kicking just because it had enough vram on it's launch days
Excluding Alan Wake which is unplayable at 18fps anyway, the 580 almost always wins, thx, good job
The Legends are back ❤️
1060 never had 110w
580 would be able to run better textures, if you put ultra on the textures in all games the 1060 will be asking for help.
ill remember these 2 video cards.have huge.difference but in newest games its liitle.bit
Not really fair tho ..the 480 came out same time as the 1060 ..580 came out a year later..also the 580 was to compete with the 1070 not 1060 ..a fair comparison for age is the 470 against the 1060 ..and we know who wins then
The 590 is competing with the 1070, not the 580
Yeah the 580 came out later to compete with 1070 as the 480 tried
Thats not true my fuzzy little man peach @@tanuviel1929
RX 580 is 480, just little higher clocked.
RX 480 1266mhz
RX 580 1340mhz
RX Vega 56 complete with GTX 1070, not 580
@@tanuviel1929nop, Vega 56.
RX 590 was just special edition RX 580
590 was 12nm, instead of 14nm
that's why it could boost to 1560mhz instead of 1340mhz of 580.
other than that it's the same gpu
What's wrong with Alan Wake on RX 580?
The game's not optimized for it. Look at the power draw on the 580, it's about half of what it is in other games.
RX 500 has no mesh shaders tech, wich the game depends entirely
@@RaulMazuchelli Neither do Nvidia GTX cards
@@EX0007 i know, but they ve alternative tech that can simulate, even that limited, something the AMD RX 500 dont
If you use Linux and a certain launch parameter, Alan wake 2 will run 30fps on the same settings as shown in this video, twice as fast as the GTX one
Check videos if you don't believe me
This cards ❤
Top.
GoT looks much worse on the AMD. Maybe that‘s why it performs way better?
lol how do you describe "looks much worse"? looking for reasons?
@@bervirus Just open your eyes... The character and some environment is looking way more blurry on AMD here. And why should i look for reasons for nvidia? I owned many nvidia and amd cards in my life. So im totally unbiased. Just wondered.
@@v1d you know a bit about video encoding? Basically, If you encode 30fps and 60fps at the same bitrate, the one with lower fps will looks clearer, and the one with higher fps will look more blurry or start showing some artefacts because it has only as half the information data for each frame
Anyway
- Can't compare image quality from video uploaded to youtube
- It has been reported alot that ALL and ONLY gtx 10 series even 1080Ti are underperforming in GoT, and its not because they looks better or other (including nvidia 16 series) looks much worse
@@bervirus Ok, maybe you right? Like i said: my initial question was just a question. But if i take your words, shouldn't look nvidia look worse in alan wake 2 because the way higher fps?
@@v1dwell no aw2 is just an unoptimized mess especially for the rx card
Rx 580 all the way
Rx card pc restart problem plz help me
You must have some deep mess. Reistall S.O. and put last motherboard/video driver
Next time add Valorant plz
I used to have gtx 1060 4 years ago then I sold it to a friend never realize it's good on the new games.... Now I have rtx 3060 Ti.
90 watt vs 130 watt
The 1060 was stock
both are stock
GTX 1060 6gb on paper Boost Clock 1709 MHz.
but they boost higher usually 1800+
@@zakur0978 mine hitted 2050mhz on core
FineWine™
So Ghost of Tshushima runs better on a RX 580 than PS4 lol
Both 1060 and 580 are better than ps4 bro 💀💀💀
RX 580 is better than PS4 Pro, PS4 is much weaker
RX580 is literally equal to an XB1 X, the PS4 is not even close
@jonathanxjames8634 Comparing apples and pears. Consoles perform vastly better than a PC with a GPU of similar specs and a CPU of similar specs. For one, the GPU is integrated into the CPU (so much faster buses), and shares the same memory as the CPU (which eliminates huge amounts of copying between RAM and VRAM). Secondly the 1st party development kits, and (to a lesser extent) 3rd party engines can utilize the exact specification (known CPU, known GPU, dozens of special optimizations designed into these custom chips) where as DX/Vulkan/GL games have to run on a variety of CPUs and GPUs and cannot utilise the same optimisations nearly so well (i.e. cannot be coded nearly so "close to the metal").
@@barneypitt3868 this game says otherwise. First party titles are optimised for consoles.
Starfield is so unoptimized
RX 580 8GB VRAM
Which one is better? None IMO; time for an upgrade my friend.
The RX 580 is almost better, but its lack of DX 12.1 really does hurt it.
Pointless
I remember the early stages of rivalry where 1060 was mopping floor with 580. How the times have changed. AMD really leveled up their drivers.
Alan Wake 2 wtf ? hahaha
cpu limited moment
RX 590 😊 Nice 💯
Its 580 lol can't you read?
@@EX0007 I think he meant that he owns a 590
I5-12400, 32 Gb and rx580 a great solution for gaming if you don't have the money for a new modern graphics card😂
I play modern games at 30-40 fps in 1080. I don't need more for gaming.
Yet 1060 sold way more then 580 when its 100 more expensive
We must Go green
Bcuz marketing, now 580 is much more popular than 1060
8GB Vs 6GB
So happy to go with rx580 over 1060 years ago
1060 years ago wowww you must be old 😲
Alan wake 2 fit beef with AMD 💀💀💀
1060 6gb easy win!
Average 1060 user:🗿🍷
Average rx580 user: 🗿🍷
bro try online games apex cs2 rust call of duty
this video kind fake, the alan wake II don't look like 5fps, is more like 15fps in the video.
at 580 I accelerated the video almost 2 times to synchronize with 1060
7800x3d and ddr5 60000? This is so unrealistic setup
Simplemente es el Hardware que tiene y asi nadie se podra quejar de "cuello de botella"🤣
This is a gpu test you realise?
Why would anyone still use mid range cards from 2016 anymore? It's been 8 years so people should have enough money for new GPUs.
If you don't want them, then don't buy these GPUs. Those are for brokies like me.
Because now days under 150 gpus are way worse then before
Because they are dead cheap and have a tremendous value. Price performance is top.
I'm sure you or your parents are in debt, so it's funny that you're talking.
Even if you're not buying them it's interesting to see how they compare nowadays.
you forgot the mention amd rx 580 has the worst software support and product quality it restart your pc while gaming show green screen and amd bug report error pop up on the other hand gtx 1060 is truly optimize
No it doesn't no problems have u ever had one ? Or u just talking sh.. nvidia fanboy?😂
Alan wake rx 580 5fps😂
you can get 30 fps if you run it on linux
Watt per performance Nvidia easy win. For more realistic bench compare GTX 1060 to RX 480 (same year) Nvidia win easy again
the release date of the RX 580 and RX 480 are 8 months apart, and thus the RX 480 was only out for 8 months. The 580 is just a refined version of the 480 except it clocks better, and which makes the 580 roughly 6% faster.
рыкса мощнее
The 580 was always a better/stronger card + the 8gb vram will make it last longer
Расскажи это Alan wake 2, Robocop, remnant..
Yup I’m almost surprised how smooth the Cyberpunk gameplay was
ahahha ALWAYS... начиная с выхода rx 480
But for the FPS you get more, it consumes much more than the 1060.
@@FrEffectsIntroforOffer24 an extra 34w ish for 4 extra fps in Cyberpunk high settings without upscaling is not bad considering you shouldn’t be playing at high settings without FSR on either card
If the parents still pay the electricity bill, the 580, if you have to pay yourself, the 1060 😂😂
Yeah like 30w is really difference be serious kid 30w is nothing we are talking about 30W difference NOT 200W
Do you know how much 30W is worth ? Its like 1 penny difference be serious...
580/480 will go down as one of the best mid-range graphics cards in history
no
Yes @@delimanyak-y2v
@@somefish9147 580 is a power reactor with gtx 1060 power, gtx 1060 is better with better energy efficiency
@@delimanyak-y2v30w difference is nothing dude we are talking about 30w not 200W tf are u on?!
Rx580 is the legend
after all these years the 580 is still better
Alan Wake 2: 💀
Always was better and more stable.
It always was
but the power draw is too big, 180W?💀
I'm using a GTX 1060, but for electrical efficiency I limit it to 60% with peak power draw at 80W I think
@@qwertyuseradmin Optimization issue from the games side. The 580 is drawing barely half the power it does in the other games.
Can someone explain in Alan Wake 2? How the 580 so low?
Rx 580 consume more power than gtx1060
You can totally notice the 1-4 fps. When I saw this I threw my 1060 in the trash where it belongs.
Why r7 7800x3d 😢
Need a better power supply for rx580
Its only 30w-50w difference an 300W psu is more than enough
@@szympHansi would still go around 500W for other things
@@GioKuster-s8m 500w? 300w is really enough trust even 300 could be too much
@@szympHans i would still get hogher for futureproofing
@@szympHans The rx580 Nitro has peak power draw of 240 watts. A 300 watt PSU is nowhere near adequate.
GTA 6 on these cards will run at 5FPS
Will be good trust me
nah 30 fps with fsr
if it uses the 12.2 API, then yes
@@superhitragini yeah that's cope lol, did you even see the performance on "last" year's alan wake 2 lol, maybe it will run "next" year's gta 6 at 30fps if it was 480p...
@@Shockload its not cope i think he's actually right on that. If GTA 6 runs on an Xbox Series S, it will run on the 580. Besides, @m8x425 is true. If GTA 6 uses DX Ultimate (12_2), then all the 580 users are doomed.
Gtx 1060 🎉❤
Btw just to let u know these both are in diff resolution some use fsr and some dont
Alan wake 5 fps 😂😂
Check again it got optimized it runs 30 fps avg in high demand areas with gtx1060 fsr2 native
Boss. Are you using directx 12 or 11
У алана вейка неприязнь к амд,не оптимизировали от слова совсем игру
все равно они уже не играбельные. не советую их покупать. минимум 6500хт
You can clearly see the difference in Alan Wake 2, raw gpu performance. In other titles the Rx 580 wins simply because it has 2gb vram more.
The patches benefited the GTX cards more, and it doesn't use raw performance.
The games that does take advantage of the whole raw performance of the cards are Ghost of Tsushima and CoD MW.
gtx 970 vs rx 580 please
Is that the Aisurix RX580? If no, are they just the same with the Aisurix?
No, that is the Original 2304SP RX 580. And from the gpu & mem clock, i think that is the Sapphire Nitro+ one.
Better than GTX 1650
quando è uscita la 1060 gli dava una pista alla 580. Ora è il contrario
King Rx580
Rx 580 8gb is better than gtx 1060 6gb
Alan wake 2 5fps😶
Ghost of Tsushima is a proof that Nvidia just did "Apple"
They intentionally slow down old GPU by driver so people will buy newer GPU.
In the past, GTX 1060 always better than RX 580 when both first came out
explain alan wake 2 is much worse on amd, but the answer is simple, more Vram, that's why the 580 was better, 2gb really makes a difference, if both had 6 gb, they would be tied or nvidia a little better.
@@andrwsarmestrong1170check how Linux and a launch parameters made the rx 580 run alan wake 2 in 30fps 1080p
@@PanchoGDMKWii simple, why does linux use the vulkan api which is better than dx12, and which is even better on amd, apart from linux itself being lighter than w10, but we are talking about windows here not linux.
@@andrwsarmestrong1170 I know, but its worth letting people know
Perhaps the other way around too. AMD is known for updating drivers long after the release to make GPUs faster.
What many dont get is that 580 needs manual tweaks.
Undervolting set power to 110-120watt at stock clock or allow OC
There r gddr5 memory timing optimization available for free from mining folks which improves bandwidth by fair margin
If pc supports rebar, 580 can also rip the benefit of it in cyberpunk, forza horizon 5 & other games
Resizeable bar*
its still marginal gains, its a very mature card, you are going to be lucky if you squeeze another 10-15% , which won't really make a difference.
By the same note, the 1060 can be tweaked.
@@ishiddddd4783 exactly.
@@ishiddddd4783 i have never seen gddr5 memory timing optimization nor rebar support for 1060. On UV, every gpu can b undervolted. 10~15% gain wd result in net 15~18% gain over 1060 when everything works ok
RX 480-580 can't compete with 1060 6GB on performance/efficiency.
if you lower the clock speed and voltage to match 1060s power draw, it will become slower.
but it's true that RX 470/480/570/580 always run at 1.15v and usually you can decrease that voltage to even 1.06 on some even lower at 1400mhz.
but you have to understand that RX 480/580 is less efficient than Vega 56, when it comes to performance/efficiency i think
The best solution is power limiting it to lets say 120-130W, and then trying to lower voltage so the gpu can boost higher when the game uses less cores.
but even with that it just can't, GTX 1000 series where the most efficient gpus and amd couldn't compete with that efficiency.
im not talking about mining... btw no idea there
Do the same with 1650 vs rx 470/570 vs 1050 ti
Rx 570 is faster
The RX 570 is about 40% faster than the 1050 ti and 15% faster than the 1650.
First give me a like
👎
i didnt know rx580 use a lot of watt
just undervolt
@@KHALID47 i have a system with a 580 and (for my specific model atleast) it does'nt have much undervolt potential, even if i managed to undervolt it to a point where the difference in power consumption is actualy noticable it would then lose substantial amount of performance.
@@fourty9933 i undervolted from 1.18 to 1.09,which is like 90 mv
>130w
>A LOT
Lmao it's still less or same than the others budget card of the last years: 2060 (160w), 3060(170w), 4060 (115w).
Original TDP for RX580 is around 170w and for 1066 is 130w. I think the author has heavily undervolted both cards.
@@fourty9933 don't confuse undervolting with underclocking
1660Ti vs 1060
Would love to see rtx 3060 12gig vs 4060 8gig and 6700xt after a decade later performance margin in the near future :D
Yeah I don’t see the 4060 aging nearly as well as those 12GB cards after a decade
@@drew2626 sorry my mistake what meant to write was half a decade*
@@mindmazejoyed even then, 2028 or five years after its release doesn’t sound great for for the 4060
@@drew2626 ikr but nvidia tries to convince us otherwise with their overpriced new "60" tier gpus
@@mindmazejoyed after less than a year my 4060 Laptop has started giving me so many problems with different games (with certain games like Cyberpunk and Metro Exodus which used to run great now being unplayable
gtx1660ti Is it still good?
Yes, my card ❤
After all the tests, it's safe to announce the final winner is
gtx1070 !!!, with which you get best of both world, better dx12 and driver support and 8gb vram! Both gtx1060 and rx580 have their own fatal drawbacks, with the former having gimped 6gb vram and latter having outdated driver support, which are not recommended by today standard.
No, the real winner is the RTX 4090. Because if you take price out of the equation then the fastest card on the market will obviously win.
@@oxaile4021 Current prices of 1070, 1060 and rx580 are 95, 75, and 60$ respectively. I think it's worth spending 30$ from rx580 to 1070.
@@withcarename all garbage just save for a used 2060Super is 130$ perform little better than rx6600 but with dlss and rt.
Fair
except that gtx 1070 still runs slower on ghost of tsushima, and almost twice the price of a rx 580
Rx 580 was always a better card
рыкса тащит
кто б сомневался!
Rx 580😊
I think, if you have compared in 2024, you should use current games, because old games have not many players anymore, it does not make sense when testing on them.
he is testing latest games
He literally tested only latest games