i tested this demo with gt1030 myself but ofc no one with a gt1030 will play games at max XD the game is very playable on low settings 900p i get 50ish fps
@@zayd1111 an improvement of literally more than 8x My guess is upping settings just gobbles up vram. Anyway this is why we need low settings testing. People assume a gt 1030 wouldn't be playable ever.
@KarlKlaus vonS Pls read my comment here (the last one) regarding your 970 and other Maxwell gpus. www.techspot.com/article/2001-doom-eternal-older-gpu-test/
1080 Ti is one mighty GPU. I think this is the best value card NVIDIA ever released. It was sold for $700 3 years ago and it performs better than a $700 card today.
Hardware Unboxed is one of the few channels out there who provide benchmarks from many different cards from both Nvidia and AMD and that's why I love their benchmark videos, the numbers don't lie.
*Waves Radeon VII Flags* Always nice to see the Radeon VII climb the performance ladder as resolution increases. I kind of wish Steve would test the card at 1200Mhz memory as this seemed to be a really common overclock for the memory and is the max AMD will allow. But I won't be holding my breath for that. I think most reviewers would like to drop that card from their test ASAP unfortunately.
@@OverlordActual See, would you SELL IT for a 5700XT? How about a 2070 Super? I really doubt it, really, really, doubt it.. But "it's not for gaming" lmfao...
So 1080Ti gets 116 frames at 1440p putting it a smidge behind the 2080S (but it OCs better, than the 2080S so they're basically the same card if you tinker). That's insane, used Tis are the best deal on the net atm.
@@Patrick73787 Super OC's badly from what I've seen, certainly can't match the very easy 10% uplift even a reference Ti hits, but this is often 12% with a bit of tweaking. check gamersnexus for Ti overclocking. Here'es the 2080S performing awfully, frankly, for a card that came out 2 years later than the Ti i2.wp.com/babeltechreviews.com/wp-content/uploads/2019/07/Untitled-3-11.jpg?resize=696%2C331&ssl=1
Same with the 1030 vs 550 Or 1060 vs 580 Or 1070ti vs Vega56 Let's not even talk about older generation GPUs You know, i think there is a reason why i always recommend AMD for the long term
Nvidia is still relevant for its more polished driver, AMD would provide a more long-term value as time goes on, its raw power is a lot but it is hindered by its own driver
@@joerg385 Yeah, I was thinking about that before, but Radeon 7 comes with 16gb HBM2, which is not something to be thrown away even for gaming. It's better card then 5700xt in every DX12 or Vulcan titles using 1440p or 4k, which leads me to believe, that in the newer AAA titles (most of them will support DX12 or Vulcan) Radeon 7 will be better and it will last longer. Of course, if AMD keep support it with the drivers, which is the most important thing.
@@BombaJead Agreed, I seem to get more fps and lower CPU usage in dx11 compared to dx12 where its lower fps and higher CPU usage. (in the demo with latest drivers)
@@Hardwareunboxed On 02-04-2020 there`s a new AMD driver special for RE3... www.amd.com/en/support/graphics/amd-radeon-5700-series/amd-radeon-rx-5700-series/amd-radeon-rx-5700-xt
Overclocked Radeon VII easily second fastest GPU when it comes to Resident Evil 3. I get better performance than a 2080 Super and the 5700XT is well behind.
Seems a bit ironic that someone called "Bang4BuckPC Gamer" is pointing out that an overclocked Radeon VII beats a 5700XT! XD (but he does make a totally valid point).
@@syncmonism Yes, I love my VII, but 'Bang for Buck' isn't it lol It's a tinkerers and enthusiasts card that barely qualified as competitive when it came out.
Only if you water cool the card. Othewise its OC is kinda shit. I own a MSI RVII, which has a shit die that has stock voltage at 1120 mv /barf. It def needs to be water cooled in order to undervolt more to get more performance.
@@sparda9060 The VII is all silicone lottery. I was lucky enough to get one that can run stable at 990mV without even a washer mod. It's quieter than my old GTX970 and i reckon with a repaste/washer it would be pretty damn silent. But at stock volrage it's unbelievable. Runs hot and loud and heats the room so much that you don't need heating in the winter.
I think I've gotten quite my moneys worth of the 1080 Ti that I bought in 2017. Only now with Warzone I feel I need to upgrade as I can't get much more than 100 FPS with the lowest setting on almost all settings in 1440p.
Yup, I remember running RE2 at max settings preset 1440p at locked 60 fps, I'm sure it'll be the same! $115 580 8GB, I'm still happy I made that choice!
@Salt Maker What? No they are great in every game. I upgraded from a 1080ti to a 2080ti and while it obviously is faster, the improvement is not mind blowing or even noticeable in any game other than RDR2. There is no game my 2080ti plays well that my 1080ti didn't play well either. Well maybe RDR2 at high resolution is literally the only exception I found
@Salt Maker Your response was "In non demanding games yeah" to which I responded that this is not true at all. I don't need benchmarks thanks, I obviously already watch them. I am more interested to know what demanding games the 1080ti is struggling with? As for a CPU bottleneck, no I would say my 3700x isn't bottlenecking anything, so not sure why you even went on this tangent? My 2080ti is obviously putting out a lot more FPS than my 1080ti was but the difference between 120 - 140 fps isn't exactly noticeable unless you are monitoring the fps.
@Salt Maker I didn't argue that, you are presenting a strawman argument so you are arguing with yourself there. You are aware of the channel I commented on... right? You do realise they do a LOT of benchmarks... right? I am well aware the 3700x will not produce as high FPS as say the 9700k (more often than not) but I knew this going into it (as I watch benchmark vids) and I didn't care as I wanted Intel to have competition and not give them any more of my money. I run 3600mhz ram btw... You don't seem to understand the difference between hyperbole and quoted stats, as per my example of the FPS difference. There is no bottleneck in my system, you are way off there. You didn't answer my question and have gone on a tangent that had nothing to do with my original comment or your reply. Again, what demanding games does the 1080ti struggle with? (other than RDR2 as I myself acknowledged).
@Salt Maker Moron, you can't even answer basic questions, you don't understand hyperbole and you clearly don't read comments or you would have seen I clearly stated the ram I use well before you responded. You wouldn't answer the only question I repeatedly asked which was what demanding games the 1080ti struggles with because you know your initial response was stupid and you cannot defend it. I have encountered some idiots on here but fuck me.... Now go troll someone else kid, I am most definitely done with your dumb ass
@Salt Maker Are you really that stupid ??? Its the best GPU for when it was launched !!! Duhhhh .. Of course after 3 years the newest and greatest aka the 2080ti will be faster !!! The 1080ti was way ahead of its time thats my point !! Even after all these years and its only beaten by the 2080ti !!! By the way i game at 240hz/1080 p... who gives a fuck about 4k ...
Really love the voice of Steve. And yes I'm a man but I can hear him clear and my ears are not 100% . Still love my Radeon Vega 56 blower . For 280 euro I pay for a nice card . And with freesync for the lower fps it's still working great and than the option for Radeon Image Sharpening . Steve you try Radeon Image Sharpening on this game ?
I think a lot of people also don't realize that Vega cards are actually very efficient when you're just web-browsing (etc.). I recall that when they're not doing anything demanding, they're just as efficient as their Nvidia (Pascal) equivalents, with the fan not even needing to run. If they used a lot more power, and always had to have the fan running even on the desktop, then that would really suck, though they'd still be good cards even then (for the right price, or for the right user who needs their superior performance in certain non-gaming applications).
@@swordfishffm I think Vega is somewhat underrated. I wasn't willing to run a blower model, and I wasn't willing to trust Gigabyte's dual fan model either (it had a lot of bad reviews). I was close to buying one, but I couldn't find a good dual fan model for a good price, and I ended up finding a great price on a good triple fan 1070 ti instead.
@@syncmonism the Max power use I had on my Vega 56 was 225 Watts and I blow on that moment the rx5700 away . And no the blower was not like a jet engine . I use a asrock blower model one of the last one. Not the Samsung memory . All Samsung memory are over voltage from the factory .
Honestly, I'm a bit disappointed by the performance here. With modern titles we've seen the RX570 8GB regularly beating the 1060 6GB, here it seems to be behind the 580 by exactly the difference in specs. But then, maybe that just speaks for the game being well optimised for the architecture. Feeling pretty good about my 570 also. The more demanding games become, the more we'll get out of it :>
I wish all games had such a great optimization. My 1080 Ti which i got for 450$ weeks before 2000series came and it performs better than 2080 Super wich cost 900$ in this game, it's shame too keep 1080 TI stock. i am using aggresive overclock on mine 2025mhz on core and +900mhz on memory, it gives around 13% gain. that's why it's a lot faster :)
@@maxzett Nah, it's not uncommon to find numbers like this in other games. More importantly, it's yet another sign for FUTURE performance. nVidia just dumps their old graphics card in the dust, and stops optimizing for them the moment the new generation is out.
Great video and thanks for all the effort! Funny to see the1080 Ti still shine so well. Re the game itself btw, at 12:35 that shows something which really bugs me about modern games, the elevator opening is such a stupid forced jump scare (on rails); the doors open far too fast, and surely given the obvious risk (she never seen zombie movies?) one would at least try to block the doors first before opening them. There's plenty of junk around, we saw her walking past it. Modern games can look very nice, but they are sorely lacking in interactive functionality. Very little of what one can see are things one can do anything with. It's as if most of the environment has been glued down. Sometimes this game play style works well because the pace is quick and the design quite clever, such as in "The Last of Us", but RE games jar a little too much in this regard, too many cases where events look like they've been pulled from badly written Hollywood movies.
Well nothing better than starting a new month with some well made benchmarking. Much appreciated. Was your Vega 64 stock ? If not, what clocks did it run at ? (core clock + HBM)
The Vram meter in the options is a bit of an odd ball. It might be different in the full game but in the demo I didn't see any meaningful difference in performance between textures set under my available Vram and maxed out textures that will exceed my Vram. My Vram gets indeed entirely utilized with the textures sets to the maximum quality but I didn't notice any stutter or slowdown during gameplay. Specs: R7 3700x, 32Gb @3400Mhz CL14 manually tuned subtimings, RX 5700XT on a PCI-E 3.0 x16.
I had to rewind a LOT on this one. I really struggled to concentrate on what you were saying with such interesting and gorgeous gameplay footage. Then again that is probably due in part to my literal ADHD lol
I have a 4GB RX 470 and I have tested the RE3 demo. I just set textures to High (1GB) and set shadows to High. I also used DX11 to get a bit more FPS. Smooth gameplay in 1080p. Setting textures to High (8GB) just uses more VRAM so that textures don't need to stream as much. For example if set to High (0.5 GB) you'll see more texture pop ins like seeing a blurry texture turning to a sharper texture right before your eyes. Why is that not discussed in detail in this video? Is that Tim's responsibility?
Hi. What happened to the RX 5600 XT is so slow compared to the RX 5700, in the reviews they where so close. Do you use the reference clocks or the new clocks?
For those who want a higher resolution than 1440p, something close to 4k, they can try 1800p, it's a pretty nice bump in quality. And to get more details they can use RIS, if they are on AMD, or the alternative offered by nvidia.
Going from these results it looks like my old GPU is still fine for 1080p gaming, at least until the next generation consoles come out and devs can start upping the general graphics quality in upcoming titles (the rumoured remake of Crysis nonwithstanding - that is a series made for melting graphics cards).
@@hououinkyouma1458 I mean u should compare it to the new upcoming pc hardware. But when u add payed online,expensive games, no display support which would force you to buy a very expensive oled just for the hdmi 2.1(if u want to fully get the "4k 120 fps") they are not a good value
I woke up and first thing I was when I opened youtube was that thumbnail, long story short my brother is mad at me for laughing too loud and waking him up.
@Salt Maker I guess that makes sense, the in game menu gave me warnings when I maxed out textures because it crossed the limit .I thought it's a demo thing to have the option for crazy settings
Radeon VII at $550 new on Amazon back in Jan 2020 + 3months xbox game pass was too hard to pass up as someone who plays at 4k. Despite being EOL... it's still new on Amazon at $599. Really competitive at that price for 1440+ users.
On the thumbnail, we can see Steve after benchmarking for 16 hours straight with no food and no sleep.
They've been upping their thumbnail game these days.
That's me with the cheese block at 2am.
Monitor Steve has been released!
jarrod which output he use
hdmi, dp , or DVI ?
Plot twist: they just zoomed into the monitor
What if... WE are in the monitor with him...?
7:18 The Vega 64 bar is green... Steve you have failed us again. A full reedit and reupload seems to be the only way out this time.
It wishes it was a GTX 1080 Ti.
Be careful of what you say, or else it'll use GN's powermod and actually get there.
@@Hardwareunboxed still beats 1080 right?
Just bought this Strix Vega 64. I am not disappointed.
@@RKroese Now imagine if you bought a good AIB card like Nitro+ :D
That thumbnail is amazing
I expected an aprils fool telling me that the GT1030 would smash them all :(
My 9600GT feels your pain
Playable on nforce
My Radeon 9800 Pro feels you
The DDR4 Or the GDDR5 Version??? LOL!
@@bigal2688 The DDR3 one to rule them all.
To the owner of the GT 1030... Steve has tested this just for you! Those painful 6 frames shows dedication to testing
i tested this demo with gt1030 myself but ofc no one with a gt1030 will play games at max XD the game is very playable on low settings 900p i get 50ish fps
@@zayd1111 Use sharpening filter and Nvidia hardware upscaling if you have a 1080P display, it save lives.
@@zayd1111 an improvement of literally more than 8x
My guess is upping settings just gobbles up vram. Anyway this is why we need low settings testing. People assume a gt 1030 wouldn't be playable ever.
thats his way to tell people still on 1030 and 1050 to get a job
@@Hobbes4ever Ahh yes, the first world presumptiveness is strong in this one
Loved the thumbnail 🤣
I'm glad I haven't sold my 1080 TI, what a monster of a card still is.
Nvidias best work
Indeed. It took AMD 2 years before they could even beat that card.
For sure it was the last GPU that worth it's money 100%, from nvidia. At least in my opinion.
Well amd main focus is servers and desktop cpus. Now they are gunning for the gpu side. I doubt nvidia will amd win but it'll be competitive again
Waka Flocka Project Performance-wise, I'm quite certain Nvidia will win. Price-wise however...
This is a monstrous amount of testing done. Are you even human?!
Cheers, mate!
1080Ti legacy continues. What a card.
@KarlKlaus vonS Pls read my comment here (the last one) regarding your 970 and other Maxwell gpus. www.techspot.com/article/2001-doom-eternal-older-gpu-test/
Looove my 1080 Ti. 4k gaming, baby!
I LOL'ed hard at the thumbnail :D
I would like a hardware unbox t-shirt printed with the thumbnail pls
1080 Ti is one mighty GPU. I think this is the best value card NVIDIA ever released. It was sold for $700 3 years ago and it performs better than a $700 card today.
I saw the thumbnail and my brain thought "huh, they're playing RE3 on a rig with 34 graphics cards".
My brain can be very stupid.
No, You just need to plug in power cord to your brain. It got loose.
Brain latency sucks man. You need to download faster brain cells. Maybe 3600mhz in dual channel.
@@SuperSilvi1990 shit I can't afford that kind of upgrade
@@benmac7552 damn dude, I'm so sorry
Hardware Unboxed is one of the few channels out there who provide benchmarks from many different cards from both Nvidia and AMD and that's why I love their benchmark videos, the numbers don't lie.
It's crazy how strong the 1080ti still is anyone gaming at 1440p with that card will still be set for another 3 years
My 1080 Ti has to be the best GPU buy I've ever made. Three years later and it's still a top dog.
My 12mb voodoo II card was not listed?? 🤔
Thank you so much for keeping the VII in your benchmarks!!! 👍
*Waves Radeon VII Flags* Always nice to see the Radeon VII climb the performance ladder as resolution increases. I kind of wish Steve would test the card at 1200Mhz memory as this seemed to be a really common overclock for the memory and is the max AMD will allow. But I won't be holding my breath for that. I think most reviewers would like to drop that card from their test ASAP unfortunately.
That's my dream card.
@@techbuildspcs Same here, and lots of people still want that card, regardless of what some people say.
Luke Himself yep, it’s gonna age well. I mean it already has lol.
@@LukeHimself I like mine ^_^
@@OverlordActual See, would you SELL IT for a 5700XT? How about a 2070 Super?
I really doubt it, really, really, doubt it..
But "it's not for gaming" lmfao...
So 1080Ti gets 116 frames at 1440p putting it a smidge behind the 2080S (but it OCs better, than the 2080S so they're basically the same card if you tinker). That's insane, used Tis are the best deal on the net atm.
@@Patrick73787 Super OC's badly from what I've seen, certainly can't match the very easy 10% uplift even a reference Ti hits, but this is often 12% with a bit of tweaking. check gamersnexus for Ti overclocking. Here'es the 2080S performing awfully, frankly, for a card that came out 2 years later than the Ti i2.wp.com/babeltechreviews.com/wp-content/uploads/2019/07/Untitled-3-11.jpg?resize=696%2C331&ssl=1
idk why but this channel may have the best camera fidelity i have ever seeing on youtube
wow, i remember 1050 leaving 560 in a dust, but now
Same with the 1030 vs 550
Or 1060 vs 580
Or 1070ti vs Vega56
Let's not even talk about older generation GPUs
You know, i think there is a reason why i always recommend AMD for the long term
Never buy low end Nvidia.
@@SuperSilvi1990 it ain't low end
@@danielkatanaofficial a 1030 and a 1050 isnt low end?
But in all honesty Nvidia's low end cards just dont have enough vram.
Nvidia is still relevant for its more polished driver, AMD would provide a more long-term value as time goes on, its raw power is a lot but it is hindered by its own driver
Man the 1080 ti is such a beast.
Well, my Vega 56 Saphire Pulse it's still kicking good these days...
Woah! 1080 Ti owners are sitting pretty right now. That card is still absoutely monstering the competition.
Radeon 7 results are impressive, the card was 500$ a month ago. For this price is unbeatable.
But considering it's often en par or even under the 5700XT in other modern titles, it's not that great investment...
@@joerg385 Yeah, I was thinking about that before, but Radeon 7 comes with 16gb HBM2, which is not something to be thrown away even for gaming. It's better card then 5700xt in every DX12 or Vulcan titles using 1440p or 4k, which leads me to believe, that in the newer AAA titles (most of them will support DX12 or Vulcan) Radeon 7 will be better and it will last longer. Of course, if AMD keep support it with the drivers, which is the most important thing.
@@joerg385 1440p or 4k is where the card shines 1080p don't stress it enough
I was not interested in this video. But the thumbnail ... I couldnt resist. Its to glorious.
Thanks mate, I'm glad we were able to get you here :D
great video as usual, please add the 1080 Ti.
We need a Performance & Graphics optimization guide for Resident Evil 3 Remake from Tim.
Refer to the RE2 guide ;)
Also use dx11 to get more frames on every card, for some reason dx12 performs worse in this title.
@@BombaJead Agreed, I seem to get more fps and lower CPU usage in dx11 compared to dx12 where its lower fps and higher CPU usage. (in the demo with latest drivers)
@@Hardwareunboxed bannerlord when? Thank you so much for your work tho!
@@Hardwareunboxed On 02-04-2020 there`s a new AMD driver special for RE3... www.amd.com/en/support/graphics/amd-radeon-5700-series/amd-radeon-rx-5700-series/amd-radeon-rx-5700-xt
I wake up randomly at 5:30 am, and watch some RUclips, and all of a sudden, Hardware Unboxed uploads videos.. Happens all the time lol..
Hopefully Steve won't turn into a Benchmark Zombie after two Resident Evil games :)
Resident Evil 3 came free with my purchase of my RX 5700 so its nice to see that the game runs well on it :D can’t wait to play when it’s out
1080ti the beast..waiting for next gen GPU for 4K.
Salt Maker It’s now starting to age for sure. But definitely not like a burnt turd wtf you talking about?
I can't stress enough how much more appreciated the new content is right now especially in this period, thank you guys.
Purchased a Vega 56 Strix ( Fixed thermal pads self) and it's still a much better purchase than 2060 or 5600XT which cost $100-150 more in my country
loooooove the hard working you do guys with these benchmark
good to see the older pascal cards can still handle 1080p and 1440p.INothing to be ashamed about 4k.Those gpu's are out of most peoples reach.
That thumbnail is a work of art. Basically forced me to like the video before even watching it! Thank you Steve!
Overclocked Radeon VII easily second fastest GPU when it comes to Resident Evil 3. I get better performance than a 2080 Super and the 5700XT is well behind.
Seems a bit ironic that someone called "Bang4BuckPC Gamer" is pointing out that an overclocked Radeon VII beats a 5700XT! XD (but he does make a totally valid point).
@@syncmonism Yes, I love my VII, but 'Bang for Buck' isn't it lol
It's a tinkerers and enthusiasts card that barely qualified as competitive when it came out.
Only if you water cool the card. Othewise its OC is kinda shit. I own a MSI RVII, which has a shit die that has stock voltage at 1120 mv /barf. It def needs to be water cooled in order to undervolt more to get more performance.
@@sparda9060 The VII is all silicone lottery.
I was lucky enough to get one that can run stable at 990mV without even a washer mod. It's quieter than my old GTX970 and i reckon with a repaste/washer it would be pretty damn silent.
But at stock volrage it's unbelievable. Runs hot and loud and heats the room so much that you don't need heating in the winter.
Member when every game had a demo?
Good times, more honest and consumer-friendly. Good times...
My 1080 Ti just holding up very well in this game.
The GeForce GT 1030 is pumping out some slide show performance right there!
i imagine there's powerpoint effects between every frame
Looking forward to playing this game
I think I've gotten quite my moneys worth of the 1080 Ti that I bought in 2017. Only now with Warzone I feel I need to upgrade as I can't get much more than 100 FPS with the lowest setting on almost all settings in 1440p.
1080 ti looks quite strong in this title, beating RTX 2080 in 1080p and matching it in 1440p ;)
Even though I dont really plan on playing most games or upgrade my gpu I really enjoy watching your benchmark videos.
Great work Steve!! Keep it up.
RX580 gang, where you at!
Sorry I went to make fun of my roommate with his 1060 and he's a big fan of RE2 and 3 remakes.
Yup, I remember running RE2 at max settings preset 1440p at locked 60 fps, I'm sure it'll be the same!
$115 580 8GB, I'm still happy I made that choice!
I got an r9 390x does that count?
@@beastestusernameever I am sure 390x probably kick gtx980s ass now.
@@beastestusernameever That's the cousin, old Vega, close enough!
my RX480 from way back in 2016 has really been a good investment. Probably going to wait until the next big AMD card to upgrade to 1440.
1080ti still looking like a great card here!
@Salt Maker What? No they are great in every game. I upgraded from a 1080ti to a 2080ti and while it obviously is faster, the improvement is not mind blowing or even noticeable in any game other than RDR2.
There is no game my 2080ti plays well that my 1080ti didn't play well either. Well maybe RDR2 at high resolution is literally the only exception I found
@Salt Maker Your response was "In non demanding games yeah" to which I responded that this is not true at all.
I don't need benchmarks thanks, I obviously already watch them. I am more interested to know what demanding games the 1080ti is struggling with?
As for a CPU bottleneck, no I would say my 3700x isn't bottlenecking anything, so not sure why you even went on this tangent?
My 2080ti is obviously putting out a lot more FPS than my 1080ti was but the difference between 120 - 140 fps isn't exactly noticeable unless you are monitoring the fps.
@Salt Maker I didn't argue that, you are presenting a strawman argument so you are arguing with yourself there.
You are aware of the channel I commented on... right? You do realise they do a LOT of benchmarks... right?
I am well aware the 3700x will not produce as high FPS as say the 9700k (more often than not) but I knew this going into it (as I watch benchmark vids) and I didn't care as I wanted Intel to have competition and not give them any more of my money. I run 3600mhz ram btw...
You don't seem to understand the difference between hyperbole and quoted stats, as per my example of the FPS difference. There is no bottleneck in my system, you are way off there.
You didn't answer my question and have gone on a tangent that had nothing to do with my original comment or your reply.
Again, what demanding games does the 1080ti struggle with? (other than RDR2 as I myself acknowledged).
@Salt Maker Moron, you can't even answer basic questions, you don't understand hyperbole and you clearly don't read comments or you would have seen I clearly stated the ram I use well before you responded.
You wouldn't answer the only question I repeatedly asked which was what demanding games the 1080ti struggles with because you know your initial response was stupid and you cannot defend it.
I have encountered some idiots on here but fuck me....
Now go troll someone else kid, I am most definitely done with your dumb ass
That’s an incredible amount of testing. Hat’s off to you mate. 👍🏻
1080Ti what a gem. Skipping 3000 series as well at this rate.
Now what we need is a Hardware Unboxed optimization guide from Tim!
I am so proud of my GTX1080ti even after 3 years of happy gaming !!! The best Nvidia GPU ever !!!
@Salt Maker Are you really that stupid ??? Its the best GPU for when it was launched !!! Duhhhh .. Of course after 3 years the newest and greatest aka the 2080ti will be faster !!! The 1080ti was way ahead of its time thats my point !! Even after all these years and its only beaten by the 2080ti !!! By the way i game at 240hz/1080 p... who gives a fuck about 4k ...
Nicee. Thanks for the benchmark scores. Can't wait to play the game now :)
Really love the voice of Steve.
And yes I'm a man but I can hear him clear and my ears are not 100% .
Still love my Radeon Vega 56 blower .
For 280 euro I pay for a nice card .
And with freesync for the lower fps it's still working great and than the option for Radeon Image Sharpening .
Steve you try Radeon Image Sharpening on this game ?
He isn't a Vega fan ...😒
I think a lot of people also don't realize that Vega cards are actually very efficient when you're just web-browsing (etc.). I recall that when they're not doing anything demanding, they're just as efficient as their Nvidia (Pascal) equivalents, with the fan not even needing to run. If they used a lot more power, and always had to have the fan running even on the desktop, then that would really suck, though they'd still be good cards even then (for the right price, or for the right user who needs their superior performance in certain non-gaming applications).
@@swordfishffm I think Vega is somewhat underrated. I wasn't willing to run a blower model, and I wasn't willing to trust
Gigabyte's dual fan model either (it had a lot of bad reviews). I was close to buying one, but I couldn't find a good dual fan model for a good price, and I ended up finding a great price on a good triple fan 1070 ti instead.
@@syncmonism the Max power use I had on my Vega 56 was 225 Watts and I blow on that moment the rx5700 away .
And no the blower was not like a jet engine .
I use a asrock blower model one of the last one.
Not the Samsung memory .
All Samsung memory are over voltage from the factory .
@@syncmonism The Sapphire models are great as usal! I own a Rog Strix and i´m happy with it. Have a nice weekend mate :)
Thank you for marking my card in green. It really does stand out. Love my Strix Vega 64.
RX 570 rocks! I love polaris so much.
Honestly, I'm a bit disappointed by the performance here. With modern titles we've seen the RX570 8GB regularly beating the 1060 6GB, here it seems to be behind the 580 by exactly the difference in specs. But then, maybe that just speaks for the game being well optimised for the architecture.
Feeling pretty good about my 570 also. The more demanding games become, the more we'll get out of it :>
How is the 1080 ti doing so well in this title when even the 2070 beats it in a lot of other titles?
Where's the 1080Ti gang ?!
I wish all games had such a great optimization. My 1080 Ti which i got for 450$ weeks before 2000series came and it performs better than 2080 Super wich cost 900$ in this game, it's shame too keep 1080 TI stock. i am using aggresive overclock on mine 2025mhz on core and +900mhz on memory, it gives around 13% gain. that's why it's a lot faster :)
@@Patrick73787 in my country its 900$ converted from Euros to USD. And it doesnt change a fact.
Man, the 570 and 580 are really holding well. No regrets getting the 580 8gb so far.
I love your test. Seriously I always base my purchases on these test. You do an amazing job.
Look at my vega 56 running like a rtx 2060 and a gtx 1080 LMAO. 250 dollars i paid it, money well spent!!
It runs like that on ONE game!!!
xxfurloxx You can say that again!
You could get a 2060 for 250 dollars
@@maxzett Nah, it's not uncommon to find numbers like this in other games.
More importantly, it's yet another sign for FUTURE performance.
nVidia just dumps their old graphics card in the dust, and stops optimizing for them the moment the new generation is out.
That to without the game ready drivers, with the game ready drivers, things should get even better if something abnormal doesn't happen.
Great video and thanks for all the effort! Funny to see the1080 Ti still shine so well. Re the game itself btw, at 12:35 that shows something which really bugs me about modern games, the elevator opening is such a stupid forced jump scare (on rails); the doors open far too fast, and surely given the obvious risk (she never seen zombie movies?) one would at least try to block the doors first before opening them. There's plenty of junk around, we saw her walking past it. Modern games can look very nice, but they are sorely lacking in interactive functionality. Very little of what one can see are things one can do anything with. It's as if most of the environment has been glued down. Sometimes this game play style works well because the pace is quick and the design quite clever, such as in "The Last of Us", but RE games jar a little too much in this regard, too many cases where events look like they've been pulled from badly written Hollywood movies.
Awesome thumbnail Steve!
Great video as always, thanks for Benchmarking :)
Another Harbor unbox Mega Benchmark. A must watch IMHO
Steve these thumbnails are getting ridiculous!
And I love it! 😂
Steve's thumbnail is gold
Thanks as always for these Steve, hope you and the family are all good mate👍
It's on times like these, that I'm very happy for the gtx 1080 ti that I bought 3 years ago
@Mykel Hardin for 450, in 2018,is a great deal!! But yeah, that was the worse part, I'm just glad that I bought it when it costed 750 dollars
@Mykel Hardin if the 3080 ti is 800-900 dollars and is 50% better than te 2080 ti, I'm going to buy it
@Mykel Hardin maybe. If the sold it for the price I said, everyone would buy it and they would made a lot more sales. So there's a opportunity
@Mykel Hardin I'm also hoping for the ps5 and xbox series X, to decrease the prices of the gpus
@Mykel Hardin exactly, I genuinely refuse to pay more than 1000 dollars for a gpu
I had to double take that thumbnail.. well done.
1070 still kicking ass to be honest. I’m definitely upgrading next year but man what a great buy for a 4 year GPU.
Game works better on DX11 though. Just saying. Results could be quite different.
Great content. Thanks for all the trouble.
I thought there are no reflections without RT? is ngrdia wrong?
And those shadows, how in the world can a game have shadows without performance killing RT?
If you mention the GTX 1060 why is it not included as a referrence on the charts?
I’m astonished by the performance of the previous gen GPUs. Why aren’t they ever recommended for PC builds? Are those cards discontinued?
Well nothing better than starting a new month with some well made benchmarking. Much appreciated.
Was your Vega 64 stock ? If not, what clocks did it run at ? (core clock + HBM)
I believe he runs EVERY Vega card at complete stock, no undervolt, no tweaking, nothing.
The Vram meter in the options is a bit of an odd ball. It might be different in the full game but in the demo I didn't see any meaningful difference in performance between textures set under my available Vram and maxed out textures that will exceed my Vram. My Vram gets indeed entirely utilized with the textures sets to the maximum quality but I didn't notice any stutter or slowdown during gameplay. Specs: R7 3700x, 32Gb @3400Mhz CL14 manually tuned subtimings, RX 5700XT on a PCI-E 3.0 x16.
I had to rewind a LOT on this one. I really struggled to concentrate on what you were saying with such interesting and gorgeous gameplay footage. Then again that is probably due in part to my literal ADHD lol
7:24 vega 64 is shown with green line.
I have a 4GB RX 470 and I have tested the RE3 demo. I just set textures to High (1GB) and set shadows to High. I also used DX11 to get a bit more FPS. Smooth gameplay in 1080p. Setting textures to High (8GB) just uses more VRAM so that textures don't need to stream as much. For example if set to High (0.5 GB) you'll see more texture pop ins like seeing a blurry texture turning to a sharper texture right before your eyes. Why is that not discussed in detail in this video? Is that Tim's responsibility?
Hi.
What happened to the RX 5600 XT is so slow compared to the RX 5700, in the reviews they where so close.
Do you use the reference clocks or the new clocks?
Love the thumbnail, Steve. Keep doing it.
For those who want a higher resolution than 1440p, something close to 4k, they can try 1800p, it's a pretty nice bump in quality. And to get more details they can use RIS, if they are on AMD, or the alternative offered by nvidia.
Where is the optimization guide??
I think in the graphs u misspelled the Vram on GTX1050 2GB not 4GB
@Setzer K not really there is like 20 percent difference in performance between them
Level1techs tested the RE3 with new drivers, I assume the 20.4.1 ones. It shows 5700 xt to run well over 60fps in 4K upscaled, pretty good.
LOVE THE THUMBNAIL!!
Will there also be a performance optimisation video like you guys did with the RE2 remake?
Going from these results it looks like my old GPU is still fine for 1080p gaming, at least until the next generation consoles come out and devs can start upping the general graphics quality in upcoming titles (the rumoured remake of Crysis nonwithstanding - that is a series made for melting graphics cards).
5:05 gt1030 ... well weren't they always 2gb cards
That thumbnail does put a smile on my face.
I still can't get the hilarious thought of "will the new consoles be better than gaming PCs" out of my head.
At the same price yes, if you just want play game
@@hououinkyouma1458 I mean u should compare it to the new upcoming pc hardware. But when u add payed online,expensive games, no display support which would force you to buy a very expensive oled just for the hdmi 2.1(if u want to fully get the "4k 120 fps") they are not a good value
I woke up and first thing I was when I opened youtube was that thumbnail, long story short my brother is mad at me for laughing too loud and waking him up.
The thumbnail is amazing
Demo was crazy you could max out 8gig on 1080p , it ran pretty great though
@Salt Maker I guess that makes sense, the in game menu gave me warnings when I maxed out textures because it crossed the limit .I thought it's a demo thing to have the option for crazy settings
Radeon VII at $550 new on Amazon back in Jan 2020 + 3months xbox game pass was too hard to pass up as someone who plays at 4k. Despite being EOL... it's still new on Amazon at $599. Really competitive at that price for 1440+ users.
A 2070S would have been better. Lower power draw, cooler, quieter and its just 3% slower than the VII