A lot of people have asked us to check AMD's driver level tessellation features. We've done so, and the results are available here: www.patreon.com/posts/24832847 In short, you will see a performance improvement, but it won't bring AMD cards up to the level of Nvidia cards and our recommendation is still that you should disable the feature
What am i not getting here. Off is 17% faster than On? Amd forced 4x is 25% faster than off? But only 2% faster than On? Doesnt make sense to me that way. Shouldnt it then be around 40% faster than on?
Hardware Unboxed The purpose of the tessellation override is to keep almost the same IQ while improving performance, even if slightly. 2-3% faster at x16 is still free performance
If I see this correct.... Off vs. Default: Min fps 47,9% Off vs. x16: Min fps 33,3% So there's an "big" 11% improvement on 1% low? So I think, limiting the max-tes is a good idea on AMD, specialy on cards like rx 480?! (Vega has more Tes-power as I know) An image-comparison between default and 8x and 16x Limit would be nice. Btw. which card was tested?
oh boy did i get yelled for saying that Ngreedia plagueworks cripples your performance. Only if we could see games are modern and doesn't have crippling software in them.... *coughs in DOOM, FarCry 5*
There is no problem. I think there are just 15%-10% of users that use 4k displays and watch 4k videos on youtube. The most used is 1080p and some times 720p.
Advanced PhysX enables destruction effects like little stones falling off rocks when shot and such things. But it is a CPU Feature and doesn't use much resources.
Advanced PhysX was moved to the CPU some time ago for games, they say it's because the cpus have gotten faster, but from what I've noticed, old implementation are still a resource hog, so there might be some other reasons behind it.
The load of simple things like smoke puts on the hardware is an utter joke. Its an AAA title but the team's lack of experience in the optimization these type titles is really exposed. Its rather crudely made in that regard.
@@IoNoobMaster It mostly uses frightfully old x87 code, even the original PhysX cards did with their 'custom' processor (I've still got one somewhere) and its now just a scam used to help push gameworks and runs entirely on the CPU just like MS's Havok middleware. Using SSE2 it can run between 1.5 and 2x faster (and that was back in 2010) but nvidia sticks to their x87 implementation to this day.
It's intresting that went nvidia sponsors title, most of the time punishes amd a lot more , yet when amd sponsors title it runs great on both amd and nvidia, makes you wonder don't it.
It is because AMD makes their tech open. Nvidia just copies AMD's features while AMD has to reverse engineer Nvidia's. AMD could have kept Mantle to themselves, but they don't have the money to throw at developers to make them use propitiatory tech, so we got Vulkan. As a side effect, most of Nvidia's features end up being short lived gimmicks because they keep it closed off. No developer wants to use something that only half gamers have(remember that AMD is in consoles) so there are only as many gameworks games as Nvidia is willing to buy.
And of course you have a link to proofs with significant enough selection of games tested, do you? Because if we talk personal experience - mine tells that it is completely random. Sometimes it even looks like Nvidia sponsored games work better on AMD -___-
Not really, we all know Nvidia are greedy scums that would sabotage even their products as long as it ends up hurting their competitor as well, it's why I never buy their products, it would go against my humanity to support such an evil corporation
@@amirabudubai2279 It is because AMD has no tech, not because open tech. AMD sponsored titles don't have any additional features, because AMD doesn't have any, and all the features they are hyping left and right everywhere, like with Vega release, were either imaginary or later canceled, and even if something was miraculously implemented then nobody even noticed. AMD sponsored game is just your regular stagnant not ambitious game that is not even trying to do anything interesting for graphics.
Yeah weird suggestion eh. Especially when he was getting well above 60fps with it on. It's not like this is a twitch shooter.. I'm playing Exodus with all the fruit on...
First of all, I just want to say thank you for these optimization videos, you literally make the game-experience for me (and probably a lot of others) that much better. As you mentioned, Metro Exodus lacked a lot of graphical options, but I noticed that there is a user.cfg file in the saved game folder which has a LOT more options. If you could investigate that as well, that would put the icing on the cake! The reason I am saying this is because once you enter the second "area" (desert area) FPS drops DRAMATICALLY, so much that if you consistently had 80 fps before no matter where you went in the first area, you would dip to low 50s and 40s. It's insane. One option for this was to drop the grass-shadow option in the file mentioned above. Just mentioning this in case you want to make a "remake" of this video. Anyway, as always, great video!
As of a couple years ago, PhysX can run entirely on the CPU, and properly (i.e. without deliberately crippled code), apparently because nVidia realized it wasn't earning them any more sales. What that means is that the performance impact of PhysX in this game will be felt through differences in CPU, not GPU. I know it's a lot of work, but you can't do an effective optimization guide by just using the fastest CPU available. Today, you'd need to check the effect of four cores, six cores, and eight cores. And, ideally, different versions of each. Running at stock, because >95% of people do not overclock.
Except nvidia own Physx and even though its now software implemented, the terms of using gameworks prevent devs using it unless ONLY an nvidia GPU is there.
Tim, physx doesn't affect anything because it is the newer less shitty physx that runs on the cpu, not the gpu. Physx was always a super simple particle system, not hard to run, but it was locked down by Nvidia and forced to run on the gpu, despite gpus being able to optimise wll for it given the chance. Physx as an Nvidia brand is dead. It is the default physics engine in unreal engine 4,basically used in all its game, yet its name never mentioned. Why? Because there it's an underlying system, not a dirty tactic by Nvidia to sell more gpus. I guess 4A games being sponsored up their ass so much didn't get the memo that it isn't 2010 anymore and physx isn't a thing like that anymore
@@addychikky since Tim says in this video that PhysX does nothing for the performance, that likely means it's the new version that runs on the CPU (This game is GPU bottlenecked with this 9900K, meaning more strain on the CPU doesn't matter) I think the setting will only affect you if you have a low end GPU, would be cool if Hardware unboxed did a test to confirm my theory
@@masprassaja3818 you can advertise your GPU without creating locked down technologies that keeps the graphics of all video games held back and hostage, since there is less incentive for developers to use it when only a few people can.
I love these optimization guides. Otherwise I wouldn't know what some settings will be the most efficient. This was a god send for me on AC Odyssey at 4K to get 60FPS at all times. Thanks for these videos!
Did you test the difference between DirextX 11 and 12? There are people saying you can gain a fair performance increase from DirectX 11 when running on the higher presets
Have you ever considered adding a frametime graph to your benchmarks and analyses? I think that it shows a much more accurate representation of the smoothness of the gameplay, and it can substitute the % lows so it won't take more space on your slides...
You guys are doing a great job for us PC gamers with your optimization videos, I've followed all your recommendations for SOTR, Metro Exodus & just recently The Division. I've been getting constant 60-90s fps with the occasional frame drops (just certain short segments at 40-50 fps which are certainly acceptable for those several seconds) on my 2080Ti/i7-4770K @ 3440x1440 Ultrawide. Hope you guys continue to post such guides for all the major titles. I trust your recommendations more than GeForce Experience.
@@FlawlessWorldAces I run a gtx 1080, still think gameworks should die, the effects can be achieved through much less intensive and hardware agnostic means, no reason to gimp performance of you own hardware with crappy API's.
@@FlawlessWorldAces While AMD provides quality for a fair price, Nvidia scams it's dumb userbase with overprized hardware and useless features and it just works.
agree ultra does the game more justice,metro is created to give you that atmosperic feeling,thats why it is a slow paced game,to enjoy the playable art the devs have put their hard work in
thanks for the guide Tim. And yeah, AMD users who still care to play with tess and hairworks; try a tessellation factor override of 16x in the adrenaline profile.
As much as I hate Nvidia, the devs are responsible for this abomination of performance. It is unacceptable for example that the tessellation hits AMD cards for 50% of the fps.
It's the same problem in basically every game works game ever. And since the performance hit on amd's cards is always way harder than on a nvidia card it's a pretty save call that it's not the fault of many game developers who are trying to optimise their games as badly as possible to piss of fans but nvidia who tries to give themselves an advantage over the competition when compared in benchmarks.
The issue is, when AMD sponsors a title they don't intentionally try to screw the competition with such "features" and a AMD sponsored title will usually work the same on a Nvidia GPU or slightly worse (Or even better if we consider the latest Assassins Creed) I don't mind the idea of Nvidia helping to fund games after all the studios need all the money they can get, but this Hairworks and Tessellation forced gimping has to stop, this is why I just can't purchase a Nvidia GPU even if I wanted to, they don't update there outdated Control Panel + Experience software, they just try to gimp every title they fund, the devs clearly knew what they were doing when they made this game, and the paychecks they got surely was welcome, they are human after all, and everyone has a price.
der_Balrog 13 lmao no it’s not the same case. Tomb raider is gameworks and is fine on AMD hardware the division was too and no issue on AMD hardware. We can keep digging but it’s obvious that not every gameworks title hurts AMD. We can see cases were AMD name on a game hurts Nvidia too. You can stop spreading lies now.
The tesselation problem is just a limitation of amd driver's implementation and architecture. Tesselation is a fairly known technique and amd (at the time ati) was really into it years and years ago. It is a known fact that amd gpus are a just bad with tesselation and they should be ashamed for that.
Big fan of this channel ! And finally FIRST ! :D Keep up the great work guys ! I hope you guys reach a million subs by the end of the year ! All the love and best wishes, from INDIA ! :)
Damn they've gone full greedy with this game, Nvidia pays them to use GameWorks and Epic pays them to pull out of Steam... Nope not gonna buy this one even tho I love Metro.
Parsa I am on the last level in Metro Exodus and I hate to say it but am a tad disappointed with it. The first 2 games are 2 of my favorite of all time (started gaming in 1981), but 4A changing the formula to a mostly open world game really did something to the magic. It's not that the areas aren't interesting or filled with detail and little stories, or that the new backpack system isn't great. or the game doesn't look amazing. It's just the game feels way more like Far Cry 2 than a Metro game because of the subtle tonal shift from linear corridor to open world.
Full greedy? This is the biggest game they've done. How entitled are you that you won't support a company that wants to make the most out of their only game in 5 years? Give me a break man, you're just a bitch
Metro Exodus is the first game that I had to use V-sync. I'm using i5 6400 and RX 580 4gb with 60hz 1080p IPS monitor. My settings are set to High with tessellation, hw, physx and moniton blur turned off. So when I started playing my FPS is going around 60, sometimes it drops to 30 but in rare situations it is most of the time above 60 and I had some lagg spikes it really annoys me, it could be because of the big frame rate diffreneces so I enabled V-sync on, full and it's really smooth and crisp, no screen tearing at all, even when I move around it feels alot smoother. So if anyone with lower hardware has the same issue try enabling v-sync and see, as I said I don't prefer V-sync but in this game it helped me alot.
I'm getting really sick of games using TAA without the option to disable it. I'm playing on a 1440p 24inch monitor so all TAA does is turn games into a blurry mess. BFV and metro look like they've got textures from 5 years ago because of it.
Like the other guy said, download reshade, select the exe of metro select directx 10+, and add the lumasharpen. Start the game and adjust the sharpening amount. I did this to FFXV.
@@Labyriiint They make the same amount of money when EPIC lowers the sales price. The only reason they're making "more" is because they're keeping the 60 euro price tag in the EU and they got a big briefcase full of money.
@@null643 Exclusivity can be bad, but is good for gamers as a whole. It is no coincidence that many of the best games each generation are exclusives; platform owners spend a lot of money trying to develop system sellers. It can be bad. Paying to have an already made game not published on other platforms is pretty scummy, but there are many more games like Half Life, Mario, Zelda, and Uncharted. Mario is the last AAA 3D platformer because publishers deem the market too small to support them. As a fan of 3D platformers, I benefit greatly exclusives.
Don't get it. I've about 5 launchers all ready. I usually just add exe. files to Steam. This one I bought for £6 on eBay and play offline, don't even use the epic launcher.
I'm sure there are other videos for that. This series of videos has always been about using the graphical settings available in the settings. I would consider myself a knowledgeable PC user, but after working with computers all day I just want to play my games and not fiddle too much. Unless I'm really bothered enough by a game's visual/performance, then I go and tweak. However videos like this is detailed but uncluttered and gives me the info that I need using the built in settings.
THIS IS THE WAY MY BROTHERS FOR THE ENHANCED EDITION! After hours of extensive testing, I've found the "Sweet Spot" optimized settings targeting 90-120+fps avg. for any system close to this setup spec and with these high quality in-game settings - I have a Zotac Amp Extreme RTX 2080 ti OC'd ~2k mhz Core & 7950mhz Mem. i7 8700k OC'd 5Ghz 1.39Vcore All Cores (no AVX) 16GB DDR4 3000mhz RAM 240hz G-Sync 1ms Monitor AW2518H NVME 1TB SSD 2k R/W With these bang for fps quality vs performance settings below I'm able to usually get between ~90-120 in most environments! - Resolution: 2880x1620 (4K IS NOT WORTH THE FPS HIT, 2880x1620 + Reshade Sharpen Filter looks near native 4k, +40% more fps! ) Quality: Ultra VSYNC: Off (if you are getting more fps than your monitor supports either turn this on or use an fps cap such a rivatuner to keep fps at lest 10fps lower than monitor refresh rate to prevent tearing) MOTION BLUR: Off RAY TRACING: High DLSS: Quality REFLECTIONS: Hybrid (if you feel your FPS is high enough try Ray Traced, but will hit performance 10-20%) VRS: 2X HAIRWORKS: On PHYSX: On (Box Filled in) TESSELLATION: On FIELD OF VIEW: 3 Extra Ticks (I prefer this since it allows you to see the watch easily on arm) NOTE: I STRONGLY recommend using RESHADE sharpen filter called CAS to help counter the blur from DLSS and to increase sharpness dramatically to get a 4k look without the performance hit. Also, the filter "Fake HDR" to help create more contrast and help the Ray Tracing version of Exodus to keep it's dark look from the original. To be honest, the default look of the game feels like vasoline is smeared all over the screen, this is where Fake HDR filter helps out by making the colors, darks and brights "pop" more. With these 2 filters your only losing a few fps, maybe 3% at most, BUT IT LOOKS SO MUCH BETTER! My reshade filter sweet spot settings are - CAS.fx: Contrast 1.0 , Sharpening intensity 0.63 FakeHDR: Power 1.3 , Radius 1= 0 , Radius 2= 0.060 There are also a few ini file tweaks to fix some annoying things the game has by default like mouse acceleration, motion blur still present, etc. A list of fixes are found on @t Lastly, there are some Nvidia Control Panel & Nvidia Inspector Tweaks I've made to squeeze out the most performance such as disabling Ansel, etc. If you wish to find out more details contact me on Discord ( V1rtuou5@/2213 ) and I'd be glad to help you get the best possible experience, ENJOY! P.S. - If you're not playing on RANGER HARDCORE with no HUD...YOU'RE DOING IT WRONG :P
The big difference between hairworks and tressFX it's that the first uses "stream output" (also known as transform feedback) and the other "compute shaders". The stream output specification was created to capture the geometry to perform calculations on them (like modifying the position to simulate movement form the wind or the character movement). This specification makes has requirement that the operation needs to be calculated in sequential order. This specification prove to be very inefficient on tiled gpu arquitectures, so now we have the compute shaders specification. In this new specification the geometry operations are done asynchronously. The computer shaders are recommended to use it, but the stream output specification was preserved to avoid compatibility issues with older APIs and games that make use of it. And it is still been use in modern engines, like unity witch uses it to render characters. Titles like the witcher 3 uses it on some monsters like the sirens, and mafia 3 makes it a requirement. Stream output damage significantly on amd gcn architecture, since needs to wait to render one point to go to the next making the gpu been unutilized for a time, and it's worse with bigger gpus, since there is more cu's without a task, in other words, been more frequency sensitive than raw performance power. This also should apply to nvidia with the Turin architecture, since it's the first asynchronous they have (remember in the presentation that Jensen said that now they are able to perform simultaneously integer and floating point operations). You can see that the better performance gains in turing was in amd optimise titles. You can also look at resident evil 2, that uses heavily the compute shaders to calculate illumination. In that game the difference between the gtx 1060 and the rx 580 it' s around 20%, but with the rtx 2080 and the rx VII that difference doesn't exist. Also i have to remember that hairworks uses tesselation and msaa on top, so you have more geometry and also you have and aa that works on geometry. The tressFX found on shadow of the tomb raider it's a modify version named purehair that uses tesselation on top. That won't take that much of the performance since the geometry calculations are done asynchronously, and also it isn't been overdone. I hope that with the turing architecture more a more game developers stop using stream output in favour of compute shaders. CD projekt need to remove all stream output usages from their engine in order to have a good optimise game, and the same goes to unity. If you want to know more here is a blog explaining why stream output is bad and why it's now on the vulkan api: jason-blog.jlekstrand.net/2018/10/transform-feedback-is-terrible-so-why.html
Thanks for this guide. I have an GTX 1080 and play on 1440p. Before this guide i used "HIGH" and "Tesselation ON", but now i switched to "ULTRA" and "Tesselation OFF" and must say: The game looks better with almost the same performance.
@@maincharacter2677 Yes, but i ended up with playing on high and with tesselation on, and i added this reshade: www.nexusmods.com/metroexodus/mods/1 Huge visual preformance, looks like extreme graphics on high settings or even better. Without nearly any fps loss at all.
@@aNDREeSnell Nice, I'm running the "Tweaked 2: "Quality High" + "Hairworks On" + "Tesselation Off" + Gamma -1" found on this site www.tweakmygames.com/post-unique/Metro-Exodus with a rx 580. I'm definitely gonna try this reshade mod !
Damn nvidia ready to sacrify performance of their own customer as long as they can cripple AMD.... That leaves a bad taste in the mouth wether you are an nvidia or AMD customer. If AMD were more competitive on high end card and offering Ray Tracing aswell I wouldn't have switched to nvidia. Anyway I'm sure AMD can do better than Deep Vaseline Super Sample with DirectML (it's hard to imagine doing worse than DLSS as DLSS does even worse than a simple resolution scaler)
AMD HAS offered ray tracing far longer than nvidia. Go look up 'radeon rays'. Its part of the Vulcan API and has been for several years now, but only after nvidia do their dog and pony RTX show with grossly misrepresented footage (that star wars demo that needed a host of Titans to render at sub 24fps) has ray tracing been a thing apparently. Just like how any phone tech like wireless charging or face/fingerprint scanning ONLY starts existing after apple does it...
@@zybch Who said nvidia invented Ray Tracing and was the only one that could do such task?? Not me. I despise nvidia proprietary stuff and selling practices but I depise toxic fanboys aswell wether they are green or red. However nobody can deny they're pushing Real-Time Ray Tracing for games more than anyone at the moment. And Nvidia invented dGPU and invented programmative shaders. I am very well aware that AMD too can do ray tracing rendering as well. But does Radeon Rays it enable AMD GPU to render Ray Tracing fast enough for Real Time Ray Tracing in gaming? No. Is there any games featuring Ray Tracing with through AMD Radeon Rays? The answer is again No. Did you know that any GPU wether it is from AMD or Nvidia and even before RTX/Volta can do real time ray tracing?! Just at a much poorer performance than what RTX cards are able today because GPU weren't specifically designed for Real Time Ray Tracing. As such PowerVR was able to design ASIC chip that would give the same Ray Tracing performance of a 980Ti for only 10W in 2016. It was like comparing a Desktop GPU to a smartphone GPU. And haven't you seen that the StarWars demo runs at over 30+ FPS at 4K DLSS on a single RTX 2080Ti. Even a RTX 2060 could run the StarWar demo at over 24FPS on 1080p without DLSS since it handles the starwar demo at 2560x1440 DLSS for arround 35FPS.
It's much easier and better for viewers to spot differences and judge for themselves if you use the whole screen for each setting, for example going from low to medium to high etc and switching back and forth depending on the settings you are currently talking about. Other than that, great vid much appreciated!
I don't think the devs care about AMD users at all. Both AMD CPU and GPU performance is far below what many other titles are able to achieve. I'm honestly fine not buying this game on principle. PC games should work for all gamers, not just a few. I didn't get into PC gaming to play this exclusivity BS that Nvidia seems to encourage. It's not exclusive in the strict tense of the word but it encourages that mentality.
Take a look at the xbox one X version of metro. They've done an outstanding job on AMD hardware, and hardware thats far more limited than is required to run the game at a similar fidelity (generally between high and ultra) on PC, and way way cheaper. So they can do it, but on PC they don't. What does the xbox not have, a GPU produced by nvidia and associated gameworks program whos only task to to make any competitor to nvidia look worse...
You can get away with using tesselation of AMD gpus, you just have to lower it from x64 or whatever is used to x4/x8 and game will look better without huge performance hit.
It’s a vicious circle with Gameworks and game sponsorships. GPUs are evaluated by performance on top selling games but Nvidia is able either sponsor or get Gameworks into more games than AMD. We can’t blame AMD for not competing, their hardware is up to task, but not against a set of games designed from the ground up to run better on the competition.
@Chiriac Puiu Yes. It's one thing to make your product better than your competition. And it's another thing to do that through unethical means. What Nvidia is doing is the latter. Which is why I switched to Radeon 4 years ago. Can't wait for Navi to be excellent in price-to-performance this July.
Very useful and informative content. I have watched many videos to solve my graphical issue of this game but couldn’t find a suitable one and then your video came and spoke everything out loudly. Thanks mate cheers 💙
wile123456 I’d understand your post better if you used proper grammar. Why is it that every AMD fanboy has atrocious grammar? There must be an underlying reason for this... I’m sure the answer won’t be pretty...
I played this game on my overclocked 1070 at 2K and barely got stable 60fps in most areas. Still a great game. Might replay it once I get an RTX 2080 or whatever card comes out by then. The game definitely has to be experienced with RTX on. The Global Illumination difference is insane!
Which Nvidia and which AMD gpu did you use for the tessellation comparison? Vega is 1080 level in tessellation. Physx runs on the cpu and makes the game go from play able to unplayable on one of my 3470 I5 PCs. No effect on Ryzen.
The physx is for like physics based debris when you're shooting things...When you shoot the ground, chunks of dirt or what ever will fly up and bounce around...stuff like that...its pretty subtle but a nice touch
You"ll get better fps if you'll lower tessellation quality from AMD driver. The default tessellation quality is set to 64x. Some people recommends to use 8x or 16x and some recommends to use between 4x and 16x. If you have the game and a amd gpu try it and see how much it will improve your fps and if you like the quality.
Apex legens is way more demanding than i expected. Im getting 30 fps on lowest settings and no AA, when in Fortnite i get 80-100+ fps on lowest settings + 1080 p + max textures
It’s crazy how heavy it is on the gpu. I really expected to get a locked 144fps on 1440p max settings with a 2080ti... the game also has some issues with gsync and/or framecaps a as I get fps drops as soon as I use either of these
Mint vid can sit and adjust while listening and watching learnt a lot and it plays at 60 fps steady. I have been trying to get this to play for a week haha. Sub gained.
From my experience with the 760 in the past, the GPU is fillrate limited and performed well at 900p for fillrate heavy games like FFXV. I'd say that 900 on medium to low will net you good performance.
It's obvious drivers for kepler cards are gimped so at this point testing kepler GPU's is pointless. 780ti is beaten even by older 7970, and 970 is twice as fast. Much weaker 760 (compared to 780ti) scores only 13 fps :P. static.techspot.com/articles-info/1795/bench/Medium.png
Physx for example makes rocks ripple down a hill after shooting or moving a boulder. Or possible debris flying off a wall after shooting it with bullets. Or rock bits flying after throwing a grenade.
I completely forgot that the metro games used these worthless presets for almost everything with no options to manually tweak everything. It's just lazy. Just another reason to wait a year to play this game, I guess. Since my 480 can't handle 1080p60 at high, I'm probably going to need to upgrade, though there's nothing that really makes sense yet. Guess I'll be waiting for 7nm midrange cards.
@@dainiusvysniauskas2049 I didn't mean that I was expecting them to add any settings. I meant that by the time its out on steam (and hopefully gog, as that is my preference) there'll be new midrange cards out more capable of playing this.
@@dainiusvysniauskas2049 It just bothers me that I'd have to turn it down all the way to medium to get a reliable 60+, since there's some pretty noticeable graphical downgrades at that point.
Im using RTX 2060, 16gb RAM on a laptop and use default NVIDIA panel settings. The Fps drops below 20 frequently during the gameplay Any solution for this would be much appreciated
Bruh... this time RTX ON to get global illumination is worth it... trust me... it’s much better than what you get in Battlefield V (and its reflections)... you don’t need to play in 4k... getting closer to prerendered graphics in real time is more important and I don’t care if it’s not 144fps... pff... the people these days...
Adrian Z man... I mod games and I understand about graphics... fuck Jensen... Ray Tracing is just the logical step going forward in graphics technology cuz the damn lighting is precisely what makes the big difference between prerendered cutscenes and real time graphics... it’s just that with the hardware we have right now many people won’t understand yet... only time will make them open their eyes and go like “ohhhh it was all about simulating how the light bounces in real life creating reflections and refractions”... oh no wait... that is too deep... lol
Only problem is even the 2070 struggles without RTX in 1440P (I've seen 50fps in some areas) and RTX kills its performance. In 1080P or with a 2080+ maybe
"I'm clearly too much of an enlightened graphics god... I know too much about graphics and I worship Jensen Huang everyday by watching his RTX reveal event... I'm far too deep for mere peasants to understand..." Nigga that's what you sound like. Get off your high horse and accept the fact that the devs and Nvidia crippled the game's performance.
Great video. I'd love it if you guys did SLi benchmarks in these. Not for every card obviously. Just testing one SLI set up would be great, just to gague the level of support/scaling/issues.
Not nearly as beautiful as RDR2. It actually looks outdated in comparison, thus runs like total shit on hardware that overpowers consoles by significant margin. PC gaming never changes.
@@Vsevolod3788 As a hardcore PC GAMER with a GTX 1080 OC'd I'm disappointed that the visuals aren't good enough to justify the performance hit in Metro Exodus. Red dead 2 looks stunning. :)
Complains about good quality Fur/Hair details but doesn't know what PhysX and tessellation do in exodus. Damn! nice judgement for clicks there! You should really start following Digital Foundry's metro exodus analysis more to up your ideas about these stuff. About the TFX, why don't we get to see good implementation of that in non TR titles even though that's AMD sponsored? like RE2 remake, AC odyssey or the likes of division? It's like saying just cos Hairworks is demanding on certain gpus which are weaker in tessellation, it shouldn't be implemented for much better visuals. Metro's Hairworks might not be that noticeable but in WITCHER 3 animals, it was damn good. It's a good thing we have channels like DF which can actually pin point the difference for viewers unlike some who have hardtime differentiating basic bullet physX.
TressFX is open source so most developers use their own implementation without the branding; Square Enix is calling it Purehair (Mankind Divided, ROTR); Guerrilla games also used it in Horizon Zero Dawn and since it's included in their engine it's probably being used in their next project and also in Death Stranding; Just because you don't see a TressFX logo everywhere it doesn't mean it's not being used in one shape or another, especially with both Xbox One and PS4 being based on AMD hardware;
Thank goodness there's not a lot of difference between AMD and Nvidia GPUs w/ PhysX this time around. It drove me mad that in Metro LL it was CPU-based for users of AMD cards yet included FLUID-based PhysX... GAH
You know, I find it interesting that you found almost zero effect with motion blur. I had horrible stuttering (and almost total freezes for up to 1 second) when doing turns larger than 90degrees with my 1070GTX. turning off the Motion blur, the turn stuttering went away. (I still get these weird freezes every "action" a character makes. Example, reload, pull up your backpack. Or things that are heavily scripted.
AMD does really have to do something about the GPU memory management, which seems especially problematic on lower end cards. If you look at 1080p, the RX 560 4GB vs 2GB VRAM difference is massive, especially when compared to the 1050 3GB vs 2GB ...
After months of troubleshooting and allot of denial(i convinced myself that 8gigs of memory was the cause of the issue). I can finally conclude that 95% of stuttering issues are due to HHD being too slow or faulty. Switched to SSD and all I can say is.....Holy shaait! It worked!
I would honestly love to know what GPUs you tested on. I think thats kinda imperative to a performance comparison of settings. As we know, certain generations of GCN handle tessellation a lot better than others.
Tips: Do some stuff manually in driver preset for the game. You will get extra 15% FPS. With my fx8320 +16gb +1063 in QHD i step forward from 38-44 FPS on lock medium-high settings to 54-60 FPS just by adjusting some options in driver game global presets in Resident Evil 2 Remake. It's 2560*1440 resolutions for you to know.
I love your videos but this video is not representative. The early areas are less demanding than the later ones (especially the desert). I have a GTX1070 myself, overclocked to 2000MHz. I get like 45 FPS on ultra in the desert while normally I get about 70-80 (1080p). You should really consider this difference if you evaluate the performance of the GPUs regarding Metro Exodus.
To fix issues with AMD cards in gameworks titles go into your control center. Gaming, global settings, select tessellation mode and override application settings. Set maximum tessellation to 8x and it will fix all issues across the board.
A lot of people have asked us to check AMD's driver level tessellation features. We've done so, and the results are available here: www.patreon.com/posts/24832847
In short, you will see a performance improvement, but it won't bring AMD cards up to the level of Nvidia cards and our recommendation is still that you should disable the feature
What am i not getting here. Off is 17% faster than On? Amd forced 4x is 25% faster than off? But only 2% faster than On? Doesnt make sense to me that way. Shouldnt it then be around 40% faster than on?
No, Off is 25% faster than AMD forced 4x, you have it the wrong way around. On with 4x tessellation is 2% faster than the regular, default on
If this game cant do LOW 1080p Mbur=off 60fps with RX 560 , then its an unoptmised POS
Hardware Unboxed The purpose of the tessellation override is to keep almost the same IQ while improving performance, even if slightly. 2-3% faster at x16 is still free performance
If I see this correct....
Off vs. Default: Min fps 47,9%
Off vs. x16: Min fps 33,3%
So there's an "big" 11% improvement on 1% low?
So I think, limiting the max-tes is a good idea on AMD, specialy on cards like rx 480?! (Vega has more Tes-power as I know)
An image-comparison between default and 8x and 16x Limit would be nice.
Btw. which card was tested?
It just GameWorks!
Jarrod'sTech I see what you did there (☞゚ヮ゚)☞
☜(゚ヮ゚☜)
(☞゚ヮ゚)☞
kek.jpg
i see you always in the comments of this channel
This video is also supposed to be 4K but RUclips are once again taking their time to process it.
oh boy did i get yelled for saying that Ngreedia plagueworks cripples your performance. Only if we could see games are modern and doesn't have crippling software in them.... *coughs in DOOM, FarCry 5*
Does that mean RUclips comes with GameWorks as well?
Atleast its not stuck at 240 or 360p lol
There is no problem. I think there are just 15%-10% of users that use 4k displays and watch 4k videos on youtube. The most used is 1080p and some times 720p.
Can I get a locked 60 fps with high settings and hairworks on my gtx 1070?? (1440p)
The graphic options are a joke...
Like all metro games
Is it funny at least?
After playing Re2 Remake, you could tell right away.
@@BrianBarbaGonzalez Ya think they would've improved that after all these years of developing Exodus and how popular PC gaming has gotten.
@@TheUruse I was honestly expecting RE2 Remake to have way less options, but nope.
Advanced PhysX enables destruction effects like little stones falling off rocks when shot and such things. But it is a CPU Feature and doesn't use much resources.
Advanced PhysX was moved to the CPU some time ago for games, they say it's because the cpus have gotten faster, but from what I've noticed, old implementation are still a resource hog, so there might be some other reasons behind it.
It works with smoke too and it kills your FPS. Try it lol
@@Code-n-Flame Agreed, smoke filled rooms kill FPS hard
The load of simple things like smoke puts on the hardware is an utter joke. Its an AAA title but the team's lack of experience in the optimization these type titles is really exposed. Its rather crudely made in that regard.
@@IoNoobMaster It mostly uses frightfully old x87 code, even the original PhysX cards did with their 'custom' processor (I've still got one somewhere) and its now just a scam used to help push gameworks and runs entirely on the CPU just like MS's Havok middleware. Using SSE2 it can run between 1.5 and 2x faster (and that was back in 2010) but nvidia sticks to their x87 implementation to this day.
You don't play Exodus on what you preffer, it's what you HAVE lol.
I suppose people could try to lower down the tessellation level in AMD drivers to see how it would stack both visually and performance wise.
Yep, definitely worth trying. We stick to the regular in game settings for these videos so we didn't cover this.
By default in AMD driver, the tessellation level is 64x (maximum), can switch to 16x to make a balance between the visual and FPS performance.
@@leanhk get it down to 4x - you wont see a big difference.
@@mariodrv depending on the game/implementation ;)
Please test this Tim. Just a quick performance figure will do. You can show it in the news update video.
It's intresting that went nvidia sponsors title, most of the time punishes amd a lot more , yet when amd sponsors title it runs great on both amd and nvidia, makes you wonder don't it.
It is because AMD makes their tech open. Nvidia just copies AMD's features while AMD has to reverse engineer Nvidia's. AMD could have kept Mantle to themselves, but they don't have the money to throw at developers to make them use propitiatory tech, so we got Vulkan.
As a side effect, most of Nvidia's features end up being short lived gimmicks because they keep it closed off. No developer wants to use something that only half gamers have(remember that AMD is in consoles) so there are only as many gameworks games as Nvidia is willing to buy.
And of course you have a link to proofs with significant enough selection of games tested, do you? Because if we talk personal experience - mine tells that it is completely random. Sometimes it even looks like Nvidia sponsored games work better on AMD -___-
Not really, we all know Nvidia are greedy scums that would sabotage even their products as long as it ends up hurting their competitor as well, it's why I never buy their products, it would go against my humanity to support such an evil corporation
@@amirabudubai2279 It is because AMD has no tech, not because open tech. AMD sponsored titles don't have any additional features, because AMD doesn't have any, and all the features they are hyping left and right everywhere, like with Vega release, were either imaginary or later canceled, and even if something was miraculously implemented then nobody even noticed. AMD sponsored game is just your regular stagnant not ambitious game that is not even trying to do anything interesting for graphics.
these titles are just demanding. and nvidia gpu's handle it better.
Disabling Tessellation, a feature from 2010, in year 2019. Such a progress!
Yeah weird suggestion eh. Especially when he was getting well above 60fps with it on. It's not like this is a twitch shooter.. I'm playing Exodus with all the fruit on...
Thank AMD for still not building a GPU that is competent at tessellation in 2019.
cabbage-eating prick Tesselation was a selling for point for Messiah in the late 90's, it's just embarrassing for AMD at this point
@@soulshot96 nVidia's making AMD look bad with GimpWorks in a particular game. Sure, it's AMD's fault!
They fucked the tessellation in this game. 20% hit is insane
Check for heavily tessellated underworld sea.
Crysis 2 xDDDDDDDD
Or search for hairworks on a bunch of buffalos 100km away
You just have to to look around.They use a ridiculous amount of tessellation which deforms some objects making them look completely unnatural.
Steve W, how many atoms flat surfaces have in real world? AMD is all about fake materials.
Actually you just need to look st the brick work. This guys lack of discernment is pretty bad.
First of all, I just want to say thank you for these optimization videos, you literally make the game-experience for me (and probably a lot of others) that much better. As you mentioned, Metro Exodus lacked a lot of graphical options, but I noticed that there is a user.cfg file in the saved game folder which has a LOT more options. If you could investigate that as well, that would put the icing on the cake! The reason I am saying this is because once you enter the second "area" (desert area) FPS drops DRAMATICALLY, so much that if you consistently had 80 fps before no matter where you went in the first area, you would dip to low 50s and 40s. It's insane. One option for this was to drop the grass-shadow option in the file mentioned above. Just mentioning this in case you want to make a "remake" of this video. Anyway, as always, great video!
This whole video was great, start to finish
lol
The video is out for like 3 mins
You're joking right?
Ganondorf
I watch it in 5x speed
As of a couple years ago, PhysX can run entirely on the CPU, and properly (i.e. without deliberately crippled code), apparently because nVidia realized it wasn't earning them any more sales.
What that means is that the performance impact of PhysX in this game will be felt through differences in CPU, not GPU.
I know it's a lot of work, but you can't do an effective optimization guide by just using the fastest CPU available. Today, you'd need to check the effect of four cores, six cores, and eight cores. And, ideally, different versions of each. Running at stock, because >95% of people do not overclock.
Except nvidia own Physx and even though its now software implemented, the terms of using gameworks prevent devs using it unless ONLY an nvidia GPU is there.
Tim, physx doesn't affect anything because it is the newer less shitty physx that runs on the cpu, not the gpu. Physx was always a super simple particle system, not hard to run, but it was locked down by Nvidia and forced to run on the gpu, despite gpus being able to optimise wll for it given the chance.
Physx as an Nvidia brand is dead. It is the default physics engine in unreal engine 4,basically used in all its game, yet its name never mentioned. Why? Because there it's an underlying system, not a dirty tactic by Nvidia to sell more gpus. I guess 4A games being sponsored up their ass so much didn't get the memo that it isn't 2010 anymore and physx isn't a thing like that anymore
I have a question I put physx to run on GPU from the nvidia control panel will it effect my game performance?
@@addychikky since Tim says in this video that PhysX does nothing for the performance, that likely means it's the new version that runs on the CPU (This game is GPU bottlenecked with this 9900K, meaning more strain on the CPU doesn't matter)
I think the setting will only affect you if you have a low end GPU, would be cool if Hardware unboxed did a test to confirm my theory
How do you sell your gpu product so you can get money? Advertise is a must in this industry. Maybe you have a better strategy to sell your gpu.
@@masprassaja3818 you can advertise your GPU without creating locked down technologies that keeps the graphics of all video games held back and hostage, since there is less incentive for developers to use it when only a few people can.
wile123456
, CPU Physx effects are nowhere near same quality as GPU Physx. Stop lying AMDog.
I always wait for your optimization video before I buy a game.
Same.
I love these optimization guides. Otherwise I wouldn't know what some settings will be the most efficient. This was a god send for me on AC Odyssey at 4K to get 60FPS at all times. Thanks for these videos!
Did you test the difference between DirextX 11 and 12? There are people saying you can gain a fair performance increase from DirectX 11 when running on the higher presets
Have you ever considered adding a frametime graph to your benchmarks and analyses? I think that it shows a much more accurate representation of the smoothness of the gameplay, and it can substitute the % lows so it won't take more space on your slides...
truth is, the game was rigged from the start
I see what you did there. Now I have the theme in my head too. Lol.
i pirated and runs not so cute
Testing each of the graphic settings on the Windows Store version is a nightmare.
Saving changes....1%
@@SpongyWhale you wont have this problem if you get to launch the game on DX11. to get DX11 version running is another problem..
This by far the best video I’ve seen so far explaining metro exodus pc settings.
You guys are doing a great job for us PC gamers with your optimization videos, I've followed all your recommendations for SOTR, Metro Exodus & just recently The Division. I've been getting constant 60-90s fps with the occasional frame drops (just certain short segments at 40-50 fps which are certainly acceptable for those several seconds) on my 2080Ti/i7-4770K @ 3440x1440 Ultrawide. Hope you guys continue to post such guides for all the major titles. I trust your recommendations more than GeForce Experience.
Gameworks needs to die!!
Nah, it's cool. AMD poorfags can just disable them if they want.
@@FlawlessWorldAces I run a gtx 1080, still think gameworks should die, the effects can be achieved through much less intensive and hardware agnostic means, no reason to gimp performance of you own hardware with crappy API's.
FlawlessWorldAces it hurts Nvidia too lol.
@@FlawlessWorldAces While AMD provides quality for a fair price, Nvidia scams it's dumb userbase with overprized hardware and useless features and it just works.
Jonathan Ellis 😂 24 AMD retards liked that comment. 😂
agree ultra does the game more justice,metro is created to give you that atmosperic feeling,thats why it is a slow paced game,to enjoy the playable art the devs have put their hard work in
dudes with my fx 8350 and gtx 1060 im getting 45 to 75 fps and i am incredibly happy.
Nice. ✌️ How? On High settings ?
thanks for the guide Tim. And yeah, AMD users who still care to play with tess and hairworks; try a tessellation factor override of 16x in the adrenaline profile.
As much as I hate Nvidia, the devs are responsible for this abomination of performance. It is unacceptable for example that the tessellation hits AMD cards for 50% of the fps.
Well they received payment from nvidia.
It's the same problem in basically every game works game ever. And since the performance hit on amd's cards is always way harder than on a nvidia card it's a pretty save call that it's not the fault of many game developers who are trying to optimise their games as badly as possible to piss of fans but nvidia who tries to give themselves an advantage over the competition when compared in benchmarks.
The issue is, when AMD sponsors a title they don't intentionally try to screw the competition with such "features" and a AMD sponsored title will usually work the same on a Nvidia GPU or slightly worse (Or even better if we consider the latest Assassins Creed) I don't mind the idea of Nvidia helping to fund games after all the studios need all the money they can get, but this Hairworks and Tessellation forced gimping has to stop, this is why I just can't purchase a Nvidia GPU even if I wanted to, they don't update there outdated Control Panel + Experience software, they just try to gimp every title they fund, the devs clearly knew what they were doing when they made this game, and the paychecks they got surely was welcome, they are human after all, and everyone has a price.
der_Balrog 13 lmao no it’s not the same case. Tomb raider is gameworks and is fine on AMD hardware the division was too and no issue on AMD hardware. We can keep digging but it’s obvious that not every gameworks title hurts AMD. We can see cases were AMD name on a game hurts Nvidia too. You can stop spreading lies now.
The tesselation problem is just a limitation of amd driver's implementation and architecture. Tesselation is a fairly known technique and amd (at the time ati) was really into it years and years ago. It is a known fact that amd gpus are a just bad with tesselation and they should be ashamed for that.
Big fan of this channel ! And finally FIRST ! :D
Keep up the great work guys ! I hope you guys reach a million subs by the end of the year !
All the love and best wishes, from INDIA ! :)
Damn they've gone full greedy with this game, Nvidia pays them to use GameWorks and Epic pays them to pull out of Steam... Nope not gonna buy this one even tho I love Metro.
I wouldn't be surprised if they put it back on steam before February 2020 because of poor sales in Epic store.
Well companies don't give a damn if you're going to buy it in 2020. They're gonna lose money and not make any other Metro titles again.
@@zetyadrian I also don't give a damn if they don't make any more because if this is how they're gonna do it then I don't want it
Parsa I am on the last level in Metro Exodus and I hate to say it but am a tad disappointed with it. The first 2 games are 2 of my favorite of all time (started gaming in 1981), but 4A changing the formula to a mostly open world game really did something to the magic. It's not that the areas aren't interesting or filled with detail and little stories, or that the new backpack system isn't great. or the game doesn't look amazing. It's just the game feels way more like Far Cry 2 than a Metro game because of the subtle tonal shift from linear corridor to open world.
Full greedy? This is the biggest game they've done. How entitled are you that you won't support a company that wants to make the most out of their only game in 5 years? Give me a break man, you're just a bitch
Metro Exodus is the first game that I had to use V-sync. I'm using i5 6400 and RX 580 4gb with 60hz 1080p IPS monitor. My settings are set to High with tessellation, hw, physx and moniton blur turned off. So when I started playing my FPS is going around 60, sometimes it drops to 30 but in rare situations it is most of the time above 60 and I had some lagg spikes it really annoys me, it could be because of the big frame rate diffreneces so I enabled V-sync on, full and it's really smooth and crisp, no screen tearing at all, even when I move around it feels alot smoother. So if anyone with lower hardware has the same issue try enabling v-sync and see, as I said I don't prefer V-sync but in this game it helped me alot.
Game also stutters on my x5670+gtx 1070 setup without vsync needs some optimization (lot)
Or you could have bought monitor with freesync, and don't bother about it.
Hairworks also introduces stuttering on my system with a GTX 1080.
Me too (1080ti)
I'm getting really sick of games using TAA without the option to disable it. I'm playing on a 1440p 24inch monitor so all TAA does is turn games into a blurry mess. BFV and metro look like they've got textures from 5 years ago because of it.
Reshade will fix it.
Like the other guy said, download reshade, select the exe of metro select directx 10+, and add the lumasharpen. Start the game and adjust the sharpening amount. I did this to FFXV.
Same with RE2.
Varun Shewale thanks for that tip!
High PPI monitors are soooo good, hope they release a 4k 24" 144hz screen in some years in the future. It would look so nice.
Grabbing that Epic Store $$$ and the Nvidia $$$....anything Deep Silver wont do for money?
Why wouldnt they go with Epic store when they give a better cut to the developers? It's a no brainer.
@@null643 My bad, yeah the publishers. But it's the obvious choice when someone gives you a better deal.
@@Labyriiint They make the same amount of money when EPIC lowers the sales price.
The only reason they're making "more" is because they're keeping the 60 euro price tag in the EU and they got a big briefcase full of money.
@@null643 Exclusivity can be bad, but is good for gamers as a whole. It is no coincidence that many of the best games each generation are exclusives; platform owners spend a lot of money trying to develop system sellers.
It can be bad. Paying to have an already made game not published on other platforms is pretty scummy, but there are many more games like Half Life, Mario, Zelda, and Uncharted. Mario is the last AAA 3D platformer because publishers deem the market too small to support them. As a fan of 3D platformers, I benefit greatly exclusives.
Don't get it. I've about 5 launchers all ready. I usually just add exe. files to Steam. This one I bought for £6 on eBay and play offline, don't even use the epic launcher.
Love this type of videos that does a in deep performance testing. Thanks guys
I wish when you do those tweaking video you would dive deeper in the game files to found if it's possible to tweak graphical options that way.
I'm sure there are other videos for that. This series of videos has always been about using the graphical settings available in the settings. I would consider myself a knowledgeable PC user, but after working with computers all day I just want to play my games and not fiddle too much. Unless I'm really bothered enough by a game's visual/performance, then I go and tweak. However videos like this is detailed but uncluttered and gives me the info that I need using the built in settings.
To any lucky duck watching this in 2021, setting the game to high provides more detail to the grime on your guns
THIS IS THE WAY MY BROTHERS FOR THE ENHANCED EDITION! After hours of extensive testing, I've found the "Sweet Spot" optimized settings targeting 90-120+fps avg. for any system close to this setup spec and with these high quality in-game settings -
I have a Zotac Amp Extreme RTX 2080 ti OC'd ~2k mhz Core & 7950mhz Mem.
i7 8700k OC'd 5Ghz 1.39Vcore All Cores (no AVX)
16GB DDR4 3000mhz RAM
240hz G-Sync 1ms Monitor AW2518H
NVME 1TB SSD 2k R/W
With these bang for fps quality vs performance settings below I'm able to usually get between ~90-120 in most environments! -
Resolution: 2880x1620 (4K IS NOT WORTH THE FPS HIT, 2880x1620 + Reshade Sharpen Filter looks near native 4k, +40% more fps! )
Quality: Ultra
VSYNC: Off (if you are getting more fps than your monitor supports either turn this on or use an fps cap such a rivatuner to keep fps at lest 10fps lower than monitor refresh rate to prevent tearing)
MOTION BLUR: Off
RAY TRACING: High
DLSS: Quality
REFLECTIONS: Hybrid (if you feel your FPS is high enough try Ray Traced, but will hit performance 10-20%)
VRS: 2X
HAIRWORKS: On
PHYSX: On (Box Filled in)
TESSELLATION: On
FIELD OF VIEW: 3 Extra Ticks (I prefer this since it allows you to see the watch easily on arm)
NOTE: I STRONGLY recommend using RESHADE sharpen filter called CAS to help counter the blur from DLSS and to increase sharpness dramatically to get a 4k look without the performance hit. Also, the filter "Fake HDR" to help create more contrast and help the Ray Tracing version of Exodus to keep it's dark look from the original. To be honest, the default look of the game feels like vasoline is smeared all over the screen, this is where Fake HDR filter helps out by making the colors, darks and brights "pop" more. With these 2 filters your only losing a few fps, maybe 3% at most, BUT IT LOOKS SO MUCH BETTER!
My reshade filter sweet spot settings are -
CAS.fx: Contrast 1.0 , Sharpening intensity 0.63
FakeHDR: Power 1.3 , Radius 1= 0 , Radius 2= 0.060
There are also a few ini file tweaks to fix some annoying things the game has by default like mouse acceleration, motion blur still present, etc. A list of fixes are found on
@t
Lastly, there are some Nvidia Control Panel & Nvidia Inspector Tweaks I've made to squeeze out the most performance such as disabling Ansel, etc. If you wish to find out more details contact me on Discord ( V1rtuou5@/2213 ) and I'd be glad to help you get the best possible experience, ENJOY! P.S. - If you're not playing on RANGER HARDCORE with no HUD...YOU'RE DOING IT WRONG :P
hairworks tanks performance.
The big difference between hairworks and tressFX it's that the first uses "stream output" (also known as transform feedback) and the other "compute shaders". The stream output specification was created to capture the geometry to perform calculations on them (like modifying the position to simulate movement form the wind or the character movement). This specification makes has requirement that the operation needs to be calculated in sequential order. This specification prove to be very inefficient on tiled gpu arquitectures, so now we have the compute shaders specification. In this new specification the geometry operations are done asynchronously.
The computer shaders are recommended to use it, but the stream output specification was preserved to avoid compatibility issues with older APIs and games that make use of it. And it is still been use in modern engines, like unity witch uses it to render characters. Titles like the witcher 3 uses it on some monsters like the sirens, and mafia 3 makes it a requirement.
Stream output damage significantly on amd gcn architecture, since needs to wait to render one point to go to the next making the gpu been unutilized for a time, and it's worse with bigger gpus, since there is more cu's without a task, in other words, been more frequency sensitive than raw performance power. This also should apply to nvidia with the Turin architecture, since it's the first asynchronous they have (remember in the presentation that Jensen said that now they are able to perform simultaneously integer and floating point operations). You can see that the better performance gains in turing was in amd optimise titles. You can also look at resident evil 2, that uses heavily the compute shaders to calculate illumination. In that game the difference between the gtx 1060 and the rx 580 it' s around 20%, but with the rtx 2080 and the rx VII that difference doesn't exist.
Also i have to remember that hairworks uses tesselation and msaa on top, so you have more geometry and also you have and aa that works on geometry. The tressFX found on shadow of the tomb raider it's a modify version named purehair that uses tesselation on top. That won't take that much of the performance since the geometry calculations are done asynchronously, and also it isn't been overdone.
I hope that with the turing architecture more a more game developers stop using stream output in favour of compute shaders. CD projekt need to remove all stream output usages from their engine in order to have a good optimise game, and the same goes to unity.
If you want to know more here is a blog explaining why stream output is bad and why it's now on the vulkan api:
jason-blog.jlekstrand.net/2018/10/transform-feedback-is-terrible-so-why.html
The way it's meant to be gimped! Radeon 7 takes a big fat dump on the 2080 once you disable the shit Gimpworks tesselation!
@CREED Whatever you say champ, I OC'd my 7 to just over 2Ghz core and 1200Mhz memory and its beating almost all 2080 OC tests I'm seeing.
Man that advert for the movie before the video was pretty good.
Thanks for this guide.
I have an GTX 1080 and play on 1440p. Before this guide i used "HIGH" and "Tesselation ON", but now i switched to "ULTRA" and "Tesselation OFF" and must say:
The game looks better with almost the same performance.
Same card and same options i had too, i will now try ultra and tess off and see if its looks better at same fps.
Andre Lundquist was it any better?
@@maincharacter2677 Yes, but i ended up with playing on high and with tesselation on, and i added this reshade: www.nexusmods.com/metroexodus/mods/1
Huge visual preformance, looks like extreme graphics on high settings or even better. Without nearly any fps loss at all.
Andre Lundquist thank you!
@@aNDREeSnell Nice, I'm running the "Tweaked 2: "Quality High" + "Hairworks On" + "Tesselation Off" + Gamma -1" found on this site www.tweakmygames.com/post-unique/Metro-Exodus with a rx 580.
I'm definitely gonna try this reshade mod !
I was looking for a channel like yours. You do extensive work, it's very informative, there's no 'clownesqueries', no faces, no nonsense. I'm in! :)
Damn nvidia ready to sacrify performance of their own customer as long as they can cripple AMD.... That leaves a bad taste in the mouth wether you are an nvidia or AMD customer.
If AMD were more competitive on high end card and offering Ray Tracing aswell I wouldn't have switched to nvidia.
Anyway I'm sure AMD can do better than Deep Vaseline Super Sample with DirectML (it's hard to imagine doing worse than DLSS as DLSS does even worse than a simple resolution scaler)
AMD HAS offered ray tracing far longer than nvidia. Go look up 'radeon rays'. Its part of the Vulcan API and has been for several years now, but only after nvidia do their dog and pony RTX show with grossly misrepresented footage (that star wars demo that needed a host of Titans to render at sub 24fps) has ray tracing been a thing apparently.
Just like how any phone tech like wireless charging or face/fingerprint scanning ONLY starts existing after apple does it...
@@zybch Who said nvidia invented Ray Tracing and was the only one that could do such task?? Not me. I despise nvidia proprietary stuff and selling practices but I depise toxic fanboys aswell wether they are green or red.
However nobody can deny they're pushing Real-Time Ray Tracing for games more than anyone at the moment. And Nvidia invented dGPU and invented programmative shaders.
I am very well aware that AMD too can do ray tracing rendering as well. But does Radeon Rays it enable AMD GPU to render Ray Tracing fast enough for Real Time Ray Tracing in gaming? No. Is there any games featuring Ray Tracing with through AMD Radeon Rays? The answer is again No.
Did you know that any GPU wether it is from AMD or Nvidia and even before RTX/Volta can do real time ray tracing?! Just at a much poorer performance than what RTX cards are able today because GPU weren't specifically designed for Real Time Ray Tracing. As such PowerVR was able to design ASIC chip that would give the same Ray Tracing performance of a 980Ti for only 10W in 2016. It was like comparing a Desktop GPU to a smartphone GPU.
And haven't you seen that the StarWars demo runs at over 30+ FPS at 4K DLSS on a single RTX 2080Ti.
Even a RTX 2060 could run the StarWar demo at over 24FPS on 1080p without DLSS since it handles the starwar demo at 2560x1440 DLSS for arround 35FPS.
@@SagittariusAx You're so damn thick its amazing you can get yourself out of bed in the morning.
@@zybch Prove me wrong instead of just being an asshole.
It's much easier and better for viewers to spot differences and judge for themselves if you use the whole screen for each setting, for example going from low to medium to high etc and switching back and forth depending on the settings you are currently talking about. Other than that, great vid much appreciated!
I don't think the devs care about AMD users at all. Both AMD CPU and GPU performance is far below what many other titles are able to achieve.
I'm honestly fine not buying this game on principle. PC games should work for all gamers, not just a few. I didn't get into PC gaming to play this exclusivity BS that Nvidia seems to encourage. It's not exclusive in the strict tense of the word but it encourages that mentality.
Take a look at the xbox one X version of metro. They've done an outstanding job on AMD hardware, and hardware thats far more limited than is required to run the game at a similar fidelity (generally between high and ultra) on PC, and way way cheaper. So they can do it, but on PC they don't.
What does the xbox not have, a GPU produced by nvidia and associated gameworks program whos only task to to make any competitor to nvidia look worse...
You can get away with using tesselation of AMD gpus, you just have to lower it from x64 or whatever is used to x4/x8 and game will look better without huge performance hit.
Better than of and not much worse than default on
@@supercalifragilistaphobic2146 honestly when not looking at screenshots I can't tell the difference between 8x and 64x tesselation.
Cryisfree can i do something similar with nvidia ?
It’s a vicious circle with Gameworks and game sponsorships. GPUs are evaluated by performance on top selling games but Nvidia is able either sponsor or get Gameworks into more games than AMD. We can’t blame AMD for not competing, their hardware is up to task, but not against a set of games designed from the ground up to run better on the competition.
This. With Nvidia's bigger marketshare, it's no wonder they get more devs to make games that run better on their product.
@Chiriac Puiu Yes. It's one thing to make your product better than your competition. And it's another thing to do that through unethical means. What Nvidia is doing is the latter. Which is why I switched to Radeon 4 years ago. Can't wait for Navi to be excellent in price-to-performance this July.
Except their hardware doesn't compete, Radeon 7 is an over volted overclocked vega 64, and can't even beat a 1080ti. Nice buy for 700$ :D
Very useful and informative content.
I have watched many videos to solve my graphical issue of this game but couldn’t find a suitable one and then your video came and spoke everything out loudly.
Thanks mate cheers 💙
All flavors of gimpworks plus RageTracing and Deep Learning Smudge Subsampling
wile123456 How are those AMD features? Oh wait you don’t have any! 😂
@@Mr.Honest247 no they are shit Nvidia features everyone hates no matter their brand of gpu. I'm always so shocked how fanboys are so bad at reading
The AA is actually pretty good. Literally the only problem is its an RTX card only thing.
@@Mr.Honest247 tressfx> hair works Free sync > gsync . if you consider copious amounts of tessellation a feature i feel bad for you LMAO
wile123456 I’d understand your post better if you used proper grammar. Why is it that every AMD fanboy has atrocious grammar? There must be an underlying reason for this... I’m sure the answer won’t be pretty...
Best comparison video so far for this game. Thanks!
I actually feel gameworks wasn't too bad on this one. Hairworks only drops a reasonable 2-5 frames and PhysX hardly affects it at all on my 970.
I played this game on my overclocked 1070 at 2K and barely got stable 60fps in most areas. Still a great game. Might replay it once I get an RTX 2080 or whatever card comes out by then. The game definitely has to be experienced with RTX on. The Global Illumination difference is insane!
Mate i this work is marvelous, keep doing it, the sole reason to why iam subbed to this channel thanks very much.
Thanks Tim and Steve for all the work you do!
Which Nvidia and which AMD gpu did you use for the tessellation comparison? Vega is 1080 level in tessellation.
Physx runs on the cpu and makes the game go from play able to unplayable on one of my 3470 I5 PCs. No effect on Ryzen.
The physx is for like physics based debris when you're shooting things...When you shoot the ground, chunks of dirt or what ever will fly up and bounce around...stuff like that...its pretty subtle but a nice touch
What is red and smells like blue paint?
Red paint.
Good to know, will definitely use this info in 2020 when it launches on Steam!!!
What happens with the tessellation performance if you cap a maximum in the AMD driver software?
You"ll get better fps if you'll lower tessellation quality from AMD driver. The default tessellation quality is set to 64x. Some people recommends to use 8x or 16x and some recommends to use between 4x and 16x. If you have the game and a amd gpu try it and see how much it will improve your fps and if you like the quality.
This game got cracked by cpy.
Your optimization videos are the best! Thanks!
I know it's not the most demanding game, but is an Apex Legends Optimization Video possible? Love your videos! :33
Apex legens is way more demanding than i expected. Im getting 30 fps on lowest settings and no AA, when in Fortnite i get 80-100+ fps on lowest settings + 1080 p + max textures
I mean it a cometetive title anyway so set everything on low or off beside 16x Af in native res :)
Apex is suprisingly demanding. Turn off ambient occlusion, it's the most demanding setting by far.
Your pc specs?
It’s crazy how heavy it is on the gpu. I really expected to get a locked 144fps on 1440p max settings with a 2080ti... the game also has some issues with gsync and/or framecaps a as I get fps drops as soon as I use either of these
Mint vid can sit and adjust while listening and watching learnt a lot and it plays at 60 fps steady. I have been trying to get this to play for a week haha. Sub gained.
Amazing work, once again! However, can you somehow include GTX 760 in further bunch-of-cards testing? Thanks!
From my experience with the 760 in the past, the GPU is fillrate limited and performed well at 900p for fillrate heavy games like FFXV. I'd say that 900 on medium to low will net you good performance.
It's obvious drivers for kepler cards are gimped so at this point testing kepler GPU's is pointless. 780ti is beaten even by older 7970, and 970 is twice as fast. Much weaker 760 (compared to 780ti) scores only 13 fps :P.
static.techspot.com/articles-info/1795/bench/Medium.png
Lucky Number8 that test seems very inaccurate.
@@EvilTurkeySlices can you prove this test is inaccurate? It's hardwadeunboxed test after all, and you are watching their channel now :).
Lucky Number8 an rx 550 definitely doesn’t beat a 770, the 770 is similar to a 7970 in performance. I own both.
Physx for example makes rocks ripple down a hill after shooting or moving a boulder. Or possible debris flying off a wall after shooting it with bullets. Or rock bits flying after throwing a grenade.
Waiting for lowspecgamer optimization. LoL
Can't wait to play this game in a year when it's released on steam :)
Nvidia notworks is the cancer of PC gaming.
Awesome video mate thanks for the help!
While I’m happy to get 30 FPS on low settings with my amd Ryzen 2200g with Vega 8 graphics
LMFAO
DX11 also performs better than DX12, worth mentioning. Or atleast on mid range GPU's it does.
I completely forgot that the metro games used these worthless presets for almost everything with no options to manually tweak everything. It's just lazy.
Just another reason to wait a year to play this game, I guess. Since my 480 can't handle 1080p60 at high, I'm probably going to need to upgrade, though there's nothing that really makes sense yet. Guess I'll be waiting for 7nm midrange cards.
Let's hope that Navi delivers!
Highly doubt devs will add more options
@@dainiusvysniauskas2049 I didn't mean that I was expecting them to add any settings. I meant that by the time its out on steam (and hopefully gog, as that is my preference) there'll be new midrange cards out more capable of playing this.
@@HeretixAevum Oh, I see. But frankly, with couple of settings turned off even current cards should be capable of providing very smooth gameplay
@@dainiusvysniauskas2049 It just bothers me that I'd have to turn it down all the way to medium to get a reliable 60+, since there's some pretty noticeable graphical downgrades at that point.
Im using RTX 2060, 16gb RAM on a laptop and use default NVIDIA panel settings. The Fps drops below 20 frequently during the gameplay
Any solution for this would be much appreciated
Metro: Exodus has been cracked!
Shading rate does also add a supersampling effect. It doesn't just increase the shading rate.
Bruh... this time RTX ON to get global illumination is worth it... trust me... it’s much better than what you get in Battlefield V (and its reflections)... you don’t need to play in 4k... getting closer to prerendered graphics in real time is more important and I don’t care if it’s not 144fps... pff... the people these days...
Go back to watching the 10 hour 'it just works' video and praise Jensen Huang.
Adrian Z man... I mod games and I understand about graphics... fuck Jensen... Ray Tracing is just the logical step going forward in graphics technology cuz the damn lighting is precisely what makes the big difference between prerendered cutscenes and real time graphics... it’s just that with the hardware we have right now many people won’t understand yet... only time will make them open their eyes and go like “ohhhh it was all about simulating how the light bounces in real life creating reflections and refractions”... oh no wait... that is too deep... lol
Only problem is even the 2070 struggles without RTX in 1440P (I've seen 50fps in some areas) and RTX kills its performance. In 1080P or with a 2080+ maybe
@@kendokaaa let me just go throw 1200$ down on a card that can even play it at decent framerates at 1080p
"I'm clearly too much of an enlightened graphics god... I know too much about graphics and I worship Jensen Huang everyday by watching his RTX reveal event... I'm far too deep for mere peasants to understand..." Nigga that's what you sound like. Get off your high horse and accept the fact that the devs and Nvidia crippled the game's performance.
Caspian Sea murders my performance. Go from 80s to 45 real quick
7:11 --> PC Master Race
Great video. I'd love it if you guys did SLi benchmarks in these. Not for every card obviously. Just testing one SLI set up would be great, just to gague the level of support/scaling/issues.
This game is beautiful.
Not nearly as beautiful as RDR2. It actually looks outdated in comparison, thus runs like total shit on hardware that overpowers consoles by significant margin. PC gaming never changes.
@@Vsevolod3788 As a hardcore PC GAMER with a GTX 1080 OC'd I'm disappointed that the visuals aren't good enough to justify the performance hit in Metro Exodus. Red dead 2 looks stunning. :)
1:05 I think 'incidentally' would be more appropriate here. It's not a coincidence that you're using the best-performing card for capture purposes.
Boycott Nvidia and Gimpworks games until Nvidia makes gameworks an open standard instead of a non optimizable black box.
The Extreme and Ultra settings are so intensive that the guy on the right actually stands up! 03:08
😆
Complains about good quality Fur/Hair details but doesn't know what PhysX and tessellation do in exodus. Damn! nice judgement for clicks there! You should really start following Digital Foundry's metro exodus analysis more to up your ideas about these stuff. About the TFX, why don't we get to see good implementation of that in non TR titles even though that's AMD sponsored? like RE2 remake, AC odyssey or the likes of division? It's like saying just cos Hairworks is demanding on certain gpus which are weaker in tessellation, it shouldn't be implemented for much better visuals. Metro's Hairworks might not be that noticeable but in WITCHER 3 animals, it was damn good. It's a good thing we have channels like DF which can actually pin point the difference for viewers unlike some who have hardtime differentiating basic bullet physX.
TressFX is open source so most developers use their own implementation without the branding; Square Enix is calling it Purehair (Mankind Divided, ROTR); Guerrilla games also used it in Horizon Zero Dawn and since it's included in their engine it's probably being used in their next project and also in Death Stranding;
Just because you don't see a TressFX logo everywhere it doesn't mean it's not being used in one shape or another, especially with both Xbox One and PS4 being based on AMD hardware;
@@d4t6ix Well I was going to answer, but it seems that you already did. Brilliantly by the way ! Good day to you sir !
@Narbonne Alexandre; Thank you, kind sir; Have a great day!
Trying to find a way to combine the words "shill" and "consumer"
would you ever consider doing these style of video for older games that are still pretty popular?
The game is so overrated, trite and dull.
I pity your taste
Great video! You have to do more ''game optimization'' videos! :D
Thank goodness there's not a lot of difference between AMD and Nvidia GPUs w/ PhysX this time around.
It drove me mad that in Metro LL it was CPU-based for users of AMD cards yet included FLUID-based PhysX... GAH
You know, I find it interesting that you found almost zero effect with motion blur. I had horrible stuttering (and almost total freezes for up to 1 second) when doing turns larger than 90degrees with my 1070GTX. turning off the Motion blur, the turn stuttering went away. (I still get these weird freezes every "action" a character makes. Example, reload, pull up your backpack. Or things that are heavily scripted.
AMD does really have to do something about the GPU memory management, which seems especially problematic on lower end cards. If you look at 1080p, the RX 560 4GB vs 2GB VRAM difference is massive, especially when compared to the 1050 3GB vs 2GB ...
Holy macaroni I didn't realise just how much of a performance hit Tessellation had
After months of troubleshooting and allot of denial(i convinced myself that 8gigs of memory was the cause of the issue). I can finally conclude that 95% of stuttering issues are due to HHD being too slow or faulty. Switched to SSD and all I can say is.....Holy shaait! It worked!
I would honestly love to know what GPUs you tested on. I think thats kinda imperative to a performance comparison of settings. As we know, certain generations of GCN handle tessellation a lot better than others.
Did you get to try any SLI testing? I heard the game scales pretty well.
9:48 the PhysX option that make smoke move or even if snow physics & pieces of rocks fragments when you shoot the ground or wall
Tips: Do some stuff manually in driver preset for the game. You will get extra 15% FPS.
With my fx8320 +16gb +1063 in QHD i step forward from 38-44 FPS on lock medium-high settings to 54-60 FPS just by adjusting some options in driver game global presets in Resident Evil 2 Remake. It's 2560*1440 resolutions for you to know.
I love your videos but this video is not representative. The early areas are less demanding than the later ones (especially the desert).
I have a GTX1070 myself, overclocked to 2000MHz. I get like 45 FPS on ultra in the desert while normally I get about 70-80 (1080p).
You should really consider this difference if you evaluate the performance of the GPUs regarding Metro Exodus.
In Caspian Sea(desert), foliage shadows kill FPS dramatically.
Advanced physX makes it so that when you shoot the ground , debris fly out and stay there instead of getting a small bullet hole
This game doesn’t look bad even at low
Thank you for the advice!
To fix issues with AMD cards in gameworks titles go into your control center. Gaming, global settings, select tessellation mode and override application settings. Set maximum tessellation to 8x and it will fix all issues across the board.