@@fjalls depends on the game. Usually I take the preset that gives me sufficient FPS for what I play, then put the texture resolution and render distance to max, I try disabling AA (1440p and higher makes aliasing less noticable), disable triple buffering, vsync and motion blur (although I'm pretty sure the effect on performance is minimal), and sometimes I play with particle/foliage density to achieve good look/perf balance.
@@WayStedYou Consoles are pretty much always significantly less powerful than medium to high level gaming PCs. Consoles are cheap and very accessible though, while PC tends to be for serious enthusiasts who want the most out of their hobby. PC gaming is less accessible due to the greater cost of hardware and technical know-how required to operate the machine.
@@sonicbroom8522 I second that, also 3DRealms were great at optimizing the game like Duke Nukem Forever back then it never put strain on computer system.
I would say Rainbow Six Siege is what happens when you got an insanely optimized game. I mean cards from 10 years ago can run it at over 60 fps at 1080p on Ultra.
I usually set everything on ultra, then I turn off all the options I hate with passion (motion blur, chromatic aberration etc.) If I'm not satisfied with my FPS, I configure each option individually, to see how much they change in terms of graphics quality and performance.
@@toxicavenger6172 some settings like film grain, chromatic aberration and lens flare aren't things we see with our eyes but cameras see it like that but i've no idea why games try to replicate a camera lens rather than human eyes
I really needed to hear this. I find that I’ve started to expect too much from games. The best gaming time of my life was being a kid playing MKII and Altered Beast on my Sega Mega Drive, yet as an adult I want to play at 1440P Ultra settings and I feel let down if it’s not performing good. As my mum used to say “don’t let perfection be the enemy of good”. Really appreciate this perspective.
Myea, I pretty much agree. I also come from a time when it was just "Cool, this game is a thing and runs.", no matter how it looked or performed. - But that's just not how games are anymore. They're not two-dimensional side-scrolling games that are fine at like 30fps or anything like that. - Games are much more made after "real life", even if it's fantastical, and it SHOULD basically be high-fidelity and high-performance. - There's nothing wrong with that, it's just how it should be, really. - Then again, expecting too much is indeed not good.
@@LawrenceTimme It’s like you didn’t read a thing I said lol. Also, you’d have to be high on crack to think that 1440p is pixelated, so I take it you’re not actually serious. I have a 4k display too, but I prefer to game on my 1440p 144hz LG monitor. The issue is that I worry too much about frames when it doesn’t even really matter. I gamed at 720p 30fps for years and loved it. That was enough to have fun then, so it shouldn’t matter now when I’m getting a much better experience now.
@@Psyopcyclops Many people are influenced by marketing and hype and while some of us are more aware than others we are all effected by it if you consume media from channels like Digital Foundry or other channels that glamorize high end hardware, maxed out settings and ray tracing. You can often see the effects of hype and marketing in the comments section when people talk more about maxing out the settings and their hardware than the game and what they liked about it. There's a reason why hardware manufacturers make sponsored content with some of the bigger channels, why they send them their flagship for "review" well before the item goes on sale because influencers are a part of marketing today. I'm not saying there's anything wrong with wanting to max out graphics of course but what I'm saying is that you can enjoy a game and get immersed into it playing in high, medium or even low if you had no other choice. Of course it depends on the person and their preferences as well. I for example can dive into and enjoy a retro rpg as much as if not more than many AAA games today.
@@Psyopcyclops Unfortunately the more educated about games we become, the more you'll feel that way. I know I would of saved a lot of money and probably be just as happy with something like Series S if I didn't understand the differences between settings. That said I do like the rest of the possibilities like modding our games on PC and can't go back to exclusively playing on console's ever again.
As a small-time indie dev, I'm kinda glad you made this video. When I've implemented graphical settings, I've always thought to myself that I might as well let people set the render scale to 16x native if they want to, or render shadows at 16x normal resolution if they want to. It's literally zero skin off my back to just increase the max value of the slider, and I imagine many of the "ultra" settings come from a similar mindset. However, I've started noticing people who just max out everything without second thought, even though the maxima are just ridiculous settings, and then they come and complain that it doesn't perform well, so your point of "complaining about poor optimization at ultra settings" really hit home with me. It pains me, but I've started thinking that I probably have to start limit the maximum settings to "reasonable" values just to keep people from harming themselves.
I think a good approach is to make somewhat reasonable decisions to push hardware and future-proof the graphics a bit, but if there's a lot of potential "harm", you can make it customizable via an INI file or such free of in-game boundaries. But I guess that depends on the engine you work with, how easy this can be done and if you want to spend the time for that. Also, resolution scale shouldn't be a part of the presets and default to 1x/100% and be adjustable separately. I assume most people just leave it like that and don't push the slider to max, like it's done for the preset.
the reason they maxed everything is because they spend a lot of money on top of the line graphics card and got dissapointed that they could have bought a more cheaper alternative
I get the impression that the highest modes are basically 'optimizations off' and it's really meant for 'forward compatibility', or in other words, use those settings in the future when revisiting the game and your new GPU can run it at such a high frame rate that it doesn't matter.
That's absolutely true, and some game even warn you about it, like Kingdom Come for example. But that worked until last year. Now that you can't find a new GPU to buy, it's just depressing to look at these gorgeous games from my 980 Ti while knowing I will never get this kind of visual quality. I can't afford (and don't want to) pay scalper prices, and when I upgrade it will be for something better than my current 980 Ti, but anything above that GPU is still $500+ today while it's really a mid-range card these days. Entry-level and mid-ranged GPUs simply don't exist anymore. A brand new RX 590 was going for under 200$ two years ago, often under 150$ with good deals, now you can't find a used one under 300$, and you'd still need to be lucky for that as most are sold 400$+
@@mikeonlinux5491 980Ti is like 1070 or 1070Ti. If you are willing to play at 30 or so fps...let's say 30 - 40 instead of 144 and above,you can match the visual settings of vastly supperior hardware just fine, especialy if at 1080p. 3090 not required...unless the game is an absolute mess requiring 3000 patches to work well enough on anything below highest of highest end.
@@GameslordXY Absolutely, I play Cyberpunk 2077 at capped 30 FPS, because I'd much rather have a butter smooth 30 FPS than FPS doing yoyo between 35 and 60+ FPS. This somehow just feels smoother. I play all my games with ultra settings, and this 980Ti is still good enough, especially since as you said I'm playing on dual 1080p 75 Hz monitors. It's just that I tend to like games that are terribly optimized lol like that CP 2077 or Star Citizen, and for these, the 980Ti can be limiting. I'd rather have RTX active and 60+ FPS like in my other games.
But often these ridiculous settings scarcely make a visual difference. Like Tim mentioned I have found games where a lower quality setting looked better. Why wouldn't future players prefer more fps or quieter fans than an unnoticeable effect?
In the days of yore, "Ultra" was a means for the developer to just push boundries, to see how far they could take it. Or so it seems when you look at older titles. But High could sometimes nearly double the framerate you got. hesedays, when you look at Ubisoft, they clearly label they intent for you to play at "High" settings, they always mention this on their spec sheet for games. In generall I have always felt games are optimized around the "High" preset, not Ultra. Ultra feels like a "Do waht you can without breaking it" kind of setting for developers.
The ultra settings are for when you buy a new gpu and you are all enthusiastic about it and crank everything to the max. After a while the honeymoon ends and rationality kicks in, so you lower them to high for more fps or for the same fps with less power consumption and noise. Personally I prefer optimized settings as sometimes the high preset drops some settings such as anisotropic filtering to x8 which is dumb as x16 has hardly any performance impact.
I've actually found that GeForce Experience doesn't do a bad job at optimization most times since the software already knows what card you're running and what it should be capable of. Always have to murder motion blur though.
Ultra definitely looks better in a lot of games, its subtle but it is. It's like RT lighting, its subtle but it's a big difference as you go through the world
Yeah, That's not true at all for most people who can crank everything to max. I typically run everything on maxed settings in 1440p. The only reason I would lower settings if it's an online fps game where I want to hit my monitors 165hz. But if I can get 165fps with maxed settings I wouldn't lower the settings ever. Before when the 2070 super came out again I ran everything maxed out but in 1080p. Unless my card has issues running it I wouldn't lower the settings...
Some differences, even if there are plenty on ultra, dont often look objectively *better* . They just look different. Medium and high graphics are sufficient, and really it is only the *low* textures that look objectively bad due to low poly , pixelated look.
This is a pretty good summary. Low settings are there to make a game playable on old or non-gaming hardware (like work laptops). On med and high settings almost all reasonable performance cost improvements have been gained. The rest is just marginal gains at huge cost.
Keep in mind that many of not most multi-platform AAA titles from the last 3 to 5 years use comparitive low or low-to-medium fixed settings on consoles (PS4/XBone) while running at 60, or even just 30 FPS. I'm sure that is going to drastically change once developers start learning real optimization techniques for the current gen consoles. But by the time those games start rolling out, PC hardware will have had a few more upgrades to overcome the lack of PC optimization, and the settings will still likely be comparable to medium or lower settings for the PC versions.
Arguably the biggest performance impact to the Borderlands games is to disable the black outlines, which are applied as a post-processing effect. These instructions are usually given with a caveat that the game looks significantly worse, but I think I prefer it. In any case the net effect is far less than often stated, especially when you're in the middle of a gunfight.
I like to see a FPS showdown, how painful was it to listen to bitchy blue boys saying “we’re still the best” when they had 5-10 FPS more on games hitting 100+ FPS.
The difference might be minor, but if you have a keen eye, you can still see it. Especially for a game you put hundreds of hours in. For example, I know exactly how each graphics setting in GTA 5 will affect the look of the game.
Game developers should give better names and descriptions for the settings, such as: Low - Try this if Standard is unacceptably slow on your computer. Standard - Good visual quality. Captures the intended look and feel of the game well. High - Great visual quality. Recommended default for high-end hardware. Highest - Nearly indistinguishable from High, but with a significant performance cost. Punishing - Useful for benchmarking computer hardware or heating home. Zero visual advantage over Highest.
All studios would need to do is rename “very high” to “ultra” and no one would even notice, while still giving people the warm fuzzy feeling of being on the highest setting, which let’s face it, is 99% of wanting it.
Sometimes even putting games on medium is more than enough. Heck, if I have the choice of 60fps on high or 90-120fps on medium with slight less details, I'll always go for the medium settings because I really am so used to high fps gaming that dips below 60fps are noticeable and annoying af....
It's due to things like Draw distance and Level Of Detail (rendering and detailing of distant objects) as well as Shadow & Reflection Resolution. The latter is sometimes MUCH higher (often 2x -- for example going from 2048x2048 to 4096x4096 -- or even higher) when going from High to Ultra.
I used the Hardware unboxed advised BF5 optimal settings and went up 25 fps and I don't really see a difference Hope you guys will do another optimization of settings for games soon
@@Relex_92 warzone actually has like half the graphics settings unchangeable. Like it let's you change them but doesn't actually do anything. Also it's vram thing lies out if it's arse. My 8gb card has most things on low at 1440p but I have to lower textures because of vram usage
Finally someone is digging into this ultra madness :) Good point on the "if it runs poorly on ultra, it's poorly optimized" attitude. Also, I would like to see MORE optimized settings videos, it's very useful to know which settings have big performance impact for each individual game. It's not always the same and dependent on the actual implementation.
I think they do have their use especially for developers to flex an engines capabilities but I also think that we shouldn't bundle multiple aspects into 1 "Ultra" settings. Shadows for example can be broken down to draw distance, contact hardening, resolution, filtering, instead of just "Ultra". This would give players more room to configure instead of having to sacrifice certain aspects of shadows by dropping from just ultra to high. I've seen so many cases where dropping shadows from just ultra to high takes away soft shadows and dramatically decreases the filtering quality. This goes to other settings as well like reflections, lighting, post processing etc
It is useful for benchmarks OR if your current card runs that setting at high enough frame rate for your monitor. E.g. some may want to lock a 60fps or others will have a high refresh 144fps but accept a 1% low of 80fps.
One of my pet peeves is when a whole bunch of effects are just lumped into "post processing". Ubisoft used to be awful for this back in the day. Was no way to turn off motion blur without turning off particle effects lol
I do agree but they are not thinking about us who people who know how to use the settings they're thinking about people who are unfamiliar with the settings and what does what they trying to keep things as simple as possible
@@kennethshores8547 Just because a lot of people don't know how to work around them doesn't mean they shouldn't do it. They can still keep the presets with the settings and only anyone comfortable with going deeper can go into those "advanced settings" and the present name changes to "custom"
@@kennethshores8547 ubisoft games have done a good job recently by showing pictures of comparison in their settings menu but some aren’t really accurate and it’s really not much to go by, detailed explanations and comparisons would be preferable
IMO, most games' medium setting already look more than good enough for me and I'm perfectly fine re-visiting games 4-6 years later if I liked them enough to re-play them at higher details after a GPU upgrade.
@Go To Channel [𝐋!𝐯𝐞 S.A.X] Doesn't matter to them even if they can't tell the difference between Ultra and High, because they are going back to the graphics setting and set it to Ultra anyway, just so they can tell their mates they have bigger e-penis.
@@AngelicRequiemXcan barely tell the difference on newer game, just worse optimization and more resolution on texture that the human eye can't percieve smh
This was a good one! Only had PC for a year and I'm just starting to really get a grasp of settings and what not. This seals the deal for high. Thank you!
I prefer going through each setting individually untill I find the perfect mix of quality and performance. It does take a bit more time but you only have to do it once per game so it's not too bad. I do it like this because not every single setting has the same impact on performance. For example anisotropic filtering won't be maxed out if I use a lower preset than ultra but 16x anisotropic filtering pretty much doesn't lower the performance while making the game look better. Also I usually don't like motion blur and selecting some preset often turns on the motion blur. One of many reasons why I prefer PC gaming is because I can play around with settings untill it's exactly how I like it.
Anyone who has ever owned a potato PC will have learnt this the hard way already! This is a brilliant video. A high-end PC with a high resolution monitor with a high refresh rate, on 'high' settings is the best experience in recent games.
Well, this actually applies even for low settings, but in the exact opposite way. In many games, there are settings where the difference in HW demands between low and medium is pretty much negligible, but they bump the visual quality up significantly.
As a software developer, I would say that it's quite hard to do... * Game should detect your OS, RAM (amount, clocks and timings), your CPU (number of cores/ threads, frequency, available instructions, generation), your videocard (model, amount of VRAM, drivers version), SSD/HDD (read write speed for big and small files) * Run a test with current preset, make some assumptions on what to change * Change settings and run another test to make sure that it guessed correct. * Repeat last two steps if necessary (but no more that 3 times to not piss off a consumer completely!)
I'm not sure if it would result in good visual experience. Leaving aside complexity of such a mechanism, I'm afraid there would be frequent blurring and creepy change in scenery.
My biggest gripe with PC gaming is the potential minefield of graphics settings. The games that offer an immediate preview of a setting change are a god send. This at least gives the user a chance of figuring out what each setting actually does in game rather than having to try and remember what the visuals looked like before changing a setting. And that's if the game even allows the setting to change without a restart.
There's not point in saying that, as sometimes the High or Very High settings is barely different from Ultra, perhaps only when you have a still and look really closely and then compare. - But other games/engines might have more significant steps between the settings.
I wish games would allow downloading with choices of which graphics settings to download so that people who are playing on low, for instance, won’t have to store extremely high quality textures and models so they can save space too
That's true, I have 1080 monitors, and most games these days are 100+ Gb because of ultra quality 4K textures. If I could only download Ultra 1080p, I would get even more games installed :)
Not as stark of a difference but works the other way around as well, i.e., not downloading the lower res textures if you're going to play it at high settings
@@JohnFromAccounting depends on the game, resolution, and your subjective preferences. Eg. Raytracing might be more accurate, but that doesn't necessarily mean it'll be more visually pleasing. Art of all mediums has incorrect/inaccurate lighting & shading. Maybe the difference is only noticeable when you're not actively playing - at which point you have to question how much it's worth. In Doom Eternal, aside from particles & a few lighting effects, I couldn't tell much difference between medium & high at all, and low still looked good to my eye
If visually it makes little to no difference right now, in 3-5 years it won't look better. Also how keen are you to play 3-5 year old games for enjoying the maximum visual quality?
yes your kind of right i got similar opinion been on amd train and rx 580 where nice 1080p for two years than two years rx 5600 xt (1440:P mediun high)granted neither those where ultra but where able to run high texture keep in mind show some more visual intensive game and there ton easier game to run and if you buy two big title a years i guess could make it a 3 years refresh circle but than you get nothing as resell value in normale time wich certainly ain't as easier to refresh but kind dark age and not expecting whit more advance process and lower production out put likely that we will see anything great for good 3-4 years.
@@Hardwareunboxed I play new titles mostly on medium cause I don't have the best graphics card and optimisation on release is often lacking. And 3-5 years later if I enjoy the game I can crank it out to max, maybe even with a resolution bump. So why not?
I don't think I've ever started playing a game without quickly running through its settings. Been disabling motion blur, film grain, chromatic aberration, TAA, oversharpening, DoF, vignette, etc since I was a kid. Also usually go with medium/high settings on most options except stuff like textures and model detail, regardless of my rig. It sometimes brings it from 60 to 100 FPS sometimes with next to no noticeable difference and 5-10 minutes of effort.
Videos like this and Digital Foundry optimized settings are great. It's crazy how many settings barely change things visually up to medium or low, but you get huge performance boosts.
@@joker927 It'll probably be very similar to Far Cry 5, hopefully they've improved CPU utilisation a bit, as that wasn't great and obviously ray tracing will be pretty demanding until you have an Nvidia RTX gpu
Days Gone being the outlier seems to really be due to SSGI (which allows nicer shadowing & bounce lighting within screenspace), you could likely still step down on shadow resolution & such. So yeah, don't stick to presets, custom settings all the way!
I 500% agree. in most games, especially fast paced/competitive multiplayers i run medium with custom settings for details. i rather the game run smoothly that look pretty. if im playing a slow/med paced or single player then ill run ultra so hard it'll make you cry how beautiful a game can be. (who knew sims on maxed settings could bring a room of adults to tears...)
As a game developer i spend 90% of my time optimising medium and high settings as those are the settings that I actually want players to use. Ultra settings are just there so that players can turn things uo to max if they have the hardware to support it. When i get feedback saying my game runs and is "unoptimised" it's always from someone who insists on running ultra settings despite them basically looking the same as high settings but for half the fps.
Agree, i dont understand why gamers bother with max settings just to get screenshoot like graphics, for example in forza horizon 5 even at low settings (not lowest) i dont even notice that much difference compared to higher setting when playing normally as long texture still at high setting unless i compered them side by side using screenshot at static scene
@@REgamesplayer if the game looks great and runs well at default settings it is well optimised. If the end user decides to run 4xSSAA with all ultra on a 2060 and then complain about it being unoptimised that's user error.
@@bobjames1521 We are not talking about custom settings. We are talking about ultra settings. Ultra =/= Max. Furthermore, SSAA is actually a very good setting to turn up. At least you would notice the difference rather than whatever we saw Ultra settings to be in those games shown in this video.
I'm mostly concerned with average fps, aiming at 120fps in 4K. I usually set everything to max at first, check what the average fps looks like, and if below ~120 I start to tweak settings until I get a stable 120+. This is pretty much always possible, but in some extreme cases I've been known to lower resolution.
Agreed, this would be much more useful. especially if the game could automatically do that based on what monitor you are connected to, at least for the very first load.
that don't really work cause, most game engines can't change every single graphical setting without a restart or reload.. just some settings can be changed, only resolution can be easily scaled on the fly.
@@gabrielandy9272 They don't change "every single graphical setting", they just change what they can. Forza Horizon 3 and 4 had dynamic quality settings that would change things like foliage density and draw distance based on framerate target, yet both also needed the restart for certain settings.
Ultra is for the future, the rich, and those wishing to flex their e-peen. High is for those who want a balance of performance and graphics. Medium is for the budget conscious. Low is for the thrift store shoppers. Very low is for Mac users.
Ultra settings can be worth it though when your graphics card is powerful enough to run ultra settings at native resolution while your monitor has low refresh rate and thus can't take advantage of the higher FPS from high settings without screen tearing.
A lot of the time Ultra will bump up things like reflection probe cubemap projections up to 4k which is a lot heavier to render but is often practically unnoticeable, especially depending on the object that's reflective.
It's a cool video, but most of the time presets don't actually set up everything on ultra, I mean even "ultra" presets have some settings on high and you can crank it up to real ultra manually, so you'll get some more noticeable differences in quality.
Most people don't bother customizing their settings, they just click a preset and go. Having the "Ultra" setting only be kinda-sorta "Ultra" lets the game devs - who actually want their game to be playable regardless of how optimistic their players may be - chuckle a bit and say "Ha, yeah, here's your Ultra Setting, ya wank. Now just play the damn thing."
I think it depends on what hardware you have. I usually don't buy the flagship cards so turning down the settings to get good smooth frame rates is normal for me, I like to keep the framerates at 90 or higher if possible.
Textures always to ultra, shadows high/medium, ambient occlusion medium/high. Other than that turn most post processing off or low. This I find at 1440p gives the best image to performance ratio (1080 Ti).
Ultra on Cyberpunk and many other games is just 1.5 myopia mode, things close are still the same, things far away or on the side are blurry, and that's it.
In my experience the general "Rule of Thumb" for game grafics settings is: Low: If you can´t get the game running or you are an max FramesPerSecond fetishist. Medium: "Time to Time" gamer or low bugged gamer who wants to enjoy the game in a comfortable state. High: Full Time, hobbiest gamer that wants to enjoy the game in all its glory. Very High/Ultra/etc.: Enthusiasts that want to enjoy the full pontential of the games they play and/or the game is 5 Years old and your midrange GPU is getting bored on high settings.
Excellent video. My advice for cases where you need to turn down even more settings , shadows is very good choice to lower as making them less sharp and detailed makes them look more natural in most games ( not all as it depends on implementation).
Agree, in some games I want to lower them to make them look "better"/more natural to me. But depending on implementation, lowering can make them just more pixelated, which is not really wanted.^^
not to mention sure you may notice the changes when switching back and fourth looking for differences, but when in the action of the game, last thing you are worried about is reflections in the puddle on the ground
Maybe, but it all adds up to the general ambiance of the game. Sure, in the middle of a gunfight you won't notice these reflections in the puddles, but when you are actually walking around and taking in the atmosphere, trying to pretend you are living in that game world, these reflection help to get even more immersed.
I went with a 3060 ti for this reason actually. I play on high settings on nearly all games and don't plan on going 4k any time soon so it just made sense to get the least expensive current gen card and skip next gen then hope the 5070 is a beast lol. This gen has been nuts.
Man thank for you putting this video out! I have been saying this for years and am always so annoyed with the obsession for ultra settings for PC enthusiasts. It probably won't change the culture for most, but it at least will get the conversation started and some people thinking :)
Ultra were suppose to be a future-proof setting for someone in years after release with never generations of hardware. For some reason now a lot of players have to to play on those Ultra settings and bash about "unoptimized" game.
5:38 ah, that explains why the Ultra image appeared less sharp than High, another thing about not manually tweaking your games is that you'll leave a lot of effects that are subjective in terms of improvement to the presentation up to the developers and they generally have pretty bad taste/go with graphical trends that for a lot of people (like me) are straight up horrible and degrade the image quality; like here you can see Chromatic Aberration, and also things like Motion Blur, Bloom (this one was specially bad some years ago), film grain and etc Always tweak your games, even if you can run on Ultra, that's what the PC platform is for
Ultra is what allows me to tell my wife I need a newer better computer. This is how I escape the depression that is real life. I want the best images possible! Ultra is the only option!
I have been buying the top hardware on the market for 9 years. I ditched SLI for the first time when the 3090 came out (had 2 2080 Tis, 2 1080 Tis before that, etc). Same for CPUs… always the top. Spent tens of thousands on PC hardware over the years. 4K max settings with a minimum of 60 has always been impossible to achieve for at least 50% of modern games. For example, no money can buy 4K maxed out CP2077 and AC Valhalla. This would really affect me negatively. I am now slowly learning to leave well enough alone and accept that there is nothing wrong with lowering a few settings, especially if they don’t improve graphics drastically. Also, sometimes the blame lays squarely at the developers’ feet for not spending time to optimise their games.
The only settings I used is based on Gpu temperature not eye candy. Dial down and dial down till you get a good balance between graphics and desirable gpu temperature.
I mean its already quite common knowledge, so i thought, that games are made/optimized for low/mid/high settings, anything above is just achieved with brute force and the fps loss to quality gain ratio is laughable. For years now games do even lock good at low settings, something that was a big change at some point as previously games looked utter rubbish on anything but medium. The reason for all this development is simply put the streamlining process in multiplatform development. Where low/mid/high just resemble the different consoles the game is going to be released on. At least something good that came from multiplatform development.
Honestly, I wish they made ultra actually Ultra by that I mean making it where not even the highest end cards of today can run it but it's meant as an option to future proof it.
i think that's what they did a few years back. I can turn up the settings for old games to their limit and have it fly even on my modern laptop which is a potato for today's games.
ultra is for future proofing. They should allow even higher settings with a warning so that in the future games can easily be set higher with better cards
@Mc76 Same result does not mean equal weight. A 1000 dead of old age or a 1000 dead from desease in NOT the same. An expensive graphics card is a top tier card that cost a lot. An overpriced graphics card is the same card valued beyond its actual worth due to factors not relevant to the performance of the card itself. Get what I'm saying?
@JeremyCuddles Pretty much everything you said is wrong except for "circumstanaces", but that just avoids the issue of current price, real price and performance.
I agree its accurate looking at recent history but with announcements of shortages pretty much across the board for many components looking to hold out for quite a while (comapines announcing they expect to meet demand in 2024 etc.) tsmc increasing the cost of allocation by 20% and depending on most likely how accepted crypto remains and the way governments deal with it there is still plenty of uncertainty.
whoa there Tim your talking sense. Has Nvidia sent out another threating letter about your editorial direction yet? I mean they're "All in" on FPS draining features and your talking about something that goes against their philosophy of up charging up the sphincter for a minor bump in performance.
Glad to see this video on this specific topic nowdays that the marketing bs has stirred the interest towards such features (like 8k and Ultra) Ultra was never meant for the "optimal gameplay experience" . Obviously this video takes the graphics quality point of view first but, if you think about it, developers have certain target fps based on minimum hardware all the way to recommended hardware and that's what dictates the LOW to HIGH presets settings. Ultra is just flexing the engine with no regards to optimising the gaming experience smoothness or otherwise the fps or performance cost.
In a nutshell : Low = 20% quality Medium = 70% quality High = 98% quality Very High = 99% quality Ultra High = 100% quality High Textures vs Ultra Textures have huge visual clarity but for shadows, ambient occlusion, reflections, tessellation and other shit stuffs, there is no significant visual loss on high compared to ultra.
Great video! You should include No Man's Sky in your testing. The differences between presets in that one are easily perceptible and quantifiable. Also, stronger emphasis on VRAM and GPU temperatures would be much appreciated. FPS and texture fidelity are no longer the most important criteria in an ecosystem where the majority of GPUs are old and pretty burned from usage.
Thanks Tim! it would have been very interesting tho to run the test also with an older GPU like a 2060/Vega or something similar, so even people with last-gen GPUs could see that they can still run most of the games that they want to play.
Yes you hit the nail on head if take a look at say steam This quick grab of the lists of top 12 GPU that total around 45% of GPU in play So I like to see guide that features GPU that most people have and playing with It is interesting that 3070 is 11 on this list and is only has 1.69% So most people are using older hardware can we see more of this and advice on getting the best out of it NVIDIA GeForce GTX 1060 NVIDIA GeForce GTX 1050 Ti NVIDIA GeForce GTX 1650 NVIDIA GeForce RTX 2060 NVIDIA GeForce GTX 1050 NVIDIA GeForce GTX 1660 Ti NVIDIA GeForce GTX 1070 NVIDIA GeForce GTX 1660 SUPER NVIDIA GeForce GTX 1660 NVIDIA GeForce RTX 2070 SUPER NVIDIA GeForce RTX 3070 AMD Radeon RX 580
I always play games on High. Ultra is meant for reviewers to provide screen shots and clips of the highest graphics settings. I can’t tell a difference between High and Ultra.
The only game I ever waited to play on max settings was RDR2 for which I waited a couple of years and played it on a 6700XT on max seetings. It wa totally worth it though. Cant imagine playing that masterpiece any other way than maxed out. I new from the start I needed to wait to experience that game in all its glory.
This is why I use a mix of medium and high, I cant tell the difference between some settings and my fps sky rockets. Its allowed me to have a satisfactory experience at 4k 120 on an rtx 3070 instead of a 3080. And where 4k 120 doesnt work, 1440p with TAA does the job
I've been happy with a tweaked Medium setting in most of my games for a while now. Though I will say that from when Skyrim first came out my hardware could only handle it at low. A few years ago I got a 580 and seeing it on the highest preset made it come back alive for me. So my thoughts ultra is for your old favorites to breathe new life into them and screenshots. Stick with a mix of medium and high for the new age stuff
Does anyone actually use the presets? i feel like most PC gamers configure the settings themselves.. you can even go over the max preset in most games if you do that
I rarely bother to change individual settings unless I really want to gain extra performance for minimal visual fidelity loss. Id rather play and get immersed instead of switching settings periodically midgame, or wasting half an hour just turning the knobs before i even get to play :)
Turn up textures and texture filtering to Ultra though 😉
✔
I always run High settings, even with my 3090. Just isn't worth it performance wise.
And please, reduce pretty much any other setting before resorting to running your monitor at a non-native resolution. 🙃
@@sciencetablet2634 it's the worst. Fxaa>Taa all day haha
Nvidia & AMD CFOs would like to have a little chat with you guys :)
The best graphics setting has ALWAYS been "Custom"
What do you mean? You turn off motion blur or what?
@@fjalls depends on the game. Usually I take the preset that gives me sufficient FPS for what I play, then put the texture resolution and render distance to max, I try disabling AA (1440p and higher makes aliasing less noticable), disable triple buffering, vsync and motion blur (although I'm pretty sure the effect on performance is minimal), and sometimes I play with particle/foliage density to achieve good look/perf balance.
Amen
@@arowhite it was a joke by ok 😊
No, it has been Tim's preset
No, you have to buy the most expensive videocard. Buy at least 2.
The more you buy, the more you save.
It just works.
yes Lord Jensen 🥴
don't ask questions, just consume product then get excited for next product.
HahahahahHa
Are your 'Pascal friends' still your friends if they held onto their Pascal cards instead of upgrading? Rhetorical question, no need to answer. :D
Thanks Jensen, I took your advice and bought 2x AMD 6900XT!
A wise man once said,
"Ultra is for screenshots!, High is for gaming."
-TD
For me it was from 800x600p for gaming and 1920x1080p for screenshots.
Disagree. The inclusion and ambience of ultra is way better. It's how the director of art intended it to be shown.
@@letsplayblis1115 Lmfao
@@letsplayblis1115 strange that they turn then all off on console then
@@WayStedYou Consoles are pretty much always significantly less powerful than medium to high level gaming PCs. Consoles are cheap and very accessible though, while PC tends to be for serious enthusiasts who want the most out of their hobby. PC gaming is less accessible due to the greater cost of hardware and technical know-how required to operate the machine.
Doom Eternal showing what a properly optimized game looks like.
No other developer is on the level of iD Software
yup, the only thing needs to be considered is if u surpass the vram limit.. ultra settings on 1080p use almost 6gb..
@@sonicbroom8522 I second that, also 3DRealms were great at optimizing the game like Duke Nukem Forever back then it never put strain on computer system.
Exactly.
I would say Rainbow Six Siege is what happens when you got an insanely optimized game. I mean cards from 10 years ago can run it at over 60 fps at 1080p on Ultra.
I usually set everything on ultra, then I turn off all the options I hate with passion (motion blur, chromatic aberration etc.) If I'm not satisfied with my FPS, I configure each option individually, to see how much they change in terms of graphics quality and performance.
Amen bro, dont forget lens flare. That shits for people witout eyes
Same here...not sure why SSO, DoF, blur and that other crap is even in a shooter.
@@mercurio822 I'll never understand why they out lens flare in a game. Our eyes don't have lens flare so why would they in a game?
@@toxicavenger6172 some settings like film grain, chromatic aberration and lens flare aren't things we see with our eyes but cameras see it like that but i've no idea why games try to replicate a camera lens rather than human eyes
@@shutlol8378 Probably because some people think it looks cool and/or better.. As long as you can turn stuff like that off I do not really care..
I really needed to hear this. I find that I’ve started to expect too much from games. The best gaming time of my life was being a kid playing MKII and Altered Beast on my Sega Mega Drive, yet as an adult I want to play at 1440P Ultra settings and I feel let down if it’s not performing good. As my mum used to say “don’t let perfection be the enemy of good”. Really appreciate this perspective.
Myea, I pretty much agree. I also come from a time when it was just "Cool, this game is a thing and runs.", no matter how it looked or performed. - But that's just not how games are anymore. They're not two-dimensional side-scrolling games that are fine at like 30fps or anything like that. - Games are much more made after "real life", even if it's fantastical, and it SHOULD basically be high-fidelity and high-performance. - There's nothing wrong with that, it's just how it should be, really. - Then again, expecting too much is indeed not good.
The problem is youre playing at 1440p. This isn't enjoyable and very pixelated. You need 4k 144hz minimum ultra settings.
@@LawrenceTimme It’s like you didn’t read a thing I said lol. Also, you’d have to be high on crack to think that 1440p is pixelated, so I take it you’re not actually serious. I have a 4k display too, but I prefer to game on my 1440p 144hz LG monitor. The issue is that I worry too much about frames when it doesn’t even really matter. I gamed at 720p 30fps for years and loved it. That was enough to have fun then, so it shouldn’t matter now when I’m getting a much better experience now.
@@Psyopcyclops Many people are influenced by marketing and hype and while some of us are more aware than others we are all effected by it if you consume media from channels like Digital Foundry or other channels that glamorize high end hardware, maxed out settings and ray tracing. You can often see the effects of hype and marketing in the comments section when people talk more about maxing out the settings and their hardware than the game and what they liked about it. There's a reason why hardware manufacturers make sponsored content with some of the bigger channels, why they send them their flagship for "review" well before the item goes on sale because influencers are a part of marketing today.
I'm not saying there's anything wrong with wanting to max out graphics of course but what I'm saying is that you can enjoy a game and get immersed into it playing in high, medium or even low if you had no other choice. Of course it depends on the person and their preferences as well. I for example can dive into and enjoy a retro rpg as much as if not more than many AAA games today.
@@Psyopcyclops Unfortunately the more educated about games we become, the more you'll feel that way. I know I would of saved a lot of money and probably be just as happy with something like Series S if I didn't understand the differences between settings. That said I do like the rest of the possibilities like modding our games on PC and can't go back to exclusively playing on console's ever again.
As a small-time indie dev, I'm kinda glad you made this video. When I've implemented graphical settings, I've always thought to myself that I might as well let people set the render scale to 16x native if they want to, or render shadows at 16x normal resolution if they want to. It's literally zero skin off my back to just increase the max value of the slider, and I imagine many of the "ultra" settings come from a similar mindset.
However, I've started noticing people who just max out everything without second thought, even though the maxima are just ridiculous settings, and then they come and complain that it doesn't perform well, so your point of "complaining about poor optimization at ultra settings" really hit home with me. It pains me, but I've started thinking that I probably have to start limit the maximum settings to "reasonable" values just to keep people from harming themselves.
I think a good approach is to make somewhat reasonable decisions to push hardware and future-proof the graphics a bit, but if there's a lot of potential "harm", you can make it customizable via an INI file or such free of in-game boundaries. But I guess that depends on the engine you work with, how easy this can be done and if you want to spend the time for that.
Also, resolution scale shouldn't be a part of the presets and default to 1x/100% and be adjustable separately. I assume most people just leave it like that and don't push the slider to max, like it's done for the preset.
Ez fix: check the system date and only unlock ultra options after 2023 lmfao
the reason they maxed everything is because they spend a lot of money on top of the line graphics card and got dissapointed that they could have bought a more cheaper alternative
I don’t think so the game will age better with ridiculous settings since future gpus will run it better
yep,,,
Most people are just clueless.
sometimes, I wonder if they even think outside theirown scope.
I get the impression that the highest modes are basically 'optimizations off' and it's really meant for 'forward compatibility', or in other words, use those settings in the future when revisiting the game and your new GPU can run it at such a high frame rate that it doesn't matter.
That's absolutely true, and some game even warn you about it, like Kingdom Come for example. But that worked until last year. Now that you can't find a new GPU to buy, it's just depressing to look at these gorgeous games from my 980 Ti while knowing I will never get this kind of visual quality. I can't afford (and don't want to) pay scalper prices, and when I upgrade it will be for something better than my current 980 Ti, but anything above that GPU is still $500+ today while it's really a mid-range card these days. Entry-level and mid-ranged GPUs simply don't exist anymore.
A brand new RX 590 was going for under 200$ two years ago, often under 150$ with good deals, now you can't find a used one under 300$, and you'd still need to be lucky for that as most are sold 400$+
@@mikeonlinux5491
980Ti is like 1070 or 1070Ti.
If you are willing to play at 30 or so fps...let's say 30 - 40 instead of 144 and above,you can match the visual settings of vastly supperior hardware just fine, especialy if at 1080p.
3090 not required...unless the game is an absolute mess
requiring 3000 patches to work well enough on anything below highest of highest end.
@@GameslordXY Absolutely, I play Cyberpunk 2077 at capped 30 FPS, because I'd much rather have a butter smooth 30 FPS than FPS doing yoyo between 35 and 60+ FPS. This somehow just feels smoother.
I play all my games with ultra settings, and this 980Ti is still good enough, especially since as you said I'm playing on dual 1080p 75 Hz monitors.
It's just that I tend to like games that are terribly optimized lol like that CP 2077 or Star Citizen, and for these, the 980Ti can be limiting. I'd rather have RTX active and 60+ FPS like in my other games.
The Witcher 2 has settings called Ubersampling and Cinematic Field of View that still don't work well in 4K a decade later.
But often these ridiculous settings scarcely make a visual difference.
Like Tim mentioned I have found games where a lower quality setting looked better.
Why wouldn't future players prefer more fps or quieter fans than an unnoticeable effect?
In the days of yore, "Ultra" was a means for the developer to just push boundries, to see how far they could take it. Or so it seems when you look at older titles. But High could sometimes nearly double the framerate you got.
hesedays, when you look at Ubisoft, they clearly label they intent for you to play at "High" settings, they always mention this on their spec sheet for games. In generall I have always felt games are optimized around the "High" preset, not Ultra. Ultra feels like a "Do waht you can without breaking it" kind of setting for developers.
Exactly. Kinda like consoles work until they are late gen
Yeah. E.g. Horizon Zero Dawn, the Playstation 4 quality is only "Medium" on PC and already looks very good.
The ultra settings are for when you buy a new gpu and you are all enthusiastic about it and crank everything to the max. After a while the honeymoon ends and rationality kicks in, so you lower them to high for more fps or for the same fps with less power consumption and noise. Personally I prefer optimized settings as sometimes the high preset drops some settings such as anisotropic filtering to x8 which is dumb as x16 has hardly any performance impact.
Screenshots too, highest settings, pause, photo mode, screen shot, return to high
I've actually found that GeForce Experience doesn't do a bad job at optimization most times since the software already knows what card you're running and what it should be capable of. Always have to murder motion blur though.
Or you can just turn on vsync to not wasting extra fps from like playable 60 or 75fps
Ultra definitely looks better in a lot of games, its subtle but it is. It's like RT lighting, its subtle but it's a big difference as you go through the world
Yeah, That's not true at all for most people who can crank everything to max. I typically run everything on maxed settings in 1440p. The only reason I would lower settings if it's an online fps game where I want to hit my monitors 165hz. But if I can get 165fps with maxed settings I wouldn't lower the settings ever. Before when the 2070 super came out again I ran everything maxed out but in 1080p. Unless my card has issues running it I wouldn't lower the settings...
Some differences, even if there are plenty on ultra, dont often look objectively *better* . They just look different. Medium and high graphics are sufficient, and really it is only the *low* textures that look objectively bad due to low poly , pixelated look.
This is a pretty good summary. Low settings are there to make a game playable on old or non-gaming hardware (like work laptops). On med and high settings almost all reasonable performance cost improvements have been gained. The rest is just marginal gains at huge cost.
I feel like this applies more to textures and geometry more then effects, ect. But, fair point.
>and really it is only the low textures that look objectively bad due to low poly
And thats my only real complaint about Valheim.
Keep in mind that many of not most multi-platform AAA titles from the last 3 to 5 years use comparitive low or low-to-medium fixed settings on consoles (PS4/XBone) while running at 60, or even just 30 FPS.
I'm sure that is going to drastically change once developers start learning real optimization techniques for the current gen consoles. But by the time those games start rolling out, PC hardware will have had a few more upgrades to overcome the lack of PC optimization, and the settings will still likely be comparable to medium or lower settings for the PC versions.
Arguably the biggest performance impact to the Borderlands games is to disable the black outlines, which are applied as a post-processing effect. These instructions are usually given with a caveat that the game looks significantly worse, but I think I prefer it. In any case the net effect is far less than often stated, especially when you're in the middle of a gunfight.
I’d love to see a blind test with users that claim that Ultra is the only setting they’ll accept.
Ultra is for screenshots, high is for playing the game! No one should put the settings to ultra haha
Testing this blind?
Yeah, that will work. lol
@@dom_m7879 it feels empowering to play at ultra 1440p
So I keep it at ultra I like it.
I like to see a FPS showdown, how painful was it to listen to bitchy blue boys saying “we’re still the best” when they had 5-10 FPS more on games hitting 100+ FPS.
The difference might be minor, but if you have a keen eye, you can still see it. Especially for a game you put hundreds of hours in. For example, I know exactly how each graphics setting in GTA 5 will affect the look of the game.
Before watching the video:
Plays at ultra.
After watching the video:
Plays at high for 10 minutes.
After 10 minutes:
Plays at ultra
Game developers should give better names and descriptions for the settings, such as:
Low - Try this if Standard is unacceptably slow on your computer.
Standard - Good visual quality. Captures the intended look and feel of the game well.
High - Great visual quality. Recommended default for high-end hardware.
Highest - Nearly indistinguishable from High, but with a significant performance cost.
Punishing - Useful for benchmarking computer hardware or heating home. Zero visual advantage over Highest.
Its amazing how invisible the difference ultra and high is in some games. Yet they even use like more than twice the VRAM, too.
All studios would need to do is rename “very high” to “ultra” and no one would even notice, while still giving people the warm fuzzy feeling of being on the highest setting, which let’s face it, is 99% of wanting it.
Sometimes even putting games on medium is more than enough. Heck, if I have the choice of 60fps on high or 90-120fps on medium with slight less details, I'll always go for the medium settings because I really am so used to high fps gaming that dips below 60fps are noticeable and annoying af....
It's due to things like Draw distance and Level Of Detail (rendering and detailing of distant objects) as well as Shadow & Reflection Resolution. The latter is sometimes MUCH higher (often 2x -- for example going from 2048x2048 to 4096x4096 -- or even higher) when going from High to Ultra.
Because consoles can't run it so they slapped in "ultra" setting to pretend they catered to PC.
I can't help turning AA to the max. Even in 8K resolution with 4X MSAA, I can still see fuzzy and shimmering edges.
I used the Hardware unboxed advised BF5 optimal settings and went up 25 fps and I don't really see a difference
Hope you guys will do another optimization of settings for games soon
Whats ur pc specs?
Intel 3770K @4.8 ghz 32GB ram 2600mhz and a 980Ti on a Z77 sabertooth from Asus
I hope they do the same for 2042 🙏🏻🙏🏻🙏🏻
@@jeffreybouman2110 f, just upgrade your cpu and mobo for another 20- 35 fps extra
@@richard35791 haha yes but this was free ;)
This tests should have even more impact on slower GPUs and GPUs with limited VRAM. Great content! Cheers!
@@Relex_92 warzone actually has like half the graphics settings unchangeable. Like it let's you change them but doesn't actually do anything. Also it's vram thing lies out if it's arse. My 8gb card has most things on low at 1440p but I have to lower textures because of vram usage
Finally someone is digging into this ultra madness :) Good point on the "if it runs poorly on ultra, it's poorly optimized" attitude. Also, I would like to see MORE optimized settings videos, it's very useful to know which settings have big performance impact for each individual game. It's not always the same and dependent on the actual implementation.
"high is the place to be"
Yes, yes it is.
Ultra settings are when you want to play a game from 2014
I think they do have their use especially for developers to flex an engines capabilities but I also think that we shouldn't bundle multiple aspects into 1 "Ultra" settings. Shadows for example can be broken down to draw distance, contact hardening, resolution, filtering, instead of just "Ultra". This would give players more room to configure instead of having to sacrifice certain aspects of shadows by dropping from just ultra to high. I've seen so many cases where dropping shadows from just ultra to high takes away soft shadows and dramatically decreases the filtering quality. This goes to other settings as well like reflections, lighting, post processing etc
It is useful for benchmarks OR if your current card runs that setting at high enough frame rate for your monitor. E.g. some may want to lock a 60fps or others will have a high refresh 144fps but accept a 1% low of 80fps.
One of my pet peeves is when a whole bunch of effects are just lumped into "post processing". Ubisoft used to be awful for this back in the day. Was no way to turn off motion blur without turning off particle effects lol
I do agree but they are not thinking about us who people who know how to use the settings they're thinking about people who are unfamiliar with the settings and what does what they trying to keep things as simple as possible
@@kennethshores8547 Just because a lot of people don't know how to work around them doesn't mean they shouldn't do it. They can still keep the presets with the settings and only anyone comfortable with going deeper can go into those "advanced settings" and the present name changes to "custom"
@@kennethshores8547 ubisoft games have done a good job recently by showing pictures of comparison in their settings menu but some aren’t really accurate and it’s really not much to go by, detailed explanations and comparisons would be preferable
IMO, most games' medium setting already look more than good enough for me and I'm perfectly fine re-visiting games 4-6 years later if I liked them enough to re-play them at higher details after a GPU upgrade.
Except for textures and model quality though. I agree with him that as much as we can, turn those up as they are really noticeable.
This. I can't wait to go back & play Metro Exodus Enhanced Edition some day.
Someone used to post this about MSFS 2020: 4 graphics settings
Low = Low
Medium = Medium
High = Medium with less FPS
Ultra = Medium with least FPS
😀
Low = Low
Medium = Low with less FPS
High = Low with even less FPS
Ultra = Low with least FPS
@@H3LLGHA5T You wish.
@Go To Channel [𝐋!𝐯𝐞 S.A.X] Doesn't matter to them even if they can't tell the difference between Ultra and High, because they are going back to the graphics setting and set it to Ultra anyway, just so they can tell their mates they have bigger e-penis.
@@AngelicRequiemXcan barely tell the difference on newer game, just worse optimization
and more resolution on texture that the human eye can't percieve smh
This was a good one! Only had PC for a year and I'm just starting to really get a grasp of settings and what not. This seals the deal for high. Thank you!
can i ask how old are you?
I prefer going through each setting individually untill I find the perfect mix of quality and performance. It does take a bit more time but you only have to do it once per game so it's not too bad. I do it like this because not every single setting has the same impact on performance. For example anisotropic filtering won't be maxed out if I use a lower preset than ultra but 16x anisotropic filtering pretty much doesn't lower the performance while making the game look better. Also I usually don't like motion blur and selecting some preset often turns on the motion blur. One of many reasons why I prefer PC gaming is because I can play around with settings untill it's exactly how I like it.
I've almost always steered away from Ultra details as the difference between this setting and High is minimal.
Anyone who has ever owned a potato PC will have learnt this the hard way already! This is a brilliant video. A high-end PC with a high resolution monitor with a high refresh rate, on 'high' settings is the best experience in recent games.
Me, that runs every game on lowest settings: "Interesting..."
Settings above low are merely hypothetical.
@@thedoofguy5707 Agreed.
I even own a RTX 3080 and still play pretty much everything at low, unless higher settings still reach 144fps
TBH, even low looks significantly better than 10 years ago 🤷🏻♂️
Well, this actually applies even for low settings, but in the exact opposite way. In many games, there are settings where the difference in HW demands between low and medium is pretty much negligible, but they bump the visual quality up significantly.
Games should give you the option to select the target FPS. All effects should be automatically tuned to maintain that FPS.
As a software developer, I would say that it's quite hard to do...
* Game should detect your OS, RAM (amount, clocks and timings), your CPU (number of cores/ threads, frequency, available instructions, generation), your videocard (model, amount of VRAM, drivers version), SSD/HDD (read write speed for big and small files)
* Run a test with current preset, make some assumptions on what to change
* Change settings and run another test to make sure that it guessed correct.
* Repeat last two steps if necessary (but no more that 3 times to not piss off a consumer completely!)
@@igorthelight come on just do it bro
@@marcoslightspeed5517 It's possible tho! ;-)
Some games can do that. I think dead by daylight has that option.
I'm not sure if it would result in good visual experience. Leaving aside complexity of such a mechanism, I'm afraid there would be frequent blurring and creepy change in scenery.
My biggest gripe with PC gaming is the potential minefield of graphics settings. The games that offer an immediate preview of a setting change are a god send. This at least gives the user a chance of figuring out what each setting actually does in game rather than having to try and remember what the visuals looked like before changing a setting. And that's if the game even allows the setting to change without a restart.
Agreed
Yeah the preview is a must-have imo. Makes it take 2-3 minutes to find the optimal setting for every individual item.
@reality8793 guess you never had a good one then
@reality8793 for me, i like to have a more powerful card, so i can get high constant framerates without dlss.
The only setting I prefer at ultra is textures. Everything else I'm fine with high to medium.
yeah a lot of the time settings such as shadows on ultra are only noticeable sharper at higher resolutions.
There's not point in saying that, as sometimes the High or Very High settings is barely different from Ultra, perhaps only when you have a still and look really closely and then compare. - But other games/engines might have more significant steps between the settings.
shadows?
draw distance is one of my biggest pet peeves Nothing takes me out of the game like stuff popping in
and texture filtering/anisotropic filtering
reducing quality settings used to be much more noticeable in older games
I wish games would allow downloading with choices of which graphics settings to download so that people who are playing on low, for instance, won’t have to store extremely high quality textures and models so they can save space too
That's true, I have 1080 monitors, and most games these days are 100+ Gb because of ultra quality 4K textures. If I could only download Ultra 1080p, I would get even more games installed :)
@@mikeonlinux5491 It would even save some energy. Less Gb>faster sending>less workload on the server.
call of duty does that now kind of
I know i late but hopefully every triple a title implement this for higher graphics settings that budget gamers wont even use and just wasting space
Not as stark of a difference but works the other way around as well, i.e., not downloading the lower res textures if you're going to play it at high settings
This didn't age very well with Steve's going on full bore on Ultra proving 8 GB of VRAM is not enough.
Modern games seems to be visually impressive on low settings and not much different on ultra.
Low settings used to look hideous, but nowadays the graphics have gotten so good that there is not that much difference between the settings anymore.
Low setting and high resolution is a winner for me
Not true, there is still a big difference, but Ultra is pushing the last 1% of visuals and it costs 50% more hit to performance.
@@JohnFromAccounting I'd challenge you to see much of a difference at 0:52 ;)
@@JohnFromAccounting depends on the game, resolution, and your subjective preferences.
Eg. Raytracing might be more accurate, but that doesn't necessarily mean it'll be more visually pleasing. Art of all mediums has incorrect/inaccurate lighting & shading.
Maybe the difference is only noticeable when you're not actively playing - at which point you have to question how much it's worth.
In Doom Eternal, aside from particles & a few lighting effects, I couldn't tell much difference between medium & high at all, and low still looked good to my eye
You can mix up settings and get basically console quality, which is good enough and save lot of performance depending of the games
I feel ultrasettings is a more future option when you play a game 3-5+ years from now. And its good for that, I reckon.
If visually it makes little to no difference right now, in 3-5 years it won't look better. Also how keen are you to play 3-5 year old games for enjoying the maximum visual quality?
@@Hardwareunboxed ultra is not good enough when you play in 8K. Texture resolution is noticeably lacking.
@@allansh828 who is actually playing in 8K?
yes your kind of right i got similar opinion been on amd train and rx 580 where nice 1080p for two years than two years rx 5600 xt (1440:P mediun high)granted neither those where ultra but where able to run high texture keep in mind show some more visual intensive game and there ton easier game to run and if you buy two big title a years i guess could make it a 3 years refresh circle but than you get nothing as resell value in normale time wich certainly ain't as easier to refresh but kind dark age and not expecting whit more advance process and lower production out put likely that we will see anything great for good 3-4 years.
@@Hardwareunboxed I play new titles mostly on medium cause I don't have the best graphics card and optimisation on release is often lacking. And 3-5 years later if I enjoy the game I can crank it out to max, maybe even with a resolution bump. So why not?
"they hated him because he spoke the truth"
@@awaken9106 no Jesus Christ
@@Pulser43 Steve from Gamers Nexus?
@@TheStraightGod nope!
Ultra is for screenshots, High is for gaming - Tech Deals
I don't think I've ever started playing a game without quickly running through its settings. Been disabling motion blur, film grain, chromatic aberration, TAA, oversharpening, DoF, vignette, etc since I was a kid. Also usually go with medium/high settings on most options except stuff like textures and model detail, regardless of my rig.
It sometimes brings it from 60 to 100 FPS sometimes with next to no noticeable difference and 5-10 minutes of effort.
Videos like this and Digital Foundry optimized settings are great. It's crazy how many settings barely change things visually up to medium or low, but you get huge performance boosts.
I really hope either/both do one for Far Cry 6 that game looks like a GPU killer on ultra.
Do you have a link to that video
@@joker927 It'll probably be very similar to Far Cry 5, hopefully they've improved CPU utilisation a bit, as that wasn't great and obviously ray tracing will be pretty demanding until you have an Nvidia RTX gpu
The foolish man goes for Ultra
The intelligent man goes for High
But the wise man goes for Custom
And greedy man goes for very low or low(to get max performance)
Days Gone being the outlier seems to really be due to SSGI (which allows nicer shadowing & bounce lighting within screenspace), you could likely still step down on shadow resolution & such.
So yeah, don't stick to presets, custom settings all the way!
Yes, there was difference in global illumination, I think Tim forgot to mention that one.
I 500% agree. in most games, especially fast paced/competitive multiplayers i run medium with custom settings for details. i rather the game run smoothly that look pretty. if im playing a slow/med paced or single player then ill run ultra so hard it'll make you cry how beautiful a game can be. (who knew sims on maxed settings could bring a room of adults to tears...)
As a game developer i spend 90% of my time optimising medium and high settings as those are the settings that I actually want players to use. Ultra settings are just there so that players can turn things uo to max if they have the hardware to support it. When i get feedback saying my game runs and is "unoptimised" it's always from someone who insists on running ultra settings despite them basically looking the same as high settings but for half the fps.
Agree, i dont understand why gamers bother with max settings just to get screenshoot like graphics, for example in forza horizon 5 even at low settings (not lowest) i dont even notice that much difference compared to higher setting when playing normally as long texture still at high setting unless i compered them side by side using screenshot at static scene
How someone else has to know what you are thinking? It is your fault for putting ultra out there without a disclaimer what it is for.
@@REgamesplayer if the game looks great and runs well at default settings it is well optimised. If the end user decides to run 4xSSAA with all ultra on a 2060 and then complain about it being unoptimised that's user error.
@@bobjames1521 We are not talking about custom settings. We are talking about ultra settings.
Ultra =/= Max. Furthermore, SSAA is actually a very good setting to turn up. At least you would notice the difference rather than whatever we saw Ultra settings to be in those games shown in this video.
@@REgamesplayer Since when does yellow cab drivers tell toyota engineering team how to build an engine?
I'm mostly concerned with average fps, aiming at 120fps in 4K. I usually set everything to max at first, check what the average fps looks like, and if below ~120 I start to tweak settings until I get a stable 120+. This is pretty much always possible, but in some extreme cases I've been known to lower resolution.
I wish more games had a 'target fps' setting, rather than detail presets.
Agreed, this would be much more useful. especially if the game could automatically do that based on what monitor you are connected to, at least for the very first load.
that don't really work cause, most game engines can't change every single graphical setting without a restart or reload.. just some settings can be changed, only resolution can be easily scaled on the fly.
@@gabrielandy9272 They don't change "every single graphical setting", they just change what they can. Forza Horizon 3 and 4 had dynamic quality settings that would change things like foliage density and draw distance based on framerate target, yet both also needed the restart for certain settings.
Ultra is for the future, the rich, and those wishing to flex their e-peen. High is for those who want a balance of performance and graphics. Medium is for the budget conscious. Low is for the thrift store shoppers. Very low is for Mac users.
Ultra settings can be worth it though when your graphics card is powerful enough to run ultra settings at native resolution while your monitor has low refresh rate and thus can't take advantage of the higher FPS from high settings without screen tearing.
A lot of the time Ultra will bump up things like reflection probe cubemap projections up to 4k which is a lot heavier to render but is often practically unnoticeable, especially depending on the object that's reflective.
Ultra low is the way to go.
It's a cool video, but most of the time presets don't actually set up everything on ultra, I mean even "ultra" presets have some settings on high and you can crank it up to real ultra manually, so you'll get some more noticeable differences in quality.
Most people don't bother customizing their settings, they just click a preset and go. Having the "Ultra" setting only be kinda-sorta "Ultra" lets the game devs - who actually want their game to be playable regardless of how optimistic their players may be - chuckle a bit and say "Ha, yeah, here's your Ultra Setting, ya wank. Now just play the damn thing."
This. Plus a lot of times, everything below ultra looks horrendous e.g. texture quality in a lot of games.
I think it depends on what hardware you have. I usually don't buy the flagship cards so turning down the settings to get good smooth frame rates is normal for me, I like to keep the framerates at 90 or higher if possible.
Remember the OG Crysis days when graphical presets actually made a massive difference?
Good old times...
Yep, I remember those days. I would say the 2000s and early 2010s were the last times ultra mattered from the start.
I don't buy a top-end graphics card to play modern games on max.
I buy them to play games 6 years from now on standard.
At this point, medium shadows and everything at high with anti aliasing to low or off is the best way to deal with performance
That said I still do appreciate optimization videos, not all games indicate performance impact of individual settings.
This type of content is exactly why I stay with this channel. Thanks!
The thing with "Ultra" settings is that it's a lot like a button labeled "do not press." You have to press it.
Textures always to ultra, shadows high/medium, ambient occlusion medium/high. Other than that turn most post processing off or low. This I find at 1440p gives the best image to performance ratio (1080 Ti).
This video needs to be included in every YTubers build guide. For enternity. Well done.
Ultra on Cyberpunk and many other games is just 1.5 myopia mode, things close are still the same, things far away or on the side are blurry, and that's it.
That chromatic aberration
I just make sure that I’m playing at my preferred FPS (120 FPS). As long at its stable, I put it at ultra if I can.
In my experience the general "Rule of Thumb" for game grafics settings is:
Low: If you can´t get the game running or you are an max FramesPerSecond fetishist.
Medium: "Time to Time" gamer or low bugged gamer who wants to enjoy the game in a comfortable state.
High: Full Time, hobbiest gamer that wants to enjoy the game in all its glory.
Very High/Ultra/etc.: Enthusiasts that want to enjoy the full pontential of the games they play and/or the game is 5 Years old and your midrange GPU is getting bored on high settings.
TLDR: graphical quality and fidelity has diminishing returns. Has always been the case, will always be
Been using HWU Optimized settings on RDR2 for ages. Then came the DLSS implementation, now I hit 80-100 fps playing at 4K.
@Russell White Rdr2 was made with 4K in mind, 1080p not considered. They had consoles in mind.
Excellent video. My advice for cases where you need to turn down even more settings , shadows is very good choice to lower as making them less sharp and detailed makes them look more natural in most games ( not all as it depends on implementation).
Agree, in some games I want to lower them to make them look "better"/more natural to me. But depending on implementation, lowering can make them just more pixelated, which is not really wanted.^^
not to mention sure you may notice the changes when switching back and fourth looking for differences, but when in the action of the game, last thing you are worried about is reflections in the puddle on the ground
Maybe, but it all adds up to the general ambiance of the game. Sure, in the middle of a gunfight you won't notice these reflections in the puddles, but when you are actually walking around and taking in the atmosphere, trying to pretend you are living in that game world, these reflection help to get even more immersed.
@@mikeonlinux5491 no
@@dann6067then explain why “no”
Just because you have high end setup doesn't mean you have to use the most possible graphics option
Ultra may be dumb, but I just got a laptop with a 4060 and a 144hz display, you'd best believe I'm going to crank everything to max.
I went with a 3060 ti for this reason actually. I play on high settings on nearly all games and don't plan on going 4k any time soon so it just made sense to get the least expensive current gen card and skip next gen then hope the 5070 is a beast lol. This gen has been nuts.
now imagine the price of that 5070
@@Kirbydreaming only about $5070 lol
@@TheRoboticFerret Yeah, its out of control. I am still bleedin from the 4070 ti i got :/ I will have to use it until the end of my life i suppose lol
Man thank for you putting this video out! I have been saying this for years and am always so annoyed with the obsession for ultra settings for PC enthusiasts. It probably won't change the culture for most, but it at least will get the conversation started and some people thinking :)
Ultra were suppose to be a future-proof setting for someone in years after release with never generations of hardware. For some reason now a lot of players have to to play on those Ultra settings and bash about "unoptimized" game.
5:38 ah, that explains why the Ultra image appeared less sharp than High, another thing about not manually tweaking your games is that you'll leave a lot of effects that are subjective in terms of improvement to the presentation up to the developers and they generally have pretty bad taste/go with graphical trends that for a lot of people (like me) are straight up horrible and degrade the image quality; like here you can see Chromatic Aberration, and also things like Motion Blur, Bloom (this one was specially bad some years ago), film grain and etc
Always tweak your games, even if you can run on Ultra, that's what the PC platform is for
"Ultra Quality Settings are Dumb"
They've always been.
Ultra is what allows me to tell my wife I need a newer better computer. This is how I escape the depression that is real life. I want the best images possible! Ultra is the only option!
I have been buying the top hardware on the market for 9 years. I ditched SLI for the first time when the 3090 came out (had 2 2080 Tis, 2 1080 Tis before that, etc). Same for CPUs… always the top. Spent tens of thousands on PC hardware over the years.
4K max settings with a minimum of 60 has always been impossible to achieve for at least 50% of modern games. For example, no money can buy 4K maxed out CP2077 and AC Valhalla. This would really affect me negatively.
I am now slowly learning to leave well enough alone and accept that there is nothing wrong with lowering a few settings, especially if they don’t improve graphics drastically. Also, sometimes the blame lays squarely at the developers’ feet for not spending time to optimise their games.
And all of a sudden, I find a video that makes me feel right in doing what I do. Thanks Tim!
Haha ive been playing on ultra with kena and dropped it to high and gained 20fps win win
The only settings I used is based on Gpu temperature not eye candy. Dial down and dial down till you get a good balance between graphics and desirable gpu temperature.
Ultra settings are future proofing. They're not dumb, they're just not to be used on games that aren't at least 5 years older.
I mean its already quite common knowledge, so i thought, that games are made/optimized for low/mid/high settings, anything above is just achieved with brute force and the fps loss to quality gain ratio is laughable.
For years now games do even lock good at low settings, something that was a big change at some point as previously games looked utter rubbish on anything but medium.
The reason for all this development is simply put the streamlining process in multiplatform development. Where low/mid/high just resemble the different consoles the game is going to be released on.
At least something good that came from multiplatform development.
I play on medium capped to 60. Gasp. But my mediocre hardware runs cool af and locked 60 always. Good enough for me.
Honestly, I wish they made ultra actually Ultra by that I mean making it where not even the highest end cards of today can run it but it's meant as an option to future proof it.
i think that's what they did a few years back. I can turn up the settings for old games to their limit and have it fly even on my modern laptop which is a potato for today's games.
@@victortitov1740 yeah that’s exactly what they did. For example they did it with the first Farcry.
@@Nib_Nob-t7x Yeah that would make more sense.
ultra is for future proofing. They should allow even higher settings with a warning so that in the future games can easily be set higher with better cards
2:04 go here and then press the -> and
I wouldn't use the word "expensive" for the graphics cards in this day and age, I'd use the word "OVERPRICED" because that's far more accurate.
@Mc76 Same result does not mean equal weight. A 1000 dead of old age or a 1000 dead from desease in NOT the same. An expensive graphics card is a top tier card that cost a lot. An overpriced graphics card is the same card valued beyond its actual worth due to factors not relevant to the performance of the card itself. Get what I'm saying?
@JeremyCuddles Pretty much everything you said is wrong except for "circumstanaces", but that just avoids the issue of current price, real price and performance.
I agree its accurate looking at recent history but with announcements of shortages pretty much across the board for many components looking to hold out for quite a while (comapines announcing they expect to meet demand in 2024 etc.) tsmc increasing the cost of allocation by 20% and depending on most likely how accepted crypto remains and the way governments deal with it there is still plenty of uncertainty.
This is gonna be very accurate for the Battlefield 2042 beta!
whoa there Tim your talking sense.
Has Nvidia sent out another threating letter about your editorial direction yet? I mean they're "All in" on FPS draining features and your talking about something that goes against their philosophy of up charging up the sphincter for a minor bump in performance.
Glad to see this video on this specific topic nowdays that the marketing bs has stirred the interest towards such features (like 8k and Ultra)
Ultra was never meant for the "optimal gameplay experience" .
Obviously this video takes the graphics quality point of view first but, if you think about it, developers have certain target fps based on minimum hardware all the way to recommended hardware and that's what dictates the LOW to HIGH presets settings.
Ultra is just flexing the engine with no regards to optimising the gaming experience smoothness or otherwise the fps or performance cost.
In a nutshell :
Low = 20% quality
Medium = 70% quality
High = 98% quality
Very High = 99% quality
Ultra High = 100% quality
High Textures vs Ultra Textures have huge visual clarity but for shadows, ambient occlusion, reflections, tessellation and other shit stuffs, there is no significant visual loss on high compared to ultra.
I love this channel. It has restored my passion for pc hardware. Keep up the great work!
Great video! You should include No Man's Sky in your testing. The differences between presets in that one are easily perceptible and quantifiable.
Also, stronger emphasis on VRAM and GPU temperatures would be much appreciated. FPS and texture fidelity are no longer the most important criteria in an ecosystem where the majority of GPUs are old and pretty burned from usage.
Thanks Tim!
it would have been very interesting tho to run the test also with an older GPU like a 2060/Vega or something similar, so even people with last-gen GPUs could see that they can still run most of the games that they want to play.
Yes you hit the nail on head if take a look at say steam
This quick grab of the lists of top 12 GPU that total around 45% of GPU in play
So I like to see guide that features GPU that most people have and playing with
It is interesting that 3070 is 11 on this list and is only has 1.69%
So most people are using older hardware can we see more of this and advice on getting the best out of it
NVIDIA GeForce GTX 1060
NVIDIA GeForce GTX 1050 Ti
NVIDIA GeForce GTX 1650
NVIDIA GeForce RTX 2060
NVIDIA GeForce GTX 1050
NVIDIA GeForce GTX 1660 Ti
NVIDIA GeForce GTX 1070
NVIDIA GeForce GTX 1660 SUPER
NVIDIA GeForce GTX 1660
NVIDIA GeForce RTX 2070 SUPER
NVIDIA GeForce RTX 3070
AMD Radeon RX 580
I always play games on High. Ultra is meant for reviewers to provide screen shots and clips of the highest graphics settings. I can’t tell a difference between High and Ultra.
The only game I ever waited to play on max settings was RDR2 for which I waited a couple of years and played it on a 6700XT on max seetings.
It wa totally worth it though. Cant imagine playing that masterpiece any other way than maxed out. I new from the start I needed to wait to experience that game in all its glory.
This is why I use a mix of medium and high, I cant tell the difference between some settings and my fps sky rockets. Its allowed me to have a satisfactory experience at 4k 120 on an rtx 3070 instead of a 3080. And where 4k 120 doesnt work, 1440p with TAA does the job
wish I had a 3080, my current config (5800x oc and gt 1030 oc ) is really badly balanced
hang in there my friend!
I've been happy with a tweaked Medium setting in most of my games for a while now. Though I will say that from when Skyrim first came out my hardware could only handle it at low. A few years ago I got a 580 and seeing it on the highest preset made it come back alive for me. So my thoughts ultra is for your old favorites to breathe new life into them and screenshots. Stick with a mix of medium and high for the new age stuff
What I like about you guys is that a personal rant takes the form of organization, clarity, and graphs of relevant examples. Very nice.
Does anyone actually use the presets? i feel like most PC gamers configure the settings themselves.. you can even go over the max preset in most games if you do that
I rarely bother to change individual settings unless I really want to gain extra performance for minimal visual fidelity loss. Id rather play and get immersed instead of switching settings periodically midgame, or wasting half an hour just turning the knobs before i even get to play :)