Same and it always looks fantastic and I love it... Having Shadows set to high or very high or Ultra looks great but medium shadows look good too and you don't notice the difference too much when the rest of your settings are set at a mixture of high ,very high, and Ultra textures Etc at 60fps in 4k
I set the shadows and motion blur to off most of the time and only focus on textures and anti aliasing because i can see my enemies easier with these settings but sometimes I need to see the shadows then I set them to lowest possible. I even play burnout paradise without shadows.
This is still one of my favorite DF videos. So sensible, and backed up with ample visual evidence, but really helping out most PC gamers understand how to tweak their games.
A part 2 with various Anti Aliasing Filters in various Resolutions (for example: How high does AA need to be at 1440p with a native 1080p screen) would be very appreciated.
That's a general true answer, but every game acts differently. Playing a game like Borderlands 1 with its comic design is way better in 1440p. But the performance hit is another question, wether playing in 1080p, high settings with 16xAA or 1440p with moderate settings and lower AA is worth it.
Depends on the game and/or engine. At 1440p I can play Wolfenstein 2 without AA but GTA5 absolutely needs at least 2xmsaa or fxaa to get rid of them jags
He was being sarcastic, satirizing certain people who don't play games, but brag about how much money they have or spent on a PC and its components. Some do play games and brag though too. It is sad, but to each their own.
And then there's Doom (2016), where you can jack up the settings and still rock 120-140fps in heavy combat. I would advise, though, to put sharpening at about 20%-25% to avoid sharpening artifacts that actually degrade the image when up high (forms white, noise outlines on details, ruining the color gradations in many cases).
It's been proven in the past that you don't need a beast of a system to play FPS games. Remember how awesome Half Life 2 looked? The specs for that game were minimal and it still looked incredible. Even running Crysis on its release at the lowest settings still looked impressive.
@@aiden_macleod As someone who played Crysis in 2007, I can assure you the game looked like utter dogshit on Low. Medium was where it was at. Anything beyond that, you needed to either get a loan or consider selling a kidney on the black market to get a computer that could run it lol
I am kinda shocked that nobody is pointing out that names like "low", "medium", "high" and "ultra" are just arbitrary names chosen by developers. I can add my own graphics setting above ultra and call it "deez nuts". It changes the depth of field sample count to 10'000 samples per pixel. Afterwards people will complain about how unoptimized my game is cuz they can't max it out on STATE OF THE ART HARDWARE. This is a constant grievance of mine as a developer. However, in defense of all the users: Most gamedevs do a *TERRIBLE* , ABSOLUTELY GARBGE SHITJOB of the following three points: - Conveying wtf the graphics option you're changing actually does. - Conveying the visual difference of said graphics option's quality levels. - Conveying the performance hit of said graphics option's quality levels. If they actually got their goddamn shit together for once, maybe we wouldn't need these damn videos and tweak guides. How many more times do I have to read something like "SSAO: Adjusts the SSAO quality." or "Use compute shaders". However seeing the code said developers write and what comments they put, I have little hope. They're exactly the same. "GetOcclusionFactor(); // Gets the occlusion factor"
Damn straight dude. I've been doing this stuff a long time, and when I was younger it was fun keeping up with all the new terms and technologies; but generally we just had simple stuff like AA, AF, dynamic lights and bump mapping to worry about. Then there came a point a few years ago, when I realised I just had no idea hat the hell all those sliders were doing any more, and what's more, I wasn't sure I cared enough to find out. Every new game I installed would take a term I recognised, and throw a random new letter into the acronym. As I get older, I want to screw around with the settings less and less- Don't get me wrong, I still love to tinker and find that sweet spot. That's WHY I'm a PC gamer. But Ihate spending hours flipping every setting on and off again in that trial and error ritual, when instead, the devs could just tell me what in the goddamn fuck the "post processing" box actually, specifically, does.
It might seem silly given what you take for granted as a dev, but you actually gave me quite the epiphany with this post for contextualizing just how much my selecting those menu options despite having less and less understanding of what they're even doing as the years go by is ego. So, thanks for that. :)
I really like the look and feel of classic PC games, there's something particularly enchanting about retro graphics, but, after more than 20 years playing games, I've come to the conclusion that what keeps me going it's the gameplay. I have to admit that for some people ultra settings are an addiction once you get used to be able to run them in your hardware.
I’ve always advocated for this. My cousin always looks for best story but can’t stand playing some games because it was so boring to play. Gameplay trumps every aspect. The story and graphics could be amazing, but if I’m not having fun, I won’t touch the game
The video game at ultra in the video isn't even remotely close to how it looks like in reality, this is due to the low bitrate of the uploaded video, enforced by youtube's heavy compression. For refrence, the Witcher Ultra in this video looks like Medium settings, if you want actual represntation then look in youtube for "The witcher Ultra," and look for videos that have 4K in order to ease the compression and increase the bitrate.
Publisher/Developer now more focus on trends....unlike 10-20 years ago where we actually have more developer experimenting with varies type of games (nowadays most of them been acquired by bigger companies).....
Lee R Definitely agree. Mid range is the sweet spot. With low end you end up having to sacrifice too much, with high end you end up paying too much for something you can barely tell apart unless you're carefully comparing 2 things side by side, unless we're talking 4K on high/ultra, but then that's almost like comparing apples to oranges and the price at this point just isn't worth it for most people.
adm It's true graphics in video games just aren't really improving at such a rate that the difference is that obvious. It's a very different situation from 10 or 20 years ago. Hardware makers have just managed to convince people they need all the tiny little details cranked up to the max to enjoy a game, and at a resolution which is far too high for the size of their screen. I mean PC gamers today seem to think they need as many pixels in their 21" computer screen they view from one foot away as in the 70" television in their living room that they watch from across the room.
The problem is that many PC gamers have in fact been brainwashed into this "Ultra" or nothing mentality. They think they're somehow "hardcore" but the reality is that they've been doped into buying into something that offers almost no return on investment. The marketing campaign has worked on these poor souls. They erroneously think for example that a GTX 1060 or RX580 is not a "legit" 1440p/60FPS card because they can't run EVERY game on "Ultra" at that resolution. It's is the same story for the GTX 1070, 1070ti, 1080, 1080Ti. I know with my previous card a GTX 1080 I could play the vast majority of games at 4K/60FPS just not at "Ultra" settings. I now have an EVGA GTX FTW3 1080Ti @2025MHZ paired w/ 8700K@4.8GHZ OC/D - H100iV2 - 16GB DDR4 RAM@3600MHZ and still can't play EVERY game at 4K/60FPS on "Ultra" settings but I would still consider the 1080Ti a legit 4K/60FPS GPU.
Difference between 1440p and 1080p is not that great that you should sacrifice 144fps experience for it just to have higher resolution. 1080p @144fps just gives greatly better gaming experience compared to 1440p @60fps so that is just wrong with your comparisons in my opinion. In general yeah I guess I admit that there are many people "brainwashed" that they "must have ultra", but I don't really care as it doesn't affect me. :D Me and my friends have decently high end PC's but we are not obsessed with the ultra settings but rather not having fps drops (at least this applies to me if not to all my friends lol). I play every game 1440p @144fps and if I see dropping much below 144fps I turn down few settings to get that stable 144fps. My next upgrade will be when 4k @144fps comes actually doable with decent investments so I think I will keep my current setup (updated last year) for good 3-5 years. If the games start require much more then I just have to adjust the settings lower and lower so that I can still get the 144fps experience.
What bothers me more than anything is unoptimized games. I typically turn settings up to max but that's just because I can. If I'm trying to 1440p 144fps I'll turn settings down to hit it. But when I have to turn down my settings to low or medium to come close to that while only having 50% GPU usage and 30% CPU I get annoyed as my $1500 setup shouldn't be forced to run at such low settings even for 60fps at 1440p.
"but the reality is that they've been doped into buying into something that offers almost no return on investment. " You're wrong on that, I've bought cards with the idea of achieving ultra settings when in reality I could just achieve it. My investment was returned about 6 years later when more demanding games came out and I could play them at 60 FPS on medium high. Now I have a gtx 1080 and use Nvsurround. Basically good for it for the next 6 years.
Not really, I just hate aliasing and like to get the most texture and model resolution and best lighting out of everything while maintaining at least 60 fps. You're way overcomplicating this.
A bit disappointed that this didn't even mention AA. I built a Ryzen machine with a 1070 to use with a 4K TV in my living room rather than buy a console. A lot of people told me this would be "unplayable". It's not. At 4K I usually turn AA off completely. It affects performance and at this resolution, a few ft away from the TV, aliasing is not noticeable. As mentioned in this video I do play with graphics options too. Most of the time 40 FPS is perfectly fine too. It doesn't have to be 60 FPS or go home.
Thank you. So many retards think you need 2 gtx 1080tis for 4k, while you can VERY well play any game at 4k with only a 1070 or even 1060 if you use your brain and use proper settings.
Hey just for the record once you're playing in 4k you don't actually need anti-aliasing at all and therefore you should always have it off when you're playing in 4k cuz I just Hawgs up massive amounts of your GPU and PC resources while giving you no returns
@@anotherfan2870 not that I don't believe you or anything my dude but I find that extremely hard to believe in general... The 1060 series is at most a 1080 P card or maybe 1440p at extremely low settings and frame rates... Not saying that you haven't been able to accomplish this but if you did you're barely running at 30 frames per second if even that and almost all of your settings are turned down to medium or low otherwise there's no way you could be running 4K anything on a 1060... and for the record you guys if you're playing in 4k you absolutely do not need to have anti aliasing of any kind this is a fact check it out online just Google it it is a fact... When you're playing in 4K resolution the only thing anti-aliasing is going to do for you is hog more sources and put even even greater stress on your gpus in your entire system as a whole which is completely unnecessary... So turn off anti-aliasing at 4K and use that vram you just saved to turn up a couple other settings like Shadows for medium to hide and so on and so forth etcetera
@@bobmarl6722 you're not going to be able to play any game at 4K and 30fps on a 1060 unless you have all your settings turn down to medium or low including anti-aliasing and everything else but then again you don't need anti-aliasing in 4k so that doesn't matter none , my son has a 1060 in his laptop and he can't do anything but 1080p Ultra because the 1060 series is a 1080 P card.
That was never the intention of this video. He acknowledges that Ultra settings are great for future proofing a game, but for a mid range card they're a waste of valuable GPU power.
I am using a GTX 1070 amp extreme with an i5 6600k on dual 1080p monitors. I normally play on high settings and I have personally not seen any big difference between high and ultra and the fps difference between both is massive so high is ok for me
Good to see that this channel isn't just about making performance vids and actually explains shit and doesn't leave the average viewer guessing and can actually learn a thing or too about computers
Ultra = use it if you can maxed out everything High = beauty and FPS Medium = more FPS but looks good Low = not that great but it gives you FPS 15% resolution scale ultra low 15FPS : well it runs
Eh, even if I CAN technically "Ultra" a setting, if I can knock it down to "high" and get 10+ more FPS with a barely noticeable visual hit, I will take that extra smoothness/responsiveness over subtleties in the image that will be lost in motion ANY day, but that's me.
This sums up everything without making such a huge video !!!! :P :P :P Few things aren't correctly presented in video where there is Diff & they trying to say its similar. Well its "Ultra" is Ultra and your comment sums up everything.
the ultra is not enough, some games u have to put in the manual and put everything to the maximum and go into nvidia control panel and set the texture filtering on high quality and negative LOD on clamp
My personal feeling on this is that tweaking game settings for your most beloved games for that best balance between graphical fidelity and frame rates is just as fun as tuning hardware for that optimal efficiency point on the frequency-voltage/power scale.
This was incredibly informative, now I won't have to worry as much about missing out on graphical details on ultra settings when my rig can only support smooth performance on high or lower. Thanks for alleviating that.
Ultra and high settings usually doesn't look that different from medium nowadays, but back when Crysis released, the difference could be night and day. Baseline visual fidelity is so friggin' high nowadays, dialling down a couple of settings doesn't feel like giving up key parts of the experience anymore.
For me it should atleast be high - ultra settings. I'm only gonna do Medium settings if the games is really demanding and doesn't get 50 - 60 FPS average.
These setting are resource intensive but you don't lose much visual fidelity by dropping down a peg and you gain a lot of performance. >Leaves motion blur enabled, which looks worse and costs performance.
I think developers should focus more on the scene composition (and the manual crafting part) rather than pure resolution and detail. Things like lighting,shadows, contrast and colors are much more important during gameplay (which is 99% of the time) than having 4K textures, which you can only appreciate if you stand and stare at them.
Ultra setting really are beyond pointless. The increase in visual fidelity is marginal yet the performance difference is insane. In this video we had to stop an analyze a scene carefully just to see the difference. If your hardware can run ultra then great run it but if it can only do high then their is no reason to upgrade. Vast majority of people wont even notice or care to notice even a mix of medium and high. Once you are immersed into a game and are having fun the last thing you really care about is if you can see a higher resolution shadow 5 miles away on this obscure tree in the background.
And yet, if you really are immersed and don't just run around pew pewing people, you start to notice every little detail. I've lost count of how many times i was in the air in far cry 5 looking at the horizon and thinking, wow incredible foliage. Maxed ofc.
That is because most games are designed around Medium-High settings due to consoles. Which is a good thing since it increases the life of our PC components before we absolutely have to upgrade. Usually the best performance to fidelity is achieved on the base console settings because that is what the game, levels, models, set pieces are designed to run on. There are very few games like the Battlefield, Witcher 3 or Doom that actually take advantage of your PC hardware. I have a GTX 1060 and most of the time I just lower the settings to all high instead of ultra just to get a stable 60. Or just use the Geforce experience's recommended settings for smooth 60. Ultra settings are usually just overkill features like MSAA or just experimental features like VXAO or Hairworks, that tank to performance hard.
On older games I use Ultra settings newer ones I usually have to blend between high and medium just do to not having the super gaming pc that is required for ultra or high all the way through. Mine plays Skyrim Special Edition on ultra and it looks great and the FPS never dips below 50 frames which is good, I usually get concerned if while playing your FPS dips drastically like in one area you have 60 fps during any situation and all of a sudden your in combat or what not and it dips to below 30 usually tells me a setting is too high.
That's subjective. For me, it's worth it. For you , it may not be. A Rolls Royce will get you to the same places as a Hyundai, but some want the extra comfort and trimmings.
I've found that the difference between graphic presets has become less and less drastic, especially after the current console generation launched. Do you think that there will be more of a difference in the future, say around the end of this console gen?
Differences ae becoming minor as time goes on. U can only make the gpu cores so small b4 they will break easily. We need a new method in making processors if we're going to see a big leap again.
We're reaching a point of diminished returns in terms of graphics in my opinion. I think we'll see more of a push for CPU development in the next generation; games over the last year or so have become increasingly more dependent on CPU resources. I hope the Ryzen/PS5 thing is true because that is a fantastic chipset.
I clearly see the diference between games before 2015 using CPU and from then on. I have an i3 6100 and almost every game (even open worlds) uses my GPU (RX 470 Nitro+ 8GB) to 100% and the CPU isn't the bottleneck in games before 2015, but for games after that is really rare for me to play a game where the CPU isn't the bottleneck. (I know it's only a dual core HT and it's weak but I think I can only see that difference because I've been playing with a low end CPU)
Depends on the game/setting, but in general you see the most noticeable difference going from "low" to "medium", with the variance becoming increasingly subtle after that.
Yeah agree with this. I realized it with the witcher 3, on my gtx 660 mostly I set to mostly medium and few high. When upgrading to 1060 6GB I set mostly very high and few high.And despite the fps difference(30 vs 60), visually I didnt mind at all with witcher 3 on my GTX 660. It looks good enough (at least on my 1080p Monitor) and doesnt feel the differrence that much
I thought so, too, because his German pronounciation was so perfect at the beginning where he said "Angst", and also in the Wolfenstein part. But then I hestiated again, since his English also sounds so good. Since I'm not an English native speaker I might not hear it as well as German, but sounds great to me. And certainly, great work in general! Awesome video!
NovaPrima listening to the "und auf Wiedersehen" at the very end of the video, I'm 99% sure that he is either a native German or has lived for many many years in the country at least. It's just way too perfect :D
I am mainly a PC gamer.. for many many years! And I learned something new here and there. So.. thank you for these kind of videos. Keep it up with your great work guys!
This should honestly be required viewing for everyone that plays games on a PC. I've lost count of the times I've explained this concept of settings optimization to people. You take it to the next level with this video though. It was really great to learn about what is actually happening "under the hood" with certain settings, and why they can sometimes be such strains on performance. This is top notch content right here! :D
Personally, even if I can run a game, I like turning the settings down so my fans stay quiet. I COULD run Overwatch with high texture quality but my GPU would be really noisy.
At around 10:30 you used the word "poignant" to mean something like "clear", "noteworthy", or "salient", but the meaning of "poignant" is tied specifically to how emotionally affecting or sentimentally charged something is, not simply how salient it is. Just a tip in case you use that word regularly in your videos. BTW I found the content in this video on the whole well-written and very informative so don't take this as any sort of blanket criticism of the script.
I'm upgrading my pc for the first time in 10 years, after playing lots of console or new pc games at really low settings. This video really helped me to stop stressing about needing the newest and best graphics card to really 'experience' the games. Thank you, friend.
Really good descriptions and sound advice in this video, great job. What bugs me the most is when I see people using "high" or "ultra" as reference points, like, "can machine X play games at ultra?" or something to that effect. Or admire games that run slow on "ultra" because that means the graphics are amazing, right? All those labels are completely arbitrary. They literally have no meaning outside what the developers assigned to them in that one specific case in that one specific game. And those "ultra" settings can really get unreasonable, but that's kinda their point. So don't feel bad if you can't run the game well at those settings. It's really not hard to make the game more CPU and GPU intensive by just setting a couple of values to unnecessary high numbers. Like, you'd probably need to change a number or two in a reasonable modern game to make every individual blade of grass always fully render in 10 mile radius and call it "super-mega-ultra vegetation" or something. Does that mean that the game has amazing graphics just for that? No. Should you feel bad that your PC can't run it nor possibly any PC that will be made during your liftime? Absolutely not. Unless it's a very old or undemanding game, dial those things that you can hardly even notice back a bit, it will be more worthwhile to ease the strain on your hardware, making it last longer and consume a bit less power, especially if it's a laptop.
@Transistor Jump No, I agree with him. I've seen people saying "I'm switching to console gaming" or "PC gaming is a waste of money" all because they can't run games at 4K max settings lol.
I expect to be able to play any game I own at ultra and get 60fps but that's cause I have one of the best GPUs on the market and my system should be able to handle it. However I do have a 1440p 144hz monitor so I do turn settings down to be able to hit 144fps. But really what gets me is unoptimized games. When I have to play at medium settings to get 60fps with 50% GPU usage and 20% CPU just bothers me.
Remembering what your screen resolution is actually is the biggest thing people forget to do. They run ultra textures, which are sometimes like 4k textures, on a 1080p monitor, when textures half the size, or even a quarter of the size would look exactly the same in most circumstances. It eats up more GPU power for literally no visual improvement.
Depends on what settings are available. Geometry and textures and lighting are the most important factors and do make a difference in some cases, I've found textures look better on the highest setting when you go beyond 1080p.
I hope all future games will have a more user-friendly graphics settings UI, which clearly shows you the visual differences that each setting will bring and instantly shows you the impact on framerate. I think it will be easier for us to find a balance without testing over and over again by ourselves.
Most important to me is the ambient occlusion in most games where there is a lot of elements around you. Sometimes it can make some scenes more beautiful than they would be with realistic linear light. The way it fades the shadows is very pleasant. I always have it maxed out
i suspect deus ex is using the same textures for most objects, but increase texture size for details like dirt, newspapers, graffiti, very similar to how the witcher 3 deals with textures, where medium uses high quality textures for faces, but low quality textures for clutter.
a little tip, in rise of the tomb raider....just set everything to very high (like , beyond very high in some settings) and set antialiasing to SMAA......solid 60 all the way on i5 6400 and 1060 6g..12gb ram.....even set whats on very high even higher like i think pure hair which is just on, on very high... even works on other games. soooooo antialiasing hits all harder and just putting it to SMAA..not SMAA×4 or more will increase frames from an average of 45 fps..to 60..even reaches 70 in some areas...Tested on AC Origins, ROTTR, Kingdom Come Deliverance, DXMD, and Crysis 3 ( Edit: SMAA doesnt look much different from x2 but looks a bit noticable against SMAAx4)
For me, very high textures led to shortage on VRAM on GTX 1060, which should not be the case in a 6GB GPU. There was definitely a memory leak as I ran MSI afterburner and saw that VRAM usage slowly climb up despite me entering in new areas. This disappeared when I set the textures to High. Maybe they patched it now. I should go back to it and check it out.
@@ColdieHU did that before.. but it was on the 1366 socket with the i7 920.. and that was a tripple channel plattform.. nowadays it's usually either dual or quad.. so yeah 12 gb seems stupid..
What I've discovered through getting sucked into chasing the frame rate and having the highest end motherboard/CPU/GPU. that it's great if you're of the type who are into building systems for bragging rights and have the money. Watching all the bench marking and over clocking videos started taking me down that path. What is hard to find is videos stating what makes a great rig for real world playing. With the RX500 series finally coming back down to a more reasonable price it's possible to get a 580 with 8 gigs of high bandwidth. Take that and put it with a 144 hz 1080p freesync monitor. Buying the two together is close to the cost of one mid high end Nvidia GPU. From my experience I wish I had figured this out sooner. How much money and trouble I would have saved myself.
@@nexxusty Past games do support it. Some with excellent scaling, and others may require an alternative SLI Profile. However, as of the last 2 years much less games support it and it's been dwindling.
Nice, but one important feature was left out of this one : Resolution scaling and the visual/framerate impact. What resolution scale to choose & is dynamic resolution scaling good or too extreme ? Also what upscaling methods are best and which ones to stay away from ? Update: Also, is it worthwhile to invest in HDR displays and what impact does HDR have on gaming frame rates?
Very insightful video, Alex. Thank you for this. When it comes to PC Gaming, I usually want the best performance and visual fidelity, and find it a real hard battle to balance those two factors. Sometimes going full Ultra really doesn't do anything, we're talking about a really small, marginal gain in visuals, almost to the point where there's a diminishing return. Lately I've been on the quest of trying to reach 144 fps in all games at max settings on a 1080p 144hz monitor. So far it's been impossible. I went from a 970, to a 1070 Ti, to a 1080 currently, and I'm thinking of going for a 1080Ti. I simply cannot max out the fps without sacrificing a lot of settings. It's funny when people say things like a 1080 is overkill for 1080p. Maybe for 60 fps it kind of is, but anything over that and you'll be struggling.
Sometimes the expensive settings are worth it. Like VXAO in RotTR. Looks so, so much better than HBAO+ or even SSAO. You notice it in every scene, because the lightning and shadowing seems much more accurate and realistic.
But the massive hit to frame rate it has means it is worth it only for the those with expensive hardware to match. Plus you can kiss DX12 goodbye since it doesn't support it.
I always like to try Ultra settings on every new game I get just to see how my PC can handle it, but settings usually get turned down if I can't hit a consistent 60 FPS at my monitor's native resolution.
For Deus Ex example, i would like to point out that instead of checking the floor textures you should check the billboard textures ( which look pathetic on very high). One may argue that they may not matter but in case of a stealth/exploration style you are forced to see them up close and they look gross. Otherwise a very good analysis and really you have proven to be a good addition to the DF team (y)
meanwhile i played newest deus ex at 1440p full ultra with 2x MSAA and i got 80-ish fps AVG :D gtx 1080 ti FE + 6700k 4.6ghz (i have a vid on my channel)
Yea I'm not sure why he was focusing so much on the ground. Felt like he should be looking at walls or just objects in general, above the ground. I always thought that anisotropic filtering was the setting that had more control over ground textures and at different angles, not texture quality. That and tessellation. Whereas texture quality affected more the walls, doors, buildings, tree trunks, rocks/boulders, general objects, clothing, etc, you know, things that stand up vertically, up + down along the y-axis. I'm probably wrong tho. But that's what I noticed. Maybe it depends on the game too
Ultra settings are typically made for future hardware. It’s one aspect I really like about PC gaming. Returning to an older game with new hardware is really fun.
Not trying to be cocky, this channel is great and you actually learn a lot from his videos, but the fact that this channel his one of the few on YT to have metal music as background (besides metal music channels obviously) was the main reason for me to subscribe to him. Yeah, I love metal. Kudos, my fellow heads!
@DigitalFoundry Ultra settings are great and all, but I think it's more than enough if you can get 1080p with locked 60 fps *at least* on PC gaming. You're not really going to actually notice much of these graphical settings when actually playing the game. What you will notice is frame rate and uneven frame pacing. So in short, I think that overall it's more worth keeping settings at Medium or High.
I agree with everything except I think no matter if the resolution is 1080p or 1440p, you should aim for locked 100 fps or so (and obviously invest that little bit for having a 144Hz monitor if you don't have one already). My mind was blown when I moved from 60Hz to 144Hz and I find higher framerate so much more satisfying than ultra settings. I mean I do like graphics quite a lot actually! But still..
@@65EKS65 I feel like it HEAVILY depends on the type of game. If I'm playing a more cinematic game then ultra settings are a lot bigger deal than high fps, however, if I'm playing a shooter then high fps trumps amazing graphics.
@@GaiaGoddessOfTheEarth Yeah I get your idea but I just don't know many "cinematic" games you mentioned. At least for me pretty much every game I play or have played have just been feeling better when it has high fps. Tho I don't have 4k monitor so I haven't invested that much into the image quality, for me 1440p is already good enough and I can play pretty much any game maxed too nowadays. I just love the smoothness of +100fps compared to ~60fps but I guess it can depend on the person too.
In summary, you'd likely gain 15-20% each (compounded) for reducing shadow draw distances (when it doesn't matter), texture resolutions (for VRAM limit hitting cases), some post processing (motion blur, depth of field) and 'extra detail'. Excellent stuff, love the way you present this too!
this has been one of your best videos yet since it does tailor to the majority of the people with mid range cards. You guys should really consider starting a series where you pick a mid range card eg. 1050ti/960 or somewhere along those lines and try to optimize AAA Games on it for 1080p gameplay with the best graphics-to-performance trade off.
i cannot thank you enough alex..i was psychologically tormented cause of the new releases coming up and i fear if my hardware can meet their requirements... i have a strix 980 and i7 6700k, and a 1080p60hz display, and at the moment due to rise in gpu prices and bit of struggling financially i am currently unable to afford a new GPU for 1080p gaming...i was so shellshocked as to new titles been announced every day and that my gpu might not be able to keep up with them...but fortunately after watching your video i tweaked onto some graphics settings and now getter a better performance at no cost and energy efficiency. I ll be truthful...I use to be so obsessed with MAXING out every game...later I learnt my lesson that maxing out not only is not required everytime but how taxing it can be on my hardware.
Gopal Chatterjee the 980 is still a damn fine card. It's pretty much a 1060 and will still play at 1080p 60fps no prob for years to come. Sure maybe not at ultra, but like you realized, lowering a few settings here and there to high, it'll be capable of 60 for a good while, at that res. Even lowering to medium after a year or so after that will extend its life even further. Med settings aren't too bad these days. Definitely still very playable and the differences aren't that drastic, night and day, like in the old days. It's a lot more subtle. Sure still noticeable but nothing huge. You'll be fine. I got a 1060 so I'm in the same boat. If I want higher fps or wanna up the res in the future, then yea, might wanna shell out the cash. But I'm satisfied for now. Oh and don't forget, there's also the used market if you wanna upgrade in the future for cheaper. Although I would be wary of it esp for this gen stuff, as a lot of them would have been used to mine. But next gen amd navi might be a lil comforting. It's rumored to be 1080 level for a midrange price of $250 next year, so maybe that'll be something to look forward to, given your budget. Same price as 1060 msrp for 65-75% more performance. Sounds pretty good to me. Idk about nvidia but I'm sure they'll also have something similar, for their midrange. As of now tho their next gen stuff disappointingly looks to be quite exp and not much of an upgrade, same price for the same performance e.g. 1170 = 1080 = $500, 1180 = 1080ti = $600 etc. We'll have to wait and see.
soopasoljah appreciate your response man.. makes me feel a lot better. Thanks a million. :) , And you're right , this is hunger for ultra settings is absolutely ridiculous until unless ur on a 1440p or 4k monitor where it matters the most. I get a pretty good visual experience with high to medium settings and its a lot less taxing on my GPU, so its a win-win. You won't believe how much obsessed I use to be.. I didn't even think twice.. even the Msaa settings...8x... I know... I know..then I started to hate my Pc then I delete the game..then another game releases..same vicious circle lol.
Also just remember. Maxwell (aka GTX 900 series) was a silly good overclocker. 980s reference from Nvidia run at about 1100mhz, but I have seen them get up to the 1500-1550 range with good cooling and some clever tweaking. Useful to know if you need to elongate the life of your card for just a bit longer!
I watched through the whole video and he actually has some very valid points. When you're actually playing a game you won't be able to tell the difference between minor tweaks or between High/Ultra.
Ambient Occlusion is pretty much my worst enemy here. Most games run under 20fps at 720p when enabled. If there's no option to disable it I can still decrease the resolution to 480p. Some games let you do that in the ini files if it isn't possible in the game options. My target frame rate is always 20 - 30fps.
That usually works for most games. But some games (Yooka-Laylee in my case) won't let me do that. There is a mod that disables bloom, DOF and AA via F-keys. But so far no options to turn of AO. At least it runs just fine at high settings and 480p. Kinda feels like a Rare game for the original Xbox.
I don't care for ultra setting but for my first rig I would want ultra setting to be playable at high fps so I don't have to worry about my PC not lasting a good while, basically I would rather spend more money for something that can last me a long time then actaully buying the budget parts that might drop in value.
having a pc that can run ultra settings is one thing and enjoying the game is whole other thing. sometimes we are too distracted with this settings that we forget why we are playing at first place.
I play for sweet visuals. I barely used fast travel in odyssey or witcher 3 i always like to travel on horse and enjoy the view which i can’t with low/medium graphics
The only instance where I find myself playing on mostly Ultra settings is DOOM 2016. That game was VERY well optimized for all types of gaming rigs, whether budget build or beefy. I set everything to Ultra with shadows and particles set to High, and motion blur set to Low. Other games I usually set texture quality to Ultra with everything else set to High or Medium and, in some very rare cases, Low.
Every game. Select Ultra, lower shadows to medium, turn ambient occlusion off. Reflections on medium. Framerate limited to 3 frames less your monitor. Vsync off in game. Vsync on in Nvidia Panel. G-sync on. YOU'RE WELCOME.
I do tend to just push everything up to ultra/highest, as long as I can maintain 4k/60fps I'm usually happy. If not, the first thing I turn down is resolution and not settings.
Cmdr Flint Yup. 4K isn't worth it. 1440p isn't worth it especially on my laptop (which is equipped with a 1920x1080 monitor). I max the settings out and thanks to the i7 7700HQ 16Gb of DDR4 at 2.4Ghz a GTX 1070 and an SSD I'm still comfortably running nearly all current-gen games above or at the worst very close to a locked 60 FPS in 1080p. Why would you turn the visual quality down, but increase the resolution? Seems just stupid to me. Still rendering even at 512x512 using Cycles (Blender) can be painfully slow... 😭
Cmdr Flint yeah man 4k is not worth it if you cant have ultra settings. Who wants to see blurry textures in full res 4k anyway. Atleast in 4k you dont need antialiasing
I like that answer. I too value better settings over resolution. ( 4k at low setting is still awful I dont care what anyone says ) Just play in window mode one resolution down for the best results imo.
Sometimes having settings on low can have a great advantage rather than setting them on high. Take PUBG for example Ps. Players on a distance can be spotted easier. For me i guess.
Yep :3 there are multiplayer games which low/minimal settings means less objects on the screen. So our opponents cant hide behind thick tall grass for example, because the grass become thin or even disappear with lowest settings XD
Speaking as someone who went all out with a PC rig about a year ago on a 65" Samsung MU8000, there were times when I cranked up the resolution and the settings to try and get the best picture possible, but it didn't matter because I had already hit the point of diminishing returns. I found that at my viewing distance, 4K offered nothing over 1080 and that I couldn't tell between Ultra and High for some options. I've decided that performance matters more than graphical perfection, and as long as the game looks decent I'll have a blast.
I usually go for manual. In most games setting shadows on low or the lowest setting provides a trade off allowing much better settings in several other settings
Great video Alex, i run an i74790k and a GTX 970 and must admit i rarely go into the graphics options as i don't really understand what most of them do.I kinda find the whole massive range of options a bit overwhelming.Thanks to your video i feel able to explore these settings a bit more :).
I currently have a pretty good rig (asus ROG VR72gs laptop) but i don't even play recent games in "ultra" for 2 reasons: fluidity and gameplay > beauty and ... i don't like to make the hardware heat up for so little bonus details! The goal is not to to push the rig to its limits, but to be able to push it but use it efficiently while not having to push it! Just like you want a LOT of audio power, to get a good sound while under-using it! It's the key to stability, that's just my opinion, however.
Generally, if you don't want to tweak it or you are afraid of all those settings. Just set it to high preset. The reality is, things changed a lot from times of original Crysis, when you were actually missing out if you did not crank it to ultra. Today developers are more aware that people with midrange or even low end machine by far outnumber top end users and they want to cater to that. So a lot of games are even designed to look good at medium. Sure high still is visual step upwards, but they will more often than not make it not necessary to enjoy a game. And ultra is often really small step up for huge performance penalty. Which is why I always agree with the statement "high is for playing and ultra is for screenshots". Because yes, on screenshot it is more noticeable, since there is no action going on and you have time to focus on the details. But those are details you likely won't notice in heat of the action. Also if you have GeForce, trying out GeForce Experience settings can in some cases be really worth it. I had plenty of cases where GeForce Experience would set settings in wide range from high to low and it would create visually really great experience, while still fairly low in demand. I am not saying it is always on the point, but from what I saw, it didn't do all that bad of a job. So in at least some cases it might be worth a shot, if you don't want to experiment yourself. But as shown, of course best way is to experiment a bit and find best mix of quality and performance.
Raios Rogue Consoles usually run games at medium-high at 30fps. If he wanted to run on medium-high on a PC, he'd also have the option to play at 60fps, an option you don't typically get on consoles.
I grew up playing in the lowest settings, low resolution and got 15-20 FPS. In some cases below 10 FPS. I got a GTX 1070 when it came out. You can bet your ass I want Ultra Settings.
*and here I am perfectly happy with a GTX770* haha Witcher 3 (and other games) still look and run amazing to me and I'm cool with silky smooth 30FPS. I used to chase top-end hardware and max setting but its just a money sink and nothing is ever quite good enough - it never ends - its kinda weird and a little sinister...but yeh
Yeh I know its old in the tech world, and I didn't say it was perfect or could do anything, I was just stating a little experience I tend to have when watching DF videos.
I also have a 770 and share your thoughts, especially with today's hardware prices. I bought an Xbox One X and haven't regretted it, much lower price, longer relevancy/lifespan and close enough graphics when compared to a good gaming PC.
Satyasya Satyasya That doesn't sound right, I have a GTX 960(~as powerful as a 770) and Witcher 3 easily runs @60fps w/ a mixture of high and ultra settings on my rig.
Have a 2gb 770 but instead of 1080p at 30fps, I lowered the resolution to 900P in order to get a constant 60fps with medium/high settings no hairworks. Hopefully now that 1070 ti have come down to prices closer to msrp, I hope I'll get one soon
A player really has to prioritize which raw graphical and post-processing effects they want to maximize or optimize. For example, I never compromise on ultra textures or extra lighting (for textures: I'll compromise one setting lower when my GTX 980 cannot handle anything above 4GB VRAM). Also, not getting 60, 90, 120 or 140fps is not a big deal. Be prepared for settling at "mid" points like 48fps, 75fps, etc. It works just as well, or near enough to make no matter. The biggest factor in performance is stability: I don't want my game exhibiting the old term of "choppiness", where framerates hike and dip all over the place inconsistently. A locked framerate -- even one limited by the game settings -- is much more preferable. I can play many pre-2016 games at >60fps, but I choose to lock it at 60 when the games allow me to, to spare the GPU and keep it in sync with my monitor refresh rate. Also keep in mind power consumption. I'd rather tweak and optimize (maximize) asset settings like effects, than pump up resolution. I prefer marathon gaming sessions on my PC, so I can wrack up (rack?) relatively huge invoices from my hydro provider when I game at 1440p to 4K. By lowering resolution, you can afford to do this, while greatly freeing up resources to pump out performance. (Resolution fidelity apparently eats up more of your GPU's cores attention than virtually anything else).
I like ultra settings so that when I revisit a game in 5+ years they're there for me.
Honestly, that's mostly how I think they're supposed to be used ;p
@@KindOldRaven Yes. Ultra is for "future hardware".
its been 5 years, time you revisit said games
You guys should start doing settings guides for games.
YES!! this would be amazin
That would be lovely, we need someone to pick up after Nvidia. Preferably using midrange cards from both manufacturers.
Clell Biggs Very good idea!
Big yes to this!!!
They do it already. Not for every game of course.
My graphics settings in most games
Pick high settings, set textures to ultra and shadows to medium.
Same here!
Same and it always looks fantastic and I love it... Having Shadows set to high or very high or Ultra looks great but medium shadows look good too and you don't notice the difference too much when the rest of your settings are set at a mixture of high ,very high, and Ultra textures Etc at 60fps in 4k
@@dawgzofwar6674 I think the difference in shadows in The Witcher 3 is very obvious.
I set the shadows and motion blur to off most of the time and only focus on textures and anti aliasing because i can see my enemies easier with these settings but sometimes I need to see the shadows then I set them to lowest possible. I even play burnout paradise without shadows.
Same !!! Never turn off shadow. The gam ugly and pale. At least medium, even the other setting low
i often get more enjoyment out of pumping all the graphics to ultra than actually playing the game.
A man of culture
Preach
That says more about the quality of AAA games than anything else.
That's me 😂
Biggest reason I got Witcher 3 for my new laptop... Now looking for some more demanding games.... Cause I'm sitting over 100 fps maxxed
This is still one of my favorite DF videos. So sensible, and backed up with ample visual evidence, but really helping out most PC gamers understand how to tweak their games.
A part 2 with various Anti Aliasing Filters in various Resolutions (for example: How high does AA need to be at 1440p with a native 1080p screen) would be very appreciated.
the higher your resolution the less you need AA
That's a general true answer, but every game acts differently. Playing a game like Borderlands 1 with its comic design is way better in 1440p. But the performance hit is another question, wether playing in 1080p, high settings with 16xAA or 1440p with moderate settings and lower AA is worth it.
Depends on the game and/or engine. At 1440p I can play Wolfenstein 2 without AA but GTA5 absolutely needs at least 2xmsaa or fxaa to get rid of them jags
Why would you be running the game with AA at 1440p if your monitor's native resolution is 1080p?
ThrashingBasskill I can't think of any title at 1080p or up that needs 8x aa let alone 16x aa. At that point you might as well downsample from 4k.
Me using ultra setting:
-for better graphical quality and visual fidelity-
For bragging rights
Alex is a great Addition to the DF Crew
Especially now that he found a good rythm for his voice
I like that his videos are nice and simple, too. No distracting music, no fuss, just technical information.
MegaManX4 Him talking is snooze inducing. DF should consider finding someone like John or Tom who can present videos without making us sleepy
L
I like his German accent. It's pretty soothing. When he said "Wolfenstein" and "Uber" I nearly came lol
I don't even play video games. I just want to brag to strangers on the internet about how powerful my rig is.
This is the most sincere comment out there.
it might be sincere but that just makes you an asshole
He was being sarcastic, satirizing certain people who don't play games, but brag about how much money they have or spent on a PC and its components. Some do play games and brag though too. It is sad, but to each their own.
I barely game anymore, yet I built a 7k$ rig because, SK it btch! Bragging rights. And yes I'm an a$$hole!
@@19971997gt ok zoomer
And then there's Doom (2016), where you can jack up the settings and still rock 120-140fps in heavy combat.
I would advise, though, to put sharpening at about 20%-25% to avoid sharpening artifacts that actually degrade the image when up high (forms white, noise outlines on details, ruining the color gradations in many cases).
It's been proven in the past that you don't need a beast of a system to play FPS games. Remember how awesome Half Life 2 looked? The specs for that game were minimal and it still looked incredible. Even running Crysis on its release at the lowest settings still looked impressive.
Thanks for the tip, gonna be replaying DOOM in 2020.
@@aiden_macleod As someone who played Crysis in 2007, I can assure you the game looked like utter dogshit on Low. Medium was where it was at. Anything beyond that, you needed to either get a loan or consider selling a kidney on the black market to get a computer that could run it lol
really, i only get 30 fps as a standard
I am kinda shocked that nobody is pointing out that names like "low", "medium", "high" and "ultra" are just arbitrary names chosen by developers. I can add my own graphics setting above ultra and call it "deez nuts". It changes the depth of field sample count to 10'000 samples per pixel.
Afterwards people will complain about how unoptimized my game is cuz they can't max it out on STATE OF THE ART HARDWARE.
This is a constant grievance of mine as a developer.
However, in defense of all the users: Most gamedevs do a *TERRIBLE* , ABSOLUTELY GARBGE SHITJOB of the following three points:
- Conveying wtf the graphics option you're changing actually does.
- Conveying the visual difference of said graphics option's quality levels.
- Conveying the performance hit of said graphics option's quality levels.
If they actually got their goddamn shit together for once, maybe we wouldn't need these damn videos and tweak guides. How many more times do I have to read something like "SSAO: Adjusts the SSAO quality."
or
"Use compute shaders".
However seeing the code said developers write and what comments they put, I have little hope. They're exactly the same.
"GetOcclusionFactor(); // Gets the occlusion factor"
Damn straight dude. I've been doing this stuff a long time, and when I was younger it was fun keeping up with all the new terms and technologies; but generally we just had simple stuff like AA, AF, dynamic lights and bump mapping to worry about. Then there came a point a few years ago, when I realised I just had no idea hat the hell all those sliders were doing any more, and what's more, I wasn't sure I cared enough to find out. Every new game I installed would take a term I recognised, and throw a random new letter into the acronym.
As I get older, I want to screw around with the settings less and less- Don't get me wrong, I still love to tinker and find that sweet spot. That's WHY I'm a PC gamer. But Ihate spending hours flipping every setting on and off again in that trial and error ritual, when instead, the devs could just tell me what in the goddamn fuck the "post processing" box actually, specifically, does.
It might seem silly given what you take for granted as a dev, but you actually gave me quite the epiphany with this post for contextualizing just how much my selecting those menu options despite having less and less understanding of what they're even doing as the years go by is ego. So, thanks for that. :)
Game settings need technical writers.
Non-descriptive yet self-evident descriptions ftw :p (Good comment!)
I never liked it when my friends brag about their rigs and what settings they run at but I'd gladly be a hypocrite if it means showing off deez nuts
I really like the look and feel of classic PC games, there's something particularly enchanting about retro graphics, but, after more than 20 years playing games, I've come to the conclusion that what keeps me going it's the gameplay. I have to admit that for some people ultra settings are an addiction once you get used to be able to run them in your hardware.
I feel the same way. Amazing visuals are nice to have, but gameplay is what makes a game special
I’ve always advocated for this. My cousin always looks for best story but can’t stand playing some games because it was so boring to play.
Gameplay trumps every aspect. The story and graphics could be amazing, but if I’m not having fun, I won’t touch the game
I used to had Ultra setting addiction. Thankfully I was cure before dropping my money on a gtx 1080
The video game at ultra in the video isn't even remotely close to how it looks like in reality, this is due to the low bitrate of the uploaded video, enforced by youtube's heavy compression. For refrence, the Witcher Ultra in this video looks like Medium settings, if you want actual represntation then look in youtube for "The witcher Ultra," and look for videos that have 4K in order to ease the compression and increase the bitrate.
Publisher/Developer now more focus on trends....unlike 10-20 years ago where we actually have more developer experimenting with varies type of games (nowadays most of them been acquired by bigger companies).....
I've been running low and mid range gpu's my whole gaming ''career''
I'm pretty much a expert at this :P
Chasing the higher end cards is often pointless. Much eye candy is not noticeable unless you start comparing it or have mad OCD.
Lee R Definitely agree. Mid range is the sweet spot. With low end you end up having to sacrifice too much, with high end you end up paying too much for something you can barely tell apart unless you're carefully comparing 2 things side by side, unless we're talking 4K on high/ultra, but then that's almost like comparing apples to oranges and the price at this point just isn't worth it for most people.
chrisutubeism - you need new eyes.
adm
It's true graphics in video games just aren't really improving at such a rate that the difference is that obvious. It's a very different situation from 10 or 20 years ago. Hardware makers have just managed to convince people they need all the tiny little details cranked up to the max to enjoy a game, and at a resolution which is far too high for the size of their screen. I mean PC gamers today seem to think they need as many pixels in their 21" computer screen they view from one foot away as in the 70" television in their living room that they watch from across the room.
@@BigUriel yup
The problem is that many PC gamers have in fact been brainwashed into this "Ultra" or nothing mentality. They think they're somehow "hardcore" but the reality is that they've been doped into buying into something that offers almost no return on investment.
The marketing campaign has worked on these poor souls. They erroneously think for example that a GTX 1060 or RX580 is not a "legit" 1440p/60FPS card because they can't run EVERY game on "Ultra" at that resolution. It's is the same story for the GTX 1070, 1070ti, 1080, 1080Ti.
I know with my previous card a GTX 1080 I could play the vast majority of games at 4K/60FPS just not at "Ultra" settings. I now have an EVGA GTX FTW3 1080Ti @2025MHZ paired w/ 8700K@4.8GHZ OC/D - H100iV2 - 16GB DDR4 RAM@3600MHZ and still can't play EVERY game at 4K/60FPS on "Ultra" settings but I would still consider the 1080Ti a legit 4K/60FPS GPU.
Difference between 1440p and 1080p is not that great that you should sacrifice 144fps experience for it just to have higher resolution. 1080p @144fps just gives greatly better gaming experience compared to 1440p @60fps so that is just wrong with your comparisons in my opinion. In general yeah I guess I admit that there are many people "brainwashed" that they "must have ultra", but I don't really care as it doesn't affect me. :D
Me and my friends have decently high end PC's but we are not obsessed with the ultra settings but rather not having fps drops (at least this applies to me if not to all my friends lol). I play every game 1440p @144fps and if I see dropping much below 144fps I turn down few settings to get that stable 144fps. My next upgrade will be when 4k @144fps comes actually doable with decent investments so I think I will keep my current setup (updated last year) for good 3-5 years. If the games start require much more then I just have to adjust the settings lower and lower so that I can still get the 144fps experience.
What bothers me more than anything is unoptimized games. I typically turn settings up to max but that's just because I can. If I'm trying to 1440p 144fps I'll turn settings down to hit it. But when I have to turn down my settings to low or medium to come close to that while only having 50% GPU usage and 30% CPU I get annoyed as my $1500 setup shouldn't be forced to run at such low settings even for 60fps at 1440p.
"but the reality is that they've been doped into buying into something that offers almost no return on investment. "
You're wrong on that, I've bought cards with the idea of achieving ultra settings when in reality I could just achieve it. My investment was returned about 6 years later when more demanding games came out and I could play them at 60 FPS on medium high. Now I have a gtx 1080 and use Nvsurround. Basically good for it for the next 6 years.
nah, id only consider the 2080ti the 4k 60fps card. the 1080ti is really the 2k 144 max everything card
Not really, I just hate aliasing and like to get the most texture and model resolution and best lighting out of everything while maintaining at least 60 fps. You're way overcomplicating this.
A bit disappointed that this didn't even mention AA. I built a Ryzen machine with a 1070 to use with a 4K TV in my living room rather than buy a console. A lot of people told me this would be "unplayable". It's not. At 4K I usually turn AA off completely. It affects performance and at this resolution, a few ft away from the TV, aliasing is not noticeable. As mentioned in this video I do play with graphics options too. Most of the time 40 FPS is perfectly fine too. It doesn't have to be 60 FPS or go home.
Thank you. So many retards think you need 2 gtx 1080tis for 4k, while you can VERY well play any game at 4k with only a 1070 or even 1060 if you use your brain and use proper settings.
Hey just for the record once you're playing in 4k you don't actually need anti-aliasing at all and therefore you should always have it off when you're playing in 4k cuz I just Hawgs up massive amounts of your GPU and PC resources while giving you no returns
fraggsta I also play at 4K, you can easily hit 60fps with decent settings with a gtx 1060 if your smart about it.
@@anotherfan2870 not that I don't believe you or anything my dude but I find that extremely hard to believe in general... The 1060 series is at most a 1080 P card or maybe 1440p at extremely low settings and frame rates... Not saying that you haven't been able to accomplish this but if you did you're barely running at 30 frames per second if even that and almost all of your settings are turned down to medium or low otherwise there's no way you could be running 4K anything on a 1060... and for the record you guys if you're playing in 4k you absolutely do not need to have anti aliasing of any kind this is a fact check it out online just Google it it is a fact... When you're playing in 4K resolution the only thing anti-aliasing is going to do for you is hog more sources and put even even greater stress on your gpus in your entire system as a whole which is completely unnecessary... So turn off anti-aliasing at 4K and use that vram you just saved to turn up a couple other settings like Shadows for medium to hide and so on and so forth etcetera
@@bobmarl6722 you're not going to be able to play any game at 4K and 30fps on a 1060 unless you have all your settings turn down to medium or low including anti-aliasing and everything else but then again you don't need anti-aliasing in 4k so that doesn't matter none , my son has a 1060 in his laptop and he can't do anything but 1080p Ultra because the 1060 series is a 1080 P card.
The hard work you guys put up for such videos is just awesome. Liked and shared. keep it up
and killing the hard work of the developers who made some Ultra settings to stand out.
That was never the intention of this video. He acknowledges that Ultra settings are great for future proofing a game, but for a mid range card they're a waste of valuable GPU power.
Vishal Gupta did hey say anything about not having ultra settings?
I am using a GTX 1070 amp extreme with an i5 6600k on dual 1080p monitors. I normally play on high settings and I have personally not seen any big difference between high and ultra and the fps difference between both is massive so high is ok for me
yeah I use only 1 which is 144hz, another 60hz is for multitasking only which I mostly use for discord, Spotify and facebook/twitter
Good to see that this channel isn't just about making performance vids and actually explains shit and doesn't leave the average viewer guessing and can actually learn a thing or too about computers
Thanks Randy. Love your ava here :D
Yup, don't treat audiences like morons and you won't get an audience like that! Everyone wins. I love educational videos, and so do many others.
Hi Mr. Marx
Ultra = use it if you can maxed out everything
High = beauty and FPS
Medium = more FPS but looks good
Low = not that great but it gives you FPS
15% resolution scale ultra low 15FPS : well it runs
Robert Lebrun 8700k 5.2ghz and 1080ti at 2.05ghz = still use high
Fps>>>>>>>>ultra
Eh, even if I CAN technically "Ultra" a setting, if I can knock it down to "high" and get 10+ more FPS with a barely noticeable visual hit, I will take that extra smoothness/responsiveness over subtleties in the image that will be lost in motion ANY day, but that's me.
This sums up everything without making such a huge video !!!! :P :P :P
Few things aren't correctly presented in video where there is Diff & they trying to say its similar. Well its "Ultra" is Ultra and your comment sums up everything.
i ultra everything out all i ever do is turn down AA
gtx 1080 ti FE 2038mhz core 5951mhz mem and 6700k 4.6ghz running 1440p 120hz 1ms
the ultra is not enough, some games u have to put in the manual and put everything to the maximum
and go into nvidia control panel and set the texture filtering on high quality and negative LOD on clamp
My personal feeling on this is that tweaking game settings for your most beloved games for that best balance between graphical fidelity and frame rates is just as fun as tuning hardware for that optimal efficiency point on the frequency-voltage/power scale.
This was incredibly informative, now I won't have to worry as much about missing out on graphical details on ultra settings when my rig can only support smooth performance on high or lower. Thanks for alleviating that.
That "Über" was really pleasant to hear
"Eu-bar"
Most games barley look better at ultra, medium-high is fine.
Ultra and high settings usually doesn't look that different from medium nowadays, but back when Crysis released, the difference could be night and day. Baseline visual fidelity is so friggin' high nowadays, dialling down a couple of settings doesn't feel like giving up key parts of the experience anymore.
For me it should atleast be high - ultra settings. I'm only gonna do Medium settings if the games is really demanding and doesn't get 50 - 60 FPS average.
Ozziw162 fuck. Crysis 3 looked amazing even on Consoles.
I'm already contented with the graphics of my ps4 slim. Oh and btw not barley, it's barely
Absolutely Ultra is pointless. You don't see a difference once you are playing.
The amount of work that went into this video is so high. Great quality content.
These setting are resource intensive but you don't lose much visual fidelity by dropping down a peg and you gain a lot of performance.
>Leaves motion blur enabled, which looks worse and costs performance.
I think developers should focus more on the scene composition (and the manual crafting part) rather than pure resolution and detail. Things like lighting,shadows, contrast and colors are much more important during gameplay (which is 99% of the time) than having 4K textures, which you can only appreciate if you stand and stare at them.
loved when Alex went full Deutsch during the Wolfenstein bit.
@@BITCOIlN über with an üüüüüüüüü
Ultra setting really are beyond pointless. The increase in visual fidelity is marginal yet the performance difference is insane. In this video we had to stop an analyze a scene carefully just to see the difference. If your hardware can run ultra then great run it but if it can only do high then their is no reason to upgrade. Vast majority of people wont even notice or care to notice even a mix of medium and high. Once you are immersed into a game and are having fun the last thing you really care about is if you can see a higher resolution shadow 5 miles away on this obscure tree in the background.
And yet, if you really are immersed and don't just run around pew pewing people, you start to notice every little detail. I've lost count of how many times i was in the air in far cry 5 looking at the horizon and thinking, wow incredible foliage. Maxed ofc.
osricen 😂😂😂lol
That is because most games are designed around Medium-High settings due to consoles. Which is a good thing since it increases the life of our PC components before we absolutely have to upgrade. Usually the best performance to fidelity is achieved on the base console settings because that is what the game, levels, models, set pieces are designed to run on. There are very few games like the Battlefield, Witcher 3 or Doom that actually take advantage of your PC hardware. I have a GTX 1060 and most of the time I just lower the settings to all high instead of ultra just to get a stable 60. Or just use the Geforce experience's recommended settings for smooth 60. Ultra settings are usually just overkill features like MSAA or just experimental features like VXAO or Hairworks, that tank to performance hard.
On older games I use Ultra settings newer ones I usually have to blend between high and medium just do to not having the super gaming pc that is required for ultra or high all the way through. Mine plays Skyrim Special Edition on ultra and it looks great and the FPS never dips below 50 frames which is good, I usually get concerned if while playing your FPS dips drastically like in one area you have 60 fps during any situation and all of a sudden your in combat or what not and it dips to below 30 usually tells me a setting is too high.
And that's why having a console makes Sense
Do you really need ultra settings? Of course not. Do you want ultra settings? Of course.
amen
"want"? I want the game to run well.
Question is are ultra setting worth the cost? Answer is No.
That's subjective. For me, it's worth it. For you , it may not be. A Rolls Royce will get you to the same places as a Hyundai, but some want the extra comfort and trimmings.
REDDY71 of course but the difference is not that big, if you have the money and there isn't anymore you need/want to buy yeah go for it.
I keep textures as high as possible and the other settings medium to high
Props for pronouncing "Ü" correctly.
Sincerely, a German viewer.
Alex is German.
How does people pronounce "ü" then usually? :o I haven't even paid attention before I read your comment :D
@@65EKS65 most english speakers usually pronounce it as an u.. uber for example.. or fuhrer.. whatever.. heard it so many times..
@@Dunkelelf3 Oh ok, that sounds so weird tho for example that "fuhrer" :D Maybe I always subconciously translated in my head as "ü" then. :D
@@65EKS65 i pronounce uber as you-ber,
I've found that the difference between graphic presets has become less and less drastic, especially after the current console generation launched. Do you think that there will be more of a difference in the future, say around the end of this console gen?
Differences ae becoming minor as time goes on. U can only make the gpu cores so small b4 they will break easily. We need a new method in making processors if we're going to see a big leap again.
We're reaching a point of diminished returns in terms of graphics in my opinion. I think we'll see more of a push for CPU development in the next generation; games over the last year or so have become increasingly more dependent on CPU resources. I hope the Ryzen/PS5 thing is true because that is a fantastic chipset.
I clearly see the diference between games before 2015 using CPU and from then on. I have an i3 6100 and almost every game (even open worlds) uses my GPU (RX 470 Nitro+ 8GB) to 100% and the CPU isn't the bottleneck in games before 2015, but for games after that is really rare for me to play a game where the CPU isn't the bottleneck. (I know it's only a dual core HT and it's weak but I think I can only see that difference because I've been playing with a low end CPU)
Depends on the game/setting, but in general you see the most noticeable difference going from "low" to "medium", with the variance becoming increasingly subtle after that.
Yeah agree with this. I realized it with the witcher 3, on my gtx 660 mostly I set to mostly medium and few high. When upgrading to 1060 6GB I set mostly very high and few high.And despite the fps difference(30 vs 60), visually I didnt mind at all with witcher 3 on my GTX 660. It looks good enough (at least on my 1080p Monitor) and doesnt feel the differrence that much
Not sure if you are German, but I like the way you pronounce all the German words in Wolfenstein
He is indeed german. At the end you can hear "und auf Wiedersehen" which is german.
Ich denke, er ist Deutscher
his pronunciation caught me off guard lol
I thought so, too, because his German pronounciation was so perfect at the beginning where he said "Angst", and also in the Wolfenstein part. But then I hestiated again, since his English also sounds so good. Since I'm not an English native speaker I might not hear it as well as German, but sounds great to me.
And certainly, great work in general! Awesome video!
Yup, "Angst" was an instant giveaway. :)
Backtracked a couple of times just to hear Wolfenstein pronounced just right xD
And über
Is he German?
John Chapman yes
No. John Linneman said Alex was American.
NovaPrima listening to the "und auf Wiedersehen" at the very end of the video, I'm 99% sure that he is either a native German or has lived for many many years in the country at least. It's just way too perfect :D
I am mainly a PC gamer.. for many many years! And I learned something new here and there. So.. thank you for these kind of videos. Keep it up with your great work guys!
This should honestly be required viewing for everyone that plays games on a PC. I've lost count of the times I've explained this concept of settings optimization to people. You take it to the next level with this video though. It was really great to learn about what is actually happening "under the hood" with certain settings, and why they can sometimes be such strains on performance. This is top notch content right here! :D
The logical and rational answer is frame rate over visual fidelity.. Unless your machine is powerful enough to achieve both.
Martian Manhunter Exactly!
Personally, even if I can run a game, I like turning the settings down so my fans stay quiet.
I COULD run Overwatch with high texture quality but my GPU would be really noisy.
very ratio
much logix
wow
No machine can do ultra 60fps at 4k.....
I would add frame rate AND frame rate stability over fidelity. There is such a thing as unpleasantly choppy 60 fps.
Loving the pc focused content keep up the good work, DF. 😀😀😀
Where's Prince Dizzy anyway?
Nice meme
Mr. Boppin living under a bridge lol
At around 10:30 you used the word "poignant" to mean something like "clear", "noteworthy", or "salient", but the meaning of "poignant" is tied specifically to how emotionally affecting or sentimentally charged something is, not simply how salient it is. Just a tip in case you use that word regularly in your videos. BTW I found the content in this video on the whole well-written and very informative so don't take this as any sort of blanket criticism of the script.
Thanks Ryan! Not always speaking english with others does this I think :D
Best to you and thanks,
Alex
WTF?
When your a English major
Mark Martin you're*
@@tigerheaddude shut up
I'm upgrading my pc for the first time in 10 years, after playing lots of console or new pc games at really low settings. This video really helped me to stop stressing about needing the newest and best graphics card to really 'experience' the games. Thank you, friend.
Really good descriptions and sound advice in this video, great job. What bugs me the most is when I see people using "high" or "ultra" as reference points, like, "can machine X play games at ultra?" or something to that effect. Or admire games that run slow on "ultra" because that means the graphics are amazing, right? All those labels are completely arbitrary. They literally have no meaning outside what the developers assigned to them in that one specific case in that one specific game.
And those "ultra" settings can really get unreasonable, but that's kinda their point. So don't feel bad if you can't run the game well at those settings. It's really not hard to make the game more CPU and GPU intensive by just setting a couple of values to unnecessary high numbers. Like, you'd probably need to change a number or two in a reasonable modern game to make every individual blade of grass always fully render in 10 mile radius and call it "super-mega-ultra vegetation" or something. Does that mean that the game has amazing graphics just for that? No. Should you feel bad that your PC can't run it nor possibly any PC that will be made during your liftime? Absolutely not.
Unless it's a very old or undemanding game, dial those things that you can hardly even notice back a bit, it will be more worthwhile to ease the strain on your hardware, making it last longer and consume a bit less power, especially if it's a laptop.
Keep up with the PC gaming content!!!!!!!!
ArcaneThinker you imply content where there is non lol
I just set everything to max and if the FPS is good enough I leave it there, if not I try and first slightly lower the "fancy" settings.
Lowkey flexing on us with your German pronounciations lol
Above all else.
@Führer des Benutzers deutsche Sprache, deutsche Sprache
@@disputeone Uber alles
It sounds stupid as fuck. Even the devs don’t pronounce it like that.
@@VaydaladaVodalada Well Alex is German so Im pretty sure he has more authority on how to pronounce it
Instant Like for the StarCraft soundtrack! That music gives me goosebumps!
I learned this 15 years ago. Ultra settings does not mean Ultra experience. New PC gamers need to get that into their thick skull of theirs.
Yeah, they don't really put enough into ultra for it to be worth it.
H Koizumi totally agree with you im running i7 8700, Gtx 1080ti, and i find it funny i still prefer high/medium, I really don’t know why..
@Transistor Jump No, I agree with him. I've seen people saying "I'm switching to console gaming" or "PC gaming is a waste of money" all because they can't run games at 4K max settings lol.
I expect to be able to play any game I own at ultra and get 60fps but that's cause I have one of the best GPUs on the market and my system should be able to handle it. However I do have a 1440p 144hz monitor so I do turn settings down to be able to hit 144fps. But really what gets me is unoptimized games. When I have to play at medium settings to get 60fps with 50% GPU usage and 20% CPU just bothers me.
you dont even have a 2080ti,you arnt saying anything good
Remembering what your screen resolution is actually is the biggest thing people forget to do. They run ultra textures, which are sometimes like 4k textures, on a 1080p monitor, when textures half the size, or even a quarter of the size would look exactly the same in most circumstances. It eats up more GPU power for literally no visual improvement.
Depends on what settings are available. Geometry and textures and lighting are the most important factors and do make a difference in some cases, I've found textures look better on the highest setting when you go beyond 1080p.
I hope all future games will have a more user-friendly graphics settings UI, which clearly shows you the visual differences that each setting will bring and instantly shows you the impact on framerate. I think it will be easier for us to find a balance without testing over and over again by ourselves.
I NEED Ultra settings. My eyes lust after the beauty of Ultra Settings and I cannot settle for the less than perfection.
Most important to me is the ambient occlusion in most games where there is a lot of elements around you. Sometimes it can make some scenes more beautiful than they would be with realistic linear light. The way it fades the shadows is very pleasant. I always have it maxed out
i suspect deus ex is using the same textures for most objects, but increase texture size for details like dirt, newspapers, graffiti, very similar to how the witcher 3 deals with textures, where medium uses high quality textures for faces, but low quality textures for clutter.
a little tip, in rise of the tomb raider....just set everything to very high (like , beyond very high in some settings) and set antialiasing to SMAA......solid 60 all the way on i5 6400 and 1060 6g..12gb ram.....even set whats on very high even higher like i think pure hair which is just on, on very high... even works on other games. soooooo antialiasing hits all harder and just putting it to SMAA..not SMAA×4 or more will increase frames from an average of 45 fps..to 60..even reaches 70 in some areas...Tested on AC Origins, ROTTR, Kingdom Come Deliverance, DXMD, and Crysis 3 ( Edit: SMAA doesnt look much different from x2 but looks a bit noticable against SMAAx4)
For me, very high textures led to shortage on VRAM on GTX 1060, which should not be the case in a 6GB GPU. There was definitely a memory leak as I ran MSI afterburner and saw that VRAM usage slowly climb up despite me entering in new areas. This disappeared when I set the textures to High. Maybe they patched it now. I should go back to it and check it out.
I am not even sure i want to know how you run 12Gb of ram. In any case, you are sacrificing speed.
@@ColdieHU did that before.. but it was on the 1366 socket with the i7 920.. and that was a tripple channel plattform.. nowadays it's usually either dual or quad.. so yeah 12 gb seems stupid..
What I've discovered through getting sucked into chasing the frame rate and having the highest end motherboard/CPU/GPU. that it's great if you're of the type who are into building systems for bragging rights and have the money. Watching all the bench marking and over clocking videos started taking me down that path. What is hard to find is videos stating what makes a great rig for real world playing. With the RX500 series finally coming back down to a more reasonable price it's possible to get a 580 with 8 gigs of high bandwidth. Take that and put it with a 144 hz 1080p freesync monitor. Buying the two together is close to the cost of one mid high end Nvidia GPU. From my experience I wish I had figured this out sooner. How much money and trouble I would have saved myself.
SLi/Crossfire is not a thing anymore man.
No games support it. Single GPU only.
@@nexxusty Past games do support it. Some with excellent scaling, and others may require an alternative SLI Profile. However, as of the last 2 years much less games support it and it's been dwindling.
Nice, but one important feature was left out of this one : Resolution scaling and the visual/framerate impact. What resolution scale to choose & is dynamic resolution scaling good or too extreme ? Also what upscaling methods are best and which ones to stay away from ?
Update: Also, is it worthwhile to invest in HDR displays and what impact does HDR have on gaming frame rates?
Very insightful video, Alex. Thank you for this. When it comes to PC Gaming, I usually want the best performance and visual fidelity, and find it a real hard battle to balance those two factors. Sometimes going full Ultra really doesn't do anything, we're talking about a really small, marginal gain in visuals, almost to the point where there's a diminishing return.
Lately I've been on the quest of trying to reach 144 fps in all games at max settings on a 1080p 144hz monitor. So far it's been impossible. I went from a 970, to a 1070 Ti, to a 1080 currently, and I'm thinking of going for a 1080Ti. I simply cannot max out the fps without sacrificing a lot of settings. It's funny when people say things like a 1080 is overkill for 1080p. Maybe for 60 fps it kind of is, but anything over that and you'll be struggling.
Sometimes the expensive settings are worth it. Like VXAO in RotTR. Looks so, so much better than HBAO+ or even SSAO. You notice it in every scene, because the lightning and shadowing seems much more accurate and realistic.
But the massive hit to frame rate it has means it is worth it only for the those with expensive hardware to match. Plus you can kiss DX12 goodbye since it doesn't support it.
I always like to try Ultra settings on every new game I get just to see how my PC can handle it, but settings usually get turned down if I can't hit a consistent 60 FPS at my monitor's native resolution.
For Deus Ex example, i would like to point out that instead of checking the floor textures you should check the billboard textures ( which look pathetic on very high). One may argue that they may not matter but in case of a stealth/exploration style you are forced to see them up close and they look gross.
Otherwise a very good analysis and really you have proven to be a good addition to the DF team (y)
meanwhile i played newest deus ex at 1440p full ultra with 2x MSAA and i got 80-ish fps AVG :D
gtx 1080 ti FE + 6700k 4.6ghz (i have a vid on my channel)
Yea I'm not sure why he was focusing so much on the ground. Felt like he should be looking at walls or just objects in general, above the ground. I always thought that anisotropic filtering was the setting that had more control over ground textures and at different angles, not texture quality. That and tessellation. Whereas texture quality affected more the walls, doors, buildings, tree trunks, rocks/boulders, general objects, clothing, etc, you know, things that stand up vertically, up + down along the y-axis. I'm probably wrong tho. But that's what I noticed. Maybe it depends on the game too
Ultra settings are typically made for future hardware. It’s one aspect I really like about PC gaming. Returning to an older game with new hardware is really fun.
these details may not individually but collectively add up! and make a major difference!
AAAND he pronounced Wolfenstein properly! Again!
Alex choosing heavy metal music for this video musical background
JoPe - TuYaTroJoueY What song is that?
That's right. The song is "Hero" from the Serious Sam 3 BFE OST.
Not trying to be cocky, this channel is great and you actually learn a lot from his videos, but the fact that this channel his one of the few on YT to have metal music as background (besides metal music channels obviously) was the main reason for me to subscribe to him. Yeah, I love metal. Kudos, my fellow heads!
@DigitalFoundry
Ultra settings are great and all, but I think it's more than enough if you can get 1080p with locked 60 fps *at least* on PC gaming.
You're not really going to actually notice much of these graphical settings when actually playing the game.
What you will notice is frame rate and uneven frame pacing.
So in short, I think that overall it's more worth keeping settings at Medium or High.
I agree with everything except I think no matter if the resolution is 1080p or 1440p, you should aim for locked 100 fps or so (and obviously invest that little bit for having a 144Hz monitor if you don't have one already). My mind was blown when I moved from 60Hz to 144Hz and I find higher framerate so much more satisfying than ultra settings. I mean I do like graphics quite a lot actually! But still..
@@65EKS65 I feel like it HEAVILY depends on the type of game. If I'm playing a more cinematic game then ultra settings are a lot bigger deal than high fps, however, if I'm playing a shooter then high fps trumps amazing graphics.
@@GaiaGoddessOfTheEarth Yeah I get your idea but I just don't know many "cinematic" games you mentioned. At least for me pretty much every game I play or have played have just been feeling better when it has high fps. Tho I don't have 4k monitor so I haven't invested that much into the image quality, for me 1440p is already good enough and I can play pretty much any game maxed too nowadays. I just love the smoothness of +100fps compared to ~60fps but I guess it can depend on the person too.
In summary, you'd likely gain 15-20% each (compounded) for reducing shadow draw distances (when it doesn't matter), texture resolutions (for VRAM limit hitting cases), some post processing (motion blur, depth of field) and 'extra detail'. Excellent stuff, love the way you present this too!
this has been one of your best videos yet since it does tailor to the majority of the people with mid range cards. You guys should really consider starting a series where you pick a mid range card eg. 1050ti/960 or somewhere along those lines and try to optimize AAA Games on it for 1080p gameplay with the best graphics-to-performance trade off.
Great video, well researched, extensive and always accompanied by relevant footage. Good work.
Yup, showing everything and a graph after that is just a bliss.
i cannot thank you enough alex..i was psychologically tormented cause of the new releases coming up and i fear if my hardware can meet their requirements... i have a strix 980 and i7 6700k, and a 1080p60hz display, and at the moment due to rise in gpu prices and bit of struggling financially i am currently unable to afford a new GPU for 1080p gaming...i was so shellshocked as to new titles been announced every day and that my gpu might not be able to keep up with them...but fortunately after watching your video i tweaked onto some graphics settings and now getter a better performance at no cost and energy efficiency. I ll be truthful...I use to be so obsessed with MAXING out every game...later I learnt my lesson that maxing out not only is not required everytime but how taxing it can be on my hardware.
Gopal Chatterjee the 980 is still a damn fine card. It's pretty much a 1060 and will still play at 1080p 60fps no prob for years to come. Sure maybe not at ultra, but like you realized, lowering a few settings here and there to high, it'll be capable of 60 for a good while, at that res. Even lowering to medium after a year or so after that will extend its life even further. Med settings aren't too bad these days. Definitely still very playable and the differences aren't that drastic, night and day, like in the old days. It's a lot more subtle. Sure still noticeable but nothing huge. You'll be fine. I got a 1060 so I'm in the same boat. If I want higher fps or wanna up the res in the future, then yea, might wanna shell out the cash. But I'm satisfied for now. Oh and don't forget, there's also the used market if you wanna upgrade in the future for cheaper. Although I would be wary of it esp for this gen stuff, as a lot of them would have been used to mine. But next gen amd navi might be a lil comforting. It's rumored to be 1080 level for a midrange price of $250 next year, so maybe that'll be something to look forward to, given your budget. Same price as 1060 msrp for 65-75% more performance. Sounds pretty good to me. Idk about nvidia but I'm sure they'll also have something similar, for their midrange. As of now tho their next gen stuff disappointingly looks to be quite exp and not much of an upgrade, same price for the same performance e.g. 1170 = 1080 = $500, 1180 = 1080ti = $600 etc. We'll have to wait and see.
soopasoljah appreciate your response man.. makes me feel a lot better. Thanks a million. :) , And you're right , this is hunger for ultra settings is absolutely ridiculous until unless ur on a 1440p or 4k monitor where it matters the most. I get a pretty good visual experience with high to medium settings and its a lot less taxing on my GPU, so its a win-win. You won't believe how much obsessed I use to be.. I didn't even think twice.. even the Msaa settings...8x... I know... I know..then I started to hate my Pc then I delete the game..then another game releases..same vicious circle lol.
Psychologically tormented? God talk about first world problems...
Also just remember. Maxwell (aka GTX 900 series) was a silly good overclocker. 980s reference from Nvidia run at about 1100mhz, but I have seen them get up to the 1500-1550 range with good cooling and some clever tweaking. Useful to know if you need to elongate the life of your card for just a bit longer!
It's all in your head. These days ultra isn't worth it. Crysis was the last game that Ultra really made a difference.
always nice to hear Wolfenstein pronounced correctly
Alex Battaglia coming in hot with that pronunciation of Wolfenstein
I watched through the whole video and he actually has some very valid points. When you're actually playing a game you won't be able to tell the difference between minor tweaks or between High/Ultra.
High settings. Gotta get that minimum of 60fps.
Insane that 60fps is detectable to me and 30fps then looks choppy.
C_ Rom that's what I'm saying fam!
Hi Alex. Großartiger Content. Diesmal auch super gesprochen, klasse Rhythmus und mit Spannung erzählt. Weiter so!!
retropolis gaming Alex ist Deutscher oder?
yep
Ambient Occlusion is pretty much my worst enemy here. Most games run under 20fps at 720p when enabled. If there's no option to disable it I can still decrease the resolution to 480p. Some games let you do that in the ini files if it isn't possible in the game options. My target frame rate is always 20 - 30fps.
That usually works for most games. But some games (Yooka-Laylee in my case) won't let me do that. There is a mod that disables bloom, DOF and AA via F-keys. But so far no options to turn of AO. At least it runs just fine at high settings and 480p. Kinda feels like a Rare game for the original Xbox.
@@IkesDaddelbox Out of curiosity what GPU are you using? I'm amazed that modern consoles would beat your PC in games.
I feel ya pal. Only last thing i could play on my pc was BF3. I was playing 576p@15fps
This is most accurated and informative video I've ever seen in RUclips, thanks for all the advices and examples you showed dude!!
One of the best comparison and tutorials on graphic settings I have ever seen
nice job man !
I don't care for ultra setting but for my first rig I would want ultra setting to be playable at high fps so I don't have to worry about my PC not lasting a good while, basically I would rather spend more money for something that can last me a long time then actaully buying the budget parts that might drop in value.
having a pc that can run ultra settings is one thing and enjoying the game is whole other thing. sometimes we are too distracted with this settings that we forget why we are playing at first place.
I play for sweet visuals. I barely used fast travel in odyssey or witcher 3 i always like to travel on horse and enjoy the view which i can’t with low/medium graphics
"Lara still looks great too" ;)
We need another special setting for her.
The only instance where I find myself playing on mostly Ultra settings is DOOM 2016. That game was VERY well optimized for all types of gaming rigs, whether budget build or beefy. I set everything to Ultra with shadows and particles set to High, and motion blur set to Low. Other games I usually set texture quality to Ultra with everything else set to High or Medium and, in some very rare cases, Low.
Every game. Select Ultra, lower shadows to medium, turn ambient occlusion off. Reflections on medium. Framerate limited to 3 frames less your monitor. Vsync off in game. Vsync on in Nvidia Panel. G-sync on. YOU'RE WELCOME.
I do tend to just push everything up to ultra/highest, as long as I can maintain 4k/60fps I'm usually happy. If not, the first thing I turn down is resolution and not settings.
Cmdr Flint
Yup. 4K isn't worth it. 1440p isn't worth it especially on my laptop (which is equipped with a 1920x1080 monitor).
I max the settings out and thanks to the i7 7700HQ 16Gb of DDR4 at 2.4Ghz a GTX 1070 and an SSD I'm still comfortably running nearly all current-gen games above or at the worst very close to a locked 60 FPS in 1080p. Why would you turn the visual quality down, but increase the resolution? Seems just stupid to me.
Still rendering even at 512x512 using Cycles (Blender) can be painfully slow... 😭
Cmdr Flint yeah man 4k is not worth it if you cant have ultra settings. Who wants to see blurry textures in full res 4k anyway. Atleast in 4k you dont need antialiasing
Same here. Recent games have in-game resolution scaling, so I set everything to ultra and have 4k output rendered at 90%.
SpicyHotTech wait really? I didnt notice aliasing alot with 27 inch 4k display. Maybe in bigger screen or tv its noticable?
I like that answer. I too value better settings over resolution. ( 4k at low setting is still awful I dont care what anyone says ) Just play in window mode one resolution down for the best results imo.
Sometimes having settings on low can have a great advantage rather than setting them on high.
Take PUBG for example
Ps.
Players on a distance can be spotted easier.
For me i guess.
Yep :3 there are multiplayer games which low/minimal settings means less objects on the screen. So our opponents cant hide behind thick tall grass for example, because the grass become thin or even disappear with lowest settings XD
You know what? It's all damn about that "oh cool, my PC is a strong guy to get this game obeys".
Speaking as someone who went all out with a PC rig about a year ago on a 65" Samsung MU8000, there were times when I cranked up the resolution and the settings to try and get the best picture possible, but it didn't matter because I had already hit the point of diminishing returns.
I found that at my viewing distance, 4K offered nothing over 1080 and that I couldn't tell between Ultra and High for some options. I've decided that performance matters more than graphical perfection, and as long as the game looks decent I'll have a blast.
I usually go for manual. In most games setting shadows on low or the lowest setting provides a trade off allowing much better settings in several other settings
You we NEED ultra setting? Of course not. Do I still spend 4k between my PC and monitor to get ultra settings? Of course.
Same! I got to play on ultra after I moved from my failed state to Europe. No regrets.
Loving the DF focused content PC. Keep it up!
Meme nice. Up keep it!
Great video Alex, i run an i74790k and a GTX 970 and must admit i rarely go into the graphics options as i don't really understand what most of them do.I kinda find the whole massive range of options a bit overwhelming.Thanks to your video i feel able to explore these settings a bit more :).
It's called Google
I currently have a pretty good rig (asus ROG VR72gs laptop) but i don't even play recent games in "ultra" for 2 reasons: fluidity and gameplay > beauty and ... i don't like to make the hardware heat up for so little bonus details! The goal is not to to push the rig to its limits, but to be able to push it but use it efficiently while not having to push it! Just like you want a LOT of audio power, to get a good sound while under-using it! It's the key to stability, that's just my opinion, however.
Very good video, hard to actually find anything that explains settings this well
The most important settings for good looking game experience are:
1 Antialiasing
2 Texture
3 Reflection
Nah.
1) Textures
2) Lighting
3) Shadows
4) FXAA/MLAA/TAA is all anyone needs
5) Reflections
6) LOD
DF: Do you need ultra settings?
Me: Laughs in Crysis.
All of my games ever: high preset, mediocre shadows, vsync on.
Except for MGS the phantom pain. That game is optimized as heck
Generally, if you don't want to tweak it or you are afraid of all those settings. Just set it to high preset. The reality is, things changed a lot from times of original Crysis, when you were actually missing out if you did not crank it to ultra. Today developers are more aware that people with midrange or even low end machine by far outnumber top end users and they want to cater to that. So a lot of games are even designed to look good at medium. Sure high still is visual step upwards, but they will more often than not make it not necessary to enjoy a game. And ultra is often really small step up for huge performance penalty. Which is why I always agree with the statement "high is for playing and ultra is for screenshots". Because yes, on screenshot it is more noticeable, since there is no action going on and you have time to focus on the details. But those are details you likely won't notice in heat of the action.
Also if you have GeForce, trying out GeForce Experience settings can in some cases be really worth it. I had plenty of cases where GeForce Experience would set settings in wide range from high to low and it would create visually really great experience, while still fairly low in demand. I am not saying it is always on the point, but from what I saw, it didn't do all that bad of a job. So in at least some cases it might be worth a shot, if you don't want to experiment yourself.
But as shown, of course best way is to experiment a bit and find best mix of quality and performance.
One of the most useful videos of this whole channel, and that’s a lot to say! Thanks for this video
For me Medium or High settings.
Arturo Rosales
Have you considered buying a gaming console?
VariantAEC Maybe he wants 60fps.
Raios Rogue Consoles usually run games at medium-high at 30fps. If he wanted to run on medium-high on a PC, he'd also have the option to play at 60fps, an option you don't typically get on consoles.
atleast the visual sacrifice will be worth it for him as he is pursuing 60 FPS. i assume.
Arturo Rosales Yup, custom hovering around medium-high with 60 fps target (or slightly higher with 30 fps target, depending on the game of course).
I grew up playing in the lowest settings, low resolution and got 15-20 FPS. In some cases below 10 FPS. I got a GTX 1070 when it came out. You can bet your ass I want Ultra Settings.
The day I could afford a gtx 1070, it won't be able to play ultra. 😂 Having 20fps fun with 750ti right now.
*and here I am perfectly happy with a GTX770* haha Witcher 3 (and other games) still look and run amazing to me and I'm cool with silky smooth 30FPS.
I used to chase top-end hardware and max setting but its just a money sink and nothing is ever quite good enough - it never ends - its kinda weird and a little sinister...but yeh
It is when the frame pacing is good, which is what is implied by my comment.
Yeh I know its old in the tech world, and I didn't say it was perfect or could do anything, I was just stating a little experience I tend to have when watching DF videos.
I also have a 770 and share your thoughts, especially with today's hardware prices. I bought an Xbox One X and haven't regretted it, much lower price, longer relevancy/lifespan and close enough graphics when compared to a good gaming PC.
Satyasya Satyasya That doesn't sound right, I have a GTX 960(~as powerful as a 770) and Witcher 3 easily runs @60fps w/ a mixture of high and ultra settings on my rig.
Have a 2gb 770 but instead of 1080p at 30fps, I lowered the resolution to 900P in order to get a constant 60fps with medium/high settings no hairworks. Hopefully now that 1070 ti have come down to prices closer to msrp, I hope I'll get one soon
Since researching for a new GPU I'm kinda addicted to watching these types of comparison videos 😅
A player really has to prioritize which raw graphical and post-processing effects they want to maximize or optimize. For example, I never compromise on ultra textures or extra lighting (for textures: I'll compromise one setting lower when my GTX 980 cannot handle anything above 4GB VRAM).
Also, not getting 60, 90, 120 or 140fps is not a big deal. Be prepared for settling at "mid" points like 48fps, 75fps, etc. It works just as well, or near enough to make no matter. The biggest factor in performance is stability: I don't want my game exhibiting the old term of "choppiness", where framerates hike and dip all over the place inconsistently. A locked framerate -- even one limited by the game settings -- is much more preferable. I can play many pre-2016 games at >60fps, but I choose to lock it at 60 when the games allow me to, to spare the GPU and keep it in sync with my monitor refresh rate.
Also keep in mind power consumption. I'd rather tweak and optimize (maximize) asset settings like effects, than pump up resolution. I prefer marathon gaming sessions on my PC, so I can wrack up (rack?) relatively huge invoices from my hydro provider when I game at 1440p to 4K. By lowering resolution, you can afford to do this, while greatly freeing up resources to pump out performance. (Resolution fidelity apparently eats up more of your GPU's cores attention than virtually anything else).