Me, at home, after exhausting day at work: 1. Start FF XV or Witcher 3 with ultra settings 2. Watch the scenery for a couple of minutes 3. Exit the game 4. Go to sleep
Lmao ikr. Usd 1/4 of our currency and average income is half of that in the us... I built "budget" pc here and all my friend call my pc godly. The only reason I hate living here.
@@kamrankambang7953 To be fair, many people in many countries (except many people in wealthy countries which have nice PC market) don't earn that much so yep, even if its midrange gaming PC, it will be seen as "godly" PC based on its price.
I prefer doing custom settings, in most games just lowering a few settings and leaving the rest maxed can give you a big fps boost. Shadows are usually one of the settings that kill fps.
Just setting shadows on medium will give you amazing results (usually low is pretty shitty and distracting). Thats why one of the first things ultra low budget PCs do to run games is disable shadows.
Shadows, AA, AO and volumetric lighting are the usual offenders. In the case of Odissey, volumetric clouds is sometimes all you have to decrease to gain double digit fps boosts.
When I still had my 580, using medium settings and bumping textures to ultra was my go-to move for most games. That along with a Freesync monitor is a pretty damn good gaming experience.
I neither stream nor benchmark, I just like the best possible experience :/ Granted it's not $1500 cards, more like $800 ones lol. Rift likes the extra power as well, only set the thing up a couple times a month but still nice to have.
@@brkbtjunkie plus it's split in 1/4 each so it's even harder to tell any difference... he should have done the split screen only for fps and full screen on top of it to really show the differences I don't have any of those games but the difference between all medium to all ultra is pretty significant in most games I have, very high to ultra often really isn't a big deal but in most games the draw distance alone makes a huge difference the most impactful however is resolution... any game I tested looked better on medium settings 4k than on ultra settings 1080p so I'd recommend aiming for the highest resolution your hardware can handle
Patrick Jerome Obaldo its depends , i cant play without AA , i just cant tolerate rough edges, but these above settings can be turned down and totally unnecessary
Sadly, things that make the lighting more realistic is one of the highest priorities. I'd take that over texture resolution or anti-aliasing, or anisotropic filtering, for example. Personally, when my GPU isn't powerful enough, I turn everything down to low/off, except ambient occlusion, and I always keep native resolution. Those are the only thing that matter more than higher framerate, to me. (I'm generalizing, though.) Ambient occlusion and realistic lighting gives me a better view of the geometry of the game world, and distances and such. Native resolution gives me the sharpness I need. Apart from that, framerate is the highest priority, so I always aim for 100fps.
The lighting in the AC run was different for each run so it was hard to compare. It was cloudy in high and very high, therefor everything appeared more washed out vs the shiny look from the sun on low and ultra. That being said I totally agree with the general consensus that ultra vs high is generally negligible and not justified for the performance hit.
It is not helpful when game companies do not make their benchmarks exactly the same run to run... notice people standing in different places? I could run the benchmark 20 times and it varies a few FPS each time. Having said that, while the min and max numbers can vary by a lot, the average tends to be pretty similar.
I noticed the lighting and shadows were different on ultra, but they didn't necessarily look better, just different. Certainly not worth the fps difference.
If you're on PC in the first place then you're mostly likely interested in gaming with better visuals. However, those who are fascinated by high end graphics are in for a ride. Its important to know that you should never feel jealous about how good someone else's game look's or runs. Instead make sure you know how to tweak and optimize your games for your current rig and setup. As time then goes by you will upgrade parts and you can perhaps increase the eye candy in your games. With each upgrade, each game and with the more you learn, you will appreciate and find just how incredible high end PC gaming is in terms of graphical fidelity. You will definitely be hooked once you see the difference between normal and high end, I'd say it changed my whole perspective on regular "gaming". I worked my way up pretty slowly over the years, learning, appreciating and experimenting every step through the journey. I now run my games in 4K with a pretty beefy rig. I can say that the difference is truly night and day and was well worth the trouble.
Visual fidelity has not really been a major thing for a while now, the current gen pro consoles have surprisingly decent graphics chips. But then again i'm not interested in running my games at 30 or 60 fps in 2019 and i'm definitely not going to play shooters on a controller. Sometimes i buy a console just for one or two exclusives (like smash and such) but otherwise the typical console exclusives (that generally weigh on the casual side) don't interest me and multiplatform games run better on PC anyway.
I hate generalisations like that. AC Odyssey is a terribly optimised game, yes, but it's one or two settings that influence the drop the most, I can't remember which ones though. So rather than going by presets, you can just fine tune what you want higher and what you want lower. For instance, in many games shading is quite taxing, but doesn't impact visuals much, but textures might not impact performance, but do affect visuals. If you just use "high" or "ultra", you're going to loose a lot, for not that big of a gain. Just spend some time, and play with the settings, it usually takes 5-10 mins at most.
1000% this. A lot of games are like that. I usually max everything and if my fps is too low I look for the most often 1-3 settings that are tanking it and lower them a bit. Usually it is something stupid like motion blur or max AA or shadow details that you dont' really need maxed to still look really good. Turn down AA and shadows a notch or two and you wont' notice much if any difference but could get like 10+fps back. This video is good for the lazy or disinterested who just use presets but if you're willing to tweak a bit you can have your cake and eat it too.
Tinkering around with the settings is the first thing I do when I launch a new game! By the way, in Odyssey, it's the volumetric clouds setting that eats a ton of FPS, switching from Ultra to High gave me a 10 FPS boost on my RX 590.
This is the true answer. Some settings have almost no performance hits, but others take a huge chunk. Namely shadows and shaders. But for the average person, this video will still help them to make the most of their gpu with ease.
In a way playing on low-end hardware is more fun because you focus more on the game than the visuals, and it makes you appreciate it more once you do upgrade.
PC gamers should do themselves a favor and stop selecting quality levels by tier. Everyone PC gaming should put the effort into knowing what each setting does, and customize the selection. An 8GB RX580, should have a mix of medium to highest, depending on the setting. An example is the texture settings. For right now, almost all games should be set to the maximum texture quality on an 8GB card (or second highest in the rare title). Some settings have a much larger impact than others, and some have a large visual impact, but not necessarily have a large performance impact. Sometimes there are post processing settings that you might prefer on low or off. Yes you can select a tier and play, but you are already PC gaming, and taking the time to install the game. Go the last 5% and set it up properly. Even on my 1080ti and 1080 systems you'll find a mixture of settings, because I don't like the look of all of the highest settings, some are garbage, and sometimes the resolution bump is the best setting of all.
If only games has some standard meaning when naming them "low", "medium", etc... but they dont.. in some cases the high and ultra settings seem to be mostly placebo, while in others its a true doubling of texture resolution
Completely disagree. Going through each and every individual setting is the most massive headache about PC gaming, and if you're switching between laptop and desktop frequently (me), your cloud saves also save your settings in about half the games you play. So this is a DAILY issue, and I f***ing hate it. And I have the knowledge to understand what's what for the most part. I still want the SHORTEST path from start screen to playing, I don't want to have to think about it, but indeed, the reality of PC gaming is that frame stutters happen unless you tweak settings. What I will happily advocate for is more developers including options for different levels of simplification, particularly a middle ground with sliders for, e.g. "textures" and "effects." A Hat in Time, as an example, has options that separate "Environment surfaces," and "model detail," as well as "effects" and "shadows." This is fine. And failing that, tiered, generalized selections are also fine. Doom (2016) as another example, also has the tiered settings, but with only a few others that you set separately. V-sync and anti-aliasing, film grain, motion blur, kind of the fluffy extras depending how much of a "cinematic" experience you want, and I actually like this setup quite a lot. The only time I'm ok with going in by individual selections is competitive shooters/eSports games where having the most smooth and consistent framerate is in fact quite critical.
@@ryans3199 I can see where you're coming from with the cloud saves. I don't use them personally, so I didn't really factor that in. We have a bunch of gaming systems on our LAN, but we tend to keep each user at one system. I have had people switch systems, but I write scripts for the saves to backup and switch between the systems. If we're talking about missing features, it wouldn't be all that hard to have a selection for cloud saves to include the in game settings with the save games, or not. I realize occasionally a game will have those settings in the actual save, but that's what selective cloud disabling is good for. Here's another one I think I'm not the only PC gamer that's tired of seeing. PC games used to allow several save slots, or at least unlimited saves. I've seen a game go from absolutely open to multiple users, to the next in the series only allowing three slots, and one save per slot, and the next only having one slot one save. I'm sick of these games that don't even allow multiple saves for one user. Some of us would rather have functionality, than just relying on a game to save for us, but we lose everything else. This isn't a cheaping out issue. I realize they want you to purchase three copies, if three people want to play, which makes me sick. Many of the games we have, we own four copies for coop LAN play. I just can't stand the loss of things that made PC gaming great. If you don't want all of the extra features to make the game run really well, that's fine, and I think they should leave the quick tiers in, but every year there are less settings available to tweak, things get more dumbed down, and half of the time we have to hope they did a decent job porting it. At least most of the games I can still outside of the game get into the configs and fix their mess. Many games have a couple of small mistakes in their config, that never gets addressed, but murders performance. I would call foul play, if I didn't know that it normally comes down to time restraints, or no funding to do a proper job for the PC version. Anyway, originally my point wasn't to force everyone to take time to tweak. My point was that once you know what all of this stuff does, it literally takes 5 minutes with good OSD software, to test and set settings, to get a much better 10-100 hour experience (depending on the game of course). If this was a 5% thing, it would be a waste of time, but many many times I can achieve the same or much better performance with similar visuals to a card that's a tier higher, by monitoring performance demands per setting, and judging the visual trade off each time.
I have a 1080 and 1080ti system, but I upgrade some of the LAN PC's by hand me down upgrading, so I have a system with a GTX 970 that's heavily modded, but it's still far behind the newer systems. It helps to do the stuff I'm talking about in those scenarios. The slowest is an R9 280 OC, and it still runs DOOM 2016 1080p locked 60. Before tweaks it couldn't manage 900p locked 60. I should mention that that system has an equally weak CPU. Fair enough on making it easier, but it needs to not sacrifice quality. Calling levels of AA low, medium, high, because you expect the user to be too dumb or lazy to understand types of AA is awful. I agree that some of those settings should be separate from the slider, or combined into a second slider. That's my biggest problem with single selection tiers.
Wow! I love it when someone proves how much we have been brainwashed, it's up to us to accept that, I have! Thank You Tech Deals, your honesty is is so refreshing!
I'm from argentina and I remember I saw the thumbnail of your video just before I refresh the RUclips's start page. And honestly tought I'll never see this video again on my feed...until now...thanks for remember us the real spirit of gaming
So the title isn't clickbait, the information is clearly and concisely presented, and the point you set out to prove is well made. It's like I'm on the wrong internet.
@@dreamshooter90 as long as the games are not the new top AAA games id bet he gets good 60fps on most games. i pick up all my games on sale (so more than 2 years old) and they tend to run perfect on my rx 580 at 60fps with lots of GPU power left over so the GTX 1060 will be about the same. plus his display is between 1920x1080p and 2560x1440p so it's biger but not too much harder than 1080p. sounds like a nice size display :D
I remember when you first started. I think I subbed then, thinking at least this guy goes through the trouble to explain why, verses just throwing up a bunch of graphs. Still think your channel is the dogs danglies and still subbed. Good to see its grown well. Good luck.
I bought the SSD 2 months ago, great for the money. I appreciate the point you are trying to make. Constant 95 degrees is a little meh. People should look what games they want to play and what moniitors they want to drive. Good and consistent frametimes is extremely important for your gaming experience, maybe even more important than the difference between high and very high level of detail. BTW: If you ever played without motion blur, tearing and low latency (Input lag) on a 144 Hz panel with a consistent high framerate, you NEVER want to go back. :-)
This makes so much sense. Thank you! I guess it all depends on how much you can crank out from your rig. For adventure games like AC Odyssey 30 FPS is OK but for First-Person-Shooter games like Metro Exodus 45 FPS is what should be the minimum for a better playing experience.
You are my hero! When you have time and no great deals to show, please do a video about the *most expensive and useless* game graphics effects, as SSAO , FSAA 4x, etc.
Love this Video! Here in Germany I confronted some youtubers whith this content: Nobody wanted to talk about this theme. Manufacturers wants to sell expensive equipment and of course only newest (and most expensive) components can fullfill our (theirs) highend dreams...
Depends on the game, World of Warships and Elite Dangerous (my main games) are perfectly playable on my PC at ultra detail. Some of the enjoyment especially in Elite is looking out at a gorgeous view of a galaxy/planet from somewhere that Noone has ever gone before!
b550m, 5800x, 3060ti, 32gb 3200 ram. I play on a 1080x60 monitor. I play on ultra settings with RT off. I never drop below 60 fps in any game I play. Hardest game for my computer is Hogwarts Legacy. No stutters, smooth as butter, 60 fps the whole time. If I dropped under, or had stutters, I'd go to high settings. But 1080 for the win :)
Sir... U just said exactly the same words which i say to my friends who foolishly argue on playing everything at maximum setting available and really suffers huge fps drop for a marginal increase in visual quality which no one ever notices when u r really in middle of the action..... I am so happy u made this video, u know.. They will listen to you... Not me... I am jealous :)))
the thing is, most competitive or esports shooters / players does not even go as high as ultra. We have mostly set custom settings or low in order to get the best fps available, even when we have a beastly rig. We need to be able to spot enemy in pixels (CSGO), we set foliage or textures to low so that we can spot enemy hiding in the dam grass (PUBG), we set Ambient Occlusion off so that we can see enemy hiding in the dark corner in a room. (R6S). Even thou most of us have beastly rig, we still game at lower details with massive FPS.
I have an i7-8700 and still my GTX 960 4GB Card lol, it’s the only “old” part of my PC, but at least I can still play most games at like mixed graphic settings.
Why you are doing that to yourself? :-) GTX 1660 is not that expensive but you will have much smoother gaming. gpu.userbenchmark.com/Compare/Nvidia-GTX-960-vs-Nvidia-GTX-1660/3165vs4038 and: ruclips.net/video/4jGIRoDF-bI/видео.html
I can't believe some people think they need to play on ultra in order to enjoy the game. Whole my life I've been optimizing my budget and graphic details trying to get 75+ fps and it feels great every time. It's a game, not a movie. Smoothness over realistic detailed scenery any time! No matter if it's competitive or story driven game.
THANK you, Tech Deals! There was another other video about this posted a while back pointing out that developers really don't balance their games for Ultra setting performance. That and that Ultra visuals are marginally better than High at best, and not worth the wild performance cost.
10:45 -- "low detail is absolute garbage ... a whole other game" 11:22 -- "except for the grass, these are amazingly similar" Even so, a very useful video.
If you are playing older games, it's fine to play at ultra, also getting a gpu a bit over kill for your resolution at the moment (e.g. 1660ti, 2060 at 1080p). As long as it stays above 60 I'm fine with ultra, but if it drops I have no problem dropping to high settings instead.
You don't need to drop an entire preset on AC:O. Volumetric clouds is a ridiculously demanding setting. Drop it to medium and you're golden. And yeah, I like ultra for playing, I'm a snob.
I love to play RPG's, so the details are something I look for... let's take the Witcher 3 example, looking at the scenery is awesome :) I play some games at High and others at Ultra, difference is not noticeable at all between these 2 presets.
I agree with you mostly. I think it is only sometimes worth it cranking that shit up to max. But I have a major critique point. Instead of showing the quartered window you should've shown us the games in full screen. We were essentially looking at a 1080p image downsampled to 480p. Therefore we couldn't tell a difference in texture quality, tessalation and small shadows and stuff. The only difference I could clearly always see was the reflections. Like I said, great point but poor evidence.
Biggest difference is in the draw distance and pop-in of objects. I always crank draw distance at the expense of detail because you really do notice it.
Love your videos, always adressing the important topics and the real deal for (let's be honest) the majority of the gamers out here who don't game on an RTX or i9-9900k.
I have that very same Cooler Master case. It surprises me how under-appreciated it is, considering its airflow, modularity and obscenely low price. The only true complain I have with it, is its acrilyc side panel, but at its price-point, its totally understandable. If you are on a budget and looking for a good case, don't hessitate to buy it.
@@gu3z185 Just to clarify: You are saying "It's a better graphics card than a shit graphics card that's also 2+ years old". Edit: The 1060 is 3 years old in July.
It's insane how much performance you can squeeze out of the games just pulling some setting down to high. Sometimes even one setting. I would never use presets myself.
What kind of glue do you use for your dentures? I keep spitting mine out during intense gaming sessions and getting saliva and denture glue all over my mic. But yours seem to stay in ok... What do you use?
Jesus this channel is always 100% real. Just seeing the thumbnail was enough for me, instant like! But one extra fact: Even for content creators you don't really need the best hardware. This is something I only learned once I got really good hardware. You get diminishing returns in video quality for every gaming quality improvement. One of the biggest reasons is RUclips's codec. Even after tricking RUclips into using their better codec still doesn't give you 1:1 detail in your videos. I noticed that if you upload a 1080P video in 4K, only then you get some of that quality you see while playing. But you can't watch RUclips videos at 4K on mobile which a lot of people use. So... TL;DR If you are a content creators thinking ohh, ok, I NEED a good pc, because it will show in videos. Yes, but not as much as you think at all. A budget build is all you will need.
As someone who plays at 4K on a 1080Ti, I can't agree enough. That high to ultra jump makes a massive difference in a lot of titles. I will say though, it also depends on if I'm playing single player or multiplayer.
The worst part is that the same computer I used to play Lol at max details in 2011 barely hits 60fps in some teams fights nowadays. At least the last time I played it, two years ago
Everyone is going to find their own sweet spot for gameplay settings. I got used to playing Rocket League at moderately low settings so I could keep my frames up around 120. Then I discovered I could create a custom fan curve in Afterburner for my RX 560 and now I play with medium settings at 144 fps steady. And whereas the card was running above 75ºC before I can now keep it right around 60ºC with just a bit of extra fan noise--most of which doesn't matter since I frequently wear earbuds to play anyway.
Stuart Dow - ASUS PG279Q 👍 it’s a couple years old now, but still better than about anything under $1000. 4K is too hard for AAA high refresh rate, even with a 2080ti.
In the Assassins Creed Odyssey engine, its a unique case, the driving factor in performance between High, Very High, and Ultra, is volumetric clouds. From medium on up, that setting offers NO measurable graphics improvement, with severe performance hits the higher you go. So you can run the whole game on ultra, with clouds on Medium, and it'll run great.
I have a system with an 1660ti and one with an RTX 2070. The 1660ti is a fine card for the price. But even ignoring the RTX features completely the 2060 is a better card when it comes to raw performance. If you want to go for the 1660ti go for a cheaper model. I have the MSI 1660ti gaming x which is only about 30 € less than an entry 2060. I can say that the 30 € in savings is not worth it in that case.
I think a very important reason to lower detail is missed. In some games the bad guy can be hidden behind detail and take you longer to find, so turning detail down let's you find/kill them first. Some games benefit from shadows and some don't. If shadows is on I can determine if someone is coming around the corner in some FPS games, so I camp and wait instead of getting camped. In others shadows hide things that take you more time to discover important things. I recommend trying settings for each game you play and use the one that benefits YOU and your hardware, there's just no other way to do it IMO. And for Space Invaders sake, lose that psychological influence of Ultra vs other settings. I have found no game that Ultra setting is best as far as competing goes, hardware performance aside of course.
@@MrYedige Eh Nothing special, DDR4 3000 15-15-15-35. Needed good performance, but when I purchased the RAM, the prices were still pretty high. That being said, I've had no problem driving 240 FPS is most of my competitive games, with at least 144 in the more demanding ones. I don't think I'll be getting a Turing anytime soon, but I think that would be necessary to ensure 240FPS in the most demanding games. It's all butter in any event, and I have no complaints. What are you running?
This video is in a way for those whom dont like fiddling with their graphical settings to find out which option affects fps and how much visual difference it outputs. Note that not all games are only specifically gpu bound, but are also cpu dependent. Regarding ac odyssey, hardware unboxed has a detailed video on what settings are optimal to use without sacrificing any of the visual fidelity.
I think it depends on your monitor. If you have a 144Hz panel you will notice the drop in frames on mid range or lower hardware when you go up to ultra
Quite an important point to always keep in mind when you're looking to go buy/upgrade to a mid(budget)/high tier GFX card is, what games do you play? Buying a budget card can be good to a certain point, specifically when your games are solo offline campaign modes, the big jump/difference with performance comes in for all the MMORPG's. You load in and there's massive battles (PvP) happening and all of a sudden your mid tier budget card is hitting overload due to 30-150+ other players casting/spells/combat animations all at the same time, then you notice the big drop and low to mid tier cards will definitely need to drop down on graphics (possibly lowest setting) to handle that and save you fighting at 10 frames a second due to the card being maxed out completely. Just something to think about if you play any of those MMORPG's with high population on the servers.
I have an i5-4440, 8GB of DDR3-12800, and a GTX660. Just today, I played AC: Unity on the lowest settings, at 768p, on my 1080p monitor. Ran just fine, but my VRAM (2GB) was filled to the brim. I plan to keep this rig for another few years. Maybe swap the GPU if the day comes when I can no longer launch a game at 720p. And I'm planning to build a LAN rig with a Q6600 taking center stage (or Q8xxx or Q9xxx, depending on how cheap I can find them). I want to bolt on an old 15" 720p laptop screen to make it a true portable LAN machine. I think I might even be able to run Fallout:NV on medium! :) You work with what you have in this business. ;) Computer building and gaming can be a great hobby, even if you're working with 10 year old components (and building $100 computers)!
Difference between very high and ultra is definitely noticeable in the lighting, especially on the water. But it's not a big deal, I wouldn't honestly care or even notice if I wasn't researching GPUs in the first place. Oh, by the way, your videos have been really helpful in making my choice of GPU. Still weighing and balancing my options but you've put a lot into perspective.
Just got around to watching this and it is so true. On another note, this is why I don't use Geforce Experience graphics optimizer. It always seems to put too much emphasis on visuals so when you go from optimized settings to very high or high, you get drastically better performance in most games.
Yeah for some reason he thinks that people buy 2080ti's for 1080P 60hz screens lol. Plus an RX580..like that's not something to relate to lol. That card was 'middle of the road' when it came out in 2017 lol... it's almost halfway through 2019.
60fps doesn't cut it, simply. Personally, I can't live with anything below about 80fps minimum. I only have a 100hz (34440x1440) monitor, sadly, so I make sure to keep it at a steady vsync'd 100fps. (With a 2080 Ti)
I think that's an industry secret. Once I've started to look outside Anandtech for GPU benchmarks, I noticed they're usually at Ultra or High settings. I felt that is intentional so that GPUs will never catch up or overkill with recent games. Those settings are good for standard testing across the industry but poor for actual gaming. Since 1024x768 resolution, I've never needed AA or higher quality AA. I want the images sharp as AA always blurs it for me. Additionally, I've noticed in one game that my GPU power consumption drops to half from when I disabled some effects. I am using a Vega 56 at 4k resolution where at the higher setting, it shows 180 watts then it hovers 70-90 watts at lower settings.
Man, you dont get it. you dont get any of this pc masterrace thing. we play on ultra just to inflate our little ego and brag to our friends that their laptop is shit and our pc is great while the truth is we cant even tell the damn difference between high vs ultra and 1440p vs 4k on a completely blind test. Not to mention that watching those 4k resolution and 144fps on our games can give some nice erection in our whole gaming session. so with that said, HOW DARE YOU SAY WE SHOULD PLAY ON HIGH!?
It really depends on the game, but for the most part, you're right. I think a Single Player game is worth playing at Ultra, but online/competitive games you probably shouldn't unless you've got a PC that'll handle like 150fps or higher on Ultra or Low on games like Siege, Modern Warfare and Battlefield, then i'd rather Ultra. I only care for 120fps, so I don't need to drop to Low if I had a 2080Ti if it gets me from 130fps to 160fps for example. I want a 3080 when it releases, because my 1070 can't handle 1440p 60fps at even high settings on some games these days, but I'm okay turning the settings down provided the game doesnt look ugly after doing so
High for gaming. Ultra for screenshots. Now, that's how you call my attention
And no clickbait at all
Better than linus tech tips
Jakov Pavelić She(Linus)is a transexual gaming bitch
Agreed - straight to the punchline. Love it
@@thaNZdon everything i ever asked myself about gaming and computers this guy answerd, saved me the time to test things.
This dude can sell weapons of mass destruction with detailed explanation.
LOL, agreed.
And benchmark it too.
"Hello and welcome to arms deals!"
He is good, he is.very good
He can even sell a can of air and I'll happily take it.
Professional... no clickbait... straight to the point... THIS IS QUALITY CONTENT...
tech is the man!
Me, at home, after exhausting day at work:
1. Start FF XV or Witcher 3 with ultra settings
2. Watch the scenery for a couple of minutes
3. Exit the game
4. Go to sleep
No you don't.
@@jcdenton4911
Lol shut-up
Same here ..
that's totally me, but i play games only at friday night/saturday/sunday.
same here
I have 2080 ti SLI and i just sit playing heaven benchmark all day
Bwahahahaha
Of course lol
What can expect from trash gpu.
Azusa Snowflake how is it trash? The price is ass but not the card.
do you think we care , brag somewhere else
This man is the most informative person I’ve seen on RUclips
High detail for gaming. Ultra for screenshots.... t-shirt idea? I think so.
wanna be called a peasant?
@@6ix750 you can only call him a peasant if he says 900p 20fps low detail for gaming 😂
"midrange pc" in my country we call it supercomputer
Lmao ikr. Usd 1/4 of our currency and average income is half of that in the us... I built "budget" pc here and all my friend call my pc godly. The only reason I hate living here.
@@kamrankambang7953 To be fair, many people in many countries (except many people in wealthy countries which have nice PC market) don't earn that much so yep, even if its midrange gaming PC, it will be seen as "godly" PC based on its price.
@Pavel Vanek Malaysia. MYR
I call it low range
@@Hilani tung le kaye
I prefer doing custom settings, in most games just lowering a few settings and leaving the rest maxed can give you a big fps boost. Shadows are usually one of the settings that kill fps.
Exactly
Just setting shadows on medium will give you amazing results (usually low is pretty shitty and distracting).
Thats why one of the first things ultra low budget PCs do to run games is disable shadows.
Shadows, AA, AO and volumetric lighting are the usual offenders. In the case of Odissey, volumetric clouds is sometimes all you have to decrease to gain double digit fps boosts.
When I still had my 580, using medium settings and bumping textures to ultra was my go-to move for most games. That along with a Freesync monitor is a pretty damn good gaming experience.
dont bother buying top end GPUs. you got the right idea. I used to do that with low to mid range gpu and I was happy.
RX 580 can max any game tho, the thing is, ubisoft engine is trash, that's why in assasins creed you get this score.
thats what i do for my daughters pc.works great for her.normally a custom mix of high and medium tho.
Hahaahaa righttt?????
2080ti on low details at 720p = ultimate gaming!!!!!!
@@julatzschmulatz9607or Xeon Platinum 8180M
MegaMinihulk lol 😂 good one
Ultimate gaming at 1000 fps :)
csgo yt be like.
420 fps blaze it
ME: I don´t fall for clickbait
Tech Deals: High for Gaming, Ultra for screenshots
Hold my cervesa
Chela 😂
The 1500 dollar video card crowd are NOT gaming. They are posting links to futuremark scores.
ha ha i love it so true
I pre-ordered my 1080Ti back then, OC it for like 2 hours and then raged and played the shit out of it since then. I absolutely love it!
I played titanfall 2 for 5 hours last night...
They play fortnite xd
I neither stream nor benchmark, I just like the best possible experience :/ Granted it's not $1500 cards, more like $800 ones lol. Rift likes the extra power as well, only set the thing up a couple times a month but still nice to have.
Bruh, is there anything wrong with my eyes? The medium setting was looking the best to me.
More realistic graphics kills fantasy creativity for me might be the same thing.
Colorblind mode gayming without being colorblind.
APOORV CHOUDHARY hard to tell after its run through RUclips compression
@@brkbtjunkie plus it's split in 1/4 each so it's even harder to tell any difference... he should have done the split screen only for fps and full screen on top of it to really show the differences
I don't have any of those games but the difference between all medium to all ultra is pretty significant in most games I have, very high to ultra often really isn't a big deal but in most games the draw distance alone makes a huge difference
the most impactful however is resolution... any game I tested looked better on medium settings 4k than on ultra settings 1080p so I'd recommend aiming for the highest resolution your hardware can handle
i think i saw it to bro...your eyes are good..it looked smoother to me too.
It def looked better than high for some reason
In 90% of the games, the main culprits are Shadows, Ambient Occlusion & Volumetric Clouds or Lighting!
Anti-Aliasing as well.
@@patrickobaldo883 depends on the type - at 1080p you can get away with post-processed anti-aliasing (not true for lower resolutions tho)
Patrick Jerome Obaldo its depends , i cant play without AA , i just cant tolerate rough edges, but these above settings can be turned down and totally unnecessary
@Transistor Jump LOL
Sadly, things that make the lighting more realistic is one of the highest priorities. I'd take that over texture resolution or anti-aliasing, or anisotropic filtering, for example.
Personally, when my GPU isn't powerful enough, I turn everything down to low/off, except ambient occlusion, and I always keep native resolution. Those are the only thing that matter more than higher framerate, to me. (I'm generalizing, though.)
Ambient occlusion and realistic lighting gives me a better view of the geometry of the game world, and distances and such. Native resolution gives me the sharpness I need. Apart from that, framerate is the highest priority, so I always aim for 100fps.
it's like the difference between MSAA 8x or 2x.
the little improvement in graphics doesn't worth the massive fps drop.
The lighting in the AC run was different for each run so it was hard to compare. It was cloudy in high and very high, therefor everything appeared more washed out vs the shiny look from the sun on low and ultra.
That being said I totally agree with the general consensus that ultra vs high is generally negligible and not justified for the performance hit.
It is not helpful when game companies do not make their benchmarks exactly the same run to run... notice people standing in different places?
I could run the benchmark 20 times and it varies a few FPS each time. Having said that, while the min and max numbers can vary by a lot, the average tends to be pretty similar.
I noticed the lighting and shadows were different on ultra, but they didn't necessarily look better, just different. Certainly not worth the fps difference.
If you're on PC in the first place then you're mostly likely interested in gaming with better visuals. However, those who are fascinated by high end graphics are in for a ride.
Its important to know that you should never feel jealous about how good someone else's game look's or runs. Instead make sure you know how to tweak and optimize your games for your current rig and setup.
As time then goes by you will upgrade parts and you can perhaps increase the eye candy in your games. With each upgrade, each game and with the more you learn, you will appreciate and find just how incredible high end PC gaming is in terms of graphical fidelity.
You will definitely be hooked once you see the difference between normal and high end, I'd say it changed my whole perspective on regular "gaming".
I worked my way up pretty slowly over the years, learning, appreciating and experimenting every step through the journey. I now run my games in 4K with a pretty beefy rig. I can say that the difference is truly night and day and was well worth the trouble.
A lot of pc players care more for performance than graphics.
On xbox one x I get better graphics and colors because of the 4K and hdr but pc I get better performance, 1440p, 144hz
@@bretts5571 True. I got myself a 4K HDR monitor for my main PC setup and now I can never go back.
Visual fidelity has not really been a major thing for a while now, the current gen pro consoles have surprisingly decent graphics chips. But then again i'm not interested in running my games at 30 or 60 fps in 2019 and i'm definitely not going to play shooters on a controller. Sometimes i buy a console just for one or two exclusives (like smash and such) but otherwise the typical console exclusives (that generally weigh on the casual side) don't interest me and multiplatform games run better on PC anyway.
@Geo Tech News ye and pay the j$wish banksters for the rest of your life?
I hate generalisations like that. AC Odyssey is a terribly optimised game, yes, but it's one or two settings that influence the drop the most, I can't remember which ones though. So rather than going by presets, you can just fine tune what you want higher and what you want lower. For instance, in many games shading is quite taxing, but doesn't impact visuals much, but textures might not impact performance, but do affect visuals. If you just use "high" or "ultra", you're going to loose a lot, for not that big of a gain. Just spend some time, and play with the settings, it usually takes 5-10 mins at most.
Excellent point.
1000% this. A lot of games are like that. I usually max everything and if my fps is too low I look for the most often 1-3 settings that are tanking it and lower them a bit. Usually it is something stupid like motion blur or max AA or shadow details that you dont' really need maxed to still look really good. Turn down AA and shadows a notch or two and you wont' notice much if any difference but could get like 10+fps back. This video is good for the lazy or disinterested who just use presets but if you're willing to tweak a bit you can have your cake and eat it too.
I agree. In the case of Odyssey, volumetric clouds and AA are the demanding settings.
Tinkering around with the settings is the first thing I do when I launch a new game! By the way, in Odyssey, it's the volumetric clouds setting that eats a ton of FPS, switching from Ultra to High gave me a 10 FPS boost on my RX 590.
This is the true answer. Some settings have almost no performance hits, but others take a huge chunk. Namely shadows and shaders. But for the average person, this video will still help them to make the most of their gpu with ease.
In a way playing on low-end hardware is more fun because you focus more on the game than the visuals, and it makes you appreciate it more once you do upgrade.
PC gamers should do themselves a favor and stop selecting quality levels by tier. Everyone PC gaming should put the effort into knowing what each setting does, and customize the selection. An 8GB RX580, should have a mix of medium to highest, depending on the setting. An example is the texture settings. For right now, almost all games should be set to the maximum texture quality on an 8GB card (or second highest in the rare title). Some settings have a much larger impact than others, and some have a large visual impact, but not necessarily have a large performance impact. Sometimes there are post processing settings that you might prefer on low or off. Yes you can select a tier and play, but you are already PC gaming, and taking the time to install the game. Go the last 5% and set it up properly. Even on my 1080ti and 1080 systems you'll find a mixture of settings, because I don't like the look of all of the highest settings, some are garbage, and sometimes the resolution bump is the best setting of all.
If only games has some standard meaning when naming them "low", "medium", etc... but they dont.. in some cases the high and ultra settings seem to be mostly placebo, while in others its a true doubling of texture resolution
yeah, I almost always set shadows to low/medium - they still look acceptable and fps boost is really big
Completely disagree. Going through each and every individual setting is the most massive headache about PC gaming, and if you're switching between laptop and desktop frequently (me), your cloud saves also save your settings in about half the games you play. So this is a DAILY issue, and I f***ing hate it. And I have the knowledge to understand what's what for the most part. I still want the SHORTEST path from start screen to playing, I don't want to have to think about it, but indeed, the reality of PC gaming is that frame stutters happen unless you tweak settings. What I will happily advocate for is more developers including options for different levels of simplification, particularly a middle ground with sliders for, e.g. "textures" and "effects." A Hat in Time, as an example, has options that separate "Environment surfaces," and "model detail," as well as "effects" and "shadows." This is fine. And failing that, tiered, generalized selections are also fine. Doom (2016) as another example, also has the tiered settings, but with only a few others that you set separately. V-sync and anti-aliasing, film grain, motion blur, kind of the fluffy extras depending how much of a "cinematic" experience you want, and I actually like this setup quite a lot.
The only time I'm ok with going in by individual selections is competitive shooters/eSports games where having the most smooth and consistent framerate is in fact quite critical.
@@ryans3199 I can see where you're coming from with the cloud saves. I don't use them personally, so I didn't really factor that in. We have a bunch of gaming systems on our LAN, but we tend to keep each user at one system. I have had people switch systems, but I write scripts for the saves to backup and switch between the systems. If we're talking about missing features, it wouldn't be all that hard to have a selection for cloud saves to include the in game settings with the save games, or not. I realize occasionally a game will have those settings in the actual save, but that's what selective cloud disabling is good for. Here's another one I think I'm not the only PC gamer that's tired of seeing. PC games used to allow several save slots, or at least unlimited saves. I've seen a game go from absolutely open to multiple users, to the next in the series only allowing three slots, and one save per slot, and the next only having one slot one save. I'm sick of these games that don't even allow multiple saves for one user. Some of us would rather have functionality, than just relying on a game to save for us, but we lose everything else. This isn't a cheaping out issue. I realize they want you to purchase three copies, if three people want to play, which makes me sick. Many of the games we have, we own four copies for coop LAN play. I just can't stand the loss of things that made PC gaming great. If you don't want all of the extra features to make the game run really well, that's fine, and I think they should leave the quick tiers in, but every year there are less settings available to tweak, things get more dumbed down, and half of the time we have to hope they did a decent job porting it. At least most of the games I can still outside of the game get into the configs and fix their mess. Many games have a couple of small mistakes in their config, that never gets addressed, but murders performance. I would call foul play, if I didn't know that it normally comes down to time restraints, or no funding to do a proper job for the PC version. Anyway, originally my point wasn't to force everyone to take time to tweak. My point was that once you know what all of this stuff does, it literally takes 5 minutes with good OSD software, to test and set settings, to get a much better 10-100 hour experience (depending on the game of course). If this was a 5% thing, it would be a waste of time, but many many times I can achieve the same or much better performance with similar visuals to a card that's a tier higher, by monitoring performance demands per setting, and judging the visual trade off each time.
I have a 1080 and 1080ti system, but I upgrade some of the LAN PC's by hand me down upgrading, so I have a system with a GTX 970 that's heavily modded, but it's still far behind the newer systems. It helps to do the stuff I'm talking about in those scenarios. The slowest is an R9 280 OC, and it still runs DOOM 2016 1080p locked 60. Before tweaks it couldn't manage 900p locked 60. I should mention that that system has an equally weak CPU. Fair enough on making it easier, but it needs to not sacrifice quality. Calling levels of AA low, medium, high, because you expect the user to be too dumb or lazy to understand types of AA is awful. I agree that some of those settings should be separate from the slider, or combined into a second slider. That's my biggest problem with single selection tiers.
i might be the only one but i think high looks better than ultra on this game o_o
Yeah, light is yellowish whilst on uktra its grey wich is dull
6:14 The real thing starts, you're Welcome.
Tech can be a prof, able to turn 20 seconds into 4 minutes of intro
doing god's work lmao thanks homie
Wow! I love it when someone proves how much we have been brainwashed, it's up to us to accept that, I have! Thank You Tech Deals, your honesty is is so refreshing!
Why am I seeing Medium settings look better?
This what I kept thinking. Coloring looks better
me too!
Different cloud patterns
and i thought i was the only one
I turn off a lot of the lighting effects I find tacky. I like to think I play with sunglasses on, just without sunglasses.
I'm from argentina and I remember I saw the thumbnail of your video just before I refresh the RUclips's start page. And honestly tought I'll never see this video again on my feed...until now...thanks for remember us the real spirit of gaming
Im a simple man. I see tech deals I click and like 😎
You also recycle the stupidest meme/saying on the internet.
I'm a simple man. I see settings I click ultra.
even as a computernerd since 1985.. i'm subscribed..
Yeah, all the big tech youtubers today make it sound like games only run if they're in ultra.
Tech Deals coming in with the facts. This is why I love this channel.
I couldn't agree more.
@That Poo Head no
@That Poo Head yes
>"can you legitimately tell the difference?"
>"see a difference?"
>plays the videos in 540p, tiled
"facts"
QAOSbringer what
So the title isn't clickbait, the information is clearly and concisely presented, and the point you set out to prove is well made. It's like I'm on the wrong internet.
sometimes i push graphics to ultra just to take screenshot FeelsBadMan u got me
Judging quality by youtube is impossible. It degrades footage so much that ultra looks like low when you actually play it.
YUP
Just as I opened up RUclips, this video popped up :D. I personally game at 2560x1080p on high with a GTX 1060 and all my games run perfectly 👌🏻
What is perfect and what are your games?
@@dreamshooter90 as long as the games are not the new top AAA games id bet he gets good 60fps on most games.
i pick up all my games on sale (so more than 2 years old) and they tend to run perfect on my rx 580 at 60fps with lots of GPU power left over so the GTX 1060 will be about the same.
plus his display is between 1920x1080p and 2560x1440p so it's biger but not too much harder than 1080p. sounds like a nice size display :D
@@Solidlightvideo wtf are you talking about. His 2560x1080 panel has 27.6M pixels and a 1080p panel 21M pixels. Thats a 30% difference.
@@BenState But it's not 30% harder to run. 1440p is about 30-35% harder to run though
@@jackoberto01 do the math
Every time I see a video on RUclips testing benchmarks they always put it ultra instead of just high
I remember when you first started. I think I subbed then, thinking at least this guy goes through the trouble to explain why, verses just throwing up a bunch of graphs.
Still think your channel is the dogs danglies and still subbed. Good to see its grown well. Good luck.
I bought the SSD 2 months ago, great for the money. I appreciate the point you are trying to make. Constant 95 degrees is a little meh. People should look what games they want to play and what moniitors they want to drive. Good and consistent frametimes is extremely important for your gaming experience, maybe even more important than the difference between high and very high level of detail. BTW: If you ever played without motion blur, tearing and low latency (Input lag) on a 144 Hz panel with a consistent high framerate, you NEVER want to go back. :-)
i play low for gaming, ultra detail for....
BLUESCREEN
good one
This makes so much sense. Thank you! I guess it all depends on how much you can crank out from your rig. For adventure games like AC Odyssey 30 FPS is OK but for First-Person-Shooter games like Metro Exodus 45 FPS is what should be the minimum for a better playing experience.
You are my hero!
When you have time and no great deals to show, please do a video about the *most expensive and useless* game graphics effects, as SSAO , FSAA 4x, etc.
Alessandro Borges check Digital Foundry
Love this Video!
Here in Germany I confronted some youtubers whith this content:
Nobody wanted to talk about this theme.
Manufacturers wants to sell expensive equipment and of course only newest (and most expensive) components can fullfill our (theirs) highend dreams...
want a bone?
@@notoriousbig3k your bone?
Depends on the game, World of Warships and Elite Dangerous (my main games) are perfectly playable on my PC at ultra detail. Some of the enjoyment especially in Elite is looking out at a gorgeous view of a galaxy/planet from somewhere that Noone has ever gone before!
this is the right answer....
b550m, 5800x, 3060ti, 32gb 3200 ram. I play on a 1080x60 monitor. I play on ultra settings with RT off. I never drop below 60 fps in any game I play. Hardest game for my computer is Hogwarts Legacy. No stutters, smooth as butter, 60 fps the whole time. If I dropped under, or had stutters, I'd go to high settings. But 1080 for the win :)
8:30 at medium CPU hits 98 C
monkaS
and 7:30 was wondering if anyone else noticed
also saw this...i have a 7700k @5ghz dellided and under a 360mm aio max hits 80c max
Wow that a lot
*89
Delidding is a must for the 7700k, hits 60c on prime 95 Nochtua d15 cooler, 30c idle 5ghz, AIO are overrated in my opinion
Sir... U just said exactly the same words which i say to my friends who foolishly argue on playing everything at maximum setting available and really suffers huge fps drop for a marginal increase in visual quality which no one ever notices when u r really in middle of the action..... I am so happy u made this video, u know.. They will listen to you... Not me... I am jealous :)))
Dude I love you legit your way of talking and explaining things is just amazing and keeps the viewer interested...Keep going!
I am really happy that you make your videos direct, simple and understandable, that is something not everyone can do
You want stable performance. 60 -100 fps at all times
then you need intel :(
Stable does not mean high fps, it means that its the same fps. I would take 60fps over jumping from 100 to 200 every sec
Yeah, at least. For single player. And rock solid 144 fps/144 fps+ for online shooters.
the thing is, most competitive or esports shooters / players does not even go as high as ultra. We have mostly set custom settings or low in order to get the best fps available, even when we have a beastly rig. We need to be able to spot enemy in pixels (CSGO), we set foliage or textures to low so that we can spot enemy hiding in the dam grass (PUBG), we set Ambient Occlusion off so that we can see enemy hiding in the dark corner in a room. (R6S). Even thou most of us have beastly rig, we still game at lower details with massive FPS.
I have an i7-8700 and still my GTX 960 4GB Card lol, it’s the only “old” part of my PC, but at least I can still play most games at like mixed graphic settings.
Why you are doing that to yourself? :-)
GTX 1660 is not that expensive but you will have much smoother gaming.
gpu.userbenchmark.com/Compare/Nvidia-GTX-960-vs-Nvidia-GTX-1660/3165vs4038
and:
ruclips.net/video/4jGIRoDF-bI/видео.html
I can't believe some people think they need to play on ultra in order to enjoy the game. Whole my life I've been optimizing my budget and graphic details trying to get 75+ fps and it feels great every time. It's a game, not a movie. Smoothness over realistic detailed scenery any time! No matter if it's competitive or story driven game.
This is why I look at Tech Deals and Digital Foundry. So informative.
Framerate over Ultra Graphics. Hehehe
THANK you, Tech Deals! There was another other video about this posted a while back pointing out that developers really don't balance their games for Ultra setting performance. That and that Ultra visuals are marginally better than High at best, and not worth the wild performance cost.
10:45 -- "low detail is absolute garbage ... a whole other game"
11:22 -- "except for the grass, these are amazingly similar"
Even so, a very useful video.
If you are playing older games, it's fine to play at ultra, also getting a gpu a bit over kill for your resolution at the moment (e.g. 1660ti, 2060 at 1080p). As long as it stays above 60 I'm fine with ultra, but if it drops I have no problem dropping to high settings instead.
I had budget pc for my entire life. Thanks God I bought a relative good one, for the first time I will play with freaking shadows on maX settings.
"how well the loin cloths are hanging". That made me spit my water. Thanks Tech Deals as always for an awesome video
2080 Ti and 9700K is what I'm playing at high detail with.
i have so much "But, but."
Although, I've tested games myself at different detail settings and you are completely accurate. Nice video.
did you just assume my processor!!!!! ryzen 2600 rx 570 8gb..........
I feel the same too 😛. Ryzen 7 1700, r9 390, 32gb
@@suiton20 wtf? That gpu bottleneck
@@techbuildspcs there's no big bottleneck here its close to ur gpu
You don't need to drop an entire preset on AC:O. Volumetric clouds is a ridiculously demanding setting. Drop it to medium and you're golden. And yeah, I like ultra for playing, I'm a snob.
I love to play RPG's, so the details are something I look for... let's take the Witcher 3 example, looking at the scenery is awesome :)
I play some games at High and others at Ultra, difference is not noticeable at all between these 2 presets.
Roberto Carlos Definitely a game worth cranking the details on.
I agree with you mostly. I think it is only sometimes worth it cranking that shit up to max. But I have a major critique point. Instead of showing the quartered window you should've shown us the games in full screen. We were essentially looking at a 1080p image downsampled to 480p. Therefore we couldn't tell a difference in texture quality, tessalation and small shadows and stuff. The only difference I could clearly always see was the reflections.
Like I said, great point but poor evidence.
Look at those teeth, I bet he even smiles at funerals😬
You're the guy backstabbing people at funerals
whats wrong with you.
Lol
That made me chuckle..thanks
Biggest difference is in the draw distance and pop-in of objects. I always crank draw distance at the expense of detail because you really do notice it.
Love your videos, always adressing the important topics and the real deal for (let's be honest) the majority of the gamers out here who don't game on an RTX or i9-9900k.
I have that very same Cooler Master case. It surprises me how under-appreciated it is, considering its airflow, modularity and obscenely low price. The only true complain I have with it, is its acrilyc side panel, but at its price-point, its totally understandable. If you are on a budget and looking for a good case, don't hessitate to buy it.
why do i feel like medium looks better than high? XD
For real.
Because the RX580 is a terrible graphics card lol
@@itsJoshW that makes no sense lmao, it's just a better 1060 for 160 bucks.
@@gu3z185 Your point?
@@gu3z185 Just to clarify: You are saying "It's a better graphics card than a shit graphics card that's also 2+ years old". Edit: The 1060 is 3 years old in July.
It's insane how much performance you can squeeze out of the games just pulling some setting down to high. Sometimes even one setting. I would never use presets myself.
Could you stretch and pad this video any further? I didn't get what you were saying after you only repeated it for 67 times
You could just stop watching after hearing it for the first time and leave.
What kind of glue do you use for your dentures? I keep spitting mine out during intense gaming sessions and getting saliva and denture glue all over my mic. But yours seem to stay in ok... What do you use?
GTA V is the only game that I can find difference between different graphics settings...
And ark
I think it's because you only play GTA V :)
Jesus this channel is always 100% real. Just seeing the thumbnail was enough for me, instant like! But one extra fact: Even for content creators you don't really need the best hardware. This is something I only learned once I got really good hardware. You get diminishing returns in video quality for every gaming quality improvement. One of the biggest reasons is RUclips's codec. Even after tricking RUclips into using their better codec still doesn't give you 1:1 detail in your videos. I noticed that if you upload a 1080P video in 4K, only then you get some of that quality you see while playing. But you can't watch RUclips videos at 4K on mobile which a lot of people use. So...
TL;DR If you are a content creators thinking ohh, ok, I NEED a good pc, because it will show in videos. Yes, but not as much as you think at all. A budget build is all you will need.
You always cover everything. Glad I'm subscribed !
emmm...6-40...is the medium and high are right?seems like right upper screen is a medium cause there is no global illumination and shadows...
you went wrong when you said "awesome rx 580"
As someone who plays at 4K on a 1080Ti, I can't agree enough. That high to ultra jump makes a massive difference in a lot of titles. I will say though, it also depends on if I'm playing single player or multiplayer.
No, I'm playing League of Legends on low settings with 40fps
I know the feeling dude...
Oof
The worst part is that the same computer I used to play Lol at max details in 2011 barely hits 60fps in some teams fights nowadays. At least the last time I played it, two years ago
Everyone is going to find their own sweet spot for gameplay settings. I got used to playing Rocket League at moderately low settings so I could keep my frames up around 120. Then I discovered I could create a custom fan curve in Afterburner for my RX 560 and now I play with medium settings at 144 fps steady. And whereas the card was running above 75ºC before I can now keep it right around 60ºC with just a bit of extra fan noise--most of which doesn't matter since I frequently wear earbuds to play anyway.
Ultra is for gaming at 1440p 144hz (on my 7700k/2080ti)
Me too but I've a 2700x
What monitor are you running if you don't mind my asking? Looking to upgrade so looking for recommendations 👍
@@ErasedPrime same here
Stuart Dow - ASUS PG279Q 👍 it’s a couple years old now, but still better than about anything under $1000. 4K is too hard for AAA high refresh rate, even with a 2080ti.
lmao those who chose 2700x still getting 144hz....rip intel=too much cost
In the Assassins Creed Odyssey engine, its a unique case, the driving factor in performance between High, Very High, and Ultra, is volumetric clouds. From medium on up, that setting offers NO measurable graphics improvement, with severe performance hits the higher you go. So you can run the whole game on ultra, with clouds on Medium, and it'll run great.
Can you make a video of gtx 1660ti vs rtx 2060 if which is good to buy. Thanks in advance more power to your channel! 👍👍👍
just buy AMD :))
Thanks @Patrick Burwash appreciated for your quick response. 👍
I have a system with an 1660ti and one with an RTX 2070. The 1660ti is a fine card for the price. But even ignoring the RTX features completely the 2060 is a better card when it comes to raw performance. If you want to go for the 1660ti go for a cheaper model. I have the MSI 1660ti gaming x which is only about 30 € less than an entry 2060. I can say that the 30 € in savings is not worth it in that case.
Get a RX570. Much more value.
@Patrick Burwash That's...not true. Digital Foundry has shown both BFV and Metro Exodus running with RT on high settings, at a steady 1080p60fps.
I think a very important reason to lower detail is missed. In some games the bad guy can be hidden behind detail and take you longer to find, so turning detail down let's you find/kill them first. Some games benefit from shadows and some don't. If shadows is on I can determine if someone is coming around the corner in some FPS games, so I camp and wait instead of getting camped. In others shadows hide things that take you more time to discover important things.
I recommend trying settings for each game you play and use the one that benefits YOU and your hardware, there's just no other way to do it IMO. And for Space Invaders sake, lose that psychological influence of Ultra vs other settings. I have found no game that Ultra setting is best as far as competing goes, hardware performance aside of course.
Hmm, I run a GTX 1080 at lowest settings plus draw distance max.
You must enjoy sniping in battlefield.
Carlos Stout then you probably have a very fast cpu. High frame rates is life.
@@MrYedige True and true.
@@CarlosElPeruacho and high bandwidth/low latency ram?
@@MrYedige Eh Nothing special, DDR4 3000 15-15-15-35. Needed good performance, but when I purchased the RAM, the prices were still pretty high. That being said, I've had no problem driving 240 FPS is most of my competitive games, with at least 144 in the more demanding ones. I don't think I'll be getting a Turing anytime soon, but I think that would be necessary to ensure 240FPS in the most demanding games. It's all butter in any event, and I have no complaints. What are you running?
This video is in a way for those whom dont like fiddling with their graphical settings to find out which option affects fps and how much visual difference it outputs.
Note that not all games are only specifically gpu bound, but are also cpu dependent.
Regarding ac odyssey, hardware unboxed has a detailed video on what settings are optimal to use without sacrificing any of the visual fidelity.
✌😏 Low 720p FTW! 💻
If your rig can handle it, why would you not game on ultra settings? Something to think about
I think it depends on your monitor. If you have a 144Hz panel you will notice the drop in frames on mid range or lower hardware when you go up to ultra
IF you're stopping to look at grass, tile, ground and building details while gaming... you're gaming wrong.
thumbs up for this guy who REALLY knows the ideal gaming settings
are you sure the labels are right ? ..... ;D
I think it's due to the change of time at medium benchmark there is barely any clouds in the sky so it is more brighter than others.
Perfect hair : Check
Perfect teeth : Check
Ungodly perfect level of information : Check
Hotel : Trivago
Having the FPS counter on the upper corner of the screen is a dis service we gamers do to ourselves. Just play and have fun! (Easier said than done)
That is true, but I always turn afterburner not to see fps but more about cpu and gpu temp.
Quite an important point to always keep in mind when you're looking to go buy/upgrade to a mid(budget)/high tier GFX card is, what games do you play?
Buying a budget card can be good to a certain point, specifically when your games are solo offline campaign modes, the big jump/difference with performance comes in for all the MMORPG's. You load in and there's massive battles (PvP) happening and all of a sudden your mid tier budget card is hitting overload due to 30-150+ other players casting/spells/combat animations all at the same time, then you notice the big drop and low to mid tier cards will definitely need to drop down on graphics (possibly lowest setting) to handle that and save you fighting at 10 frames a second due to the card being maxed out completely.
Just something to think about if you play any of those MMORPG's with high population on the servers.
i have similar specs but I have a Core i7-4790 and ddr3 16gb of ram
I have an i5-4440, 8GB of DDR3-12800, and a GTX660. Just today, I played AC: Unity on the lowest settings, at 768p, on my 1080p monitor. Ran just fine, but my VRAM (2GB) was filled to the brim. I plan to keep this rig for another few years. Maybe swap the GPU if the day comes when I can no longer launch a game at 720p.
And I'm planning to build a LAN rig with a Q6600 taking center stage (or Q8xxx or Q9xxx, depending on how cheap I can find them). I want to bolt on an old 15" 720p laptop screen to make it a true portable LAN machine. I think I might even be able to run Fallout:NV on medium! :)
You work with what you have in this business. ;) Computer building and gaming can be a great hobby, even if you're working with 10 year old components (and building $100 computers)!
Difference between very high and ultra is definitely noticeable in the lighting, especially on the water. But it's not a big deal, I wouldn't honestly care or even notice if I wasn't researching GPUs in the first place.
Oh, by the way, your videos have been really helpful in making my choice of GPU. Still weighing and balancing my options but you've put a lot into perspective.
rtx 2080 ti 1080p....... bruh only unaware competitive streamers.
dylan eijkman or 144hz ultra
Just got around to watching this and it is so true. On another note, this is why I don't use Geforce Experience graphics optimizer. It always seems to put too much emphasis on visuals so when you go from optimized settings to very high or high, you get drastically better performance in most games.
60fps doesn't cut it for a 144hz 4k screen.. nobody is going to be dumb enough to buy a 2080ti to play on 1080p.. come on..
Yeah for some reason he thinks that people buy 2080ti's for 1080P 60hz screens lol. Plus an RX580..like that's not something to relate to lol. That card was 'middle of the road' when it came out in 2017 lol... it's almost halfway through 2019.
60fps doesn't cut it, simply. Personally, I can't live with anything below about 80fps minimum.
I only have a 100hz (34440x1440) monitor, sadly, so I make sure to keep it at a steady vsync'd 100fps. (With a 2080 Ti)
I think that's an industry secret. Once I've started to look outside Anandtech for GPU benchmarks, I noticed they're usually at Ultra or High settings. I felt that is intentional so that GPUs will never catch up or overkill with recent games. Those settings are good for standard testing across the industry but poor for actual gaming.
Since 1024x768 resolution, I've never needed AA or higher quality AA. I want the images sharp as AA always blurs it for me.
Additionally, I've noticed in one game that my GPU power consumption drops to half from when I disabled some effects. I am using a Vega 56 at 4k resolution where at the higher setting, it shows 180 watts then it hovers 70-90 watts at lower settings.
Man, you dont get it. you dont get any of this pc masterrace thing. we play on ultra just to inflate our little ego and brag to our friends that their laptop is shit and our pc is great while the truth is we cant even tell the damn difference between high vs ultra and 1440p vs 4k on a completely blind test. Not to mention that watching those 4k resolution and 144fps on our games can give some nice erection in our whole gaming session. so with that said, HOW DARE YOU SAY WE SHOULD PLAY ON HIGH!?
It really depends on the game, but for the most part, you're right. I think a Single Player game is worth playing at Ultra, but online/competitive games you probably shouldn't unless you've got a PC that'll handle like 150fps or higher on Ultra or Low on games like Siege, Modern Warfare and Battlefield, then i'd rather Ultra. I only care for 120fps, so I don't need to drop to Low if I had a 2080Ti if it gets me from 130fps to 160fps for example. I want a 3080 when it releases, because my 1070 can't handle 1440p 60fps at even high settings on some games these days, but I'm okay turning the settings down provided the game doesnt look ugly after doing so