It makes a lot of sense for recording footage. Being able to zoom in post production is crazy useful. so from a gaming perspective a content creator may get a lot out of it.
The thing that drives me batshit crazy is the limited draw distance on objects, vegetation and LOD's even when it's cranked all the way up. So down the road when your computer can crush it you're still limited to seeing pop in.
i absolutely hate LOD in games (at least their current implementation). There should be a way to calibrate it so the game checks your resolution and FOV, and using that it goes through the assets you can see and calculates how many pixels it takes up on your screen, and calculates LOD from there. Because currently, you can still see sizable objects that are just blurs or grey blocks until you get extremely close to it which ruins the viewing experience for me.
This is already a problem with old 3d games right now. It likely won't stop being a problem either. There are game devs right who STILL use framerate to calculate physics and whatnot. The industry hates future-proofing. It's why live-service games can get away with becoming literally unplayable once the central servers get shut down.
Exactly. It's like yeah, I get it, when San Andreas was being made at the time, Rockstar couldn't even fathom the idea of a 4090 graphics card. But with just how powerful our modern hardware is, I should easily be able to have the entire State loaded in at once with all the high LOD models loaded in and have ZERO performance hit
When I got my 1080ti I started running all my games at 4K (DSR) on a 1080p display because in older games, render distance and LOD were tied to resolution. Perfect supersampling AA and dramatically less geometry pop in.
@@drunkchey9739 yup, if my 1080p display had better color I would use that over my 1440p display. Especially because 4k DSR at 1440p doesn't look right and 5K is out of my performance tolerance.
@@drunkchey9739 really is, I remember downsampling from 8K for my 1080p monitor in some old games. Every line looked absolutely perfect. Crazy that the trend nowadays is to render at less than native res, then upscale and slap TAA on top.
4:27 the temporal issues that come with it, TAA is a plague to modern games that implement it badly. oversharpen to compensate, ghosting, blur, dependency for certain effects, etc.
Image ghosting in Satisfactory was quite distracting the last time I played it, with the effect being super smeary, which reminded me of Doom 2016, specifically the scene in Samuel Hayden's Office when he waves his hands around, trails are left behind on his carpet. Out of FSR, DLSS and TSR and plain old TAA, I remember DLSS being the least image disrupting method, though it's still noticeable on the conveyor belts. It made me consider turning off anti-aliasing entirely, though that'll cause issues too since stuff like screen-space contact shadows relies on TAA to smooth it out! So it's become an issue that TAA is relied upon to make some effects work, which is an unfortunate circumstance to be in. I kinda just try to ignore it, since nothing is perfect, so it's a matter of compromises... Edit, since somehow I couldn't grammar check correctly, and also an elaboration on the scene in Doom 2016.
I'd really love to know Phillip's thoughts on TAA and all the ghosting it produces. I recently bought a 2k screen with a 4070ti super to accompany it. After trying some modern games I felt something was wrong, but couldn't quite put my finger on it. The biggest offender for me was The Finals, I get that it uses RT to update light in destroyed buildings and it would shimmer a lot without some temporal solution, so it forces TAA or TAA-based upscalers as it's only options, leaving out traditional AA or even no AA at all. But I really hate what it does to image clarity especially on longer distances, players and other moving objects become mushy blobs with long trails after them. No matter the settings, I always see those ugly afterimages and it makes my blood boil since that obviously flawed technology is pushed on me with no alternatives present.
I think you're right, but I do very much see the importance of having the option. Txaa makes some of the best hair rendering I've ever seen, most likely only replicateable by super sampling. It's the copper of antialiasing solutions, you know? Like how, sure, platinum, gold and silver ARE more conductive, but they're nowhere near as abundant as copper. Hence why they use it in everything. And when combined with the sharpening filters of fsr and the like, it can make for a solid stopgap. I actually prefer cranking the sharpness in everything I use on my steam deck, since blurriness bothers me alot more than sheer pixelation. But that's the key difference, there I have a choice of how my games are scaled, console players don't. And when the game devs want to make a game run well on console, you used to have to jump through all kinds of hoops and truely work for your frame rates, show some technical wizardry. Now they're gonna be tempted (and sometimes even forced by management) to just slap some fsr on that b, and call it a day.
@@322-Dota2 what about crap like cyberpunk 2077 which has built in taa and if you disable it through config game just breaks because game was built on it?
@@322-Dota2 While you might not feel a need for AA at 8K, there will still be artifacts. To truly not need AA anymore, you need insanly high resolutions. If you don't believe me, go to blur busters ufo test page, select the aliasing visibility test and measure how far from the screen you need to be to not see any artifacts any more. Then calculate the needed resolution for a normal viewing distance from that.
@@B.E.3.R I've never even heard of it until now, and I've even booted up CS2 more than once (to test whether my potato can run it comfortably, or at all in the case of the Linux install).
Personally, while I see the value in 8k, I definitely am more pleased with the current refresh rate arms race going on in the monitor space currently. The benefit of 8k relative to the performance cost is definitely very high, and while it is not without value, I personally think that chasing higher refresh rates (240hz/360hz/480hz) is the more beneficial solution. We are still very far away from hitting truly diminishing returns when it comes to responsiveness (okay, maybe this one not so much), motion smoothness, and *especially* motion clarity. Low MPRT is absolutely critical to having a truly sharp gaming experience IMO and goes woefully underdiscussed compared to cranking the resolution. What is the point of rendering such an extreme amount of video information when the vast majority of it ends up smeared and indecipherable simply because our eyes don't mesh well with how sample and hold displays display the image? It takes a performance dump of already potentially *questionable* value and makes it even more situationally beneficial, further diminishing the appeal in running such high resolutions.
I sacrificed colour accuracy and brightness for a different IPS monitor of the same resolution, but upgraded from 60Hz to 165Hz. Oh my god it was so worth it.
TBH Frame gen is the way to go for motion clarity, at 300+ fps you already have enough responsiveness, and since frame gen has less latency and less artifacts at higher fps. The downsides would be very small. The only limitation is that the optical flow accelerators on current cards weren't made for 300>1000fps. Frame gen can be seen as an enhanced BFI, where instead of flashing a black frame to increase motion clarity, you flash an interpolated frame.
Why not both? I have a 4k240 display. Target of equivalent motion clarity to 1000fps seems to be the end game, although BFI could give that at much lower refresh rates. Still, I don't play games with extremely high motion. Anything in 3 figures fps is sufficient for my needs, yet my resolution itch isn't satisfied and I'd like more options beyond 4k. Options exist going both ways for whatever is preferred and given long enough I'm sure they'll converge.
@@snowwsquire Interpolation is definitely the way forward but, yeah, you do need a notably high starting FPS to overcome the shortcomings of modern interpolation. The optical flow processor stuff is mostly marketing however, applications like Lossless Scaling can already do 4x interpolation even on GPUs without specialized hardware and it isn't reliant on any actual game data, it just captures the game window and interpolates it like a video and it does it very well, with about as little latency as is possible for that style of interpolation and pretty decent quality as well. Though obviously, first party solutions from companies like Nvidia where the game feeds behind the scene data does yield higher quality output. It's also quite a bit better than black frame insertion on account of not only having no brightness hit (making HDR feasible) it also provides increases in motion smoothness that BFI fails to achieve.
Imagine working hard on making a decent video, being eager to hear your viewer's feedback only to be bombarded with a reddit tier comment chain spamming "hmm yes, maybe it does have purpose, thank you for pointing this out"
The biggest use case for 8K is headsets, VR or otherwise. That said I still think gaming at 8K (outside of a headset) is dumb and likely will be for a very long time. The diminishing returns are pretty extreme and there are lots of other metrics worth boosting before that. I also don't think downscaling from 8K counts as gaming at 8K. Ultimately what you see is still at 4K and only requires a 4K screen. And 5K and 6K aren't 8K. The returns diminish exponentially and it's rare to see people arguing 5K is too much for example.
Thanks for the nuanced take. Most of the time people claim that there is no advantage to 8K, while, as you said, there is one, it's just not even remotely worth it for now. If you are talking native 8K I fully agree that it will be a bad idea for a long time. The thing is though, upscaling is getting better and better, so gaming on an 8K display could become reasonable quite a bit faster. Currently even just the upscaling cost to 8K isn't worth it, but two GPU gens in the future that will probably not be a noticable performance impact anymore. For completly matching peak human vision, you'd need 2400ppd. I tried to measure that for myself and ended up with >1700ppd instead, which is >86K on a 32" at 80cm distance. In realistic use cases it will be indistinguishable from a 16K screen and even that is very far into diminishing returns. Personally, I would prefer 6K, as the benefit of 8K+ is too little and I would rather have the resources spent on refresh rate and color depth. But I expect the industry to push for 8K instead.
Im pretty sure average ppd on the best eyes is lower than 500ppd. Maybe there is a study that says a focal point can see that high. So in that case just focus 1440p into that area. But the rest of the screen would realistically never need to cross 16k. Or 15360p. Regardless of size. Apple "retina" is 120ppd. And double that is probably goodenough@@davidruppelt
It's true. Downscaling isn't the same as native at all. It's the same as watching a video in 720p that was made at 4k. It's a huge upgrade in image quality versus a video made at 720p. But at the end of the day the clarity isn't anything more than 720p. Theortically even if the bitrate was the same. It would still apply that you can not zoom in (or bring your head closer) to actually see detail. It's the same story with upscalers. Atleast for the current ones. You can test by turning off forced taa/upscaling. You'll see that you don't actually lose any detail. With upscaling a blade of grass can be ultra sharp. Without it you'll see the 16 raw pixels. But if you turn up the real resolution of that to 64 pixels. You would actually be able to see the veins and stuff on piece of grass. Yet you would still see squares on the squares so you have people who still call it inferior to the ultra sharp perfect circle.
@@jwhi419 I have the 2400ppd from the "NVIDIA Automotive Screen Density Calculator" where they claim that to be "approx. limit of high contrast feature detection". I don't think there is a practical benefit of a screen that capable over a 300ppd screen with AA, but if you truly want a resolution that under no circumstances needs AA, than 2400ppd it is. 300ppd would be "20/4 vision; approx. limit of alignment detection (hyperacuity)" and is equivalent to a 32" 16K screen at 80cm distance. That should in my opinion be the endgoal. Apples 120ppd are "20/10 vision; practical upper limit of visual acuity". That is a more realistic goal for now and would probably be good enough for me, but would need AA to avoid the edge cases where it isn't. I did a test with blur busters "Worst-Case Aliasing Visibility Test" and there I could still see the artifacts from a distance equivalent to 1700ppd. There may very well be some error in my test design, but at least I believe the conclusion to be valid, that 120ppd are not enough for worst case scenarios.
@@davidruppelt i see. However i do think that if the tech to create a 32 inch 16k display exist. Then the tech to do the same in vr with a headset weighing 200grams would probably exist too. Well the chief of blur busters has talked about an end goal display technology in his forum before. The Holo deck from star trek. I do not recall the details but i think he ignores pixels at such a high resolution. Of course his deal is more about the motion clarity. Since just like you noted that you would need more than 1000ppd for the line test. You would need more than 10,000hz to move each of those pixels visibily to a human.
Obviously when people say 8k is a gimmick, they are talking about the here and now. The hardwarerequirements and costs are just not worth it. Things might change in 10 to 20 years, as they always do, but thats not a revolutionary idea.
That's not obvious, many people say resolutions higher than 4k are _completely_ worthless due to the limits of human vision. This is already visible on smaller devices, 4k laptops aren't much sharper than 1440p ones and I'll never need a 4k phone even in 20 years. I agree with philip that the limit for monitors/TVs is a bit higher, I wish there were more 5-6k options, but there's a limit there too and 8k will likely be the last increase. VR headsets will go even higher but once we have better foveation and reprojection it might be a bit meaningless to talk about native resolution/refresh rate
@@speedstyle. laptop displays above 1080p have demonstrated how useless they are for more than 10 years now. but still they keep being produced despite the fact that you have to sit 2 inches away to even tell. intel didn't even allow unfiltered scaling from 4k to 1080 until 2019, and every laptop before that is arbitrarily not allowed to use the objectively simplest method of rescaling in existence. and then they pretended it is an actual feature that took effort to implement when they so graciously decided that the consumer worms were allowed to use it.
@@speedstyle. i had a 1440p phone in 2014. all my phones since have had lower resolution because that's just useless in a phone. my current phone is considerably larger, but it's something like 2300x1080, a much lower resolution (also ultrawide) and it's still as sharp as i'll ever need.
I don't find 4K that much better than 1440p personaly already, but I still can understand that others enjoy higher resolutions. It's just not that interesting to me. Of course higher resolution is better, but framerates and a smooth visual are also important.
Still, in the future 4K and even 8K will become that "smooth framerates and clarity" sweetspot. I also have a 1440p screen and am very happy with it, yet I can't wait for 4K to become as easy to run as 1080p nowadays, I love replaying favourite games whenever I get a higher resolution screen, just so I can experience detail I wouldn't have otherwise.
Same here. I first went from full HD to 4K, which was a huge difference. Much later I got a 1440p high refresh rate screen and the difference in sharpness is unnoticable in desktop/text sharpness. Maybe for something like Escape from Tarkov it would be nice for spotting stuff in the distance, but 4K high refresh rate is prohibitively expensive both in processing power and screen price. No way I'm paying NVIDIA the "we price for the elite" cost that they have imposed on newer RTX cards. High end AMD is already expensive enough.
I definitely noticed 1440p to 4k. I don’t notice 1800p to 4k. Its like how 144hz vs 240 is noticeable. 240 to 360 is harder. Its just pass the barrier. Though 500hz is approximately the frame rate an untrained eye will notice.
I actually still see a huge difference because I jump recently from 22 inch 1080P to 27inch 1440P and it's not much better even ppi is just a bit better than before but not that much. On my old 43inch 4K TV even at 1 meter looked so much more crazy because it wasn't just resolution, just dropping slightly the resolution on it the game was still very defined but lost a lot of brightness/colour/contrast because beside resolution there's also massive more colour information that makes a lot more changing in between stuff. Problem is now some games I used to play fine at 1080P now I sadly have to play as vaseline in my new 1440P monitor cause I lack hardware with my Strix RX6700XT 🫣 Ps: I wish I had a 27 inch 4K monitor, that would make me very happy but I would sadly lack hardware to push such high res, but later on a PC upgrade even upscaling works much better for 4K than on a 1440P or 1080P monitor since like FSR quality will start from a higher resolution in 1st place.
And what about VR too, we are already almost approaching 8k with the apple vision pro. Regardless of the success of the device, the screen resolution is something that is praised when compared to other VR headsets.
I feel like you meant this as a joke but yeah, in 5 years with a decent card it will be a normal option.... I started playing games in 4k in 2017 with a gtx 1070. I could play games in 8k with my current GPU if i wanted to.
@@OttoLP personally in 5 years I fully expect I will only just have adopted 4k, but I'm also not buying anything that could be construed as high end lol. I just can't bear to spend that much. But yeah, just as 4k seemed very silly and wasteful, but is now seeming more and more viable... So too will 8k. I expect we are more than 5 years away from 8k reaching mainstream, but I don't doubt that we will start seeing it again by 2030.
@@pancakerizer Hahahaha, yeah that would be crazy playing games at 4k 240fps like it is nothing. Though I doubt they will improve on path tracing performance as much in 2 generations, it might get 2-4 times faster at most I guess. Fun thought though :D
@@OttoLPhonestly going off of 4K (which never took market dominance) It would be about 8 years from now for people to start adopting it in mass. Because high frame rate monitors will lag behind by about that much
Pixel quality > pixel count; furthermore, art direction > fidelity Chasing such high technical standards has game developers and artists overworked for diminishing visual returns, install sizes that occupy 1/4 of a console's hard drive, and products that often need to be smothered in layers of AI and TAA vaseline in order to run at a stable frame rate at all. Meanwhile, Nintendo still makes billions shipping games at 720-1080p, while still earning praise for visuals
Nintendo is lacking in 3D high quality visuals and FPS, they don't have any truly competitive games except for smash and the smash community has to constantly work around Nintendos hardware. Nintendo is only successful because of their art style and family friendly brands, it has nothing to do with pixel quality, they are often the worst. Shipping games that run better on a smartphone for $60 is predatory af. Switch games look fine handheld but hook it up to a 4k TV and any 3D game is gonna look like complete dogshit, especially 3rd party ports like doom eternal or witcher 3. Yes it's technically impressive running on a android tablet designed in 2017 but I get better performance emulating it on my phone or even Xbox cloud streaming
Does it though? Or is this another "feels" argument. People always talk about devs and artists being overworked even if there's not even any proof of it, it's a lazy argument that says very little.
i think the main polarizing point of upscaling is that instead of being a free bonus performance or clarity boost developers have started using it as a replacement for optimization to ship games out faster and cheaper, and at that point you still need high end hardware to run it well which kind of defeats half of the point of upscaling
"developers have started using it as a replacement for optimization" I've seen this statement in some form or another for probably 3 years now and I still have not seen any proof or evidence to back it up. People have even come up with elaborate conspiracy theories where developers have conspired with Nvidia to make their games run worse to make DLSS seem more valuable. So lucio-ohs8828, do you have any proof whatsoever or are you making shit up and talking out your ass like everyone else?
@@kiri101 Please dont excuse the generic slop that is gaming nowadays, its clear that devs dont have any talent anymore. Its all just talentless UE or Unity devs making bad games and using AI so that they are at least running lmao Star Wars Battlefront 2 came out 6 years ago and ran fine on a 1070 or 1080 and looks much better than just about every game nowadays. Perhaps hiring actually talented people that care about their product or having a custom engine isn't such a bad idea? Memory is very cheap, GPUs are not.
for gaming i agree, a crisp 4k image is enough (that's something you don't get with upscalers or the post shader salad of current games btw). But i'd argue that for desktop use, 8k is useful. i already use my 4k screen as a "canvas" of sort, everything windowed, nothing full screen, and i drag stuff around. i'd love to have a 90" 8k screen so i can use any small portion of it in 1440~ 4k and be left with a ton of extra useful canvas space. beats having a ton of monitors for this same task.
Same, like there is definitely a physical limit to how sharp we can see, like we know that eagles can see things at a distance that just aren't visible in any detail to a human. I don't know what that limit is but if 8K has no noticable clearer sharpness than 4K at a reasonable viewing distance then I really don't see the point except for screens that are meant to be looked at partially. The money spent on 8k hardware, could then be spent for higher refresh rate/HDR/OLED, whatever else gives better image quality. Also, just my personal preference. I recently finally upgraded from 1080p to 1440p, and while it was a clear improvement, it wasn't as much as I thought it would be. I will definitly upgrade to 4K at some point, but my expectations aren't very high about 4K so it might still be another 5-10 years until I do the upgrade.
I've always appreciated your videos on tech and specs (including resolutions) because it's always felt like you were hopping on the next big thing before anyone else even acknowledged its future widespread use. I remember a world a bit more than 10 years ago when the majority view on the internet was that 4K was useless because we had hit diminishing returns at 1440p and wouldn't benefit from more pixels on monitors that sit so close to our eyes. Now the same is being said about 8K while quality 4K monitors can be bought on Amazon for 250£ and have thousands of glowing reviews. It's a uniquely human thing to feel so pedantically correct about "tech advances aren't worth it anymore! we need other transcendental innovations!"... and to be so harshly proven wrong when the inevitable happens. Great video!
I mean, I still think so. 4k is still a gimmick and almost no content is consumed at that resolution and gains no real benefits from it. Objectively you can in most cases not tell a difference on a tv with those distances, and a computer monitor gains so little benefits that a 4x performance penalty cannot be justified in any way.
One of the issues I still se in "gaming" monitor reviews is that they point out that the increased pixel count is useless while gaming, if so maybe use it while doing other stuff. Personally I prefer 4k gaming even at 27 inches and I look forward to the time when I get my first 8k screen. I hope it could be have better colours as than my 4k smartphone.
1440p on a phone is a gimmick. I've had a phone with 1440p for a while, but I lowered the resolution to 1080p and noticed no difference. At 400ppi already, what's the point of even higher density? It just drains your battery faster and the only way to notice a difference is with a microscope.
Yeah, that's why I still game with a 1060 @ capped 30fps. No need for the extra frames - I don't notice any difference. 30 frames is enough. The only way to tell the difference is by scouring a video frame-by-frame.
30fps on sample and hold displays is really quite significantly poorer. BFI helps a lot here because your eyes have their own motion blur which blurs the frames together significantly unless you have black between them so you can see the frame without the next frame blurring into it as much. Movies have real motion blur to compensate for this effect so it doesn't look unnatural @@Dancyspartan
While it is eventually possible that the 8K would be a realistic option on screen monitors. One is personally more interested in the possibilities of 8k VR. With double 8k strapped to your eyeballs, it would make an unprecedentedly clear viewing experience.
sadly, although i think it does have a purpose, and thank you for pointing this out, the LOD changing on models will appear a lot more obvious at 8k. at 4k playing skyrim with max object fade its still noticable at ~40 inches
@@2kliksphilip just googled unreal 5 nanite cloud and it looks very cool, hopefully newer games hold up to the future! i wonder if older games can handle 8k with your rtx 4090 and how they handle the higher res, if theres noticable popin/LOD switching? could be an interesting test :)
@@bebobo1 there seems to be a hard limit before it switches LOD models. i think you can actually regenerate distant LOD for skyrim using external tools. i remember doing something similar for getting the map to work with roads+seasons mods, something like xlodgen iirc?
It is prettty wild to me that it isn't supported directly more often. It isn't a very well-known idea because you need to set it up in such roundabout ways.
I actually think we will never get there. Just like phone batteries get bigger (denser) and CPUs more efficient but every phone has "a day and a bit" battery life, our graphics cards will get more powerful, but games will have more polygons or shaders or textures or in case they can't add anything else, more and more rays being traced and noise being compensated. So 8k high quality will unfortunately not happen I think. But still, if we get to 4.5k scaled down, 5k, 6k... people will hopefully see the difference. I had, seven years ago, one of the first GPUs marketed as "true 4k card", even back then, without a 4k monitor, the anti-aliasing of running at a higher res and downscaling was just something that I couldn't get over. And now I always have to spend much more than I'd "need" on my PCs, just because of this.
This is a pretty good answer. I don't think that we'll "never get there", but like you're pointing out, everything is going to scale the same to the point that we'll hardly see a benefit. Batteries get better and CPUs get faster and more efficient, and developers are like "It's free real estate," and never bother to optimize. Sure, someone might want an 8k gaming monitor, but do they want to pay for the video card to run that monitor?
I personally have made the experience that using DLSS or other AI upscalers beyond native to essentially do superresolution but with AI upscaling usually just means you are importing the upscaling artifacts these produce into your native resolution rendering which was previously pristine. This is of course also just a tradeoff you can make depending on your preference. Just wanted to mention that this is not simply better than native most of the time. There are visible drawbacks that I at least notice without actively trying to pay attention to it. To be a bit more concrete, the visual artifact I always find annoying is how distant objects that have a high-contrast transition to a background will leave obvious and dark ghosting artifacts in motion, usually caused by camera motion. Great video btw. Thanks philip!
I used to think that 4k was pointless compared to 1440p, but now I realize my real beef is with TV manufacturers, game consoles, and to a lesser extent monitor manufacturers who force us into exponential upgrades without offering adequate options in the middle. I always thought that consoles and TV manufacturers targeting 4k without any love for 1440p was stupid. I think it's stupid that there's not a lot of 1800p monitor options. I think that it's incredibly stupid that TV manufacturers are going right to 8k, and it's especially stupid how the consoles will fall right in line, and likely won't support monitors that will inevitably be created in-between.
It's a shame 6k doesnt come to tv. Imax films are shown in 6k at the high end theaters. It could easily be transferable to tvs. It's the reason why 4k exists in the first place
@@jwhi419yeah, 4k is a mere 8 megapixels and 6k is 20 megapixels, the sweet spot for most cameras too. 8k is a whopping 36 megapixels being rendered on screen. And you need to be pretty close to get the extra detail.
4:20 Yes but the "native" here is still using taa, with is why I dont like upscaling at all, I personally dont like the blur taa gives. And would rather use not aa at all, even with all the crunchy pixels, hell at 1440p and higher aa isnt "really" needed (especially at 8k wink wink). And if proformance is needed I would rather drop resolution and use something like smaa, then use upscaling. I wish modern games kept clear of taa. r/fucktaa has some good info.
AA is definitely still needed at 4k for _most_ people. You might not mind that unstable raw look, but there’s a reason TAA became so ubiquitous: most people do. Besides the fact that modern and future games look very different to old games without AA, as the more complex your geometry and shading are, the more unstable the image is going to look without AA. In order for geometry and shading to advance, some sort of AA that can solve temporal instability is necessary.
Always Disabling AA gang here. Yes, I would much rather have jaggies (which are not visible on 1440p unless you stay still and do one of those "zoom ins" on the screen, which I'm not doing because I'm playing the game) and a higher framerate than a lower framerate and a blurry mess of an image full of ghosting.
@@Mankepanke I wonder how much of this is a generational thing. I'm in my mid thirties and among people in that age bracket it's not uncommon to not care about aliasing, even if on the more glaring side. It's like my brain doesn't register it as a problem.
SMAA sucks for vegetation so its as a method of AA its pretty much useless on games with lots of vegetation. DLAA is clearer than TAA so try that. DLAA/TAA being temporal methods allows games to run certain effects at lower samples to optimise performance. Volumetric lighting can be ran at 1/4 res and TAA/DLAA can be used to makes this a complete image with little artifacting. being able to save 1.1ms on a render of each frame can mean the difference of going from 55 to 60fps. similar methods are used in screen space reflections.
8K is objectively better than 4K and you can see the difference under specific circumstances. However due to diminishing returns and extreme hardware requirements I foresee its adoption outside TVs being very slow. According to Steam Hardware Survey over 50% of participating Steam users have a primary display resolution of 1080p or lower and 4K adoption is at less than 10%. To me this says that we're years away from 8K gaming being anything more than a curiosity for most people especially as any resolution is a moving target when it comes to performance (the price to performance stagnation in the GPU market doesn't help either). Another issue to consider is that monitors last a long time. It is not uncommon for people to keep using the same monitor even if they completely upgraded their PC in the meantime. This likely contributes to the slow adoption of higher resolutions.
In modern games medium, and often low settings look actually quite similar to high/ultra in many cases (as you pointed out in another video). Honestly, in demanding games I put everything on medium or low (or look at "optimized settings and then continue to pair back things) and play as close to native 4K as my aging 6700XT will allow.
Totally disagree on upscaling. DLSS has a known input lag problem, and taa which smears the image when in motion. I can't play with upscalers without noticing these problems
@@DraxilSpada-vc2wg oh yea ok that makes alot of sense, I was thinking like an old 19' monitor or something which would be terrible to look at🤣 On a 7' that res would be perfect and i'm sure it makes games and other programs run alot easier on your pc.
@@Sk0die The hilarious part is I use it with an RTX 4090, it allows me to crank things up to stupid amounts, or just run my PC without burning a figurative hole in the motherboard.
@@DraxilSpada-vc2wg you are experiencing a level of bottleneck I didn't even know was possible. please tell me you have another display you use as well? I'm fairly certain my ryzen APU could run any game in existence on high settings with that display, I think the 4090 could be a tad overkill😂
I'll just leave a little shout out for tiny little 8K screens. We need them! VR with 8K per eye and super bright HDR goated OLED would be amazing. Foveated rendering takes care of not having GPU to drive all those pixels and 8K gives you amazing clarity that has to be getting close to what our eyes can even deal with. I'm no vision to pixel expert but I do notice that in VR, the same resolution looks so much worse than it does on a monitor a few feet away from me. So lets get on with it and get to 8K! It can only help VR heads wanting that to trickle down to little screens for their faces.
It's all about the detail and aliasing, not the literal pixel size. Supersampling to 8k could provide perfect image quality, but an 8k display just pushes those issues down the road.
my internet's down, i have no idea when it will be back, so i'm using a mobile hotspot to watch this video. i'm watching a video on 8K resolution in 144p
i play all my games at 4K on a 1080p monitor because it just looks that good. i imagine playing at 8K on a 4K monitor would be just as amazing of a jump in image quality. as more 8K monitors start to appear, i'll be happy to get a 4K monitor for cheap
1:45 Is it odd though? Because medium graphics take away a LOT of detail. Sure, most agree that ULTRA settings are overkill and don't do much, but HIGH graphics are the sweetspot. Medium is definitely a downgrade, even at 4K. It's easy to see in the GTA V gameplay you provided.
yea many games have a big fidelity stepdown from high to medium than from ultra to high. High settings + high resolution is the best combo visuals. I'd take 4k high over 1440p ultra any day.
I did find it interesting that he used GTAV as the example, because in most modern games I would agree that you can drop the graphic settings quite a bit before it becomes very noticeable, but GTAV has a very obvious drop from high to "medium" (which is actually the lowest setting available). Medium is basically the xbox360/ps3 graphic settings.
For 7 years i gamed on 1360x768p on a 19 inch tv. It wasn't until 2024 that i finally upgraded. I got a used 24 inch 1080p monitor from craigslist and it's wonderful. I honestly don't think i'd ever want 4k gaming or above, 1080p at this size and distance looks perfectly fine for me.
I think more importantly than explaining why 8k might be worth it, you said that you look forward to the future, without it letting you detract from what you enjoy in the present. I think people are motivated to upgrade not just because "it's new", but also because it means that what they currently have "is old".
I remember Monster Hunter Rise on PC having broken TAA (literally didn't work at all, dunno if it got fixed) and I used my extra GPU headroom to run at 6K. It downsampled to 4K using DLDSR and the image quality was superb ❤
Philip digs deep into things I have no access to or interest in. I like Philip for that (and more). Keeps me more open-minded and helps me embrace the future rather than shut down with only the things I'm familiar and content with, while automatically rejecting everything new. Thanks for the reality check, Philip. I'm still rocking my GTX 1080 on a 1080p screen anyway because I don't care enough to invest into an upgrade, but thanks nonetheless.
Buying a 27 inch 4K monitor was the best gaming decision I ever made, despite the internet saying 4K "isn't worth it at that size". HA! That showed them!
Definitely not too much resolution. It's just a matter of preference whether you focus on spatial or temporal resolution, as you currently can't reasonably have both. For productivity I would prefer even more. 27" 5K or 32" 6K. Sadly there are not a lot of options, and they are all non gaming displays and very expensive.
Fantastic video, the combination of DLSS with DLDSR has such a profound effect on image quality it's kind of ridiculous, games like RDR2 still look extremely blurry at 4K whether it be with DLSS quality (even with the latest .dll file) or at native, but when DLSS is combined with DLDSR the game is transformed as far as clarity goes, most of this clarity is coming from DLDSR as obviously it's rending at a higher res than your monitor, but it really seems to be the AI component here that cleans the image up to a staggering degree, these technologies along with RTX HDR have really enriched my PC gaming experience, great video!
Finally someone that said this... I went as far as to buy pro display XDR specifically for the 6k res at 32" (tho the rest of the features are also very welcome - like bye, bye IPS glow) and the difference is SO CLEAR compared to 4k - I have a freaking astigmatism btw. People pretending anything above 4k being useless always was a mistery to me - it has to be just repeating tech reviewers who in turn have a agenda - no sane person would dismiss 8k as it is dismissed by majority of people if they actually saw the difference. You might not care about it, accept what you have, but to state for e.g. that 8k won't come and 4k is the "last resolution" and we'll go with fidelity only from now on is just stupid.
People back in the day kept saying there is no point of HD resolution and Blu Ray. Now nobody wants to touch a 480p video with a stick, or even 720p. We'll get to 8K, but perhaps we have to increase the refresh rate first, 4K 120 > 8K 60 imo.
Ah so its one of those cycles. I didnt know that, but it doesnt surprise me. Eventually people will probably say 1080p is unusable and 1440p turns into the new 1080p (as in the bare minimum like it is now), and then 2k and so on and so on.
There are always going to be people who are fine with 720p and 30 FPS. Look at the new Zelda game that came out last week. To me the jaggies and smudged shadows are very distracting though.
@@winterhell2002 True but thats on a smaller screen as well so its not as noticable. If it was 1080p on a 21inch or 45 inch screen it would be way more noticeable. The switch upscales though when docked and connected to an external screen
Instead of higher fidelity. I wish more games would focus on physics. Destructible buildings and stuff. And just be able to interact with everything. Hell Let Loose is perfect in many ways. Except performance and absolutely everything is static in that game… Imagine a Hell Let Loose with fully destructible buildings
You're doing an apples to oranges comparison here. The common complaint is that a 4k display (at it's intended viewing distance), is indistinguishable from an 8k display, is not addressed here. Rendering games at higher resolutions, such as with some anti-aliasing techniques will produce a better image no doubt, but buying a higher resolution display won't help. Now, there are exceptions, such as split-screen gaming, or displays which have abnormally close viewing distances (like those found in VR headsets), where 8k is desirable, but those are still exceptions. I have to stress, that this all hinges upon you using your display at a reasonable viewing distance - too close and you'll get eye fatigue, too far and you won't even make out the difference between 1080p and 4k. Perhaps the biggest takeaway I find in your video is the fact that you didn't even compare a real 8K display to a 4K one.
@@2kliksphilip Yes, but you're viewing them from the same distance. In actual use, you should keep the 32" display farther from your head than the 16" one.
@@2kliksphilip I'm not sure even this gets to the core point though - the suggestion is that 4k content on the 16" 4k display will probably look identical to 8k content downsampled to 4k on the 32" 4k display at the same viewing distance. Downsampling from higher resolutions is literally the most primitive kind of anti-aliasing, but it's the most accurate and produces the least artifacts. If it's the only benefit of rendering at 8k then 8k monitors probably are just a fad, and we're really saying we just want better AA at 4k...
You could be right, and I might be totally misunderstanding you. The question for me is whether 8k content displayed on a 32" 8k monitor looks any sharper than the same 8k content downsampled and displayed on a 32" 4k monitor. Not sure if that kind of research has even been done though, so who's to say. But I feel like we're getting to the point where the anti aliasing is happening in our eyes at this kind of pixel density
@@2kliksphilip I actually bought a 8k samsung 55 inch because i was shopping for a 4k display and seen the exact same display i was gonna purchse in 8k on sale for the exact same price so i was like fuck it. The pixel density a 4k 55 inch tv is only 80 but at 8k it is 160 its very noticeable imo. I been gaming on it for a year and a half and it did not disappoint. I do not regret the decision i dont play many new games in 8k, But i now play all of my older games at 8k stuff pre 2015 gets 60 fps + on my 4070. Even older games like left 4 dead 2 i get 120 fps+ even with 200+ mods enable. Even some newer games like bf1 at max graphics i get about 55fps which isnt great for a fps but bf1 at 8k is the most insanely beautiful piece of gaming media i've even seen. Its actually pretty great and it does a lot more for games than most people would like to expect. at 8k 0 anti aliasing is required and objects in far distances become so clear especially on a 55 inch. Things like text or any small details effects like fire, flying debris, bugs, smoke become so clear. Things like power lines are literally a LINE with 0 stair stepping. im not denying these things are a "nuance" but they are quite beautiful to see in person. if u are obsessed with graphics imo 8k does something no raytracing or shader could ever do it provides clarity that is unimaginable.
Went to curry's to check the new 8k tv's and was shocked that they are displayed with soft upscaled videos and "looked" worse than the 4k tv next to it
I did a couple of calculations and with the same angular size of the pixel that suits me, an 8k monitor will be 86 inches diagonally and will occupy 120 degrees of horizontal vision, does this make sense? this is madness
good point, most vr games already suck ass on good graphics, meaning the bar is set low for the player base anyway, wouldnt hurt if u played another game on low quality graphics but with 8k
Touching on the final point you mentioned in the video about wishing people would be more excited for future technology and new standards, I feel a large reason for the widespread dislike of that kind of stuff has been heavily influenced by the economy, and people's ability to actually go out and buy new tech these days without spending more than they can really afford. I feel if the economy were to improve and people had more spending power we'd see a large shift in how people see new technology.
What I'm gonna look at in 10 or however many years time is what games I play right now will even recognize that 8K exists (in the game's settings menu, not a config file). I know devs are a lot more aware of higher resolution gaming but it's gonna be interesting, especially with the increased prevalence of indie games.
I was looking at the office on a 27 inch 1440p screen 1.5 meters away and was blown away that i could read the text on the book shelves. Idk why but it added a lot... FYI; Normally I watch from 2.5 meters away on a 65 inch 4K OLED. I tried moving drom 3.5 m up to 0.5m, but I just can't read it. Maybe it was created in 1080p and i am gaslighting myself, but I think 8K might actually make a difference for me. (If movies were 8k :p) I have had a 4k oled tv for 8 years now, so maybe I just got very used to it.
I remember choosing 1280*1024 Low in Oblivion over 800*600 High, mostly because the native resolution of my TFT display. Loved Quake III in 1600*1200 on a CRT which was considered a super high resolution. I think 8k is beautiful. I hope one day I'll be able to afford it. Now I'm stuck with 1440P on a 3080 on a 240Hz display. We just can't have everything. :( Great video!
The optimal distance chart at 2:45 is much more relevant for media consumption than for gaming. As an ex-eyefinity user, it's okay to not have the whole screen in the focused view and have it fill up my peripheral vision.
I agree with all points made. I would like to add though, that for the leap to 8k-quality to not kill storage space, development companies would need to optimize games, and would need even further to make uncompressed files an option on the side and not the onl way to download their program.
8K does have a purpose. Driving down the price for 4K.
It makes a lot of sense for recording footage.
Being able to zoom in post production is crazy useful. so from a gaming perspective a content creator may get a lot out of it.
@@RusticRonnie 4k already does the trick, usually. You can zoom in on a quarter of the screen and still have a 1080p image.
@@RusticRonnie, not really, 8k is slower to work with on anything but the very, very high end stuff
@Dummigame not if the end product is meant to be in 4K
The thing that drives me batshit crazy is the limited draw distance on objects, vegetation and LOD's even when it's cranked all the way up. So down the road when your computer can crush it you're still limited to seeing pop in.
I could not agree more. A 5800x3d and a 7900xtx should be enough to see the horizon but all i see a line of lods
hmm yes, maybe it does have purpose, thank you for pointing this out
i absolutely hate LOD in games (at least their current implementation). There should be a way to calibrate it so the game checks your resolution and FOV, and using that it goes through the assets you can see and calculates how many pixels it takes up on your screen, and calculates LOD from there. Because currently, you can still see sizable objects that are just blurs or grey blocks until you get extremely close to it which ruins the viewing experience for me.
This is already a problem with old 3d games right now. It likely won't stop being a problem either.
There are game devs right who STILL use framerate to calculate physics and whatnot. The industry hates future-proofing. It's why live-service games can get away with becoming literally unplayable once the central servers get shut down.
Exactly. It's like yeah, I get it, when San Andreas was being made at the time, Rockstar couldn't even fathom the idea of a 4090 graphics card. But with just how powerful our modern hardware is, I should easily be able to have the entire State loaded in at once with all the high LOD models loaded in and have ZERO performance hit
Looks good watching on my phone
It looks good on my smart watch.
Even though they are generally just a bit higher than 1080p or sometimes 1440p, Phones process the Pixel density of the Gods.
hmm yes, maybe it does have purpose, thank you for pointing this out
Hmm, yes...
hmm, yes...
hmm, monke...
@@Nyllsor maybe it does have purpose
combo breaker
When I got my 1080ti I started running all my games at 4K (DSR) on a 1080p display because in older games, render distance and LOD were tied to resolution. Perfect supersampling AA and dramatically less geometry pop in.
This is the way!
I've been playing at 4k for over a decade and have since switched to 8k on 55" Samsung TV 3 years ago and I can't go back
supersampling is criminally underrated, the crispness it can bring to even a simple 1080p screen is insane
@@drunkchey9739 yup, if my 1080p display had better color I would use that over my 1440p display. Especially because 4k DSR at 1440p doesn't look right and 5K is out of my performance tolerance.
DSR is just amazing on 1080p
@@drunkchey9739 really is, I remember downsampling from 8K for my 1080p monitor in some old games. Every line looked absolutely perfect. Crazy that the trend nowadays is to render at less than native res, then upscale and slap TAA on top.
"You don't have to be polarized about literally everything ever"
Woah there buddy, this is the internet we're talking about
4:27 the temporal issues that come with it, TAA is a plague to modern games that implement it badly. oversharpen to compensate, ghosting, blur, dependency for certain effects, etc.
Image ghosting in Satisfactory was quite distracting the last time I played it, with the effect being super smeary, which reminded me of Doom 2016, specifically the scene in Samuel Hayden's Office when he waves his hands around, trails are left behind on his carpet. Out of FSR, DLSS and TSR and plain old TAA, I remember DLSS being the least image disrupting method, though it's still noticeable on the conveyor belts. It made me consider turning off anti-aliasing entirely, though that'll cause issues too since stuff like screen-space contact shadows relies on TAA to smooth it out! So it's become an issue that TAA is relied upon to make some effects work, which is an unfortunate circumstance to be in. I kinda just try to ignore it, since nothing is perfect, so it's a matter of compromises...
Edit, since somehow I couldn't grammar check correctly, and also an elaboration on the scene in Doom 2016.
I'd really love to know Phillip's thoughts on TAA and all the ghosting it produces. I recently bought a 2k screen with a 4070ti super to accompany it. After trying some modern games I felt something was wrong, but couldn't quite put my finger on it. The biggest offender for me was The Finals, I get that it uses RT to update light in destroyed buildings and it would shimmer a lot without some temporal solution, so it forces TAA or TAA-based upscalers as it's only options, leaving out traditional AA or even no AA at all. But I really hate what it does to image clarity especially on longer distances, players and other moving objects become mushy blobs with long trails after them. No matter the settings, I always see those ugly afterimages and it makes my blood boil since that obviously flawed technology is pushed on me with no alternatives present.
I think you're right, but I do very much see the importance of having the option.
Txaa makes some of the best hair rendering I've ever seen, most likely only replicateable by super sampling.
It's the copper of antialiasing solutions, you know? Like how, sure, platinum, gold and silver ARE more conductive, but they're nowhere near as abundant as copper. Hence why they use it in everything.
And when combined with the sharpening filters of fsr and the like, it can make for a solid stopgap.
I actually prefer cranking the sharpness in everything I use on my steam deck, since blurriness bothers me alot more than sheer pixelation.
But that's the key difference, there I have a choice of how my games are scaled, console players don't.
And when the game devs want to make a game run well on console, you used to have to jump through all kinds of hoops and truely work for your frame rates, show some technical wizardry. Now they're gonna be tempted (and sometimes even forced by management) to just slap some fsr on that b, and call it a day.
Yep, It would be a better comparison if the "Native" image was without TAA. I prefer SMAA even if it still has some pixelated parts.
@@qualitycontent1021 I agree, the finals has some of the worst TAA imo
just give me some new form of anti aliasing that doesn't turn image into a ghosthunt
at 8k you don't need anti aliasing
@@322-Dota2 what about crap like cyberpunk 2077 which has built in taa and if you disable it through config game just breaks because game was built on it?
@@322-Dota2 While you might not feel a need for AA at 8K, there will still be artifacts. To truly not need AA anymore, you need insanly high resolutions. If you don't believe me, go to blur busters ufo test page, select the aliasing visibility test and measure how far from the screen you need to be to not see any artifacts any more. Then calculate the needed resolution for a normal viewing distance from that.
CMAA is hands-down the best cheap AA, unfortunately very few games use it: Yakuza 3-5 remasters, FFXVI, CS2 and that's about it
@@B.E.3.R I've never even heard of it until now, and I've even booted up CS2 more than once (to test whether my potato can run it comfortably, or at all in the case of the Linux install).
Personally, while I see the value in 8k, I definitely am more pleased with the current refresh rate arms race going on in the monitor space currently. The benefit of 8k relative to the performance cost is definitely very high, and while it is not without value, I personally think that chasing higher refresh rates (240hz/360hz/480hz) is the more beneficial solution. We are still very far away from hitting truly diminishing returns when it comes to responsiveness (okay, maybe this one not so much), motion smoothness, and *especially* motion clarity.
Low MPRT is absolutely critical to having a truly sharp gaming experience IMO and goes woefully underdiscussed compared to cranking the resolution. What is the point of rendering such an extreme amount of video information when the vast majority of it ends up smeared and indecipherable simply because our eyes don't mesh well with how sample and hold displays display the image? It takes a performance dump of already potentially *questionable* value and makes it even more situationally beneficial, further diminishing the appeal in running such high resolutions.
I sacrificed colour accuracy and brightness for a different IPS monitor of the same resolution, but upgraded from 60Hz to 165Hz. Oh my god it was so worth it.
TBH Frame gen is the way to go for motion clarity, at 300+ fps you already have enough responsiveness, and since frame gen has less latency and less artifacts at higher fps. The downsides would be very small. The only limitation is that the optical flow accelerators on current cards weren't made for 300>1000fps. Frame gen can be seen as an enhanced BFI, where instead of flashing a black frame to increase motion clarity, you flash an interpolated frame.
Why not both? I have a 4k240 display. Target of equivalent motion clarity to 1000fps seems to be the end game, although BFI could give that at much lower refresh rates. Still, I don't play games with extremely high motion. Anything in 3 figures fps is sufficient for my needs, yet my resolution itch isn't satisfied and I'd like more options beyond 4k. Options exist going both ways for whatever is preferred and given long enough I'm sure they'll converge.
@@snowwsquire Interpolation is definitely the way forward but, yeah, you do need a notably high starting FPS to overcome the shortcomings of modern interpolation. The optical flow processor stuff is mostly marketing however, applications like Lossless Scaling can already do 4x interpolation even on GPUs without specialized hardware and it isn't reliant on any actual game data, it just captures the game window and interpolates it like a video and it does it very well, with about as little latency as is possible for that style of interpolation and pretty decent quality as well. Though obviously, first party solutions from companies like Nvidia where the game feeds behind the scene data does yield higher quality output. It's also quite a bit better than black frame insertion on account of not only having no brightness hit (making HDR feasible) it also provides increases in motion smoothness that BFI fails to achieve.
@@TeeBeeDee you say it’s mostly marketing and then detail how it’s higher quality, and the bigger part is that it’s not run on the shaders.
no thank you i can only see 40 pixels across
-burger40
you forgot the 6
Dude lying about his identity thinking we won't know about his 6 shm
@@roborogue_ interestingly in his icon it faintly says "burger40", no 6.
@@korakys looks like it says "burger10"
@@korakys also his username is in fact burger40, it's just the unique tag that shows in comments that's burger406
Imagine working hard on making a decent video, being eager to hear your viewer's feedback only to be bombarded with a reddit tier comment chain spamming "hmm yes, maybe it does have purpose, thank you for pointing this out"
hmm yes, maybe it does have purpose, thank you for pointing this out
hmm yes, maybe it does have purpose, thank you for pointing this out
hmm yes, maybe it does have purpose, thank you for pointing this out
hmm yes, maybe it does have purpose,thank you for pointing this out.
hmm yes, maybe it does have purpose, thank you for pointing this out
The biggest use case for 8K is headsets, VR or otherwise.
That said I still think gaming at 8K (outside of a headset) is dumb and likely will be for a very long time. The diminishing returns are pretty extreme and there are lots of other metrics worth boosting before that.
I also don't think downscaling from 8K counts as gaming at 8K. Ultimately what you see is still at 4K and only requires a 4K screen.
And 5K and 6K aren't 8K. The returns diminish exponentially and it's rare to see people arguing 5K is too much for example.
Thanks for the nuanced take. Most of the time people claim that there is no advantage to 8K, while, as you said, there is one, it's just not even remotely worth it for now. If you are talking native 8K I fully agree that it will be a bad idea for a long time. The thing is though, upscaling is getting better and better, so gaming on an 8K display could become reasonable quite a bit faster. Currently even just the upscaling cost to 8K isn't worth it, but two GPU gens in the future that will probably not be a noticable performance impact anymore.
For completly matching peak human vision, you'd need 2400ppd. I tried to measure that for myself and ended up with >1700ppd instead, which is >86K on a 32" at 80cm distance. In realistic use cases it will be indistinguishable from a 16K screen and even that is very far into diminishing returns.
Personally, I would prefer 6K, as the benefit of 8K+ is too little and I would rather have the resources spent on refresh rate and color depth. But I expect the industry to push for 8K instead.
Im pretty sure average ppd on the best eyes is lower than 500ppd. Maybe there is a study that says a focal point can see that high. So in that case just focus 1440p into that area. But the rest of the screen would realistically never need to cross 16k. Or 15360p. Regardless of size.
Apple "retina" is 120ppd. And double that is probably goodenough@@davidruppelt
It's true. Downscaling isn't the same as native at all. It's the same as watching a video in 720p that was made at 4k. It's a huge upgrade in image quality versus a video made at 720p. But at the end of the day the clarity isn't anything more than 720p. Theortically even if the bitrate was the same. It would still apply that you can not zoom in (or bring your head closer) to actually see detail.
It's the same story with upscalers. Atleast for the current ones. You can test by turning off forced taa/upscaling. You'll see that you don't actually lose any detail. With upscaling a blade of grass can be ultra sharp. Without it you'll see the 16 raw pixels. But if you turn up the real resolution of that to 64 pixels. You would actually be able to see the veins and stuff on piece of grass. Yet you would still see squares on the squares so you have people who still call it inferior to the ultra sharp perfect circle.
@@jwhi419 I have the 2400ppd from the "NVIDIA Automotive Screen Density Calculator" where they claim that to be "approx. limit of high contrast feature detection". I don't think there is a practical benefit of a screen that capable over a 300ppd screen with AA, but if you truly want a resolution that under no circumstances needs AA, than 2400ppd it is.
300ppd would be "20/4 vision; approx. limit of alignment detection (hyperacuity)" and is equivalent to a 32" 16K screen at 80cm distance. That should in my opinion be the endgoal. Apples 120ppd are "20/10 vision; practical upper limit of visual acuity". That is a more realistic goal for now and would probably be good enough for me, but would need AA to avoid the edge cases where it isn't. I did a test with blur busters "Worst-Case Aliasing Visibility Test" and there I could still see the artifacts from a distance equivalent to 1700ppd. There may very well be some error in my test design, but at least I believe the conclusion to be valid, that 120ppd are not enough for worst case scenarios.
@@davidruppelt i see. However i do think that if the tech to create a 32 inch 16k display exist. Then the tech to do the same in vr with a headset weighing 200grams would probably exist too.
Well the chief of blur busters has talked about an end goal display technology in his forum before. The Holo deck from star trek. I do not recall the details but i think he ignores pixels at such a high resolution. Of course his deal is more about the motion clarity. Since just like you noted that you would need more than 1000ppd for the line test. You would need more than 10,000hz to move each of those pixels visibily to a human.
Obviously when people say 8k is a gimmick, they are talking about the here and now. The hardwarerequirements and costs are just not worth it.
Things might change in 10 to 20 years, as they always do, but thats not a revolutionary idea.
That's not obvious, many people say resolutions higher than 4k are _completely_ worthless due to the limits of human vision. This is already visible on smaller devices, 4k laptops aren't much sharper than 1440p ones and I'll never need a 4k phone even in 20 years. I agree with philip that the limit for monitors/TVs is a bit higher, I wish there were more 5-6k options, but there's a limit there too and 8k will likely be the last increase. VR headsets will go even higher but once we have better foveation and reprojection it might be a bit meaningless to talk about native resolution/refresh rate
@@speedstyle.
laptop displays above 1080p have demonstrated how useless they are for more than 10 years now.
but still they keep being produced despite the fact that you have to sit 2 inches away to even tell.
intel didn't even allow unfiltered scaling from 4k to 1080 until 2019, and every laptop before that is arbitrarily not allowed to use the objectively simplest method of rescaling in existence.
and then they pretended it is an actual feature that took effort to implement when they so graciously decided that the consumer worms were allowed to use it.
@@speedstyle. i had a 1440p phone in 2014. all my phones since have had lower resolution because that's just useless in a phone. my current phone is considerably larger, but it's something like 2300x1080, a much lower resolution (also ultrawide) and it's still as sharp as i'll ever need.
"You don't have to be polarized about literally everything ever"
This quote applies to so much rn, thank you
in the year 2027 i shall look forward to viewing your nostril hair in 8k at max settings
I don't find 4K that much better than 1440p personaly already, but I still can understand that others enjoy higher resolutions. It's just not that interesting to me. Of course higher resolution is better, but framerates and a smooth visual are also important.
Still, in the future 4K and even 8K will become that "smooth framerates and clarity" sweetspot. I also have a 1440p screen and am very happy with it, yet I can't wait for 4K to become as easy to run as 1080p nowadays, I love replaying favourite games whenever I get a higher resolution screen, just so I can experience detail I wouldn't have otherwise.
Same here. I first went from full HD to 4K, which was a huge difference. Much later I got a 1440p high refresh rate screen and the difference in sharpness is unnoticable in desktop/text sharpness. Maybe for something like Escape from Tarkov it would be nice for spotting stuff in the distance, but 4K high refresh rate is prohibitively expensive both in processing power and screen price. No way I'm paying NVIDIA the "we price for the elite" cost that they have imposed on newer RTX cards. High end AMD is already expensive enough.
It starts to looks better on TV's IMO (55 inch and higher). On computer monitors it's just meh.
I definitely noticed 1440p to 4k.
I don’t notice 1800p to 4k.
Its like how 144hz vs 240 is noticeable.
240 to 360 is harder.
Its just pass the barrier.
Though 500hz is approximately the frame rate an untrained eye will notice.
I actually still see a huge difference because I jump recently from 22 inch 1080P to 27inch 1440P and it's not much better even ppi is just a bit better than before but not that much.
On my old 43inch 4K TV even at 1 meter looked so much more crazy because it wasn't just resolution, just dropping slightly the resolution on it the game was still very defined but lost a lot of brightness/colour/contrast because beside resolution there's also massive more colour information that makes a lot more changing in between stuff.
Problem is now some games I used to play fine at 1080P now I sadly have to play as vaseline in my new 1440P monitor cause I lack hardware with my Strix RX6700XT 🫣
Ps: I wish I had a 27 inch 4K monitor, that would make me very happy but I would sadly lack hardware to push such high res, but later on a PC upgrade even upscaling works much better for 4K than on a 1440P or 1080P monitor since like FSR quality will start from a higher resolution in 1st place.
And what about VR too, we are already almost approaching 8k with the apple vision pro. Regardless of the success of the device, the screen resolution is something that is praised when compared to other VR headsets.
hmm yes maybe it does have purpose, thank you for pointing this out
...in 2030 with a RTX 8090
2030 is just 5 years and 3 months away. If I'm lucky I'll own a 4K 240 Hz monitor by then
I feel like you meant this as a joke but yeah, in 5 years with a decent card it will be a normal option....
I started playing games in 4k in 2017 with a gtx 1070. I could play games in 8k with my current GPU if i wanted to.
@@OttoLP personally in 5 years I fully expect I will only just have adopted 4k, but I'm also not buying anything that could be construed as high end lol. I just can't bear to spend that much.
But yeah, just as 4k seemed very silly and wasteful, but is now seeming more and more viable... So too will 8k.
I expect we are more than 5 years away from 8k reaching mainstream, but I don't doubt that we will start seeing it again by 2030.
@@pancakerizer Hahahaha, yeah that would be crazy playing games at 4k 240fps like it is nothing. Though I doubt they will improve on path tracing performance as much in 2 generations, it might get 2-4 times faster at most I guess. Fun thought though :D
@@OttoLPhonestly going off of 4K (which never took market dominance)
It would be about 8 years from now for people to start adopting it in mass. Because high frame rate monitors will lag behind by about that much
Pixel quality > pixel count; furthermore, art direction > fidelity
Chasing such high technical standards has game developers and artists overworked for diminishing visual returns, install sizes that occupy 1/4 of a console's hard drive, and products that often need to be smothered in layers of AI and TAA vaseline in order to run at a stable frame rate at all. Meanwhile, Nintendo still makes billions shipping games at 720-1080p, while still earning praise for visuals
Nintendo is lacking in 3D high quality visuals and FPS, they don't have any truly competitive games except for smash and the smash community has to constantly work around Nintendos hardware. Nintendo is only successful because of their art style and family friendly brands, it has nothing to do with pixel quality, they are often the worst. Shipping games that run better on a smartphone for $60 is predatory af. Switch games look fine handheld but hook it up to a 4k TV and any 3D game is gonna look like complete dogshit, especially 3rd party ports like doom eternal or witcher 3. Yes it's technically impressive running on a android tablet designed in 2017 but I get better performance emulating it on my phone or even Xbox cloud streaming
Does it though? Or is this another "feels" argument. People always talk about devs and artists being overworked even if there's not even any proof of it, it's a lazy argument that says very little.
i think the main polarizing point of upscaling is that instead of being a free bonus performance or clarity boost developers have started using it as a replacement for optimization to ship games out faster and cheaper, and at that point you still need high end hardware to run it well which kind of defeats half of the point of upscaling
Work expands to fill available resources. Just like how applications use crazy amounts of memory nowadays.
hmm yes, maybe it does have purpose, thank you for pointing this out
Sometimes it's a lack of optimization, sometimes its just a harder problem to solve
"developers have started using it as a replacement for optimization"
I've seen this statement in some form or another for probably 3 years now and I still have not seen any proof or evidence to back it up. People have even come up with elaborate conspiracy theories where developers have conspired with Nvidia to make their games run worse to make DLSS seem more valuable.
So lucio-ohs8828, do you have any proof whatsoever or are you making shit up and talking out your ass like everyone else?
@@kiri101
Please dont excuse the generic slop that is gaming nowadays, its clear that devs dont have any talent anymore. Its all just talentless UE or Unity devs making bad games and using AI so that they are at least running lmao
Star Wars Battlefront 2 came out 6 years ago and ran fine on a 1070 or 1080 and looks much better than just about every game nowadays. Perhaps hiring actually talented people that care about their product or having a custom engine isn't such a bad idea?
Memory is very cheap, GPUs are not.
I still think it doesn't have a purpose, but I appreciate you arguing the opposite view
for gaming i agree, a crisp 4k image is enough (that's something you don't get with upscalers or the post shader salad of current games btw).
But i'd argue that for desktop use, 8k is useful. i already use my 4k screen as a "canvas" of sort, everything windowed, nothing full screen, and i drag stuff around. i'd love to have a 90" 8k screen so i can use any small portion of it in 1440~ 4k and be left with a ton of extra useful canvas space. beats having a ton of monitors for this same task.
Same, like there is definitely a physical limit to how sharp we can see, like we know that eagles can see things at a distance that just aren't visible in any detail to a human. I don't know what that limit is but if 8K has no noticable clearer sharpness than 4K at a reasonable viewing distance then I really don't see the point except for screens that are meant to be looked at partially. The money spent on 8k hardware, could then be spent for higher refresh rate/HDR/OLED, whatever else gives better image quality.
Also, just my personal preference. I recently finally upgraded from 1080p to 1440p, and while it was a clear improvement, it wasn't as much as I thought it would be. I will definitly upgrade to 4K at some point, but my expectations aren't very high about 4K so it might still be another 5-10 years until I do the upgrade.
I've always appreciated your videos on tech and specs (including resolutions) because it's always felt like you were hopping on the next big thing before anyone else even acknowledged its future widespread use. I remember a world a bit more than 10 years ago when the majority view on the internet was that 4K was useless because we had hit diminishing returns at 1440p and wouldn't benefit from more pixels on monitors that sit so close to our eyes. Now the same is being said about 8K while quality 4K monitors can be bought on Amazon for 250£ and have thousands of glowing reviews. It's a uniquely human thing to feel so pedantically correct about "tech advances aren't worth it anymore! we need other transcendental innovations!"... and to be so harshly proven wrong when the inevitable happens. Great video!
I mean, I still think so. 4k is still a gimmick and almost no content is consumed at that resolution and gains no real benefits from it.
Objectively you can in most cases not tell a difference on a tv with those distances, and a computer monitor gains so little benefits that a 4x performance penalty cannot be justified in any way.
One of the issues I still se in "gaming" monitor reviews is that they point out that the increased pixel count is useless while gaming, if so maybe use it while doing other stuff. Personally I prefer 4k gaming even at 27 inches and I look forward to the time when I get my first 8k screen. I hope it could be have better colours as than my 4k smartphone.
1440p on a phone is a gimmick. I've had a phone with 1440p for a while, but I lowered the resolution to 1080p and noticed no difference. At 400ppi already, what's the point of even higher density? It just drains your battery faster and the only way to notice a difference is with a microscope.
Yeah, that's why I still game with a 1060 @ capped 30fps. No need for the extra frames - I don't notice any difference. 30 frames is enough. The only way to tell the difference is by scouring a video frame-by-frame.
@@Dancyspartan
this is just bait now
30fps on sample and hold displays is really quite significantly poorer. BFI helps a lot here because your eyes have their own motion blur which blurs the frames together significantly unless you have black between them so you can see the frame without the next frame blurring into it as much. Movies have real motion blur to compensate for this effect so it doesn't look unnatural @@Dancyspartan
While it is eventually possible that the 8K would be a realistic option on screen monitors. One is personally more interested in the possibilities of 8k VR. With double 8k strapped to your eyeballs, it would make an unprecedentedly clear viewing experience.
sadly, although i think it does have a purpose, and thank you for pointing this out, the LOD changing on models will appear a lot more obvious at 8k. at 4k playing skyrim with max object fade its still noticable at ~40 inches
although i would love 2x pixel density on my primary big monitor, even just for productivity
@@2kliksphilip just googled unreal 5 nanite cloud and it looks very cool, hopefully newer games hold up to the future! i wonder if older games can handle 8k with your rtx 4090 and how they handle the higher res, if theres noticable popin/LOD switching? could be an interesting test :)
im pretty sure there are lines in the .ini files that control draw distance, but it's been a while so i don't remember exactly which ones.
@@bebobo1 there seems to be a hard limit before it switches LOD models. i think you can actually regenerate distant LOD for skyrim using external tools. i remember doing something similar for getting the map to work with roads+seasons mods, something like xlodgen iirc?
I too remember when I was excited for newer, better technology. Thanks for making me remember that. It's OK to be excited about things!
Down scaling is amazing. It makes the image so clear I can't go back to 2K after witnessing its beauty.
It is prettty wild to me that it isn't supported directly more often. It isn't a very well-known idea because you need to set it up in such roundabout ways.
@@colbyboucher6391SSAA is the same thing if you can find it in the settings.
I actually think we will never get there. Just like phone batteries get bigger (denser) and CPUs more efficient but every phone has "a day and a bit" battery life, our graphics cards will get more powerful, but games will have more polygons or shaders or textures or in case they can't add anything else, more and more rays being traced and noise being compensated.
So 8k high quality will unfortunately not happen I think. But still, if we get to 4.5k scaled down, 5k, 6k... people will hopefully see the difference. I had, seven years ago, one of the first GPUs marketed as "true 4k card", even back then, without a 4k monitor, the anti-aliasing of running at a higher res and downscaling was just something that I couldn't get over.
And now I always have to spend much more than I'd "need" on my PCs, just because of this.
This is a pretty good answer. I don't think that we'll "never get there", but like you're pointing out, everything is going to scale the same to the point that we'll hardly see a benefit. Batteries get better and CPUs get faster and more efficient, and developers are like "It's free real estate," and never bother to optimize. Sure, someone might want an 8k gaming monitor, but do they want to pay for the video card to run that monitor?
8k and even higher is relevant for VR as optics continue to improve.
I personally have made the experience that using DLSS or other AI upscalers beyond native to essentially do superresolution but with AI upscaling usually just means you are importing the upscaling artifacts these produce into your native resolution rendering which was previously pristine. This is of course also just a tradeoff you can make depending on your preference. Just wanted to mention that this is not simply better than native most of the time. There are visible drawbacks that I at least notice without actively trying to pay attention to it. To be a bit more concrete, the visual artifact I always find annoying is how distant objects that have a high-contrast transition to a background will leave obvious and dark ghosting artifacts in motion, usually caused by camera motion.
Great video btw. Thanks philip!
I used to think that 4k was pointless compared to 1440p, but now I realize my real beef is with TV manufacturers, game consoles, and to a lesser extent monitor manufacturers who force us into exponential upgrades without offering adequate options in the middle. I always thought that consoles and TV manufacturers targeting 4k without any love for 1440p was stupid. I think it's stupid that there's not a lot of 1800p monitor options. I think that it's incredibly stupid that TV manufacturers are going right to 8k, and it's especially stupid how the consoles will fall right in line, and likely won't support monitors that will inevitably be created in-between.
It's a shame 6k doesnt come to tv. Imax films are shown in 6k at the high end theaters. It could easily be transferable to tvs. It's the reason why 4k exists in the first place
@@jwhi419yeah, 4k is a mere 8 megapixels and 6k is 20 megapixels, the sweet spot for most cameras too. 8k is a whopping 36 megapixels being rendered on screen. And you need to be pretty close to get the extra detail.
4:20 Yes but the "native" here is still using taa, with is why I dont like upscaling at all, I personally dont like the blur taa gives. And would rather use not aa at all, even with all the crunchy pixels, hell at 1440p and higher aa isnt "really" needed (especially at 8k wink wink).
And if proformance is needed I would rather drop resolution and use something like smaa, then use upscaling. I wish modern games kept clear of taa. r/fucktaa has some good info.
AA is definitely still needed at 4k for _most_ people. You might not mind that unstable raw look, but there’s a reason TAA became so ubiquitous: most people do. Besides the fact that modern and future games look very different to old games without AA, as the more complex your geometry and shading are, the more unstable the image is going to look without AA. In order for geometry and shading to advance, some sort of AA that can solve temporal instability is necessary.
Always Disabling AA gang here. Yes, I would much rather have jaggies (which are not visible on 1440p unless you stay still and do one of those "zoom ins" on the screen, which I'm not doing because I'm playing the game) and a higher framerate than a lower framerate and a blurry mess of an image full of ghosting.
@@Mankepanke I wonder how much of this is a generational thing. I'm in my mid thirties and among people in that age bracket it's not uncommon to not care about aliasing, even if on the more glaring side. It's like my brain doesn't register it as a problem.
SMAA sucks for vegetation so its as a method of AA its pretty much useless on games with lots of vegetation. DLAA is clearer than TAA so try that. DLAA/TAA being temporal methods allows games to run certain effects at lower samples to optimise performance. Volumetric lighting can be ran at 1/4 res and TAA/DLAA can be used to makes this a complete image with little artifacting. being able to save 1.1ms on a render of each frame can mean the difference of going from 55 to 60fps. similar methods are used in screen space reflections.
TAA is horrendous and wish more games used straight upscaling instead.
8K is objectively better than 4K and you can see the difference under specific circumstances. However due to diminishing returns and extreme hardware requirements I foresee its adoption outside TVs being very slow. According to Steam Hardware Survey over 50% of participating Steam users have a primary display resolution of 1080p or lower and 4K adoption is at less than 10%. To me this says that we're years away from 8K gaming being anything more than a curiosity for most people especially as any resolution is a moving target when it comes to performance (the price to performance stagnation in the GPU market doesn't help either).
Another issue to consider is that monitors last a long time. It is not uncommon for people to keep using the same monitor even if they completely upgraded their PC in the meantime. This likely contributes to the slow adoption of higher resolutions.
It's me!! I'm in the 50%!! I don't have much choice though since it's my laptop's screen, but the high refresh rate does feel really good.
Objectively higher, not better.
@@spectrobit5554 I meant that it's objectively better in terms of quality. Sorry for the confusion.
In modern games medium, and often low settings look actually quite similar to high/ultra in many cases (as you pointed out in another video). Honestly, in demanding games I put everything on medium or low (or look at "optimized settings and then continue to pair back things) and play as close to native 4K as my aging 6700XT will allow.
yup. medium effects with ultra textures. Best visuals to performance.
Totally disagree on upscaling. DLSS has a known input lag problem, and taa which smears the image when in motion. I can't play with upscalers without noticing these problems
me watching this video at 240p on my 640x480 monitor:
ah yes, quality
I don't think i've ever used a monitor of such low resolution in my entire life. 720p ones for sure but 480p is next level, how big is it?
@@Sk0die It's a 7 inch monitor. :3
USB powered, so my PC powers the monitor as well as displaying to it, uses much less power.
@@DraxilSpada-vc2wg oh yea ok that makes alot of sense, I was thinking like an old 19' monitor or something which would be terrible to look at🤣
On a 7' that res would be perfect and i'm sure it makes games and other programs run alot easier on your pc.
@@Sk0die The hilarious part is I use it with an RTX 4090, it allows me to crank things up to stupid amounts, or just run my PC without burning a figurative hole in the motherboard.
@@DraxilSpada-vc2wg you are experiencing a level of bottleneck I didn't even know was possible. please tell me you have another display you use as well?
I'm fairly certain my ryzen APU could run any game in existence on high settings with that display, I think the 4090 could be a tad overkill😂
mm yes maybe it does have purpose, thank you for pointing this out
"i grew up in a time where i looked forward to the future" - that one hurt
hmm yes, maybe it foes have purpose, thank you gor pointing this out
i love u 2kfilip im sorry everyone is commenting on ur vid with a copypaste
I'll just leave a little shout out for tiny little 8K screens. We need them! VR with 8K per eye and super bright HDR goated OLED would be amazing. Foveated rendering takes care of not having GPU to drive all those pixels and 8K gives you amazing clarity that has to be getting close to what our eyes can even deal with. I'm no vision to pixel expert but I do notice that in VR, the same resolution looks so much worse than it does on a monitor a few feet away from me. So lets get on with it and get to 8K! It can only help VR heads wanting that to trickle down to little screens for their faces.
People called me crazy too when i played GTA V at 4K. It only ran at 35-50 fps on medium-high settings on my R9 Nano, but i thought it was great.
It's all about the detail and aliasing, not the literal pixel size. Supersampling to 8k could provide perfect image quality, but an 8k display just pushes those issues down the road.
the thing 8k brings to the table is cream to an already frosted cake
The more pixels the better, but 8K simply isn't topical yet.
my internet's down, i have no idea when it will be back, so i'm using a mobile hotspot to watch this video. i'm watching a video on 8K resolution in 144p
"Added clarity and realism to the distant buildings..."
-> Shows clip of GTA V.
Hmm yes maybe it does have a purpose, thank you for pointing that out
i play all my games at 4K on a 1080p monitor because it just looks that good. i imagine playing at 8K on a 4K monitor would be just as amazing of a jump in image quality. as more 8K monitors start to appear, i'll be happy to get a 4K monitor for cheap
super sampling is underrated!
This did not make me be all like 'hmm yes maybe it does have purpose, thank you for pointing this out'
1:45 Is it odd though? Because medium graphics take away a LOT of detail. Sure, most agree that ULTRA settings are overkill and don't do much, but HIGH graphics are the sweetspot. Medium is definitely a downgrade, even at 4K. It's easy to see in the GTA V gameplay you provided.
yea many games have a big fidelity stepdown from high to medium than from ultra to high. High settings + high resolution is the best combo visuals. I'd take 4k high over 1440p ultra any day.
he's coping.
I did find it interesting that he used GTAV as the example, because in most modern games I would agree that you can drop the graphic settings quite a bit before it becomes very noticeable, but GTAV has a very obvious drop from high to "medium" (which is actually the lowest setting available). Medium is basically the xbox360/ps3 graphic settings.
@@NimbleSnek i wonder how fat the Nvidia check is..
@@TerrorByte69420
if he was paid by nvidia wouldn't he be telling you to only play at ultra so you're forced to spent 10000 dollars on a flagship gpu?
8k has sense in vr
I don’t even own a computer or anything and i still look forward to all your videos.
hmm no, definitely it does not have purpose, curse you for pointing this out
hmm yes maybe it does have purpose, thank you for pointing this out - but have you considered 16K?
The issue with DLSS is that in motion it turns pretty darn blurry and its just nothing compared to native
Me watching on 1080p screen : hmm yes maybe it does have purpose, thank you for pointing this out
For 7 years i gamed on 1360x768p on a 19 inch tv. It wasn't until 2024 that i finally upgraded. I got a used 24 inch 1080p monitor from craigslist and it's wonderful. I honestly don't think i'd ever want 4k gaming or above, 1080p at this size and distance looks perfectly fine for me.
watching this without eyes
I think more importantly than explaining why 8k might be worth it, you said that you look forward to the future, without it letting you detract from what you enjoy in the present. I think people are motivated to upgrade not just because "it's new", but also because it means that what they currently have "is old".
Guy like me still chilling on 1080p
man I cant wait for 16K to be a thing so I can finally see what everyone is so hyped about with 4K
I could listen to philip talk about rendering techniques and resolutions all day honestly
I'm still running some newer games at 720p on my 970 :)
Based and budget pilled.
cool to see poor brothers like me 💪
My main display is 1080P but my secondary is an old 720P HDTV. We will make it brother!
I remember Monster Hunter Rise on PC having broken TAA (literally didn't work at all, dunno if it got fixed) and I used my extra GPU headroom to run at 6K. It downsampled to 4K using DLDSR and the image quality was superb ❤
Philip digs deep into things I have no access to or interest in. I like Philip for that (and more). Keeps me more open-minded and helps me embrace the future rather than shut down with only the things I'm familiar and content with, while automatically rejecting everything new.
Thanks for the reality check, Philip. I'm still rocking my GTX 1080 on a 1080p screen anyway because I don't care enough to invest into an upgrade, but thanks nonetheless.
Hmm Yes Maybe It Does Have Purpose, Thank You for Pointing This Out Philip
Your point on downscaling really did seal the deal, you do definitely have a point.
Buying a 27 inch 4K monitor was the best gaming decision I ever made, despite the internet saying 4K "isn't worth it at that size". HA! That showed them!
Definitely not too much resolution. It's just a matter of preference whether you focus on spatial or temporal resolution, as you currently can't reasonably have both. For productivity I would prefer even more. 27" 5K or 32" 6K. Sadly there are not a lot of options, and they are all non gaming displays and very expensive.
Wait but where's the 2kliksphilip manspread nutshot in this video?
hmm yes, mabye it does have a purpose, thank you for point this out
Fantastic video, the combination of DLSS with DLDSR has such a profound effect on image quality it's kind of ridiculous, games like RDR2 still look extremely blurry at 4K whether it be with DLSS quality (even with the latest .dll file) or at native, but when DLSS is combined with DLDSR the game is transformed as far as clarity goes, most of this clarity is coming from DLDSR as obviously it's rending at a higher res than your monitor, but it really seems to be the AI component here that cleans the image up to a staggering degree, these technologies along with RTX HDR have really enriched my PC gaming experience, great video!
Cyberpunk 2077 at 6k is something I wish more people could experience its so immersive
Finally someone that said this...
I went as far as to buy pro display XDR specifically for the 6k res at 32" (tho the rest of the features are also very welcome - like bye, bye IPS glow) and the difference is SO CLEAR compared to 4k - I have a freaking astigmatism btw. People pretending anything above 4k being useless always was a mistery to me - it has to be just repeating tech reviewers who in turn have a agenda - no sane person would dismiss 8k as it is dismissed by majority of people if they actually saw the difference. You might not care about it, accept what you have, but to state for e.g. that 8k won't come and 4k is the "last resolution" and we'll go with fidelity only from now on is just stupid.
Me watching this at 480p on a 1080p screen: "Interesting."
I'm still in a 32 inch 1080p club.
Every game must look like Minecraft I bet
People back in the day kept saying there is no point of HD resolution and Blu Ray. Now nobody wants to touch a 480p video with a stick, or even 720p. We'll get to 8K, but perhaps we have to increase the refresh rate first, 4K 120 > 8K 60 imo.
Ah so its one of those cycles. I didnt know that, but it doesnt surprise me. Eventually people will probably say 1080p is unusable and 1440p turns into the new 1080p (as in the bare minimum like it is now), and then 2k and so on and so on.
There are always going to be people who are fine with 720p and 30 FPS. Look at the new Zelda game that came out last week. To me the jaggies and smudged shadows are very distracting though.
@@winterhell2002 True but thats on a smaller screen as well so its not as noticable. If it was 1080p on a 21inch or 45 inch screen it would be way more noticeable. The switch upscales though when docked and connected to an external screen
Instead of higher fidelity. I wish more games would focus on physics.
Destructible buildings and stuff. And just be able to interact with everything.
Hell Let Loose is perfect in many ways. Except performance and absolutely everything is static in that game…
Imagine a Hell Let Loose with fully destructible buildings
Would love for VR and all those things you mentioned to become more focused upon.
I think 1080p is gonna be the hill I die on.
There is a really unique benefit of 4k that a lot of people dont really consider: Its a perfect integer scale of commonly used resolutions.
i have never gamed on a resolution higher than 1080p
You're doing an apples to oranges comparison here. The common complaint is that a 4k display (at it's intended viewing distance), is indistinguishable from an 8k display, is not addressed here. Rendering games at higher resolutions, such as with some anti-aliasing techniques will produce a better image no doubt, but buying a higher resolution display won't help. Now, there are exceptions, such as split-screen gaming, or displays which have abnormally close viewing distances (like those found in VR headsets), where 8k is desirable, but those are still exceptions. I have to stress, that this all hinges upon you using your display at a reasonable viewing distance - too close and you'll get eye fatigue, too far and you won't even make out the difference between 1080p and 4k.
Perhaps the biggest takeaway I find in your video is the fact that you didn't even compare a real 8K display to a 4K one.
@@2kliksphilip Yes, but you're viewing them from the same distance. In actual use, you should keep the 32" display farther from your head than the 16" one.
@@2kliksphilip I'm not sure even this gets to the core point though - the suggestion is that 4k content on the 16" 4k display will probably look identical to 8k content downsampled to 4k on the 32" 4k display at the same viewing distance.
Downsampling from higher resolutions is literally the most primitive kind of anti-aliasing, but it's the most accurate and produces the least artifacts. If it's the only benefit of rendering at 8k then 8k monitors probably are just a fad, and we're really saying we just want better AA at 4k...
@@2kliksphilip maybe these guys are visually impaired?
You could be right, and I might be totally misunderstanding you. The question for me is whether 8k content displayed on a 32" 8k monitor looks any sharper than the same 8k content downsampled and displayed on a 32" 4k monitor.
Not sure if that kind of research has even been done though, so who's to say. But I feel like we're getting to the point where the anti aliasing is happening in our eyes at this kind of pixel density
@@2kliksphilip
I actually bought a 8k samsung 55 inch because i was shopping for a 4k display and seen the exact same display i was gonna purchse in 8k on sale for the exact same price so i was like fuck it.
The pixel density a 4k 55 inch tv is only 80 but at 8k it is 160 its very noticeable imo.
I been gaming on it for a year and a half and it did not disappoint.
I do not regret the decision i dont play many new games in 8k, But i now play all of my older games at 8k stuff pre 2015 gets 60 fps + on my 4070.
Even older games like left 4 dead 2 i get 120 fps+ even with 200+ mods enable.
Even some newer games like bf1 at max graphics i get about 55fps which isnt great for a fps but bf1 at 8k is the most insanely beautiful piece of gaming media i've even seen.
Its actually pretty great and it does a lot more for games than most people would like to expect. at 8k 0 anti aliasing is required and objects in far distances become so clear especially on a 55 inch.
Things like text or any small details effects like fire, flying debris, bugs, smoke become so clear. Things like power lines are literally a LINE with 0 stair stepping. im not denying these things are a "nuance" but they are quite beautiful to see in person. if u are obsessed with graphics imo 8k does something no raytracing or shader could ever do it provides clarity that is unimaginable.
Clearly, the only ones bold enough to advance the realm of resolutions is the capable men and women that create NSFW games and animations.
Went to curry's to check the new 8k tv's and was shocked that they are displayed with soft upscaled videos and "looked" worse than the 4k tv next to it
I did a couple of calculations and with the same angular size of the pixel that suits me, an 8k monitor will be 86 inches diagonally and will occupy 120 degrees of horizontal vision, does this make sense? this is madness
Me, clicks on this video at 3 AM. Looks at the "hmm" comments.
It's time to go to sleep!
goodnight
@@fowlizm9065 gn!
The day I'll be able to run it, 8K will be the best antialiasing technique for my 4K monitor.
8k is the resolution VR needs to reach
good point, most vr games already suck ass on good graphics, meaning the bar is set low for the player base anyway, wouldnt hurt if u played another game on low quality graphics but with 8k
Touching on the final point you mentioned in the video about wishing people would be more excited for future technology and new standards, I feel a large reason for the widespread dislike of that kind of stuff has been heavily influenced by the economy, and people's ability to actually go out and buy new tech these days without spending more than they can really afford. I feel if the economy were to improve and people had more spending power we'd see a large shift in how people see new technology.
Everyone already knows the human eye can’t see past 8gb of ram.
Me, still running a GTX 970, only using 1440p because the 1080p monitor I had died: "Hm, yes, yes."
What I'm gonna look at in 10 or however many years time is what games I play right now will even recognize that 8K exists (in the game's settings menu, not a config file). I know devs are a lot more aware of higher resolution gaming but it's gonna be interesting, especially with the increased prevalence of indie games.
I was looking at the office on a 27 inch 1440p screen 1.5 meters away and was blown away that i could read the text on the book shelves. Idk why but it added a lot...
FYI; Normally I watch from 2.5 meters away on a 65 inch 4K OLED.
I tried moving drom 3.5 m up to 0.5m, but I just can't read it. Maybe it was created in 1080p and i am gaslighting myself, but I think 8K might actually make a difference for me. (If movies were 8k :p)
I have had a 4k oled tv for 8 years now, so maybe I just got very used to it.
I remember choosing 1280*1024 Low in Oblivion over 800*600 High, mostly because the native resolution of my TFT display. Loved Quake III in 1600*1200 on a CRT which was considered a super high resolution. I think 8k is beautiful. I hope one day I'll be able to afford it. Now I'm stuck with 1440P on a 3080 on a 240Hz display. We just can't have everything. :( Great video!
As a strategy gamer higher resolution can really add a lot more immersion getting to see more of the map in some.
8k and beyond makes the The most sense for VR type applications.
I just want to thank you for not making this a 53 min long documentary, starting from the Big Bang till today.
The optimal distance chart at 2:45 is much more relevant for media consumption than for gaming. As an ex-eyefinity user, it's okay to not have the whole screen in the focused view and have it fill up my peripheral vision.
GTA visually looks great, well until you lower the textures
I agree with all points made.
I would like to add though, that for the leap to 8k-quality to not kill storage space, development companies would need to optimize games, and would need even further to make uncompressed files an option on the side and not the onl way to download their program.