Seeing Crysis 3 being ran in 8k, reminds me of back in 2014 where someone tried to run Crysis in 8K to produce waterpainting like screenshots, the game only ran like 2fps. It's amazing how far we have come.
I remember doing this! I had a GTX 770 4GB and I was taking screenshots at up to 8192x4608, which is *even higher* than the 8K seen in this video, lol. The fact that it ran at all to begin with kind of blew my mind.
@@prgnify Would be weird to have megatexture artifacts since Doom Eternal uses id Tech 7, and they actually dropped support for the usage of megatextures since that iteration. Digital Foundry's video on Doom Eternal explains that they moved away from megatextures due to it being more artist friendly to create assets.
I think your video style suits you well, you might not have helalallot of money and resources like big channels like LinusTechTips has, but your more amateur made at home style works when you talk about the history of graphics cards and also nvidia's and amd's past and how the long term trend with price to performance and the value of graphics cards is actually really good knowledge that other channels just don't show. I often get bored of the modern youtube channels and their videos of "OMG I GOT THE RTX 696969 super with 10000000 gigahertz which gives me 999999999fps at the price of 10 000€ per card", you show how fan boyism doesn't make sence and you actually put the effort to make videos about any topic, for example the budget gpu market videos which might not get a whole lot of views but are still such important information for a small amount of people. And then sometimes you make these silly 8k videos which are also enjoyable. So I really enjoy your video style no matter the topic.
0:24 4090 owner here; this card is *definitely NOT* overkill for 4K, not even close. Especially if you want 120Hz+ with no compromises in graphics settings. In fact, it's barely adequate for 4K gaming, so much so that CPUs are still being benchmarked at 1080 and 1440p because at 4K you're heavily GPU-bottlenecked, even with a 4090.
if it handles every game in 4k60fps is literally IS very adequate for "4k gaming". yes i think the 5090 will be the true 4k high fps GPU BUT to say the 4090 is "barely" adequate is factually wrong.
@@SerkuXY 60 FPS was the standard 20 years ago. For a good 4K gaming experience you want to be getting 120 *minimum* to keep up with modern 4K displays, which the 4090 can't do in every game.
@@Psythik Sixty fps didn't become the PC standard until just under 15 years ago. It wasn't until after the Great Consolization of 2008, when GPUs kept getting faster and faster but consoles remained static that a VSync-locked 60 fps became the minimum. With respect, you are wrong about a good 4K gaming experience. The main reason you like higher framerates is that it reduces Eye-Tracking Motion Blur on Sample-and-Hold displays (It also can reduce input latency, but that shouldn't be a problem to begin with). On a CRT there is literally _zero_ ETMB at _any_ framerate. You get better motion clarity at 60 fps (in fact _perfect_ clarity) than you do at 480 fps on an LCD/DLP/OLED. I have a 4K 55" OLED and I know how much worse motion clarity is compared to a CRT. But I'm not going to waste GPU power pumping out extra frames when I desperately need that power for high settings, high FoV, and AA. I would _prefer_ higher framerates and lower ETMB, but there is absolutely nothing unplayable about 60 fps. The 4090 can't even do a VSync-locked 60 fps in every game, so I'm confused about you talking about 120+ fps. What are you doing --- dropping the settings down to minimum?
It turns out that even if Graphics cards *could* push enough pixels to comfortably run games at 8K, game developers are in no rush to optimise their games to work comfortably at resolutions beyond 4K without the UI being borked or various texture/graphical bugs.
@@TheSyntheticSnake Keep in mind that it's not the fault of devs, shareholders need infinite growth regardless and only think short term, they don't give a damn if the game is garbage as long as 'graph go up'.
@@kristoffer3000 True to a degree, but you do get cases where the devs themselves are the issue. Bethesda and From spring to mind with their outright denying that there's anything wrong
Nice vid, definitely excited to see how upscaling plays it's part! So I came here from a switch video.. Love the juxtaposition of the term "playable" coming from watching switch content to this. 720p-900p, 25fps, low settings, missing textures, 1/2 res shadows, 1/2 framerate reflections: "Fantastically playable!" .
That's pretty easy, since it's a fixed camera path. Just trim the clips to where the benchmark begins, put them on top of each other in the timeline, and cut the topmost clip where you want the transition to be.
Super interesting, thanks Philip! It's wild that anything at all can run at 8K WITH playable frame rates. What a day we live in. I'm still on 1080p 😅 You said a bit about Ultra Graphics 4K being much better than 8K. Would you ever do a full video on this? I'm curious to know if you think 8K will ever be worth it at all, even if we have GPUs which can run it well. How much can our eyes really see, right? I remember watching a video from you years ago reviewing 4K. I think it had GTA footage. The times have really changed 🤔
you might slightly notice some shimmering at like 20-30 cm distance from the monitor of 27 inches, because the ppi of such monitor would be 326, and the eyes' ppi for such distance is calculated to be 338. 10k resolution would be practically perfect for gaming without any aliasing and shimmering, just cus it actually has higher ppi than our eye can handle. even on 31.5 inch display 10k would work perfectly.
You can still get temporal aliasing from materials that are so high detail that even 8K won’t resolve them properly. Modern games use TAA for a ton of tricks that you can't really brute force with a higher resolution. Luckily TAA looks a ton more crisp if you give it a higher resolution to sample. Similar to what he was saying about FXAA in this video.
Mirrors Edge is just a beautiful game. I recently played through the OG one again and man, it's just gorgeous. Sure, the textures aren't that sharp anymore, but in motion you could easily sell it as a modern game. It just shows how important good lighting is, and the very limited color palette helps you even more. While Catalyst was more polished with the movement, from the art style and the feeling of a real place and story I prefer the OG one. And the Still Alive Soundtrack is just awesome.
*"While Catalyst was more polished with the movement, from the art style and the feeling of a real place and story I prefer the OG one."* You're kidding me? Catalyst not only did not improve upon those aspects... it regressed?
you can 4K most games prior to 2015 with 3070. Even some recent title, but not the triple A games. But to be honest as much cool RT look like, the pef hit is more important than switching from 1080p to 4K, and 4K is just so much better than any RT at 1080p. I'll always prefer sharper and good looking image over good reflection. I'll probably use RT when we can all play 8K games with budget graphic card at 150fps, then I'll activate RT to get perfect quality even though it's 60fps. I don't give a damn about DLSS, i'll never use it unless i'm very very poor and can't afford a card with good rasterisation capability.
I have played many games --- maxed out (the only way I play) --- at 4K on a 980 Ti or 2080 Ti, at a VSync-locked 60 fps. The most demanding include: The Division (2080 Ti) DOOM 2016 (2080 Ti) Dying Light (2080 Ti) Batman: Arkham Knight (2080 Ti) Just Cause 3 (2080 Ti) The Evil Within (2080 Ti) Watch_Dogs (2080 Ti) (Dropped below 60 fps when driving fast --- could have avoided this by dropping settings) SOMA (2080 Ti) Crysis 3 (2080 Ti) BioShock: Infinite (980 Ti) Dishonored (980 Ti) RAGE (980 Ti)
Great entertaining yet educational video much appreciated. i subscribed when i saw your snapdragon video which was very intelligently done. i am no real gamer but have a delidded 14900k and 4090 that i built and a sony 8k and occasionally play some games like recent avatar, flight sim, forza horizon, ;lies of P etc and seem to get away with 8k, but after a while backed it down to 4k because i do not think the games are developed at all for 8k obviously and do not look that much better. But its always nice to push the 4090 to the limit for science as you say! All the best to you and keep creating magic content.
Thank you for the lovely comment! And thank you for understanding I do this stuff, not because it's sensible, but because it isn't. And will be great to look back on in 3, 5 or 10 years time when the situation will have drastically changed.
Art direction and style does a lot for percieved quality for sure. We have better tech available today, but that doesn't just automagically make things look better.
Considering that with upscalers 1440p produces way better quality than 1080p native while performing the same or better, 8k could have better quality than 4k with the same or better performance, so i'm looking forward for the dedicated video!
Looking at the unplayable frame rates in 8k reminded me of time when i asked on bloodborne reddit about stability of 60 fps mod for ps4 pro and got jumped on by people because there's no reason to get that, 30 fps is more than enough, i wouldnt notice a difference between 30 and 60 and higher fps number doesn't matter at all - its all just afterburner flexing. Never went there after, i'm just sitting here waiting for PC port
have fun with waiting for it, will never be ported, or you really think that FromSoftware will take the money and time to port a 10 years old game to the pc ?
Hasn't it been miiiighty interestin' how the consolers suddenly stopped talking about "muh 30 fps is mo' cinematic" the moment their consoles could do 60 fps? Miiiiighty interestin' indeed...
I love these videos. The more of them you make, the cooler the time capsule becomes. If you look back on what 2034's flagship card can do, you'll have the 1080Ti and 4090 to compare it to. Cool!
the one game my GPU will always fear is Modded Minecraft, I literally built my 3080 setup just for it... It's endlessly tweakable to run on potatos or spaceships, and people have still been pushing the boundaries with 4K/8K textures ontop of raytraced shaders. I think it'd be a fun video to run a bunch of different types of modern MC with optimization mods on varying resolutions, texture packs, and shader packs. (your previous raytraced MC vids looked great too!).
@@LisSolitudinous looking at the rate of which 4k became standard, 8k will become standard anywhere around 10 years. But, COVID seemed to slow done the industry very drastically, so realistically it’ll be around 15 years, unless some kind of boom happens
My favorite game to check raytracing with is Control, you should really consider adding that to your lineup. It was one of the first games with proper upscaling and raytracing and it did a decent job of it, and with an updated DLSS DLL it actually looks phenomenal
Super Ultrawide 2140p - which exists now and is affordable to enthusiasts - is probably the closest we'll get to 8k gaming anytime soon. Foveated rendering would make it viable even. But then again, most games already break on 1440p 32:9.
I still can't think of any justification for 8k. Like I've actually had the chance to do a real side by side comparison in a microcenter, and unless you are literally mashing your face up against the screen, the difference is negligible at best.
@@kniazjarema8587Our perception has limits and we are reaching them. You think going from 8k 16k to 32k is going to be game-changing on a 27 inch monitor? How about a 32? Adding pixels helps to a point, we've basically reached that point in terms of monitors. 4k on a 27inch screen is very clear.
@@paulcox2447 People were saying this about 1080p back in the day and 1280x1024 before that. The reason you can't see a difference between 4k and 8k is because no game is designed for that resolution. If you play a PS1 game at 8k, it's not going to look any different than 900p. The PS1 FF7 isn't going to look better at 8k than modern FF7 remake does at 720p. A higher resolution cannot show detail that simply isn't there. MSAAx2 or x4 maxes out any modern game's visuals at 4k, but because MSAA is so demanding, only few games use it anymore. If games had 8k shadows/lighting/AO/water effects and textures using UE 7-8, things would change fast. That would require something like at least 256 gigs VRAM GPU with DDR9+ running multiple chiplets at 5Ghz with equivalent RAM and 10GHZ 256 core CPUs, technology and engines that are not even remotely ready for today's market yet. Compared to that future 8k era, our current era is to what the early PS3 era is for us today. We'll eventually get there, but it's going to take nearly 20 years, maybe 15 at the earliest. The next big resolution for the enthusiast market is going to be 5k followed by 6k.
I haven't tried gaming with resolutions higher than 1080p personally, and I don't really see that changing any time soon, but it is cool to see how far tech has come.
One thing that might be viable for running the game at 8k, is to use no AA and then down sampling to the 4k output. As someone who hates TAA this might be a way to get good aliasing without the terrible ghosting from TAA. (Yes I'm aware that this is basically what msaa is already but most modern games don't even have alternative AA that isn't temporal and this could be a way to get good aliasing in these games, especially in the future when this level of performance in a GPU is midrange.)
That's not what MSAA is, but that is what SSAA is or DSR for a more recent driver level implementation. DLAA also kind of works from higher res and then down sampling. If you want it, it's available, but you might not like the frame rate that you get in modern games with SSAA derivatives.
@@SeelBees DLAA does not perform any upscaling or downsampling, it is just applying the algorithm to the native image with no upscaling, so it tends to look much worse than DLSS quality with 2x DSR applied, since that actually upscales to the DSR res then downsamples back to your screen.
The question of "can GPU X do 8K?" is actually very situational. I have a 4K display, and I use VSR in older games, such as Postal 2, Half-Life 2, Far Cry or FEAR to supersample from 7680x4320. Those games run perfectly fine in 8K on a 6900XT. So the question is really not just "can GPU X do 8K", but rather "can GPU X run the games I want in 8K".
Love the mirrors edge music in the background. Also I've been gaming in 8K on an 8K tv for a couple years now with a 3090. Older games run great at 8K and newer games you can turn some settings down or upscale to 8K if needed
Honestly I'm struck by how pretty DXMD still looks, not sure exactly how to word it, it just looks clean and sharp and it really contrasted with the visual noise of all the rubble and foliage in BF1 and Crysis 3 that were tested before and after.
while interesting video i doubt 8k should be a goal for other thing than vr, they are already numerous people who can't see any difference between full and ultra HD so 8K i think would turn into a big placebo effect.
VR also makes SLI make sense again as you can split the two screens for each GPU which then renders less pixels and possibly gives even higher framerates.
@@fungo6631 Yes and no. SLI was made to use multiple GPUs to render a set of frames cooperatively. In VR the GPU renders 2 different sets of frames, one for each eye. Assuming you can hook up the the 2 screens to different cards and have 2x 16xPCIe slots, SLI is technically unnecessary as each GPU will have its own set of frames to generate. Since 2 cards need to be coordinated, this will most likely increase the CPU workload.
@@LAndrewsChannel All you need is to slightly change the transformation matrix as the scenery is largely the same with just a slightly different camera position for each eye. I suppose something like Infinity cache could help here. In any case, I believe GPU bandwidth constrained use cases would benefit the most as VR resolutions tend not to be dual monitor aspect ratios, so each GPU would only draw half the pixels.
@@fungo6631 It doesn't matter that the "scenery is largely the same", you still need to go through the whole pipeline after changing the view to get the frame for the other eye, you can't just "reuse" pixels and not render half of them. The only "reuse" is that of the world's geometry. Please look up NVidia's pages on multi view rendering and VR SLI. Many smart people worked on those and if there was a better way to do it with the current hardware I am pretty sure they would do that instead.
The Samsung Odyssey Neo G9 mini-LED 57" ultrawide has the native resolution of 7680x2160 at 240Hz. Would be a good alternate test for a very high resolution fps test.
0:26 I disagree with the claim that the RTX 4090 is "overkill for 4K", especially if you try running games with RT maxed out at native 4K while also trying to maintain 120 fps (since all 4K gaming monitors support at least 120 Hz).
People that claim this usually play less demanding games or are fine with 60/30fps at native or use upscaling+FG without a second thought. In The Talos Principle 2, DLSS->DLAA on the LOW preset gets you ~130fps at 4k ruclips.net/video/ziLm1Ea8xDY/видео.html
@@2kliksphilip I get that. My point is that it's a bit early to call any card overkill for 4K until that's possible. We're clearly nowhere close to enough GPU power for truly native 4K gaming especially at higher refresh rates.
If only there was a way to get RT working okay on the 4090 at 4K. Some way of scaling up the graphics, using some kind of upscaling the pixels in some scalable way
@@2kliksphilip I want to make it clear that I'm not against upscaling. It's a technology that has its uses and that can be of great value under the right circumstances. I'm also not saying RTX 4090 owners shouldn't use it in RT games at 4K. However the fact is that having to use upscaling at 4K with the RTX 4090 is proof that it's not "overkill for 4K" in the same way that the GTX 1080 Ti was overkill for 1080p in 2017 (I actually think it made a lot of sense for 1440p given that even in GPU heavy titles it was able to maintain enough fps to almost max out the refresh rate of 1440p gaming monitors at the time). The upscaling setting can be thought of as another quality setting where off is the equivalent of the high setting and performance is the equivalent of the low setting (yes I'm aware it's not that simple and that there are some cases where DLSS can make more accurate guesses about the image compared to native rendering but it's not guaranteed to do that). Finally if the RTX 4090 was truly overkill for 4K instead of talking about using DLSS at 4K we would be talking about using DLAA (anti-aliasing via DLSS) at 4K or running games at 4K with over 100% resolution scaling.
I absolutely love experimental gaming testing videos like this and this channel in particular is just one of my absolute favorites, amazing editing and great dialogue. Whenever I feel down Phil always has my back. please upload more videos like this 🥰 Much love from Sweden! 🇸🇪
Should have tried to call of duty games and Doom, all well optimized games. I don't have a 4090 but I've been playing infinite warfare and WW2 at 8k 60fps recently, haven't tried Doom but it's certainly well optimized.
@@burger406 personally, I don't see the point in an 8k screen, but rendering at 8k (when achievable) certainly has its advantages. Play a game at 4k without anti aliasing and it'll still be a shimmery mess. However, enabling anti aliasing usually means blurring in motion, trailing artifacts, etc. If instead you can supersample to an even higher resolution like 8k, you get less aliasing, no artifacts, and a much clearer image worthy of your 4k screen.
What a great video with lots of good explaination! I really think 4090 is still a good "8K card". Sometimes we just need to take a few steps back like lowering graphics settings to medium or high or maybe even low if we want an "enjoyable" 8K experience on 4090. It will still looks good enough to most people probably XD. May I know if there is any updates on that separate video about 8K gaming with upscaling enabled? Very curious to see and compare it to this video!
The thing is 4k will never truly be a thing as the bar keeps changing. Don't forget when the 780 ti and titan Black came out they could run all the newest games at 4k single card at 60 fps. Then when maxwell came out to run 4k with the newest games you needed to run in sli with a 980 ti and or titan X. Then came pascal. Huzzah 4k gaming onna single card. Then came Turing and you weren't running the newest games at 4k on that, back to 1440p it was. You see there's a pattern here. The bottleneck being memory bandwidth the solution being HBM memory which has since been completely abandoned. But those HBM2 cards were phenomenal at 4k. They took a smaller hit at higher and higher resolutions. Ive no doubt a 8192bit HBM3 Gpu would perform 4k only slightly worse than it would 1440p
Some times I wonder will todays gpus be like my old gtx 680 .Will todays 4090 be considerd a good budget card in 15 years . In like 10 to 20 years will the best gpu play games or videos a 16k or 32k . Will we even see a difrence if we even play at 16k . Will the future games look more real with 1000 poligon screws or will we get 16k textures that you can see the blood and sweat that went in to the game . Its fun to think about the future but in real life gpus get like 20% faster each new generation . Even some good GPUs like 3070 suffer from 8gb ram Nvida put on them .
I really don't even see the need to have any antialiasing at 8k, it just seems more like an all around performance hit. Unless you're using an 8k 65 inch TV as a monitor on your desk, you might need AA of some form, but honestly it's the same with 4k. I doubt you'd see any difference (other than framerate) from couch distance. Cool vid baby, teehee
i am still "stuck" with a 1080p144hz monitor and i love it. but thinking about 8K being a thing in like 5-10years is just insane. i am already blown away when i see 4K because it feels unreal in terms of how a game or movie CAN look. i always get flashbacks of my pentium 4 cpu that i had as a kid and always realize we've come so far!
Wow for some reason you sound really young in this video lol. I don't mean this in a negative (or positive) way, your voice just almost reminds me more of your older videos than your newer ones haha
I love listening to Philip tell me how poorly these games are running at 8k while I watch this video on RUclips mobile at 780p. I can really feel every pixel
unrelated but holy shit i have never heard of "talos principle 2" but that might be the most beautiful game i have ever seen. those trees somehow look more realistic than real life. somehow battlefield 1 is the 2nd best looking game in this video, i dont know what it is about that game but it always looked better than current gen games to me.
This video was actually recorded and uploaded in 8K but youtube's taking its sweet, sweet time to display it in that. So 4K will have to do :(
I'll return..
@@Tholi I'll be back too...
I'll have to rewatch later!
too bad i have a 1080p monitor so i wont be able to see all those pixels
for all 4 people that own an 8k display
Seeing Crysis 3 being ran in 8k, reminds me of back in 2014 where someone tried to run Crysis in 8K to produce waterpainting like screenshots, the game only ran like 2fps. It's amazing how far we have come.
I remember doing this! I had a GTX 770 4GB and I was taking screenshots at up to 8192x4608, which is *even higher* than the 8K seen in this video, lol. The fact that it ran at all to begin with kind of blew my mind.
Who are ''we''? Majority of people have 1080p and 4090 run some games below 60FPS at 4K
@@-Ice_Cold- i mean from 2 fps average to 50 fps is a 25x improvement
@@-Ice_Cold- those games are very poorly optimized then, or their systems are.
German programming be like that. I remember Buying a 1070 and Playing this Game on 3 1080 Monitors so that I could see The Bow in It's entirety. XD
Nobody is running jedi survivor at 8K anytime soon lol
I am confident even NASA computer cannot do that
Nobody is running jedi survivor
@Deathheart161 5090 in about 3-5 months, 6090 two years after that if they keep their regular two year cycle and naming scheme.
@@LinuxPlayer9 how dare you steal my joke before I even made it
Jedi survivor is just a terrible PC port. Game stutters at 480p minimum settings.
Doom Eternal would have been interesting. Its the best optimized title I know.
It runs with about 70-80 Fps from my testing with maximum settings (without AA).
@Scorpion95 at 8k?? That's crazy good wtf what kind of wizardry did they pull when optimizing
@@SolarFlare307 Yep at 8K. It runs really really good. At 4k got about 240 Fps but I´m partly CPU (Ryzen 3950X) limited
I tried it at 8k and it had some odd artifacts, as if the shaders the game uses are a bit broken at 8k. Performance was solid though
@@prgnify Would be weird to have megatexture artifacts since Doom Eternal uses id Tech 7, and they actually dropped support for the usage of megatextures since that iteration. Digital Foundry's video on Doom Eternal explains that they moved away from megatextures due to it being more artist friendly to create assets.
my main takeaway is that at 8K resolution, the blur of FXAA has such little impact on image quality that it becomes good again
perhaps at 16k well reach the point where even TAA is not unbearable
@@OfficialFo at 16k TAA gonna look like 1080p MSAAx2 😂
@@hombrepepega3472 honestly every step we take is just to recreate old methods it seems
@@hombrepepega3472 With all the benefits of TAA? Yes please!
@@OfficialFo at 16k we don't even need any AA enabled because we already have built in SSAA
I think your video style suits you well, you might not have helalallot of money and resources like big channels like LinusTechTips has, but your more amateur made at home style works when you talk about the history of graphics cards and also nvidia's and amd's past and how the long term trend with price to performance and the value of graphics cards is actually really good knowledge that other channels just don't show. I often get bored of the modern youtube channels and their videos of "OMG I GOT THE RTX 696969 super with 10000000 gigahertz which gives me 999999999fps at the price of 10 000€ per card", you show how fan boyism doesn't make sence and you actually put the effort to make videos about any topic, for example the budget gpu market videos which might not get a whole lot of views but are still such important information for a small amount of people. And then sometimes you make these silly 8k videos which are also enjoyable. So I really enjoy your video style no matter the topic.
Agreed. It's just cozy no bullshit content
Watching this video in 720p and can confirm, that 8k resolution looks crystal sharp
@@talison461 I've got the video on 2160p60 4k. Never knew RUclips videos could look so crisp
"helalallot"
0:24 4090 owner here; this card is *definitely NOT* overkill for 4K, not even close. Especially if you want 120Hz+ with no compromises in graphics settings. In fact, it's barely adequate for 4K gaming, so much so that CPUs are still being benchmarked at 1080 and 1440p because at 4K you're heavily GPU-bottlenecked, even with a 4090.
if it handles every game in 4k60fps is literally IS very adequate for "4k gaming". yes i think the 5090 will be the true 4k high fps GPU BUT to say the 4090 is "barely" adequate is factually wrong.
@@SerkuXY 60 FPS was the standard 20 years ago. For a good 4K gaming experience you want to be getting 120 *minimum* to keep up with modern 4K displays, which the 4090 can't do in every game.
@@Psythik Sixty fps didn't become the PC standard until just under 15 years ago. It wasn't until after the Great Consolization of 2008, when GPUs kept getting faster and faster but consoles remained static that a VSync-locked 60 fps became the minimum.
With respect, you are wrong about a good 4K gaming experience. The main reason you like higher framerates is that it reduces Eye-Tracking Motion Blur on Sample-and-Hold displays (It also can reduce input latency, but that shouldn't be a problem to begin with). On a CRT there is literally _zero_ ETMB at _any_ framerate. You get better motion clarity at 60 fps (in fact _perfect_ clarity) than you do at 480 fps on an LCD/DLP/OLED.
I have a 4K 55" OLED and I know how much worse motion clarity is compared to a CRT. But I'm not going to waste GPU power pumping out extra frames when I desperately need that power for high settings, high FoV, and AA. I would _prefer_ higher framerates and lower ETMB, but there is absolutely nothing unplayable about 60 fps.
The 4090 can't even do a VSync-locked 60 fps in every game, so I'm confused about you talking about 120+ fps. What are you doing --- dropping the settings down to minimum?
@@Psythik high refresh rate in single player games looks like shit. i prefer to play at 60 fps a lot.
then why even bother bying a fkn card that costs 2000 dollars?????
Nice taste in using Mirror's Edge Catalyst music in the beginning. A perfect fit for recording a video on a sunny day like this.
So nice of you for mentioning Mirror's Edge Catalyst and The Talos Principle 2 - these games deserves more attention.
It turns out that even if Graphics cards *could* push enough pixels to comfortably run games at 8K, game developers are in no rush to optimise their games to work comfortably at resolutions beyond 4K without the UI being borked or various texture/graphical bugs.
Devs don't seem to be in a rush to optimise their games to work comfortably at any resolution so often it seems
@@TheSyntheticSnake Keep in mind that it's not the fault of devs, shareholders need infinite growth regardless and only think short term, they don't give a damn if the game is garbage as long as 'graph go up'.
I mean I cant exactly blame them lol, how many people are running 8k monitors and have the power to actually do it in the first placd
@@kristoffer3000 True to a degree, but you do get cases where the devs themselves are the issue. Bethesda and From spring to mind with their outright denying that there's anything wrong
@@TheSyntheticSnake Well yeah but they're more cult-like than anything else.
Nice vid, definitely excited to see how upscaling plays it's part! So I came here from a switch video.. Love the juxtaposition of the term "playable" coming from watching switch content to this. 720p-900p, 25fps, low settings, missing textures, 1/2 res shadows, 1/2 framerate reflections: "Fantastically playable!" .
0:40 nice jumpscare
That cut from RT on to RT off in Hitman was super clean. That can't have been easy to do.
That's pretty easy, since it's a fixed camera path. Just trim the clips to where the benchmark begins, put them on top of each other in the timeline, and cut the topmost clip where you want the transition to be.
RT Always OFF!
man I love just chilling out and running around in Mirror's Edge Catalyst
me watching this on 360p on a phone: mmh yes
If not for the stats displayed on the HUD, it all looks the same, except when the fps obviously dips to 20.
The video is only 4k anyway
@@Leonard.L.Church its in 8k lol
@@MrJokster666 RUclips doesn't do 8k
@@Leonard.L.Church yes it does. it's in 8k
1:02 are those the sugar less diarrhea gummy bears XD
Truly a pinnicle of the Amazon reviews lore.
bringing up bad memories
The fartist lives on.
Part 2 coming soon
Love the choice of music!
8:18 the POWER of editing
*seamless transition*
I was looking for this comment haha, it was so unexpected but noticeable which made it hilarious
Super interesting, thanks Philip! It's wild that anything at all can run at 8K WITH playable frame rates. What a day we live in. I'm still on 1080p 😅
You said a bit about Ultra Graphics 4K being much better than 8K. Would you ever do a full video on this? I'm curious to know if you think 8K will ever be worth it at all, even if we have GPUs which can run it well. How much can our eyes really see, right?
I remember watching a video from you years ago reviewing 4K. I think it had GTA footage. The times have really changed 🤔
i love watching videos that just benchmark cards, such a calming vibe for some reason
I quite enjoy your hardware adventurtes, thanks.
Applying Anti-Aliasing at 8K is like heating the sun with a blowtorch. You're really not gonna notice the difference
The difference is easily noticeable in the lack of framerate
you might slightly notice some shimmering at like 20-30 cm distance from the monitor of 27 inches, because the ppi of such monitor would be 326, and the eyes' ppi for such distance is calculated to be 338. 10k resolution would be practically perfect for gaming without any aliasing and shimmering, just cus it actually has higher ppi than our eye can handle. even on 31.5 inch display 10k would work perfectly.
You can still get temporal aliasing from materials that are so high detail that even 8K won’t resolve them properly. Modern games use TAA for a ton of tricks that you can't really brute force with a higher resolution.
Luckily TAA looks a ton more crisp if you give it a higher resolution to sample. Similar to what he was saying about FXAA in this video.
That's why 2x MSAA at 8K is plenty and going any higher makes no sense most of the time.
people said the same about 4k and it's still unplayable without AA due to the shimmering
Mirrors Edge is just a beautiful game. I recently played through the OG one again and man, it's just gorgeous. Sure, the textures aren't that sharp anymore, but in motion you could easily sell it as a modern game. It just shows how important good lighting is, and the very limited color palette helps you even more. While Catalyst was more polished with the movement, from the art style and the feeling of a real place and story I prefer the OG one. And the Still Alive Soundtrack is just awesome.
*"While Catalyst was more polished with the movement, from the art style and the feeling of a real place and story I prefer the OG one."*
You're kidding me? Catalyst not only did not improve upon those aspects... it regressed?
All your videos are bangers but this one was especially great!
I havent even seen a 8k monitor being sold
You can use a tv.
I haven't even seen an 8k TV being sold
1080p will look excellent on an 8k tv with integer scaling.
@@UncleJemimathere are many 8k tvs
Dell 8k
0:05 Been using 4K since 2016, there wasnt really a card that could drive 4k properly until the RTX3090 (and I had Titan's, SLi rigs etc)
Been using 4K since 2016 lmao you a legend :)
2080 could run 4k so can other 30 series cards
@@Zwank36 wdym not properly. Plenty of games ran smoothly at 4k on PCs
you can 4K most games prior to 2015 with 3070. Even some recent title, but not the triple A games. But to be honest as much cool RT look like, the pef hit is more important than switching from 1080p to 4K, and 4K is just so much better than any RT at 1080p. I'll always prefer sharper and good looking image over good reflection. I'll probably use RT when we can all play 8K games with budget graphic card at 150fps, then I'll activate RT to get perfect quality even though it's 60fps. I don't give a damn about DLSS, i'll never use it unless i'm very very poor and can't afford a card with good rasterisation capability.
I have played many games --- maxed out (the only way I play) --- at 4K on a 980 Ti or 2080 Ti, at a VSync-locked 60 fps. The most demanding include:
The Division (2080 Ti)
DOOM 2016 (2080 Ti)
Dying Light (2080 Ti)
Batman: Arkham Knight (2080 Ti)
Just Cause 3 (2080 Ti)
The Evil Within (2080 Ti)
Watch_Dogs (2080 Ti) (Dropped below 60 fps when driving fast --- could have avoided this by dropping settings)
SOMA (2080 Ti)
Crysis 3 (2080 Ti)
BioShock: Infinite (980 Ti)
Dishonored (980 Ti)
RAGE (980 Ti)
Great entertaining yet educational video much appreciated. i subscribed when i saw your snapdragon video which was very intelligently done. i am no real gamer but have a delidded 14900k and 4090 that i built and a sony 8k and occasionally play some games like recent avatar, flight sim, forza horizon, ;lies of P etc and seem to get away with 8k, but after a while backed it down to 4k because i do not think the games are developed at all for 8k obviously and do not look that much better. But its always nice to push the 4090 to the limit for science as you say! All the best to you and keep creating magic content.
Thank you for the lovely comment! And thank you for understanding I do this stuff, not because it's sensible, but because it isn't. And will be great to look back on in 3, 5 or 10 years time when the situation will have drastically changed.
Amazing choice of music in the opening there :)
It is so weird to me that Batman Arkham Knight, an almost 10 year old game looks better than many of these much newer games.
It was a game ahead of its time. Sure, it had poor optimisation, but nowadays, most of those problems are fixed.
Art direction and style does a lot for percieved quality for sure. We have better tech available today, but that doesn't just automagically make things look better.
Bro seriously
AAA decline, graphics have got worse
@@RichardPhillips1066 TAA is to blame for a lot of that, it's so damn bad.
Considering that with upscalers 1440p produces way better quality than 1080p native while performing the same or better, 8k could have better quality than 4k with the same or better performance, so i'm looking forward for the dedicated video!
AH MIRROR EDGE OST GOT ME EDGING AND GOONING
Looking at the unplayable frame rates in 8k reminded me of time when i asked on bloodborne reddit about stability of 60 fps mod for ps4 pro and got jumped on by people because there's no reason to get that, 30 fps is more than enough, i wouldnt notice a difference between 30 and 60 and higher fps number doesn't matter at all - its all just afterburner flexing. Never went there after, i'm just sitting here waiting for PC port
Console fanboys are just trying their absolute hardest to cope with reality. Good on you for not touching that echo chamber ever again 👍
We're all waiting brother
May the day come soon
have fun with waiting for it, will never be ported, or you really think that FromSoftware will take the money and time to port a 10 years old game to the pc ?
ShadPS4 can emulate Bloodborne kind of well, but you should wait to try it out until it runs at 15-60FPS
Hasn't it been miiiighty interestin' how the consolers suddenly stopped talking about "muh 30 fps is mo' cinematic" the moment their consoles could do 60 fps?
Miiiiighty interestin' indeed...
The background music on this video is excellent
TALOS PRINCIPLE 2 MENTIONED!!!
So sad that you didn't try to run the game on a REALLY demanding game like Minecraft
Unironically would obliterate frames unless using stuff like nvidium
@alexlxpg4985 Older versions might run better. I can say for a fact release 1.0 and lower will but not sure about anything above.
I love these videos. The more of them you make, the cooler the time capsule becomes. If you look back on what 2034's flagship card can do, you'll have the 1080Ti and 4090 to compare it to. Cool!
The mirrors edge music is so good
Really enjoyed this style of video especially because of the personal touches and jokes ❤.
1:06 This looks so animated. I only think it isnt because of the background.
@embudo-xx4qu Fluffykins has some awesome fur. I've always wondered how Philip manages to keep it that way
Can't wait for this video with the RTX 5090
the mirror's edge music makes the video so calming
the one game my GPU will always fear is Modded Minecraft, I literally built my 3080 setup just for it... It's endlessly tweakable to run on potatos or spaceships, and people have still been pushing the boundaries with 4K/8K textures ontop of raytraced shaders. I think it'd be a fun video to run a bunch of different types of modern MC with optimization mods on varying resolutions, texture packs, and shader packs. (your previous raytraced MC vids looked great too!).
Guess I'll be able to see 8k at my home....30 years later...
@@LisSolitudinous looking at the rate of which 4k became standard, 8k will become standard anywhere around 10 years. But, COVID seemed to slow done the industry very drastically, so realistically it’ll be around 15 years, unless some kind of boom happens
10-15
Some honest surprises in here, it's doing better in a lot of these than i was expecting.
0:58 I thought this was a footage from a game at first
It is! The game is called Stray.
@@eplixc8267 nahhhh
@@MauveDash
man.. that mirrors edge music. i would sit and let that play in the background for HOURS on my ps4. good memories. thanks phil
Just a great reminder that Mirror's Edge is an awesome game ❤
I liked the music for the different games you did with the video
Can i just say, props to Solar Fields for his soundtrack for Mirror's Edge and ME:C. God-tier soundtracks.
His other stuff is also incredible
My favorite game to check raytracing with is Control, you should really consider adding that to your lineup. It was one of the first games with proper upscaling and raytracing and it did a decent job of it, and with an updated DLSS DLL it actually looks phenomenal
Super Ultrawide 2140p - which exists now and is affordable to enthusiasts - is probably the closest we'll get to 8k gaming anytime soon. Foveated rendering would make it viable even. But then again, most games already break on 1440p 32:9.
I remember watching your videos on 4k and raytracing, and thinking how far away we were from it, now here we are
I still can't think of any justification for 8k.
Like I've actually had the chance to do a real side by side comparison in a microcenter, and unless you are literally mashing your face up against the screen, the difference is negligible at best.
4K is my endgame.
what content did you watch? there are no 8k movies
That's what they say every single time the technology increases in anything. "There's no point for A, B is enough for me"
@@kniazjarema8587Our perception has limits and we are reaching them. You think going from 8k 16k to 32k is going to be game-changing on a 27 inch monitor? How about a 32?
Adding pixels helps to a point, we've basically reached that point in terms of monitors. 4k on a 27inch screen is very clear.
@@paulcox2447 People were saying this about 1080p back in the day and 1280x1024 before that.
The reason you can't see a difference between 4k and 8k is because no game is designed for that resolution. If you play a PS1 game at 8k, it's not going to look any different than 900p. The PS1 FF7 isn't going to look better at 8k than modern FF7 remake does at 720p. A higher resolution cannot show detail that simply isn't there. MSAAx2 or x4 maxes out any modern game's visuals at 4k, but because MSAA is so demanding, only few games use it anymore.
If games had 8k shadows/lighting/AO/water effects and textures using UE 7-8, things would change fast. That would require something like at least 256 gigs VRAM GPU with DDR9+ running multiple chiplets at 5Ghz with equivalent RAM and 10GHZ 256 core CPUs, technology and engines that are not even remotely ready for today's market yet. Compared to that future 8k era, our current era is to what the early PS3 era is for us today.
We'll eventually get there, but it's going to take nearly 20 years, maybe 15 at the earliest. The next big resolution for the enthusiast market is going to be 5k followed by 6k.
I haven't tried gaming with resolutions higher than 1080p personally, and I don't really see that changing any time soon, but it is cool to see how far tech has come.
One thing that might be viable for running the game at 8k, is to use no AA and then down sampling to the 4k output. As someone who hates TAA this might be a way to get good aliasing without the terrible ghosting from TAA.
(Yes I'm aware that this is basically what msaa is already but most modern games don't even have alternative AA that isn't temporal and this could be a way to get good aliasing in these games, especially in the future when this level of performance in a GPU is midrange.)
That's not what MSAA is, but that is what SSAA is or DSR for a more recent driver level implementation. DLAA also kind of works from higher res and then down sampling. If you want it, it's available, but you might not like the frame rate that you get in modern games with SSAA derivatives.
what you need is DLAA
@@SeelBees DLAA does not perform any upscaling or downsampling, it is just applying the algorithm to the native image with no upscaling, so it tends to look much worse than DLSS quality with 2x DSR applied, since that actually upscales to the DSR res then downsamples back to your screen.
@@epoch151 true, that was misrepresentation on my side. DLDSR is working from higher res, DLAA works on native res image.
I love the music in the background, it’s more relaxed than what you usually put in which I sometimes find irritating
1:09 nice lion ya got there
great video, can't wait for the upscaling one
The question of "can GPU X do 8K?" is actually very situational. I have a 4K display, and I use VSR in older games, such as Postal 2, Half-Life 2, Far Cry or FEAR to supersample from 7680x4320. Those games run perfectly fine in 8K on a 6900XT. So the question is really not just "can GPU X do 8K", but rather "can GPU X run the games I want in 8K".
No shit sherlock. Nobodys asking if a 750ti can run doom at 8k. They want to know if current games are playable.
Love the mirrors edge music in the background. Also I've been gaming in 8K on an 8K tv for a couple years now with a 3090. Older games run great at 8K and newer games you can turn some settings down or upscale to 8K if needed
1080p 60fps supremacy
2k 60FPS golden middle
Thank you Phillip, i appreciate these kind of videos that push the limits which make me feel better about not owning an 8k monitor
Honestly I'm struck by how pretty DXMD still looks, not sure exactly how to word it, it just looks clean and sharp and it really contrasted with the visual noise of all the rubble and foliage in BF1 and Crysis 3 that were tested before and after.
Impressive......loved this video 🙏
while interesting video i doubt 8k should be a goal for other thing than vr, they are already numerous people who can't see any difference between full and ultra HD so 8K i think would turn into a big placebo effect.
Thank you for you services to the gaming industry
MIRRORS EDGE CATALYST MUSIC, AAAAAAAAAAAAAAAAAAAAA ITS SO BEAUTIFUL
You know, I have been gone for 12 days and your channels are the ones I have gone to first.
The fun thing about vr is that it stops your gpu from being overkill. You can have a 4090 and still struggle with some games. It's just too hungry lol
VR also makes SLI make sense again as you can split the two screens for each GPU which then renders less pixels and possibly gives even higher framerates.
@@fungo6631 Haha, never though it this way, but it seems logical.
@@fungo6631 Yes and no.
SLI was made to use multiple GPUs to render a set of frames cooperatively. In VR the GPU renders 2 different sets of frames, one for each eye. Assuming you can hook up the the 2 screens to different cards and have 2x 16xPCIe slots, SLI is technically unnecessary as each GPU will have its own set of frames to generate. Since 2 cards need to be coordinated, this will most likely increase the CPU workload.
@@LAndrewsChannel All you need is to slightly change the transformation matrix as the scenery is largely the same with just a slightly different camera position for each eye.
I suppose something like Infinity cache could help here. In any case, I believe GPU bandwidth constrained use cases would benefit the most as VR resolutions tend not to be dual monitor aspect ratios, so each GPU would only draw half the pixels.
@@fungo6631 It doesn't matter that the "scenery is largely the same", you still need to go through the whole pipeline after changing the view to get the frame for the other eye, you can't just "reuse" pixels and not render half of them. The only "reuse" is that of the world's geometry.
Please look up NVidia's pages on multi view rendering and VR SLI. Many smart people worked on those and if there was a better way to do it with the current hardware I am pretty sure they would do that instead.
When we have AAA games in 4k at 120fps on igpu i will be happy 😢
9:18 CS2 very well optimized 😂😂😂
Instantly paused video liked and subscribed when you started Jedi and said it’s easily one of the buggiest games of all time.
There's a tool called Special K developed by Kaldaien. Would be interesting to see your point of view on it.
The Samsung Odyssey Neo G9 mini-LED 57" ultrawide has the native resolution of 7680x2160 at 240Hz. Would be a good alternate test for a very high resolution fps test.
meanwhile my 1060 is still powering any game i play on 1080p 💪
As long as you're not playing Alan Wake 2 you should be fine, unless you enjoy the 20fps experience
Good luck running the most demanding games
it's good for 1080p 60 fps but not for much more, i suggest upgrading to an rx 6600 or better
The best thing about the whole video is the cute cat at 0:58🥰
0:26 I disagree with the claim that the RTX 4090 is "overkill for 4K", especially if you try running games with RT maxed out at native 4K while also trying to maintain 120 fps (since all 4K gaming monitors support at least 120 Hz).
Nobody's running RT native when upscaling makes far more sense
People that claim this usually play less demanding games or are fine with 60/30fps at native or use upscaling+FG without a second thought.
In The Talos Principle 2, DLSS->DLAA on the LOW preset gets you ~130fps at 4k ruclips.net/video/ziLm1Ea8xDY/видео.html
@@2kliksphilip I get that. My point is that it's a bit early to call any card overkill for 4K until that's possible. We're clearly nowhere close to enough GPU power for truly native 4K gaming especially at higher refresh rates.
If only there was a way to get RT working okay on the 4090 at 4K. Some way of scaling up the graphics, using some kind of upscaling the pixels in some scalable way
@@2kliksphilip I want to make it clear that I'm not against upscaling. It's a technology that has its uses and that can be of great value under the right circumstances. I'm also not saying RTX 4090 owners shouldn't use it in RT games at 4K. However the fact is that having to use upscaling at 4K with the RTX 4090 is proof that it's not "overkill for 4K" in the same way that the GTX 1080 Ti was overkill for 1080p in 2017 (I actually think it made a lot of sense for 1440p given that even in GPU heavy titles it was able to maintain enough fps to almost max out the refresh rate of 1440p gaming monitors at the time). The upscaling setting can be thought of as another quality setting where off is the equivalent of the high setting and performance is the equivalent of the low setting (yes I'm aware it's not that simple and that there are some cases where DLSS can make more accurate guesses about the image compared to native rendering but it's not guaranteed to do that).
Finally if the RTX 4090 was truly overkill for 4K instead of talking about using DLSS at 4K we would be talking about using DLAA (anti-aliasing via DLSS) at 4K or running games at 4K with over 100% resolution scaling.
I absolutely love experimental gaming testing videos like this and this channel in particular is just one of my absolute favorites, amazing editing and great dialogue. Whenever I feel down Phil always has my back. please upload more videos like this 🥰
Much love from Sweden! 🇸🇪
you know, i feel you. I got an RX 7900XTX for a 1440p 144hz display and it feels SOOO OVERKILL hahaha
broo the mirrors edge catalyst song in the background brings so much memories 😭😭
MIRRORS EDGE
literally the same reaction
😥
Should have tried to call of duty games and Doom, all well optimized games. I don't have a 4090 but I've been playing infinite warfare and WW2 at 8k 60fps recently, haven't tried Doom but it's certainly well optimized.
never seen it be explained when we will ever need eight thousand pixels horizomtally
-burger40
Less aliasing and less blur. It's 100% diminishing returns, I don't think anyone is seriously suggesting it as a standard.
@@existentialselkath1264 i am going to sit 1mm from my 50 inch screen and complain there isnt enough pixels
@@burger406 personally, I don't see the point in an 8k screen, but rendering at 8k (when achievable) certainly has its advantages.
Play a game at 4k without anti aliasing and it'll still be a shimmery mess. However, enabling anti aliasing usually means blurring in motion, trailing artifacts, etc. If instead you can supersample to an even higher resolution like 8k, you get less aliasing, no artifacts, and a much clearer image worthy of your 4k screen.
What a great video with lots of good explaination! I really think 4090 is still a good "8K card". Sometimes we just need to take a few steps back like lowering graphics settings to medium or high or maybe even low if we want an "enjoyable" 8K experience on 4090. It will still looks good enough to most people probably XD. May I know if there is any updates on that separate video about 8K gaming with upscaling enabled? Very curious to see and compare it to this video!
The thing is 4k will never truly be a thing as the bar keeps changing. Don't forget when the 780 ti and titan Black came out they could run all the newest games at 4k single card at 60 fps. Then when maxwell came out to run 4k with the newest games you needed to run in sli with a 980 ti and or titan X. Then came pascal. Huzzah 4k gaming onna single card. Then came Turing and you weren't running the newest games at 4k on that, back to 1440p it was. You see there's a pattern here. The bottleneck being memory bandwidth the solution being HBM memory which has since been completely abandoned. But those HBM2 cards were phenomenal at 4k. They took a smaller hit at higher and higher resolutions. Ive no doubt a 8192bit HBM3 Gpu would perform 4k only slightly worse than it would 1440p
wow what camera do you use your cat looks beautiful with that camera
i am so used to your voice, humor and videos. it always feels like another doze of nicotine.
mirror's edge catalyst's theme hits you right in the soul.
Some times I wonder will todays gpus be like my old gtx 680 .Will todays 4090 be considerd a good budget card in 15 years . In like 10 to 20 years will the best gpu play games or videos a 16k or 32k . Will we even see a difrence if we even play at 16k . Will the future games look more real with 1000 poligon screws or will we get 16k textures that you can see the blood and sweat that went in to the game . Its fun to think about the future but in real life gpus get like 20% faster each new generation . Even some good GPUs like 3070 suffer from 8gb ram Nvida put on them .
I really don't even see the need to have any antialiasing at 8k, it just seems more like an all around performance hit. Unless you're using an 8k 65 inch TV as a monitor on your desk, you might need AA of some form, but honestly it's the same with 4k. I doubt you'd see any difference (other than framerate) from couch distance. Cool vid baby, teehee
i am still "stuck" with a 1080p144hz monitor and i love it. but thinking about 8K being a thing in like 5-10years is just insane. i am already blown away when i see 4K because it feels unreal in terms of how a game or movie CAN look. i always get flashbacks of my pentium 4 cpu that i had as a kid and always realize we've come so far!
Thank you for showing me how 8K looks like, it's neat. - Me on phone with sun behind my back and on 480p video quality
Wow for some reason you sound really young in this video lol. I don't mean this in a negative (or positive) way, your voice just almost reminds me more of your older videos than your newer ones haha
I love listening to Philip tell me how poorly these games are running at 8k while I watch this video on RUclips mobile at 780p. I can really feel every pixel
Playing 8K on a 1080p screen it's 8X SSAA. Looks dope!! Can see things detailed down to a mile and even further!
I love how most of the games you tested are games that I jave never heard about.
I click on Mirrors edge thumbnail videos.
Thumbs up for the Mirrors Edge Catalyst Soundtrack at the beginning!!!!!
unrelated but holy shit i have never heard of "talos principle 2" but that might be the most beautiful game i have ever seen. those trees somehow look more realistic than real life.
somehow battlefield 1 is the 2nd best looking game in this video, i dont know what it is about that game but it always looked better than current gen games to me.
Very cool video, friend.