To put things into perspective: 1080p is 2 megapixels, 1440p is 3.6 megapixels, 4K is 8.2 megapixels, 8K is 33.2 megapixels. And 16K is "only" 132.7 megapixels.
It is the perfect resolution. No pixelation seen with naked eye at a decently close distance. 4K and 8k are sharp, but put lots of detail on screen and you can notice. 1080p is just not good anymore, it needs strong AA and still not good enough, might aswell play in 1440p with no AA. 1440p and 4K you activate upscaler and it looks good, plus you gain performance. 1080p with upscaler looks soft, better used as budget resolution. Next monitor I buy certainly wont be 1080p.
PC gaming in general is getting more expensive to the point I'll probably be done "modern" gaming on one within the next few years unfortunately. My 10400F + 2080 Super combo has been showing its age for a while now.
I mean if you look at the 40 series hierarchy it goes: 4060, 4060 ti, 4070, 4070 super, 4070 ti, 4070 ti super, 4080, 4080 super and then 4090. (4050 doesn’t count)
If this video blows up i swear, sketchy market sellers are, probably going to have a field day with this video. 16k capable could become the next buzzword
your comment about mid range being an ambiguous term is true. I remember when a mid range card was like £199 or £249. Pretty sure that's what i got my R9 290 for back in its day.
PC building companies and some content creators keep trying to get people to "move on" to 1440p 120/144 FPS while the majority (including myself) still has and is fine with 1080p. IMO it's mostly just a way to push high-end hardware.
@@unisonarcanine With new games a 3060 will struggle at 4k, but if you play stuff from the PS4 gen it might do okay. PS3 gen it definitely will do 4K max settings on pretty much everything. Emulation too. There are tons of great older games to play if you wanna use that 4K monitor, and it will still make a huge difference to their image quality even though 4k wasn't commonplace when some of them released. Going back to play older games that you used to enjoy on much weaker hardware or console, even though they looked and ran like ass at the time, is half the fun of PC gaming imo. You're maybe missing out on a handful of brand new games with a 3060, but you've got like 30 years worth of "backwards compatibility" and a machine that can max most of it at 4k or higher.
Well, the human visual system can only resolve around 500 million points of light between the build in visual receptors and small fast twitches that amplify the overall resolutions somewhat akin to temporal AA. 32K is over 528 million pixels. If one were to construction a display for use by multiple persons(say something like the walls of a holodeck from Star Trek) then I could maybe see something like a 128K display put to use, or even higher(256K, 512K, 1024K, 2048K, 4096K) but we're talking a pretty good size wall at that point.
@@jtenorjhumans don't see pixels though, so technically things could still look even better at 64k. Of course, if you just move back further from your 32k TV, I think things will start to look like an actual window. 8K looks slightly blurry especially when you look at a window and then look at the TV
I have an RX 6800 XT. I'm gonna have to try that game at 8K on the 4K TV I use as a monitor on my desk. Sounds like it should run decent enough for me.
Some minor DLSS or FSR or XeSS like quality mode might help a bit, but it you crank it to performance mode it would probably defeat the point of using 16K in the first place.
16K is never gonna be the standard, definitely not in 20 years. It's far beyond the point of diminishing returns. The only case where 16K resolution may occur is with really big screens or with static images designed to be zoomed in to see details; it's not gonna be worth it for anything real-time. Even if you could run it at a high framerate at some point, it would be a waste of energy.
People giving you a hard time over the use of the term mid range feels unfair especially considering you accurately call it a "modern mid range" GPU which is what it is. Always enjoy your videos and look forward to any new releases :)
Buddy that card is not modern mid-range by any terms, the price dictates its positioning, this is a low tier high end chip. RTX 4080 being high tier high end. RTX 4090 being super high end. It's simple, but mid-range, that's nonsense.
@@GATERISTIC Delusional. Price doesn't determine class it's the performance that counts. 4070 matches the last gen 3080 and in terms of performance it falls in the middle of the 40 series. It's definitely midrange by today's standards.
Well, running 8K on a smaller 4K TV right in front of my face is like running it at 4x SSAA which is nice, but could be better. Running a game at 16K on a 4K TV (again, the viewing distance is fairly close in my case) would be like running 4K at 16x SSAA which may be a bit overkill(8x SSAA is probably the realistic max one would notice over 4x SSAA) but I bet it would be noticeable. I'd probably stick with running older games at 8K on my 4K TV when possible and reasonable. But that's me.
@@jtenorj 16K native on a 16k display is not the same as 16k running on a 4k display with lots of super sampling. But I'll admit that on that resolution everything gets difficult to see anyway. 8K super sampling on a 4K display looks ultra smooth, but 16k native = 16x4K or 64x1080p That are a lot of pixels. you have to stand right in front of a 30ft screen to see the individual pixels. 16K is an absurd resolution only sensible for some IMAX theatres, archive storage/ master tapes or giant billboards.
@@KneppaH My 4K TV is a 43 inch( so between 40 and 48). When I lean in I can see pixels, but if I lean back things smooth out. For a long while I was playing PC Building Simulator 1 at 4K max settings on my FX 8350 and RX 480 8GB PC getting between 25-30 fps. I have a separate Ryzen 7 1700 machine I tried the game on, first with an RX 6700 XT. My frame rate shot up to over 100 fps(TV is basically 60 Hz but can do like a doubling to 120 Hz for smoothness I think) but my lows were still relatively low like 60ish. Upgrading to an RX 6800 XT made my lows improve to near average but the average didn't move so I think I have a CPU bottleneck. Need to maybe try my R9 3900X or just move straight to my 5800X3D. PCBS1 doesn't have any kind of AA available by default. Looks pretty good at 4K native with some jaggies. I've tried 1080p and it's horrible by comparison. Also tried 1440p and while better clearly isn't as good as 4k. I should try to use the driver to super scale it from 8k. I was recently playing 2005's Battlefield 2 in bot mode in 8K on the 4K TV and it was quite crisp. I don't think 16k is available in the options menu, but maybe I can force 16K from the driver. Might try that in the near future and see if I can notice any difference.
@@W-meme Yeah, older games won't likely have super high res textures. But running some of these games at much higher resolution then down sampling to 4K should all but eliminate jaggies from the edges of polygons.
I used to play certain multiplayer games at super high resolution, farthest draw distance, and super low details. When you can very clearly see the other team on the other side of the map, it makes sniping that much easier.
Please do comparisons of different GPU's at the same power limit. For example 1660 Super vs 2060 Super@120 watt. Or 1060, 1660 super, 2060 super, 3060 and 4060 all of them on power limit 100 watts.
Yeah GTA San Andreas is very pristine a clear at 16K. This reminds me of upscaling the resolution of Classic Genesis/Mega Drive games like Sonic and it looks very sharp and pristine at 1080p when using an upscailer and SCART and HDMI cables.
as a little fun fact, the human eye can only see in about 32k, meaning we are now 1/4 the way to being able to... at least look at a freezeframe at what it would be like to run a game at a real life resoultion
Not sure if anyone else mentioned this, but you can click on the error box and press ctrl + c and after u close the game you can paste it into notepad or something to see the error
Some games that don't have the option to go that high, if you go into windowed mode and maximize the window it sometimes will regardless if it had the 16k option or not in game, assuming you are running your desktop at 16k lol
When I play at 4k on my 4k monitor, some games prefer fullscreen 100% scaling (original screen size) some games won't start without this. I don't have a 16k monitor but I assume this might apply.
Another reason I feel like amd not only cares about the customers but i also got fsr 3.1 and frame gen on my last gen rx 6800 with 16gbs of vram amd ftw.
It's because the motion blur comes when you driving speed. This is motion blur effect slightly reduce the brightness.when the car hits the passenger the car slows down so the motion blur is gone so it's back to default brightness 😊
You should try an A100 or H100 80GB, since i'm sure there's no price limitations for your channel! 80gb of vram might just be enough for this endeavor. Not to mention output ports- who needs em?
I'm definitely a fan of being able to find tricks to run stupidly high resolutions, but I really do hope they stock at 8k for computer monitors. There's no reasonable situation where 16k will ever be useful. It's already really hard to justify 8k, because outside of big cinema screens, and sitting insanely close to your monitor, you won't see a difference.
This just isn't correct at 8K. Things still don't look as clear as they do in real life. I want my monitor to look like a window. In fact I want to remove my window and replace it with a monitor. I can't do that at 8K cuz it's not clear enough. I think I have to go to like 64k since humans can theoretically see up to 32k
remember you may have a lot of people saying 16k is pointless and will never be popular but they said the same thing about 4k 10 years ago and even about 720p once upon a time
Not really... it is mid range by product stack. Sorry that some of the most powerful consumer gpus ever made are a bit pricey? Complain about something else like new car prices or sum.
It's actually not, I game on an 8K TV and I still wish I had 16k because it's not perfectly clear. All you have to do is look out the window and see how much better it looks compared to your 8K TV
I set my DSR on my 2k display to max of 5k to test my 2060 super oc 8gb with factorio & 7 days to die. Every time I set 7 days to die to 5k it would go back to 2k & factorio would only run in 2k but factorio has no res options. I then found out factorio runs at what your desktop res is so set that to my DSR setting of 5k & factorio now runs 5k 60fps & 7 days to die will now keep the 5k setting but on low to mid settings I only get about 20 to 25fps. I do not own Bioshock but maybe setting your desktop res to max DSR then it may work.
You can see very little windows using zoom instrument in windows. Just hold win key and press + several times. P.S. Half-Life 2 Lost Coast, 3080ti, 16k - 133.25 fps average.
If someone showed you the 4000 series desktop gpus, and pointed to the gpu that was 4th from the bottom (with 5 gpus above it), what would you call it? Everything is about context There is a lowest end Ferrari too
to have the benefit of this resolution you will have to have a display with the same amount of pixels. 16K = 16 x 4K or 64 x 1080p. That is an insane amount of pixels. It only makes sense for 30ft displays and standing right in front of it. 1080p = 2Megapixel. 16K = 132Megapixel. That's a big difference in resolution. 16K super sampling on a 4K display is a waste of resources or energy. 8K on a 4K is the best you can get. After that the next logical step is native 8K on a 8K display.
@@KneppaH Technically, 16K on a 4K screen(16x SSAA) is overkill. However, 8X SSAA is not(better than 4x SSAA or 8K on a 4K screen). If you have a largish(TV size) screen right in front of you(on a desk) I would contend one can spot the difference. It may not be practical in some cases(doing 16K on a 4K with newer games drops the performance to unenjoyable levels) but for older games where it can easily be done with decent frame rates, why not?
I feel like that's fine. 1080P might actually be slightly worse. When you go to really low resolutions like 720p, 480p and all that it just looks pixely but at these weird Middle resolutions like 1080P and 1440. It just looks like you have bad vision and at the upper end like 4K and 8K it just looks like a slightly blurry window. I wish we had 16 and 32k though so we could have it look like an actual window
I get that this was just for funsies but I never see the point in running at a higher resolution if you have to turn everything down to lowest settings. Its so much better looking overall to have lower resolution and higher settings. Sure its not as sharp but 1440p high is better than 4K low any day of the week thanks to better effects,shadows and foliage, etc
I def agree with this on modern games where we've got more effective and cheaper post-process anti-aliasing solutions like TAA and DLSS/FSR. Reduced aliasing has always been the biggest improvement gained from higher resolutions imo, and that is largely taken care of at any resolution over 1080p nowadays. The increase in overall image quality between 1440p and 4K obviously is still visible, even 4k to 8k is visible (including down sampling), but with anti-aliasing taken care of separately it's less of a big deal in modern games. Where higher resolution really makes a difference is in older games that lack good post-process anti-aliasing (so games from the 7th console generation or older mostly), but thankfully when playing most of those on any semi-recent GPU you can easily run them maxed at 4k/5k/8k or whatever. Anything beyond 2x super sampling is probably overkill though imo. I think it will be a very, very long time before we're seriously talking about playing games at 16k being a thing. Even with downsampling on an 8k display I mean. I think at that point we're well into diminishing returns in terms of image quality.
@@yewtewbstew547 even back when 720p was standard and most high end cards could barely do 1080p and 1440p was a pipe dream,I still believed it was better to do 720p high or ultra than to do 1080p low or medium. The game just looks worse. Its true that a lot of modern games still look great at low presets but you can still see a difference between low and even medium and its better to have lower resolution at higher settings than the other way around and I think it always will be that way.
@@ThatMetalheadMan Yeah there are settings that you just have to prioritize over everything sometimes. Stuff like draw distance in games with big maps.
@@yewtewbstew547 something like PUBG or Fortnite maybe is better to have high res and low settings other than draw distance for an advantage but those are niche and not the usual for mainstream games
is there a big difference between the regular 4070 and the Super version? will I get similar performance to the Super version if I pick the standard one?
@@snakeshepard9761 One way to put it is the vanilla 4070 has less than 6000 CUDA cores(5888 to be precise) while the 4070 Super has over 7000. Similar clock speeds.
The fact that people getting butthurt over the fact that 4070 is midrange is hilarious 🤣 People really needs to stop living in the past, 4070 is a midrange card.
16x SSAA on a 4K display(which is admittedly a bit overkill, but 8x SSAA could be noticed above the 4x SSAA of running 8K on a 4K display. Mine is a 43 inch TV on my desk).
ive been saying this for years, if you are just gaming why go for higher res and expensive hardware? isnt it about the game, about how much fun it is and hardware enough to enjoy the game? (around 75fps on high)
This is pretty accurate. I picked up a GTX 1080 TI for free and another free 4k monitor from 7 years ago and I can still get 60 FPS at native 4K. Although 1080p just isn't good enough for me at least
@@RandomGaminginHD To imagine that less than 10 years ago paying 800€ for a high-end GPU was considered madness and now we're paying 600/700 for mid-range ones... The GPU crysis really showed companies that gamers will pay anything for the new hot thing
@@YetAnotherSADXFan I still think it will bite Nvidia in the ass down the line, at least as far as gaming GPU sales go, because I reckon it will result in people upgrading less frequently. It's not like people just magically have an extra ~$1,000 to throw away every 2-4 years compared to what people were earning 10 years ago, and when you combine that with the (imo) quite dramatically reduced number of high quality games releasing per year nowadays, the incentive to regularly upgrade just isn't there anymore.
My guess is either a 1080p monitor with high refresh or a 4K TV with 60Hz refresh. I know he's had one of each at one time or another, and may still have both.
@@pixelpiet4211 The 7900 XTX has 24GB of vram to the 4070 Super's 12GB. That's double. I wonder if he still has that RTX 4080 Super or if he had to give it back.
970 mid range 1070 mid range 2070 mid range 3070 mid range and GUESS FKIG WHAT! 40 GOT DAMN 70 MID RANGE! I swear if I see one more dmas comment "4070 mid range? yeah sure." You may not like it, but yeah its a fkn mid range gpu. Get over it.
Upper mid range, maybe? I mean there's still the 4070 Ti, 4070 Ti Super, 4080, 4080 Super and 4090 above it. Below are only the 4060, 4060 Ti(8GB/16GB) and the vanilla 4070. I suppose one could also include the 3050 6GB since even though it's based on last gen architecture, Nvidia just released it not too long ago.
Wouldn't it be 15K? Not 16K? The horizontal resolution is closer to 15,000 than 16,000. 16K would be 16000x9000 (also the 4070 Super is DEFINITELY NOT midrange, that's one of the highest end cards you can buy)
Agree on resolution naming, somewhat disagree on card tier. Current US pricing for new cards 7/23/2024(based on PCPartPicker): 4090 $1700+ 4080 $1100+ 4080S $960+ 4070 Ti Super $770+ 4070 Ti $710+ 4070S $580+ 4070 $540+ 4060 Ti 16GB $433+ 4060 Ti 8GB $365+ 4060 $280+ 3050 6GB $160 4070 Super is right in the middle.
@@RandomGaminginHD Cards like the 4080, 4080 Super and the 4090 should be called enthusiast level cards because these cards are priced as if they are high production equipment, definitely not consumer friendly prices, so in a sense the RTX 4070 Super is the new GTX 1080 Ti of 2024 :P
@@Russell970 A big part of the problem too imo is that a lot of the bigger games releasing nowadays are really bloated in terms of how they actually utilize your GPU. You'll be playing something that looks about 10% better than a PS4 game, but it wants 2x the VRAM and an extra 150 watts just to maintain 60fps. Unpopular to say this probably, but I've noticed that Unreal Engine 4/5 and 3D Unity games are the worst culprits.
@@yewtewbstew547 Thanks to DLSS, FSR and other upscalers the devs became lazy when it comes to optimizing their games and focused instead on releasing games years before they are finished (speaking of Cyberpunk 2077 and Starfield as an example)
To put things into perspective: 1080p is 2 megapixels, 1440p is 3.6 megapixels, 4K is 8.2 megapixels, 8K is 33.2 megapixels.
And 16K is "only" 132.7 megapixels.
or 16x4K or 64x1080p
It is the perfect resolution. No pixelation seen with naked eye at a decently close distance. 4K and 8k are sharp, but put lots of detail on screen and you can notice. 1080p is just not good anymore, it needs strong AA and still not good enough, might aswell play in 1440p with no AA. 1440p and 4K you activate upscaler and it looks good, plus you gain performance. 1080p with upscaler looks soft, better used as budget resolution. Next monitor I buy certainly wont be 1080p.
Well no wonder my old 2060 hated my friend's tv. I was makin it do 6x the work over my old screen. Got a 7900xt and a 1440p display now.
Yeah that's why I don't believe a sec that the GPU actually computed all that... that's resolution which is 1500% higher than 4K....
@@mclarenf1gtr99 stop using cancerous TAA on 1080p and it will look good again
Can't believe the 4070 is considered "mid-range" 💀💀
Bro i have a 1650s
PC gaming in general is getting more expensive to the point I'll probably be done "modern" gaming on one within the next few years unfortunately. My 10400F + 2080 Super combo has been showing its age for a while now.
I mean if you look at the 40 series hierarchy it goes: 4060, 4060 ti, 4070, 4070 super, 4070 ti, 4070 ti super, 4080, 4080 super and then 4090. (4050 doesn’t count)
Its right in the middle of the series.
It has a small die, with a size under 300m2. It's just a mid range card priced very high.
Random Gaming in UberHD
If this video blows up i swear, sketchy market sellers are,
probably going to have a field day with this video. 16k capable could become the next buzzword
😂
fr lol, probably gonna see "16k capable gaming pc" with a gt 710
the fact that we dont have commercial 16k displays yet that are easy to obtain makes this video useful for later
@@hyperturbotechnomike 4 4k Monitors is only 8k tho. Just like 4 2k (FullHD) Monitors is "only" 4k, resolution wise.
@@lS727 maybe "QUHD"?
Human eyes can't see in higher resolution than 8K so unless you are an Alien with super resolution eyes just stick with 8K.😂
@@GManWrites woah, huge factors are at play here, size of screen and distance relative to it.
You won't even see a difference between 8K and 16K unless you have special alien eyes.
your comment about mid range being an ambiguous term is true. I remember when a mid range card was like £199 or £249. Pretty sure that's what i got my R9 290 for back in its day.
1080 will be retro in few years
it already is 💀, its just a worse 3060 at this point with less vram too.
1440p is the sweet spot. I played on 4k on a 4k tv and saw barely no difference. Unless you're supposed to stick your face on the screen I guess.
@@fourty9933What u talking about lmao💀
For a lot of reviewers it already is. 1440p is Perfect and 4k is still very high end...even tho many mid range cards can run 4k now
@@chedds He thought he was talking about the card 💀
We powering the Vegas Sphere with this one
the sphere is 16K x 16K not 16K
@@dumbfloppa I see
😂
they said the Sphere is powered by 150 RTX A6000 cards. So yea, a single 4070 Super is not even close lol
@@armando1is1great bruh ofc, it's a joke man
PC building companies and some content creators keep trying to get people to "move on" to 1440p 120/144 FPS while the majority (including myself) still has and is fine with 1080p. IMO it's mostly just a way to push high-end hardware.
I still play at 1080p too 🤫😁
Just don't make the mistake I did and get a 4k monitor with a 3060, even with DLSS most games struggle!
@@unisonarcanine With new games a 3060 will struggle at 4k, but if you play stuff from the PS4 gen it might do okay. PS3 gen it definitely will do 4K max settings on pretty much everything. Emulation too. There are tons of great older games to play if you wanna use that 4K monitor, and it will still make a huge difference to their image quality even though 4k wasn't commonplace when some of them released.
Going back to play older games that you used to enjoy on much weaker hardware or console, even though they looked and ran like ass at the time, is half the fun of PC gaming imo. You're maybe missing out on a handful of brand new games with a 3060, but you've got like 30 years worth of "backwards compatibility" and a machine that can max most of it at 4k or higher.
1440p objectively looks a lot better but 1080p is fine if that's all you can get but it's the bare minimum for me and a growing amount of other people
@@Scornfull with my 4k monitor I prioritise FPS over resolution so I usually end up at 1440p with DLSS anyway (4k + DLSS is often still too slow)
I just wanted to say thank you for the videos and the time you put into getting them out.
One day they'll have 128k resolution!
And then the 1080k will be the bew 1080p gaming.
Well, the human visual system can only resolve around 500 million points of light between the build in visual receptors and small fast twitches that amplify the overall resolutions somewhat akin to temporal AA. 32K is over 528 million pixels. If one were to construction a display for use by multiple persons(say something like the walls of a holodeck from Star Trek) then I could maybe see something like a 128K display put to use, or even higher(256K, 512K, 1024K, 2048K, 4096K) but we're talking a pretty good size wall at that point.
@@jtenorjhumans don't see pixels though, so technically things could still look even better at 64k. Of course, if you just move back further from your 32k TV, I think things will start to look like an actual window. 8K looks slightly blurry especially when you look at a window and then look at the TV
I managed to get middle earth shadow of war at 30 fps at around 12k res with my 6800xt 16gb but 16k was a slideshow.
Nice. Apart from the slideshow part of course haha
I have an RX 6800 XT. I'm gonna have to try that game at 8K on the 4K TV I use as a monitor on my desk. Sounds like it should run decent enough for me.
put in perspective that 16K = 16x4k and 64x1080p so 30fps at 12k is not bad at all.
Funny thing... That first car you were driving in San Andreas looks just like the Chevy Impala I drove in high school. (Yes, I'm old.)
12 GB is mid range?!? My God, I just saved enough to buy a 3gb GTX 1060!
(Dammit Brazil)
3gb in 2024? Ouch!
@@unisonarcanine Brazil is though man
💀
@@unisonarcanine If he's playing older games he's good, I have 6 and its more then enough for what i play
Mid range for the latest games like Alan Wake 2 with min 8gb vram requirement at 1440p Ultra Path Tracing Enabled
What If you used some aggressive DLSS to the 16K res? Would that change anything?
Some minor DLSS or FSR or XeSS like quality mode might help a bit, but it you crank it to performance mode it would probably defeat the point of using 16K in the first place.
@@jtenorj i was mostly thinking that for the games that failed to run. but yeah i see what you mean.
"Less rural crimes and potholes" thats so true as a fellow british person 😭
'Less rural crime and potholes' 😂
Mad that there's games from over a decade ago that current hardware still can't handle within the bounds of the developer limitations.
In about 20 years when 16K is the new standard, people will go back and watch this video to see how these cards play their favorite retro games.
😂
@@RandomGaminginHD love the channel my guy. Peace from the USA
16K is never gonna be the standard, definitely not in 20 years. It's far beyond the point of diminishing returns. The only case where 16K resolution may occur is with really big screens or with static images designed to be zoomed in to see details; it's not gonna be worth it for anything real-time. Even if you could run it at a high framerate at some point, it would be a waste of energy.
@@b4ttlemast0rI don't think that's true. Even at 32 inches, I can see aliasing on a 4k oled. We still have a ways to go.
Consoles will be native 4K in 20 years.
People giving you a hard time over the use of the term mid range feels unfair especially considering you accurately call it a "modern mid range" GPU which is what it is. Always enjoy your videos and look forward to any new releases :)
Buddy that card is not modern mid-range by any terms, the price dictates its positioning, this is a low tier high end chip. RTX 4080 being high tier high end. RTX 4090 being super high end. It's simple, but mid-range, that's nonsense.
@@GATERISTIC lol
@@GATERISTIC Delusional. Price doesn't determine class it's the performance that counts. 4070 matches the last gen 3080 and in terms of performance it falls in the middle of the 40 series.
It's definitely midrange by today's standards.
Resolution is like the level of polish on a car. The more it shine the more it sale.
Unless you have your face firmly planted on a 100 inch TV you're not going to see the the difference, and even then you probably still won't.
Well, running 8K on a smaller 4K TV right in front of my face is like running it at 4x SSAA which is nice, but could be better. Running a game at 16K on a 4K TV (again, the viewing distance is fairly close in my case) would be like running 4K at 16x SSAA which may be a bit overkill(8x SSAA is probably the realistic max one would notice over 4x SSAA) but I bet it would be noticeable. I'd probably stick with running older games at 8K on my 4K TV when possible and reasonable. But that's me.
@@jtenorj 16K native on a 16k display is not the same as 16k running on a 4k display with lots of super sampling. But I'll admit that on that resolution everything gets difficult to see anyway. 8K super sampling on a 4K display looks ultra smooth, but 16k native = 16x4K or 64x1080p That are a lot of pixels. you have to stand right in front of a 30ft screen to see the individual pixels. 16K is an absurd resolution only sensible for some IMAX theatres, archive storage/ master tapes or giant billboards.
@@KneppaH My 4K TV is a 43 inch( so between 40 and 48). When I lean in I can see pixels, but if I lean back things smooth out.
For a long while I was playing PC Building Simulator 1 at 4K max settings on my FX 8350 and RX 480 8GB PC getting between 25-30 fps.
I have a separate Ryzen 7 1700 machine I tried the game on, first with an RX 6700 XT. My frame rate shot up to over 100 fps(TV is basically 60 Hz but can do like a doubling to 120 Hz for smoothness I think) but my lows were still relatively low like 60ish. Upgrading to an RX 6800 XT made my lows improve to near average but the average didn't move so I think I have a CPU bottleneck. Need to maybe try my R9 3900X or just move straight to my 5800X3D.
PCBS1 doesn't have any kind of AA available by default. Looks pretty good at 4K native with some jaggies. I've tried 1080p and it's horrible by comparison. Also tried 1440p and while better clearly isn't as good as 4k. I should try to use the driver to super scale it from 8k.
I was recently playing 2005's Battlefield 2 in bot mode in 8K on the 4K TV and it was quite crisp. I don't think 16k is available in the options menu, but maybe I can force 16K from the driver. Might try that in the near future and see if I can notice any difference.
True added to that the games that will run at 16k won't even have 2k/4k textures in the game files 😂
@@W-meme Yeah, older games won't likely have super high res textures. But running some of these games at much higher resolution then down sampling to 4K should all but eliminate jaggies from the edges of polygons.
RandomGamingin16K
nice video :D
I used to play certain multiplayer games at super high resolution, farthest draw distance, and super low details. When you can very clearly see the other team on the other side of the map, it makes sniping that much easier.
Please do comparisons of different GPU's at the same power limit. For example 1660 Super vs 2060 Super@120 watt. Or 1060, 1660 super, 2060 super, 3060 and 4060 all of them on power limit 100 watts.
I'd like to see how Medal of Honor Allied Assault would deal with it
This game got me into online fps gaming. There's actually a free version you can download and play online
Yeah GTA San Andreas is very pristine a clear at 16K. This reminds me of upscaling the resolution of Classic Genesis/Mega Drive games like Sonic and it looks very sharp and pristine at 1080p when using an upscailer and SCART and HDMI cables.
Very important information on eyes
Gonna save this video in a digital time capsule until youtube has 16k video option.
me watching 4k videos on a 720p monitor
Weird how I'm still seeing tons of aliasing even with the crazy level of supersampling.
Is it an issue with the recording software perhaps.
You always do a great job with your videos! Where in England do you live?
as a little fun fact, the human eye can only see in about 32k, meaning we are now 1/4 the way to being able to... at least look at a freezeframe at what it would be like to run a game at a real life resoultion
3:33 Bad Company 2 to this day I still play it on PC using project rome to find online servers what a game
6:26 amm driver crash great
Well this experiment was cool
shadow of war has 16k textures in the game
In no world is a 4070ti is midrange😂. Great video tho bro.
Not sure if anyone else mentioned this, but you can click on the error box and press ctrl + c and after u close the game you can paste it into notepad or something to see the error
RGHD really grew from Celeron iGPUs and recycling yard find PCs to the 1080 Ti video then now to “mid range” 4070 Supers and 16K gaming 😂
I listened to the voice in this video to fall asleep in the morning
Some games that don't have the option to go that high, if you go into windowed mode and maximize the window it sometimes will regardless if it had the 16k option or not in game, assuming you are running your desktop at 16k lol
When I play at 4k on my 4k monitor, some games prefer fullscreen 100% scaling (original screen size) some games won't start without this. I don't have a 16k monitor but I assume this might apply.
Another reason I feel like amd not only cares about the customers but i also got fsr 3.1 and frame gen on my last gen rx 6800 with 16gbs of vram amd ftw.
0:54 Brightness increase when hitting pedestrian.
It's because the motion blur comes when you driving speed. This is motion blur effect slightly reduce the brightness.when the car hits the passenger the car slows down so the motion blur is gone so it's back to default brightness 😊
You should try an A100 or H100 80GB, since i'm sure there's no price limitations for your channel! 80gb of vram might just be enough for this endeavor. Not to mention output ports- who needs em?
I'm definitely a fan of being able to find tricks to run stupidly high resolutions, but I really do hope they stock at 8k for computer monitors. There's no reasonable situation where 16k will ever be useful. It's already really hard to justify 8k, because outside of big cinema screens, and sitting insanely close to your monitor, you won't see a difference.
This just isn't correct at 8K. Things still don't look as clear as they do in real life. I want my monitor to look like a window. In fact I want to remove my window and replace it with a monitor. I can't do that at 8K cuz it's not clear enough. I think I have to go to like 64k since humans can theoretically see up to 32k
in probably 5 to 10 years time you will probably do a similar video running cyberpunk in 16k and that my friend I'll have seen it all
The 4070 super fe is easily a great little gpu that is of course for the price of gpus nowadays.
remember you may have a lot of people saying 16k is pointless and will never be popular but they said the same thing about 4k 10 years ago and even about 720p once upon a time
The pixels in the games are to less to even get displayed on the screen in 16k, they are just smoothed out.
i have this card fe i love how it looks and performs so much
its criminal to call a 600 dollar card mid range
it is in the stick "mid range" price wise it is not mid range
Not really... it is mid range by product stack. Sorry that some of the most powerful consumer gpus ever made are a bit pricey? Complain about something else like new car prices or sum.
@@FacialVomitTurtleFights Yes but keep in mind top end cards from 10 years ago were legit half the price (event accounting for inflation)
It is mid range though
Price doesn't determine class. It's the performance that matters and the 70 class was always midrange.
Im rolling, "sharper than my eyeballs can see"💀💀
It's actually not, I game on an 8K TV and I still wish I had 16k because it's not perfectly clear. All you have to do is look out the window and see how much better it looks compared to your 8K TV
I set my DSR on my 2k display to max of 5k to test my 2060 super oc 8gb
with factorio & 7 days to die.
Every time I set 7 days to die to 5k it would go back to 2k & factorio would only
run in 2k but factorio has no res options.
I then found out factorio runs at what your desktop res is
so set that to my DSR setting of 5k & factorio now runs 5k 60fps
& 7 days to die will now keep the 5k setting but on low to mid settings I only get
about 20 to 25fps.
I do not own Bioshock but maybe setting your desktop res to max DSR then it may work.
Me playing on 1080p with 16gb of vram... Hmmm, yes, interesting...
You can see very little windows using zoom instrument in windows. Just hold win key and press + several times.
P.S. Half-Life 2 Lost Coast, 3080ti, 16k - 133.25 fps average.
i like how the 480 number is in the W res and the 640 number is in the Y res lol
i also cant bare to call a 4070super a mid range card , something like a 2080 or 3060ti is probably closer to mid range these days
Then I guess you just like living in the past...
If someone showed you the 4000 series desktop gpus, and pointed to the gpu that was 4th from the bottom (with 5 gpus above it), what would you call it?
Everything is about context
There is a lowest end Ferrari too
are these resolutions real? Or just an option, but with no real effect? For example a 4K res. with x4 AA to downscale it for smoother edges?
to have the benefit of this resolution you will have to have a display with the same amount of pixels. 16K = 16 x 4K or 64 x 1080p. That is an insane amount of pixels. It only makes sense for 30ft displays and standing right in front of it. 1080p = 2Megapixel. 16K = 132Megapixel. That's a big difference in resolution. 16K super sampling on a 4K display is a waste of resources or energy. 8K on a 4K is the best you can get. After that the next logical step is native 8K on a 8K display.
@@KneppaH Technically, 16K on a 4K screen(16x SSAA) is overkill. However, 8X SSAA is not(better than 4x SSAA or 8K on a 4K screen). If you have a largish(TV size) screen right in front of you(on a desk) I would contend one can spot the difference. It may not be practical in some cases(doing 16K on a 4K with newer games drops the performance to unenjoyable levels) but for older games where it can easily be done with decent frame rates, why not?
Me who still games at 720p... LOL
I feel like that's fine. 1080P might actually be slightly worse. When you go to really low resolutions like 720p, 480p and all that it just looks pixely but at these weird Middle resolutions like 1080P and 1440. It just looks like you have bad vision and at the upper end like 4K and 8K it just looks like a slightly blurry window. I wish we had 16 and 32k though so we could have it look like an actual window
I get that this was just for funsies but I never see the point in running at a higher resolution if you have to turn everything down to lowest settings. Its so much better looking overall to have lower resolution and higher settings. Sure its not as sharp but 1440p high is better than 4K low any day of the week thanks to better effects,shadows and foliage, etc
Yeah I agree tbh
I def agree with this on modern games where we've got more effective and cheaper post-process anti-aliasing solutions like TAA and DLSS/FSR. Reduced aliasing has always been the biggest improvement gained from higher resolutions imo, and that is largely taken care of at any resolution over 1080p nowadays. The increase in overall image quality between 1440p and 4K obviously is still visible, even 4k to 8k is visible (including down sampling), but with anti-aliasing taken care of separately it's less of a big deal in modern games.
Where higher resolution really makes a difference is in older games that lack good post-process anti-aliasing (so games from the 7th console generation or older mostly), but thankfully when playing most of those on any semi-recent GPU you can easily run them maxed at 4k/5k/8k or whatever. Anything beyond 2x super sampling is probably overkill though imo.
I think it will be a very, very long time before we're seriously talking about playing games at 16k being a thing. Even with downsampling on an 8k display I mean. I think at that point we're well into diminishing returns in terms of image quality.
@@yewtewbstew547 even back when 720p was standard and most high end cards could barely do 1080p and 1440p was a pipe dream,I still believed it was better to do 720p high or ultra than to do 1080p low or medium. The game just looks worse. Its true that a lot of modern games still look great at low presets but you can still see a difference between low and even medium and its better to have lower resolution at higher settings than the other way around and I think it always will be that way.
@@ThatMetalheadMan Yeah there are settings that you just have to prioritize over everything sometimes. Stuff like draw distance in games with big maps.
@@yewtewbstew547 something like PUBG or Fortnite maybe is better to have high res and low settings other than draw distance for an advantage but those are niche and not the usual for mainstream games
is there a big difference between the regular 4070 and the Super version? will I get similar performance to the Super version if I pick the standard one?
Not much but the extra money for the Super are worth it
@@snakeshepard9761 One way to put it is the vanilla 4070 has less than 6000 CUDA cores(5888 to be precise) while the 4070 Super has over 7000. Similar clock speeds.
Jeez... if only there was a world wide search engine in which a person could ask that question...
Me still playing at 900p in 2024 haha im used to it
Still have an GTX980 😅😢😂
Респект за Бад Компани 2.
how about using an upscaler?
I always play on 16K on my little GTX 970
*opens minesweeper
The fact that people getting butthurt over the fact that 4070 is midrange is hilarious 🤣
People really needs to stop living in the past, 4070 is a midrange card.
someone once said you need 1gb of VRAM per K of resolution so that’s probably why it didn’t work. lol
Have you ever tested 16k resolution with 4090?
well... 16K is out of my reach. still... might be a useful video for the future.
why do some of the games have pretty noticeable jaggies? is it just because of no anti aliasing?
It's most probably because of downsampling for the video. Different scaling algorithms give different outcome.
16K... WHY... Also depends a lot on having a very light game to run at that res
16x SSAA on a 4K display(which is admittedly a bit overkill, but 8x SSAA could be noticed above the 4x SSAA of running 8K on a 4K display. Mine is a 43 inch TV on my desk).
fun.
ive been saying this for years, if you are just gaming why go for higher res and expensive hardware? isnt it about the game, about how much fun it is and hardware enough to enjoy the game? (around 75fps on high)
This is pretty accurate. I picked up a GTX 1080 TI for free and another free 4k monitor from 7 years ago and I can still get 60 FPS at native 4K.
Although 1080p just isn't good enough for me at least
The lack of vram is really problematic
half life was the only game where you got more fps than 20 because it managed to stay within 12 gigs of vram somehow
4070 is mid range? Gotta get more bread
Apparently so. I’d consider it high end but 600 mid range is where we’re at now for some reason lol
@@RandomGaminginHD To imagine that less than 10 years ago paying 800€ for a high-end GPU was considered madness and now we're paying 600/700 for mid-range ones... The GPU crysis really showed companies that gamers will pay anything for the new hot thing
@@RandomGaminginHD very true, correct response.
Mainly because people ignore cheaper but equivalent options that and both companies are hellbent on keeping pandemic pricing
@@YetAnotherSADXFan I still think it will bite Nvidia in the ass down the line, at least as far as gaming GPU sales go, because I reckon it will result in people upgrading less frequently. It's not like people just magically have an extra ~$1,000 to throw away every 2-4 years compared to what people were earning 10 years ago, and when you combine that with the (imo) quite dramatically reduced number of high quality games releasing per year nowadays, the incentive to regularly upgrade just isn't there anymore.
Am I dreaming or San Andreas has some aliasing at 16k ?
Interesting 🤔
Remember: 16K is 16 x 4K. 64 x 1080p
Try to put your desktop resolution at 16 k to work
forget the card, what monitor are you using?!
My guess is either a 1080p monitor with high refresh or a 4K TV with 60Hz refresh. I know he's had one of each at one time or another, and may still have both.
Oh my science
Wonder how an rx 6600 would do it
Just get rx 580 8gbs for 30-40 bucks used and call it a day. Fullhd high/medium 60fps everything
Where Skyrim
xD
I think you should do this test with a 7900xtx 😊
This could even work because the card has 4 GB more VRAM and runs better in rasterization.
@@pixelpiet4211 *12 GB more VRAM
@@pixelpiet4211the xtx has 24gb vram
@@pixelpiet4211 The 7900 XTX has 24GB of vram to the 4070 Super's 12GB. That's double. I wonder if he still has that RTX 4080 Super or if he had to give it back.
@@jtenorj sorry ,confused with the GRE 😁
Well heard of dldsr but not dsr tool...😮
Meanwhile me with a dell inspron laptop berly getting 50 fps with the lowest settings
Hey thats the one i have 🎉
970 mid range 1070 mid range 2070 mid range 3070 mid range and GUESS FKIG WHAT! 40 GOT DAMN 70 MID RANGE!
I swear if I see one more dmas comment "4070 mid range? yeah sure." You may not like it, but yeah its a fkn mid range gpu. Get over it.
try outputting to a CRT screen
12gb vram not enough? im sure the 16gb 4060 ti would have done better 🦐🦐🦐🦐🦐🦐
Are six shrimps a modern symbol of technology misused?
@@BenAkenobi yes
for old game better to use 1080p res,because texture at high res is so fake
how much would you sell that gpu to me for?
Probiere es einige Jahre später mit einer RTX 7070 mit 24 GB Vram dann kann man vielleicht 16k mit 60 FPS spielen. 😊
4070 super midrange?
Apparently so
yes because the 4080 and 4090 are high end. The 4060 is the entry-level card in the series.
Upper mid range, maybe? I mean there's still the 4070 Ti, 4070 Ti Super, 4080, 4080 Super and 4090 above it. Below are only the 4060, 4060 Ti(8GB/16GB) and the vanilla 4070. I suppose one could also include the 3050 6GB since even though it's based on last gen architecture, Nvidia just released it not too long ago.
Wouldn't it be 15K? Not 16K? The horizontal resolution is closer to 15,000 than 16,000. 16K would be 16000x9000
(also the 4070 Super is DEFINITELY NOT midrange, that's one of the highest end cards you can buy)
It's in the middle of the RTX 40 series hierarchy so it is indeed midrange in that sense
Agree on resolution naming, somewhat disagree on card tier. Current US pricing for new cards 7/23/2024(based on PCPartPicker):
4090 $1700+
4080 $1100+
4080S $960+
4070 Ti Super $770+
4070 Ti $710+
4070S $580+
4070 $540+
4060 Ti 16GB $433+
4060 Ti 8GB $365+
4060 $280+
3050 6GB $160
4070 Super is right in the middle.
Everything leas then 16 gb of vram is middest of the midrange
mid-range? yeah sure
Yeah I’d call it high end myself but apparently not
@@RandomGaminginHD Cards like the 4080, 4080 Super and the 4090 should be called enthusiast level cards because these cards are priced as if they are high production equipment, definitely not consumer friendly prices, so in a sense the RTX 4070 Super is the new GTX 1080 Ti of 2024 :P
@@Russell970 A big part of the problem too imo is that a lot of the bigger games releasing nowadays are really bloated in terms of how they actually utilize your GPU. You'll be playing something that looks about 10% better than a PS4 game, but it wants 2x the VRAM and an extra 150 watts just to maintain 60fps.
Unpopular to say this probably, but I've noticed that Unreal Engine 4/5 and 3D Unity games are the worst culprits.
@@yewtewbstew547 Thanks to DLSS, FSR and other upscalers the devs became lazy when it comes to optimizing their games and focused instead on releasing games years before they are finished (speaking of Cyberpunk 2077 and Starfield as an example)