Yung Glut basically ampere gpus should move that 4K segment to a playable and enjoyable state as they mentioned before. It may be the first gpu in the world which will make 4K gaming a decent pick for 2020 and so on. But if nvidia wills to push that 4K gaming experience to the next level, in 2025 we may have some decent performance boost like 2080ti in 4K where its more a playable 60fps than 20-35fps But as i said before, only about 0.73% players are playing in native 4K resolution on highest settings in most games, so that means 4K is basically still an unplayable thing for most of the users
8k is not an exaggeration, who can consume, who can buy 16k; 32 k resolution who can use it, when i have money to run 8k at 144fps i will buy up to 4k at 60fps is the reality here at home
@@GeniusFreak its game by game. some perform significantly worse in sli mode. others perform well. all in all its not worth the money for the extra gpu unless you need it for something other than gaming.
Remember 3 years ago when getting 4k 60 fps max settings was the goal? 3 years ago no one would've even thought about gaming in 8k in somewhat playable frame rates.
I tried all 3, and i feel 2k is the sweet spot. Cause at 2k the game can still look good, and same time get high FPS. Which to me IS THE MOST IMPORTANT. 4K and 8k looks REALLY nice, but don't like it when i hit 60fps.
Just use Nvidia game filters to render your games at 4k and higher... You just sharpen the image and you will see no real difference to 4k and higher while you just loose like 5-10 fps... Best Feature ever
"lack of VRAM" It is enough for gaming... If you are playing in 8k ultra with ray tracing enabled and dlss , of course maybe is not enough, but NO ONE plays in 8k, the majority that has this kind of cards plays at 4k high-ultra details and rtx on (if the game is compatible with it) Only games poorly optimized can use that amount of VRAM (11gb) check The Witcher 3, it uses less than 2gb of VRAM on 1080p. If games were good optimized they will use MAXIMUM 4-5gb vram on 1080p.
@m.h gaming now 8k monitor is not touchable, even 4k is hardly mainstream, we are not only talking about monitor itself, also ports , cable technology, now the 4k 144hz monitor can only achieved through video compression technology. It might take 3,4 years to get 1. 8k monitors at reasonable prices 2. Videos are popularly available for 8k. 3. New ports and cables which can support 8k at higher refresh rate. It is not a surprise it will take 4,5 years to achieve all of these , and how much do you think the flagship will be worth after 2-3 generations ?
I switched from a PS4 to PC last month so I know how it is playing on 60fps.Its not bad at all.But 240 FPS is definitely buttery smooth.I get like 50 FPS at 4K high settings let alone 8K.
Reminix ps4 and xbox games are already locked at medium-high setting depending on the game so why yall complainin lol thats as good as my pc lol and they hit easy 60 fps
I didn't expect that much performance in 8k by this time. Many games are somehow payable in 8k. A few years ago you were counting seconds per frame when playing in 8k. I am really interested to see how the upcoming 3080ti will perform.
Your 2080Ti's are often hovering at around 50% performance at 1440p, instead of showing you the most that they can do. Hence for benchmarking purposes you should keep your *"Power Management Mode"* in the "Nvidia Control Panel" at _"Prefer Maximum Performance"_ instead of the default _"Optimal Power"._
No it is actually that his Processor can't keep up with these high framerates. Even tho its the top of the line intel cpu for gaming it still isnt fast enough to push like 160-240fps all the time. Doesn't have anything to do with the performance setting. Although it says like 50% of the cpu is being used, just 1 Core of all the 10 cores needs to be at 100% and the cpu can't put out more frames.
Contrary to popular misconception, SLI is not "dead," it just doesn't scale in EVERY game. If you've got a high refresh 1440p or 4K monitor and are looking for next gen card performance in AAA games today, it does provide benefit.
But that's not native 8k, I'm really curious how native 8k looks like and could you tell me how does aliasing fair when you bump it to 8k using geforce software? does it managed to get rid of it ?
8K is basically like every 1/4th of your screen is an entire 4K display which is like insane and is totally unnecessary. You can tell by the FPS difference between the two that gaming at 8K is like a huge impact on fps even with this beast of a GPU
First thing you are watching RUclips and RUclips doesn't support x265, lower bitrate then actual video uploaded you want to see the real difference get an 8K TV and Play modern PC games Support 8K Resolution then change it to 4K you will easy see the huge difference
@@accuratemrstuff3842 You really need at least an 8k 85 inch tv to notice the biggest difference... They are too expansive to justify the price... None of us have done the propper testing to notice the difference so whatever anybody says in the comments means nothing... Also the performance is hit so bad because of the lack of vram...Unfortunately, it looks like the 3090 is only going to have 12gb of vram... games will need to support DLSS to be able to reach 8k 60fps in new AAA released games
The i9-10900K and KF may be the fastest processors for games, but the lack of support from these processors to PCIe 4.0 for a PCIe Gen 4 SSD is discouraging.
@@brunopizzi1478 It may be, but let's get to the facts; AMD launched support for PCIe 4.0 almost at the beginning of the i9-9900K and KF generation, so many Intel fans were looking forward to launching something with support for PCIe 4.0. However, when she launched something, she immediately launched the 10900K and KF which is practically a 9900K with more cores and still forced the staff to buy new motherboards, since it is a Z490 chiset and is compatible with the 9900K's Z390. So, EVERYONE was waiting for Intel to give AMD some response in the 10th Generation, which it didn't. So, despite being the fastest processor for games, the 10900K is a huge disappointment, it is just a 9900K with more cores and still requires you to purchase a new mobo. And not to mention the fact that it looks like Intel has parked in the 14nm lithography.
@@zenith1352 sorry didn't see your comment when I wrote mine. At one point the fps went to 145, so I am not sure if it is really capped at 144fps. We can see the GPU is not fully utilised.
Wenn ich richtig gelesen habe hast du die beiden GPU`s in einer x8/x8 Anbindung ( zumindest steht das in der Produktbeschreibung vom z490 so X16/X0 oder X8/X8) was mich dann insofern wundert weil ich das bei The Witcher auch mit 2 MSI Gaming x Trio 2080 Ti ausprobiert habe und dabei gut 60 FPS weniger hatte als du in deinem Video. Dachte es liegt bei mir an der X8/X8 Anbindung aber offensichtlich verwendest du die selbe Konfig nur das es bei dir funktioniert.
@@BENCHMARKSFORGAMERS Darf ich noch fragen, wie du in deinem Loop die beiden GPU`s ansteuerst? GPU-Radiator-GPU ?? So möchte ich es auch bei mir bauen, fürchte aber das sich das dann mit den Schläuchen nicht ausgeht....
@@rudirenntier3175 Radiator - GPU - CPU - Radiator. Video vom PC: bit.ly/32rM0qk Die Reihenfolge hat kaum einen Einfluss auf die Temperaturen: ruclips.net/video/ea7X8s_6rJo/видео.html
2K is 1920x1080 because has 2 million pixels. Original 2K is 2048x1080 but this is for cinema, for mainstream it was cut to 1920x1080. But 2560x1440 it's not 2K, it's Quad HD.
Jesus the 10900k is so strong, only like 20% usage in 4k and 8k..sure intel is overpriced as hell, but if money isnt a problem and you really want the best performance, go intel
The game is on *8k*
RUclips restricts quality in *4k*
My screen is *1080p*
Me watching this video in 144p
My pc just shutting down when I try to watch
lol my eyes are 240p so
@@ashneel my eyes 144p at 20 fps
Ash Neel try wiping your glasses
well my screen is 720p
8K is just basically overkill for this generation. Hell, even 4K is not even mainstream yet for PC's.
8k fps counts are like 4k fps counts drin 2015, im interested what fps count are normal in 2025
Yung Glut basically ampere gpus should move that 4K segment to a playable and enjoyable state as they mentioned before. It may be the first gpu in the world which will make 4K gaming a decent pick for 2020 and so on.
But if nvidia wills to push that 4K gaming experience to the next level, in 2025 we may have some decent performance boost like 2080ti in 4K where its more a playable 60fps than 20-35fps
But as i said before, only about 0.73% players are playing in native 4K resolution on highest settings in most games, so that means 4K is basically still an unplayable thing for most of the users
8k is not an exaggeration, who can consume, who can buy 16k; 32 k resolution who can use it, when i have money to run 8k at 144fps i will buy up to 4k at 60fps is the reality here at home
@@DuaLL1337 yeah the Problem ist that pixel count increases exponantialy. I guess in 2025 youll play 4k 144hz. And 8k with 60fps depending on the game
DuaLL 4K is already overkill
Nice to know x2SLI is supported well, all games took all the benefits from the GPUs!
Reviewers lie and tell you it's dead while their personal rigs have sli.
@@blkshp25 Agree!!!
@@GeniusFreak its game by game. some perform significantly worse in sli mode. others perform well. all in all its not worth the money for the extra gpu unless you need it for something other than gaming.
Took years for those games to be optimized properly tho ;)
@@blkshp25 couldn't be more true lol. I just realized that.
Remember 3 years ago when getting 4k 60 fps max settings was the goal? 3 years ago no one would've even thought about gaming in 8k in somewhat playable frame rates.
I tried all 3, and i feel 2k is the sweet spot. Cause at 2k the game can still look good, and same time get high FPS. Which to me IS THE MOST IMPORTANT. 4K and 8k looks REALLY nice, but don't like it when i hit 60fps.
@That pro who lags on a 27 inch screen , no.
Just use Nvidia game filters to render your games at 4k and higher... You just sharpen the image and you will see no real difference to 4k and higher while you just loose like 5-10 fps... Best Feature ever
Who's here after the announcement of rtx 3000 series
Mee
Hi guys
Yep
me 😋
My dream and only have 1 rtx
just go at store
and here i am wishing to have gtx 1660 super
@@hamzahdzulfiqar8173 i have rtx 2060 1 year ago.
@@dinsya4906 does it works well for you ?
@Chris zhang nice... ,what games are you playing and how much fps did you get with it ?
* 2080 Ti SLI + 8K Monitor RIG
Nobody: What did it cost?
Wallet: Everything....
Right now, cheaper than the 3090. At $400 each for used ones, I'm wondering what would be the best setup for a pimax 8k headset?
it is evident the lack of VRAM in some games. If it had more than 11gb on one card, the performance would be much better..
"lack of VRAM"
It is enough for gaming...
If you are playing in 8k ultra with ray tracing enabled and dlss , of course maybe is not enough, but NO ONE plays in 8k, the majority that has this kind of cards plays at 4k high-ultra details and rtx on (if the game is compatible with it)
Only games poorly optimized can use that amount of VRAM (11gb)
check The Witcher 3, it uses less than 2gb of VRAM on 1080p. If games were good optimized they will use MAXIMUM 4-5gb vram on 1080p.
RTX 3xxx series will have 24GB of vram. Next step to the feature.
When Vram is not an issue ruclips.net/video/BUoQv5g-Jjo/видео.html
Damn..will there ever be a benchmarking video without Cryis?
Cause that game still brings pc to it's fucking knees thats why.
I can't wait for the upcoming GPUs this year...
@kyojuro rengoku 😭😭😭😭
@kyojuro rengoku I Will buy RTX 3070 super or 3080, but It depende for the price
U mean in 2 weeks?
@kyojuro rengoku oh my God!! So expensive
@@ThunderBlastvideo No, 2 months, because RTX 3090 Will arrive first
When BF5 runs smoother than Crysis. (Happy Crytek noises.)
3090 sli could totally run every game above 60fps on 8k
16K
When 8k is main stream, 3090 will worth $500
@m.h gaming now 8k monitor is not touchable, even 4k is hardly mainstream, we are not only talking about monitor itself, also ports , cable technology, now the 4k 144hz monitor can only achieved through video compression technology. It might take 3,4 years to get 1. 8k monitors at reasonable prices 2. Videos are popularly available for 8k. 3. New ports and cables which can support 8k at higher refresh rate. It is not a surprise it will take 4,5 years to achieve all of these , and how much do you think the flagship will be worth after 2-3 generations ?
@m.h gaming 4k TVs are available for 4 years at least, how many people are using 4k monitors for pc? I don't think the number is higher than 50%
Your pc 2x RTX 2080TI GTA V 8K: 55 fps
My Pc: Burning
GTA 5 = Trash "game"
Андрей Шевцов ?
@@curtisg8399 bruh
Андрей Шевцов ?
Me: I'm getting only 60 fps(Battlefield 5)
My friend: LOL your pc is so trash:)
Me: At 8K:)
My friend: NANI!?
that pro who lags It was joke chill🥺
I switched from a PS4 to PC last month so I know how it is playing on 60fps.Its not bad at all.But 240 FPS is definitely buttery smooth.I get like 50 FPS at 4K high settings let alone 8K.
Reminix ps4 and xbox games are already locked at medium-high setting depending on the game so why yall complainin lol thats as good as my pc lol and they hit easy 60 fps
that pro who lags lol I’m fine 720p 20 FPS but like ok
I didn't expect that much performance in 8k by this time. Many games are somehow payable in 8k. A few years ago you were counting seconds per frame when playing in 8k. I am really interested to see how the upcoming 3080ti will perform.
3090 ti*
Ampere is here boi
Great video. What monitors were used for 4k/ 8k? Nividia DSR?
I used Elgato Game Capture 4K60 Pro MK.2 (7680x4320 @ 30Hz, custom res)
Your 2080Ti's are often hovering at around 50% performance at 1440p, instead of showing you the most that they can do. Hence for benchmarking purposes you should keep your *"Power Management Mode"* in the "Nvidia Control Panel" at _"Prefer Maximum Performance"_ instead of the default _"Optimal Power"._
No it is actually that his Processor can't keep up with these high framerates. Even tho its the top of the line intel cpu for gaming it still isnt fast enough to push like 160-240fps all the time. Doesn't have anything to do with the performance setting. Although it says like 50% of the cpu is being used, just 1 Core of all the 10 cores needs to be at 100% and the cpu can't put out more frames.
Why cpu is not working on 8k than 4k and 2k??
This video 4k
My monitor 1080p
My internet 360p
My glasses 240p
My eyes 144p
The 2K is not the 1440p. The 1440p is called QHD. The 2K resolution is 2048x1080.
Its weird to see that sometimes, you get less than 1/4 fps in 8K than in 4K, even tho it is 4x the amount of pixels
so you can’t use this setup in cold war? only one 2080ti would work correct or?
Thanks for the video. I currently run a 10900K with 2 RTX 2080s SUPER in SLI at 1440p.
Oh my gawd inmagine, open watercoolin these and overclocking them
Jays 2 cent did this check it out
@@MrPeps-vb8oi ok ty :)
I think that graphic cards from rtx series like 2080 ti are working on NV link... not on SLI ... or am i missed something ?
Bro, which is the best processor for editing & gaming under 133$?
I think a i3 9th gen or a amd ryzen 3 2200g will be good
Ryzen 3 3300x definitely (120$)
how does cpu clock speed affect the gpu utilization?
Contrary to popular misconception, SLI is not "dead," it just doesn't scale in EVERY game. If you've got a high refresh 1440p or 4K monitor and are looking for next gen card performance in AAA games today, it does provide benefit.
Which brand of rtx 2080ti you had here ?
What program are you running all your information type gpu's, cpu plus fps
4 rtx 2080 sli, best config for 8k
Nah, rtx 3090
@@obama69890 Much money
2 rtx 3090ti in sli will assure 8k60hz or vrStereo4k
2 3090ti in sli will assure 8k60hz or 4k Stereo vr
16 titan v desk pc 16k 500fps lololol
there is difference between 2k and 4k with respect to the display?
Do you need this video if youtube has limited image quality?
Nice work bro ❤️... I believe this is the future of gaming 🤪🌀❤️🤫
When it comes to 4k, the work is basically for the gpu only, the CPU does basically nothing on 4k, that's another thing that kills the performance
The video looks amazing
You have an 8K TV?
Over 3000 in just GPUs to play 8k at 60gps. That's insane. Still thanks for this very informative video. 👌
4K Just got Popular for 2020, am so interesting to watch 8K Gameplay and am looking forward for 8K & 16K HDR future gaming
What's the big difference between the 4K and 8K? For me, I didn't find any difference in the video
You are most likely watching in 2160p or under. You can’t see those extra pixels. You can’t fit 4320 pixels into 2160
Ark is a joke when it come to optimization
As usual
Makes it good for benchmarking
Its the #1 graphical game tho
Rest in peace rtx 20s
Yea this fps that is on crysis 3 on 4320p i get on 1080p and low settings.
Which of these P’s is better to play low middle or hıgh?
But that's not native 8k, I'm really curious how native 8k looks like and could you tell me how does aliasing fair when you bump it to 8k using geforce software? does it managed to get rid of it ?
harshlens what is native 8k then?
Up to 4k, 3080 is slightly faster than 1080ti sli.
So I'll assume that the 4080 hopper against the 2080ti sli will have the same ratio.
my screen is 1080p . looks all same nothing diff? that's my problem?
Is there even a difference between 4k and 8k wtf
Yes, one makes me sweat. The other one swear.
8K is basically like every 1/4th of your screen is an entire 4K display which is like insane and is totally unnecessary. You can tell by the FPS difference between the two that gaming at 8K is like a huge impact on fps even with this beast of a GPU
First thing you are watching RUclips and RUclips doesn't support x265, lower bitrate then actual video uploaded
you want to see the real difference get an 8K TV and Play modern PC games Support 8K Resolution then change it to 4K you will easy see the huge difference
@@accuratemrstuff3842 You really need at least an 8k 85 inch tv to notice the biggest difference... They are too expansive to justify the price... None of us have done the propper testing to notice the difference so whatever anybody says in the comments means nothing... Also the performance is hit so bad because of the lack of vram...Unfortunately, it looks like the 3090 is only going to have 12gb of vram... games will need to support DLSS to be able to reach 8k 60fps in new AAA released games
@@michaelangst6078 Yea that's what I was thinking. I play at 27inch and i heard 1440p is perfect because i sit like 40inches away from it.
The i9-10900K and KF may be the fastest processors for games, but the lack of support from these processors to PCIe 4.0 for a PCIe Gen 4 SSD is discouraging.
The 11gen will support 👌 rtx 30 + pci 4.0 +11gen+ sdd nmv2 support pcie 4 at 8000mhz maybe 10000+ walter cooler will can do 8k at120 fps easy
@@brunopizzi1478 It may be, but let's get to the facts; AMD launched support for PCIe 4.0 almost at the beginning of the i9-9900K and KF generation, so many Intel fans were looking forward to launching something with support for PCIe 4.0. However, when she launched something, she immediately launched the 10900K and KF which is practically a 9900K with more cores and still forced the staff to buy new motherboards, since it is a Z490 chiset and is compatible with the 9900K's Z390.
So, EVERYONE was waiting for Intel to give AMD some response in the 10th Generation, which it didn't.
So, despite being the fastest processor for games, the 10900K is a huge disappointment, it is just a 9900K with more cores and still requires you to purchase a new mobo.
And not to mention the fact that it looks like Intel has parked in the 14nm lithography.
@@vladdraculLV i agree man, but i expected the 11 gen will be the bomb
What about 8K + DLSS?
so buy me an 8ktv and I will buy the 3090... fuckstick
Maybe try next time with Titan RTX?
RTX SLI on 1080p = God mode 300fps something would like to see
3:40 the FPS in 4K and 2k are same lol
the fps cap in those games are 144fps so they are much higher then they are saying it is
That is either because of cpu bottleneck, or the game caps cutscenes at 144fps
Fadi Issa you’re stupid that’s not a cpu bottleneck
Fadi Issa Yeah what?
@@zenith1352 sorry didn't see your comment when I wrote mine. At one point the fps went to 145, so I am not sure if it is really capped at 144fps. We can see the GPU is not fully utilised.
I think for gaming 1080p is optimum.
Smoother gameplay or more resolution?
Half half
Music pls 0:00 - 0:19?
Does the pcie 3.0 bottlenect the 2080ti sli?
just to clarify, 2K is not 1440p
QHD vs. UHD-I vs. UHD-II would be
correct, but nobody understands that😅
Most of people think that,so what can you do about it.
Tell me what you want but on a normal sized monitor you DO NOT need 4k let alone 8k.
For a big ass tv or (super) ultra wide I can understand
please tell me, what is software in video?
Phá Chuyên Gia looks like Msi afterburner. Using the monitor mode
were not ready for 1080p 240fps in warzone so...rtx 3000 maybe do it
how much power is consuming the Ultra-PC ?
100%
Is titan rtx sli a thing?
can u show the pc? never seen such low temps, love ur videos
Dont even notice lol maybe 5fans not water cooling
You have a custom loop ?
Yes👌
Pictures -> RUclips community tab
Can you do this same test with Doom Eternal please.
Why the fps on 4k are more than 2k
why in 8k the grafic cards do not push their limits, and the i9 does it
Why below 40 fps looks choppy as hell?
Could you test 3090 with ARK?
i can play 4k at 60 fps is it good ?
can you please show your build
Oh why don't you use cpu that supports more PCIe lines? Results would be much better
not really... ruclips.net/video/2NSlMWvEH3I/видео.html
Я непойму откуда столько фпс 2видеокарты? И если это 2 видеокарты то почему они обе работаю на 100 проц.?
And, where are differents?
patiently waiting for a benchmark on the avengers.
Wenn ich richtig gelesen habe hast du die beiden GPU`s in einer x8/x8 Anbindung ( zumindest steht das in der Produktbeschreibung vom z490 so X16/X0 oder X8/X8) was mich dann insofern wundert weil ich das bei The Witcher auch mit 2 MSI Gaming x Trio 2080 Ti ausprobiert habe und dabei gut 60 FPS weniger hatte als du in deinem Video. Dachte es liegt bei mir an der X8/X8 Anbindung aber offensichtlich verwendest du die selbe Konfig nur das es bei dir funktioniert.
Der Intel i9-10900K verfügt nur über 16 PCIe Lanes, somit ist nur x8/x8 möglich.
Anti-Aliasing wurde bei Witcher 3 deaktivert. Settings: 04:42
@@BENCHMARKSFORGAMERS Darf ich noch fragen, wie du in deinem Loop die beiden GPU`s ansteuerst? GPU-Radiator-GPU ?? So möchte ich es auch bei mir bauen, fürchte aber das sich das dann mit den Schläuchen nicht ausgeht....
@@rudirenntier3175 Radiator - GPU - CPU - Radiator. Video vom PC: bit.ly/32rM0qk
Die Reihenfolge hat kaum einen Einfluss auf die Temperaturen: ruclips.net/video/ea7X8s_6rJo/видео.html
For a moment I thought 4320p as 4k
Can you please add Anno 1800?
What is the name of the intro song
2080 ti = 3090
Please 3090 sli
Monitor 8K?
8K looks like it struggles to maintain 30fps, overheats graphics, and raises your electricity bill.
same people 10 years later:
still struggling with my old 3090 Ti SLI to keep stable 144 fps on my shitty 8k monitor that has only 144hz anyway...
did the pegomastix in ark scare anyone else then proceed to have ark flash backs
waiting for 3080 ti sli
red dead redemption 2 pliz
I didn't see any video of that game running at 8k, maybe for the next gen gpus... xd
bruh u didnt remove the fps lock in bf5
Temps are below the average temperatures in some parts of India. Wow
gpu temp is too low" is this water block cooling?
Yes it is watercooled
No i think watercooled is more cold than this
2K is 1920x1080 because has 2 million pixels. Original 2K is 2048x1080 but this is for cinema, for mainstream it was cut to 1920x1080. But 2560x1440 it's not 2K, it's Quad HD.
Right side fps is my pc in 2k. Good old rx vega 64 nitro+ .
Jesus the 10900k is so strong, only like 20% usage in 4k and 8k..sure intel is overpriced as hell, but if money isnt a problem and you really want the best performance, go intel
Isn't 2k 1080p?
No 1920x1080 is called FHD or Full High definition 1080p.
2560x1440 is called 2K or 1440p.
There, now you know ;)
@@mobarakjama5570 oh ok lol, thx
Tommaso Tapparo 1920*1080p is 2k. 2560*1440p is 2,5k. 3840*2160p is 4K
@@Porche992 wrong friend :)
none of these games or any for that matter are optimized for 8k. doesn’t matter if you have flagship spec, you’ll still get low fps.
What’s the end song
If you ever feel useless, remember:
8K TVs exist.
so it CAN run crysis but at 30 fps