I think so. Right now, the performance is exceptional and thanks to DLSS, one can tinker settings to play the game at smooth framerates and good visuals.
It won't hold up into next year. It barely holds up now at 4K. You need to use DLSS. That was my comment before watching. I kind of know already what it's like. I have an RTX 4070 Ti Super.
The 4070 Ti Super is a great card, but I would also recommend the 4070 Super (Non-Ti). The ti-super is $200 more expensive (in the US), but delivers between 15-20% more performance. Notice in this video the VRAM that is being utilized. Whenever we got close to 12GB, we had to dial back the settings anyway.
actually dont think so, you can play medium settings but ULTRA textures and get great performance out of this card, dialing back to medium textures is awfull since textures don't require compute power , just memory size. 16GB will be much better in the next 2 years than 12
I have not been able to push my 4070 Ti hard enough to force it to run out of VRAM at playable framerates. It always runs out of power first, no matter what texture quality, RT settings and FG type I use. 12GB is a comfortable buffer size for modern uncompromised 1440p gaming and slightly compromised 4K gaming.
My 4070 TI Super is strictly for 1440p. The jump from 1080p was expensive but was more of an upgrade than 1440 to 4K. The one upgrade I'm looking at is a 1440p OLED to replace my IPS 170Hz monitor. 4K will be for my next system upgrade in a few years.
Why not upgrade to 4K OLED later instead of spending money on 1440p OLED? IPS gets the job done well for now, no? 4K is great. It's become my go-to resolution because of how versatile it is thanks to DLSS. You don't even need a powerful GPU to enjoy games on a 4K monitor nowadays.
I don't get why people say the jump from 1440p to 4K isn't as much of an upgrade as from 1080p to 1440p, I mean 4K is far more than double the pixels of 1440p while 1080p isn't even less than half of 1440p. This math doesn't add up 😂
Why do people keep making such statements "strictly". Not at all, just use DLSS and in most games its more than enough. I am using a 3080 12gb which is a bit weaker and I am very happy with the performance. You don't have to max out every single game, some settings are for the future. It has always been this way. I have been playing Jedi fallen order and the newer one, cyberpunk 2077, GoW Ragnarok and other games and I never have to go below dlss balanced to have at least 80 fps+
for wukong shadow -high,global illumination-medium,viusal,hair effect-medium,vegetation-high and then try rt very at 4k with rtx 4070 ti super.then see the magic
Let's make one thing VERY CLEAR. The 4070 Ti Super is NOT a 4K GPU. Just because it can, doesn't mean it should. I have one and I love it, but I also have realistic expectations.
I picked up 4070 ti super few weeks ago to upgrade my rx 5700xt. Best upgrade after 4 years. I play on a 144hz ultrawide monitor and these cards are perfect for it. Immersive gaming with high to ultra settings with high fps. And as a video editor motion graphics artist that 16gb ram and the cuda cores is a big plus for me. Like my workflow increased to 100%. Its just super smooth compared to using my old 5700xt. Thinking of upgrading my monitor to an ultrawide oled next 😊
I just installed my MSi Shadow 3x RTX 4070Ti Super and I absolutely love it! I usually play at 1440P max settings and it does that with ease! I have a 165hz 1440P monitor and it easily plays Assassin's Creed Mirage and Valhalla at 144fps+ on max settings in most areas. I was able to get around 120fps in Stalker 2 with DLSS quality and 1440P optimized settings. This card does more than I could need!
@theivadim your videos are the reason I chose that card. I have an i9 11900kf CPU. Do you think that CPU will bottleneck the 4070Ti Super? It didn't seem to be bottlenecking on most of the games I've played other than maybe Stalker 2...
Exactly, in some CPU-demanding games, you will see a bottleneck, but most perform just fine. Ultimately, only upgrade if the FPS you see on screen is lower than what you’d consider a good experience.
@@theivadim it's already maxing my 165hz monitor out in a lot of games, so I should be good to go for a while. I would like to upgrade to a 4k monitor in the future, now that I have the GPU capable of some 4k gaming.
I sold my 4070 and got the tuf oc 4070ti super and i play at 4k all day so was big upgrade not just 30 fps more but when 4070 went over vram more like 60 so im happy with mine.but thats me tho.
Depends on the games that you wanna play on it. If you wanna play older titles from the PS4/XBO era, or games that released on both the new and old generation of consoles. Then yea, it can be a great 4k GPU for that. If we are talking next gen games only. Then it's a big NO. Unless the game is well optimized, or does not push fidelity beyond what the current hardware can handel. RT is still a joke of feature, that eats way too much resources, when compared to normal high quality baked lightning solutions. It's nice as a additional feature on top of traditional lightning. The only advantage to RT right now, is that developers can save time and money, by doing everything in real time. And now waiting for the old light solutions to render. At a very high cost to the end user. So YOU the customer. Not to mention that the majority of PC gamers that had enough money to buy a new GPU in the recent years, all use a 3060 or a 4060. So they will never be able to even enable most of these advanced RT features. Without turning their game in to a Nintendo 64 slideshow. When the mainstream hardware will finally catch up to the demands that the developers push on their new games, then that feature will be worth something. But at that time, they will try to force something else, just to force ppl to upgrade their hardware. We could have had games with the graphical quality of RDR2 or Batman Arkham Knight for a few more years, and I'm sure that the majority of gamers would not complain one bit. If the games were good and fun to play. And in the mean time RT for the mainstream consumers could have catched up, to where it's actually usable. Without cutting your fps in half or more. FG is also very bad, I pref to play my games with DLLS Q and have 90-100 fps if I can. Than to use FG, to instantly feel the added latency to my mouse or controller. I'm amazed that some ppl can't feel the huge delay in the response time, that's added with this feature. Or better yet, how some developers are now pushing us to FG it from 30 to 60 fps. Good luck with that. Overall the current generation of GPU's is very bad. Overpriced, with features that they can't fully handle. A 850$+ GPU that has to lower setting at 2k in modern games. Joke of a product. But considering that the 5070 is expected to only have 12GB of Vram, and perform at the level of a normal 4070 Ti( not super). This GPU will easily last you till the next generation of consoles launches. Not to mention you will always be playing above the consoles settings. Better resolution/graphics/framerate/rt. I mean a 3070 is still doing better in 3rd party games than a PS5 Pro. So I would not worry about it not lasting till then. So that's at least till late 2026/early 2027.
A higher 4k or 3k resolution doesn't make games more fun it just sharpens the image a tad but on a 27 inch monitor its not a big difference which is why i'm sticking to 1080p to enjoy more fps. The PC gaming industry is trying to convince ppl they need to play in 4k to get them to buy new hardware because not much ppl struggle to run any game in 1080p now.
I think this card shines in 1440p high settings is the king of that resolution, for 4k still good to get like an entry-level, but if you thinkg that can go up for the 4080 that card performs better for 4k, But if I had to get cut in my Budget for a 4k set up, this card it's a great option
4K DLSS Performance will still look better than 1440p native. The quality of the visuals is not from the GPU but from the monitor and is called "pixel density". If you play 1440p on a 4K monitor, you are basically 'stretching' each pixel across multiple pixels to fill the screen which results is a lower visual quality (think of expanding a 640 x 480 photo to fill your monitors 1920 x 1080 screen. It will look blurry). Using 4K DLSS, each pixel is actually filling the space it is designed to fill. Each pixel is not being 'stretched' but is digitally upscaled to fill the designated space and because 4K monitors have higher pixel density, 4K DLSS is essentially using more pixels resulting in a sharper image. When it comes to FPS. 1440p native will (in most cases) give you a higher FPS count over 4K DLSS Performance. So at the end of the day, you can either have better visuals or higher FPS. You can't have the best of both worlds. Hope this helps.
@ thank you very much sir . I was just watching videos saying 4k is not necessary . And also if you use dlss or fat the quality will be lower . Maybe but after your explanation it makes sense . Just want to know of the extra money for dlss is worth compared to fsr. Could you please explain ?
@@pchx Both DLSS and FSR are using similar upscaling algorithms. DLSS is just Nvidia's upscaling solution and FSR is AMD's. However. Nvidia have locked DLSS so only RTX GPU's can use DLSS. So if you own an AMD GPU, you can't use DLSS. But AMD's FSR is not locked, so FSR can be used on AMD, Nvidia and Intel GPU's. DLSS is more advanced than FSR therefore DLSS looks considerably better than FSR in most cases. AMD's FSR is advancing though and the gap is slowly closing between DLSS and FSR. In regards to your comment of "4K is not necessary". That is true. 4K is not necessary at all. Yes, it looks better (even with DLSS/FSR), but you are losing a lot of performance for visual quality. I only use a 1440p monitor and I play games with my 4070 Ti Super using 1440p DLSS Quality. That way I can turn the graphical settings up high and still get decent FPS. At the end of the day everyone is different and it comes down to the type of games you play. There is no real "right or wrong". The last thing I will say is that the 4070 Ti Super is not really designed to be a 4K GPU, even when using DLSS Performance. You can use it at 4K, but if you want to play a game that has Ray Tracing and you want to use those features, the 4070 Ti Super won't deliver high FPS. But if you invest in a good 1440p HDR, high refresh rate monitor, you will probably have a better overall gaming experience.
So a 4070 super Ti £700-£800 card can't do 4k just like every other sub £1000 GPU ,they all need upscaling ,and upscaling has been around since i bought my first 720p plasma 50 inch TV years ago, all this software upscaling making crap cards look better is the proof we were never going to have GPU's that can play native 4k at a resonable price for years to come. Upscaling works on my MSI 1080 Ti 11gb and it plays games if you turn everything down just like we watched in this video.
4000 series won't hold up well. Once 5000 releases, Nvidia will start to decrease 4000 performance in new games. Next year you will need at least 4080 Super to get more or less decent 1440p experience. Additional VRAM won't help here.
@theivadim it won't age well in 4k. While it might be fine today, it's not a good investment if you have a 4k monitor. It's a 1440p gpu. Especially if you actually plan to use ray tracing
my PC has an RTX 4070 Super on a 1440p OLED monitor at 240hz and it works great on every game I've played on with max settings. I would imagine the 4070 ti super performs even better. If you have an OLED monitor running at 1440p 27", 4k isn't really a necessity with these great looking high quality OLED high refresh rate monitors IMO.
Thinking of upgrading to oled. Currently using 144hz ultrawide VA monitor. Should i get a 27” 1440p oled or ultrawide oled. Will it be a downgrade graphically from ultrawide to 1440p? I really want to upgrade to an oled but scared of burn in. Monitor will be used for productivity and gaming but more on productivity.
@pinoyxbox I have a pg34wcdm oled. It's the most expensive 1440p ultrawide you can buy, buts its beautiful and bright. You can always play at 27 inches as well. Ill never go back to back/edge lit panels. As long as you do the maintenance care cycles you dont have to worry about burn in. Either a 27 inch or 34 inch wont be a downgrade. Just go with what you think is best for you.
I'm sure it’ll work fine in most games. Except for the new Indiana Jones game, based on the system requirements shared by the developers. There's always that one game that has extremely unreasonable requirements.🙂
its a 4k card locked at 60fps, anything higher depending on the title, is just not happening(depending on settings). people who are focused on 4k so much especially for triple A titles, would benefit better from atleast a 4080 super, i game on 4k up to 165hz on a 4090. it struggles even on some games today at native, hell even dlss sometimes barely does any justice, i myself love high graphic settings, love observing the quality in games reflections etc. A bigger issue is optimization in games, ive had a 4070ti, for what i used it for. great at 4k 60fps, more towards 120 or higher? without sacrificing settings its a struggle, lack of vram in very specific scenarios can come into play. all in all depends on the title, pc config like cpu gpu combo ram cap. 4070ti is a strong card but it shines best in 1440p application, whihc 1440 is more than enough for todays games rn. best smooth spot imo
Im a crossover type gamer allthough i love my maxed out single player experience i also love my competitive gameplay experiences . For me 4k is just not good enough. The 4070 is fine if you stay away from 4k. My reason gor staying away is . At 4k running at a high competitive hrz is not possible at least for 'regular' money. However 1440p can be found at 240 hrz which i find the perfect blend for my type of gamer profile.. pri ing isnt nice yet and some specs arent truly possible to incorporate without a gpu upgrade beyond a 4090
Rx 7900 GRE is slightly better than Rx 7800 XT but Rx 7900 xt is far more powerful than both. If you can afford it then go for RTX 4070 Ti Super as it's overall the superior product in my opinion but if you care about the value you get by each dime you spend then Rx 7900 XT is the way to go.
Why 4k ? I don't see much difference between 2k and 4k. Why lower the performance so much just to play in 4k from +160 fps in 2k to below 60 fps in 4k. I don't understand why you would put $3000 into a PC and then play below 60fps
In God of War Ragnarok, disable Vsync and you will get higher frame rates. I tested this scenario several times. Vsync drops the frame rate by 20 to 40 fps.
The 4*** series is an extreme unlucky GPU! 😮😮 you might think "oh how can a GPU be unlucky" but I would point to the extreme unlucky nature of the number 4 and the letter N! Case closed!!! 😢
Do you think the 4070 Ti SUPER will perform this well in games released in 2025?
Yes sir got o e myself
for a 1000e for card in most countries.....i really don't care.
It more than likely will, yes. (But I'm happy with my $500 RX 6950 XT at 1080p 😝😝😝)
I think so. Right now, the performance is exceptional and thanks to DLSS, one can tinker settings to play the game at smooth framerates and good visuals.
It won't hold up into next year. It barely holds up now at 4K. You need to use DLSS. That was my comment before watching. I kind of know already what it's like. I have an RTX 4070 Ti Super.
The 4070 Ti Super is a great card, but I would also recommend the 4070 Super (Non-Ti). The ti-super is $200 more expensive (in the US), but delivers between 15-20% more performance. Notice in this video the VRAM that is being utilized. Whenever we got close to 12GB, we had to dial back the settings anyway.
Good point. I love the 4070 SUPER. It's been my daily driver throughout 2024.
actually dont think so, you can play medium settings but ULTRA textures and get great performance out of this card, dialing back to medium textures is awfull since textures don't require compute power , just memory size. 16GB will be much better in the next 2 years than 12
16gb vram is the minimum now.
@@pf100andahalfminimum means to run the game on low settings
I have not been able to push my 4070 Ti hard enough to force it to run out of VRAM at playable framerates. It always runs out of power first, no matter what texture quality, RT settings and FG type I use.
12GB is a comfortable buffer size for modern uncompromised 1440p gaming and slightly compromised 4K gaming.
My 4070 TI Super is strictly for 1440p.
The jump from 1080p was expensive but was more of an upgrade than 1440 to 4K.
The one upgrade I'm looking at is a 1440p OLED to replace my IPS 170Hz monitor.
4K will be for my next system upgrade in a few years.
Why not upgrade to 4K OLED later instead of spending money on 1440p OLED? IPS gets the job done well for now, no? 4K is great. It's become my go-to resolution because of how versatile it is thanks to DLSS. You don't even need a powerful GPU to enjoy games on a 4K monitor nowadays.
I don't get why people say the jump from 1440p to 4K isn't as much of an upgrade as from 1080p to 1440p, I mean 4K is far more than double the pixels of 1440p while 1080p isn't even less than half of 1440p. This math doesn't add up 😂
Why do people keep making such statements "strictly". Not at all, just use DLSS and in most games its more than enough. I am using a 3080 12gb which is a bit weaker and I am very happy with the performance. You don't have to max out every single game, some settings are for the future. It has always been this way. I have been playing Jedi fallen order and the newer one, cyberpunk 2077, GoW Ragnarok and other games and I never have to go below dlss balanced to have at least 80 fps+
Keep with 1440p resolution, just pick a monitor with higher pixels density than others, it looks pretty sharp not much difference at all with 4k
stalker’s performance is not the GPU’s fault. game is optimized like horse ass, as many UE5 games are
i would say that 4070 ti super is more a solid 1440p card, got one my self for 2k gaming.
It's entry level 4K so that makes it the ultimate 1440p card, I'd argue that it's overkill even.
Same here bro
for wukong shadow -high,global illumination-medium,viusal,hair effect-medium,vegetation-high and then try rt very at 4k with rtx 4070 ti super.then see the magic
Let's make one thing VERY CLEAR. The 4070 Ti Super is NOT a 4K GPU. Just because it can, doesn't mean it should. I have one and I love it, but I also have realistic expectations.
Digital max resolution 7680x4320. So it's 4k but it depends on the games, you have fps from 50 to 90 fps so it works very well
I picked up 4070 ti super few weeks ago to upgrade my rx 5700xt. Best upgrade after 4 years. I play on a 144hz ultrawide monitor and these cards are perfect for it. Immersive gaming with high to ultra settings with high fps.
And as a video editor motion graphics artist that 16gb ram and the cuda cores is a big plus for me. Like my workflow increased to 100%. Its just super smooth compared to using my old 5700xt. Thinking of upgrading my monitor to an ultrawide oled next 😊
underrated channel!!
Excellent video...recently got myself this card. Do you feel like 32" is too large for productivity? I'm thinking 27" at 4k.
Remember that if we enable Ray Tracing, shadows and ambient occlusion by raster must be turned off.
Nice video. Just bought a samsung 43in 4k monitor. Moving from 32in 2k. I think i'll be happy with my 4070 ti super performance
Nice upgrade. Enjoy!
4080 super best combo of 9800x3d
ordered
@@avash20099800x3d is great but I’d recommend just waiting for new cards come January
Disable MPO feature in windows and you get 10 more frames
I just installed my MSi Shadow 3x RTX 4070Ti Super and I absolutely love it! I usually play at 1440P max settings and it does that with ease! I have a 165hz 1440P monitor and it easily plays Assassin's Creed Mirage and Valhalla at 144fps+ on max settings in most areas. I was able to get around 120fps in Stalker 2 with DLSS quality and 1440P optimized settings. This card does more than I could need!
Great to hear! Enjoying the experience is what matters most.🙂
@theivadim your videos are the reason I chose that card. I have an i9 11900kf CPU. Do you think that CPU will bottleneck the 4070Ti Super? It didn't seem to be bottlenecking on most of the games I've played other than maybe Stalker 2...
Exactly, in some CPU-demanding games, you will see a bottleneck, but most perform just fine. Ultimately, only upgrade if the FPS you see on screen is lower than what you’d consider a good experience.
@@theivadim it's already maxing my 165hz monitor out in a lot of games, so I should be good to go for a while. I would like to upgrade to a 4k monitor in the future, now that I have the GPU capable of some 4k gaming.
Good idea. Thanks to DLSS resolution upscaling, you can get a great experience on a 4K monitor. It’s a worthwhile upgrade, in my opinion.
I got mine so I could play modern games with maxed out settings at 175fps 1440p.
My 5800X3D is starting to show it's age, especially on Stalker 2.
I sold my 4070 and got the tuf oc 4070ti super and i play at 4k all day so was big upgrade not just 30 fps more but when 4070 went over vram more like 60 so im happy with mine.but thats me tho.
This card is unstoppable at 1080p, no ai needed
nice video mate, can you make the same video for 4070 super and ultrawide
Um... I'm unsure of the cameras or setup that exists that one can record hands free...
Depends on the games that you wanna play on it. If you wanna play older titles from the PS4/XBO era, or games that released on both the new and old generation of consoles. Then yea, it can be a great 4k GPU for that. If we are talking next gen games only. Then it's a big NO. Unless the game is well optimized, or does not push fidelity beyond what the current hardware can handel.
RT is still a joke of feature, that eats way too much resources, when compared to normal high quality baked lightning solutions. It's nice as a additional feature on top of traditional lightning.
The only advantage to RT right now, is that developers can save time and money, by doing everything in real time. And now waiting for the old light solutions to render. At a very high cost to the end user. So YOU the customer. Not to mention that the majority of PC gamers that had enough money to buy a new GPU in the recent years, all use a 3060 or a 4060. So they will never be able to even enable most of these advanced RT features. Without turning their game in to a Nintendo 64 slideshow.
When the mainstream hardware will finally catch up to the demands that the developers push on their new games, then that feature will be worth something. But at that time, they will try to force something else, just to force ppl to upgrade their hardware.
We could have had games with the graphical quality of RDR2 or Batman Arkham Knight for a few more years, and I'm sure that the majority of gamers would not complain one bit. If the games were good and fun to play. And in the mean time RT for the mainstream consumers could have catched up, to where it's actually usable. Without cutting your fps in half or more.
FG is also very bad, I pref to play my games with DLLS Q and have 90-100 fps if I can. Than to use FG, to instantly feel the added latency to my mouse or controller. I'm amazed that some ppl can't feel the huge delay in the response time, that's added with this feature. Or better yet, how some developers are now pushing us to FG it from 30 to 60 fps. Good luck with that.
Overall the current generation of GPU's is very bad. Overpriced, with features that they can't fully handle. A 850$+ GPU that has to lower setting at 2k in modern games. Joke of a product.
But considering that the 5070 is expected to only have 12GB of Vram, and perform at the level of a normal 4070 Ti( not super). This GPU will easily last you till the next generation of consoles launches. Not to mention you will always be playing above the consoles settings. Better resolution/graphics/framerate/rt. I mean a 3070 is still doing better in 3rd party games than a PS5 Pro. So I would not worry about it not lasting till then. So that's at least till late 2026/early 2027.
me: my 4080super is no longer enough for 1440p.
author: is 4070ti super enough for 4k?
A higher 4k or 3k resolution doesn't make games more fun it just sharpens the image a tad but on a 27 inch monitor its not a big difference which is why i'm sticking to 1080p to enjoy more fps. The PC gaming industry is trying to convince ppl they need to play in 4k to get them to buy new hardware because not much ppl struggle to run any game in 1080p now.
I can't believe that's a 32" monitor. Lol. Why does it look so small? 😅
probably because it’s curved
Probably because the camera lens is ultrawide. Optical illusion.🙂
I think this card shines in 1440p high settings is the king of that resolution, for 4k still good to get like an entry-level, but if you thinkg that can go up for the 4080 that card performs better for 4k, But if I had to get cut in my Budget for a 4k set up, this card it's a great option
Nice video!! What can you say about your speakers I want to get some and the monitor I was thinking going G8 oled Samsung or should i go msi?
I highly recommend waiting for the RX 8800 XT that thing will be a beast.
Is it better if you just play 1440p native ? Or 4k dlss still beter ? Please can you explain
4K DLSS Performance will still look better than 1440p native. The quality of the visuals is not from the GPU but from the monitor and is called "pixel density". If you play 1440p on a 4K monitor, you are basically 'stretching' each pixel across multiple pixels to fill the screen which results is a lower visual quality (think of expanding a 640 x 480 photo to fill your monitors 1920 x 1080 screen. It will look blurry). Using 4K DLSS, each pixel is actually filling the space it is designed to fill. Each pixel is not being 'stretched' but is digitally upscaled to fill the designated space and because 4K monitors have higher pixel density, 4K DLSS is essentially using more pixels resulting in a sharper image.
When it comes to FPS. 1440p native will (in most cases) give you a higher FPS count over 4K DLSS Performance. So at the end of the day, you can either have better visuals or higher FPS. You can't have the best of both worlds.
Hope this helps.
@ thank you very much sir . I was just watching videos saying 4k is not necessary . And also if you use dlss or fat the quality will be lower . Maybe but after your explanation it makes sense . Just want to know of the extra money for dlss is worth compared to fsr. Could you please explain ?
@@pchx Both DLSS and FSR are using similar upscaling algorithms. DLSS is just Nvidia's upscaling solution and FSR is AMD's. However. Nvidia have locked DLSS so only RTX GPU's can use DLSS. So if you own an AMD GPU, you can't use DLSS. But AMD's FSR is not locked, so FSR can be used on AMD, Nvidia and Intel GPU's. DLSS is more advanced than FSR therefore DLSS looks considerably better than FSR in most cases. AMD's FSR is advancing though and the gap is slowly closing between DLSS and FSR.
In regards to your comment of "4K is not necessary". That is true. 4K is not necessary at all. Yes, it looks better (even with DLSS/FSR), but you are losing a lot of performance for visual quality. I only use a 1440p monitor and I play games with my 4070 Ti Super using 1440p DLSS Quality. That way I can turn the graphical settings up high and still get decent FPS.
At the end of the day everyone is different and it comes down to the type of games you play. There is no real "right or wrong".
The last thing I will say is that the 4070 Ti Super is not really designed to be a 4K GPU, even when using DLSS Performance. You can use it at 4K, but if you want to play a game that has Ray Tracing and you want to use those features, the 4070 Ti Super won't deliver high FPS. But if you invest in a good 1440p HDR, high refresh rate monitor, you will probably have a better overall gaming experience.
@ thank you professor
The air flow inside that case is terrible. Install an intake fan.
The case has 3 intake fans to the right of the motherboard. I have the same case, but in black.
When are you gonna get an OLED!? Get the Curved AlienWare QD-OLED 32 inch.
I’d love to, but I can’t afford it right now. I’m planning to get one sometime next year.
So a 4070 super Ti £700-£800 card can't do 4k just like every other sub £1000 GPU ,they all need upscaling ,and upscaling has been around since i bought my first 720p plasma 50 inch TV years ago, all this software upscaling making crap cards look better is the proof we were never going to have GPU's that can play native 4k at a resonable price for years to come.
Upscaling works on my MSI 1080 Ti 11gb and it plays games if you turn everything down just like we watched in this video.
Get the 4070 ti super for 1440p. Good vram so it should hold up. It’s mid at 4k.
I disagree. I’m enjoying this GPU much more on a 4K monitor compared to 1440p.
4000 series won't hold up well. Once 5000 releases, Nvidia will start to decrease 4000 performance in new games. Next year you will need at least 4080 Super to get more or less decent 1440p experience. Additional VRAM won't help here.
@@stangamer1151 based on what?
@theivadim it won't age well in 4k. While it might be fine today, it's not a good investment if you have a 4k monitor. It's a 1440p gpu. Especially if you actually plan to use ray tracing
@@stangamer1151 what are you talking about ? this isn't apple lmao, what do you mean decreasing 4000 performance ? the 3000 series still work fine
my PC has an RTX 4070 Super on a 1440p OLED monitor at 240hz and it works great on every game I've played on with max settings. I would imagine the 4070 ti super performs even better. If you have an OLED monitor running at 1440p 27", 4k isn't really a necessity with these great looking high quality OLED high refresh rate monitors IMO.
Thinking of upgrading to oled. Currently using 144hz ultrawide VA monitor. Should i get a 27” 1440p oled or ultrawide oled. Will it be a downgrade graphically from ultrawide to 1440p? I really want to upgrade to an oled but scared of burn in. Monitor will be used for productivity and gaming but more on productivity.
@pinoyxbox I have a pg34wcdm oled. It's the most expensive 1440p ultrawide you can buy, buts its beautiful and bright. You can always play at 27 inches as well.
Ill never go back to back/edge lit panels.
As long as you do the maintenance care cycles you dont have to worry about burn in. Either a 27 inch or 34 inch wont be a downgrade. Just go with what you think is best for you.
It will do just fine; however, I am more concerned about my 4070 super. Will it be ready for 2025? On lower resolution, of course.
I'm sure it’ll work fine in most games. Except for the new Indiana Jones game, based on the system requirements shared by the developers. There's always that one game that has extremely unreasonable requirements.🙂
its a 4k card locked at 60fps, anything higher depending on the title, is just not happening(depending on settings). people who are focused on 4k so much especially for triple A titles, would benefit better from atleast a 4080 super, i game on 4k up to 165hz on a 4090. it struggles even on some games today at native, hell even dlss sometimes barely does any justice, i myself love high graphic settings, love observing the quality in games reflections etc. A bigger issue is optimization in games, ive had a 4070ti, for what i used it for. great at 4k 60fps, more towards 120 or higher? without sacrificing settings its a struggle, lack of vram in very specific scenarios can come into play. all in all depends on the title, pc config like cpu gpu combo ram cap. 4070ti is a strong card but it shines best in 1440p application, whihc 1440 is more than enough for todays games rn. best smooth spot imo
This is my FINAL GPU. I want to play 1440p ULTRA with RT Ultra
Pairs nicely with my Alienware 3225QF OLED.
so u can actually limit the game fps just from rivaturner?
Yes. It's a handy feature. RTSS does it best.
@@theivadimdoes limiting FPS provide a smoother experience?
@@kasadam85 In games that suffer from stuttering - yes.
can you test with this configuration in the game Battlefield 2042 on all presets please? thnkss
I won’t be doing that anytime soon. I’ve moved on to testing the RX 7800 XT now.
Im a crossover type gamer allthough i love my maxed out single player experience i also love my competitive gameplay experiences . For me 4k is just not good enough. The 4070 is fine if you stay away from 4k. My reason gor staying away is . At 4k running at a high competitive hrz is not possible at least for 'regular' money. However 1440p can be found at 240 hrz which i find the perfect blend for my type of gamer profile.. pri ing isnt nice yet and some specs arent truly possible to incorporate without a gpu upgrade beyond a 4090
Way too expensive, still 900 euros in my country....not worth it...
Thank you for the video
For a smooth 4K experience it's the 4080/4080 Super
Really hate PC gaming these days. Too expensive just to hit 60 fps on 4K, and that's with all the AI bs.
Is 4070 Ti SUPER better than the RX 7900XT? And is the RX 7900 GRE better than both???
Rx 7900 GRE is slightly better than Rx 7800 XT but Rx 7900 xt is far more powerful than both. If you can afford it then go for RTX 4070 Ti Super as it's overall the superior product in my opinion but if you care about the value you get by each dime you spend then
Rx 7900 XT is the way to go.
8800XT could be THE card vs the 4070ti S.
Why 4k ? I don't see much difference between 2k and 4k. Why lower the performance so much just to play in 4k from +160 fps in 2k to below 60 fps in 4k. I don't understand why you would put $3000 into a PC and then play below 60fps
is the rtx 4070 enough for 1080p ?
Yes, but for the best visual experience, I recommend pairing it with a 4K monitor and using DLSS.
@@theivadim ok thank you.Can you try rtx 4070 super for 1080p with RT
4k? dont you men 2k with at least 100fps?
4070 ti super imo is the best Nvidia card in the 4000 generation
In God of War Ragnarok, disable Vsync and you will get higher frame rates. I tested this scenario several times. Vsync drops the frame rate by 20 to 40 fps.
I think i need a 4090 for higher fps on 4k
weapons fell like trash tested , no feddback from gun sound is pure ... how it is amazing???
The 4*** series is an extreme unlucky GPU! 😮😮 you might think "oh how can a GPU be unlucky" but I would point to the extreme unlucky nature of the number 4 and the letter N! Case closed!!! 😢
you shouldnt trust this dude tbh hes a biased af for nvidia
Why? he is showing proff of the performace directly on video, so why is he biased?
If it were under 500$...
This gpu is not for 4k it's 1440p gpu
👌👌👏👏
this is a 1440p card for sure
Ti and Super, God damn!
1440p card
i will give u same keys for 50% less lol
No
such a misleading video
Useless 😂