This is actually pretty cool especially for people like me that want to game at 4K high refresh with ought needing to spend a fortune on a 4090 or 5090. I could care less about fake frames as long as they look good and play is smooth.
It's not smooth and has lots of visual tearing lattcenly and input lag so even it says you are playing at say 100+ fps the game is actually running at 60fps and if the game is running at under 60fps and you use dlss and frame generation the game is horrible.
@@thetheoryguy5544 because you can cearly see issues even in Nivdia controlled environments also dlls 4 is not a new technology its already out by software from other non gpu companies and you can clearly see issues with it. Dlls is good for ray tracing but to be forced to used in every game is very bad it shows just bad game development has become one and if yous support dlss fsr and fg for every game then yous are helping to make the pc gaming far worse yous are malking a world where we get a lot less for our money while being charged more not even owning our own fps anymore plus expect pc games to be at a even worse optimisation state because of ue5 exmaple stalker 2 you can't run the game at native resolution because ue5 is broken with native resolution.
software trickery is what you will get for forseeable future because brute force aint cutting it anymore. Look at the spec of 5090 just for measily 30% uplift.
@@Case_ definition is in the HW itself. Nvidia went "back" to 512 bit bus so it can maximize bandwidth, over 20K cores etc for measily perf uplift because the huge gen to gen gains are over. The software "cheats" are the future.
@@MaggotCZ Given that there's no reliable benchmarks available (exactly my original point), saying the 5090 has a "measly performance upllift" is not warranted. It's reasonable to expect some 30-40% uplift in the raw power, which is along the lines of what the generational improvement tended to be in the past.
@@Case_ well yeah but that 30-40% came at the cost. The spec for the 5090 is and absolute brute force. Its like a decade, the last time nvidia used 512 bit bus. it has 575W TDP. My point being is that for the absolute top tier perf, the HW just cant cope anymore. While with the "software cheat" a measily 5070 with its shitty spec can reach far higher numbers. I think thats the focus in the future because there will come point in time where nvidia just cant make the HW any better but with SW and the whole AI shit. We are at the start so far. The other point is that raster will slowly disappear so it wont even be a metric in few years cause every game will have RT baked in.
Frame generation is not, it generates a lot of latency they introduced reflex 2 which reduces latency but it will normally won’t be enough to compensate for the latency increase the new frame generation introduces
Yes, you need minimum 165Hz GSync compatible monitor. I always use DLSS quality and FG this running good in COD, I get 140-180fps I didn’t feeling the difference input lag with mouse and keyboard. I love frame gen this is so good innovation, can’t to try DLSS 4
I used it in warzone for a bit, its decent enough for casual gaming but not comp because it has a bit of input lag. however, reflex 2 is coming and apparently it reduces pc latency 3 times as much as the orgiinal so this could maybe improve the input lag that isa caused by dlss, frame gen etc
I dont want to play 4k compromising the graphics settings. Need to appreciate the visuals also as well as being competitive. I think this new machine would do good if there are no delays.
I should have asked him to open a portal and explained that AI can't be calculated spontaneously. The DLSS in the new Nvidia chips, with their fake frame generation, could improve performance by predicting where a portal might be placed before it's actually positioned. This would let the AI focus on rendering a specific path of vision through the portal instead of the entire game environment. But there’s still a gap - the portal isn’t working as it should. Every time I open it, the frame rate drops, no matter what. I just want a consistent game experience, not a fancy Doctor Strange portal that tanks my performance.
Mult generation is kinda weird. Say you’re generating from 60fps to like 120, it’ll still feel like you’re playing at 60. It’s less of a “performance enhancer” and more of a “frame smoother”. Idk if I’d recommend upgrading unless you’re on a 20 series card tbh
New cards have more AI TOPS, and are a different architecture, also NVidea only lets their newer cards have the fancy new tech, to push consumers to buy them.
@onepieceluffy7692 are you sure? the AMD 9070 was performing close to an NVIDIA 4080 super in BO6 at the CES event, and that's just their mid range card...
Fake lame frames dont mean it can beat it gtfo nvidia is just using all this new tech to make crappy higher frames now, take off all that extra frame gen crap and then see how it really matches to the 4090, i have one myself in an itx and honestly it more more than what i need its pointless for me to want a 5090, what it is like 20% increase only? No thanks maybe when something better comes out ill think about it but the 5090 just doesnt seem all that great to me imo
It's NOT for you. You'd be insane to buy a 4090 at around the same price of a 5090 if you're building a new computer or even an upgrade. It's as if people don't believe there are fake frames being generated with the 40 series. When you say "FAKE" , does it not beat the 40 series with this fakeness you keep talking about ?
i wouldnt go and say that lmao, the 5070 already will beat the 4090 in games that support dlss 4, in games where raster matters more? sure 4090 still is stronger but 5080 is a better card than the 4090 forsure, faster memory and better features including dlss4, thats why all 4090 users are selling their cards on FB market lol
@onepieceluffy7692 no it’s not, it’s facts. Fake frame generation at least for now doesn’t look good. I play mostly simulators and at least for VR the dlss looks crap. Looks like you want to believe in big corporate marketing. Do you really believe that a $2000 cheaper card you push the same quality? Only in your dreams.
@@carterdemuth Look at Nvidia's official graph for the 4070 vs 5070 for Plague Tale and Far Cry 6. The 5070 gets 30% better performance than the 4070 at 1440p. The 4070 gets ~ 80FPS on plague tale which means the 5070 would get ~104 FPS. The 4090 gets ~155 FPS on that game same resolution and settings. That means the 4090 is 49% faster than the 5070. It's not even close. Plus, add the program called Lossless Scaling which already does 4X Frame Generation to any game and the 4090 outperforms the 5070 by a long shot.
This is actually pretty cool especially for people like me that want to game at 4K high refresh with ought needing to spend a fortune on a 4090 or 5090. I could care less about fake frames as long as they look good and play is smooth.
True
It's not smooth and has lots of visual tearing lattcenly and input lag so even it says you are playing at say 100+ fps the game is actually running at 60fps and if the game is running at under 60fps and you use dlss and frame generation the game is horrible.
@@deadpoolf1 You haven't even used DLS4 so how would you know?
@@thetheoryguy5544Personal has functional eyeballs though 😂🎉❤
@@thetheoryguy5544 because you can cearly see issues even in Nivdia controlled environments also dlls 4 is not a new technology its already out by software from other non gpu companies and you can clearly see issues with it. Dlls is good for ray tracing but to be forced to used in every game is very bad it shows just bad game development has become one and if yous support dlss fsr and fg for every game then yous are helping to make the pc gaming far worse yous are malking a world where we get a lot less for our money while being charged more not even owning our own fps anymore plus expect pc games to be at a even worse optimisation state because of ue5 exmaple stalker 2 you can't run the game at native resolution because ue5 is broken with native resolution.
So is this with frame gen on or off? If on, please show the same demo with frame gen off to actually show the GPU power instead of software trickery.
software trickery is what you will get for forseeable future because brute force aint cutting it anymore. Look at the spec of 5090 just for measily 30% uplift.
@@MaggotCZ That depends on your definition of "cutting it".
@@Case_ definition is in the HW itself. Nvidia went "back" to 512 bit bus so it can maximize bandwidth, over 20K cores etc for measily perf uplift because the huge gen to gen gains are over. The software "cheats" are the future.
@@MaggotCZ Given that there's no reliable benchmarks available (exactly my original point), saying the 5090 has a "measly performance upllift" is not warranted. It's reasonable to expect some 30-40% uplift in the raw power, which is along the lines of what the generational improvement tended to be in the past.
@@Case_ well yeah but that 30-40% came at the cost. The spec for the 5090 is and absolute brute force. Its like a decade, the last time nvidia used 512 bit bus. it has 575W TDP. My point being is that for the absolute top tier perf, the HW just cant cope anymore.
While with the "software cheat" a measily 5070 with its shitty spec can reach far higher numbers. I think thats the focus in the future because there will come point in time where nvidia just cant make the HW any better but with SW and the whole AI shit. We are at the start so far. The other point is that raster will slowly disappear so it wont even be a metric in few years cause every game will have RT baked in.
(picture of Christian Bale)
"IMPRESSIVE. VERY NICE."
"LET'S SEE IT WITH DLSS OFF."
Yes with 4090 $2000 so shhh
yes babbby😂
hey im just a dumb guy. they wont be able to make the 4090 run dlss 4?
they need more money
No 4090 has no dlss4
it will have dlss 4, just not the multi-frame generation part
is dlss good for competetive gaming?
Frame generation is not, it generates a lot of latency they introduced reflex 2 which reduces latency but it will normally won’t be enough to compensate for the latency increase the new frame generation introduces
Yes, you need minimum 165Hz GSync compatible monitor. I always use DLSS quality and FG this running good in COD, I get 140-180fps I didn’t feeling the difference input lag with mouse and keyboard.
I love frame gen this is so good innovation, can’t to try DLSS 4
It's terrible
I used it in warzone for a bit, its decent enough for casual gaming but not comp because it has a bit of input lag. however, reflex 2 is coming and apparently it reduces pc latency 3 times as much as the orgiinal so this could maybe improve the input lag that isa caused by dlss, frame gen etc
I dont want to play 4k compromising the graphics settings. Need to appreciate the visuals also as well as being competitive. I think this new machine would do good if there are no delays.
180 vs 234 and 1 fake frame vs 3 fake frames idk sounds like fake video cards release
If you can't sling a brick down at NVIDIA then you will have one at your doorstop.
How does this work in FPS games? Will there be any delays?
so with dlss4 on both cards and mult on 5070 will be equal fps ?
I should have asked him to open a portal and explained that AI can't be calculated spontaneously. The DLSS in the new Nvidia chips, with their fake frame generation, could improve performance by predicting where a portal might be placed before it's actually positioned. This would let the AI focus on rendering a specific path of vision through the portal instead of the entire game environment.
But there’s still a gap - the portal isn’t working as it should. Every time I open it, the frame rate drops, no matter what. I just want a consistent game experience, not a fancy Doctor Strange portal that tanks my performance.
Genuine question, would it just be better to have dlss on and have frame gen off? Or are they both turned on at the same time
Both turned on.
DLSS 3.5 + FG in 4090
DLSS 4 + MFG in 5070
Where's the input latency numbers
"Feel free to test it" Ends video. 0_0
Wiggle the moouuuuuuuuuusssseeeeeeeee. Lets compare the screen tearing like in LTT's video.
Mult generation is kinda weird. Say you’re generating from 60fps to like 120, it’ll still feel like you’re playing at 60. It’s less of a “performance enhancer” and more of a “frame smoother”. Idk if I’d recommend upgrading unless you’re on a 20 series card tbh
Bruh, who are you recommending going to sit down and be quiet
RTX 5070 best card for people who don't want to spend £2k plus like me
It means a 5070 on labtop can easily made 4k 100fps for 2/3 years
Is there a good reason DLSS4 can't run on a 4090.
It can
Dlss 4 is only exclusive for rtx 5000 series
New cards have more AI TOPS, and are a different architecture, also NVidea only lets their newer cards have the fancy new tech, to push consumers to buy them.
@@Bipul-j7m ofcourse they're gonna say so they can take your money.
No DLSS 4 Transformer Model for Upscaling will be Available for 40 series
I'm in the process of building my first pc. Should I wait for the RTX 5070 or buy a RTX 4070 super?
Wait for radeon 90 series, then decide
Don't buy amd it's always behind rtx
@onepieceluffy7692 are you sure? the AMD 9070 was performing close to an NVIDIA 4080 super in BO6 at the CES event, and that's just their mid range card...
Nvidia stops producing good gpu's after 1080ti
RTX 4070 Ti super OR RTX 5070 Ti ?
5070 Ti will offer around a 4080 Super in performance.
@@03chrisv yeah
Also, sort of a transfer speed is increased 😁
5070 ti will have much more performance and faster vram new generation of tensor and rt cores and 5070 ti is 50$ cheaper
So they are selling us interpollation lol
That’s what I do. I watch…
They are roasting their own product lol rip 4090
the game most to suport multi frame generator,if they don't 4090 is better.
I just bought a 4k 160fps monitor and currently getting 70 fps in 4k with a 3080ti. I think a new 5070 would get me to nearly 140 stable frames in 4k
Marvel rivals needs to be optimized better
works fine for me at 4k with 4080
Fake lame frames dont mean it can beat it gtfo nvidia is just using all this new tech to make crappy higher frames now, take off all that extra frame gen crap and then see how it really matches to the 4090, i have one myself in an itx and honestly it more more than what i need its pointless for me to want a 5090, what it is like 20% increase only? No thanks maybe when something better comes out ill think about it but the 5090 just doesnt seem all that great to me imo
It's NOT for you. You'd be insane to buy a 4090 at around the same price of a 5090 if you're building a new computer or even an upgrade. It's as if people don't believe there are fake frames being generated with the 40 series. When you say "FAKE" , does it not beat the 40 series with this fakeness you keep talking about ?
not even 5080 can remotely touch 4090. not by a long shot
i wouldnt go and say that lmao, the 5070 already will beat the 4090 in games that support dlss 4, in games where raster matters more? sure 4090 still is stronger but 5080 is a better card than the 4090 forsure, faster memory and better features including dlss4, thats why all 4090 users are selling their cards on FB market lol
Nah u r long by a long shot
@@ALPINE_M4GTSmy rx 6800m laptop is faster than 5090 that way if i use lossless scaling which offers framegeneration upto 20x
@ idk about that bro 😂
@@Sonu-w7o5q why are you embarrassing yourself 😂, I think you are newbie
in my amd 6800xt i am getting 250 + fps
Ill see it when i see it. Possibly cherry picked the fastest 5070 and the slowst 4090
And even after that it is pretty impressive... Slowest 4090 is still way better than 4080s, but still I wanted 16 gb vram in 5070
It’s cool for the budget, but no way it looks the same. 4090 the image clarity and sharpness is way better.
Oh no your upset that your 40 series card doesn’t compete
Cope cry
@onepieceluffy7692 no it’s not, it’s facts. Fake frame generation at least for now doesn’t look good. I play mostly simulators and at least for VR the dlss looks crap. Looks like you want to believe in big corporate marketing. Do you really believe that a $2000 cheaper card you push the same quality? Only in your dreams.
@@carterdemuth5070 6k cuda cores, 4090 16k cuda cores, 5070 12gb vram, 4090 24gb vram, he's right
@@carterdemuth Look at Nvidia's official graph for the 4070 vs 5070 for Plague Tale and Far Cry 6. The 5070 gets 30% better performance than the 4070 at 1440p. The 4070 gets ~ 80FPS on plague tale which means the 5070 would get ~104 FPS.
The 4090 gets ~155 FPS on that game same resolution and settings. That means the 4090 is 49% faster than the 5070. It's not even close. Plus, add the program called Lossless Scaling which already does 4X Frame Generation to any game and the 4090 outperforms the 5070 by a long shot.