🔥 Games -70% OFF:: ene.ba/Analista 🔥 Buy the best games at the lowest price!: ene.ba/Analista-Games 🔥 Buy digital cards at the best price!: ene.ba/Analista-Tarjetas - DLSS 4 adds Multi Frame Generation exclusively for RTX 50 graphics cards. - Multi Frame Generation is an improved version of Frame Generation that allows us to multiply fps up to x4. - Multi Frame Generation allows us to select the fps multiplier in x2 (similar to DLSS3), x3 and x4. - DLSS 4 is also an improvement for all RTX graphics cards thanks to the Transformer model. This new model replaces CNN, applying DLSS upscaler and Ray Reconstruction enhancements. Any DLSS-capable GPU benefits from this. - The graphics card used for all shots was an RTX 5090. - DLSS 4 añade Multi Frame Generation en exclusiva para las gráficas RTX 50. - Multi Frame Generation es una versión mejorada de Frame Generation que nos permite multiplicar los fps hasta x4. - Multi Frame Generation nos permite seleccionar el multiplicador de fps en x2 (similar al que ofrecía DLSS3), x3 y x4. - DLSS 4 también supone una mejora para todas las gráficas RTX gracias al modelo Transformer. Este nuevo modelo sustituye a CNN, aplicando mejoras en el upscaler de DLSS y Ray Reconstruction. Cualquier GPU compatible con DLSS se beneficia de esta mejora. - La gráfica usada para todas las tomas ha sido una RTX 5090.
Y decian que la calidad del rasterizado era lo mejor, que DLSS se ve mal, sinceramente no sé si la comunidad sea AMD Radeon Boys o se quedaron en el año 2019 cuando la primera versión del DLSS estaba en pañales. Nvidia lleva mas de 6 años entrenando su modelo de IA, apenas AMD Radeon esta entrando al mundo de la IA y la comunidad aplaude como si fuera la gran novedad, solo demuestran hipocresía por parte de la comunidad.
AMD tiene tiene un tiempo dentro del campo de la IA con sus procesadores Ryzen, que en GPU haya entrada ahorita es otra cosa, lo que decanta entre AMD y Nvidia son los precios locos de esta última, además tampoco es buena que una GPU por ejemplo la 5070 full raster vaya a 30fps en 1440p y con IA es que suba a 170fps, ojalá ese día no llegue
Yo lo único que veo es que sin IA el juego se cae a pedazos en 4k ... Literal la GTX1080 de hace 9 años apenas tiene diferencia con el rendimiento de una grafica que se supone que costara 2300 euros, me produce vomito lo dependientes que se están volviendo las graficas. Y los juegos peor, cada vez peor optimizados, ya que? Si total el DLSS lo arreglara por mi :)) Hagamos que SH2 remake renderice todo aunque no se vea por la niebla.
@@Spillow-C I prefer playing a game with no AA than one with TAA on. TAA just blur the image to hide the aliasing. MSAA can't be used? Forza horizon 5 uses it on consoles and the image is one of the best you can get on them.
DLSS2/3/4/XeSS/FSR4 are all AI-enhanced TAA.. it is the same thing except their respective AI models do the sampling and denoising part.. it is why it's easy to implement in today's games.
Dlss is drawing some missing parts of the frame with trained AI. That's why it looks better. For me personally dlss 4 fine itself, i won't use frame generatation
Imagine if Nintendo actually implements this on the switch 2, it would even be able to automatically upscale switch 1 titles without needing any huge patches for the new console
I think they won't be able to do it because this type of upscaling is strictly available on Nvidia cards only. I mean, if it was thag easy, PS5 Pro would've just used it instead of relying on PSSR
@@D1smantleplaystation used chips from AMD not Nvidia that was did make PSSR because before PSSR the choice was (FSR2.2 - dynamic scale) but Nintendo use Nvidia chips in switch 1 and now in switch 2 and new chips can access DLSS
@@D1smantle Switch 2 and Switch 1 use Nvidia GPUs. But Switch 2 wont be able to use Mult generation which is exclusive to Blackwell GPUs(Switch 2 uses Ampere). But Ampere GPUs are compatible with other DLSS 4 features. PS5 Pro can only dream of having DLSS. It doesnot have Nvidia hardware.
@D1smantle Nintendo is working with Nvidia. The current switch SoC is made by them, so it is that easy to them who already work with Nvidia but not for Sony, who works with different companies
@stephkm3655 well... if that's the case, Switch 2 might just rely on DLSS since upscaling seems to be the new gimmick now. Also, (idk if the specs have been revealed yet), but I doubt it will be able to do anything reasonable with its hardware without any upscaling, seeing as they gave it a 4k display. Also, not just PS5, but basically any thing on AMD can only dream of having DLSS, that's why they rely on a close approximation
@@desiqneer5317Bro, there's nothing wrong with vehicles. I saw that part of the video, with the gameplay of Cyberpunk 2077, and there is no change for the worse, only the increase in fluency. What are you talking about?
@@Guharo Because of course, you tell me "if there are visual errors, l-l-look at the cars"... brother, I have watched the video 4 times on a 4K screen, with the video slowed down... AND I COULDN'T SEE ANYTHING. There is not a single visual error.
70-80 4K Native at Max in AW2 is absolutely nuts, that's the best showcase I think here on the power of the card. No one in their right mind is going to be playing at native 4K with DLSS4 to begin with, it's a BIG upgrade in quality (tested it myself on CB2077 today on my 4070 Ti on patch 2.21, at 1440p...basically clears up all the major DLSS ghosting issues, much sharper than the native TAA, and Ray Reconstruction is now also fixed ie. no more painterly smearing even with PT). But to see it able to power through that forest at native res...not to mention the hilariously high framerates with Frame Gen, which yes...I know it adds some input lag, these aren't multiplayer games.... But of course, the take away will somehow be "CB2077 running at 30fps on RTX 5090 OMEGALUL!!!" because the world is dumb.
Some gamers are still thinking 4k 60fps is real. I mean its real in some games but not in cyberpunk. Because devs if cyberpunk didn't optimized the game well. Don't curse nvidia for it
It's not perfect, but it's a lot better than it was a year ago. New DLSS is actually pretty impressive in 4K in CB2077, the difference in clarity is very small compared to native. I actually enjoy playing with DLSS now when half a year ago I played without DLSS because of the artifacts and blur. I use DLSS Quality mode though, it's the best one
It's so sad to live en era how even its best video graphics card cannot be strong enough to play 5 years old game in highest possible settings... 10 years ago we would talk about how we can reduce input latency and rather have RAMs with lower CL. I cannot believe how the developers became lazy these days. All I want to say is that we played Crysis 2-3 and GTA 5 on Xbox 360 which has released in 2005 console. Just go and check how good Fight Night Champion the game is on x360.
the Devs are giving us choices, its not ment to be played at everything maxxed out right now, that could come in handy when revisiting the game in the future with better hardware, games looks amazing on "high" settings
You're just being ignorant and don't understand it. This is with PATH TRACING, not standard ray tracing which is already very demanding. For context, path tracing is what a lot of animated movies use...that's not cheap, and in real time?
@@lum1notryc828 Don't accuse me of being ignorant, I know what Path Tracing is. I don't know what your expectations were when RT was first marketed a few years ago, but they positioned our expectations for RT at the Path Tracing level. More than 5 years have passed. Let us now have access to the technology they promises. This shame belongs to NVIDIA not devs. On the other hand, my opinion is very clear that developers are getting lazy. I’ve been in the industry.
To go further, it is rendering at above console frame rates at native 4k res and with FULL path tracing, that is an absolutely ridiculous feat. That would in the past take an entire rendering farm hours per frame.
@@Jonatan606it's 13ms (system latency) for 250fps if it was real frames. So about 4x the latency. That already tells you that it's not at all the same thing.
Gracias por tomarte tu tiempo y hacer las pruebas, aunque al menos a mi me hubiera gustado que hubieras mostrado dlss calidad tambien y no unicamente rendimiento. / Thank you so much for take your time to make this test, but i'd have liked you show us dlss quality as well and not just performance.
I just replaced some DLSS 3.8 games with DLSS4, I must say the RDR2 improvement along with CP2077 is INSANE. Game looks better than native while running better.
Please don't let me see comments saying that DLSS looks better than Native, especially in Cyberpunk. Cyberpunk does NOT have a Native Resolution option. The game's "native" has TAA. DLSS is better than TAA? Sure. Better than native? Never. Not even DLAA.
Native in Cyberpunk with no TAA would be a jaggy mess like every other game that relies on it when you disable it, so yes DLSS looks better than native since it clears all of that lol. The days of MSAA are over and DLAA is the next best thing. DLSS Performance alone looks leagues better than TAA while giving a big performance boost, with the new model it's going to get even better.
@@TGP482 DLAA is the next best thing, but it is not good. Games don't need neither should rely on TAA to fix artifact in SS or AO. Documented and fixable, if publishers allowed for that. TAA is not meant to be comparable to DLSS, since they're both based on relatively common technology, but TAA's older and stagnant (not up-to-date, nor adapted to the game as much, since it has no ML). There are plenty of people who have made videos and articles about the problems of TAA, when we have major things like FXAA or literally any other type of AA that can be implemented. Again, comparing TAA to Native is just plainly wrong, of course DLSS/DLAA looks better than "native TAA", when it is "not rending in 4K internally".
@@lopesmorrenofim You have games in mind where Native looks leagues ahead and so much better that its worth giving up the performance gain compared to DLSS in its Newest or most recent versions? Would love to have a glimpse on that
@@AdaptacionGamer Again: they're better if you're comparing to a TAA'd render, not Native 4K. Then you get the AA argument, but that's beyond the point and fixable. DLAA is NEVER better than actual native.
@@cun7us Every time a ray of light bounces, it generates noise in the image, and it is worse when we talk about path tracing, since more rays are traced than in common ray tracing
@@AdaptacionGamer It looks awful and it's expensive on the GPU. I'd much rather have simple shadow maps that look less accurate but don't have that horrible looking mess.
Lo de generar frames se notaba mucho los generados, prefería no generarlos, era una sensación desagradable, habrán mejorado esto? Verlo en un vídeo no se siente igual que jugándolo.
I would be curious to compare DLDSR to this new transformer solution. I'd assume DLDSR would still be better but maybe not? Or maybe DLDSR will just run even better now.
This is not the battle for performance anymore, this is the battle of upscaling quality. I tried it myself and I seriously can't see any different between DLSS quality and DLSS performance if using transformer update. If AMD and Intel can't keep up, the future of gaming is bleak.
Gracias Nvidia por este nuevo software y tecnologías. Gracias a este fuerte competidor, puedo comprar GPU Amd más baratas de lo que merecen. ¡Saludos desde Turquía!
DLSS4 is great and FG4x is great and am glad the transformer dlss feature is coming to the 40 and the 30 series but the base frames in cyberpunk are still horrible Its like 20% faster which is not a generational leap The dlss4 and FG4x are great like i said and the latency is good but its definitely not twice as fast as the 4090 like Nvidia was claiming Its just some marketing bs
It feels like with new GPUs now, we are buying features instead of raw performance. Anyway, I always expect there to be some kind of latency with frame-gen... Especially on competitive titles.
@@neilranson4185 True, people expect the technology to progress linearly to the graphical demands of current games, but it definitely isn't possible without features like upscaling and frame gen. Our expectations are so high that Nvidia, AMD, Intel had to incorporate these AI technologies to give us satiable performance which wouldn't be possible with just Raw power which every other person seems to cry abt these days.
but it does have more raw performance..also latency between framegen x2 and x4 is the same. Also when reflex 2 comes out it'll be the same if not less then native
al final me terminó gustando el dlss4 corrigió muchos errores de artifacts que eran algo molesto en la mayoria de uso de reescalado por IA, la verdad si dió un mejor salto, lo que no me convence es que Nvidia tenga que obligarte a cambiar de PSU cada tantos años para acomodar la alimentacion de la grafica, como pasar de atx 2.0 a 2.4 3.0 3.5 y 5.0, me parece una toma de pelo cambiar de fuente solo para beneficiar a las empresas y consumir mas y descartar una fuente que tranquilamente te puede durar unos 8 años, y si se que existen adaptadores, pero estos tienden a tener fallas que problemas para entregar energia y por ese mismo motivo las nuevas fuentes ya vienen con esta nueva adaptacion para entregar alimentacion directa y sin problemas de que se te estropee la placa de video, AMD tiene que dar un paso en alto para ver si está a la altura de Nvidia y ver que tal competencia quiera dar.
Bueno, pero no necesariamente deberías comprar la 5090, también viene la 5080 con 16GB de Vram, yo sé que la Vram no es lo más importante, pero el chip tampoco debería estar muy lejos en rendimiento, debería ser más amigable en el consumo
Yes it is a disappointment to see new hardware being so poor and weak instead of delivering a jump in raw performance as it was back then... but you have to admit, those frame generations are BONKERS... it rarely shows any artifacts that messes the experience and it makes the games run so freaking smooth it's practically unbelievable. Already had a blast with the 4060Ti doing wonders with the Frame Generation, and now this new Multi FG getting even better results... At the end of the day, even through handicaps like these, you're still getting a better visual and performance than last-gen, so for me it's still a win
You aren't getting better performance, you're getting perceived performance only when you aren't actually playing. Your input is going to be delayed significantly when only 1 out of every 4/5 frames is real
Usan DLSS en Perfomance para sacar 80FPS en el cyberpunk sin usar Frame Gen? Bueno, al menos no hace falta Frame Gen, pero veo que algunas veces no es suficiente
Esta usando path tracing, si lo desactivas fácil llega a los 140 o talvez 200 ni idea no tengo una 5090 obviamente, pero creer que una gráfica con el path tracing en 4k pueda correr juegos fluidamente sin rescalado es como pedirle a un super deportivo que vaya igual de rápido que un f22, si, es un super deportivo pero no jodas no pidas cosas imposibles
Is it me? Or the DLSS 4.0 looks WAY BETTER than the natives? Even the sharpness on shadows looks more smoothly and realistic than the natives. Does consoles and AMD Graphics have the same issie with this?
Native runs with TAA and the AI solution is superior, so unless the console versions will use an AI upscale (maybe the PS5 pro does) they’re kinda stuck
It is kind of shameful. Even with max settings and path tracing I would expect it not to dip below 30fps. That being said, the differences in visual quality between native and DLSS 4 aren't huge.
what a stupid comment, it upscales from 1080p... and looks very good on 4K monitors if you own one you would know it, also complaining about path tracing maxxed out 4K native results while your card can't probably run the game at all even at lower settings. If you have no clue what you are talking about then just simply don't comment, path tracing is not meant beeing able to run with pure raster performance the tech is extremly demanding and just look at this comment couple years later because everybody will focus on Software solutions rather than hardware because it's too cost expensive to produce every time more advanced hardware compared to software R&D.
because of path/ray tracing. you just have no idea what you are talking about if you expect ray tracing to run natively without denoising and upscaling. there would be no ray tracing in games period without upscaling and if you run the game without any raytracing features then you can easily run it without using upscalers.
Is just me or it seems that Dlss 4 using x2 has a better performance than the last Dlls using just x2 ( frame generation) That could be good for 40xx models when opperating dlss 4.0
entonces el nuevo modelo transformer solo afecta a sombras generadas por PT con RR? no afecta a sombras normales? digo xq si es asi, una lastima, un 5090/5080 no es para todos jaja
Bueno, se supone que entre más rayos reboten en la imagen más ruido hace, por eso hay sombras que se ven distorsionadas, por eso cuando usa Ray reconstruction se ven nítidas, debería afectar a todas las sombras
@@Giovannivalderrabano El reescalador únicamente se teoriza que va a estar para todas las RTX, solo que va a ser exclusivo de la serie 50 de momento. Es como el reescalador del DLSS 3 que, en su momento fue exclusivo de la serie 40 y ahora está disponible tanto para la serie 20 como 30. En cuanto al Multi Frame Generator, este si va a ser exclusivamente de la serie 50 por cómo funciona. Los tensor cores de las rtx 40 simplemente no pueden generar tantos cuadros.
@@lechugaconsal8604 DLSS3 Upscaler was always compatible with 20 and 30 series cards, even ray reconstruction. Only frame generation is exclusive. The new transformer model of the upscaler is available in Cyberpunk 2077 and run on older GPUs.
@@X1M423 Es exactamente lo que digo, amigo... bueno, digo ahora. Mi comentario es de hace tres días, cuando las pruebas no son concluyentes dado que el video solo muestra la calidad y el aumento de rendimiento con las nuevas gráficas. No en sí si sirven o no con las RTX 20, 30 o 40 (Que ya se sabe que funciona en dichas gráficas).
If you saw the comparison, both the DLSS Quality and the Performance surpass (or equal, in the worst case) the quality of the native resolution. That is to say, for quality, you are not going to lose anything. Why, then, use the DLSS Performance in 4K? Because its base resolution starts from 1080p, making the performance much higher than if it were rescaled from 1440p, as the DLSS Quality mode does.
@ It is not native. He didn’t show native resolution. He showed a 4K output of a TAA render. Cyberpunk does NOT render natively, period. Bad comparison.
Jajajaj total, montar un pc conuna 5090 sale a mucho menor presio. De hecho sale hasta megor una series S amigo tiene más potensia que la 5090 y mas barata de paso conlos esclusivos es un win win y todo a 8k, una del icia.
All that excitement about Transformer model and you can barely tell the difference @3:26. The detail seems about the same to me. Amazing how Nvidia can fool people by increasing the sharpness filter.
Look at this. Developers refuse to optimize their games because DLSS boosts performance, it’s annoying. and this upscaling stuff is being pushed all the time
🔥 Games -70% OFF:: ene.ba/Analista
🔥 Buy the best games at the lowest price!: ene.ba/Analista-Games
🔥 Buy digital cards at the best price!: ene.ba/Analista-Tarjetas
- DLSS 4 adds Multi Frame Generation exclusively for RTX 50 graphics cards.
- Multi Frame Generation is an improved version of Frame Generation that allows us to multiply fps up to x4.
- Multi Frame Generation allows us to select the fps multiplier in x2 (similar to DLSS3), x3 and x4.
- DLSS 4 is also an improvement for all RTX graphics cards thanks to the Transformer model. This new model replaces CNN, applying DLSS upscaler and Ray Reconstruction enhancements. Any DLSS-capable GPU benefits from this.
- The graphics card used for all shots was an RTX 5090.
- DLSS 4 añade Multi Frame Generation en exclusiva para las gráficas RTX 50.
- Multi Frame Generation es una versión mejorada de Frame Generation que nos permite multiplicar los fps hasta x4.
- Multi Frame Generation nos permite seleccionar el multiplicador de fps en x2 (similar al que ofrecía DLSS3), x3 y x4.
- DLSS 4 también supone una mejora para todas las gráficas RTX gracias al modelo Transformer. Este nuevo modelo sustituye a CNN, aplicando mejoras en el upscaler de DLSS y Ray Reconstruction. Cualquier GPU compatible con DLSS se beneficia de esta mejora.
- La gráfica usada para todas las tomas ha sido una RTX 5090.
How do you prefer to play your games? Would you use all the AI features or native only?
Guys, see you on Max Payne 3 multiplayer, its so fun and still played
What is your CPU dude?
Where is the latency?
Watch more then just one video
📈 to the moon
Hidden😂
Around 30ms with dlss 4. Pretty much same as first gen of frame gen according to some tests out on the net atm
On the moon
Y decian que la calidad del rasterizado era lo mejor, que DLSS se ve mal, sinceramente no sé si la comunidad sea AMD Radeon Boys o se quedaron en el año 2019 cuando la primera versión del DLSS estaba en pañales. Nvidia lleva mas de 6 años entrenando su modelo de IA, apenas AMD Radeon esta entrando al mundo de la IA y la comunidad aplaude como si fuera la gran novedad, solo demuestran hipocresía por parte de la comunidad.
Nvidia le lleva como 2-3 generaciones por delante a la competencia xd.
Lo que hacen estos con su tecnología es pura magia negra😮
AMD tiene tiene un tiempo dentro del campo de la IA con sus procesadores Ryzen, que en GPU haya entrada ahorita es otra cosa, lo que decanta entre AMD y Nvidia son los precios locos de esta última, además tampoco es buena que una GPU por ejemplo la 5070 full raster vaya a 30fps en 1440p y con IA es que suba a 170fps, ojalá ese día no llegue
this is not raster. path tracing isn't raster.
Yo lo único que veo es que sin IA el juego se cae a pedazos en 4k ... Literal la GTX1080 de hace 9 años apenas tiene diferencia con el rendimiento de una grafica que se supone que costara 2300 euros, me produce vomito lo dependientes que se están volviendo las graficas. Y los juegos peor, cada vez peor optimizados, ya que? Si total el DLSS lo arreglara por mi :)) Hagamos que SH2 remake renderice todo aunque no se vea por la niebla.
Keep in mind, when you see fuzzies without DLSS? That's TAA ruining the image.
😂
TAA is a curse.
@@THOUGHTB so is aliasing, TAA is necessary, MSAA cant be used anymore because it drains performance and it just applies on geometry too
@@Spillow-C I prefer playing a game with no AA than one with TAA on. TAA just blur the image to hide the aliasing. MSAA can't be used? Forza horizon 5 uses it on consoles and the image is one of the best you can get on them.
@@Spillow-C TAA is the reason we have so many blurry games in the last 5 -7 years.
Hay un poco de gráfica en esas inteligencias artificiales
Y bueno, es el futuro, el aprendizaje profundo vino para quedarse
Creo que es bueno, en unos años los juegos no tendrán que preocuparse mucho por la optimización y enfocarse en el gameplay
@@JohanesK. Creo que algún día llegarán los videojuegos personalizados por IA
llora
@@JohanesK. no lo es
Why the fuck DLSS 4 looks better than native while being smoother? I didn't see this coming
Because its anti-aliasing is better than the shitty TAA that plagues every modern game.
@@TheCactuar124 Exactly, wouldn't look that good if put up against dlaa
@@TheCactuar124 yeah DLAA is so much better than all other aa methods.
DLSS2/3/4/XeSS/FSR4 are all AI-enhanced TAA.. it is the same thing except their respective AI models do the sampling and denoising part.. it is why it's easy to implement in today's games.
Dlss is drawing some missing parts of the frame with trained AI. That's why it looks better. For me personally dlss 4 fine itself, i won't use frame generatation
Thanks a ton! This is my first time sending in comments for comparison, and you crushed it! Awesome work! 🎉🎉🎉
Imagine if Nintendo actually implements this on the switch 2, it would even be able to automatically upscale switch 1 titles without needing any huge patches for the new console
I think they won't be able to do it because this type of upscaling is strictly available on Nvidia cards only. I mean, if it was thag easy, PS5 Pro would've just used it instead of relying on PSSR
@@D1smantleplaystation used chips from AMD not Nvidia that was did make PSSR because before PSSR the choice was (FSR2.2 - dynamic scale) but Nintendo use Nvidia chips in switch 1 and now in switch 2 and new chips can access DLSS
@@D1smantle Switch 2 and Switch 1 use Nvidia GPUs. But Switch 2 wont be able to use Mult generation which is exclusive to Blackwell GPUs(Switch 2 uses Ampere). But Ampere GPUs are compatible with other DLSS 4 features.
PS5 Pro can only dream of having DLSS. It doesnot have Nvidia hardware.
@D1smantle Nintendo is working with Nvidia. The current switch SoC is made by them, so it is that easy to them who already work with Nvidia but not for Sony, who works with different companies
@stephkm3655 well... if that's the case, Switch 2 might just rely on DLSS since upscaling seems to be the new gimmick now. Also, (idk if the specs have been revealed yet), but I doubt it will be able to do anything reasonable with its hardware without any upscaling, seeing as they gave it a 4k display. Also, not just PS5, but basically any thing on AMD can only dream of having DLSS, that's why they rely on a close approximation
When DLSS 4 was activated, did you notice any stuttering or visual distortions in the image?
And what about latency?
look right framegen x4 cyberpunk cars bacjlight xd image terrible..
@@desiqneer5317Bro, there's nothing wrong with vehicles. I saw that part of the video, with the gameplay of Cyberpunk 2077, and there is no change for the worse, only the increase in fluency. What are you talking about?
With Frame Gen garbage you always, ALWAYS get visual errors.
@@Guharo Because of course, you tell me "if there are visual errors, l-l-look at the cars"... brother, I have watched the video 4 times on a 4K screen, with the video slowed down... AND I COULDN'T SEE ANYTHING. There is not a single visual error.
@@lechugaconsal8604 These guys are crazy and simply don't want to accept the new technologies that are excellent.
70-80 4K Native at Max in AW2 is absolutely nuts, that's the best showcase I think here on the power of the card. No one in their right mind is going to be playing at native 4K with DLSS4 to begin with, it's a BIG upgrade in quality (tested it myself on CB2077 today on my 4070 Ti on patch 2.21, at 1440p...basically clears up all the major DLSS ghosting issues, much sharper than the native TAA, and Ray Reconstruction is now also fixed ie. no more painterly smearing even with PT). But to see it able to power through that forest at native res...not to mention the hilariously high framerates with Frame Gen, which yes...I know it adds some input lag, these aren't multiplayer games.... But of course, the take away will somehow be "CB2077 running at 30fps on RTX 5090 OMEGALUL!!!" because the world is dumb.
Is dlss4 already available?
@@DjCliff86 in CyberPunk yes
That is not with path tracing on. You're getting excited about nothing.
Some gamers are still thinking 4k 60fps is real. I mean its real in some games but not in cyberpunk. Because devs if cyberpunk didn't optimized the game well. Don't curse nvidia for it
x3 and x4 were full of artifacts.
x2 Though was really enjoyable to watch.
I didn't see artifacts lol
@@NamelesFameles Hes tard, i think he is a 4090 owner
It really shows GPU developers working hard for games than game developers itself 🤡
1:48 meh ... This is like it's been soaped
Meah get a brain
It's not perfect, but it's a lot better than it was a year ago. New DLSS is actually pretty impressive in 4K in CB2077, the difference in clarity is very small compared to native. I actually enjoy playing with DLSS now when half a year ago I played without DLSS because of the artifacts and blur. I use DLSS Quality mode though, it's the best one
La calidad que tiene este canal a la hora de comparativas me parece el mejor de youtube
It's so sad to live en era how even its best video graphics card cannot be strong enough to play 5 years old game in highest possible settings... 10 years ago we would talk about how we can reduce input latency and rather have RAMs with lower CL.
I cannot believe how the developers became lazy these days. All I want to say is that we played Crysis 2-3 and GTA 5 on Xbox 360 which has released in 2005 console. Just go and check how good Fight Night Champion the game is on x360.
the Devs are giving us choices, its not ment to be played at everything maxxed out right now, that could come in handy when revisiting the game in the future with better hardware, games looks amazing on "high" settings
You're just being ignorant and don't understand it. This is with PATH TRACING, not standard ray tracing which is already very demanding. For context, path tracing is what a lot of animated movies use...that's not cheap, and in real time?
@@lum1notryc828 Don't accuse me of being ignorant, I know what Path Tracing is. I don't know what your expectations were when RT was first marketed a few years ago, but they positioned our expectations for RT at the Path Tracing level. More than 5 years have passed. Let us now have access to the technology they promises. This shame belongs to NVIDIA not devs.
On the other hand, my opinion is very clear that developers are getting lazy. I’ve been in the industry.
@@burakbulawe literally do have access to it. You are being ignorant.
To go further, it is rendering at above console frame rates at native 4k res and with FULL path tracing, that is an absolutely ridiculous feat. That would in the past take an entire rendering farm hours per frame.
Input lag: 7 business days
From 30 to 40 ms is not bad at all
@@Jonatan606it's 13ms (system latency) for 250fps if it was real frames. So about 4x the latency. That already tells you that it's not at all the same thing.
Still lower than AMSlow.
@@elpato3190 That's not how it works
@@Shoorik100 it is.
1:10 there are problem inMFG… look at cars headlight
há problemas em todo o vídeo do mfg meu amigo! assista novamente olhando apenas na ultima tela x4
Gracias por tomarte tu tiempo y hacer las pruebas, aunque al menos a mi me hubiera gustado que hubieras mostrado dlss calidad tambien y no unicamente rendimiento. / Thank you so much for take your time to make this test, but i'd have liked you show us dlss quality as well and not just performance.
I just replaced some DLSS 3.8 games with DLSS4, I must say the RDR2 improvement along with CP2077 is INSANE. Game looks better than native while running better.
Notas algun artefacto, shuttering o algun tipo de inestabilidad?
Nice video, but what do you think about the image quality on dlss 4?
dlss equilibrado y calidad?
In the Outlaws there seems to be something not set up right in the gameplay with DLSS 4. It's like it's not loading the shadows?
PC the way it was meant to be played.
Please don't let me see comments saying that DLSS looks better than Native, especially in Cyberpunk.
Cyberpunk does NOT have a Native Resolution option. The game's "native" has TAA. DLSS is better than TAA? Sure.
Better than native? Never. Not even DLAA.
Native in Cyberpunk with no TAA would be a jaggy mess like every other game that relies on it when you disable it, so yes DLSS looks better than native since it clears all of that lol. The days of MSAA are over and DLAA is the next best thing. DLSS Performance alone looks leagues better than TAA while giving a big performance boost, with the new model it's going to get even better.
@@TGP482 DLAA is the next best thing, but it is not good. Games don't need neither should rely on TAA to fix artifact in SS or AO. Documented and fixable, if publishers allowed for that.
TAA is not meant to be comparable to DLSS, since they're both based on relatively common technology, but TAA's older and stagnant (not up-to-date, nor adapted to the game as much, since it has no ML).
There are plenty of people who have made videos and articles about the problems of TAA, when we have major things like FXAA or literally any other type of AA that can be implemented.
Again, comparing TAA to Native is just plainly wrong, of course DLSS/DLAA looks better than "native TAA", when it is "not rending in 4K internally".
@@lopesmorrenofim You have games in mind where Native looks leagues ahead and so much better that its worth giving up the performance gain compared to DLSS in its Newest or most recent versions? Would love to have a glimpse on that
DLSS Quality or with Native DLAA should be better than Native, and more when You use Ray tracing or Path tracing
@@AdaptacionGamer Again: they're better if you're comparing to a TAA'd render, not Native 4K. Then you get the AA argument, but that's beyond the point and fixable.
DLAA is NEVER better than actual native.
No latency at all really and it’s brand new so it’ll be fixed,Really promising tech!
What CPU did you use? 🙂
Will try this on my 4090, but it's hard to believe performance is what I will use as it normally don't look good
Where's lossless scaling?
In most games I won’t be using dlss4 as a 4090/5090 user
2:03 What's going on here? Is that a bug or is it supposed to look that way without ray reconstruction?
Noise, is when You are using RT, in this case is Path Trazing
@@AdaptacionGamer Just noise? It looks a little too messy and undefined to be that.
That's because of TAA
@@cun7us Every time a ray of light bounces, it generates noise in the image, and it is worse when we talk about path tracing, since more rays are traced than in common ray tracing
@@AdaptacionGamer It looks awful and it's expensive on the GPU. I'd much rather have simple shadow maps that look less accurate but don't have that horrible looking mess.
Lo de generar frames se notaba mucho los generados, prefería no generarlos, era una sensación desagradable, habrán mejorado esto? Verlo en un vídeo no se siente igual que jugándolo.
Tocará probarlo uno mismo para corroborar
I would be curious to compare DLDSR to this new transformer solution. I'd assume DLDSR would still be better but maybe not? Or maybe DLDSR will just run even better now.
This is not the battle for performance anymore,
this is the battle of upscaling quality.
I tried it myself and I seriously can't see any different between DLSS quality and DLSS performance if using transformer update.
If AMD and Intel can't keep up, the future of gaming is bleak.
Gracias Nvidia por este nuevo software y tecnologías. Gracias a este fuerte competidor, puedo comprar GPU Amd más baratas de lo que merecen. ¡Saludos desde Turquía!
Next Time 4090 vs 5090 vs 5080
DLSS4 is great and FG4x is great and am glad the transformer dlss feature is coming to the 40 and the 30 series but the base frames in cyberpunk are still horrible
Its like 20% faster which is not a generational leap
The dlss4 and FG4x are great like i said and the latency is good but its definitely not twice as fast as the 4090 like Nvidia was claiming
Its just some marketing bs
no seria mejor si aprovechas el modo calidad del DlSS 4?
Wow Nvidia smashed it from 33 to 200 wow that’s like 6x. Even the 2x is awesome.
Dlss performance already boost it to 70 so it's 4x
皆が酷評しているほど悪くない
ただframe genはx3までが良さそう
素晴らしい技術なので対応ゲームを増やして欲しい
特にVR
DLSS is next level
Nvidia leaving the competition another generation behind
Con la salida de la serie 50 por fin me puedo comprar la 1050😅.
PD Sería interesante que hicieras un video comparando el wuthering waves.
1:29 Comienza el video
It feels like with new GPUs now, we are buying features instead of raw performance. Anyway, I always expect there to be some kind of latency with frame-gen... Especially on competitive titles.
That’s the reality of trying to squeeze the same amount of performance gains out of the same old methods, diminishing returns
@@neilranson4185 True, people expect the technology to progress linearly to the graphical demands of current games, but it definitely isn't possible without features like upscaling and frame gen. Our expectations are so high that Nvidia, AMD, Intel had to incorporate these AI technologies to give us satiable performance which wouldn't be possible with just Raw power which every other person seems to cry abt these days.
There is a physical limitation with lithography and semiconductors.
Software has to make up for the lack of brute force now.
but it does have more raw performance..also latency between framegen x2 and x4 is the same. Also when reflex 2 comes out it'll be the same if not less then native
There's little to no latency difference
al final me terminó gustando el dlss4 corrigió muchos errores de artifacts que eran algo molesto en la mayoria de uso de reescalado por IA, la verdad si dió un mejor salto, lo que no me convence es que Nvidia tenga que obligarte a cambiar de PSU cada tantos años para acomodar la alimentacion de la grafica, como pasar de atx 2.0 a 2.4 3.0 3.5 y 5.0, me parece una toma de pelo cambiar de fuente solo para beneficiar a las empresas y consumir mas y descartar una fuente que tranquilamente te puede durar unos 8 años, y si se que existen adaptadores, pero estos tienden a tener fallas que problemas para entregar energia y por ese mismo motivo las nuevas fuentes ya vienen con esta nueva adaptacion para entregar alimentacion directa y sin problemas de que se te estropee la placa de video, AMD tiene que dar un paso en alto para ver si está a la altura de Nvidia y ver que tal competencia quiera dar.
Bueno, pero no necesariamente deberías comprar la 5090, también viene la 5080 con 16GB de Vram, yo sé que la Vram no es lo más importante, pero el chip tampoco debería estar muy lejos en rendimiento, debería ser más amigable en el consumo
@@AdaptacionGamer 5080 TIENE 16 GB DE VRAM
@ dale pues mano, gracias por el detallito, igual no creo que le dé una crisis mental por no comprar lo mejor de Nvidia
Se ve mejor con dlss4 que en nativo JAJA, saludos pa los haters de nvidia😘
Yes it is a disappointment to see new hardware being so poor and weak instead of delivering a jump in raw performance as it was back then... but you have to admit, those frame generations are BONKERS... it rarely shows any artifacts that messes the experience and it makes the games run so freaking smooth it's practically unbelievable.
Already had a blast with the 4060Ti doing wonders with the Frame Generation, and now this new Multi FG getting even better results... At the end of the day, even through handicaps like these, you're still getting a better visual and performance than last-gen, so for me it's still a win
You aren't getting better performance, you're getting perceived performance only when you aren't actually playing. Your input is going to be delayed significantly when only 1 out of every 4/5 frames is real
Завезли бы это на xbox series s, я думаю не завозят из за того, что бы не пропадал спрос на МОЩНОЕ железо)
Чел, ты...
It's indeed interesting, but the input lag is still a major problem..
Definitivamente. Se ve bueno en los números de Fraps ,pero muy malo para el imput.
How do you know?
@@jamesramsey1442 This is what Nvidia literally explains to you and shows you how it works.
@@jamesramsey1442 Because I already have a 40 series card, and frame interpolation works exactly like that. It is only visual.
@@lopesmorrenofim How do you know it is a "problem" with these cards too?
What's the name of the music?
Usan DLSS en Perfomance para sacar 80FPS en el cyberpunk sin usar Frame Gen? Bueno, al menos no hace falta Frame Gen, pero veo que algunas veces no es suficiente
Esta usando path tracing, si lo desactivas fácil llega a los 140 o talvez 200 ni idea no tengo una 5090 obviamente, pero creer que una gráfica con el path tracing en 4k pueda correr juegos fluidamente sin rescalado es como pedirle a un super deportivo que vaya igual de rápido que un f22, si, es un super deportivo pero no jodas no pidas cosas imposibles
entiendo que debe ser algo de nvidia pero quien se compra una 5090 para usar dlss en performance
Is it me? Or the DLSS 4.0 looks WAY BETTER than the natives? Even the sharpness on shadows looks more smoothly and realistic than the natives. Does consoles and AMD Graphics have the same issie with this?
Native runs with TAA and the AI solution is superior, so unless the console versions will use an AI upscale (maybe the PS5 pro does) they’re kinda stuck
i have 4090, in cb77 max settings 2160p rt overdrive native dlaa run 25fps, low 22 max 36, 5090 28 frame?
Realmente el x2 es 50% más, el x3 recién es el doble.
Dlss 4 is good with fg 👌
4k DLSS Performance is just 1080p...
What a joke that 5090 cant't do 4k and 30fps native...
It is kind of shameful. Even with max settings and path tracing I would expect it not to dip below 30fps. That being said, the differences in visual quality between native and DLSS 4 aren't huge.
what a stupid comment, it upscales from 1080p... and looks very good on 4K monitors if you own one you would know it, also complaining about path tracing maxxed out 4K native results while your card can't probably run the game at all even at lower settings. If you have no clue what you are talking about then just simply don't comment, path tracing is not meant beeing able to run with pure raster performance the tech is extremly demanding and just look at this comment couple years later because everybody will focus on Software solutions rather than hardware because it's too cost expensive to produce every time more advanced hardware compared to software R&D.
Path tracing is severely intensive, even for the best GPU in the market. Think of Crysis in 2007...
@@zensaiybro if Nvidia pay you to write this you won't defense him like that
because of path/ray tracing. you just have no idea what you are talking about if you expect ray tracing to run natively without denoising and upscaling. there would be no ray tracing in games period without upscaling and if you run the game without any raytracing features then you can easily run it without using upscalers.
Is just me or it seems that Dlss 4 using x2 has a better performance than the last Dlls using just x2 ( frame generation)
That could be good for 40xx models when opperating dlss 4.0
They changed it yes. even on 40 series cards framgen will be better and uses less vram
where i can get dlss 4 dll files ?
Una pregunta ¿pronto harás uno de la 5080 con los mismos juegos?
El embargo para la rtx 5080 no termina hasta el 29 de enero, así que no se verá ninguna review hasta entonces.
RED DEAD REDEMPTION 2! IN 4K! MSSA 8X ULTRA
no entiendo por que no puede andar de forma natica el juego
Buen trabajo 👏🏼👏🏼
CPU bottle necked even at 4x
i cant keep up with this bs anymore.. "CNN Models" & "Transformers" what the hell lol
upgrade your CPU, its bottlenecking your GPU
as soon as you use dlss performance, your GPU usage drops to 70%
entonces el nuevo modelo transformer solo afecta a sombras generadas por PT con RR? no afecta a sombras normales? digo xq si es asi, una lastima, un 5090/5080 no es para todos jaja
Bueno, se supone que entre más rayos reboten en la imagen más ruido hace, por eso hay sombras que se ven distorsionadas, por eso cuando usa Ray reconstruction se ven nítidas, debería afectar a todas las sombras
El DLSS4 esta para la serie 40 o es exclusivo de la serie 50?
@@Giovannivalderrabano El reescalador únicamente se teoriza que va a estar para todas las RTX, solo que va a ser exclusivo de la serie 50 de momento. Es como el reescalador del DLSS 3 que, en su momento fue exclusivo de la serie 40 y ahora está disponible tanto para la serie 20 como 30.
En cuanto al Multi Frame Generator, este si va a ser exclusivamente de la serie 50 por cómo funciona. Los tensor cores de las rtx 40 simplemente no pueden generar tantos cuadros.
@@lechugaconsal8604 DLSS3 Upscaler was always compatible with 20 and 30 series cards, even ray reconstruction. Only frame generation is exclusive. The new transformer model of the upscaler is available in Cyberpunk 2077 and run on older GPUs.
@@X1M423 Es exactamente lo que digo, amigo... bueno, digo ahora. Mi comentario es de hace tres días, cuando las pruebas no son concluyentes dado que el video solo muestra la calidad y el aumento de rendimiento con las nuevas gráficas. No en sí si sirven o no con las RTX 20, 30 o 40 (Que ya se sabe que funciona en dichas gráficas).
Why show DLSS Performance why not Quality?
because in 4k is recomended performance mode. Have better results than native resolution
@@sput231br9 Not native. TAA. Never native.
Because DLSS performance is just 1080p.
If you saw the comparison, both the DLSS Quality and the Performance surpass (or equal, in the worst case) the quality of the native resolution. That is to say, for quality, you are not going to lose anything. Why, then, use the DLSS Performance in 4K? Because its base resolution starts from 1080p, making the performance much higher than if it were rescaled from 1440p, as the DLSS Quality mode does.
@ It is not native. He didn’t show native resolution. He showed a 4K output of a TAA render.
Cyberpunk does NOT render natively, period. Bad comparison.
Cada ves las tarjetas traen menos potencia bruta y dependen más de la IA
Minha dúvida foi respondida com esse vídeo. A dúvida era se daria pra escolher quantos frames o multi frame gen iria gerar, pra mim 2x ta ótimo já
2X é o frame gen padrão que já tinha anteriormente. E sim os jogos deixam escolher entre 2,3 e 4X.
Warning AMD Fan boys alert in comments
5000 series and the last gpu still cannot run Cyberpunk 2077 at max grahpics with path tracing without using dlss or frame gen lol
And even the 6000 series will not do it because the ai is the future
Now you can go back to your cave 😂
AMD Boy???? raster boy??? NOUU RT??? 😂😂😂😂😂😂😂😂
@@Fenarok the trasformer model will look like native or even better with the next upgrades.this is just the beginning
Bottleneck?? Upgrade your cpu old man
rtx 5090 vs rtx 4090
Foquin diabolicol
Song name?
ibrahim majga- cierny baca
Pe-pero ps5 pro con 33TF
Jajajaj total, montar un pc conuna 5090 sale a mucho menor presio. De hecho sale hasta megor una series S amigo tiene más potensia que la 5090 y mas barata de paso conlos esclusivos es un win win y todo a 8k, una del icia.
Con DLSS 4, de la 3070 pa arriba te estás merendando a la pro. No hace falta una seríe 4000 y mucho menos una 5000
All that excitement about Transformer model and you can barely tell the difference @3:26. The detail seems about the same to me. Amazing how Nvidia can fool people by increasing the sharpness filter.
The difference can be seen with lots of motion going on.
Look at any cable or hair and will look antialiased compared to old model
DLSS4 performance resolution is still worse than native
Dlss производительность на rtx 5090, ясно понятно.
MFG en la series 4000 YA
No va a pasar..
@@Endert0217 Va a pasar, eventualmente de hecho. No solo se lo están planteando sino que encima en la 4000 funciona con los OFA y ahora no.
@@SweetFlexZ Si, pero solo en 2x.
What witchcraft is this
amazing video as always
RTX 5090 30FPS Performance in 5 Years Old Game😂😂😂It’s just trash.I can play at 30fps on my console😅
Look at this. Developers refuse to optimize their games because DLSS boosts performance, it’s annoying. and this upscaling stuff is being pushed all the time
Outlaws wtf, que pésima optimización
DLSS finally doesn't completely suck. You should use Quality preset though, Performance sucks
Does it have reflex 2?
IDK
ABC
The best ❤
Config pc
Cool
disgusting.
Those native performances are rough. Too bad everything's AI now. It is what it is, i guess.
MFGx4 on
Low latency off
The latency is still uploading 😂
PS5 pro better :)
🤣
what a joke
JAJAJAJAJAJA
Even series x is better than that shitty ps5 pro.
jajaaja is a trash