No Way this is Good... WTF is DLSS 4??

Поделиться
HTML-код
  • Опубликовано: 23 янв 2025

Комментарии • 4,6 тыс.

  • @rdineshkartikdk
    @rdineshkartikdk 15 дней назад +1430

    0:03 It is rtx 4090 not rtx 5090

    • @vextakes
      @vextakes  15 дней назад +494

      u right, my b. didnt even catch it in the edit

    • @christophermullins7163
      @christophermullins7163 15 дней назад +57

      ​@@vextakesit's ok.. you're still the goat

    • @Shahzad12357
      @Shahzad12357 15 дней назад +24

      ​@@vextakes take some rest brah

    • @Shahzad12357
      @Shahzad12357 15 дней назад

      ​@@vextakes you need it

    • @Shahzad12357
      @Shahzad12357 15 дней назад

      ​@@vextakes you need it

  • @TheKartoffel101
    @TheKartoffel101 15 дней назад +3498

    In the RTX 6000 generation, you receive an LSD pill instead of a GPU, allowing you to visualize all those frames directly in your mind.

    • @anttikangasvieri1361
      @anttikangasvieri1361 15 дней назад +210

      lower latency as frames are in your mind man

    • @H3c171
      @H3c171 15 дней назад +55

      ​@@anttikangasvieri1361 yea but depends on your CPU generation and vram capacity. Most people can't afford a high end build so they'll probably lag and driver crashes

    • @phutureproof
      @phutureproof 15 дней назад +9

      id pay for that lol

    • @ezoni8438
      @ezoni8438 15 дней назад +28

      meanwhile AMD just releases another 6800xt

    • @CLANIMEProductions
      @CLANIMEProductions 15 дней назад +15

      nice stolen comment lil bro

  • @marcopazzi117
    @marcopazzi117 15 дней назад +5757

    We came from not owning games, to not owning frames.

    • @FO0TMinecraftPVP
      @FO0TMinecraftPVP 15 дней назад +65

      COPE
      Nvidia is the best

    • @cin2110
      @cin2110 15 дней назад +518

      @@FO0TMinecraftPVP Projecting troll buddy boy boy.

    • @megaham1552
      @megaham1552 15 дней назад +9

      @@seaneckhart9914 What lol

    • @MrEditorsSideKick
      @MrEditorsSideKick 15 дней назад +1

      @@seaneckhart9914 Jews? You mean you? Autocorrect going crazy.

    • @actualyoungsoo
      @actualyoungsoo 15 дней назад +60

      Do you realise DLSS is still done by the GPU hardware? You have no clue what you are talking about smh
      No car in the world runs with an engine that is not tuned. No program in the world is operated without optimization. The same goes for GPUs. Graphics cards need utilization methods to use as much gpu hardware as possible, and DLSS is the method that does that. People are discarding the fact that DLSS is still done by the GPU hardware, not by the server GPU connected via the internet.

  • @itag295
    @itag295 15 дней назад +2639

    Nvidia incoming drivers:
    dlss4 = false;
    if ( series >= 5000 ) { dlss4 = true; }

    • @tibui-c4k
      @tibui-c4k 15 дней назад +287

      Too verbose.
      dlss4 = series >= 5000;

    • @tehJimmyy
      @tehJimmyy 15 дней назад

      @@tibui-c4k this guy codes

    • @itag295
      @itag295 15 дней назад +149

      @@tibui-c4k not sure c++ can compile that 🤔

    • @Lord_Arbiter
      @Lord_Arbiter 15 дней назад +53

      Only multi frame generation is exclusively to 50 series all remaining features are coming to 40,30,20 series

    • @christophermullins7163
      @christophermullins7163 15 дней назад +12

      @JithuChilukuri warp reflex is locked to 50 supposedly.. that technology with 2x framegen could be incredible. Just with artifacts sprinkled in but it'll feel nice!

  • @xdrastig_4207
    @xdrastig_4207 14 дней назад +437

    so we basically came to a point where we can't physically upgrade graphics cards, so even at maximum of possible physical performance, ue5 still performs as shit

    • @Circaninesix
      @Circaninesix 14 дней назад +122

      Game optimisation is where to go next. Don't know how that became optional nowadays. The hardware we have is incredibly powerful, but the software is too bloated to run well on even the best cards sometimes.

    • @GravitasZero
      @GravitasZero 14 дней назад +23

      @@Circaninesixnot sure anyone knows how to optimize games anymore nowadays

    • @PREDATEURLT
      @PREDATEURLT 14 дней назад +9

      @@Circaninesix Answer is simple DLSS and frame gen is only optimization that game developers use, if not that 720p 30FPS gaming is coming back.

    • @clouds5
      @clouds5 14 дней назад +13

      Blackwell (rtx 5000) apparantly uses the same processing node as 4000. They are doing the same thing as Intel was doing 10 years ago. They are so far ahead that they can basically re-release 4000 series under a new name with small tweaks (5070 seems to be about the same as 4070s...). Pretty sure the next gen with a new processing node will be much more efficient. Just skip Blackwell. If you need a new GPU now, look for deals/used on 4070tiS or 4080S.
      Because yes, we will probably stagnate until we have competition again. Amd is working on chiplet gpus, nvidia is full on ai train. So maybe eventually amd can do for GPUs what they did to CPUs with ryzen?

    • @robinsebelova7103
      @robinsebelova7103 14 дней назад +5

      @@Circaninesix every increase in performance of HW means that publishers need to spend less money for optimalizations. Frame Generation is especially harmful technology in this aspect.

  • @teckniec2
    @teckniec2 15 дней назад +775

    I like how games are becoming more and more unoptimized that we first needed more powerful hardware to now needing fake frames. All to make a game that doesn't look any better than one built 10 years ago....minus the lighting.

    • @o_o9039
      @o_o9039 15 дней назад +49

      yeah imagine if games actually optimized and all the AI was to increase graphics instead of make them lose the least amount of quality while adding performance

    • @blackface-b1v
      @blackface-b1v 15 дней назад +23

      Ngl
      Cyberpunk, black myth wukong and hellblade actually look great
      And beat games from like 2015

    • @pcwalter7567
      @pcwalter7567 15 дней назад +26

      ​@@blackface-b1v definitely not Battlefront 2...

    • @bobbygetsbanned6049
      @bobbygetsbanned6049 15 дней назад +34

      They should be using AI to optimize the game code rather than generate fake frames and fake resolutions.

    • @blackface-b1v
      @blackface-b1v 15 дней назад +1

      @pcwalter7567 battlefront 2 looks incredible
      I would consider it an outlier
      It still looks worse than black myth tho... even if by a little bit

  • @denmaakujin9161
    @denmaakujin9161 15 дней назад +1168

    I don't like where we are going with fake everything

    • @elsevillaart
      @elsevillaart 15 дней назад +1

      Started with fake boobs and toupes.

    • @owdoogames
      @owdoogames 15 дней назад +55

      The Matrix

    • @YTStopCensoringFreedomOfspeech
      @YTStopCensoringFreedomOfspeech 15 дней назад +105

      Not only that, everything costs more, this GPU generation is beta testing AI technologies with false advertising on performance. Why would you utilize frame generation to re introduce visual errors and input lag which all previous generations were trying to get rid of? As long as people keep paying for this, it's gonna keep happening. The problem is people who can afford these cards keep paying for worse value and the developers have less incentive to change their model. There is also lack of competition since AMD is literally owned by the same family. I guess we will have to wait for China to catch up before we see real competition.

    • @trevor972m4
      @trevor972m4 15 дней назад +1

      What is real? right?

    • @EyemachineStudios
      @EyemachineStudios 15 дней назад +2

      there next GPU will be designed, named, and manufactured by AI

  • @ronokoplhatake7463
    @ronokoplhatake7463 15 дней назад +1725

    DLSS4 will make devs even lazier at optimizing their games then they already are

    • @poka26ev2
      @poka26ev2 15 дней назад +175

      They will probably just download movie assets with millions of polygons and triangles

    • @tux_the_astronaut
      @tux_the_astronaut 15 дней назад +97

      @@poka26ev2they already are

    • @samgoff5289
      @samgoff5289 15 дней назад +17

      Why do you care if then end result is good and makes games look and play better? I don’t care if the game company, Microsoft, or nvidia is the one who improves games

    • @draco2xx
      @draco2xx 15 дней назад +9

      cutting even more shortcuts

    • @dr_diddy
      @dr_diddy 15 дней назад +184

      @@samgoff5289 based on current trends, this is not making games better mate.
      The fact that we are now getting recommended spec sheets that include frame gen and DLSS to reach 60fps is a tragedy.
      If those targets were 120hz then sure, I have no issues but 60???

  • @og.StudMuffin
    @og.StudMuffin 15 дней назад +191

    The slow release of DLSS features makes it harder for people to realise that those tiny artifacts aren't what games normally look like. People just assume the blur and jitter is normal.

    • @FatherLamb
      @FatherLamb 15 дней назад +27

      Yep! I was called out for "Oh its just how the game is" yet no.. its not. And a lot of people like my self NOTICE these issues when using DLSS or FSR & FG.
      Its NOT normal at all. The fact its being shrugged off as "Meh, its what ever" is not good.

    • @banner7310
      @banner7310 14 дней назад +7

      I'm never using DLSS or any Fram gen crap. 30fps 33.33ms frametime doesn't change much when it's rendered to 100fps so it's gonna a feel like crap

    • @sudifgish8853
      @sudifgish8853 14 дней назад +12

      Nvidia and parts of the game industry are basically accustoming people into thinking that eating shit is perfectly normal and people complaining about it are stupid. They just have to do it long enough and it will only be the "stupid old folks" that go on about how games were better back then. Blurriness and temporal artifacts are already just accepted, even though they make new games look worse than over 10y old games.

    • @stangamer1151
      @stangamer1151 13 дней назад +3

      To be honest, the blur was here since TAA introduction about 12 years ago. DLAA/DLSS actually usually reduces blur and ghosting, compared to TAA.
      If you disable all kinds of AA, modern games look awful, due to complex geometry. The flickering is so annoying. DLAA/DLSS is a must, to eliminate these artifacts.

    • @og.StudMuffin
      @og.StudMuffin 13 дней назад +2

      @stangamer1151 antialiasing used to be a performance hit for a quality bump. With DLSS and similar tech we are giving up quality for performance. Other AA takes high res and lowers it for better quality. DLSS takes low res and upscales it for better FPS, which causes the bad image quality. We want more powerful GPUs at good prices. Nvidia is giving us more FPS at lower quality and calling it more power.
      Nobody wants to disable AA, but when it's on, games run like shit because "you can just upscale it for better FPS with TAA/DLSS".
      There's 2 issues tho, developers cut corners and Nvidia keeps giving them reasons to.

  • @rns10
    @rns10 15 дней назад +336

    2:36 40 series had dlss3 but 30 series didnt.
    Seems like these GPU releases are more like software updates, than hardware upgrades.

    • @Kobe4Life2424
      @Kobe4Life2424 14 дней назад +6

      DLSS4 is available on the 40 series. just not the multi frame generation part of it since that probably requires something in the blackwell architecture that can't be brought over to older gen

    • @rns10
      @rns10 14 дней назад +27

      @Kobe4Life2424 Looks at the specs of 40 Super series GPU and the 50 series GPU, other than being 5-10% better hardware, there is no reason to believe this cant run on even 4090, or 4080 super or 4070TI

    • @Kobe4Life2424
      @Kobe4Life2424 14 дней назад +5

      @rns10 your simply looking at the cuda core count. And vram capacities? They don't tell the full picture. Look at the AI specs. For multi frame gen most likely Blackwell architecture along with the higher AI TOPS are needed to take advantage of the multi frame gen tech. But hey we can all simply speculate. I am happy as long as Nvidia keeps making me money I'll keep buying their GPU's

    • @lyxsm
      @lyxsm 14 дней назад +7

      @@rns10 This would be true if it was purely about performance, but it's not. The 40 and 50 series have different processor architectures, with one being made for such features while the other isn't. This ofc doesn't necessarily mean these features are impossible to implement on the 40 series, but rather that the implementation makes no sense, since it would only run at a heavy additional cost in compute that outweighs any potential gain.

    • @Doomguy-777
      @Doomguy-777 14 дней назад +13

      ​@Kobe4Life2424 This is all snake oil bs. The RTX 4090 literally has almost 3x the amount of CUDA cores, RT cores and Tensor cores alone (which means running quite more complex AI/LLM models), a lot more VRAM, than the RTX 5070 and it won't have MFG (also an AI feature like DLSS 4) unlike the latter. The entire "aRcHiTeCtUrE" talk is bs just like they made possible for older RTX 20 series to have access to DLSS after a while and community complaints seeing through their marketing ploy and artificial technical limitations. It's all software and drivers limiting the capabilities. Including FG and MFG as in Lossless Scaling software working with every GPU model and is the same technology, just not a proprietary one like Nvidia's MFG.
      You are being gaslighted big time into thinking you need to buy the "new generation" cards when those didn't evolve much hardware wise in truth. That's how they make a lot of profit: put in the minimum amount of effort to sell the barely better thing more expensive. The RTX 5070 will not be better than the RTX 4090 at all. Give it DLSS 4 and MFG in a year, and it will it blow the 5070 out of the water in every way imaginable.
      We are not talking a limitation by architecture like Intel changing CPU socket format too often requiring you to buy a new motherboard frequently. It's all made up and limited through their drivers and softwares.

  • @leon3295
    @leon3295 15 дней назад +858

    That CEO will say anything to get his stocks up lol. AI AI AI A AI

  • @curie1420
    @curie1420 15 дней назад +930

    soo.... gpu companies sell us fake fake frames and fake resolution meanwhile games studios tell us to be happy not owning games? id rather leave this hobby tbh

    • @robotnikkkk001
      @robotnikkkk001 15 дней назад +35

      ...SOON ENOUGH,IT'LL BE ABOUT MAYBE EVEN *_food_* ,THO
      ........INDEED A NIGHTMARE IRL

    • @kingsugoyt
      @kingsugoyt 15 дней назад +99

      Just buy old games or wait for these new overpriced games to go on major sales before sweeping them up. Or consider sailing the high seas

    • @Rastamanjungle
      @Rastamanjungle 15 дней назад +77

      dude there are so many super games you can play right now on avarage PC.
      Most of the best games ever are like 5-20 years old, all the new ones are just clones barely nothing new.
      And the graphics arent that better, im playing stalker 2 currently on top settings but i can tell you now that 5 year old Metro Exodus had better graphics, climate, gamplay and everything.

    • @RADkate
      @RADkate 15 дней назад +6

      define fake because they are still being output rasterization is Just as inaccurate as vector reconstruction lol

    • @Justfillintheblank
      @Justfillintheblank 15 дней назад +35

      @@RADkate If you have to make a BS argument like this then you already know you're BSing yourself. It's fake frames because those frames are AI generated.

  • @kingpic
    @kingpic 14 дней назад +42

    Keyword here is fake. Fake frames vs real frames. Real frames will always be the only benchmark.

    • @jujuyee2534
      @jujuyee2534 10 дней назад +1

      true

    • @phoenixvance6642
      @phoenixvance6642 10 дней назад +4

      Fake frames were supposed to be supplementary.. It could have been a way to get enthusiasts into 8k gaming without needing the best cards! what the hell happened...

  • @michael77309
    @michael77309 15 дней назад +579

    The era of unoptimized games that can muster 10fps is here, big L for the gamers

    • @reza7531
      @reza7531 15 дней назад +57

      at this point let's just start reading books as entertainment again , since movies , shows , anime , games , social media and everything you can think of has become trash in the past 6 years

    • @kadmus78
      @kadmus78 15 дней назад +31

      Optimizing games is a thing of the past. They're at the "optimizing gamers" stage now.

    • @pleaseenteranamelol711
      @pleaseenteranamelol711 15 дней назад +2

      I don't care because i dont care about any of these games anyway. Time to grow up.

    • @BradJohannsen
      @BradJohannsen 15 дней назад +7

      And watch, DF will champion this crap like it's a win for gamers everywhere.

    • @randombleachfan
      @randombleachfan 15 дней назад +14

      Unoptimized AAA games. Not sure about Indie Games, i think Indie Games are okay.

  • @Thundermonk99
    @Thundermonk99 15 дней назад +224

    One of the more subtle issues introduced by the focus on Frame Generation + Upscaling is that the downsides (input lag, artifacts) don't impact the entire GPU stack equally. Rather, these downsides are much more pronounced for entry level or mid-range GPUs than on the top-end cards. On a 4090/5090, the base framerate (pre-framegen) is high enough that input latency is at an acceptable level and introducing 5-10ms of additional input latency isn't the end of the world. That same tradeoff doesn't make nearly as much sense when the base framerate is 30fps. Visual artifacting tells a similar story. On a 4090/5090 where the base framerate is 80-144 FPS, individual generated frames won't stay on the screen for very long, so any artifact is extremely short-lived and likely not to be noticed. When you go further down the stack and the base framerate is 30-50 FPS, visual artifacts would be far more noticeable.
    Even DLSS upscaling shares a similar problem. Relying on upscaling from 1440p --> 4k or 1080p --> 4k is a much better experience than upscaling from 720p to 1440p. It's easy to recommend DLSS Balanced for 4k users, but harder to recommend DLSS balanced for a 1080p monitor user.
    These technologies tend to put their best foot forward on the highest-end hardware, but the experience of using these techniques on lower-end hardware is a lot more mixed.

    • @GFClocked
      @GFClocked 15 дней назад +7

      Btw same with taa. The more fps and more resolution you have it directly correlates with all these technologies. So if you're poor wompwomp too bad you're not good enough for jensens jacket gang

    • @MrMozkoZrout
      @MrMozkoZrout 15 дней назад +9

      Yeah it's really a paradox. All this tech could be beneficial if it could "fake" performance on cards that don't really have it but it doesn't really work there. It is usable really on high end stuff which would probably be powerful enough to give a good experience without all this tech in most games and it is basically just an extra creme on top. I really hope somebody will come up with a better budget offering, be it AMD or even Intel or something

    • @grantlikecomputers1059
      @grantlikecomputers1059 15 дней назад +1

      Isn’t this true with all previous versions of DLSS as well though? I suppose it’s probably more pronounced with multi-frame gen since more of the frames aren’t really rendered…
      But it sounds to me like NVIDIA has tried their best to allow you to choose exactly which settings work best to find the sweet spot for your hardware/workload. It’ll come down to individual preference whether you’d like to play with no artifacts at 20-30FPS or with a bit more latency or artifacts at 100+ fps.

    • @seeibe
      @seeibe 15 дней назад +3

      "The more you buy the more you save"

    • @yourhandlehere1
      @yourhandlehere1 15 дней назад +1

      80-144 fps in what? 1080p solitaire? Cyberthang it's only 20-27

  • @ryr2277
    @ryr2277 15 дней назад +264

    Game dev/publisher in the next 3-8 years : our game run natively on 15fps with unnoticable visual improvement. but thank to ai 4x multi frame gen now we can run it on 60fps :D
    Wait.. optimization? What's that? We too lazy to do whatever that thing anyway

    • @robotnikkkk001
      @robotnikkkk001 15 дней назад

      ...WELL,THERES STILL SOME good THING,HEHE
      ....ADVANTAGE AT any COMPETITIVE GAME....HEHE....

    • @xxelitewarriorxx
      @xxelitewarriorxx 15 дней назад +8

      You do know that path tracing was enabled right? If it was disabled the 5090 is pushing 100+fps at 4k raster

    • @ASlaveToReason
      @ASlaveToReason 15 дней назад +4

      Amd is doing that right now with a custom version of fsr 4 for handhelds. It's specifically to 3x+ the battery life and in that context I'm all for it.

    • @pieterlindeque7798
      @pieterlindeque7798 15 дней назад +12

      ​@@xxelitewarriorxxso you're telling me a damn 5090 that costs more than an entire PC can't max out a game? That's amazing.

    • @xxelitewarriorxx
      @xxelitewarriorxx 15 дней назад +1

      @pieterlindeque7798 5090 can max out every game if we talking raster and normal ray tracing on max but if you add path tracing then it cannot achieve 60fps natively without those features especially at 4k resolution

  • @GENKI_INU
    @GENKI_INU 14 дней назад +30

    1:40 Speaking of... Can we talk about how the RTX 5070 on laptops will still be stuck on 8 GB of VRAM?

    • @kururhai4531
      @kururhai4531 12 дней назад +3

      hell nah it's 5060 in disguise
      5070 with 128bit bus is real shame

    • @SerpentBoySnake
      @SerpentBoySnake 9 дней назад

      Crazy

  • @Lu5ck
    @Lu5ck 15 дней назад +175

    Nvidia is making a huge claim they are upscaling from 480p to 4k!

    • @kelet-studios
      @kelet-studios 15 дней назад +8

      3klikphillip did that experiment

    • @harrison00xXx
      @harrison00xXx 15 дней назад +7

      from personal upscaling experience and not in realtime, even heavy AI rendering with enough time to render, anything below 720p upscaled to 4k is useless. Will look 100% fake and smeary and somehow "painting" like.
      Even 720p to 4K is already "hard", thats why i would upscale 720p content rather to 1440p (2x), for real time upscaling its much less resource eating regarding to CPU and "AI" stuff, for lets say video upscaling, you save A LOT OF SPACE by making it 1440p instead of 4K, it can look in 1440p even better than the (too extremely) upscaled 4K.
      I am playing for example Cyberpunk, Stalker 2 and a few GPU hungry online survival games with DLSS.... i have "just" a RTX 2080 but i want to enjoy often 4k60 or 1440p120 on OLED TV. To make it worse, the fps are not allowed to drop below 40 fps or frame times above 12 ms, otherwise Gsync doesnt work on the TV (40-120 Hz Gsync range), so in some games which drop below 40 fps or have some framedrops/stutter due too high resolution, DLSS is a gamechanger to make the "old" 2080 still "nice" to use.
      But if possible, i avoid ANY "fake" enhancements, including DLSS, even at the cost of reducing effects, texture details, rendering distance etc down to a certain point.
      In my opinion, AI doenst have any reason to be involved in enhancing every aspect of graphics, fps etc. For me it looks as if DLSS, now this frame generation to 2-4x the fps, its just a BAD TRY to compensate for the resources hungry, mainly UE5 games trying to look hyperrealistic.
      Good think im playing nowadays often older titles from my childhood, those classic games are so much more entertaining and better, despite much less capabilities of the PCs those days.

    • @Relativizor
      @Relativizor 15 дней назад +1

      No. From 1080p to 4k. Then optical flow for the rest.

    • @Jackthetraveller
      @Jackthetraveller 15 дней назад

      you know you can choose from what it upscale from it looks really good upscaling from 1440p to 4k

    • @Jackthetraveller
      @Jackthetraveller 15 дней назад

      you are not supposed to upscale from anything lower then 1080p and that you will only need to do if you got a really bad gpu.

  • @paincake2595
    @paincake2595 15 дней назад +532

    Jensen is pretty good at marketing, saying my igpu "Brute force renders" games at 40fps sounds like heavy lifting.

    • @alandiegovillalobos
      @alandiegovillalobos 15 дней назад +26

      Your igpu is such a brute 😅 NVIDIA gpus are civilized.

    • @paincake2595
      @paincake2595 15 дней назад +37

      @@alandiegovillalobos man, imagine what my igpu could do if it had ""AI"".

    • @Wodagazowana5000
      @Wodagazowana5000 15 дней назад +1

      My ps5 renders at 40 no problem
      Its pc it always sucks​@paincake2595

    • @alandiegovillalobos
      @alandiegovillalobos 15 дней назад +12

      @ hahaha 😂 oh I can’t. Will have to ask my buddy with the flashing jacket. I’m sure it’ll perform at 5090 levels though.

    • @StormierNik
      @StormierNik 15 дней назад +18

      Yeah it makes it sound like raw performance is some uncultured savage was of doing things.

  • @hybridgaming09
    @hybridgaming09 15 дней назад +238

    NVIDIA has evolved from a hardware company into a software company

    • @dante19890
      @dante19890 15 дней назад +12

      its cuz they reached max level in hardware and trying to get AI leveld up

    • @enzoguimaraescost
      @enzoguimaraescost 15 дней назад

      Yeah rtx 5060 8gb​@@dante19890

    • @reignmans
      @reignmans 15 дней назад +9

      @@dante19890 lmfao

    • @FBI_Agent_
      @FBI_Agent_ 15 дней назад +7

      @@dante19890 Fucking hell my man is clueless jeez

    • @badpuppy3
      @badpuppy3 15 дней назад +17

      It's called Moore's Law bro. We've reached the physical limits of what hardware can do.

  • @dazza1970
    @dazza1970 14 дней назад +11

    Maybe its just me?? but no one talks about Cyberpunk demo where the 50 series card (so 5070 - 5090 i guess~) is getting LESS than 30 fps when no AI is used... on a STATIC screen.. Is this really progress???

    • @avatarion
      @avatarion 14 дней назад +2

      The 50 series cards are getting more frames than the 40 series, so yes, it's progress.

    • @pedropierre9594
      @pedropierre9594 7 дней назад +1

      It’s the Cyberpunk Overdrive settings, which make any card sweat, so it is on par with current gen slightly better, BUT I’ve seen a recent benchmark with the multi frame gen at 50 ish ms latency so yeah fake frame as can be

  • @xeofela
    @xeofela 15 дней назад +575

    Imagine paying $2k for a gpu raw power which gives u less than 30fps in 4k is a crime

    • @alejandroespinosa4623
      @alejandroespinosa4623 15 дней назад +19

      Would you buy an overpriced 4090 instead ?

    • @Monkey56021
      @Monkey56021 15 дней назад +97

      Lol they say it's 2k but you goddamn well know that you won't get any for less than 2500

    • @loayzc10
      @loayzc10 15 дней назад +14

      You can't possibly think it will provide 30fps...

    • @roxxedk9897
      @roxxedk9897 15 дней назад +16

      RT is included in this figure and Its a tech that simply cant be scaled without fancy mockery. When its off its 70+ on last gen.

    • @Balnazzardi
      @Balnazzardi 15 дней назад +33

      Dude, that would only happen with the most demanding games at highest ray tracing/path tracing settings and if you dont utilize DLSS or frame gen (like for example Indiana Jones and Great Circle wont run for more then 30fps if you have everything maxed out including path tracing but dont enable DLSS or frame generation)
      If you run path tracing/ray tracing settings at low or medium, you get playable framerates even without DLSS or frame generation and if you disable raytracing entirely, then even more so you dont have to worry about framerate whatsover at highest settings possible.
      I mean ye, you can argue with that kind of money for GPU, it should run a game with full ray tracing settings without DLSS/frame generation higher, but when Nvidia's competitors dont have GPUs that would be able to do that either (at all in fact), then I think lot of this whining is ridicilous. Its literally only in those highest possible ray/path tracing settings in most system heavy games you get such poor framerates if you dont enable DLSS or path tracing and from my personal experience using DLSS3 with 4070 Super I can say that as "average joe gamer" I didnt see or feel any drawbacks using them. Sure maybe if/when you know what to look for or are more sensitive to latency, you can notice the difference, but even then depending on the game you are playing, you can ask yourself does it really matter?

  • @everade
    @everade 15 дней назад +218

    Frame Warp is atrocious. Look at the right side, the text. It's predicting garbage based on prior frames, and it flickers heavily when it snaps back to the actual rendered frame.
    Additionaly, they essentialy take a 3D camera, but move it in 2D, meaning you no longer get an accurate representation where you're actually standing. And to hide that, they MUST cut out your weapon, and enemies, moving them back to where they are supposed to be, only based on depth data. While in reality every object in 3D space would need to be cut out and moved to accomodate the new camera position, not just the characters and weapons. And then use prediction inpainting to fill out the empty space. The actual frame you get is completly fake, in both viewing angle and lots of rendered pixels.
    Don't forget, you will move your camera WAY faster in actual gameplay, meaning that you could end up with 30% of pixels being inpainted with TONS of gibberish and flickering. What about a rocket flying towards your face? Will it be moved as well? What about pillars, walls and so on? They will most likely just be ignored.
    Also don't forget that your GPU just wasted tons of computing on rendering pixels that are cut out and moved out of frame, and then waste additional computing power to generate way worse visuals.

    • @hyl
      @hyl 15 дней назад +27

      Frame warp is very cheap and proven to perform. It doesn’t use AI, and it has existed as a feature for VR games under the name “asynchronous spacewarp” for over seven years at this point. It does wonders to improve latency in VR, where latency is much more significant. The artifacting is really not that significant, even in VR, and it can make VR so much less nauseating. If you need to minimize latency as much as possible, the consequence of these artifacts is worth it, but you can always disable it if you don’t need the reduction of latency.
      All this will do is inpaint live while waiting for the next frame to be created, so it wouldn’t make your frames you would usually get have lower quality. The alternative to inpainting is a still frame. Inpainting paints between frames, not the frames themselves, so I don’t think you should be too worried about image quality reduction. It can only give a user more information.

    • @everade
      @everade 15 дней назад +12

      @@hyl You're right, it's not really "AI". It's a predictive rendering algorithm based prior frames. In short. It has no idea what's supposed to be there, so it predicts it, and fills in garbage. Just open the official video, and open your eyes. It's flickering already on subtle movements. If you pause it at the right moments, you see how scrambled some sections become. The text on the right side is scrambled inbetween certain frames during movement. And this showcase was a best case scenario, it will be way worse during actual gameplay. I'm sure there are tons of players with bad vision and may not be able to notice it. For me i'm being hella destracted by subtle movements in the frame, because i've trained my brain to aim at moving things for the past decades.
      But sure, we can deactivate it, but then you will have to compete against players with an advantage. Which isn't optimal either.

    • @hyl
      @hyl 15 дней назад +11

      @ I get what you’re saying. For me though, 60 FPS through some spacewarp solution would be preferable over 30 FPS native. Though warping does result in artifacts like you say, 30 FPS to me looks like the entire frame is artifacting because of how slow the frame updates. If you want to see a bit more about this topic I think Linus Tech Tips made a video looking into the feature under the title “Asynchronous Reprojection”

    • @zerocal76
      @zerocal76 15 дней назад +11

      @@hyl LTT is the last channel you want for any in-depth technical analysis. I'd be waiting on a Digital Foundry video for stuff like this!

    • @Kannibole
      @Kannibole 14 дней назад +1

      Yeah, and I dont know if it's just me, but the actual lighting of the frames seems always a bit off (or in the showcase rather the ceiling lights), as in a very subtle flickering overall which is very annoying and can be super distracting

  • @blobeyeordie
    @blobeyeordie 15 дней назад +151

    Imagine playing Hogwarts Legacy and everyone is a Harry Potter by Balenciaga character

    • @Shares1004
      @Shares1004 15 дней назад +18

      to be honest, that would be kinda funny in its own way^^

    • @GENKI_INU
      @GENKI_INU 12 дней назад +2

      @@blobeyeordie There's probably a mod for that

    • @blobeyeordie
      @blobeyeordie 12 дней назад +1

      @@GENKI_INU i would play it just for that reason

    • @CoffeeDrinkerKim
      @CoffeeDrinkerKim 10 дней назад +3

      “Imagine playing Hogwarts Legacy” I’d rather watch paint dry.

    • @MegaDarklee
      @MegaDarklee 10 дней назад +1

      @@CoffeeDrinkerKimthe game was good though. I thoroughly enjoyed it, although I feel they dropped the ball on post support.

  • @AlexLexusOfficial
    @AlexLexusOfficial 14 дней назад +11

    at this point dlss4 just feels like you're watching a video and not rendering the frames

  • @dwayne_
    @dwayne_ 15 дней назад +140

    9:39 The average PC gamer is not the average person. They will notice all these flaws.

    • @deejnutz2068
      @deejnutz2068 15 дней назад +16

      No they won't.

    • @fixo5132
      @fixo5132 15 дней назад +14

      specially those paying 1K + into a card...

    • @dwayne_
      @dwayne_ 15 дней назад +12

      @deejnutz2068 I think they will especially if they switch from native resolution. Im not saying it's so bad that they won't use it. A lot of people use the current DLSS. All im saying is that PC gamers are not as oblivious as people make them out to be.

    • @TakenWithout
      @TakenWithout 15 дней назад +6

      I agree with this because any gamer who’s even somewhat into competitive games is already trained to spot pixels that look different (because that means an enemy) so they’re gonna notice “wrong” pixels in AI generated frames

    • @Nicktoon100
      @Nicktoon100 15 дней назад +1

      lol

  • @guille92h
    @guille92h 15 дней назад +255

    Graphics haven't improved much since 2019, but we get now unoptimised games, RE2 remake(2019) and SWBF2(2017) looked great and didn't even have ray tracing, and could run in many gpus.

    • @guille92h
      @guille92h 15 дней назад +57

      I prefer good stable graphics than ai generated ultra realistic graphics that look strange, but Nvidia wants games to be artificially more demanding and require exclusive features avaible only to their latest gpus to sell 2000$ gpus, and they even pay developers to force games to have their features.

    • @enzoguimaraescost
      @enzoguimaraescost 15 дней назад +2

      Good times

    • @torchbearer3784
      @torchbearer3784 15 дней назад +19

      Graphics have not improved much since 2010. I have been playing older games and been blown away how good some of them still look.

    • @張彥暉-v8p
      @張彥暉-v8p 15 дней назад +1

      You have bad eyes, bro.

    • @brando3342
      @brando3342 15 дней назад +21

      Graphics peaked in 2016, and have actually been degrading since 2019 since this AI trash really got going.

  • @frankjohannessen6383
    @frankjohannessen6383 15 дней назад +29

    when the RTX 90-series comes out there won't be any pixels or game at all, your GPU will just have an AI that will convince you that you've just played the most awesome game ever.

    • @Phininx
      @Phininx 15 дней назад

      You will just be livestreaming your videocard :)

    • @owdoogames
      @owdoogames 15 дней назад +1

      Thats one way to clear my backlog XD

    • @upanddown1132
      @upanddown1132 14 дней назад +1

      Yea and it will cost $5k

  • @Egon_Freeman
    @Egon_Freeman 14 дней назад +29

    @8:59 YESSSS, those "duplicating" thingies are specifically artefacts are the exact same thing when you're doing "optical flow" retiming in video editing. It's essentially trying to estimate _motion vectors_ - and, for that particular part of the image - _failing._ This often happens in areas that are repetitive, so the algorithm doesn't quite know which part is the one in the past, and which is the one _in the future,_ so to speak (when you have a pattern, it's at times impossible to know which way the motion is moving, since it can look exactly the same - at least without knowing a few frames in advance).

    • @raven4k998
      @raven4k998 14 дней назад +1

      I do not use DLSS DLSS2 or DLSS3 so why would I use DLSS4?

    • @armet1272
      @armet1272 13 дней назад

      will there be a technology to remove those issues

    • @Egon_Freeman
      @Egon_Freeman 13 дней назад

      @@armet1272 It will likely work better with some _jitter_ introduced (it would still be wobbly, but much less so). Increasing the resolution may also help.

    • @StrazdasLT
      @StrazdasLT 12 дней назад

      @@raven4k998 you dont have to play videogames either.

    • @raven4k998
      @raven4k998 12 дней назад

      @@StrazdasLT are you ok are you on something because I did not say anything about videogames kid have a cookie calm down and learn how to read because I never said anything about videogames

  • @schoologylibrarybot4311
    @schoologylibrarybot4311 15 дней назад +550

    If the game already feels like shit DLSS isn’t going to help

    • @FO0TMinecraftPVP
      @FO0TMinecraftPVP 15 дней назад +1

      COPE
      Amd is for poor people

    • @googleplaynow9608
      @googleplaynow9608 15 дней назад +32

      DLSS makes tons of shit feeling games way better already...🤡

    • @fs5866
      @fs5866 15 дней назад +11

      actually i think this is a net positive for everyone, there's not that many demanding games unless you turn on ray tracing

    • @Nurix09
      @Nurix09 15 дней назад

      ​@@googleplaynow9608DLSS helps but it becomes a problem if the true framerate is 20 fps and the rest are fake frames.

    • @longplaylegends
      @longplaylegends 15 дней назад +46

      ​@@fs5866 Uhhh... Unless you have a 4070 class card or better, I think a lot of people would disagree with you.

  • @telepop123
    @telepop123 15 дней назад +194

    Vex is pointing out issues with DLSS 4 in the Digital Foundry video that Digital Foundry didn't point out themselves.

    • @arthurarthur6083
      @arthurarthur6083 15 дней назад

      DF are Nvidia bootlickers

    • @cythose4059
      @cythose4059 15 дней назад +5

      is still early to say if its good or not, like how dlss 1.0 or 2.0 where not great in many games, but dlss 3.7 is absolute great comparted to fsr and in some games even native if you play in 1440p or 4k resolution

    • @RichardPhillips1066
      @RichardPhillips1066 15 дней назад

      That's because DF are industry shills

    • @aweirdwombat
      @aweirdwombat 15 дней назад +51

      Honestly I've been kind of disappointed in Digital Foundry for a while. I don't think they're being paid for glowing reviews or anything. But I think their love of new technology bias them a lot.

    • @pepeedge5601
      @pepeedge5601 15 дней назад +43

      @@aweirdwombat
      They seem like tech shills to me.
      Not shills to tech companies, but shills to whatever new tech comes out.
      Overexcitement maybe.

  • @86lanzo
    @86lanzo 15 дней назад +144

    Is it me or is Nvidias GPUs straight changing our games now..
    If they're changing textures,faces,lighting they're changing the developers vision..
    That's very different to just adding fake frames.

    • @s0ulll0warri0r
      @s0ulll0warri0r 15 дней назад +17

      there is no such thing as a developers vision or talented developers anymore

    • @random_person618
      @random_person618 15 дней назад +10

      Exactly. The games made by devs that do pay serious attention to every detail, texture, model and lighting. But that all gets ruined by NVIDIA using AI to literally recreate their whole game.

    • @marcogenovesi8570
      @marcogenovesi8570 15 дней назад +3

      @@random_person618 what are those devs? I have not seen many of those in AAA

    • @random_person618
      @random_person618 15 дней назад +2

      @@marcogenovesi8570 So you're saying that just because a developer doesn't do what I previously said, they don't have a game? They still made a whole game that people can play, albeit only the rich people, it's not like they used AI to generate the game. Even Minesweeper or Tetris was made by a developer, it's not like the game spawned in out of nowhere. Now NVIDIA is generating 200+ frames with AI which still ruins a lot of efforts that the developer made, which is unfair because the developer didn't use AI but NVIDIA is forcing people to use DLSS because of the 30 FPS you'd get otherwise. Even in the AI-enhanced face that was shown in this video, it is pretty clear that between the original face and the AI one, a lot of shadows and lighting effects were altered. That's my whole point.

    • @astromos
      @astromos 15 дней назад +4

      Your argument makes no sense. The developers can choose if they want to implement these features. The bad part is if they're used as a crutch against poor optimization.

  • @----.__
    @----.__ 6 дней назад +1

    I grew up on the C64, then A500, then on to PC. I've seen graphics evolve massively in my lifetime.
    Never thought I'd live long enough to see them devolve... One would think it would be a better idea to teach current programmers how to optimise instead of making hand-wavium BS software/hardware fixes for the problems that modern programming and forced crunch brings.
    If you ever want to see what _real_ programming looks like, check out the demo scene, most notably the 64kb competitions. Hardcore coders using less memory than a single jpeg can make full motion 3D videos, with banging sound tracks, in 64kb because they _know_ how to code efficiently.

  • @greenwhite79
    @greenwhite79 15 дней назад +374

    RTX 5070 with 12GB VRAM is a joke.

    • @josephstewart7696
      @josephstewart7696 15 дней назад +25

      For real, im building my first pc right now with a 4070 Ti Super and that’s got 16 GB of VRAM

    • @BloodLetterG
      @BloodLetterG 15 дней назад +12

      If the 5070 is better than the RX 7800 XT (at raw performance) I'm 100% getting the 5070

    • @UltraVegito-1995
      @UltraVegito-1995 15 дней назад +68

      ​@@BloodLetterGyour raw performance isn't getting there once that GPU is deprived of VRAM

    • @BloodLetterG
      @BloodLetterG 15 дней назад +7

      @@UltraVegito-1995 well, the Dlss 4 might make up for a Vram

    • @quakers474
      @quakers474 15 дней назад +19

      ​​​@@UltraVegito-1995It's not a 4k card that's gonna eat up more than 12 gb of vram. I have a 4070 ti super and ive hardly ever utilized more than 12 gbs of vram, that too with ray tracing which honestly doesn't make a difference in some titles. It's just a bullshit argument for anyone using 1440p or 1080p cards. For 4k, yes 12 isn't enough.

  • @WyFoster
    @WyFoster 15 дней назад +82

    Crazy now, FPS is no longer the metric of performance. It is now latency. Because with frame gen we can have all the FPS we want but the response could be terrible.

    • @agamaz5650
      @agamaz5650 14 дней назад +6

      and it is extremely depressing since oled monitors now have 0,03 ms latency

    • @WyFoster
      @WyFoster 13 дней назад

      @agamaz5650 At least they won't make it worse.

    • @Heavenset
      @Heavenset 13 дней назад

      the only hope is amd or intel, but intel is stilll too young and amd is just an nvidia's copycat

    • @Shaggii_
      @Shaggii_ 11 дней назад

      You’re right this is the way things are headed. I never thought things would get this bad tbh

    • @mataloce
      @mataloce 10 дней назад +2

      ​@Heavenset i don't even think they will try to compete... this business model is way too profitable if every one of them applies it...
      Imagine saving in raw architecture and strongly investing in ai models which can be copied between them... only giving those features to the last series... and somehow on... gaming industry just became the cellphone industry

  • @BDOGreatSaiyanMan
    @BDOGreatSaiyanMan 10 дней назад +4

    Fake frames should not count towards performance.
    Pure rasterization is what should matter. The truth is that the 50-series is a MINOR upgrade on the 40-series... The whole generation is nothing more than marketing spin and shows what happens when there is lacking competition in the marketing place - Remember Intel until Ryzen came along???

    • @iris4547
      @iris4547 9 дней назад

      this is the reason 3-4x is locked to 50 series, as the raw performance is only roughly 20% over 40 series.

  • @BarZaTTacKs
    @BarZaTTacKs 15 дней назад +40

    We should change the testing to a pixel quality test where we analyze the screen pixels to kind of hold them more accountable for this

    • @jarrettleto
      @jarrettleto 15 дней назад +8

      yeah, I'm most tired of them saying how DLSS looks better now when every time I turn it on I can see the motion blur and feel the input lag. It looks like shit, stop telling me it doesnt.

    • @forasago
      @forasago 14 дней назад +2

      That ship sailed when TAA and dynamic resolution (or fixed upscaling) became the industry standard in the mid 2010s.

  • @daviddamasceno6063
    @daviddamasceno6063 15 дней назад +217

    Playing with frame gen on my current 4080 feels like playing on console on a very bad TV with no game mode. It's doable but man, it's so unpleasant. I hate that this is the future that NVidia is pushing.

    • @siriansight
      @siriansight 15 дней назад +1

      profound

    • @tessierrr
      @tessierrr 15 дней назад +31

      And people are literally praising frame gen 🤣 number higher must mean better 🤣

    • @francistaylor1822
      @francistaylor1822 15 дней назад +18

      i hate fake frames, I dunno it just feels wrong

    • @andrewsolis2988
      @andrewsolis2988 15 дней назад +18

      This is why instead of hoping AMD competes to lower Nvidia prices, the masses need to support the competition to send a message. If AMD 9070 can hit 4080 levels on raster, I am switching. I hate when companies are entitled to our money due to blind loyalty! There is still hope as NOBODY knows the true performance of the 9070, only AMD and board partners

    • @RonaldoxSiu
      @RonaldoxSiu 15 дней назад +8

      You're just picky frame gen is good even from 40 fps to 80~

  • @diogoalmeidavisuals
    @diogoalmeidavisuals 15 дней назад +37

    Here's what I don't see anyone mention. How does this affect performance in VR?
    Where generating different content to each eye can be an issue

    • @rich4513
      @rich4513 15 дней назад +8

      Yep .. AFAIK VR cant utilize the frame gen without messing up render

    • @michaeljamesm
      @michaeljamesm 15 дней назад

      chameleon eyes vision

    • @TheMalT75
      @TheMalT75 14 дней назад +1

      Aeons ago, I actually bought an LG TV with passive stereoscopic 3D and was watching "3D bluerays" on it. In principle, amazingly cool. However, compression-based mismatch of "textures" such as brick walls, presented the left eye not only with a different 3D-perspective than the right eye, but also with a different texture, so the flat brick wall itself became a confusing, frame-by-frame changing 3D-ish object, very trippy! Cryptonian armor in the movie "Man of Steel" as an example of that effect is still stuck in my brain and I cannot imagine VR being different...

    • @ethyr
      @ethyr 8 дней назад +1

      VR is the only thing I look forward to in gaming nowadays... and it's sad to see pure rasterization take a backseat.

  • @Johnnyynf
    @Johnnyynf 14 дней назад +5

    About the reflex2 stuff
    I have saw some video about similar tech on VR stuff, as it's more graphically demanding and require low latency to not feel dissy.
    The real innovation I think they have here is that they don't need a bigger image ready to be moved around, instead they use outpainting

  • @tonmoyratul3799
    @tonmoyratul3799 15 дней назад +296

    this is the time for AMD to shine by giving us actual performance and frames instead of fake frames.

    • @hyperturbotechnomike
      @hyperturbotechnomike 15 дней назад +42

      I'm waiting for an intel B770

    • @dante19890
      @dante19890 15 дней назад +51

      ye since when has AMD came and save the day

    • @peanut93able
      @peanut93able 15 дней назад +3

      true because fsr4 upscaling looks much better now

    • @christroy8047
      @christroy8047 15 дней назад +16

      @@peanut93able We don't know how good FSR4 will be yet.

    • @jonji-kz2nz
      @jonji-kz2nz 15 дней назад +11

      Amd always behind in gpu

  • @loayzc10
    @loayzc10 15 дней назад +34

    It does not add 200+ frames from 24 or whatever. It's first boosted with dlss to get a decent base frame rate of 60 then 60x4=240

    • @Patataa4k
      @Patataa4k 15 дней назад +10

      Yea true, i dont understand how vex doesn't realize something so simple to see

    • @JonBslime
      @JonBslime 15 дней назад +11

      Because there was also a example of a game running running at 124 fps with DLSS4 which means 30fps base… and regardless if it’s 60 fps base it sill adds latency if you actually watched the entire video. It’s simply a bad gimmick to get ppl to buy more.

    • @nameless.0190
      @nameless.0190 15 дней назад +5

      @@JonBslime tbh I find frame gen even more pointless because of that.. If you can already get 60 fps in single player games, what's the point of using frame gen and doubling latency to get more? Low end cards do not have enough VRAM for it (FG eats it up like a mf), so that is out of the question.
      The more beneficial use case would be for multiplayer games, but the latency penalty is too much for it to be viable. Maybe one day if they can get the latency penalty down to within a few ms.

    • @Patataa4k
      @Patataa4k 15 дней назад +4

      @@JonBslime It's easy to do the calculation
      30x4 = 120
      60x4 = 240
      What I don't understand is the thumbnail of the video, which says 214 frame fakes, when doing calculations it's easy to see what the base fps are with dlss performance etc, and then add the x4 of MFG, If I have to use dlss for better performance I use it, also have to say that I'm not a big fan of frame generation.

    • @alperen1383
      @alperen1383 15 дней назад +1

      120 hertz already looks smooth as butter why would i go even higher and add more latency and artifacts?

  • @rodneypearce6113
    @rodneypearce6113 15 дней назад +92

    Faces at 2:06 - the AI face looks it has a plastic sheen and feels not right, but the standard one to me looks more actually human. How is that better?

    • @Rov-Nihil
      @Rov-Nihil 15 дней назад

      luckily we've been trained a bit to spot AI faked images, it's all smoothing from now folks!

    • @HomelessShoe
      @HomelessShoe 15 дней назад

      Well, not everyone is born like a globlin. You have genuinely natural beauties.
      And are you saying I am plastic? lol I have a natural glowing skin, because I take care of my body and skin.

    • @rodneypearce6113
      @rodneypearce6113 15 дней назад +31

      @HomelessShoe lol username does not check out.

    • @scarm_rune
      @scarm_rune 15 дней назад +23

      @@HomelessShoe why are you getting flared up over a comment?

    • @jokered1133
      @jokered1133 15 дней назад +15

      I think the standard one looked better because it looks more game-like, while the AI face just looks like a deepfake, you get the live action face and you try to plop it into a 3d rendered scene and it will look weird.
      Keeps reminding me of fully 3D rendered movies and how acting will be a relic of the past, and all movies will look just like real life but rendered on Unreal Engine 9.

  • @bigeddiekane4995
    @bigeddiekane4995 9 дней назад

    I’m terrible with finishing long RUclips videos but dang man you have a flow that is so smooth. You have your claims, evidence, backup videos, and a way to keep the watchers engaged. Enjoyed the way you put everything together flawlessly and explained exactly what’s on the summary. 🔥

  • @Rov-Nihil
    @Rov-Nihil 15 дней назад +49

    Remember when TVs had this super smooth motion feature and how well those sold?

    • @dante19890
      @dante19890 15 дней назад +15

      they still have them. Motion interpoaltion is very common in tvs. But they have really high input lag penalty and looks very bad with gaming cuz they dont have access to the game values.

    • @tonytrampolini2958
      @tonytrampolini2958 15 дней назад +5

      Literally every modern smart tv sells with this lmao terrible example

    • @ryr2277
      @ryr2277 15 дней назад +4

      I really hate those feature on tv. It turn 24fps cinematic movie into tv series with weird jelly like movement and noticable ghosting

    • @draco2xx
      @draco2xx 15 дней назад

      tv still got those and it's good, the power demand is nowhere with games but with movies its great

    • @tonytrampolini2958
      @tonytrampolini2958 15 дней назад

      @@ryr2277 agreed I hate it. But it's not affected TV sales at all as this comment suggests.

  • @dragonmares59110
    @dragonmares59110 15 дней назад +50

    My issue is that it is still 1000$ for 16GB of vram...being stuck with a 3070 i cannot financially justify upgrading yet

    • @TheMissingDislikeButton
      @TheMissingDislikeButton 15 дней назад +1

      technically it's $750 , 5070ti has 16gb

    • @dragonmares59110
      @dragonmares59110 15 дней назад +19

      @@TheMissingDislikeButton will be 1000/1200 euros in europe, as what happened with the 4070ti

    • @RudiRatte-v1p
      @RudiRatte-v1p 15 дней назад

      @@dragonmares59110 yes, it's a shame that after 4 years you can't get a card that's twice as fast for 400 bucks.

    • @BladeRunner031
      @BladeRunner031 15 дней назад +10

      I switched from 3070 ti to 3080 ti 12gb and difference is huge.Sounds stupid but it is,those 4gb really makes the difference in games that are close to using 7gb or more.But the biggest,like huuuge performance boost I got switching from 10700k to 9800x3d.
      For example in Witcher 3 just switching cards I got 4 fps,from 73 to 77 but less fps drops due to memory.After buying new CPU that jumped to 110 fps

    • @NihongoWakannai
      @NihongoWakannai 15 дней назад +2

      Nvidia could easily give more, but they know they can milk people for AI by forcing high VRAM onto more expensive cards.

  • @pastuh
    @pastuh 15 дней назад +18

    When you play VR games, you might feel discomfort or pain in your eyes. Why does this happen? It's often due to flickering lights. Sometimes you notice the flickering, and sometimes you don’t but brains still process it.
    Obvious DLSS random lighting artifacts will strain eyes, even when simply watching monitor pixels.

    • @metabang03
      @metabang03 15 дней назад +4

      DLSS, THE GUARANTEE YOU WILL BE EXPERIENCING YOUR GAME IN RHE FUCKING PAST

  • @elMario02
    @elMario02 14 дней назад +11

    0:23 The 3070 came out with 8gb of vram and the 2080ti has 11. The 2080ti has aged better...

  • @kauamachadodeoliveira
    @kauamachadodeoliveira 15 дней назад +11

    Fun Fact: If you use the Driver's Frame Generation on an AMD GPU and using the Frame Generation on the game, it'll get just the same result as DLSS 4 in exception for the image quality, for now at least

  • @AShortsskipper
    @AShortsskipper 15 дней назад +35

    From not owning games to not owning frames

  • @olebrumme6356
    @olebrumme6356 15 дней назад +11

    What some don't seem to understand is that no matter how much Nvidia is gonna reduce the input latency, you will still have the base framerate one. So let's say a base framerate of 30 fps, but MFG gets it up to 120 fps, you will have 33.33ms and NOT 8.33ms you would get from native 120 fps.

    • @rikuleinonen
      @rikuleinonen 15 дней назад

      Keep in mind that DLSS can easily double your base frame rate if you're GPU bound due to AI upscaling, which positively affects input lag. If your base frame rate without DLSS is 30 fps, then with DLSS 4 it may jump all the way to 240 fps with your base fps doubling. It's honestly crazy technology and doesn't look as bad as people make it out to be.
      Only for companies to throw it all under a bus and lower your base frame rate to 15 fps. Say hello to good old TAA smearing, because you WILL experience it if you didn't before.

    • @manefin
      @manefin 15 дней назад +2

      @@rikuleinonen When i use frame gen my input lag always increases.

  • @TrueReshiram
    @TrueReshiram 9 дней назад +4

    I'm sick of all this AI bs!
    I want raw power and speed, I'm tired of taa and other lazy practices.
    Every game I play that uses the unreal engine is a blurry mess. But the ones that don't look sharp!

  • @andrewsolis2988
    @andrewsolis2988 15 дней назад +17

    I refuse to fall into this trend. Please AMD, focus on raster!!! Thank you Vex for calling out BS! Respect!!

  • @lordfreezer9550
    @lordfreezer9550 15 дней назад +62

    Meanwhile us 1080p literal GODS playing any game at 200 FPS with these barely sufficient 4k focused cards

    • @visionop8
      @visionop8 15 дней назад +8

      You do have a point there. I like 1440p myself but I'll go to 1080p for framerate purposes as I did in Dead Space Remake. The secret is a high-end audio receiver with an upscaler and game mode with low latency. 1080p games actually look incredible on a 4k screen and you can run with REAL framerates with little to no input lag.

    • @alexlara5326
      @alexlara5326 15 дней назад +4

      I like 4k sorry 😢

    • @XiLock_Alotx
      @XiLock_Alotx 15 дней назад +11

      Staying at 1440p for the foreseeable future. My 4090 and 7800x3d are more than capable for at least this entire gen.

    • @jready1455
      @jready1455 15 дней назад +1

      must suck to be able to see pixels on your screen

    • @bruhtholemew
      @bruhtholemew 15 дней назад

      Not on the latest poorly optimized UE5 games at least.

  • @stephanmilius3598
    @stephanmilius3598 15 дней назад +98

    Just use "lossless scaling" software for multi frame generation if you have an older card. Works fine for me.

    • @user-ey2mz9cz3h
      @user-ey2mz9cz3h 15 дней назад +5

      It works fine, lossless scaling. but as shown it uses pc resources to generate those frames.
      Performance/optimilisation wise MFG is better, because it runs in an optimized workflow, and can acces software, driver and hardware level stuff(its all nvidia).
      it has more potential.
      But, does it look good, does it perform good, and is it worth spending cash. Esp, because all the dlss /reflex optimisations also reach 4000 series, combined with lossless scaling it's up there too. it leaves a 20-30% improvement. for how much?
      I dont think its a bad call to pick up a 4090 second hand now, while all 4090 sellers are rush selling those cards because 5070 is going to perform like a 4090.

    • @ArdaU
      @ArdaU 15 дней назад +6

      It's so bad that I can't even use that shit for watching anime, it creates artifacts like crazy I would rather get 60 fps

    • @Raynhardx
      @Raynhardx 15 дней назад

      @@ArdaU Same. Tried it on factorio, and the artifacts of the intermediate frames are way worse then the desire to have more fps. Gladly I found another way to run factory 120 fps native via a mod. I guess the tech only really works if the images are already extremely busy with millions of polygons and textures. For 2d games or anime, every glitch sticks out like a sore thumb.

    • @ArdaU
      @ArdaU 15 дней назад

      @@Raynhardx FSR 3 Frame generation is great btw it really feels like you are actually getting high fps but when I move if I watch closer it feels a bit weird for some reason

    • @theomcreich821
      @theomcreich821 15 дней назад +5

      sorry but its just not usable, i have characters vanish from turning the camera, very visible artifacts.

  • @its_lucky2526
    @its_lucky2526 11 дней назад +3

    i am SO confused. why would a next generation gpu have the same performance as the previous one? isn't that a BAD thing?

    • @Iwasfoundunharmed
      @Iwasfoundunharmed 11 дней назад

      Why not make more performance but ALSO add the frame gen.

  • @SueDoeNym-b4d
    @SueDoeNym-b4d 15 дней назад +8

    17:03 Frame Warp is not actually that new. John Carmack was talking about what he called Time Warp that they used in VR in 2017.
    If anyone is interested it is from the UMKC talk at around the 30 minute mark.

  • @P4el1co
    @P4el1co 15 дней назад +7

    convolutional neural networks tend to be smaller in size and more computationally effective than transformer models, meaning that the new model will probably use more VRAM and cores

    • @HansWurst-I
      @HansWurst-I 15 дней назад +1

      Yup, so the FPS uplift should be way lower.

  • @AintImpressed
    @AintImpressed 15 дней назад +10

    Could you please remake this video when the new model from DLSS4 is released on 40-series? Curious how transform is going to do against current CNN.

  • @fazepancake
    @fazepancake 14 дней назад +3

    how is the raw performance gonna be like compared to previous nvidia gpus

  • @Mathster_live
    @Mathster_live 15 дней назад +55

    Classic manufactured generational improvement

    • @TheMSilenus
      @TheMSilenus 15 дней назад +1

      Why has blackwell then more AI TOPS than ADA? No improved tensor cores?

  • @imacomputer1234
    @imacomputer1234 15 дней назад +84

    Vote with your wallets.

    • @chloemarietaylor4036
      @chloemarietaylor4036 15 дней назад +12

      Entirely, sticking with my 980ti for yet another gen. Eyeing up the Arcs.

    • @draco2xx
      @draco2xx 15 дней назад +3

      4090 is doing just fine, not worth upgrading yet and it will be cheaper now. scalpers will now have to sell it at a loss🥳🥳🎉

    • @GoodStoryGames
      @GoodStoryGames 15 дней назад +1

      I'm definitely getting a 5080

    • @alexanderbaca7352
      @alexanderbaca7352 15 дней назад +2

      AMD IGPU for me, Nvidia is just gambling for frames.

    • @javiervelez9329
      @javiervelez9329 15 дней назад +5

      I will when i buy a 5080. More raw performance better RT and FG than anything AMD has to offer

  • @kanashi6736
    @kanashi6736 15 дней назад +20

    thanks for making this man. its horrifying how many people are just lapping this shit up. its so disappointing to me.

    • @jmangames5562
      @jmangames5562 15 дней назад

      I hope nobody buys them and I can get a few!! I see way more man childs hating on the 50 series in comments, 80% atleast, good stay on your past gen and shut up!!

    • @twotonekrazy7051
      @twotonekrazy7051 15 дней назад +1

      In order to sell a product you need consumers.
      Nvidia is pushing fake frames so hard because there are so many people out here willing to buy this bullcrap.
      It fucking sucks.

  • @ap-kl8lq
    @ap-kl8lq 11 дней назад +1

    5:34 this is my main concern, which no video showed
    the artifacts are already very visible with what we have, I can only imagine it'll look way worse with multi frame gen

  • @DebeaumontCadiz
    @DebeaumontCadiz 15 дней назад +958

    *Happy new year 🎊 You work for 40yrs to have $1M in your retirement, meanwhile some people are putting just $10K into trading from just few months ago and now they are multimillionaires*

    • @QuentinHufton
      @QuentinHufton 15 дней назад +2

      wow this awesome 👏 I'm 47 and have been looking for ways to be successful, please how??

    • @DebeaumontCadiz
      @DebeaumontCadiz 15 дней назад +1

      It's Esther A Berg doing, she's changed my life.

    • @Futralhundson
      @Futralhundson 15 дней назад

      After I raised up to 325k trading with her I bought a new House and a car here in the states 🇺🇸🇺🇸 also paid for my son's surgery (Oscar). Glory to God.shalom.

    • @afiktion
      @afiktion 15 дней назад

      Good day all👍🏻 from Australia 🇦🇺. I have read a lot of posts that people are very happy with the financial guidance she is giving them ! What way can I get to her exactly ?

    • @LysterCushard
      @LysterCushard 15 дней назад

      Absolutely! I've heard stories of people who started with little to no knowledge but made it out victoriously thanks to Ms. Esther A Berg.

  • @asdsad17
    @asdsad17 15 дней назад +39

    i'm more interested in the upscaler upgrade.
    maybe dlaa will be less blurry now.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat 15 дней назад +6

      Dlaa blurry ? Wtf are you talking with 3.7 update it is so sharp plus you can manually adjust sharpness using dlss tweaks

    • @Wodagazowana5000
      @Wodagazowana5000 15 дней назад +1

      ​@@DragonOfTheMortalKombathow to know what sharpness is correct tho

    • @SamuraiSupreme
      @SamuraiSupreme 15 дней назад +11

      @@DragonOfTheMortalKombat the shits blurry, this is hard cope icl

    • @christophermullins7163
      @christophermullins7163 15 дней назад +1

      Yeah I am stoked to see if the lower res upscaling looks better. Performance at 4k is as far as I can go but I prefer balanced. Maybe there is some magic that makes performance look great so I can actually run pathtracing at 60fps 4k with framegen

    • @eiidisa4214
      @eiidisa4214 15 дней назад +1

      dlaa is better than native what are you on 💀

  • @Noum77
    @Noum77 15 дней назад +56

    RIP game optimisation

    • @dante19890
      @dante19890 15 дней назад +10

      PC gaming has never been optimized. Always relied on bruteforce

    • @enzoguimaraescost
      @enzoguimaraescost 15 дней назад +7

      ​@@dante19890of course it was,rainbow six, valve games, etc

    • @dante19890
      @dante19890 15 дней назад +8

      @enzoguimaraescost you cant make an argument for the 0.02 percent. Pc gaming has never relied on optimization as a whole. Cuz u can't optimize a game for every pc hardware configuration.

    • @henry789
      @henry789 15 дней назад +1

      @@dante19890 half life 2 says otherwise.

    • @mitchhudson3972
      @mitchhudson3972 15 дней назад +7

      ​@@dante19890 Tf does "brute force" mean? Every game used to be optimized like crazy to even get it to run. Quake, doom, crisis, etc. Pretty much every game from that era was running with 1/50 the processing power and look about as good as games from way later than them.

  • @NewFoundPanda
    @NewFoundPanda 15 дней назад +31

    so the real question will the 5070 even be worth buying if your not going to use dlss but what if your upgrading a 3070 to a 5070 just in raw power would it be worth the upgrade at 2k gaming?

    • @X37V_Freecazoid
      @X37V_Freecazoid 15 дней назад +8

      Better 5070Ti for 16GB Vram, it really must have in such awful times of no optimization

    • @NewFoundPanda
      @NewFoundPanda 14 дней назад +1

      @@X37V_Freecazoid Thanks for info

    • @qwertyqwerty-zi6dr
      @qwertyqwerty-zi6dr 14 дней назад +2

      5070 ti all day

    • @persephone9360
      @persephone9360 14 дней назад

      yes, absolutely. since no one actually gave you a yes or a no.

    • @dasboot5366
      @dasboot5366 14 дней назад +1

      I'm not a fan of AMD and Nvidia, but I had a very negative reaction to the first generation of RTX 2000, I already realized then that they would sell this scam in the future and I was not mistaken.

  • @MinskUK
    @MinskUK 15 дней назад +10

    We want clear images with good native frame rates. The next wave of AAA games will potentially have neither. I guess we’ll see.

    • @tux_the_astronaut
      @tux_the_astronaut 15 дней назад +5

      Optimization is a thing of the past for AAA. frame generation and DLSS will make Nvidia more money, save money for the studios, all while the gamer will be paying more for less
      Everyone wins except gamers

    • @TheLoneWolfling
      @TheLoneWolfling 14 дней назад +1

      @@tux_the_astronaut In the short term.
      In the longer term?
      Obligatory reminder that one of the major factors leading to the video game crash of 1983 was a bunch of shovelware releases that initially sold like hotcakes until consumers caught up.

    • @StrazdasLT
      @StrazdasLT 12 дней назад

      What you wont stopped existing in 2015.

  • @Kamijin01
    @Kamijin01 15 дней назад +7

    My question is will this be some kind of gateway to poor gaming optimization. Seems like a lot of shortcuts that may affect the base game 🤷‍♂️

  • @niluss6
    @niluss6 15 дней назад +2

    I really don't understand how dlss is still marketed as such a big thing. TV's have done frame interpolation years ahead of dlss, and dlss now still has the same artifacting and latency problems.

  • @akyo19
    @akyo19 15 дней назад +43

    In the future years we wont even need a GPU... we will pay for " cloud " service that allow our GPUless pc to access AI and get 400fps ... I dont like it x)

    • @markbrettnell3503
      @markbrettnell3503 15 дней назад +6

      Agreed. Look at this online only gaming bs we have already.

    • @johnc8327
      @johnc8327 15 дней назад +7

      Where you been homie? They have that now with Xbox cloud and GeForce now.

    • @akyo19
      @akyo19 15 дней назад +2

      @@johnc8327 yeah, it's similar, but I'm talking actually about paying for a non existed GPU.. Cloud gaming you still need the console right? ( I really just care about pc, never looked into it tbh)

    • @NihongoWakannai
      @NihongoWakannai 15 дней назад +7

      Because more input lag is what everyone wants

    • @Grysham
      @Grysham 15 дней назад

      Nah they have cloud streaming games for PC already. Geforce Now was actually pretty good, gave fast high end graphics with no noticable latency as long as I had a good internet connection. Problem was licensing. Geforce Now only lets you play games you own on Stean or Epic (or GOG) but only if the publishers lease the game on the platform. That means tons of games aren't on there.
      Was fun while I couldn't get a graphics card during Covid, but I prefer being able to play on my own PC.

  • @fredEVOIX
    @fredEVOIX 15 дней назад +15

    "you will own no real frames and be happy"
    3090 was around 2Ghz 4090 is around 3Ghz I feared the 5090 would not see the same 70% jump in performance because he wasn't going to repeat that clock increase and sadly I was correct

    • @TheRealLink
      @TheRealLink 15 дней назад +2

      Can't gauge performance from clock frequency alone these days.

  • @ggmgoodgamingmichael7706
    @ggmgoodgamingmichael7706 15 дней назад +5

    This is worse than the old triple buffering input lag. It has to create a buffer between real frames to interpolate between with fake frames.

  • @Foxlum
    @Foxlum 14 дней назад +2

    Frame Warp is literally just the same thing Space Warp is in VR, but for flatscreen games, it wouldn't be that hard for it to apply elsewhere. Its literally just motion reproduction via motion vectors, the same motion vectors these upscalers need to keep distortion minimal. Nvidia is really pulling a Apple spin on this stuff.

  • @egotist-ical
    @egotist-ical 15 дней назад +4

    Not excited about the frame gen, and at the 9070 XT vs. 5070 price point, I think the former might make up for DLSS Super Res. with it's superior rasterization + inherent quality diff. between upscaled and native

  • @LHGaming25
    @LHGaming25 15 дней назад +23

    Frame generation tech has horrible latency at lower frame rate, its great to increase performance if it can achieve stable 60 ish fps to say 120 or 180 bit otherwise the input lag is terrible

    • @johnc8327
      @johnc8327 15 дней назад

      With DLSS super sampling, you would need a super weak GPU to not get 60fps. Even a 5060 will likely be able to DLSS balanced + frame gen to put out a better experience than a 3090 using no DLSS at all. Brute force rendering will be a worse experience unless you have a 5090.

    • @christophermullins7163
      @christophermullins7163 15 дней назад +1

      I was thinking.. if it went from like 120 to 135 or whatever... That means the base frame rate is 68 instead of the 60 fps of the old fg which is a few ms better latency. 🤷 Better is better I guess.

    • @christophermullins7163
      @christophermullins7163 15 дней назад

      ​@@johnc8327no offense but everyone with a brain would know you'd get a better experience with a 3090 in any game in any settings. Not even talking about the vram amount of the 5060. Framegen is useless for most people

    • @ArtorioVideojogos
      @ArtorioVideojogos 15 дней назад +1

      @@johnc8327 Frame time is still bad, even with reflex + boost

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat 15 дней назад

      ​@@ArtorioVideojogos lol no it isn't. Latency is better and so is smoothness.

  • @Foorythegreat
    @Foorythegreat 15 дней назад +5

    You really don't have a logical reason to why the reflex 2 technology comes to 50 series cards first? Really? It's money...

  • @ensuredchaos8098
    @ensuredchaos8098 10 дней назад +1

    The problem is that Nvidia themselves stated that you need a minimum of 60 FPS to not have latency, the 5090 was shown not even being able to crack 30 FPS in Cyberpunk without DLSS on. Not all games have DLSS which is worrying as well, will those games just run exactly the same as previous gen of GPUs (or worse)? Seems like a tough sell.

  • @leyonden9999
    @leyonden9999 15 дней назад +5

    The more you buy...the more frame you save thats right new quote

  • @michaeloconnor9739
    @michaeloconnor9739 10 дней назад +3

    Frame generation is just expensive motion blur

  • @tech1m502
    @tech1m502 15 дней назад +7

    I completely agree, 1 of every 4 frames are real........ that's madness =(

    • @cloudycolacorp
      @cloudycolacorp 15 дней назад +5

      When you think about it, it's like half of that 1 frame that is real given upscaling is happening too

    • @tech1m502
      @tech1m502 15 дней назад

      @@cloudycolacorp Yeah you are right, even worse hahah

    • @user-HAYMONjustregyoutuber99
      @user-HAYMONjustregyoutuber99 15 дней назад

      And that HALF real frame got upscaled textures and is trying to be sharper by Ai​@@tech1m502

    • @owdoogames
      @owdoogames 15 дней назад +1

      If only 1 of every 4 frames are real, then Nvidia better accept only 1 in every 4 of the $$$ that I pay for their GPUs are real. Give me my $500 5090 now, Jensen.

    • @tech1m502
      @tech1m502 15 дней назад

      100%

  • @alexsnow5
    @alexsnow5 10 дней назад +2

    Nvidia took the 4070 and mix it with Lossless Scaling then they put a 5070 label on it and made that "new tech" exclusive... and called it a day.

  • @whatif8741
    @whatif8741 15 дней назад +9

    AI is going to add to latency. Next gen we will probably see something to tackle it

    • @liberteus
      @liberteus 15 дней назад +9

      The AI will generate frames and play the game. You'll just watch.

    • @fedrecedriz6868
      @fedrecedriz6868 15 дней назад

      Nvidia Reflex 2 says hello

    • @jokered1133
      @jokered1133 15 дней назад

      Yeah Nvidia Reflex 3 will go to the future and grab the future frames for you and put them on screen so you can have -15ms latency, players on RTX6090 have an advantage on older cards and you will rank up in the future using AI TOPS

  • @HairryTurttleneck
    @HairryTurttleneck 15 дней назад +13

    literally nothing is real anymore... anything, anywhere

    • @endureuntiltheend86
      @endureuntiltheend86 15 дней назад +3

      Thank the ghews

    • @HairryTurttleneck
      @HairryTurttleneck 15 дней назад

      @@endureuntiltheend86 the greatest magic trick of all time. Fooled everybody by being the one to write the books, make the movies and pick our politics. Then pretend it’s all been us. Wild it worked without questions for decades. The internet is what will end up waking ppl up and saving us. They all want it regulated

    • @StrazdasLT
      @StrazdasLT 12 дней назад

      It never was. Its all just triangles masquarading.

  • @jeanjean0015
    @jeanjean0015 15 дней назад +23

    Fake FPS must be paid with fake money . 😀

    • @forasago
      @forasago 14 дней назад

      This NFT is worth 2000 dollars, trust me, nvidia.

    • @iftifn
      @iftifn 13 дней назад

      @@forasago And it's generated with AI! AI AI AI AI
      This comment should be upvoted much more!
      We should pay fake money for the fake world!

    • @pooshpoosh9232
      @pooshpoosh9232 9 дней назад

      Money itself is already as government can create as much as it wants
      Don't own money, find a way to own assets, otherwise, in the age of money printing, you'll become ever poorer

  • @KarelGita
    @KarelGita 15 дней назад +12

    The cheapest 4070Ti super is over 1000$ in canada. The 5070 is listed about 800$

    • @patsk8872
      @patsk8872 15 дней назад +6

      4070 Ti Super has a lot more raw, actual rendering performance. But if you're fine with AI generated frames then...

    • @supertony02
      @supertony02 15 дней назад

      @@patsk8872most people would still take the 5070

    • @zeenxdownz
      @zeenxdownz 15 дней назад

      ​@@patsk8872 No?

    • @Tech-is1xy
      @Tech-is1xy 15 дней назад +1

      @@patsk8872 Where are you getting this from. The 5070 seems like it will be very close in performance to a 4070 TI super. Maybe slightly weaker. Not "a lot more", as your stating.

    • @azraelfaust487
      @azraelfaust487 14 дней назад

      @@patsk8872 have you tested the 5070 yourself ? what games did you play bro enlighten us

  • @mibbio2148
    @mibbio2148 15 дней назад +5

    15:50 or option 3: it's just in that preview version for comparison and wil be removed before releasing the game/update.

  • @khirasier
    @khirasier 15 дней назад +13

    lossless upscale has had 4x framegen already for a while.

    • @francescoatzori7417
      @francescoatzori7417 15 дней назад

      Xd

    • @Velocifero
      @Velocifero 15 дней назад +8

      With horrendous artifacting... Makes it completely unusable.

    • @LordVader1887
      @LordVader1887 15 дней назад +2

      @@Velocifero and you think dlss 4 wont?

    • @dante19890
      @dante19890 15 дней назад +2

      its on the same level as a 10 yr old TVs motion interpolation. pretty much unusable

    • @Tech-is1xy
      @Tech-is1xy 15 дней назад +11

      ​@@LordVader1887You want us to believe a solo developer has made a frame generation algorithm as robust as a multi billion dollar company?

  • @Superdazzu2
    @Superdazzu2 14 дней назад +1

    dlss super resolution and frame gen improvements on the 30th of january seem incredible, can't wait to try them (improvements to the model of dlss upscaling and improvements to vram, latency and performance of dlss fg improvements)

  • @MrAnimescrazy
    @MrAnimescrazy 15 дней назад +6

    I cant wait for the raster benchmarks of the 5090 vs the 4090

  • @Porcuman
    @Porcuman 14 дней назад +8

    "Impossible without AI" that's when you know you became a slave to technology

  • @Skeleleman
    @Skeleleman 15 дней назад +6

    Oh well thanks for making Reflex 2 available on previous series.
    Good to know that I can finally use advertised frame gen after 3 years.

  • @thatzaliasguy
    @thatzaliasguy 15 дней назад +2

    Remember the days back when raw rasterized performance was king, and 60fps meant only 16ms of latency... This new timeline f'ing sux.

  • @AlinaTaylor-p4g
    @AlinaTaylor-p4g 15 дней назад +5

    Love your videos Thank you for the Content

  • @povertyiscreated2265
    @povertyiscreated2265 15 дней назад +10

    Future games will be entirely AI. Theyre using ai compute as a benchmark. Software is more important than hardware.

  • @Beztebyo
    @Beztebyo 13 дней назад +4

    2:07 they litterally just gave her the bold glamour tik tok filter😭💀

  • @AttilaV.F.
    @AttilaV.F. 13 дней назад

    Dude your videos got so much better, congrats man, big step up!

  • @maximumjesus
    @maximumjesus 15 дней назад +6

    are we gonna see video game characters with 3 rows of teeth and 8 fingers on each hand?

  • @tokertalk9648
    @tokertalk9648 14 дней назад +8

    With that tacky leather jacket, the CEO must have a robot fk doll programmed with the latest AI tech. Im sure the extra frames in the newer GPUs help him get off easier but it sure as hell makes gaming worse!