DLSS 3 Kinda Sucks

Поделиться
HTML-код
  • Опубликовано: 2 фев 2023
  • Twitter: / swallowfire
    Message me anywhere if you have any suggestions or comments, I'd love to hear from you!
  • НаукаНаука

Комментарии • 120

  • @videosallnight
    @videosallnight 9 месяцев назад +10

    Everybody knows that DLSS stands for Dirty Little Secrets Scam !!!!

    • @soulbreaker1467
      @soulbreaker1467 2 месяца назад

      Their fans boys dont and defend it i seen people saying 4060 beats the rx 7900 xtx lol

    • @InsidePlay2023
      @InsidePlay2023 19 дней назад

      ⁠@@soulbreaker1467Yes 4060 beats 7900xtx in terms of power consumption, features and future proofing because of having dlss and fg so you make no sense here mr "lol"

  • @mand_oh
    @mand_oh Год назад +48

    I agree, I really think that 1. DLSS 3.0 should’ve been called DLFG (Deep Learning Frame Generation) because calling it DLSS is kinda misleading, and 2. We should really be considering Asynchronous Reprojection over techniques like frame generation, as with the former you don’t need to fake any frames but the gameplay still manages to be way smoother. It’s kind of baffling to me that anything outside of VR hasn’t adopted it yet because it’s basically free responsiveness.

    • @Swallowfire
      @Swallowfire  Год назад +5

      Yes, I notice reprojection in VR quite often and it's waaaaaay less intrusive and ugly than the horrible stutter I feel when frame generation is on.

    • @darkengine5931
      @darkengine5931 Год назад +1

      @@Swallowfire There's a huge conceptual problem with VR with RTX as well. Most of NVIDIA's state-of-the-art path tracing demos rely on heavy use of biasing and denoising hacks to the image to run at real-time or pseudo real-time framerates. They've just barely gotten it to where you don't notice too many artifacts with all this heavy filtering if both eyes are seeing the same image. But with stereoscopic vision, you notice it so much! I'm a CG dev and got so excited at techniques like ReSTIR and Optix until I tested it in a VR context. It completely falls apart in VR, and if we can't use those techniques, we're still miles away from true real-time path tracing.
      I haven't tested DLSS in VR but I imagine it's similar for VR because all these techniques rely on spatiotemporal filtering (trying to insert missing information by looking at data from frames nearby or looking at data from previous frames across time). These techniques all start to show their artifacts 10x more with stereoscopic vision.

    • @Mikelica69
      @Mikelica69 Год назад

      🤓☝️

    • @mhnoni
      @mhnoni 5 месяцев назад

      Indeed, I just found out about this, I thought the frame generation was based on the old frame not making a new one, why did they do this? makes no sense to me.

  • @MocchiDraws
    @MocchiDraws 7 месяцев назад +4

    We've gone from frame tearing to reality tearing

  • @aaronreed8402
    @aaronreed8402 Год назад +3

    I'm in the market for a new build, and I appreciate the information that you presented here. Thanks for backing up your argument with evidence too!

  • @ChrisPFuchs
    @ChrisPFuchs Год назад +4

    It's been working really well for me in Cyberpunk. Switching out the .dll to 1.0.7 already shows pretty big progress. The biggest issues I have with it are issues with artifacts in the menu and some UI artifacts. In game I actually prefer its image quality to DLSS 2's image quality.

    • @Swallowfire
      @Swallowfire  Год назад +1

      The Cyberpunk update wasn't out while I was making the video. The continuous first person view really helps prevent the issues that annoy me. I'm sure I wouldn't mind it in God of War either with a single continuous shot and no cuts.

    • @hypetelevision
      @hypetelevision 11 месяцев назад

      what card are you using

    • @ChrisPFuchs
      @ChrisPFuchs 11 месяцев назад

      @@hypetelevision I have a 4090. I think I agree with the above comment, the artifacts in first person shooters are minimized with DLSS 3, but then input latency becomes more noticeable. I used DLSS 3 in Hogwarts legacy, and the character artifacting right in frony of the screen was super annoying. The technologies pretty cool though and I still find I'll use it in certain games.

  • @watchyoursixmate
    @watchyoursixmate 11 месяцев назад +1

    Quick question does anybody get artifacts on Microsoft Flight Simulator with the loading screen just before you load into the world with your plane with DLSS3 enabled? I get artifacts on my loading screen just before I load into the world with my plane and it only happens when I have DLSS3 enabled, it doesn’t happen when DLSS3 is turned off I’m just wondering if anybody else has this issue with DLSS 3?

  • @makingtechfriendlyindia
    @makingtechfriendlyindia 9 месяцев назад

    Is it causing motion sickness?

  • @danielgomez7236
    @danielgomez7236 11 месяцев назад +6

    See, DLSS3 frame generation will get better over time and it needs to be tested with each game, all AI need some work, testing and fine tuning.
    It may not look good at first with fast moving scenes but will get better and be very important in the future

    • @prozac1127
      @prozac1127 8 месяцев назад +1

      Even in dlss super res i run native and get updated resolution. If i run low res then there's artifacts and prediction errors

  • @BelfastBiker
    @BelfastBiker Год назад +9

    honestly, it looks like fantastic technology.

  • @TheIndulgers
    @TheIndulgers Год назад +3

    Right for sure. Tons of artifacts when jumping between the logs.

    • @Munenushi
      @Munenushi 10 месяцев назад

      this ^ I don't see how some people can't just see it.... the ghosting, blurring, etc.
      maybe it has something to do with their monitors, the refresh rate, or something like that - if a monitor has a really low refresh rate like 8ms or something, they'll see 'ghosting' anyway, and totally won't notice the 'ghosting' artifacts in the game lol

  • @BeckOfficial
    @BeckOfficial 9 месяцев назад +1

    Frame interpolation has all days been a nightmare at high speeds where objects move a lot from frame to frame. And it has been even since mid 00's with the twixtor plugin for After Effects that could slow down your footage simulating super slow motion. If the object moved too much from frame to frame it would create a lot of artifacts.
    Especially when introducing motion blur and high speeds or very complex objects like fluids (waves crashing) it becomes nearly impossible to make it look good with out any artifacts unless you already have a high frame rate. The lower the frame rate, the more artifacts because objects move further from frame to frame. I would say you need at least 100 fps before turning frame gen on would even start to make sense. And by that point it almost doesn't matter any more.
    I highly doubt that this feature will ever look as good as the native frame rate at sub 80fps. At least not with the traditional interpolation. Maybe if you interpolate the raw data before rendering, instead of interpolating the finished rendered images it could work without artifacts. But that would probably tank performance instead of boosting it.
    I hope that I am wrong though and that we see some insane AI magic in the future.

  • @ElluSoler
    @ElluSoler Год назад +2

    I noticed some random black points (artifacts) in the right witcher 3 video also

  • @DeepakJindal07
    @DeepakJindal07 Год назад +4

    To be honest, these ghostery images are much better then intermittent shedders and delays, anyone who plays games like Battlefield, COD, would not mind these. They want flawless gaming. So Both, DLSS and FSR are fine to me.

    • @Swallowfire
      @Swallowfire  Год назад +2

      DLSS frame generation looks like stutters to me. DLSS and FSR upscaling don't cause that. Leave them on but frame generation off.

  • @elusivemindsstudios
    @elusivemindsstudios Год назад +1

    My dude what was that first game you were playing?

    • @Swallowfire
      @Swallowfire  Год назад

      The side scroller? That's a game called Fist Forged in Shadow Torch. Metroidvania with a rabbit and a big metal fist. It's great.

  • @jeffreyhines8794
    @jeffreyhines8794 Год назад +8

    people always look for something to complain about , I love it so far also I'm not playing my games at super slowed down speeds just so I can catch artifacts or issues, if are you constantly looking for errors you will find it in everything, instead of taking the time to enjoy it they use the time to find errors

    • @Swallowfire
      @Swallowfire  Год назад +2

      Did you notice it in the Spider-Man cutscene I showed?

    • @JABelms
      @JABelms Год назад +4

      I played 4k since 2014, trust me, if you see and know these issues you will see them. It's just an acquired skill

    • @michaking3734
      @michaking3734 11 месяцев назад

      amen!

  • @Munenushi
    @Munenushi 10 месяцев назад +1

    :23 if I may, the "magic" you speak of is just a form of FSR actually (lol ) - it is rendering the game at a lower resolution, then scretching it up ("upscaling") and then 'enhancing' it, with sharpening, contrast adjustments, etc etc. - you end up with better framerate but slightly less fidelity
    (interestingly, it is similar to when NVIDIA acquired 3dfx and put out a "T-Buffer" gimmick; it was a 'blending of frames' to try alleviate aliasing at the time)
    I don't see how some people can't just see it.... the ghosting, blurring, etc..
    maybe it has something to do with their monitors, the refresh rate, or something like that - if a monitor has a really low refresh rate like 10ms or something, they'll see 'ghosting' all the time anyway, and totally won't notice the 'ghosting' artifacts in the game when dlss3 is on lol
    going AMD for now, yes... great vid examples

  • @drumyogi9281
    @drumyogi9281 Год назад +2

    Thank you for informing me that you can't use vsync with DLSS. That is very important information. I do have one question, though. Can you cap your FPS still?

    • @Swallowfire
      @Swallowfire  Год назад +1

      It depends on the game. I was able to cap it in Witcher, but not Spider-Man.

    • @drumyogi9281
      @drumyogi9281 Год назад +1

      @@Swallowfire doesn't that just cause screen tearing?

    • @Swallowfire
      @Swallowfire  Год назад

      In Spider-Man and Need for Speed, it does. Even on a freesync display. Very ugly and annoying.

  • @03chrisv
    @03chrisv 11 месяцев назад +2

    It's getting better. Frame generation is a first gen technology so expect it to get better and better over time. Currently its at a state that is very usable in a lot of games.

  • @Conner06
    @Conner06 Год назад +3

    maybe if devs even bothered to optimize the games we wouldn't have to use these horrible technoligies like fsr and dlss. Way to reliant on them.

    • @Swallowfire
      @Swallowfire  Год назад +1

      Ya. If every game ran like Doom Eternal we wouldn't be in this mess.

  • @vidfreak56
    @vidfreak56 8 месяцев назад +1

    I got it right. So obvious FG was the one on the right. I could literally feel the FG compared to native.

  • @TheL1arL1ar
    @TheL1arL1ar Год назад +2

    I thought I was watching digital foundry for a second…. lol right on brother…

  • @GotAdam1
    @GotAdam1 Год назад +4

    I am really impressed with fg on my 4090 fe and dlss is great sure it can introduce artifacts in certain situations but for new tech it's impressive. as far as pricing yeah it's getting ridiculous but amd is actually worse value at the minute as it doesn't have the performance, the features and the coolers are garbage in comparison

    • @Swallowfire
      @Swallowfire  Год назад

      Oh really? I didn't know the coolers on the FE models weren't up to scratch.

    • @imnotusingmyrealname4566
      @imnotusingmyrealname4566 10 месяцев назад +1

      ​@@SwallowfireAMD reference cards sell in low numbers. They are perfectly, just not the quietest. Way better than anything previously.

    • @imnotusingmyrealname4566
      @imnotusingmyrealname4566 10 месяцев назад +2

      You might have enough VRAM spending an insane 1600 dollars on a graphics card but 4060, 4060 Ti, 4070, and 4070 Ti already have not enough VRAM and frame gen requires a lot of VRAM. NVIDIA cards age like milk. Also no, AMD has equal or better performance with substantially more VRAM (except the horrid 7600p except for the very highest tier where their RDNA 3 architecture didn't meet performance targets. The only difference is tech features. Prices will only ever go up. Have fun spending 2500 dollars on a 5090, but hey it will likely have 32GB VRAM with a 512 bit bus so it must be worth it.

  • @susiolu
    @susiolu 6 месяцев назад

    for laptop gamers like me, we have it even worse, our 4070 mobile has only 8 gigs of memory and almost 20-30% less performance.

  • @lorenztor1990
    @lorenztor1990 Год назад +1

    lol, still rocking a 1080.

    • @Swallowfire
      @Swallowfire  Год назад

      Love the 1080. Had a founders edition in my PC for like 4 years.

  • @trickygaming3211
    @trickygaming3211 Месяц назад +1

    this doesnt apply to every game though

  • @DanoeRenouva
    @DanoeRenouva 11 месяцев назад +1

    Maybe dlss need itd own processor chip on graphich card.

  • @jakejoyride
    @jakejoyride Год назад

    Beautiful video

  • @DrunkedMunk
    @DrunkedMunk 10 месяцев назад +1

    I would buy three steam decks if I could link them all up and have one machine with three times the performance of a single steam deck. SLI anyone?

    • @Swallowfire
      @Swallowfire  10 месяцев назад

      Hahaha and it would still be cheaper than a 4090

  • @user-te4zh2dz9r
    @user-te4zh2dz9r 6 месяцев назад

    2:35 Hello! I am not an English speaker and I am very interested to know: you speak words "far" and "simulator" is in the British manner (British pronunciation), although the country on your channel is the USA. Why is this so?

  • @Denis-wk7km1
    @Denis-wk7km1 6 месяцев назад

    Blogger doesn't understand that these artefacts can be fix, and Frame generation can be improved. 10 month later in december 2023 I can't see any problems in many DLSS3 FG games

  • @aperson8693
    @aperson8693 10 месяцев назад +1

    In games such as Hogwarts Legacy where the optimization is shit, FG is actually extremely useful in straight up doubling my fps

    • @KainniaK
      @KainniaK 10 месяцев назад +4

      The problem with this is that some game companies are gonna be like: hey let's make more money by not paying our devs to optimize our game, we will just some DLSS on it to get a higher frame rate instead.

  • @jonathan130
    @jonathan130 Год назад +9

    ok but we at least they bring something new, it’s not perfect but it’s a big leap.

    • @greg8909
      @greg8909 Год назад +3

      Creating frames that the game does not makes the input lag even bigger than with dlss 2.0 and even more on lower end cpu.
      It also lower the visual fidelity of the game.
      But yeah you have more FPS appears on the screen...
      Most ppl won't notice the difference so personally I prefer less input lag and more visual fidelity over more FPS.

  • @maddogfargo3153
    @maddogfargo3153 9 месяцев назад +1

    DLSS looks bad, especially when you watch the videos slowed down to 25%. This CLEARLY shows you 2 important things:
    1) DLSS still has tons of blurriness and temporal smearing / ghosting! You can even notice it at normal playback speeds, but slowing it down makes it even more obvious.
    2) Frame Generation 'augments' your frames by adding GARBAGE frames in between QUALITY frames for 'fake' smoothness. Essentially, it just adds NOISE at a fast pace.
    So much for the group think on 'superior image quality'. DLSS + frame gen looks like blurry noise. But hey, people gotta cope with wasting all that $$$ on nVidia hype, somehow. 🤣

  • @druout1944
    @druout1944 9 месяцев назад

    DLSS is like that fable "The Emperor's New Clothes". It's pretending but it isn't; it cannot look as good as native by it's very implementation; it is not rendering at full native resolution; they can say supersampling all they want; but it is actually rendering with less fidelity than native; hence the blurry awfulness in games.

  • @valentinvas6454
    @valentinvas6454 10 месяцев назад +2

    I'm kinda late but thank you for making this video. I find it pretty weird when people (and Nvidia) are trying to argue that DLSS3 makes weaker RTX 4000 cards a better choice than more powerful Radeon or RTX 3000 cards because frame generation... As if this thing worked perfectly. I think many people don't understand that the extra DLSS3 frames are not even generated by the game engine so you can't do anything in these generated frames. That's why the game doesn't feel more responsive even though you see more frames.

    • @Nerathim
      @Nerathim 8 месяцев назад

      Frame gen works extremely well in some games, Alan Wake 2 being a good example. Same for Cyberpunk 2077.
      As long as your game isn't extremely fast paced, if you hit 45fps without FG, FG will feel good.

  • @BelfastBiker
    @BelfastBiker Год назад +2

    if we can only see it in FS when you slow down things drastically, then the issue doesn't exist.

    • @Swallowfire
      @Swallowfire  Год назад

      I perceive it as a stutter, like shader compilation. It's impossible to show that in a 60fps video from a 100+fps source file, and even if I could, I don't know how many other people would if they're not the one to press the button, so I used individual corrupt frames to demonstrate what's actually happening to make me think the game it stuttering.

    • @BelfastBiker
      @BelfastBiker Год назад

      @@Swallowfire ok cheers

    • @Munenushi
      @Munenushi 10 месяцев назад

      you can actually see it in game, in his example video here. it's like old school 'ghosting' issues.... some percieve it as stuttering, some as blurring, etc. it is totally noticeable playing

  • @KainniaK
    @KainniaK 10 месяцев назад

    This is all good and well but the only other company that sells you GPU's that are good enough to play the latest games on is AMD and their solutions are even worse. In fact they are so bad that in order to possible gain some marketshare they have been paying game companies not to support DLSS.

    • @Swallowfire
      @Swallowfire  10 месяцев назад

      I agree. The whole thing with Starfield and FSR is super shady. FSR itself is fine. I did a direct comparison a while ago and DLSS always took the performance crown, but FSR and XeSS looked better in some very specific circumstances.

  • @geeknproud321
    @geeknproud321 Год назад +1

    If you haven't bought a card yet basically any flavor of 3080 used is what you want. It does all the important tricks and raytracing is perfectly doable at 1440p max settings. Avoid mined cards by buying an LHR card. 4080 is a great card being much more efficient than 3080 for what it gives you but that pricing...

  • @desiraqble
    @desiraqble Год назад +3

    quit complaining you have a 4080 dude😂 you get high frame rates anyways lol
    thanks for the video anyways!

    • @Swallowfire
      @Swallowfire  Год назад

      HAHA I love the card! I just don't love Nvidia's shady marketting.

    • @desiraqble
      @desiraqble Год назад +1

      @@Swallowfire agreed, i’m still stuck on a 1650 super so i like to try to keep up

    • @Swallowfire
      @Swallowfire  Год назад

      Hey that's valid. My media PC had a 1650 until January this year. Very solid card considering what they cost before the GPU shortage haha

    • @desiraqble
      @desiraqble Год назад

      @@Swallowfire thats when i built my system, should have waited

  • @user-gh8dj4sm7m
    @user-gh8dj4sm7m 10 месяцев назад +1

    oh my go it really "sucks" because when I'm playin a game i slow down the F game at 18% speed and see a shadow holyyyyy experience ruined .. lmao dlss 3 is the best thing ever i have a rtx 4060 ti that give the same performance as 3090 ti with dlss 3 not exaggerating at all go check it out yourself the difference is so minimal that its negligible and no need to say it sucks, pretty sure dlss 4 or something would provide better AI its in early stages still looks amazing.
    I remember having a gt 1030 and shity graphics compared to that now when i play games I don't look at slow down footage to point out how horrible it is, LOL!.

  • @Komentujebomoge32
    @Komentujebomoge32 Год назад +3

    Maybe Intels graphics cards? I do not like AMD. I know, nobody cares

    • @Swallowfire
      @Swallowfire  Год назад +1

      I was thinking about buying an Arc now the price as dropped.

  • @peko8091
    @peko8091 10 месяцев назад +1

    not kinda sucks
    it sucks

  • @hoshikuzuvenus
    @hoshikuzuvenus 8 месяцев назад

    underrated video

    • @Swallowfire
      @Swallowfire  8 месяцев назад

      RUclips doesn't show it but this is my most disliked video ever, 19% dislikes lmao
      Lots of salty 40 series owners.

    • @hoshikuzuvenus
      @hoshikuzuvenus 8 месяцев назад +2

      @@Swallowfire Some people can't handle the truth. Also, this only being available to the recent high-end GPUs enable game developers to poorly optimize their games.
      Keep up the good work man. Subscribed!

  • @darkengine5931
    @darkengine5931 Год назад +3

    As a dev, I really dislike it just conceptually even though a lot of users seem to have taken a liking to it. What I worry about regardless of how much people like it is that I think this tech will very likely slow down progress in the real-time computer graphics industry in actually being able to rasterize (or even raytrace in the future) at full resolution. It alleviates pressure from us developers as well as hardware manufactures to *actually* build software and hardware fast enough to rasterize or raytrace 1SPP (or higher with true AA) at real-time framerates on high-resolution displays.
    It also defeats the functional purpose of playing at a higher screen resolution. It might not entirely defeat the aesthetic purpose if people don't mind the artifacts from spatiotemporal sampling, but say someone wants to play a game at higher resolution because they want to be able to spot enemies a kilometer away to snipe. DLSS would actually not help here at all and generally make things worse, since a raytracer rendering at 1/4th to 1/9th screen resolution is typically going to be unable to raytrace such distant objects that start to become smaller than a pixel in size (there aren't enough rays being cast at 1RPP so the fewer rays per pixel will mostly miss those distant objects). A rasterizer has a similar issue because most of those distant triangles will want to rasterize to the same pixel with fewer pixels and won't be drawn or will be overwritten with the depth test. Super sampling can't insert things the rasterizer or raytracer never rendered in the first place.
    So if someone has the option between playing a game with DLSS at 2160p vs. playing without at 1280p at roughly the same FPS, they will actually see more information clearly, like objects at a distance, fine textural details, strands of hair, things instantly appearing in-frustum, at 1280p better than 2160p with DLSS since again the rasterizer/raytracer has more pixels to work with. So depending on our tastes, maybe 2160p with DLSS might look better to some people than 1280 without, but 1280p without still offers much more visual information. If the main point of increased screen resolution is not just to make things look pretty but to see more data, then DLSS doesn't help at all and will likely slow progress in this area. If it's just to make things look pretty to people at the cost of visual information, then depending on the person, it may or may not succeed for them depending on their personal idea of what pretty means.
    As for what looks pretty, personally I prefer fewer visual artifacts at the cost of resolution. Realize not everyone is the same but I seem to be like you. I come from a visual arts background (VA/CompSci dual major) and I might be a bit visually obsessed. There's lots of visual things most people don't notice that I tend to notice, so I generally prefer DLSS off and just dial down the screen resolution to get the frame rates.

    • @Swallowfire
      @Swallowfire  Год назад +1

      I totally see your point, other comments are also saying it's absurd that PC players are content using upscalers like a console. I wouldn't go that hard. My main reason for using DLSS is that the anti aliasing it provides on Quality mode (66.6%) is often better than the AA built into the game. Forza Horizon 5's AA is very ugly and DLSS cleans it up nicely. Then again, I've always had top of the line hardware. If you're running a 2060 then DLSS might be absolutely necessary to get 60fps.

    • @darkengine5931
      @darkengine5931 Год назад +1

      ​@@Swallowfire Makes sense! Are you comparing DLSS to TAA or MSAA or both if I may ask? MSAA should hypothetically yield the best quality (or at least the closest to ground truth), since it's actually rendering at the sub-pixel level and akin to rendering at 4+ times the frame buffer resolution, but that's very expensive. TAA actually seems worse than FXAA IMO since the temporal sampling across time can lead to bigger artifacts smeared across frames.
      DLSS at least seems superior to me for antialiasing to TAA. I'm really not a big fan of trying to sample across frames or across pixels and look for data to reuse that matches since it biases the results, but at least DLSS does it in a really smart way to try to (not always succeeding) reduce the artifacts as much as possible. I'm hoping in the not-too-distant future that we can get rid of spatial and temporal samplers and just take multiple samples per pixel as with the case of MSAA.

    • @Swallowfire
      @Swallowfire  Год назад +1

      Usually I'm comparing DLSS to TAA. Unfortunately a lot of games force TAA on when an upscaler is off. Hogwarts and Forza both do this. MSAA is definitely best but like you said, crazy on the GPU.
      DLSS is the best but it does break down. Play HiFi Rush if you want to see DLSS absolutely fall to bits lol

    • @TotalXPvideos
      @TotalXPvideos 11 месяцев назад +1

      That's exactly what I fear aswell, games that can't be played without DLSS3 even on 70-80 models, I tested out remnant 2 yesterday and on an RTX 3070 with settings on high at 1440p it wouldn't hit 60 fps if I turned DLSS2 off and that game is NOT some next gen graphics beast, don't think it even has ray tracing and such.
      Now imagine in the future other games that will be like that, now the game without dlss3 runs like 40 fps and to get it to 70 you need dlss3 but since it has less and less frames to work with the whole image will just be terrible.

    • @03chrisv
      @03chrisv 11 месяцев назад +1

      Doesn't DLSS 4k quality mode render at around 1800p internally and balanced mode at around 1440p? Meaning it still has more pixel information to work with over native 1280p. I think you were referring to DLSS 4k performance mode which is around 1080p internal resolution. I think in many cases even DLSS 4k performance mode still looks a bit better than native 1440p, let alone 1280p

  • @-Rizecek-
    @-Rizecek- Год назад +1

    I personally started playing at native 1440P at 125FPS. DLSS will give you a better picture, but it can also make it worse and add ghosting.
    DLSS frame generation is a total abomination.

    • @Swallowfire
      @Swallowfire  Год назад

      I find DLSS is at its best when it's making up for bad anti-aliasing. Forza Horizon 5 has horrible AA and DLSS smooths it out perfectly.

  • @ever611
    @ever611 Год назад +1

    the true issue with dlss 3 frame generation is that, when you absolutely need more frames, you will get more visual artifacts and feel less responsiveness

    • @Swallowfire
      @Swallowfire  Год назад

      The extra input delay drove me nuts in Cyberpunk Overdrive with frame gen on. Couldn't stand it with a controller or a mouse.

    • @deyandimitrov7287
      @deyandimitrov7287 10 месяцев назад

      @@Swallowfire Well, you need a base framerate of at least 50-60 for the latency to not make the game unplayable. RT Overdrive would make that impossible as even a 4090 can't run it properly. It is a great technology, though but as I am on an AMD GPU, I am waiting for Fluid Motion Frames.

  • @shihabuddinahmad7284
    @shihabuddinahmad7284 Год назад

    Why are you not recommending Xess. For 4 GB GPU (gtx 1650 super) users this is the only DLSS for me. Intel is a new GPU manufacturer but it is rising faster than any newcomer if I am not wrong. I believe Arc B380 will be the future most used GPU in steam servay.

    • @Swallowfire
      @Swallowfire  Год назад

      When I made this video, I didn't own an Arc. I have since made a video comparing DLSS, FSR, and XESS and I was very impressed by it. Check it out!

  • @rammer3232
    @rammer3232 11 месяцев назад +1

    I don't even notice those 0.1 milliseconds glitches, people "hate" on things too easily.

    • @Swallowfire
      @Swallowfire  11 месяцев назад

      On a 144hz display, I see them as stutters. Feels like playing a game with the dreaded shader compilation stutter.

    • @YuriMomoiro
      @YuriMomoiro 10 месяцев назад

      "I don't have this problem, so no one else does."
      Some people have more sensitive eyes than others.

  • @marcleblanc7295
    @marcleblanc7295 Год назад +4

    Your eyes look sexy when upset :) I love my 4090 and flight sim all i gotta say, and no, i don't slow down anything to observe something to complain about, i enjoy the flying and since getting a 490 I"m gettting better first try landing scores too and 120 and up all settings ultra, terrain detail 400 alll the way up, all options completely maxed out, never below 100 and i've never noticed jitter or stutter.

  • @BonusCrook
    @BonusCrook Год назад +1

    Or just buy a used 3090 and fix the thermal pads

    • @Swallowfire
      @Swallowfire  Год назад

      Friend of mine did that and loves it. Even though it CPU limits everything cos he's only got a 3700X lol

  • @pcneststayinformed
    @pcneststayinformed Год назад +1

    I can't believe that DLSS 3.0 is so bad that it is actually a turn off. I am good with my 1080 Ti. Nice video bud.

    • @Swallowfire
      @Swallowfire  Год назад +1

      1080ti is a beast. One of the best cards they ever made.

    • @geeknproud321
      @geeknproud321 Год назад

      My 1080Ti was a stopgap from 1070 to 3080. Just not enough horsepower to do justice to some newer games like Metro and others. Glad I skipped RTX2000 because the RT performance sucked on those cards. But yeah, 1080ti didn't do the job for me. Even if some games are just terribly optimized throwing more power at them is still the solution. Also, raytracing when done right is KICKASS.

  • @MultiTesseract
    @MultiTesseract Год назад +1

    Pixel peeping

    • @opmacace523
      @opmacace523 Год назад

      It’s pretty noticeable you don’t need to pixel peep if you are used to 120+ hz

  • @kingiument4627
    @kingiument4627 11 месяцев назад +2

    Ah yes Nvidia making a new technology that requires new hardware that supports it sooooo evil uhuh

    • @kevinerbs2778
      @kevinerbs2778 11 месяцев назад

      It'd be better if older tech could use it, that way we create less e-waste. It's why I support S.L.I/crossfire for DX11, & mGPU for DX12. We need more mGPU support on more games.

    • @kingiument4627
      @kingiument4627 11 месяцев назад +1

      @@kevinerbs2778 crossfire likily wont ever really work well. Also did you not see what I said about it needing hardware that actually supports the new stuff. Older stuff simply cant take advantage of it

    • @Munenushi
      @Munenushi 10 месяцев назад +1

      AMD made their tech :
      - support old cards
      - support any games
      - support competitor (ngreedia) cards
      up to you tho

    • @kingiument4627
      @kingiument4627 10 месяцев назад

      @@Munenushi Nvidia has the same thing its called NIS and it was on par with fsr 1 but it hasnt been worked on since fsr 2

  • @wizzenberry
    @wizzenberry 5 месяцев назад

    the amount of people who got a 40 series coping in these comments is insane, dlss 3 is just a scam to sell overpriced gpus

  • @LowRiderFuckYou
    @LowRiderFuckYou Год назад +2

    oh yes after this video i'm running buying a crappy amd gpu! not at all. 4070ti is on the way.