AMD's FSR 3 Frame Generation Analyzed

Поделиться
HTML-код
  • Опубликовано: 6 сен 2024

Комментарии • 2,5 тыс.

  • @elheber
    @elheber 11 месяцев назад +1431

    I'd love to see an analysis of gameplay footage deleting half the frames and using only the odd synthetic frames, so are how the artifacts look on their own in real time.

    • @Mojo_DK
      @Mojo_DK 11 месяцев назад +121

      That is an awesome idea.

    • @zedsdeadbaby
      @zedsdeadbaby 11 месяцев назад +46

      That's a very clever idea, I would like to see this also.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat 11 месяцев назад +53

      Let's see how real or fake those frames are.

    • @monsterboomer8051
      @monsterboomer8051 11 месяцев назад +18

      that is a very interesting idea

    • @goldenpanda7004
      @goldenpanda7004 11 месяцев назад +35

      Interesting idea, but useless in application

  • @Zordonzig
    @Zordonzig 11 месяцев назад +649

    I can only hope AMD and game devs get better with FSR3 over time. We really really need AMD to be competitive.

    • @yellowflash511
      @yellowflash511 11 месяцев назад +97

      Not really because people who say this only want that so they can buy Nvidia at cheaper rates.

    • @saminavy7124
      @saminavy7124 11 месяцев назад +22

      @@yellowflash511 so true!

    • @SirCaco
      @SirCaco 11 месяцев назад +34

      @@yellowflash511 I'm sure there's truth to that when it comes to certain people, but that doesn't mean it's not true. It's very important for competition to continue. There can be no stagnation, or there will be complacency, and that means higher prices for worse products.

    • @qlum
      @qlum 11 месяцев назад +23

      @@yellowflash511 Not true for me, I just like my torture myself running Linux, but not torture myself hard enough to use Nvidia's linux drivers.

    • @evrythingis1
      @evrythingis1 11 месяцев назад

      It's going to take much more than being better than Nvidia in every way to break Nvidia's monopoly. AMD has had the better offerings for the majority of their time competing with Nvidia, it has never affected market share. Just like Google with their illegal search monopoly, or Microsoft with their illegal x86/64 operating system monopoly. Hell, AMD had to sue Intel for over 4 billion dollars just to break into their illegal CPU monopoly.

  • @thaddeus2447
    @thaddeus2447 11 месяцев назад +8

    Don't know what Tim is talking about, but from footage he represented i see better sharpness and fine detail on FSR3 side compared to blurry image on DLSS3 side. So what kind of DLSS superiority he is talking about? FSR3 works even on GTX 10 series cards and does as good job as DLSS3 Which only works on 40 series.

    • @kenshirogenjuro873
      @kenshirogenjuro873 11 месяцев назад +2

      I seldom noticed any “superiority” on the DLSS side either. In some places I noticed stutter on the DLSS side while it was perfectly smooth on FSR. Not sure if that an artifact of capturing or something else.

  • @keyboard_g
    @keyboard_g 11 месяцев назад +198

    Frame generation. The less you need it, the better it is.

    • @markhackett2302
      @markhackett2302 11 месяцев назад +4

      FG is useful to turn a monitor that has to have a slower overdrive setting for some games, IF that FG rate puts it into the faster overdrive setting.
      MOST monitors have two or three overdrive settings required, it is a high end feature if it has a single overdrive setting. Don't use FG for the game, use FG for the monitor.

    • @cosmic_gate476
      @cosmic_gate476 11 месяцев назад +1

      ​@@markhackett2302 Useless on OLED monitors with no overshoot

    • @C0BEX
      @C0BEX 11 месяцев назад

      meaning its still usefull for 99% of monitor owners ou there lol@@cosmic_gate476

    • @Ronaldo-se3ff
      @Ronaldo-se3ff 11 месяцев назад

      if you have those then you dont need FG. @@cosmic_gate476

    • @trackingdifbeatsaber8203
      @trackingdifbeatsaber8203 11 месяцев назад

      @@cosmic_gate476 OLED is still really expensive if you can afford an OLED screen I doubt you would be using upscaling

  • @notacopa
    @notacopa 11 месяцев назад +300

    In case people don’t know, AMD fixed the VRR issue with the most recent beta drivers

    • @dendonflo
      @dendonflo 11 месяцев назад +8

      Where do you find the most recent beta drivers ? I can only find the link to the page for the first release of AFMF :\

    • @majorgg66
      @majorgg66 11 месяцев назад +18

      They will break it in the next one. I swear.

    • @puffyips
      @puffyips 11 месяцев назад

      @@majorgg66then don’t download those

    • @jonRock
      @jonRock 11 месяцев назад +15

      Unless I'm seeing a different driver, Freesync is only working with Fluid Motion Frames, and it is still not working with FSR3.

    • @Xilent1
      @Xilent1 11 месяцев назад +2

      They watched the video first too

  • @eyeOfAC
    @eyeOfAC 11 месяцев назад +570

    They got the optical flow thingy working on compute, and reasonably well too. That’s the main point, the rest is just teething problems for new tech. This was supposed to be impossible without hardware support, as Nvidia put it. I’m impressed

    • @arthurbonds7200
      @arthurbonds7200 11 месяцев назад +42

      They never said it was impossible - they said it wouldn't look very good, and a lot of the problems here actually point to such. Remember that NV is still positioning DLSS as one of the crowns of RTX (which it absolutely is) so putting out a lower quality for specific products in the stack will do nothing but damage DLSS' reputation for consistency among all supporting hardware.
      Can they do it? Sure, given enough time and incentive. In the meantime, they're currently content letting FSR take the second fiddle since it runs on their own hardware too anyway.

    • @dakbassett
      @dakbassett 11 месяцев назад

      @@arthurbonds7200 They said it wasn't possible. Google it.

    • @markhackett2302
      @markhackett2302 11 месяцев назад +119

      @@arthurbonds7200 No, hey said it was impossible. "Oh, without the new flow tensor instructions you wouldn't get good frame rates!!!!", but we got good frame rates without the use of the optical flow tensor cores.

    • @HunterTracks
      @HunterTracks 11 месяцев назад +101

      ​@@arthurbonds7200But it does look good, that's the thing. The interpolated frames themselves don't look any worse than DLSS 3 FG. The issues with frame pacing are completely separate and quite solvable.

    • @slayerr4365
      @slayerr4365 11 месяцев назад +23

      @@HunterTracks Are you one of these todds that spouts bs like "fsr and dlss look the same" lol. Friendly reminder that fsr3 hard locks you into fsr 2 upscaling. I don't care what fps you are getting if you are using fsr 2 over dlss 2 on any rtx card just so you can use frame gen then you are insane for how much visual quality you are using.

  • @BigHeadClan
    @BigHeadClan 11 месяцев назад +439

    I recall Nvidia frame gen also having teething issues when it first came out.
    Ill give AMD its chance to get its ducks in a row with the technology.
    If nothing else I very much appreciate they included it on nearly all cards on the market not just their latest generation.

    • @Jackson-bh1jw
      @Jackson-bh1jw 11 месяцев назад

      i think the cheat here is that on DLSS 3 announcement was that DLSS 3 WAS Frame Gen unified..... now DLSS3.5 is announced as Frame Gen excluded, but they didnt said that and everyone is confused, and they are claiming to have all the DLSS for all cards, a big scam.

    • @Davinmk
      @Davinmk 11 месяцев назад +39

      some of the stuff listed in the video already has been fixed

    • @TheBackyardChemist
      @TheBackyardChemist 11 месяцев назад +7

      It even runs on an Nvidia 1060, although it is not particularly useful on that card

    • @jinx20001
      @jinx20001 11 месяцев назад +14

      the difference is AMD just spent the last year and some delays getting its ducks in a row... why do people keep making the excuse for AMD that nvidia wasnt perfect on launch so why should AMD be perfect when the difference is nvidia lead the way with the tech and AMD takes its sweet ass time, a year in this case no less and delays in order to still drop it with problems. If this doesn't tell people AMD are extremely far behind i dont know what else will... its like AMD panic when Nvidia introduce a new feature and flap around like ducks for a year to try and imitate it always coming off worse.

    • @JM-kn9dh
      @JM-kn9dh 11 месяцев назад +46

      @@jinx20001Nvidia developed the technology for their servers, then adapted it for gaming and the results is much weaker cards per dollar for the average person. AMD also takes their time cause they make it open source.

  • @ij6708
    @ij6708 11 месяцев назад +234

    Looking back in time, DLSS 3 didn't have a smooth launch either. I vaguely remember visual artifacts especially with UI and some other issues.
    FSR3 is more interesting simply for the fact that it's not hardware locked, hell it'll even run on igpus

    • @Jackson-bh1jw
      @Jackson-bh1jw 11 месяцев назад +47

      AMD buried: PhysX, G-Sync (remember your old god? "G-Sync compatible" its just Freesync), Nvidia Hairworks, "DressFX", SLI, DLSS 2 (waaay more games on FSR), and RTX (20% usage) is half dead, and DLSS 3 just runs in 5 games and shares the same destiny, while NVIDIA buyers forcedare forced to pay the RTX-tax of double the price, and most lineups discontinued (GTX).
      I only see win after win here, nvidia tech superior? Man you have a mountain of taxed unupdated dead techs around you, enjoy.
      And RX 6600 and RX 6800 cant be beaten on their league.

    • @mrsasshole
      @mrsasshole 11 месяцев назад +1

      @@Jackson-bh1jw Yeah, Jackson I have to agree, I'm a little surprised by the dreary tone of this video. The current problems (reportedly resolved in the beta drivers) seem minor and very surmountable. And given the broad support FSR3 is going to enjoy versus the tiny subset of cards compatible with DLSS 3, this seems like a feature to celebrate.

    • @Le_Mon9
      @Le_Mon9 11 месяцев назад +43

      ​@@Jackson-bh1jwNvidia tech may be more superior but the fact that they lock out users from their tech with their proprietary bs is what kills their.

    • @rvs55
      @rvs55 11 месяцев назад +4

      That is true. But unfortunately, first to market matters because by the time the competitor comes out with a competing technology, the incumbent has already stabilized theirs or have gone to a second better or more enhanced iteration. The market doesn't appreciate second place, or give any sympathy for the underdog. They'll just go for the one that works better.

    • @Nib_Nob-t7x
      @Nib_Nob-t7x 11 месяцев назад +26

      ⁠​⁠@@Jackson-bh1jw​​⁠a lot of games use physX still as an example in September payday 3, lies of P, and Mortal Kombat 1 all released and use physX. Games just got rid of the option to turn it off.
      Besides that most of what you said is wrong. Like your false claim that FSR supports more games when in reality DLSS does. Dlss 3 doesn’t only support 5 games it’s supports 46+. G-sync still exist and gets updates, it also offers more features like ULMB2 and the ability to use adaptive sync at all framerates. Hairworks is still being worked on and improved a few months ago they were able to simulate 86,000 strands of hair at 17 ms per frame.

  • @Hjominbonrun
    @Hjominbonrun 11 месяцев назад +8

    So first release of FSR3 vs DLSS3?
    Would be great to see another review after 2 or 3 updates.

    • @googleslocik
      @googleslocik 11 месяцев назад

      "the will fix it after release" ideology is one of the most delusional fanboy defence
      it will never be as good as dlss3, same way dlss is much better than fsr2, nvidia might be a shady company, but clearly they didnt lie when they said the new hardware in 4xx series is designed to do it right

  • @bournechupacabra
    @bournechupacabra 11 месяцев назад +210

    Given that AMD has nailed the quality and overhead aspects of frame generation, I'm optimistic that fixing the anti-lag and frame pacing issues won't be too difficult. They clearly released a beta version of the feature and called it a full release. But once those issues are sorted out it will be great.
    I think FSR will always be behind in terms of upscaling image quality but as long as they are close enough I think that's a big win

    • @totalprocall6894
      @totalprocall6894 11 месяцев назад +56

      People don't give NVIDIA enough stick for the fact their tech is locked behind their own hardware. I'll take FSR and its open source nature, over DLSS for this reason anyday. The difference in real time use is miniscule...I have pc's with both vendors hardware.

    • @JohnSmith-ro8hk
      @JohnSmith-ro8hk 11 месяцев назад +9

      uh its actually a preview driver that is not part of the main driver branch. it is the definition of a beta driver...

    • @grantusthighs9417
      @grantusthighs9417 11 месяцев назад +3

      ​@@totalprocall6894they're not doing it out of the kindness in their heart. They know they have no chance of developers implementing it if it was locked to their hardware as Nvidia basically dominates the market. They have tried before with better tech and higher market share and failed.

    • @HunterTracks
      @HunterTracks 11 месяцев назад

      ​​@@grantusthighs9417"AMD hardware" encompasses the entire console market (except Switch, but that's its own little bubble). Even without that, it's a few days of dev work for 10% more potential reach, that's more than enough for most big studios.
      Either way, it's quite obvious that AMD did it for their own benefit, but a win for consumers is a win for consumers either way. It's still something to celebrate, no matter who does it.

    • @magnomliman8114
      @magnomliman8114 11 месяцев назад +2

      @@totalprocall6894that ''open source'' p)art clearly doesn't help amds tool get better. so what's the point then.

  • @Drovek451
    @Drovek451 11 месяцев назад +143

    I played around a bit in the Forspoken demo and actually thought it looked fine, though my 6950XT was indeed maxing out my monitor with FrameGen.

    • @alexandruilea915
      @alexandruilea915 11 месяцев назад +9

      I have a 6900XT and can't wait for FSR 3 FG to be in Witcher 3. I get like 60-70 FPS native with max details on a UWQHD monitor that has 144hz refresh rate. At 120-140 fps it would be just there. I would like to keep the option for no upscaling tho.

    • @LagiohX3
      @LagiohX3 11 месяцев назад +1

      Yeah I tried it on a 3090 at first looking for the artifacts it looked bad, it lessens the Effect at Native Res. but just playing through it felt nice. The game itself looked like ass sometimes, bad shadows (what is RT shadows for?), the MC model, effects are a bit low res idk if its an artistic choice but it feels like they are good but look bad sometimes, I like the aggressiveness of the enemies and the gameplay tho.

    • @markhackett2302
      @markhackett2302 11 месяцев назад +1

      That may be what FG is best for, either DLSS or FSR. Turn a game that requires VRR because on occasion it gets 60-80 frames natively but you can lock it to Vsync because that is getting you above the rate of the monitor almost all the time.

    • @slayerr4365
      @slayerr4365 11 месяцев назад +3

      It's so cute watching all the amd babies come out of the woodworks after a year of bashing frame gen because they can't use it all of a sudden falling in love with the technology now they finally have access to their own version albeit worse in every way than dlss frame gen lol.
      At least I won't have to keep reading delusional anti frame gen comments on this channel now the peasants can use it too. I thank amd for that.

    • @iseeu-fp9po
      @iseeu-fp9po 11 месяцев назад +26

      @@slayerr4365 Stop fanboying. Nobody cares what brand you use. Better technology benefits everyone, and btw.: I'm currently on an Nvidia GPU.

  • @KeepAnOpenMind
    @KeepAnOpenMind 11 месяцев назад +80

    The reason upscaling works the best at maximum refresh rates is simple: the motion vectors velocity is less the higher the number of frames is. The frames are predicted based on the velocity and the greater it is, the harder it is to predict (correctly).

  • @rickgear2579
    @rickgear2579 11 месяцев назад +99

    I won't lie, I am legit impressed with FSR3/FG. While certainly not as good as DLSS3/FG, I've seen enough that I can say Radeon owners aren't getting a drastically inferior experience. FSR3 could definitely use a few more tweaks and will likely get better with time, but those multi-$100 premiums for nvidia GPUs start to look less worth paying for. I don't disagree that it's not ready for primetime, but as a preview, I like what I see.
    Well done AMD. The tech certainly isn't as good as DLSS3/FG, but that makes sense given how much of a premium you have to pay to take advantage of it.

    • @slayerr4365
      @slayerr4365 11 месяцев назад +5

      People won't be getting a much worse experience from fsr 3 once the issues are fixed. They are going a much worse experience from being locked to fsr 2 upscaling in order to use fsr 3. If they found a way to let 20 and 30 series owners still use dlss 2 with fsr frame gen they probably would of made tens of thousands of new amd fans. As it is I would recommend any 20 or 30 series users to ever use fsr 3 because it means losing the quality of dlss and that is absolutely never worth it.

    • @jackconrad4814
      @jackconrad4814 11 месяцев назад +10

      Not just AMD users, also the majority of nvidia users. TONS of nvidia buyers of 2000 and 3000 with the promise of a better feature set got left in the dust at the first opportunity with DLSS3

    • @chronossage
      @chronossage 11 месяцев назад +4

      I wouldn't be too sure about that. There are too many NVidia stans out there. Remember the big thing a month or two ago when they were all getting mad about games not having DLSS while FSR is just fine? All the consoles currently use FSR upscaling and no one there seems to care or think about it. Sadly everyone will just get mad that Nvidia still won't play nice while still buying their GPUs. I'd still rather just play my games at native. I'm starting to think I'll just never try to move to 4k and stay at 1080p for the rest of my days.

    • @gusmlie
      @gusmlie 11 месяцев назад +1

      Shame AMD don’t innovate on gpu’s. If nvidia didn’t do something for AMD to copy the gpu market would still be at the 8800gtx.

    • @cloakofshadow1
      @cloakofshadow1 11 месяцев назад +1

      @@chronossage say what you want, Nvidia don't have a reputation for abysmal drivers.

  • @dragojess
    @dragojess 11 месяцев назад +75

    Hot take: requiring a high base framerate to work well is a good thing, it discourages devs using fsr3 to make up for poor framerates, it only enhances the experience on an already good enough framerate

    • @slayerr4365
      @slayerr4365 11 месяцев назад +10

      Fair but not entirely true. Seeming devs are already using upscaling as a replacements for optimisation the thought process will more likely be "what's the lowest frame rate we can get to with upscaling that is acceptable to boost up with frame gen to be playable" now.
      If 2023 has taught us anything it is that game developers are the laziest people on earth who will do everything in their power to not spend a single second working on making their games playable.
      On a related but not related note I also don't agree with his take that you need at least 100 fps for frame gen to be worth it. For me on dlss frame gen anything over about 75 fps with frame gen on so around 45 native fps was enough to make the game feel perfect. Playing cyberpunk with 80 fps fg on path tracing currently and the game feels worlds better than playing at around 50-60 fps with fg off and lower settings. I think a lot of people are using bad monitors which is giving them a skewed view on how much you need for it to look and feel smooth.

    • @Coffeeenjoyer31
      @Coffeeenjoyer31 11 месяцев назад +3

      I understand what you mean, but I can't see why you would enable this feature if your fps was already acceptable. Just a lower quality image and worse input latency. I would consider using it if fps was below 60 to hopefully bring the smoothness up to par with 120hz refresh rate.. just my opinion though.

    • @PetrisonRocha
      @PetrisonRocha 11 месяцев назад +3

      @@Coffeeenjoyer31 Yes! If I have real stable 60 fps, why would I want fake low quality high lag 100 fps instead? Makes no sense at all to me.

    • @Coffeeenjoyer31
      @Coffeeenjoyer31 11 месяцев назад +1

      @@PetrisonRocha this is why Nvidia locking frame gen to RTX 40 series is particularly annoying. A 4060 has enough performance where you wouldn't likely be in a situation where above 60 fps is unachievable.. so kinda meh

    • @robertstan298
      @robertstan298 11 месяцев назад +1

      THIS. Hopefully we'll only see it AT MOST in quasi-120Hz/fps PS5/XSX game modes. Which in turn, may mean they'll put some actual effort into integrating it into the render pipeline and result in a great implementation on PC as well.

  • @McMaxW
    @McMaxW 11 месяцев назад +31

    I appreciate the detailed analysis... but I'm sure I would not notice 90% of these issues while gaming.

    • @halrichard1969
      @halrichard1969 11 месяцев назад +15

      You and me both brother. Sometimes I feel like many of these YTers become hyper critical about most everything in order to generate interest. Beyond looking decent, my focus is gameplay, gameplay, gameplay. Graphics are really secondary. And hey, most people do not play on ultra settings anyway because they are not playing on the latest greatest hardware. Sometimes, I feel like a motherless child. :D

    • @3Dant
      @3Dant 11 месяцев назад

      ​@@halrichard1969 It's their job so they're naturally going to be more focussed on these things than the average person. And I think in general they're trying to hold AMD to a higher standard, which is a good thing. The more AMD feels like they need to achieve feature parity with Nvidia, the better it is for consumers. It's debatable whether they put too much emphasis on how it compares to Nvidia features, given that the major use case is for those who can't run those Nvidia features (AMD cards, older Nvidia cards), but in general I think it's a good thing.

    • @bassyey
      @bassyey 10 месяцев назад +1

      I won't notice any of these on my indies and jrpgs lol. Western AAA games rely on tech to be interesting.

  • @beat_g9368
    @beat_g9368 11 месяцев назад +324

    conclusion: when buying a new gpu, don't rely on the frame generation, buy the best one you are willing to afford that renders the highest fps natively

    • @SpudCommando
      @SpudCommando 11 месяцев назад +50

      So if you care about game raster then buy AMD because Nvidia is overpriced for fps

    • @donsly375
      @donsly375 11 месяцев назад +2

      but bru i'm not a peasant like you

    • @rinsenpai135
      @rinsenpai135 11 месяцев назад +7

      That's why i got a 3060 laptop instead of a 4050 laptop. Despite CPU being more powerful in the 4050 laptop, the 3060 is stronger at native 1080p and 1440p than the 4050. Considering I don't play games with upscaling options, this is the best choice to do.

    • @004vamsi
      @004vamsi 11 месяцев назад +2

      So, a 4090 then?

    • @monsterboomer8051
      @monsterboomer8051 11 месяцев назад +42

      nvidia is stupid, their 4070 is a good gpu. But it's 12gb vram is a joke. I bought rx7800xt instead.

  • @Jackintosh117
    @Jackintosh117 11 месяцев назад +5

    Are the background clips the wrong way round at 16:16 to 17:12?
    The left clip is straight up higher quality, and the commentary is stating that DLSS looks better?
    Pause at 16:45 and tell me the right doesnt look like a blurry and lower quality version of the left image.

    • @Ben-Rogue
      @Ben-Rogue 11 месяцев назад

      Nope, they just seem to prefer the DLSS blurry up-scaling over some FSR shimmering. I'm sure if they tuned down the FSR sharpening slightly it might improve the shimmering, at a slight cost to obvious sharpness win it has over DLSS here

    • @WayStedYou
      @WayStedYou 11 месяцев назад

      the dlss looks smudged and the fsr looks sharp but like you said probably too sharp, if it were toned down it would look much nicer@@Ben-Rogue

  • @ItsDillan
    @ItsDillan 11 месяцев назад +88

    This is going to be like G-Sync.
    AMD will lag behind in engineering a viable alternative that's able to be used by a broader group of hardware, the whole time Nvidia will claim that it's just "not possible" for them to use *Insert newest DLSS here* on certain hardware. Eventually AMD will catch up and Nvidia will have to change their marketing strategy or look like idiots.
    AMD occasionally manage to do some really interesting things. FSR now, but mantle before was so forward thinking that a lot of the API was supposedly absorbed into Vulkan and DX12.

    • @phutureproof
      @phutureproof 11 месяцев назад +21

      and lets not forget that today every single one of us is using AMD64 architecture in our intel CPU lol
      (at least kind of!)

    • @MarineRX179
      @MarineRX179 11 месяцев назад +14

      This has always been the case:
      3D Vision
      PhysX
      Gsync
      DLSS
      Nvidia has always been and always will be about pushing self-serving proprietary
      There's nothing wrong with that, but when they alienate even their own user-base with older gen cards for the sake of pushing them into upgrading (milking them), that is simply greed.

    • @mirage8753
      @mirage8753 11 месяцев назад +2

      yes, nvidia sells technologies, while amd sells graphic cards. look at 4060/ti, they are useless without frame generation. lower power consumption barely worth the upgrade

    • @mirage8753
      @mirage8753 11 месяцев назад

      @@phutureproof indeed

    • @MarineRX179
      @MarineRX179 11 месяцев назад

      @@mirage8753 That's because in reality they are actually 4050/4050ti, but Ngreedia label them higher just so that can sell lower tier cards at higher pricing.

  • @Nightrain5555
    @Nightrain5555 11 месяцев назад +20

    FSR 3, for being new and just launched, it is VERY promising. Big props to amd!

  • @ClaudeHomml
    @ClaudeHomml 11 месяцев назад +4

    Did you mistakenly switch the FSR3 and DLSS videos? I found it rather confusing that just about every videoclip labeled FSR3 had obviously more detailed and sharper textures, than the DLSS-labeled ones. While the commentary was saying the exact opposite. Even through youtube-codec the DLSS textures looked obviously lower resolution and way more blurry.

  • @mycosys
    @mycosys 11 месяцев назад +213

    Seems like a pretty promising public beta to me, i look forward to the full/FSR3.1 release on Starfield

    • @csl750
      @csl750 11 месяцев назад +10

      Well if it is a beta release... then there shouldn't be too much 'panic' per say... issues then are expected...

    • @InnuendoXP
      @InnuendoXP 11 месяцев назад

      ​@@csl750per se* fyi

    • @Terepin64
      @Terepin64 11 месяцев назад +7

      Beta? More like alpha.

    • @mycosys
      @mycosys 11 месяцев назад

      Alpha Made Drivers @@Terepin64 👍👌

    • @csl750
      @csl750 11 месяцев назад +5

      @@LTNetjak and due to AMD trying to make the tech available for a wide selection of cards... they gonna struggle

  • @anonymoususer7985
    @anonymoususer7985 11 месяцев назад +59

    So glad Daniel Owen has been going through the settings with the community to help and show people since the tech launched last month

    • @Gl0ckb1te
      @Gl0ckb1te 11 месяцев назад +11

      Hes very good. Seems like a wholesome dude. HUB is still GOAT

    • @sergioaraujo7943
      @sergioaraujo7943 11 месяцев назад +6

      Never watched him but the behavior of a part of his audience spamming every place (it is NOT restricted to HUB) that comments something about another youtuber work pretty much guarantee that I'll never watch him. He might be a really good youtuber but I am tired of seeing this kind cultish of spam everywhere.

    • @anonymoususer7985
      @anonymoususer7985 11 месяцев назад

      @@sergioaraujo7943maybe you might want to take your issues up with HUB, seen as Tim mentions Daniel’s name in the video you are commenting on or do you not want to listen to the video from HUB either

    • @AKSBSU
      @AKSBSU 11 месяцев назад +10

      @@sergioaraujo7943 I think he's a math teacher or something and makes the videos solo from his house. He seems like a decent dude. I've watched is videos here and there for quite a while.

    • @Gl0ckb1te
      @Gl0ckb1te 11 месяцев назад +21

      @@sergioaraujo7943 that is the worst reason to not watch someones content.

  • @TheGoncas2
    @TheGoncas2 9 месяцев назад +3

    One thing that I don't see people talking about enough is how EXPENSIVE the DLSS FG pass is. Doing the math, it costs 5ms at 4K on a RTX 4090, while FSR3 only costs 2.4ms. It's LESS THAN HALF, and it makes a difference in the frame rates, especially at high refresh rates! If AMD can do it, Nvidia should optimize the DLSS3 cost too.

  • @rockindude3001
    @rockindude3001 11 месяцев назад +8

    Thanks for the review!
    I think it would have been helpful for you, the review and the viewers to have taken a deeper dive into the technology. The AMD and GPUOpen Project have some deep dives into how FSR2.2 works. For example which TAA implementation (game native or FSR) is chosen is up to the developers iirc. They CAn choose the FSR-TAA, but they can also hook in their own.
    Shimmering of fine details in FSR may be reduced by a better implementation by the developers (like you suggested, letting IoA choose *their* TAA should be possible).
    The same goes for UI elements rendered at lower fps than the interpolated frame, this surely is down to the devs deciding which elements need to go where (interpolated or not) for screen presentation.
    Anyway, cheers for your content! Your work is really appreciated and I hope I voiced my criticism respectfully and constructively!

  • @MrQwertypoiuyty
    @MrQwertypoiuyty 11 месяцев назад +28

    Good or bad, even if it's subpar with DLSS 3.5, I just am glad FSR 3 is here as it saved me from my itch to upgrade from RTX3080 to 4080. I say GOOD JOB, AMD!

    • @PhilosophicalSock
      @PhilosophicalSock 11 месяцев назад +1

      It is subpar with Dlss3, not 3.5 by any means

    • @Jtwizzle
      @Jtwizzle 11 месяцев назад

      Now just need more games with support. Could be a while before there's a decent library.

    • @dudanvictor4203
      @dudanvictor4203 11 месяцев назад

      ​@@PhilosophicalSock3.5 too

    • @SwurvGG
      @SwurvGG 11 месяцев назад +1

      Thank you amd! Will probably buy a 5080 next year instead of the next amd card 😂

    • @PhilosophicalSock
      @PhilosophicalSock 11 месяцев назад +4

      @@dudanvictor4203 nope. 3.5 is about ray reconstruction

  • @kenshirogenjuro873
    @kenshirogenjuro873 11 месяцев назад +115

    As touched on in the video, a lot of what we’re being presented with does suggest they were trying to meet some internal deadline, maybe a promise made to shareholders or who knows what. The announcement seemed almost hushed, and with only two relatively less relevant games so far, it’s all very much like they’re still ironing a lot out before a likely more boisterous public announcement I’m sure they are trying to get to sooner rather than later.

    • @midnightaf
      @midnightaf 11 месяцев назад +41

      Did you forget that DLSS 3 also had issues at launch and looked like a smeary mess?

    • @user-jd3pk1bz8e
      @user-jd3pk1bz8e 11 месяцев назад +50

      Remember first dlls implementation on cyberpunk was trash. All companies are rushing products because the market doesn't allow waiting procedures. Profits are more important than quality delivery nowadays.

    • @aratakeneikawa6589
      @aratakeneikawa6589 11 месяцев назад +3

      They announced it a year ago

    • @crazyvlogs9453
      @crazyvlogs9453 11 месяцев назад

      @@user-jd3pk1bz8e yeah, even more for AMD now, after all they are running to catch up with NVIDIA in this area, so they want to rush things more than any other company

    • @mojojojo6292
      @mojojojo6292 11 месяцев назад +3

      Rushed? It was supposed to be out q1 this year. It's already half a year late.

  • @MyrKnof
    @MyrKnof 11 месяцев назад +5

    Look at 17:00.. I got NO idea what Tim is talking about there, because the FSR looks WAAAAY sharper. ofc that might be CAS sharpening, which would also introduce/explain shimmering. So what do you want? Sharpness or Vaseline?

  • @MrX-qs3gv
    @MrX-qs3gv 11 месяцев назад +20

    i am impressed with fsr 3 on forspoken on my 4070ti I swear my next GPU will be an AMD surely

    • @PhilosophicalSock
      @PhilosophicalSock 11 месяцев назад +7

      lol what is that logic if having an rtx4000

    • @imp_ct
      @imp_ct 11 месяцев назад +2

      @@PhilosophicalSock Obvious bs from AMD fanboys

    • @technicallycorrect838
      @technicallycorrect838 11 месяцев назад

      Surely .... (Aware)

  • @krjal3038
    @krjal3038 11 месяцев назад +12

    I don't enjoy frame generation at all even on a top tier system but I'm really happy to see AMD improving the competition. Hopefully FSR upscaling can improve more as well.

  • @huggysocks
    @huggysocks 11 месяцев назад +106

    You guys may want to check the release notes cause it tells you how to fix frame pace on there. Works great for me with both the original and experimental drivers in fact, I can use fluid motion with frame gen.

    • @inducedapathy1296
      @inducedapathy1296 11 месяцев назад +4

      Fluid motion is the driver setting right? I saw the one guy using a beta driver that enables framegen via the driver however it cuts off when you move around fast. Kinda cool that you can enable framegen via a driver instead of the game which might help in older games like say eu4 ck3 total war etc. Those are more cpu bound though but don't require fast movement like an FPS.

    • @huggysocks
      @huggysocks 11 месяцев назад +7

      @@inducedapathy1296 you can even combine fluid motion and frame gen in newer games to get super high frames rates works amazing. Fluid motion can even speed up games like LA Noir that had a 24hz cap, though this makes facial animations run a bit too fast.

    • @ultrademigod
      @ultrademigod 11 месяцев назад +2

      This post should be pinned, because most people wont even be aware of it, and probably wont read the release notes.

    • @AlexRubio
      @AlexRubio 11 месяцев назад +5

      Good luck with that HWU don't really bother, if they do it'll be a shocker. I've already moved on from this channel, but I was willing to give them a chance. I was screaming this on my phone, but whatever

    • @Doug-mu2ev
      @Doug-mu2ev 11 месяцев назад +1

      What does the release notes say to do to fix frame pace? I did not see anything other than turn Vsync on. Was it frame capping to half your monitor refresh rate?

  • @mymobilestube
    @mymobilestube 11 месяцев назад +64

    Thanks for putting in the effort to dissect this on such a granular level and explaining it in such a straight forward way ❤

  • @theriddick
    @theriddick 11 месяцев назад +24

    I just want them to fix the VRR support which only works sometimes and not for driver method. AND ALSO I want them to support HDR!
    Those are the two big ticket items for me to start using FSR3. Perhaps it will be on a per game situation, but until we get official release and not preview driver release, that may not happen.
    I found the exact same issues as HU here in my short test. Some people however are getting different results and I think it may come down to monitor configuration.
    17:00 You have FSR sharpening turned on but DLSS comparison doesn't have sharpening dialed up.

    • @xKamiiii
      @xKamiiii 11 месяцев назад +2

      no one cares about hdr. i'd like to protect my eyes, thankyou

    • @xzaramurd
      @xzaramurd 11 месяцев назад +13

      @@xKamiiii I care very much about HDR. It looks incredible.

    • @mariozenarju6461
      @mariozenarju6461 11 месяцев назад

      Games that support HDR allow you to set brightness levels. If you're having a bad experience with it, it's on you@@xKamiiii

    • @ij6708
      @ij6708 11 месяцев назад +14

      ​HDR is a visual upgrade surpassing RTX tech simply for the fact that it's not taxing at all performance wise to use and it likewise drastically improves image presentation. So guess I'm part of the "no one" who do care about it

    • @xKamiiii
      @xKamiiii 11 месяцев назад +1

      @@xzaramurd no it really doesn't. it's too flashy. i prefer raw colours

  • @SSJfraz
    @SSJfraz 11 месяцев назад +5

    That clickbait title man, jesus! 🤦‍♂

  • @RETROxGAMER
    @RETROxGAMER 11 месяцев назад +3

    As we can see all over youtube, pixel peeping analysis in slow motion with magnified images is one thing, and the improved overall experience for a lot of gamers with older gpu's, non nvidia gpu's and even last gen nvidia gpu's that supposedly were too weak for frame generation is a totally different pair of socks. These gamers will gladly sacrifice some image quality and latency to play their games more comfortably. Contrary to what many people tends to belive watching tech channels the majority of gamers don't own high refresh rate VRR monitors and high end gpu's so FSR3 is a really good thing despite it's flaws.

  • @devonmoreau
    @devonmoreau 11 месяцев назад +9

    Impressive work! Such a technical, organized, well delivered video ❤

  • @mondzi
    @mondzi 11 месяцев назад +3

    15:57 There seem to be big difference in sharpness setting when comparing upscaling in FSR3 vs DLSS3 (FSR3 seems to have sharpness cranked too high).

  • @glenwaldrop8166
    @glenwaldrop8166 11 месяцев назад +4

    I recommend setting a frame cap at 1080P and enabling frame generation on both prospective brands.
    Playing with FSR 3.0 FG enabled with a cap of 60 (monitor limit) on Forspoken lowered the CPU and GPU load so much it felt like I was playing a UE3 or old UE4 title vs a UE5. The input latency was massively improved.

  • @rsherhod
    @rsherhod 11 месяцев назад +200

    I hope Nvidia allow devs to separate the UI rendering pipeline, like AMD has.

    • @B16B0SS
      @B16B0SS 11 месяцев назад +7

      I'm sure they will - its more of a hassle to devs though if their ui isn't built in such a way to take advantage of the API

    • @flocker4415
      @flocker4415 11 месяцев назад +1

      They probably are trying to find a way to fix the UI issues within the frame generation pipeline rather than just leaving those UI elements at native framerate
      Cause it's not a perfect solution either, if those ui elements move or are in world space (like way points, health bars etc.) they really appear to stutter compared to the rest of the game.

    • @names-are-for-friends
      @names-are-for-friends 11 месяцев назад +1

      it's bizarre they didn't do this in the first place. they know it's a bad idea to run post processing before FG, so why not UI elements too? I have to doubt it's all that much effort when UI is overlaid in the first place

    • @KeepAnOpenMind
      @KeepAnOpenMind 11 месяцев назад +2

      If I remember correctly, it was advised in the DLSS integration manual, to render the HUD elements natively, and not upscale/generate those. Either way, if the developer cared, this was what they would do, but here we are.

    • @KeepAnOpenMind
      @KeepAnOpenMind 11 месяцев назад +2

      Apparently, not all the game devs car enough to do that properly.

  • @TheDean420
    @TheDean420 11 месяцев назад +11

    I have been using the Driver based Fluid Motion Frames in Starfield. I now get 200+ fps at 3440x1440 on my 7900xtx. It looks and feels so much better than Native rendering at Ultra

  • @Kylethejobber
    @Kylethejobber 11 месяцев назад +29

    When I tried FSR3 I was impressed. It will be nice that game developers will be able to use it for the xbox and ps5.

    • @mikem2253
      @mikem2253 11 месяцев назад +6

      Hope so. 60fps minimum though so I hope devs use it properly on console.

    • @juanblanco7898
      @juanblanco7898 11 месяцев назад +1

      ​@@mikem2253 With developers already upscaling to 4K from resolutions as low as 720p in recent titles I don't see them shying away from going the same road here and interpolating from >60 FPS as well. I reckon it's going to be a complete shit show if they'll start implementing FG on consoles.
      As stated in video, FG is only of real benefit for powerful hardware that's basically already capable of achieving HFR by itself, if not to the same degree as with FG added on top of it.

    • @erikruelas6737
      @erikruelas6737 11 месяцев назад

      It will work on the ps5 but be crippled compared to the series x

    • @MickdeRaad
      @MickdeRaad 11 месяцев назад +1

      ​@@juanblanco7898inb4 720p30 games upscaled and upframed to 4k60 lol

  • @shadouqh8370
    @shadouqh8370 11 месяцев назад +7

    Great video Tim! Honestly though I have the same opinion as Steve. I simply miss the days when GPUs were just about raw rasterization, and improving performance with each iteration.

  • @winblood8758
    @winblood8758 11 месяцев назад +4

    If you got to zoom 300% and slowdown 4x in order to see differences, proves to me that you won't notice a difference at full speed, no zoom 😝

  • @AshtonCoolman
    @AshtonCoolman 11 месяцев назад +29

    This video needed to be clearer that this is a technology preview. The VRR issue has already been resolved in the latest beta drivers. This video is being sold as this being FSR3's final form. (Disclaimer: I have a 4090 and don't currently own any AMD cards. I just believe in this info being represented properly).

    • @AMD718
      @AMD718 11 месяцев назад +17

      Unfortunately, even when AMD releases a newer version of FSR3 with VRR compatibility, anti-lag+ working properly with FG active, etc., many people will still tune into this video when researching FSR3 and come to the conclusion it's crap. What AMD has done here is impressive even in its unfinished, beta form, with teething issues galore. They've proven frame-generation doesn't need dedicated hardware-based optical flow accelerators. Most folks thought AMD was bullshitting when they said 2023 release, and that they were years away from anything, since it is reported to have taken NVidia 3 years to develop DLSS3 FG. Yet here we are with a solid frame generation base upon which to build a compelling feature that works across multiple generations of cards and brands. I don't think that really comes across in this video.

    • @TheKareem87
      @TheKareem87 11 месяцев назад

      @@AMD718 lol you can thank Tim for that

    • @cc0767
      @cc0767 11 месяцев назад

      you can only review something which currently exists, not a potential version 3 years in the future

    • @markhackett2302
      @markhackett2302 11 месяцев назад

      @@cc0767 Or one in current day, just a couple of days after recording, because that doesn't make NVidia look good.

    • @cc0767
      @cc0767 11 месяцев назад

      @@markhackett2302 they may have fixed 1 particular issue, not magically made the software solution better than the one with dedicated hardware

  • @KeepUp101-j1y
    @KeepUp101-j1y 11 месяцев назад +5

    This is a Win for the gaming community lets go AMD!!!

  • @GENKI_INU
    @GENKI_INU 11 месяцев назад +1

    9:48 But why use VRR if frame-tearing doesn't bother you, or is extremely minimal to begin with?
    Especially when VVR is known to sometimes cause issues with monitor features or game graphics features, performance or otherwise? And there are better ways to cap frame-rate too.

  • @sparkplugbarrens
    @sparkplugbarrens 11 месяцев назад +8

    It would be great if you could circle or mark problematic areas in the image, which you are referring to in your script, because I personally find it very difficult to identify what you are exactly criticizing. Anyway, still love your very in depth reviews of frame generation and upscaling technologies!

  • @_Kaurus
    @_Kaurus 11 месяцев назад +3

    So it sucks unless your FPS is already high enough to not require it. 100% solution for zero problem. fantastic.

  • @mcbellyman3265
    @mcbellyman3265 11 месяцев назад +49

    You may have seen the Switch version of No Man's Sky now has a custom built FSR2 implementation which looks really impressive despite the low resolution. Unfortunately probably took a lot of work and is likely game-specific, which is a real shame as it gives a glimpse of what FSR can do with some optimisation.

    • @mparagames
      @mparagames 11 месяцев назад +15

      @@lunarvvolf9606 I mean, indirectly it builds company trust, also xbox and ps5 use amd hardware

    • @iurigrang
      @iurigrang 11 месяцев назад +1

      It's such a shame because the improvements in No Man's Sky are where it matters the most: Temporal stability.
      FSR has nailed overall detail down surprisingly well, but I'm often using inhouse TAAU over FSR simply because of how distracting the artifacts are. Such a shame

    • @mparagames
      @mparagames 11 месяцев назад

      @@lunarvvolf9606 Again, AMD sells GPUs and CPUs to companies like microsoft and sony for their consoles, for the Steam Deck, and even other companies use their features on hardware that would be unsupported by nvidia, like nintendo uses FSR in some of their games. This all means that companies are more prone to do a contract with AMD, or renew their contracts, to have AMD hardware on their devices.
      Of course, on the PC market NVidia sells way more, but AMD's strategy is more focused at mobile and console department right now as they have the advantage on that market.

    • @Adept_Austin
      @Adept_Austin 11 месяцев назад

      ​@@lunarvvolf9606I think AMD's decision to make their technology manufacturer agnostic is more of a byproduct of their commitment to backwards-compatibility support for their own hardware. FSR3 isn't hardware dependent which allows it to run on 3 generations of AMD GPUs whereas DLSS3 only runs on the current gen NVIDIA ones. Once AMD has their tech running on their own cards in that way, it's obviously possible to run on NVIDIA cards. Their cross-platform compatibility is probably to avoid backlash from software locking their tech from competitors because that would be kind of a bad look. Besides, *IF* AMD and NVIDIA cards have feature parity, why wouldn't you go AMD? They support their cards longer and they give you more performance for the price. I'm not saying it makes the most business sense, it's a noble goal.

    • @mparagames
      @mparagames 11 месяцев назад +5

      plus, the fact that amd technology is open makes it so it works onm ALL amd hardware as well, unlike nvidia technology, in which for example the nintendo switch cannot make use of DLSS despite running a nvidia gpu because thast gpu is "too old". Nintendo themselves uses FSR, and as more companies use it AMD's brand reputation increases, and outside of the pc builders makert, ths is important.
      More and more brands are deciding to go with AMD for their APU / mobile solutions, microsoft, sony, and steam are already doing that, and nintendo potentially could on theiir next console release.
      Remember that AMD has less marketshare overall than nvidia, which means that the only reliable way for them to induce consumers and companies to use their own solutions is by making them avaliale on multiple hardware.

  • @Satook
    @Satook 11 месяцев назад +2

    Re. combining DLSS upscaling and FSR framegen (at 18:00), this is probably very difficult due to how they’re integrated into the overall rendering process. Rendering engine are already highly conditional and complex. Trying to have 2 different “end of pipeline” techs working in tandem is very difficult and potentially impossible depending on what access (to generated data) they provide to the calling code.
    Suggesting it’s due to AMD not wanting to give NVIDIA users a leg up completely ignores the complexity and limitations of the software involved.

    • @blegi1245
      @blegi1245 11 месяцев назад +2

      You have to remember that these are the same people who without evidence spread the "amd is preventing starfield from having dlss" fud. In that context them demanding that amd implements dlss in fsr is very on brand. It cannot possibly be nvidias job to implement their own proprietary bullshit into an otherwise open source code.

  • @Vicric
    @Vicric 11 месяцев назад +2

    Latency comparisons with DLSS3 should have been done without vsync for the Nvidia tech. It harms latency for no benefit and the comparisons with FSR3 frame gen is misleading.

    • @Mr.NiceUK
      @Mr.NiceUK 11 месяцев назад

      With reflex on, vsync doesn't harm latency, reflex with reflex on basically limits fps to a little below the refresh rate automatically.

  • @HoozahYK
    @HoozahYK 11 месяцев назад +5

    Did you generate Tims reactions on thumbnail with FSR 3 😂??

  • @cocobos
    @cocobos 11 месяцев назад +7

    At 16:00, are you sure DLSS is superior? Looks blurrier to me.

    • @PineyJustice
      @PineyJustice 11 месяцев назад +4

      "Oh no, there is a tiny amount of shimmer at 3x zoom, lets just say the blurry one looks better and hope nobody notices" FSR3 quality did improve by a noticeable amount in immortals compared to FSR2, quality looks same or better than native and native AA mode looks incredible. Can't speak for forspoken.

    • @cocobos
      @cocobos 11 месяцев назад

      ​@@PineyJusticeyea, both have their pros and cons. I can live with a little shimmer, but blurrier images?

    • @PineyJustice
      @PineyJustice 11 месяцев назад +1

      @@cocobos DLSS was the blurrier one as you correctly pointed out, yet DLSS was declared to be the winner because there was a tiny bit of shimmer when zoomed in 300% on blades of grass. Blurry can be fine, sharp can be fine, it just depends on the game aesthetic.

    • @mingyi456
      @mingyi456 11 месяцев назад

      @@PineyJustice I can quite easily see the shimmer at 1x zoom before Tim zooms in on the pond though. I am watching the video on a 1080p monitor although the video resolution is still set to 4k, maybe that is why I cannot as easily notice the difference in sharpness.

    • @PineyJustice
      @PineyJustice 11 месяцев назад +1

      @@mingyi456 Only if you're really looking for it. Also there is zero shimmering in immortals, seems to be a problem with forspoken. Also just after at 16:27 you can clearly see that FSR is sharper and more detailed than dlss.

  • @EdCGSantos
    @EdCGSantos 11 месяцев назад +4

    Enjoyed Rich's take (DF) much more than this one and I enjoy HU content. It's good stuff for the community but it is in it's early stages. It's tough to get this level of bashing so early on. DLSS3 had a poor start also...and using dedicated hw..

    • @TheKareem87
      @TheKareem87 11 месяцев назад

      Yeah, the DF video was definitely better. Same facts, less biased reporting imo.

  • @kusumayogi7956
    @kusumayogi7956 11 месяцев назад +1

    You need make video about AFMF + FSR 3 FG. That is more interesting, i curious about latency when antilag+ on vs off.

  • @maximada2003
    @maximada2003 11 месяцев назад +7

    As an owner of a 3070, I really wanted AMD to drop a huge win bomb here as I feel cheated by NVIDIA that a 3000 series card cant use all the Nvidia Tech.

    • @mrvr6165
      @mrvr6165 11 месяцев назад +4

      You have tensor cores and they can run the NV tensor optical flow-engine that's used under the hood in DLSS3 for framegen.. which you should be allowed to!
      As a 3060Ti-owner who's been upsampling vids to 60fps with tensor since the 2070-- I'm done with nvidia if they keep this up. Maybe I'll support Intel next round.

    • @NXDf7
      @NXDf7 11 месяцев назад +7

      @@mrvr6165they want your money, and they'll keep doing shit like locking tech out to buy the newer generations. Rtx was an excuse, dlss3 also is an excuse. What will be the next excuse.

    • @arenzricodexd4409
      @arenzricodexd4409 11 месяцев назад +1

      That is simply stupid. You can't expect older hardware to always support newer tech.

    • @OutOfNameIdeas2
      @OutOfNameIdeas2 11 месяцев назад

      ​@@mrvr6165they have been doing this since they started making gpus. They are a very evil company.

    • @OutOfNameIdeas2
      @OutOfNameIdeas2 11 месяцев назад +7

      ​@@arenzricodexd4409difference is if it can't vs if Nvidia will allow it. Nvidia won't allow it unless there is enough backlash

  • @the.wanginator
    @the.wanginator 11 месяцев назад +25

    I've been playing around with Fluid Motion Frames on other titles, and I'm impressed. The key is to configure the game where the microstutter guage on the built in OSD is at 0%.
    With this tech preview driver, I've achieved the best results when:
    - Leaving VRR on
    - Turn VSync Off
    - Setting an ingame frame cap to the peak of your monitors refresh rate
    My results have varied from game to game. Some titles don't care for it, while others love it. As long as my native FPS is above 90 or so, I'm good.
    P.S. I've even been able to get good results when enabling ray tracing, maxing out graphics options and/or upping the in-game render resolution from 100% (1440p) to as high as 150% (or 4K).

  • @gustavoalmeida1579
    @gustavoalmeida1579 11 месяцев назад +3

    If you can't reach the full refresh of your monitor with FSR3, you can always lower the refresh rate to have it sync, for example lowering a 160Hz to 120Hz.

    • @mirage8753
      @mirage8753 11 месяцев назад

      yes, and it seems if you can reach 60-70fps with native or upscaled resolution then 120fps is pretty much possible so vsync wont be a problem. so main intention for frame generation is to reach high fps for monitors with 100Hz+. so upscaling is for high fps up to 60 on high resolutions when frame generation is for high refresh rate. so now you can have 100+fps on 4k on max settings in games with upscaling and fg without any noticable quality loss

  • @raresmacovei8382
    @raresmacovei8382 11 месяцев назад +31

    FSR3 with a capped fps, while hitting that fps 100% of the time, looks absolutely flawless and input lag is great also.
    In my perception, 60 fps with FG is fine on a controller while 80 fps with FG is fine on mouse.

    • @AKSBSU
      @AKSBSU 11 месяцев назад +10

      I've just messed around with it in Forespoken, which was thrown in with a graphics card, and the latency was far less of a problem than I expected. I thought it would feel pretty sluggish, but it looked good and was still decently responsive. The game still sucks, but it looks better while sucking.

  • @TheIndulgers
    @TheIndulgers 11 месяцев назад +3

    22:39 This is not a fair comparison. The 4090 is a faster card at a SIGNIFICANTLY higher price, so it obvious it will have lower input latency because of its higher base performance.
    Blaming AMD for not incorporating reflex into FSR 3 is ridiculous.
    Yes, FSR 3 has some issues, notably the vsyn issue. But the fact that the IQ is generally superior with less ui issues than dlss3, latency penalty is minimal , higher output fps, and available on more cards is a huge win.

  • @C0BEX
    @C0BEX 11 месяцев назад +1

    Honestly when looking at the direct comparisons, to me, FSR3 had better image quality for everything in the background. Foliage, textures etc. seemed sharper. Also the comparison between native vs FSR3 at 18:26, why is the FSR image so much brighter ? Is it a time of day issue in the game ? Another question, you said FSR is good for maxing out your monitor refresh, but is it also OK to go above the refresh ?

  • @TrevorM1992
    @TrevorM1992 11 месяцев назад +4

    I have owned rtx and radeon cards and have never once turned on DLSS, FSR or any of these features. Call me old fashioned but pure rasterization and native resolution is the way to go for me.

    • @cocobos
      @cocobos 11 месяцев назад

      If you keep upgrading your high-end GPU, then it's fine with native.

  • @rENEGADE666JEDI
    @rENEGADE666JEDI 11 месяцев назад +7

    To sum up in my situation Radeon 7900xtx nitro + Forspoken 4k ultra RT, I have around 45-60+ fps native. I set vsync on in monitor, thanks to FSR3 I have 90-120 cap FPS. I don't feel lag. The quality is awesome. I don't care about the rest. I'm keeping my fingers crossed that they add this to as many games as possible because in my case it works great.

    • @cts006
      @cts006 11 месяцев назад

      Which FSR quality setting is that? FSR AA?

  • @mkcristobal
    @mkcristobal 11 месяцев назад +19

    This actually looks very promising if they get it to work properly with VRR and Vsync off, and I don't see why they couldn't do that 👍🏻

    • @jeffreypaul9428
      @jeffreypaul9428 11 месяцев назад +2

      Sounds like they got the frame gen part right, they have to make it work in all cases now.

  • @HunterTracks
    @HunterTracks 11 месяцев назад +1

    I swear, saying that using frame generation at 30 FPS to bring you to 60 FPS isn't worth it due to motion artifacts is like saying that cooking your food isn't worth it because you get less nutrients. Playing at 30 FPS is likely to have a much more noticeable and negative impact on your gaming experience than frame generation artifacts could ever hope to.

    • @Mr.NiceUK
      @Mr.NiceUK 11 месяцев назад

      Also isn't worth it because the latency is the biggest issue at 30fps, and frame gen will only make the latency worse.

    • @HunterTracks
      @HunterTracks 11 месяцев назад

      ​@@Mr.NiceUKLatency is still very much serviceable in my experience, and there are many games that don't require low latency (for instance, Baldur's Gate 3).
      Either way, it would've at least been a good point to include.

  • @tisjester
    @tisjester 11 месяцев назад

    I just have to cut in (watching from the start of the video). Your talking head videos are so crisp and clear. It is a pleasure to watch your videos. (and your Audio is always on point) Now back to the super informative video I know is coming 😄

  • @K1ngDhruv
    @K1ngDhruv 11 месяцев назад +15

    Considering my wallet, I am completely satisfied with the FSR3 technology given its free

  • @RTXonerix
    @RTXonerix 11 месяцев назад +14

    What about using FPS limit? If a game runs at 120+fps, I would just set an fps limiter to 120, wouldn't that fix the uneven frame generation?

    • @optimalsettings
      @optimalsettings 11 месяцев назад +13

      you also can set vsync global in the driver

    • @tonymorris4335
      @tonymorris4335 11 месяцев назад +4

      Did you miss where he did that and it didn't help at like 7 minutes?

    • @nevcairiel
      @nevcairiel 11 месяцев назад +3

      He tried setting a FPS limit in the video, it did not work.

    • @optimalsettings
      @optimalsettings 11 месяцев назад

      @@nevcairiel ah ok. sorry. thought it would be like nvidia.

    • @cerebelul
      @cerebelul 11 месяцев назад +3

      It works if you set it with Riva Tuner.

  • @jameslewis2635
    @jameslewis2635 11 месяцев назад +5

    I expect that the implementation of this technology is a bit lacking right now (remember how bad the first few titles with FSR and DLSS performed) but that should improve some, at least to become a bit more consistent.

  • @AncientGameplays
    @AncientGameplays 11 месяцев назад +1

    Let's not forget that although DLSS is MUCH better generally, FSR has way less ghosting

  • @frostilver
    @frostilver 11 месяцев назад

    4:47 Have you mislabeled cuz the right looks like it's optical flow enabled
    Edit: Oh I see. The framerate tells. Though FSR3 looked quite bad at higher framerate.

  • @katsuenyagaming
    @katsuenyagaming 11 месяцев назад +4

    its maybe because of compression, but i fail to notice any quality difference in this video provided.

    • @igorthelight
      @igorthelight 11 месяцев назад

      Are you watching it on a big monitor or on 6.5" smartphone screen? ;-)

  • @GewelReal
    @GewelReal 11 месяцев назад +5

    Not even close but glad we got competition!

    • @DonaldHendleyBklynvexRecords
      @DonaldHendleyBklynvexRecords 11 месяцев назад +1

      in practice its very close actually, these guys are whats breaking FSR3

    • @GewelReal
      @GewelReal 11 месяцев назад

      No VRR is a deal-breaker. But AMD looks to really want to make a good feature set so hope we can get the improvements soon

    • @DonaldHendleyBklynvexRecords
      @DonaldHendleyBklynvexRecords 11 месяцев назад

      @@GewelReal VRR works just under certain settings....plus these are Beta drivers, how ppl draw conclusions off Beta anything is beyond me

  • @Jens-sl5je
    @Jens-sl5je 11 месяцев назад +11

    Even its not on the Same level with DLSS/FG its for all cards and its new NV FG was also not perfekt on the start...

    • @tonymorris4335
      @tonymorris4335 11 месяцев назад +2

      You're not wrong but it's been almost a year since they announced it was coming out... That's more than long enough that it shouldn't be broken and fucked on release still. If not to get the tech refined and working well wtf has the delay been for?

  • @dastard12
    @dastard12 11 месяцев назад +2

    Assuming everyone has vrr is a strange assumption, the least upgraded component is usually a monitor, ive had two 165hz 1440p monitors for 7 years and don't really plan on upgrading until they break

  • @sheikhtashdeedahmed
    @sheikhtashdeedahmed 11 месяцев назад +1

    Tim, I believe you've made a mistake @2:55.
    FSR frame gen does not require fsr 3 upscale. It only needs fsr3 tech to be enabled at least with native aa mode enabled.
    The way you've said it initially makes it appear that the upscale is needed. Native AA ofc, is not upscale.

  • @shanent5793
    @shanent5793 11 месяцев назад +7

    Stick to the facts, AMD isn't locking DLSS out of FSR frame generation. It's Nvidia's own policies that make it unavailable

    • @shanthoshravi5073
      @shanthoshravi5073 10 месяцев назад

      doesn't mean they shouldn't allow frame generation without any upscaling

  • @DeadLead
    @DeadLead 11 месяцев назад +3

    You can create custom resolutions with lower refresh rate like 80hz, 70hz etc.. and benefit frame generation with no judder. You will be happy with the result. It is obvious its not finished feature, yet its a nice tool that will extend life of old GPUs with no brand separation and it should be appreciated.

  • @user-gt6mi7is7l
    @user-gt6mi7is7l 11 месяцев назад +3

    Actually I see no difference between FSR3 FG and DLSS3.5 FG frame tearing in the video. Only the words from this guy set them apart. He has better eyes than me?

  • @Shamsteak
    @Shamsteak 11 месяцев назад +2

    I still think FSR 3 is going to be amazing for those of us who still aim for 1080p 60fps on non VRR setups. Unfortunately PC gaming is in an odd state with poorly optimized software and overpriced high end hardware and this could help the low/mid range setups be playable at reasonable fps.

  • @92F0XBody
    @92F0XBody 11 месяцев назад

    HU mentioned Daniel Owen! He definitely looks up to your channel so it’s cool to see you give him some shine ☺️
    Been waiting for Tim’s review! Thanks 🍻

  • @danhowell3574
    @danhowell3574 11 месяцев назад +4

    This all reminds me of the old Crossfire and SLI issues.

  • @picolete
    @picolete 11 месяцев назад +3

    In my personal opinion DLSS3 looks blurrier than FSR3 at least in this examples, it looks like it's using some kind of TAA

    • @darreno1450
      @darreno1450 11 месяцев назад +1

      I'm glad I'm not the only that noticed that. I hear one thing and see another.

  • @KeepAnOpenMind
    @KeepAnOpenMind 11 месяцев назад +24

    I think you should be able to find only the generated frames (instead of guessing where those are), by using renderdoc or NVIDIA nsight. You should be able to see the pipelines and you’ll be able to see the DLSS for sure, not sure about FSR3 FG.

  • @jonbeckham6463
    @jonbeckham6463 11 месяцев назад +1

    What is it like to see the difference on any of the examples? I am impressed yall see anything different. The world must be different for those with better vision than mine.

  • @GoufinAround_
    @GoufinAround_ 11 месяцев назад +2

    My whole problem with these technologies is the fact that I got into PC gaming to not need upscaling technology to get better performance. I think this is why I'm essentially becoming a mainly console gamer again because PC gaming is so expensive to get that intended higher performance without upscaling tech

    • @mosesdavid5536
      @mosesdavid5536 11 месяцев назад

      A 6700xt is $310, rtx 3070 is $350
      Both are stronger than consoles...you dont need to upscale with them.

    • @wildeyshere_paulkersey853
      @wildeyshere_paulkersey853 10 месяцев назад

      @@mosesdavid5536Do you get the SSD, pro, motherb, power sup etc all for free with the GPU?

    • @mosesdavid5536
      @mosesdavid5536 10 месяцев назад

      @@wildeyshere_paulkersey853 What stupid question is this? Are you expecting to build a rig for the same price as a console selling at a loss? Not mind 6700xt-3070 are 20-35% faster than ps5 not including rt

    • @wildeyshere_paulkersey853
      @wildeyshere_paulkersey853 10 месяцев назад

      @@mosesdavid5536Them selling at a loss isn't my problem.

  • @DennisRamberg
    @DennisRamberg 11 месяцев назад +41

    From what I have heard from AMD it seems as if FSR3 frame generation is deeply intertwined with the FSR upscaling algorithm itself (It even made a lot of sense to develop it). This is why you can't turn FSR off... "Native AA" still uses the FSR Pass due to how interconnected these technologies are. And a lot of the benefits would be lost if the frame generation tech decouples from the upscaling pass itself.
    I think the main scenario for FSR3 at the time being is for 60fps games to make use of 120Hz 4K TVs with a VSYNC lock. Which could prove beneficial on consoles. You don't get any latency reduction from the tech, but at least the games can utilize that 120Hz and look smoother than if running at 60Hz.

    • @richardhunter9779
      @richardhunter9779 11 месяцев назад +10

      Like all "smart" upscaling technologies, FSR 2 is based on re-projection, therefore, it makes TAA an redundant pass, as that too is based on re-projection.

    • @cocobos
      @cocobos 11 месяцев назад +3

      They (AMD) already said, the FG uses the frames from FSR during the revealing conference. They can't just use the frames from DLSS. 😅

    • @emilioestevezz
      @emilioestevezz 11 месяцев назад +1

      @@cocobos cant or wont?

    • @NXDf7
      @NXDf7 11 месяцев назад +8

      @@emilioestevezz both

    • @zaidlacksalastname4905
      @zaidlacksalastname4905 11 месяцев назад

      Motion smoothing in 2023. Nature is healing

  • @OnlyAnOpinion20
    @OnlyAnOpinion20 11 месяцев назад +5

    If this works on the Xbox and PS5, then console players might finally get to play at 4k 120fps after all.

    • @WayStedYou
      @WayStedYou 11 месяцев назад

      yeah all it will take is running the game at 1080p interally using FSR and not actually being 120fps

    • @Netsuko
      @Netsuko 11 месяцев назад +2

      Sadly I doubt it. At least not with this first iteration. It seems like they have a long way to go.

    • @OnlyAnOpinion20
      @OnlyAnOpinion20 11 месяцев назад

      @@Netsuko yeah completely agree, it's definitely a work in progress

  • @Columbus666
    @Columbus666 11 месяцев назад +5

    AMD worked hard to creat an open technology, for ALL players and developpers and just for that, he win my respect. Like Freesynch vs G-Synch.
    Now, it's interresting to know the difference with this two technology, I'm not very surprise by the result.
    The difference that is FSR3 will be implemented in all games very quickly and for all graphics cards compare to the DLSS3 and the 0,1% of gamer who bought an RTX4000.
    I'm not too thrilled either that these two companies are rushing into upscaling technology and that developers see this as an opportunity to stop optimizing their games. It's yet another way of making our hardware obsolete more quickly.

  • @Santoroz
    @Santoroz 11 месяцев назад +1

    FSR works great with my 1080strix in Cyberpunk 2077. Very impressive it works with older Nvidia gpu. G-sync (no vsync)

  • @Macho_Man_Randy_Savage
    @Macho_Man_Randy_Savage 11 месяцев назад +2

    decided to try out forespoken just as the new patch dropped and can't complain as i get a locked and smooth 120fps on my C2. there's some smearing in the corners at times and stutters when it loads a new section. the dlss included does look better but it only gives me a stuttery 70-80fps so i'll gladly use fsr 3 for this title.

  • @a9udn9u-vanced
    @a9udn9u-vanced 11 месяцев назад +3

    I'm not going to be able to spot any of these differences and fsr 3 is more accessible so it's a clear win for me

  • @scifitherapy
    @scifitherapy 11 месяцев назад +10

    Shoutout to Daniel Owen! He is a full time teacher yet has managed to cram in many FSR3 videos since it launched. Worth checking out!

    • @gameurai5701
      @gameurai5701 11 месяцев назад +1

      Really likeable guy, I discovered his channel half a year ago. I think he's doing a great mix by presenting the videos in a casual, yet also somewhat detailed ways, which is not easy to achieve.

  • @paul1979uk2000
    @paul1979uk2000 11 месяцев назад +10

    It's not really about weather FSR 3 destroys DLSS 3, it's more about AMD keeps reducing these exclusive features that Nvidia keeps coming up with to lock us down to their ecosystem.
    In a way, what AMD is doing is reducing the need for DLSS by being more open, after all, most developers would rather support 1 over supporting 2 or 3, and all that's needed is the tech to be good enough and open enough that it works on a lot more hardware, as well as consoles, which I'm not sure if this frame gen will work on the current consoles, but it should do looking at how well it works on older gen gpu's on the PC.
    DLSS is a good tech, but it's always going to be limited in its support being vendor locked, this kind of tech needs to be broadly open that it works on any gpu of the last few gen, and also works on consoles for it to really take off, FSR delivers that, XeSS could but it favours Intel hardware over rival hardware.
    DLSS is the better tech for now, but the gap has closed so much that to most gamers and developers, it likely doesn't matter, good enough is what most in the mainstream will care about and FSR is in that ballpark.

    • @mosesdavid5536
      @mosesdavid5536 11 месяцев назад

      Which amd card do you own?

    • @Born_Stellar
      @Born_Stellar 11 месяцев назад +2

      Nvidia said G-sync needed extra hardware to support it as well and now we've all but moved on to AMD's implementation in almost every product but a few extra expensive ones. I hope FSR goes the same way, I mean at this point it does kind of prove you don't need AI cores to do it.

    • @TheKazragore
      @TheKazragore 11 месяцев назад

      Don't forget that the consoles use AMD tech, so there is an incentive for devs to use FSR in their games from the get-go.

    • @markhackett2302
      @markhackett2302 11 месяцев назад

      @@TheKazragore Unless NVidia manage to get someone to scream "AMD are Blocking DLSS!!!!" again.

  • @bleeb1347
    @bleeb1347 11 месяцев назад +2

    9:45 - Saying that most gamers are using VRR today is like saying “why doesn’t everyone just buy an RTX 4090?”
    Lol.
    No man. Most gamers are not running VRR. In fact, 90% of PC gamers are running cheap monitors under $100 with 60-75hz refresh rates that do not even support FreeSync. Get. Real.

  • @calvinminer4365
    @calvinminer4365 11 месяцев назад +2

    I would love to see frame generation for video. Some movies and most anime would greatly benefit. There are videos of Akira with 60fps interpolated render, it looks fantastic. To have that work on the fly for any video would be huge.

    • @bbbbbbb51
      @bbbbbbb51 11 месяцев назад

      Disgusting. Cinematic fps is a standard for a reason. Fake frame on movies is awful visually

    • @videogaminbiker889
      @videogaminbiker889 11 месяцев назад +1

      I use SVP 4 to achieve exactly this. I can live interpolate up to 120fps 4k only limited by my hardware. It works with youtube as well I love it great program.

  • @Aremisalive
    @Aremisalive 11 месяцев назад +10

    Not the conclusion I was expecting. FSR frame Gen works brilliantly for me. VRR works on my 6950XT.

  • @WereCatf
    @WereCatf 11 месяцев назад +7

    Will there be an updated look if/when AMD fixes the issues? Also, just out of curiosity, has there been any word on Intel implementing frame generation as well for XeSS?

    • @eliadbu
      @eliadbu 11 месяцев назад +1

      HUB usually follows up when there is a new variants or improvement with this tech, so no reason to think they won't with this FG tech. As with Intel, nothing I have heard officially or leaked, maybe when Battlemage will release, they will introduce new tech.

  • @drummers2945
    @drummers2945 11 месяцев назад +1

    Yeah but you missed what it’s trying to give the end user (wether they use it or not) between team green and team red team green might be all that but at a rather significant cost, I think later on down the track fsr will work great.

  • @nsuinteger-au
    @nsuinteger-au 11 месяцев назад

    Did you do any recent works on the studio lighting or change to the recording camera? This video looks of the studio, Tim and the background looks stunning.

  • @chronox837
    @chronox837 11 месяцев назад +5

    Jesus christ look at all the people having something valuable to say on a 31 minutes video which was uploaded not even 10 minutes ago

    • @Ben-Rogue
      @Ben-Rogue 11 месяцев назад +1

      Because you don't have to watch the entire video before commenting on something in the first 10 minutes of the video...

    • @dangerous8333
      @dangerous8333 11 месяцев назад

      Stay in school.