I'd love to see an analysis of gameplay footage deleting half the frames and using only the odd synthetic frames, so are how the artifacts look on their own in real time.
@@yellowflash511 I'm sure there's truth to that when it comes to certain people, but that doesn't mean it's not true. It's very important for competition to continue. There can be no stagnation, or there will be complacency, and that means higher prices for worse products.
It's going to take much more than being better than Nvidia in every way to break Nvidia's monopoly. AMD has had the better offerings for the majority of their time competing with Nvidia, it has never affected market share. Just like Google with their illegal search monopoly, or Microsoft with their illegal x86/64 operating system monopoly. Hell, AMD had to sue Intel for over 4 billion dollars just to break into their illegal CPU monopoly.
Don't know what Tim is talking about, but from footage he represented i see better sharpness and fine detail on FSR3 side compared to blurry image on DLSS3 side. So what kind of DLSS superiority he is talking about? FSR3 works even on GTX 10 series cards and does as good job as DLSS3 Which only works on 40 series.
I seldom noticed any “superiority” on the DLSS side either. In some places I noticed stutter on the DLSS side while it was perfectly smooth on FSR. Not sure if that an artifact of capturing or something else.
FG is useful to turn a monitor that has to have a slower overdrive setting for some games, IF that FG rate puts it into the faster overdrive setting. MOST monitors have two or three overdrive settings required, it is a high end feature if it has a single overdrive setting. Don't use FG for the game, use FG for the monitor.
They got the optical flow thingy working on compute, and reasonably well too. That’s the main point, the rest is just teething problems for new tech. This was supposed to be impossible without hardware support, as Nvidia put it. I’m impressed
They never said it was impossible - they said it wouldn't look very good, and a lot of the problems here actually point to such. Remember that NV is still positioning DLSS as one of the crowns of RTX (which it absolutely is) so putting out a lower quality for specific products in the stack will do nothing but damage DLSS' reputation for consistency among all supporting hardware. Can they do it? Sure, given enough time and incentive. In the meantime, they're currently content letting FSR take the second fiddle since it runs on their own hardware too anyway.
@@arthurbonds7200 No, hey said it was impossible. "Oh, without the new flow tensor instructions you wouldn't get good frame rates!!!!", but we got good frame rates without the use of the optical flow tensor cores.
@@arthurbonds7200But it does look good, that's the thing. The interpolated frames themselves don't look any worse than DLSS 3 FG. The issues with frame pacing are completely separate and quite solvable.
@@HunterTracks Are you one of these todds that spouts bs like "fsr and dlss look the same" lol. Friendly reminder that fsr3 hard locks you into fsr 2 upscaling. I don't care what fps you are getting if you are using fsr 2 over dlss 2 on any rtx card just so you can use frame gen then you are insane for how much visual quality you are using.
I recall Nvidia frame gen also having teething issues when it first came out. Ill give AMD its chance to get its ducks in a row with the technology. If nothing else I very much appreciate they included it on nearly all cards on the market not just their latest generation.
i think the cheat here is that on DLSS 3 announcement was that DLSS 3 WAS Frame Gen unified..... now DLSS3.5 is announced as Frame Gen excluded, but they didnt said that and everyone is confused, and they are claiming to have all the DLSS for all cards, a big scam.
the difference is AMD just spent the last year and some delays getting its ducks in a row... why do people keep making the excuse for AMD that nvidia wasnt perfect on launch so why should AMD be perfect when the difference is nvidia lead the way with the tech and AMD takes its sweet ass time, a year in this case no less and delays in order to still drop it with problems. If this doesn't tell people AMD are extremely far behind i dont know what else will... its like AMD panic when Nvidia introduce a new feature and flap around like ducks for a year to try and imitate it always coming off worse.
@@jinx20001Nvidia developed the technology for their servers, then adapted it for gaming and the results is much weaker cards per dollar for the average person. AMD also takes their time cause they make it open source.
Looking back in time, DLSS 3 didn't have a smooth launch either. I vaguely remember visual artifacts especially with UI and some other issues. FSR3 is more interesting simply for the fact that it's not hardware locked, hell it'll even run on igpus
AMD buried: PhysX, G-Sync (remember your old god? "G-Sync compatible" its just Freesync), Nvidia Hairworks, "DressFX", SLI, DLSS 2 (waaay more games on FSR), and RTX (20% usage) is half dead, and DLSS 3 just runs in 5 games and shares the same destiny, while NVIDIA buyers forcedare forced to pay the RTX-tax of double the price, and most lineups discontinued (GTX). I only see win after win here, nvidia tech superior? Man you have a mountain of taxed unupdated dead techs around you, enjoy. And RX 6600 and RX 6800 cant be beaten on their league.
@@Jackson-bh1jw Yeah, Jackson I have to agree, I'm a little surprised by the dreary tone of this video. The current problems (reportedly resolved in the beta drivers) seem minor and very surmountable. And given the broad support FSR3 is going to enjoy versus the tiny subset of cards compatible with DLSS 3, this seems like a feature to celebrate.
That is true. But unfortunately, first to market matters because by the time the competitor comes out with a competing technology, the incumbent has already stabilized theirs or have gone to a second better or more enhanced iteration. The market doesn't appreciate second place, or give any sympathy for the underdog. They'll just go for the one that works better.
@@Jackson-bh1jwa lot of games use physX still as an example in September payday 3, lies of P, and Mortal Kombat 1 all released and use physX. Games just got rid of the option to turn it off. Besides that most of what you said is wrong. Like your false claim that FSR supports more games when in reality DLSS does. Dlss 3 doesn’t only support 5 games it’s supports 46+. G-sync still exist and gets updates, it also offers more features like ULMB2 and the ability to use adaptive sync at all framerates. Hairworks is still being worked on and improved a few months ago they were able to simulate 86,000 strands of hair at 17 ms per frame.
"the will fix it after release" ideology is one of the most delusional fanboy defence it will never be as good as dlss3, same way dlss is much better than fsr2, nvidia might be a shady company, but clearly they didnt lie when they said the new hardware in 4xx series is designed to do it right
Given that AMD has nailed the quality and overhead aspects of frame generation, I'm optimistic that fixing the anti-lag and frame pacing issues won't be too difficult. They clearly released a beta version of the feature and called it a full release. But once those issues are sorted out it will be great. I think FSR will always be behind in terms of upscaling image quality but as long as they are close enough I think that's a big win
People don't give NVIDIA enough stick for the fact their tech is locked behind their own hardware. I'll take FSR and its open source nature, over DLSS for this reason anyday. The difference in real time use is miniscule...I have pc's with both vendors hardware.
@@totalprocall6894they're not doing it out of the kindness in their heart. They know they have no chance of developers implementing it if it was locked to their hardware as Nvidia basically dominates the market. They have tried before with better tech and higher market share and failed.
@@grantusthighs9417"AMD hardware" encompasses the entire console market (except Switch, but that's its own little bubble). Even without that, it's a few days of dev work for 10% more potential reach, that's more than enough for most big studios. Either way, it's quite obvious that AMD did it for their own benefit, but a win for consumers is a win for consumers either way. It's still something to celebrate, no matter who does it.
I have a 6900XT and can't wait for FSR 3 FG to be in Witcher 3. I get like 60-70 FPS native with max details on a UWQHD monitor that has 144hz refresh rate. At 120-140 fps it would be just there. I would like to keep the option for no upscaling tho.
Yeah I tried it on a 3090 at first looking for the artifacts it looked bad, it lessens the Effect at Native Res. but just playing through it felt nice. The game itself looked like ass sometimes, bad shadows (what is RT shadows for?), the MC model, effects are a bit low res idk if its an artistic choice but it feels like they are good but look bad sometimes, I like the aggressiveness of the enemies and the gameplay tho.
That may be what FG is best for, either DLSS or FSR. Turn a game that requires VRR because on occasion it gets 60-80 frames natively but you can lock it to Vsync because that is getting you above the rate of the monitor almost all the time.
It's so cute watching all the amd babies come out of the woodworks after a year of bashing frame gen because they can't use it all of a sudden falling in love with the technology now they finally have access to their own version albeit worse in every way than dlss frame gen lol. At least I won't have to keep reading delusional anti frame gen comments on this channel now the peasants can use it too. I thank amd for that.
The reason upscaling works the best at maximum refresh rates is simple: the motion vectors velocity is less the higher the number of frames is. The frames are predicted based on the velocity and the greater it is, the harder it is to predict (correctly).
I won't lie, I am legit impressed with FSR3/FG. While certainly not as good as DLSS3/FG, I've seen enough that I can say Radeon owners aren't getting a drastically inferior experience. FSR3 could definitely use a few more tweaks and will likely get better with time, but those multi-$100 premiums for nvidia GPUs start to look less worth paying for. I don't disagree that it's not ready for primetime, but as a preview, I like what I see. Well done AMD. The tech certainly isn't as good as DLSS3/FG, but that makes sense given how much of a premium you have to pay to take advantage of it.
People won't be getting a much worse experience from fsr 3 once the issues are fixed. They are going a much worse experience from being locked to fsr 2 upscaling in order to use fsr 3. If they found a way to let 20 and 30 series owners still use dlss 2 with fsr frame gen they probably would of made tens of thousands of new amd fans. As it is I would recommend any 20 or 30 series users to ever use fsr 3 because it means losing the quality of dlss and that is absolutely never worth it.
Not just AMD users, also the majority of nvidia users. TONS of nvidia buyers of 2000 and 3000 with the promise of a better feature set got left in the dust at the first opportunity with DLSS3
I wouldn't be too sure about that. There are too many NVidia stans out there. Remember the big thing a month or two ago when they were all getting mad about games not having DLSS while FSR is just fine? All the consoles currently use FSR upscaling and no one there seems to care or think about it. Sadly everyone will just get mad that Nvidia still won't play nice while still buying their GPUs. I'd still rather just play my games at native. I'm starting to think I'll just never try to move to 4k and stay at 1080p for the rest of my days.
Hot take: requiring a high base framerate to work well is a good thing, it discourages devs using fsr3 to make up for poor framerates, it only enhances the experience on an already good enough framerate
Fair but not entirely true. Seeming devs are already using upscaling as a replacements for optimisation the thought process will more likely be "what's the lowest frame rate we can get to with upscaling that is acceptable to boost up with frame gen to be playable" now. If 2023 has taught us anything it is that game developers are the laziest people on earth who will do everything in their power to not spend a single second working on making their games playable. On a related but not related note I also don't agree with his take that you need at least 100 fps for frame gen to be worth it. For me on dlss frame gen anything over about 75 fps with frame gen on so around 45 native fps was enough to make the game feel perfect. Playing cyberpunk with 80 fps fg on path tracing currently and the game feels worlds better than playing at around 50-60 fps with fg off and lower settings. I think a lot of people are using bad monitors which is giving them a skewed view on how much you need for it to look and feel smooth.
I understand what you mean, but I can't see why you would enable this feature if your fps was already acceptable. Just a lower quality image and worse input latency. I would consider using it if fps was below 60 to hopefully bring the smoothness up to par with 120hz refresh rate.. just my opinion though.
@@PetrisonRocha this is why Nvidia locking frame gen to RTX 40 series is particularly annoying. A 4060 has enough performance where you wouldn't likely be in a situation where above 60 fps is unachievable.. so kinda meh
THIS. Hopefully we'll only see it AT MOST in quasi-120Hz/fps PS5/XSX game modes. Which in turn, may mean they'll put some actual effort into integrating it into the render pipeline and result in a great implementation on PC as well.
You and me both brother. Sometimes I feel like many of these YTers become hyper critical about most everything in order to generate interest. Beyond looking decent, my focus is gameplay, gameplay, gameplay. Graphics are really secondary. And hey, most people do not play on ultra settings anyway because they are not playing on the latest greatest hardware. Sometimes, I feel like a motherless child. :D
@@halrichard1969 It's their job so they're naturally going to be more focussed on these things than the average person. And I think in general they're trying to hold AMD to a higher standard, which is a good thing. The more AMD feels like they need to achieve feature parity with Nvidia, the better it is for consumers. It's debatable whether they put too much emphasis on how it compares to Nvidia features, given that the major use case is for those who can't run those Nvidia features (AMD cards, older Nvidia cards), but in general I think it's a good thing.
conclusion: when buying a new gpu, don't rely on the frame generation, buy the best one you are willing to afford that renders the highest fps natively
That's why i got a 3060 laptop instead of a 4050 laptop. Despite CPU being more powerful in the 4050 laptop, the 3060 is stronger at native 1080p and 1440p than the 4050. Considering I don't play games with upscaling options, this is the best choice to do.
Are the background clips the wrong way round at 16:16 to 17:12? The left clip is straight up higher quality, and the commentary is stating that DLSS looks better? Pause at 16:45 and tell me the right doesnt look like a blurry and lower quality version of the left image.
Nope, they just seem to prefer the DLSS blurry up-scaling over some FSR shimmering. I'm sure if they tuned down the FSR sharpening slightly it might improve the shimmering, at a slight cost to obvious sharpness win it has over DLSS here
This is going to be like G-Sync. AMD will lag behind in engineering a viable alternative that's able to be used by a broader group of hardware, the whole time Nvidia will claim that it's just "not possible" for them to use *Insert newest DLSS here* on certain hardware. Eventually AMD will catch up and Nvidia will have to change their marketing strategy or look like idiots. AMD occasionally manage to do some really interesting things. FSR now, but mantle before was so forward thinking that a lot of the API was supposedly absorbed into Vulkan and DX12.
This has always been the case: 3D Vision PhysX Gsync DLSS Nvidia has always been and always will be about pushing self-serving proprietary There's nothing wrong with that, but when they alienate even their own user-base with older gen cards for the sake of pushing them into upgrading (milking them), that is simply greed.
yes, nvidia sells technologies, while amd sells graphic cards. look at 4060/ti, they are useless without frame generation. lower power consumption barely worth the upgrade
@@mirage8753 That's because in reality they are actually 4050/4050ti, but Ngreedia label them higher just so that can sell lower tier cards at higher pricing.
Did you mistakenly switch the FSR3 and DLSS videos? I found it rather confusing that just about every videoclip labeled FSR3 had obviously more detailed and sharper textures, than the DLSS-labeled ones. While the commentary was saying the exact opposite. Even through youtube-codec the DLSS textures looked obviously lower resolution and way more blurry.
Never watched him but the behavior of a part of his audience spamming every place (it is NOT restricted to HUB) that comments something about another youtuber work pretty much guarantee that I'll never watch him. He might be a really good youtuber but I am tired of seeing this kind cultish of spam everywhere.
@@sergioaraujo7943maybe you might want to take your issues up with HUB, seen as Tim mentions Daniel’s name in the video you are commenting on or do you not want to listen to the video from HUB either
@@sergioaraujo7943 I think he's a math teacher or something and makes the videos solo from his house. He seems like a decent dude. I've watched is videos here and there for quite a while.
One thing that I don't see people talking about enough is how EXPENSIVE the DLSS FG pass is. Doing the math, it costs 5ms at 4K on a RTX 4090, while FSR3 only costs 2.4ms. It's LESS THAN HALF, and it makes a difference in the frame rates, especially at high refresh rates! If AMD can do it, Nvidia should optimize the DLSS3 cost too.
Thanks for the review! I think it would have been helpful for you, the review and the viewers to have taken a deeper dive into the technology. The AMD and GPUOpen Project have some deep dives into how FSR2.2 works. For example which TAA implementation (game native or FSR) is chosen is up to the developers iirc. They CAn choose the FSR-TAA, but they can also hook in their own. Shimmering of fine details in FSR may be reduced by a better implementation by the developers (like you suggested, letting IoA choose *their* TAA should be possible). The same goes for UI elements rendered at lower fps than the interpolated frame, this surely is down to the devs deciding which elements need to go where (interpolated or not) for screen presentation. Anyway, cheers for your content! Your work is really appreciated and I hope I voiced my criticism respectfully and constructively!
Good or bad, even if it's subpar with DLSS 3.5, I just am glad FSR 3 is here as it saved me from my itch to upgrade from RTX3080 to 4080. I say GOOD JOB, AMD!
As touched on in the video, a lot of what we’re being presented with does suggest they were trying to meet some internal deadline, maybe a promise made to shareholders or who knows what. The announcement seemed almost hushed, and with only two relatively less relevant games so far, it’s all very much like they’re still ironing a lot out before a likely more boisterous public announcement I’m sure they are trying to get to sooner rather than later.
Remember first dlls implementation on cyberpunk was trash. All companies are rushing products because the market doesn't allow waiting procedures. Profits are more important than quality delivery nowadays.
@@user-jd3pk1bz8e yeah, even more for AMD now, after all they are running to catch up with NVIDIA in this area, so they want to rush things more than any other company
Look at 17:00.. I got NO idea what Tim is talking about there, because the FSR looks WAAAAY sharper. ofc that might be CAS sharpening, which would also introduce/explain shimmering. So what do you want? Sharpness or Vaseline?
I don't enjoy frame generation at all even on a top tier system but I'm really happy to see AMD improving the competition. Hopefully FSR upscaling can improve more as well.
You guys may want to check the release notes cause it tells you how to fix frame pace on there. Works great for me with both the original and experimental drivers in fact, I can use fluid motion with frame gen.
Fluid motion is the driver setting right? I saw the one guy using a beta driver that enables framegen via the driver however it cuts off when you move around fast. Kinda cool that you can enable framegen via a driver instead of the game which might help in older games like say eu4 ck3 total war etc. Those are more cpu bound though but don't require fast movement like an FPS.
@@inducedapathy1296 you can even combine fluid motion and frame gen in newer games to get super high frames rates works amazing. Fluid motion can even speed up games like LA Noir that had a 24hz cap, though this makes facial animations run a bit too fast.
Good luck with that HWU don't really bother, if they do it'll be a shocker. I've already moved on from this channel, but I was willing to give them a chance. I was screaming this on my phone, but whatever
What does the release notes say to do to fix frame pace? I did not see anything other than turn Vsync on. Was it frame capping to half your monitor refresh rate?
I just want them to fix the VRR support which only works sometimes and not for driver method. AND ALSO I want them to support HDR! Those are the two big ticket items for me to start using FSR3. Perhaps it will be on a per game situation, but until we get official release and not preview driver release, that may not happen. I found the exact same issues as HU here in my short test. Some people however are getting different results and I think it may come down to monitor configuration. 17:00 You have FSR sharpening turned on but DLSS comparison doesn't have sharpening dialed up.
HDR is a visual upgrade surpassing RTX tech simply for the fact that it's not taxing at all performance wise to use and it likewise drastically improves image presentation. So guess I'm part of the "no one" who do care about it
As we can see all over youtube, pixel peeping analysis in slow motion with magnified images is one thing, and the improved overall experience for a lot of gamers with older gpu's, non nvidia gpu's and even last gen nvidia gpu's that supposedly were too weak for frame generation is a totally different pair of socks. These gamers will gladly sacrifice some image quality and latency to play their games more comfortably. Contrary to what many people tends to belive watching tech channels the majority of gamers don't own high refresh rate VRR monitors and high end gpu's so FSR3 is a really good thing despite it's flaws.
I recommend setting a frame cap at 1080P and enabling frame generation on both prospective brands. Playing with FSR 3.0 FG enabled with a cap of 60 (monitor limit) on Forspoken lowered the CPU and GPU load so much it felt like I was playing a UE3 or old UE4 title vs a UE5. The input latency was massively improved.
They probably are trying to find a way to fix the UI issues within the frame generation pipeline rather than just leaving those UI elements at native framerate Cause it's not a perfect solution either, if those ui elements move or are in world space (like way points, health bars etc.) they really appear to stutter compared to the rest of the game.
it's bizarre they didn't do this in the first place. they know it's a bad idea to run post processing before FG, so why not UI elements too? I have to doubt it's all that much effort when UI is overlaid in the first place
If I remember correctly, it was advised in the DLSS integration manual, to render the HUD elements natively, and not upscale/generate those. Either way, if the developer cared, this was what they would do, but here we are.
I have been using the Driver based Fluid Motion Frames in Starfield. I now get 200+ fps at 3440x1440 on my 7900xtx. It looks and feels so much better than Native rendering at Ultra
@@mikem2253 With developers already upscaling to 4K from resolutions as low as 720p in recent titles I don't see them shying away from going the same road here and interpolating from >60 FPS as well. I reckon it's going to be a complete shit show if they'll start implementing FG on consoles. As stated in video, FG is only of real benefit for powerful hardware that's basically already capable of achieving HFR by itself, if not to the same degree as with FG added on top of it.
Great video Tim! Honestly though I have the same opinion as Steve. I simply miss the days when GPUs were just about raw rasterization, and improving performance with each iteration.
This video needed to be clearer that this is a technology preview. The VRR issue has already been resolved in the latest beta drivers. This video is being sold as this being FSR3's final form. (Disclaimer: I have a 4090 and don't currently own any AMD cards. I just believe in this info being represented properly).
Unfortunately, even when AMD releases a newer version of FSR3 with VRR compatibility, anti-lag+ working properly with FG active, etc., many people will still tune into this video when researching FSR3 and come to the conclusion it's crap. What AMD has done here is impressive even in its unfinished, beta form, with teething issues galore. They've proven frame-generation doesn't need dedicated hardware-based optical flow accelerators. Most folks thought AMD was bullshitting when they said 2023 release, and that they were years away from anything, since it is reported to have taken NVidia 3 years to develop DLSS3 FG. Yet here we are with a solid frame generation base upon which to build a compelling feature that works across multiple generations of cards and brands. I don't think that really comes across in this video.
9:48 But why use VRR if frame-tearing doesn't bother you, or is extremely minimal to begin with? Especially when VVR is known to sometimes cause issues with monitor features or game graphics features, performance or otherwise? And there are better ways to cap frame-rate too.
It would be great if you could circle or mark problematic areas in the image, which you are referring to in your script, because I personally find it very difficult to identify what you are exactly criticizing. Anyway, still love your very in depth reviews of frame generation and upscaling technologies!
You may have seen the Switch version of No Man's Sky now has a custom built FSR2 implementation which looks really impressive despite the low resolution. Unfortunately probably took a lot of work and is likely game-specific, which is a real shame as it gives a glimpse of what FSR can do with some optimisation.
It's such a shame because the improvements in No Man's Sky are where it matters the most: Temporal stability. FSR has nailed overall detail down surprisingly well, but I'm often using inhouse TAAU over FSR simply because of how distracting the artifacts are. Such a shame
@@lunarvvolf9606 Again, AMD sells GPUs and CPUs to companies like microsoft and sony for their consoles, for the Steam Deck, and even other companies use their features on hardware that would be unsupported by nvidia, like nintendo uses FSR in some of their games. This all means that companies are more prone to do a contract with AMD, or renew their contracts, to have AMD hardware on their devices. Of course, on the PC market NVidia sells way more, but AMD's strategy is more focused at mobile and console department right now as they have the advantage on that market.
@@lunarvvolf9606I think AMD's decision to make their technology manufacturer agnostic is more of a byproduct of their commitment to backwards-compatibility support for their own hardware. FSR3 isn't hardware dependent which allows it to run on 3 generations of AMD GPUs whereas DLSS3 only runs on the current gen NVIDIA ones. Once AMD has their tech running on their own cards in that way, it's obviously possible to run on NVIDIA cards. Their cross-platform compatibility is probably to avoid backlash from software locking their tech from competitors because that would be kind of a bad look. Besides, *IF* AMD and NVIDIA cards have feature parity, why wouldn't you go AMD? They support their cards longer and they give you more performance for the price. I'm not saying it makes the most business sense, it's a noble goal.
plus, the fact that amd technology is open makes it so it works onm ALL amd hardware as well, unlike nvidia technology, in which for example the nintendo switch cannot make use of DLSS despite running a nvidia gpu because thast gpu is "too old". Nintendo themselves uses FSR, and as more companies use it AMD's brand reputation increases, and outside of the pc builders makert, ths is important. More and more brands are deciding to go with AMD for their APU / mobile solutions, microsoft, sony, and steam are already doing that, and nintendo potentially could on theiir next console release. Remember that AMD has less marketshare overall than nvidia, which means that the only reliable way for them to induce consumers and companies to use their own solutions is by making them avaliale on multiple hardware.
Re. combining DLSS upscaling and FSR framegen (at 18:00), this is probably very difficult due to how they’re integrated into the overall rendering process. Rendering engine are already highly conditional and complex. Trying to have 2 different “end of pipeline” techs working in tandem is very difficult and potentially impossible depending on what access (to generated data) they provide to the calling code. Suggesting it’s due to AMD not wanting to give NVIDIA users a leg up completely ignores the complexity and limitations of the software involved.
You have to remember that these are the same people who without evidence spread the "amd is preventing starfield from having dlss" fud. In that context them demanding that amd implements dlss in fsr is very on brand. It cannot possibly be nvidias job to implement their own proprietary bullshit into an otherwise open source code.
Latency comparisons with DLSS3 should have been done without vsync for the Nvidia tech. It harms latency for no benefit and the comparisons with FSR3 frame gen is misleading.
"Oh no, there is a tiny amount of shimmer at 3x zoom, lets just say the blurry one looks better and hope nobody notices" FSR3 quality did improve by a noticeable amount in immortals compared to FSR2, quality looks same or better than native and native AA mode looks incredible. Can't speak for forspoken.
@@cocobos DLSS was the blurrier one as you correctly pointed out, yet DLSS was declared to be the winner because there was a tiny bit of shimmer when zoomed in 300% on blades of grass. Blurry can be fine, sharp can be fine, it just depends on the game aesthetic.
@@PineyJustice I can quite easily see the shimmer at 1x zoom before Tim zooms in on the pond though. I am watching the video on a 1080p monitor although the video resolution is still set to 4k, maybe that is why I cannot as easily notice the difference in sharpness.
@@mingyi456 Only if you're really looking for it. Also there is zero shimmering in immortals, seems to be a problem with forspoken. Also just after at 16:27 you can clearly see that FSR is sharper and more detailed than dlss.
Enjoyed Rich's take (DF) much more than this one and I enjoy HU content. It's good stuff for the community but it is in it's early stages. It's tough to get this level of bashing so early on. DLSS3 had a poor start also...and using dedicated hw..
As an owner of a 3070, I really wanted AMD to drop a huge win bomb here as I feel cheated by NVIDIA that a 3000 series card cant use all the Nvidia Tech.
You have tensor cores and they can run the NV tensor optical flow-engine that's used under the hood in DLSS3 for framegen.. which you should be allowed to! As a 3060Ti-owner who's been upsampling vids to 60fps with tensor since the 2070-- I'm done with nvidia if they keep this up. Maybe I'll support Intel next round.
@@mrvr6165they want your money, and they'll keep doing shit like locking tech out to buy the newer generations. Rtx was an excuse, dlss3 also is an excuse. What will be the next excuse.
I've been playing around with Fluid Motion Frames on other titles, and I'm impressed. The key is to configure the game where the microstutter guage on the built in OSD is at 0%. With this tech preview driver, I've achieved the best results when: - Leaving VRR on - Turn VSync Off - Setting an ingame frame cap to the peak of your monitors refresh rate My results have varied from game to game. Some titles don't care for it, while others love it. As long as my native FPS is above 90 or so, I'm good. P.S. I've even been able to get good results when enabling ray tracing, maxing out graphics options and/or upping the in-game render resolution from 100% (1440p) to as high as 150% (or 4K).
If you can't reach the full refresh of your monitor with FSR3, you can always lower the refresh rate to have it sync, for example lowering a 160Hz to 120Hz.
yes, and it seems if you can reach 60-70fps with native or upscaled resolution then 120fps is pretty much possible so vsync wont be a problem. so main intention for frame generation is to reach high fps for monitors with 100Hz+. so upscaling is for high fps up to 60 on high resolutions when frame generation is for high refresh rate. so now you can have 100+fps on 4k on max settings in games with upscaling and fg without any noticable quality loss
FSR3 with a capped fps, while hitting that fps 100% of the time, looks absolutely flawless and input lag is great also. In my perception, 60 fps with FG is fine on a controller while 80 fps with FG is fine on mouse.
I've just messed around with it in Forespoken, which was thrown in with a graphics card, and the latency was far less of a problem than I expected. I thought it would feel pretty sluggish, but it looked good and was still decently responsive. The game still sucks, but it looks better while sucking.
22:39 This is not a fair comparison. The 4090 is a faster card at a SIGNIFICANTLY higher price, so it obvious it will have lower input latency because of its higher base performance. Blaming AMD for not incorporating reflex into FSR 3 is ridiculous. Yes, FSR 3 has some issues, notably the vsyn issue. But the fact that the IQ is generally superior with less ui issues than dlss3, latency penalty is minimal , higher output fps, and available on more cards is a huge win.
Honestly when looking at the direct comparisons, to me, FSR3 had better image quality for everything in the background. Foliage, textures etc. seemed sharper. Also the comparison between native vs FSR3 at 18:26, why is the FSR image so much brighter ? Is it a time of day issue in the game ? Another question, you said FSR is good for maxing out your monitor refresh, but is it also OK to go above the refresh ?
I have owned rtx and radeon cards and have never once turned on DLSS, FSR or any of these features. Call me old fashioned but pure rasterization and native resolution is the way to go for me.
To sum up in my situation Radeon 7900xtx nitro + Forspoken 4k ultra RT, I have around 45-60+ fps native. I set vsync on in monitor, thanks to FSR3 I have 90-120 cap FPS. I don't feel lag. The quality is awesome. I don't care about the rest. I'm keeping my fingers crossed that they add this to as many games as possible because in my case it works great.
I swear, saying that using frame generation at 30 FPS to bring you to 60 FPS isn't worth it due to motion artifacts is like saying that cooking your food isn't worth it because you get less nutrients. Playing at 30 FPS is likely to have a much more noticeable and negative impact on your gaming experience than frame generation artifacts could ever hope to.
@@Mr.NiceUKLatency is still very much serviceable in my experience, and there are many games that don't require low latency (for instance, Baldur's Gate 3). Either way, it would've at least been a good point to include.
I just have to cut in (watching from the start of the video). Your talking head videos are so crisp and clear. It is a pleasure to watch your videos. (and your Audio is always on point) Now back to the super informative video I know is coming 😄
I expect that the implementation of this technology is a bit lacking right now (remember how bad the first few titles with FSR and DLSS performed) but that should improve some, at least to become a bit more consistent.
4:47 Have you mislabeled cuz the right looks like it's optical flow enabled Edit: Oh I see. The framerate tells. Though FSR3 looked quite bad at higher framerate.
You're not wrong but it's been almost a year since they announced it was coming out... That's more than long enough that it shouldn't be broken and fucked on release still. If not to get the tech refined and working well wtf has the delay been for?
Assuming everyone has vrr is a strange assumption, the least upgraded component is usually a monitor, ive had two 165hz 1440p monitors for 7 years and don't really plan on upgrading until they break
Tim, I believe you've made a mistake @2:55. FSR frame gen does not require fsr 3 upscale. It only needs fsr3 tech to be enabled at least with native aa mode enabled. The way you've said it initially makes it appear that the upscale is needed. Native AA ofc, is not upscale.
You can create custom resolutions with lower refresh rate like 80hz, 70hz etc.. and benefit frame generation with no judder. You will be happy with the result. It is obvious its not finished feature, yet its a nice tool that will extend life of old GPUs with no brand separation and it should be appreciated.
Actually I see no difference between FSR3 FG and DLSS3.5 FG frame tearing in the video. Only the words from this guy set them apart. He has better eyes than me?
I still think FSR 3 is going to be amazing for those of us who still aim for 1080p 60fps on non VRR setups. Unfortunately PC gaming is in an odd state with poorly optimized software and overpriced high end hardware and this could help the low/mid range setups be playable at reasonable fps.
HU mentioned Daniel Owen! He definitely looks up to your channel so it’s cool to see you give him some shine ☺️ Been waiting for Tim’s review! Thanks 🍻
I think you should be able to find only the generated frames (instead of guessing where those are), by using renderdoc or NVIDIA nsight. You should be able to see the pipelines and you’ll be able to see the DLSS for sure, not sure about FSR3 FG.
What is it like to see the difference on any of the examples? I am impressed yall see anything different. The world must be different for those with better vision than mine.
My whole problem with these technologies is the fact that I got into PC gaming to not need upscaling technology to get better performance. I think this is why I'm essentially becoming a mainly console gamer again because PC gaming is so expensive to get that intended higher performance without upscaling tech
@@wildeyshere_paulkersey853 What stupid question is this? Are you expecting to build a rig for the same price as a console selling at a loss? Not mind 6700xt-3070 are 20-35% faster than ps5 not including rt
From what I have heard from AMD it seems as if FSR3 frame generation is deeply intertwined with the FSR upscaling algorithm itself (It even made a lot of sense to develop it). This is why you can't turn FSR off... "Native AA" still uses the FSR Pass due to how interconnected these technologies are. And a lot of the benefits would be lost if the frame generation tech decouples from the upscaling pass itself. I think the main scenario for FSR3 at the time being is for 60fps games to make use of 120Hz 4K TVs with a VSYNC lock. Which could prove beneficial on consoles. You don't get any latency reduction from the tech, but at least the games can utilize that 120Hz and look smoother than if running at 60Hz.
Like all "smart" upscaling technologies, FSR 2 is based on re-projection, therefore, it makes TAA an redundant pass, as that too is based on re-projection.
AMD worked hard to creat an open technology, for ALL players and developpers and just for that, he win my respect. Like Freesynch vs G-Synch. Now, it's interresting to know the difference with this two technology, I'm not very surprise by the result. The difference that is FSR3 will be implemented in all games very quickly and for all graphics cards compare to the DLSS3 and the 0,1% of gamer who bought an RTX4000. I'm not too thrilled either that these two companies are rushing into upscaling technology and that developers see this as an opportunity to stop optimizing their games. It's yet another way of making our hardware obsolete more quickly.
decided to try out forespoken just as the new patch dropped and can't complain as i get a locked and smooth 120fps on my C2. there's some smearing in the corners at times and stutters when it loads a new section. the dlss included does look better but it only gives me a stuttery 70-80fps so i'll gladly use fsr 3 for this title.
Really likeable guy, I discovered his channel half a year ago. I think he's doing a great mix by presenting the videos in a casual, yet also somewhat detailed ways, which is not easy to achieve.
It's not really about weather FSR 3 destroys DLSS 3, it's more about AMD keeps reducing these exclusive features that Nvidia keeps coming up with to lock us down to their ecosystem. In a way, what AMD is doing is reducing the need for DLSS by being more open, after all, most developers would rather support 1 over supporting 2 or 3, and all that's needed is the tech to be good enough and open enough that it works on a lot more hardware, as well as consoles, which I'm not sure if this frame gen will work on the current consoles, but it should do looking at how well it works on older gen gpu's on the PC. DLSS is a good tech, but it's always going to be limited in its support being vendor locked, this kind of tech needs to be broadly open that it works on any gpu of the last few gen, and also works on consoles for it to really take off, FSR delivers that, XeSS could but it favours Intel hardware over rival hardware. DLSS is the better tech for now, but the gap has closed so much that to most gamers and developers, it likely doesn't matter, good enough is what most in the mainstream will care about and FSR is in that ballpark.
Nvidia said G-sync needed extra hardware to support it as well and now we've all but moved on to AMD's implementation in almost every product but a few extra expensive ones. I hope FSR goes the same way, I mean at this point it does kind of prove you don't need AI cores to do it.
9:45 - Saying that most gamers are using VRR today is like saying “why doesn’t everyone just buy an RTX 4090?” Lol. No man. Most gamers are not running VRR. In fact, 90% of PC gamers are running cheap monitors under $100 with 60-75hz refresh rates that do not even support FreeSync. Get. Real.
I would love to see frame generation for video. Some movies and most anime would greatly benefit. There are videos of Akira with 60fps interpolated render, it looks fantastic. To have that work on the fly for any video would be huge.
I use SVP 4 to achieve exactly this. I can live interpolate up to 120fps 4k only limited by my hardware. It works with youtube as well I love it great program.
Will there be an updated look if/when AMD fixes the issues? Also, just out of curiosity, has there been any word on Intel implementing frame generation as well for XeSS?
HUB usually follows up when there is a new variants or improvement with this tech, so no reason to think they won't with this FG tech. As with Intel, nothing I have heard officially or leaked, maybe when Battlemage will release, they will introduce new tech.
Yeah but you missed what it’s trying to give the end user (wether they use it or not) between team green and team red team green might be all that but at a rather significant cost, I think later on down the track fsr will work great.
Did you do any recent works on the studio lighting or change to the recording camera? This video looks of the studio, Tim and the background looks stunning.
I'd love to see an analysis of gameplay footage deleting half the frames and using only the odd synthetic frames, so are how the artifacts look on their own in real time.
That is an awesome idea.
That's a very clever idea, I would like to see this also.
Let's see how real or fake those frames are.
that is a very interesting idea
Interesting idea, but useless in application
I can only hope AMD and game devs get better with FSR3 over time. We really really need AMD to be competitive.
Not really because people who say this only want that so they can buy Nvidia at cheaper rates.
@@yellowflash511 so true!
@@yellowflash511 I'm sure there's truth to that when it comes to certain people, but that doesn't mean it's not true. It's very important for competition to continue. There can be no stagnation, or there will be complacency, and that means higher prices for worse products.
@@yellowflash511 Not true for me, I just like my torture myself running Linux, but not torture myself hard enough to use Nvidia's linux drivers.
It's going to take much more than being better than Nvidia in every way to break Nvidia's monopoly. AMD has had the better offerings for the majority of their time competing with Nvidia, it has never affected market share. Just like Google with their illegal search monopoly, or Microsoft with their illegal x86/64 operating system monopoly. Hell, AMD had to sue Intel for over 4 billion dollars just to break into their illegal CPU monopoly.
Don't know what Tim is talking about, but from footage he represented i see better sharpness and fine detail on FSR3 side compared to blurry image on DLSS3 side. So what kind of DLSS superiority he is talking about? FSR3 works even on GTX 10 series cards and does as good job as DLSS3 Which only works on 40 series.
I seldom noticed any “superiority” on the DLSS side either. In some places I noticed stutter on the DLSS side while it was perfectly smooth on FSR. Not sure if that an artifact of capturing or something else.
Frame generation. The less you need it, the better it is.
FG is useful to turn a monitor that has to have a slower overdrive setting for some games, IF that FG rate puts it into the faster overdrive setting.
MOST monitors have two or three overdrive settings required, it is a high end feature if it has a single overdrive setting. Don't use FG for the game, use FG for the monitor.
@@markhackett2302 Useless on OLED monitors with no overshoot
meaning its still usefull for 99% of monitor owners ou there lol@@cosmic_gate476
if you have those then you dont need FG. @@cosmic_gate476
@@cosmic_gate476 OLED is still really expensive if you can afford an OLED screen I doubt you would be using upscaling
In case people don’t know, AMD fixed the VRR issue with the most recent beta drivers
Where do you find the most recent beta drivers ? I can only find the link to the page for the first release of AFMF :\
They will break it in the next one. I swear.
@@majorgg66then don’t download those
Unless I'm seeing a different driver, Freesync is only working with Fluid Motion Frames, and it is still not working with FSR3.
They watched the video first too
They got the optical flow thingy working on compute, and reasonably well too. That’s the main point, the rest is just teething problems for new tech. This was supposed to be impossible without hardware support, as Nvidia put it. I’m impressed
They never said it was impossible - they said it wouldn't look very good, and a lot of the problems here actually point to such. Remember that NV is still positioning DLSS as one of the crowns of RTX (which it absolutely is) so putting out a lower quality for specific products in the stack will do nothing but damage DLSS' reputation for consistency among all supporting hardware.
Can they do it? Sure, given enough time and incentive. In the meantime, they're currently content letting FSR take the second fiddle since it runs on their own hardware too anyway.
@@arthurbonds7200 They said it wasn't possible. Google it.
@@arthurbonds7200 No, hey said it was impossible. "Oh, without the new flow tensor instructions you wouldn't get good frame rates!!!!", but we got good frame rates without the use of the optical flow tensor cores.
@@arthurbonds7200But it does look good, that's the thing. The interpolated frames themselves don't look any worse than DLSS 3 FG. The issues with frame pacing are completely separate and quite solvable.
@@HunterTracks Are you one of these todds that spouts bs like "fsr and dlss look the same" lol. Friendly reminder that fsr3 hard locks you into fsr 2 upscaling. I don't care what fps you are getting if you are using fsr 2 over dlss 2 on any rtx card just so you can use frame gen then you are insane for how much visual quality you are using.
I recall Nvidia frame gen also having teething issues when it first came out.
Ill give AMD its chance to get its ducks in a row with the technology.
If nothing else I very much appreciate they included it on nearly all cards on the market not just their latest generation.
i think the cheat here is that on DLSS 3 announcement was that DLSS 3 WAS Frame Gen unified..... now DLSS3.5 is announced as Frame Gen excluded, but they didnt said that and everyone is confused, and they are claiming to have all the DLSS for all cards, a big scam.
some of the stuff listed in the video already has been fixed
It even runs on an Nvidia 1060, although it is not particularly useful on that card
the difference is AMD just spent the last year and some delays getting its ducks in a row... why do people keep making the excuse for AMD that nvidia wasnt perfect on launch so why should AMD be perfect when the difference is nvidia lead the way with the tech and AMD takes its sweet ass time, a year in this case no less and delays in order to still drop it with problems. If this doesn't tell people AMD are extremely far behind i dont know what else will... its like AMD panic when Nvidia introduce a new feature and flap around like ducks for a year to try and imitate it always coming off worse.
@@jinx20001Nvidia developed the technology for their servers, then adapted it for gaming and the results is much weaker cards per dollar for the average person. AMD also takes their time cause they make it open source.
Looking back in time, DLSS 3 didn't have a smooth launch either. I vaguely remember visual artifacts especially with UI and some other issues.
FSR3 is more interesting simply for the fact that it's not hardware locked, hell it'll even run on igpus
AMD buried: PhysX, G-Sync (remember your old god? "G-Sync compatible" its just Freesync), Nvidia Hairworks, "DressFX", SLI, DLSS 2 (waaay more games on FSR), and RTX (20% usage) is half dead, and DLSS 3 just runs in 5 games and shares the same destiny, while NVIDIA buyers forcedare forced to pay the RTX-tax of double the price, and most lineups discontinued (GTX).
I only see win after win here, nvidia tech superior? Man you have a mountain of taxed unupdated dead techs around you, enjoy.
And RX 6600 and RX 6800 cant be beaten on their league.
@@Jackson-bh1jw Yeah, Jackson I have to agree, I'm a little surprised by the dreary tone of this video. The current problems (reportedly resolved in the beta drivers) seem minor and very surmountable. And given the broad support FSR3 is going to enjoy versus the tiny subset of cards compatible with DLSS 3, this seems like a feature to celebrate.
@@Jackson-bh1jwNvidia tech may be more superior but the fact that they lock out users from their tech with their proprietary bs is what kills their.
That is true. But unfortunately, first to market matters because by the time the competitor comes out with a competing technology, the incumbent has already stabilized theirs or have gone to a second better or more enhanced iteration. The market doesn't appreciate second place, or give any sympathy for the underdog. They'll just go for the one that works better.
@@Jackson-bh1jwa lot of games use physX still as an example in September payday 3, lies of P, and Mortal Kombat 1 all released and use physX. Games just got rid of the option to turn it off.
Besides that most of what you said is wrong. Like your false claim that FSR supports more games when in reality DLSS does. Dlss 3 doesn’t only support 5 games it’s supports 46+. G-sync still exist and gets updates, it also offers more features like ULMB2 and the ability to use adaptive sync at all framerates. Hairworks is still being worked on and improved a few months ago they were able to simulate 86,000 strands of hair at 17 ms per frame.
So first release of FSR3 vs DLSS3?
Would be great to see another review after 2 or 3 updates.
"the will fix it after release" ideology is one of the most delusional fanboy defence
it will never be as good as dlss3, same way dlss is much better than fsr2, nvidia might be a shady company, but clearly they didnt lie when they said the new hardware in 4xx series is designed to do it right
Given that AMD has nailed the quality and overhead aspects of frame generation, I'm optimistic that fixing the anti-lag and frame pacing issues won't be too difficult. They clearly released a beta version of the feature and called it a full release. But once those issues are sorted out it will be great.
I think FSR will always be behind in terms of upscaling image quality but as long as they are close enough I think that's a big win
People don't give NVIDIA enough stick for the fact their tech is locked behind their own hardware. I'll take FSR and its open source nature, over DLSS for this reason anyday. The difference in real time use is miniscule...I have pc's with both vendors hardware.
uh its actually a preview driver that is not part of the main driver branch. it is the definition of a beta driver...
@@totalprocall6894they're not doing it out of the kindness in their heart. They know they have no chance of developers implementing it if it was locked to their hardware as Nvidia basically dominates the market. They have tried before with better tech and higher market share and failed.
@@grantusthighs9417"AMD hardware" encompasses the entire console market (except Switch, but that's its own little bubble). Even without that, it's a few days of dev work for 10% more potential reach, that's more than enough for most big studios.
Either way, it's quite obvious that AMD did it for their own benefit, but a win for consumers is a win for consumers either way. It's still something to celebrate, no matter who does it.
@@totalprocall6894that ''open source'' p)art clearly doesn't help amds tool get better. so what's the point then.
I played around a bit in the Forspoken demo and actually thought it looked fine, though my 6950XT was indeed maxing out my monitor with FrameGen.
I have a 6900XT and can't wait for FSR 3 FG to be in Witcher 3. I get like 60-70 FPS native with max details on a UWQHD monitor that has 144hz refresh rate. At 120-140 fps it would be just there. I would like to keep the option for no upscaling tho.
Yeah I tried it on a 3090 at first looking for the artifacts it looked bad, it lessens the Effect at Native Res. but just playing through it felt nice. The game itself looked like ass sometimes, bad shadows (what is RT shadows for?), the MC model, effects are a bit low res idk if its an artistic choice but it feels like they are good but look bad sometimes, I like the aggressiveness of the enemies and the gameplay tho.
That may be what FG is best for, either DLSS or FSR. Turn a game that requires VRR because on occasion it gets 60-80 frames natively but you can lock it to Vsync because that is getting you above the rate of the monitor almost all the time.
It's so cute watching all the amd babies come out of the woodworks after a year of bashing frame gen because they can't use it all of a sudden falling in love with the technology now they finally have access to their own version albeit worse in every way than dlss frame gen lol.
At least I won't have to keep reading delusional anti frame gen comments on this channel now the peasants can use it too. I thank amd for that.
@@slayerr4365 Stop fanboying. Nobody cares what brand you use. Better technology benefits everyone, and btw.: I'm currently on an Nvidia GPU.
The reason upscaling works the best at maximum refresh rates is simple: the motion vectors velocity is less the higher the number of frames is. The frames are predicted based on the velocity and the greater it is, the harder it is to predict (correctly).
I won't lie, I am legit impressed with FSR3/FG. While certainly not as good as DLSS3/FG, I've seen enough that I can say Radeon owners aren't getting a drastically inferior experience. FSR3 could definitely use a few more tweaks and will likely get better with time, but those multi-$100 premiums for nvidia GPUs start to look less worth paying for. I don't disagree that it's not ready for primetime, but as a preview, I like what I see.
Well done AMD. The tech certainly isn't as good as DLSS3/FG, but that makes sense given how much of a premium you have to pay to take advantage of it.
People won't be getting a much worse experience from fsr 3 once the issues are fixed. They are going a much worse experience from being locked to fsr 2 upscaling in order to use fsr 3. If they found a way to let 20 and 30 series owners still use dlss 2 with fsr frame gen they probably would of made tens of thousands of new amd fans. As it is I would recommend any 20 or 30 series users to ever use fsr 3 because it means losing the quality of dlss and that is absolutely never worth it.
Not just AMD users, also the majority of nvidia users. TONS of nvidia buyers of 2000 and 3000 with the promise of a better feature set got left in the dust at the first opportunity with DLSS3
I wouldn't be too sure about that. There are too many NVidia stans out there. Remember the big thing a month or two ago when they were all getting mad about games not having DLSS while FSR is just fine? All the consoles currently use FSR upscaling and no one there seems to care or think about it. Sadly everyone will just get mad that Nvidia still won't play nice while still buying their GPUs. I'd still rather just play my games at native. I'm starting to think I'll just never try to move to 4k and stay at 1080p for the rest of my days.
Shame AMD don’t innovate on gpu’s. If nvidia didn’t do something for AMD to copy the gpu market would still be at the 8800gtx.
@@chronossage say what you want, Nvidia don't have a reputation for abysmal drivers.
Hot take: requiring a high base framerate to work well is a good thing, it discourages devs using fsr3 to make up for poor framerates, it only enhances the experience on an already good enough framerate
Fair but not entirely true. Seeming devs are already using upscaling as a replacements for optimisation the thought process will more likely be "what's the lowest frame rate we can get to with upscaling that is acceptable to boost up with frame gen to be playable" now.
If 2023 has taught us anything it is that game developers are the laziest people on earth who will do everything in their power to not spend a single second working on making their games playable.
On a related but not related note I also don't agree with his take that you need at least 100 fps for frame gen to be worth it. For me on dlss frame gen anything over about 75 fps with frame gen on so around 45 native fps was enough to make the game feel perfect. Playing cyberpunk with 80 fps fg on path tracing currently and the game feels worlds better than playing at around 50-60 fps with fg off and lower settings. I think a lot of people are using bad monitors which is giving them a skewed view on how much you need for it to look and feel smooth.
I understand what you mean, but I can't see why you would enable this feature if your fps was already acceptable. Just a lower quality image and worse input latency. I would consider using it if fps was below 60 to hopefully bring the smoothness up to par with 120hz refresh rate.. just my opinion though.
@@Coffeeenjoyer31 Yes! If I have real stable 60 fps, why would I want fake low quality high lag 100 fps instead? Makes no sense at all to me.
@@PetrisonRocha this is why Nvidia locking frame gen to RTX 40 series is particularly annoying. A 4060 has enough performance where you wouldn't likely be in a situation where above 60 fps is unachievable.. so kinda meh
THIS. Hopefully we'll only see it AT MOST in quasi-120Hz/fps PS5/XSX game modes. Which in turn, may mean they'll put some actual effort into integrating it into the render pipeline and result in a great implementation on PC as well.
I appreciate the detailed analysis... but I'm sure I would not notice 90% of these issues while gaming.
You and me both brother. Sometimes I feel like many of these YTers become hyper critical about most everything in order to generate interest. Beyond looking decent, my focus is gameplay, gameplay, gameplay. Graphics are really secondary. And hey, most people do not play on ultra settings anyway because they are not playing on the latest greatest hardware. Sometimes, I feel like a motherless child. :D
@@halrichard1969 It's their job so they're naturally going to be more focussed on these things than the average person. And I think in general they're trying to hold AMD to a higher standard, which is a good thing. The more AMD feels like they need to achieve feature parity with Nvidia, the better it is for consumers. It's debatable whether they put too much emphasis on how it compares to Nvidia features, given that the major use case is for those who can't run those Nvidia features (AMD cards, older Nvidia cards), but in general I think it's a good thing.
I won't notice any of these on my indies and jrpgs lol. Western AAA games rely on tech to be interesting.
conclusion: when buying a new gpu, don't rely on the frame generation, buy the best one you are willing to afford that renders the highest fps natively
So if you care about game raster then buy AMD because Nvidia is overpriced for fps
but bru i'm not a peasant like you
That's why i got a 3060 laptop instead of a 4050 laptop. Despite CPU being more powerful in the 4050 laptop, the 3060 is stronger at native 1080p and 1440p than the 4050. Considering I don't play games with upscaling options, this is the best choice to do.
So, a 4090 then?
nvidia is stupid, their 4070 is a good gpu. But it's 12gb vram is a joke. I bought rx7800xt instead.
Are the background clips the wrong way round at 16:16 to 17:12?
The left clip is straight up higher quality, and the commentary is stating that DLSS looks better?
Pause at 16:45 and tell me the right doesnt look like a blurry and lower quality version of the left image.
Nope, they just seem to prefer the DLSS blurry up-scaling over some FSR shimmering. I'm sure if they tuned down the FSR sharpening slightly it might improve the shimmering, at a slight cost to obvious sharpness win it has over DLSS here
the dlss looks smudged and the fsr looks sharp but like you said probably too sharp, if it were toned down it would look much nicer@@Ben-Rogue
This is going to be like G-Sync.
AMD will lag behind in engineering a viable alternative that's able to be used by a broader group of hardware, the whole time Nvidia will claim that it's just "not possible" for them to use *Insert newest DLSS here* on certain hardware. Eventually AMD will catch up and Nvidia will have to change their marketing strategy or look like idiots.
AMD occasionally manage to do some really interesting things. FSR now, but mantle before was so forward thinking that a lot of the API was supposedly absorbed into Vulkan and DX12.
and lets not forget that today every single one of us is using AMD64 architecture in our intel CPU lol
(at least kind of!)
This has always been the case:
3D Vision
PhysX
Gsync
DLSS
Nvidia has always been and always will be about pushing self-serving proprietary
There's nothing wrong with that, but when they alienate even their own user-base with older gen cards for the sake of pushing them into upgrading (milking them), that is simply greed.
yes, nvidia sells technologies, while amd sells graphic cards. look at 4060/ti, they are useless without frame generation. lower power consumption barely worth the upgrade
@@phutureproof indeed
@@mirage8753 That's because in reality they are actually 4050/4050ti, but Ngreedia label them higher just so that can sell lower tier cards at higher pricing.
FSR 3, for being new and just launched, it is VERY promising. Big props to amd!
Did you mistakenly switch the FSR3 and DLSS videos? I found it rather confusing that just about every videoclip labeled FSR3 had obviously more detailed and sharper textures, than the DLSS-labeled ones. While the commentary was saying the exact opposite. Even through youtube-codec the DLSS textures looked obviously lower resolution and way more blurry.
Seems like a pretty promising public beta to me, i look forward to the full/FSR3.1 release on Starfield
Well if it is a beta release... then there shouldn't be too much 'panic' per say... issues then are expected...
@@csl750per se* fyi
Beta? More like alpha.
Alpha Made Drivers @@Terepin64 👍👌
@@LTNetjak and due to AMD trying to make the tech available for a wide selection of cards... they gonna struggle
So glad Daniel Owen has been going through the settings with the community to help and show people since the tech launched last month
Hes very good. Seems like a wholesome dude. HUB is still GOAT
Never watched him but the behavior of a part of his audience spamming every place (it is NOT restricted to HUB) that comments something about another youtuber work pretty much guarantee that I'll never watch him. He might be a really good youtuber but I am tired of seeing this kind cultish of spam everywhere.
@@sergioaraujo7943maybe you might want to take your issues up with HUB, seen as Tim mentions Daniel’s name in the video you are commenting on or do you not want to listen to the video from HUB either
@@sergioaraujo7943 I think he's a math teacher or something and makes the videos solo from his house. He seems like a decent dude. I've watched is videos here and there for quite a while.
@@sergioaraujo7943 that is the worst reason to not watch someones content.
One thing that I don't see people talking about enough is how EXPENSIVE the DLSS FG pass is. Doing the math, it costs 5ms at 4K on a RTX 4090, while FSR3 only costs 2.4ms. It's LESS THAN HALF, and it makes a difference in the frame rates, especially at high refresh rates! If AMD can do it, Nvidia should optimize the DLSS3 cost too.
Thanks for the review!
I think it would have been helpful for you, the review and the viewers to have taken a deeper dive into the technology. The AMD and GPUOpen Project have some deep dives into how FSR2.2 works. For example which TAA implementation (game native or FSR) is chosen is up to the developers iirc. They CAn choose the FSR-TAA, but they can also hook in their own.
Shimmering of fine details in FSR may be reduced by a better implementation by the developers (like you suggested, letting IoA choose *their* TAA should be possible).
The same goes for UI elements rendered at lower fps than the interpolated frame, this surely is down to the devs deciding which elements need to go where (interpolated or not) for screen presentation.
Anyway, cheers for your content! Your work is really appreciated and I hope I voiced my criticism respectfully and constructively!
Good or bad, even if it's subpar with DLSS 3.5, I just am glad FSR 3 is here as it saved me from my itch to upgrade from RTX3080 to 4080. I say GOOD JOB, AMD!
It is subpar with Dlss3, not 3.5 by any means
Now just need more games with support. Could be a while before there's a decent library.
@@PhilosophicalSock3.5 too
Thank you amd! Will probably buy a 5080 next year instead of the next amd card 😂
@@dudanvictor4203 nope. 3.5 is about ray reconstruction
As touched on in the video, a lot of what we’re being presented with does suggest they were trying to meet some internal deadline, maybe a promise made to shareholders or who knows what. The announcement seemed almost hushed, and with only two relatively less relevant games so far, it’s all very much like they’re still ironing a lot out before a likely more boisterous public announcement I’m sure they are trying to get to sooner rather than later.
Did you forget that DLSS 3 also had issues at launch and looked like a smeary mess?
Remember first dlls implementation on cyberpunk was trash. All companies are rushing products because the market doesn't allow waiting procedures. Profits are more important than quality delivery nowadays.
They announced it a year ago
@@user-jd3pk1bz8e yeah, even more for AMD now, after all they are running to catch up with NVIDIA in this area, so they want to rush things more than any other company
Rushed? It was supposed to be out q1 this year. It's already half a year late.
Look at 17:00.. I got NO idea what Tim is talking about there, because the FSR looks WAAAAY sharper. ofc that might be CAS sharpening, which would also introduce/explain shimmering. So what do you want? Sharpness or Vaseline?
i am impressed with fsr 3 on forspoken on my 4070ti I swear my next GPU will be an AMD surely
lol what is that logic if having an rtx4000
@@PhilosophicalSock Obvious bs from AMD fanboys
Surely .... (Aware)
I don't enjoy frame generation at all even on a top tier system but I'm really happy to see AMD improving the competition. Hopefully FSR upscaling can improve more as well.
You guys may want to check the release notes cause it tells you how to fix frame pace on there. Works great for me with both the original and experimental drivers in fact, I can use fluid motion with frame gen.
Fluid motion is the driver setting right? I saw the one guy using a beta driver that enables framegen via the driver however it cuts off when you move around fast. Kinda cool that you can enable framegen via a driver instead of the game which might help in older games like say eu4 ck3 total war etc. Those are more cpu bound though but don't require fast movement like an FPS.
@@inducedapathy1296 you can even combine fluid motion and frame gen in newer games to get super high frames rates works amazing. Fluid motion can even speed up games like LA Noir that had a 24hz cap, though this makes facial animations run a bit too fast.
This post should be pinned, because most people wont even be aware of it, and probably wont read the release notes.
Good luck with that HWU don't really bother, if they do it'll be a shocker. I've already moved on from this channel, but I was willing to give them a chance. I was screaming this on my phone, but whatever
What does the release notes say to do to fix frame pace? I did not see anything other than turn Vsync on. Was it frame capping to half your monitor refresh rate?
Thanks for putting in the effort to dissect this on such a granular level and explaining it in such a straight forward way ❤
I just want them to fix the VRR support which only works sometimes and not for driver method. AND ALSO I want them to support HDR!
Those are the two big ticket items for me to start using FSR3. Perhaps it will be on a per game situation, but until we get official release and not preview driver release, that may not happen.
I found the exact same issues as HU here in my short test. Some people however are getting different results and I think it may come down to monitor configuration.
17:00 You have FSR sharpening turned on but DLSS comparison doesn't have sharpening dialed up.
no one cares about hdr. i'd like to protect my eyes, thankyou
@@xKamiiii I care very much about HDR. It looks incredible.
Games that support HDR allow you to set brightness levels. If you're having a bad experience with it, it's on you@@xKamiiii
HDR is a visual upgrade surpassing RTX tech simply for the fact that it's not taxing at all performance wise to use and it likewise drastically improves image presentation. So guess I'm part of the "no one" who do care about it
@@xzaramurd no it really doesn't. it's too flashy. i prefer raw colours
That clickbait title man, jesus! 🤦♂
As we can see all over youtube, pixel peeping analysis in slow motion with magnified images is one thing, and the improved overall experience for a lot of gamers with older gpu's, non nvidia gpu's and even last gen nvidia gpu's that supposedly were too weak for frame generation is a totally different pair of socks. These gamers will gladly sacrifice some image quality and latency to play their games more comfortably. Contrary to what many people tends to belive watching tech channels the majority of gamers don't own high refresh rate VRR monitors and high end gpu's so FSR3 is a really good thing despite it's flaws.
Impressive work! Such a technical, organized, well delivered video ❤
15:57 There seem to be big difference in sharpness setting when comparing upscaling in FSR3 vs DLSS3 (FSR3 seems to have sharpness cranked too high).
I recommend setting a frame cap at 1080P and enabling frame generation on both prospective brands.
Playing with FSR 3.0 FG enabled with a cap of 60 (monitor limit) on Forspoken lowered the CPU and GPU load so much it felt like I was playing a UE3 or old UE4 title vs a UE5. The input latency was massively improved.
I hope Nvidia allow devs to separate the UI rendering pipeline, like AMD has.
I'm sure they will - its more of a hassle to devs though if their ui isn't built in such a way to take advantage of the API
They probably are trying to find a way to fix the UI issues within the frame generation pipeline rather than just leaving those UI elements at native framerate
Cause it's not a perfect solution either, if those ui elements move or are in world space (like way points, health bars etc.) they really appear to stutter compared to the rest of the game.
it's bizarre they didn't do this in the first place. they know it's a bad idea to run post processing before FG, so why not UI elements too? I have to doubt it's all that much effort when UI is overlaid in the first place
If I remember correctly, it was advised in the DLSS integration manual, to render the HUD elements natively, and not upscale/generate those. Either way, if the developer cared, this was what they would do, but here we are.
Apparently, not all the game devs car enough to do that properly.
I have been using the Driver based Fluid Motion Frames in Starfield. I now get 200+ fps at 3440x1440 on my 7900xtx. It looks and feels so much better than Native rendering at Ultra
When I tried FSR3 I was impressed. It will be nice that game developers will be able to use it for the xbox and ps5.
Hope so. 60fps minimum though so I hope devs use it properly on console.
@@mikem2253 With developers already upscaling to 4K from resolutions as low as 720p in recent titles I don't see them shying away from going the same road here and interpolating from >60 FPS as well. I reckon it's going to be a complete shit show if they'll start implementing FG on consoles.
As stated in video, FG is only of real benefit for powerful hardware that's basically already capable of achieving HFR by itself, if not to the same degree as with FG added on top of it.
It will work on the ps5 but be crippled compared to the series x
@@juanblanco7898inb4 720p30 games upscaled and upframed to 4k60 lol
Great video Tim! Honestly though I have the same opinion as Steve. I simply miss the days when GPUs were just about raw rasterization, and improving performance with each iteration.
So say we all
If you got to zoom 300% and slowdown 4x in order to see differences, proves to me that you won't notice a difference at full speed, no zoom 😝
This video needed to be clearer that this is a technology preview. The VRR issue has already been resolved in the latest beta drivers. This video is being sold as this being FSR3's final form. (Disclaimer: I have a 4090 and don't currently own any AMD cards. I just believe in this info being represented properly).
Unfortunately, even when AMD releases a newer version of FSR3 with VRR compatibility, anti-lag+ working properly with FG active, etc., many people will still tune into this video when researching FSR3 and come to the conclusion it's crap. What AMD has done here is impressive even in its unfinished, beta form, with teething issues galore. They've proven frame-generation doesn't need dedicated hardware-based optical flow accelerators. Most folks thought AMD was bullshitting when they said 2023 release, and that they were years away from anything, since it is reported to have taken NVidia 3 years to develop DLSS3 FG. Yet here we are with a solid frame generation base upon which to build a compelling feature that works across multiple generations of cards and brands. I don't think that really comes across in this video.
@@AMD718 lol you can thank Tim for that
you can only review something which currently exists, not a potential version 3 years in the future
@@cc0767 Or one in current day, just a couple of days after recording, because that doesn't make NVidia look good.
@@markhackett2302 they may have fixed 1 particular issue, not magically made the software solution better than the one with dedicated hardware
This is a Win for the gaming community lets go AMD!!!
9:48 But why use VRR if frame-tearing doesn't bother you, or is extremely minimal to begin with?
Especially when VVR is known to sometimes cause issues with monitor features or game graphics features, performance or otherwise? And there are better ways to cap frame-rate too.
It would be great if you could circle or mark problematic areas in the image, which you are referring to in your script, because I personally find it very difficult to identify what you are exactly criticizing. Anyway, still love your very in depth reviews of frame generation and upscaling technologies!
So it sucks unless your FPS is already high enough to not require it. 100% solution for zero problem. fantastic.
You may have seen the Switch version of No Man's Sky now has a custom built FSR2 implementation which looks really impressive despite the low resolution. Unfortunately probably took a lot of work and is likely game-specific, which is a real shame as it gives a glimpse of what FSR can do with some optimisation.
@@lunarvvolf9606 I mean, indirectly it builds company trust, also xbox and ps5 use amd hardware
It's such a shame because the improvements in No Man's Sky are where it matters the most: Temporal stability.
FSR has nailed overall detail down surprisingly well, but I'm often using inhouse TAAU over FSR simply because of how distracting the artifacts are. Such a shame
@@lunarvvolf9606 Again, AMD sells GPUs and CPUs to companies like microsoft and sony for their consoles, for the Steam Deck, and even other companies use their features on hardware that would be unsupported by nvidia, like nintendo uses FSR in some of their games. This all means that companies are more prone to do a contract with AMD, or renew their contracts, to have AMD hardware on their devices.
Of course, on the PC market NVidia sells way more, but AMD's strategy is more focused at mobile and console department right now as they have the advantage on that market.
@@lunarvvolf9606I think AMD's decision to make their technology manufacturer agnostic is more of a byproduct of their commitment to backwards-compatibility support for their own hardware. FSR3 isn't hardware dependent which allows it to run on 3 generations of AMD GPUs whereas DLSS3 only runs on the current gen NVIDIA ones. Once AMD has their tech running on their own cards in that way, it's obviously possible to run on NVIDIA cards. Their cross-platform compatibility is probably to avoid backlash from software locking their tech from competitors because that would be kind of a bad look. Besides, *IF* AMD and NVIDIA cards have feature parity, why wouldn't you go AMD? They support their cards longer and they give you more performance for the price. I'm not saying it makes the most business sense, it's a noble goal.
plus, the fact that amd technology is open makes it so it works onm ALL amd hardware as well, unlike nvidia technology, in which for example the nintendo switch cannot make use of DLSS despite running a nvidia gpu because thast gpu is "too old". Nintendo themselves uses FSR, and as more companies use it AMD's brand reputation increases, and outside of the pc builders makert, ths is important.
More and more brands are deciding to go with AMD for their APU / mobile solutions, microsoft, sony, and steam are already doing that, and nintendo potentially could on theiir next console release.
Remember that AMD has less marketshare overall than nvidia, which means that the only reliable way for them to induce consumers and companies to use their own solutions is by making them avaliale on multiple hardware.
Re. combining DLSS upscaling and FSR framegen (at 18:00), this is probably very difficult due to how they’re integrated into the overall rendering process. Rendering engine are already highly conditional and complex. Trying to have 2 different “end of pipeline” techs working in tandem is very difficult and potentially impossible depending on what access (to generated data) they provide to the calling code.
Suggesting it’s due to AMD not wanting to give NVIDIA users a leg up completely ignores the complexity and limitations of the software involved.
You have to remember that these are the same people who without evidence spread the "amd is preventing starfield from having dlss" fud. In that context them demanding that amd implements dlss in fsr is very on brand. It cannot possibly be nvidias job to implement their own proprietary bullshit into an otherwise open source code.
Latency comparisons with DLSS3 should have been done without vsync for the Nvidia tech. It harms latency for no benefit and the comparisons with FSR3 frame gen is misleading.
With reflex on, vsync doesn't harm latency, reflex with reflex on basically limits fps to a little below the refresh rate automatically.
Did you generate Tims reactions on thumbnail with FSR 3 😂??
At 16:00, are you sure DLSS is superior? Looks blurrier to me.
"Oh no, there is a tiny amount of shimmer at 3x zoom, lets just say the blurry one looks better and hope nobody notices" FSR3 quality did improve by a noticeable amount in immortals compared to FSR2, quality looks same or better than native and native AA mode looks incredible. Can't speak for forspoken.
@@PineyJusticeyea, both have their pros and cons. I can live with a little shimmer, but blurrier images?
@@cocobos DLSS was the blurrier one as you correctly pointed out, yet DLSS was declared to be the winner because there was a tiny bit of shimmer when zoomed in 300% on blades of grass. Blurry can be fine, sharp can be fine, it just depends on the game aesthetic.
@@PineyJustice I can quite easily see the shimmer at 1x zoom before Tim zooms in on the pond though. I am watching the video on a 1080p monitor although the video resolution is still set to 4k, maybe that is why I cannot as easily notice the difference in sharpness.
@@mingyi456 Only if you're really looking for it. Also there is zero shimmering in immortals, seems to be a problem with forspoken. Also just after at 16:27 you can clearly see that FSR is sharper and more detailed than dlss.
Enjoyed Rich's take (DF) much more than this one and I enjoy HU content. It's good stuff for the community but it is in it's early stages. It's tough to get this level of bashing so early on. DLSS3 had a poor start also...and using dedicated hw..
Yeah, the DF video was definitely better. Same facts, less biased reporting imo.
You need make video about AFMF + FSR 3 FG. That is more interesting, i curious about latency when antilag+ on vs off.
As an owner of a 3070, I really wanted AMD to drop a huge win bomb here as I feel cheated by NVIDIA that a 3000 series card cant use all the Nvidia Tech.
You have tensor cores and they can run the NV tensor optical flow-engine that's used under the hood in DLSS3 for framegen.. which you should be allowed to!
As a 3060Ti-owner who's been upsampling vids to 60fps with tensor since the 2070-- I'm done with nvidia if they keep this up. Maybe I'll support Intel next round.
@@mrvr6165they want your money, and they'll keep doing shit like locking tech out to buy the newer generations. Rtx was an excuse, dlss3 also is an excuse. What will be the next excuse.
That is simply stupid. You can't expect older hardware to always support newer tech.
@@mrvr6165they have been doing this since they started making gpus. They are a very evil company.
@@arenzricodexd4409difference is if it can't vs if Nvidia will allow it. Nvidia won't allow it unless there is enough backlash
I've been playing around with Fluid Motion Frames on other titles, and I'm impressed. The key is to configure the game where the microstutter guage on the built in OSD is at 0%.
With this tech preview driver, I've achieved the best results when:
- Leaving VRR on
- Turn VSync Off
- Setting an ingame frame cap to the peak of your monitors refresh rate
My results have varied from game to game. Some titles don't care for it, while others love it. As long as my native FPS is above 90 or so, I'm good.
P.S. I've even been able to get good results when enabling ray tracing, maxing out graphics options and/or upping the in-game render resolution from 100% (1440p) to as high as 150% (or 4K).
If you can't reach the full refresh of your monitor with FSR3, you can always lower the refresh rate to have it sync, for example lowering a 160Hz to 120Hz.
yes, and it seems if you can reach 60-70fps with native or upscaled resolution then 120fps is pretty much possible so vsync wont be a problem. so main intention for frame generation is to reach high fps for monitors with 100Hz+. so upscaling is for high fps up to 60 on high resolutions when frame generation is for high refresh rate. so now you can have 100+fps on 4k on max settings in games with upscaling and fg without any noticable quality loss
FSR3 with a capped fps, while hitting that fps 100% of the time, looks absolutely flawless and input lag is great also.
In my perception, 60 fps with FG is fine on a controller while 80 fps with FG is fine on mouse.
I've just messed around with it in Forespoken, which was thrown in with a graphics card, and the latency was far less of a problem than I expected. I thought it would feel pretty sluggish, but it looked good and was still decently responsive. The game still sucks, but it looks better while sucking.
22:39 This is not a fair comparison. The 4090 is a faster card at a SIGNIFICANTLY higher price, so it obvious it will have lower input latency because of its higher base performance.
Blaming AMD for not incorporating reflex into FSR 3 is ridiculous.
Yes, FSR 3 has some issues, notably the vsyn issue. But the fact that the IQ is generally superior with less ui issues than dlss3, latency penalty is minimal , higher output fps, and available on more cards is a huge win.
Honestly when looking at the direct comparisons, to me, FSR3 had better image quality for everything in the background. Foliage, textures etc. seemed sharper. Also the comparison between native vs FSR3 at 18:26, why is the FSR image so much brighter ? Is it a time of day issue in the game ? Another question, you said FSR is good for maxing out your monitor refresh, but is it also OK to go above the refresh ?
I have owned rtx and radeon cards and have never once turned on DLSS, FSR or any of these features. Call me old fashioned but pure rasterization and native resolution is the way to go for me.
If you keep upgrading your high-end GPU, then it's fine with native.
To sum up in my situation Radeon 7900xtx nitro + Forspoken 4k ultra RT, I have around 45-60+ fps native. I set vsync on in monitor, thanks to FSR3 I have 90-120 cap FPS. I don't feel lag. The quality is awesome. I don't care about the rest. I'm keeping my fingers crossed that they add this to as many games as possible because in my case it works great.
Which FSR quality setting is that? FSR AA?
This actually looks very promising if they get it to work properly with VRR and Vsync off, and I don't see why they couldn't do that 👍🏻
Sounds like they got the frame gen part right, they have to make it work in all cases now.
I swear, saying that using frame generation at 30 FPS to bring you to 60 FPS isn't worth it due to motion artifacts is like saying that cooking your food isn't worth it because you get less nutrients. Playing at 30 FPS is likely to have a much more noticeable and negative impact on your gaming experience than frame generation artifacts could ever hope to.
Also isn't worth it because the latency is the biggest issue at 30fps, and frame gen will only make the latency worse.
@@Mr.NiceUKLatency is still very much serviceable in my experience, and there are many games that don't require low latency (for instance, Baldur's Gate 3).
Either way, it would've at least been a good point to include.
I just have to cut in (watching from the start of the video). Your talking head videos are so crisp and clear. It is a pleasure to watch your videos. (and your Audio is always on point) Now back to the super informative video I know is coming 😄
Considering my wallet, I am completely satisfied with the FSR3 technology given its free
What about using FPS limit? If a game runs at 120+fps, I would just set an fps limiter to 120, wouldn't that fix the uneven frame generation?
you also can set vsync global in the driver
Did you miss where he did that and it didn't help at like 7 minutes?
He tried setting a FPS limit in the video, it did not work.
@@nevcairiel ah ok. sorry. thought it would be like nvidia.
It works if you set it with Riva Tuner.
I expect that the implementation of this technology is a bit lacking right now (remember how bad the first few titles with FSR and DLSS performed) but that should improve some, at least to become a bit more consistent.
Let's not forget that although DLSS is MUCH better generally, FSR has way less ghosting
4:47 Have you mislabeled cuz the right looks like it's optical flow enabled
Edit: Oh I see. The framerate tells. Though FSR3 looked quite bad at higher framerate.
its maybe because of compression, but i fail to notice any quality difference in this video provided.
Are you watching it on a big monitor or on 6.5" smartphone screen? ;-)
Not even close but glad we got competition!
in practice its very close actually, these guys are whats breaking FSR3
No VRR is a deal-breaker. But AMD looks to really want to make a good feature set so hope we can get the improvements soon
@@GewelReal VRR works just under certain settings....plus these are Beta drivers, how ppl draw conclusions off Beta anything is beyond me
Even its not on the Same level with DLSS/FG its for all cards and its new NV FG was also not perfekt on the start...
You're not wrong but it's been almost a year since they announced it was coming out... That's more than long enough that it shouldn't be broken and fucked on release still. If not to get the tech refined and working well wtf has the delay been for?
Assuming everyone has vrr is a strange assumption, the least upgraded component is usually a monitor, ive had two 165hz 1440p monitors for 7 years and don't really plan on upgrading until they break
Tim, I believe you've made a mistake @2:55.
FSR frame gen does not require fsr 3 upscale. It only needs fsr3 tech to be enabled at least with native aa mode enabled.
The way you've said it initially makes it appear that the upscale is needed. Native AA ofc, is not upscale.
Stick to the facts, AMD isn't locking DLSS out of FSR frame generation. It's Nvidia's own policies that make it unavailable
doesn't mean they shouldn't allow frame generation without any upscaling
You can create custom resolutions with lower refresh rate like 80hz, 70hz etc.. and benefit frame generation with no judder. You will be happy with the result. It is obvious its not finished feature, yet its a nice tool that will extend life of old GPUs with no brand separation and it should be appreciated.
Actually I see no difference between FSR3 FG and DLSS3.5 FG frame tearing in the video. Only the words from this guy set them apart. He has better eyes than me?
I still think FSR 3 is going to be amazing for those of us who still aim for 1080p 60fps on non VRR setups. Unfortunately PC gaming is in an odd state with poorly optimized software and overpriced high end hardware and this could help the low/mid range setups be playable at reasonable fps.
HU mentioned Daniel Owen! He definitely looks up to your channel so it’s cool to see you give him some shine ☺️
Been waiting for Tim’s review! Thanks 🍻
This all reminds me of the old Crossfire and SLI issues.
In my personal opinion DLSS3 looks blurrier than FSR3 at least in this examples, it looks like it's using some kind of TAA
I'm glad I'm not the only that noticed that. I hear one thing and see another.
I think you should be able to find only the generated frames (instead of guessing where those are), by using renderdoc or NVIDIA nsight. You should be able to see the pipelines and you’ll be able to see the DLSS for sure, not sure about FSR3 FG.
What is it like to see the difference on any of the examples? I am impressed yall see anything different. The world must be different for those with better vision than mine.
My whole problem with these technologies is the fact that I got into PC gaming to not need upscaling technology to get better performance. I think this is why I'm essentially becoming a mainly console gamer again because PC gaming is so expensive to get that intended higher performance without upscaling tech
A 6700xt is $310, rtx 3070 is $350
Both are stronger than consoles...you dont need to upscale with them.
@@mosesdavid5536Do you get the SSD, pro, motherb, power sup etc all for free with the GPU?
@@wildeyshere_paulkersey853 What stupid question is this? Are you expecting to build a rig for the same price as a console selling at a loss? Not mind 6700xt-3070 are 20-35% faster than ps5 not including rt
@@mosesdavid5536Them selling at a loss isn't my problem.
From what I have heard from AMD it seems as if FSR3 frame generation is deeply intertwined with the FSR upscaling algorithm itself (It even made a lot of sense to develop it). This is why you can't turn FSR off... "Native AA" still uses the FSR Pass due to how interconnected these technologies are. And a lot of the benefits would be lost if the frame generation tech decouples from the upscaling pass itself.
I think the main scenario for FSR3 at the time being is for 60fps games to make use of 120Hz 4K TVs with a VSYNC lock. Which could prove beneficial on consoles. You don't get any latency reduction from the tech, but at least the games can utilize that 120Hz and look smoother than if running at 60Hz.
Like all "smart" upscaling technologies, FSR 2 is based on re-projection, therefore, it makes TAA an redundant pass, as that too is based on re-projection.
They (AMD) already said, the FG uses the frames from FSR during the revealing conference. They can't just use the frames from DLSS. 😅
@@cocobos cant or wont?
@@emilioestevezz both
Motion smoothing in 2023. Nature is healing
If this works on the Xbox and PS5, then console players might finally get to play at 4k 120fps after all.
yeah all it will take is running the game at 1080p interally using FSR and not actually being 120fps
Sadly I doubt it. At least not with this first iteration. It seems like they have a long way to go.
@@Netsuko yeah completely agree, it's definitely a work in progress
AMD worked hard to creat an open technology, for ALL players and developpers and just for that, he win my respect. Like Freesynch vs G-Synch.
Now, it's interresting to know the difference with this two technology, I'm not very surprise by the result.
The difference that is FSR3 will be implemented in all games very quickly and for all graphics cards compare to the DLSS3 and the 0,1% of gamer who bought an RTX4000.
I'm not too thrilled either that these two companies are rushing into upscaling technology and that developers see this as an opportunity to stop optimizing their games. It's yet another way of making our hardware obsolete more quickly.
FSR works great with my 1080strix in Cyberpunk 2077. Very impressive it works with older Nvidia gpu. G-sync (no vsync)
decided to try out forespoken just as the new patch dropped and can't complain as i get a locked and smooth 120fps on my C2. there's some smearing in the corners at times and stutters when it loads a new section. the dlss included does look better but it only gives me a stuttery 70-80fps so i'll gladly use fsr 3 for this title.
I'm not going to be able to spot any of these differences and fsr 3 is more accessible so it's a clear win for me
Shoutout to Daniel Owen! He is a full time teacher yet has managed to cram in many FSR3 videos since it launched. Worth checking out!
Really likeable guy, I discovered his channel half a year ago. I think he's doing a great mix by presenting the videos in a casual, yet also somewhat detailed ways, which is not easy to achieve.
It's not really about weather FSR 3 destroys DLSS 3, it's more about AMD keeps reducing these exclusive features that Nvidia keeps coming up with to lock us down to their ecosystem.
In a way, what AMD is doing is reducing the need for DLSS by being more open, after all, most developers would rather support 1 over supporting 2 or 3, and all that's needed is the tech to be good enough and open enough that it works on a lot more hardware, as well as consoles, which I'm not sure if this frame gen will work on the current consoles, but it should do looking at how well it works on older gen gpu's on the PC.
DLSS is a good tech, but it's always going to be limited in its support being vendor locked, this kind of tech needs to be broadly open that it works on any gpu of the last few gen, and also works on consoles for it to really take off, FSR delivers that, XeSS could but it favours Intel hardware over rival hardware.
DLSS is the better tech for now, but the gap has closed so much that to most gamers and developers, it likely doesn't matter, good enough is what most in the mainstream will care about and FSR is in that ballpark.
Which amd card do you own?
Nvidia said G-sync needed extra hardware to support it as well and now we've all but moved on to AMD's implementation in almost every product but a few extra expensive ones. I hope FSR goes the same way, I mean at this point it does kind of prove you don't need AI cores to do it.
Don't forget that the consoles use AMD tech, so there is an incentive for devs to use FSR in their games from the get-go.
@@TheKazragore Unless NVidia manage to get someone to scream "AMD are Blocking DLSS!!!!" again.
9:45 - Saying that most gamers are using VRR today is like saying “why doesn’t everyone just buy an RTX 4090?”
Lol.
No man. Most gamers are not running VRR. In fact, 90% of PC gamers are running cheap monitors under $100 with 60-75hz refresh rates that do not even support FreeSync. Get. Real.
I would love to see frame generation for video. Some movies and most anime would greatly benefit. There are videos of Akira with 60fps interpolated render, it looks fantastic. To have that work on the fly for any video would be huge.
Disgusting. Cinematic fps is a standard for a reason. Fake frame on movies is awful visually
I use SVP 4 to achieve exactly this. I can live interpolate up to 120fps 4k only limited by my hardware. It works with youtube as well I love it great program.
Not the conclusion I was expecting. FSR frame Gen works brilliantly for me. VRR works on my 6950XT.
Will there be an updated look if/when AMD fixes the issues? Also, just out of curiosity, has there been any word on Intel implementing frame generation as well for XeSS?
HUB usually follows up when there is a new variants or improvement with this tech, so no reason to think they won't with this FG tech. As with Intel, nothing I have heard officially or leaked, maybe when Battlemage will release, they will introduce new tech.
Yeah but you missed what it’s trying to give the end user (wether they use it or not) between team green and team red team green might be all that but at a rather significant cost, I think later on down the track fsr will work great.
Did you do any recent works on the studio lighting or change to the recording camera? This video looks of the studio, Tim and the background looks stunning.
Jesus christ look at all the people having something valuable to say on a 31 minutes video which was uploaded not even 10 minutes ago
Because you don't have to watch the entire video before commenting on something in the first 10 minutes of the video...
Stay in school.