@@kazumakiryu7559 absolutely! Have you seen how much that tanks performance? It’s no coincidence John has really short hair too. Alex basically takes 50% of the render budget.
@@kazumakiryu7559I am quite certain it is rendered on a AMD GPU. Rich is moving faster and feel snappier than in real life because of frame generation. Plus anti aliasing does wonder, I've never seen gamers with such a good skin
Looking at the chairs, you can easily get the pecking order at DF. The luxury throne for the overlord, the design chair for the underling, the hard wood bench for the petty foot soldier.
@@YouCanCallMeMich i do have my facts straight. Fsr 3 will not work the same on gtx cards. Will work best on rtx cards. Not gonna be like a gtx 970 will have double triple fps. Doesn’t work like that
Everything FSR 3 is insanely exciting. I wonder if PS5 will use its FMF or ”frame interpolation” to increase fps in situations were fidelity is lifted on 60hz moments. Lets say 60hz locked games get an graphical update and some resolution bump, and then lock the new interpolated frames to a v-synced 60fps, then it being more stable and faster then normal regular upscaling. Or if it would only be used to get 4K 120hz easier on some games with no loss of graphical settings, and it retaining the 4K 60hz resolution. This resolution in question is almost always 1440p or 1800p on 60fps ”performance modes” and fidelity almost always 2160p and locked 30fps. Would it be possible to cheat the fps and resolution balance and bump it up like a cheated boost to 2160p with greater upscaling from FSR 3 and abusing the FMF ”frame interpolation” to get 16.6ms 60fps stable with greater fidelity and graphics.
What if they only add the frame interpolation to 120hz modes on some games to up the fidelity there 😂 that would be soo disappointing. I want better graphics and better resolution and 60fps locked.
I think this would definitely help for my VRR TV because when its in the 48-119 fps range there is ghosting on the TV with VRR enabled but with it being locked to 120fps with FSR3 it would clean all that up.
Although I am more of an Nvidia guy, I can't wait to see this and I am very excited AMD finally get something more like DLSS! Hell, if it can even reach the consoles, that would be amazing! Is it feasible?
If even AMD says it works "best" if the game can already maintain a stable 60 FPS, then that probably means that the game NEEDS to maintain 60 FPS and that as a result it won't be feasible (or much of a benefit) on consoles. They are showing it off with high refresh rate monitors on PC, that seems to be the target use case. 60 > 120 FPS and such.
@@steel5897its the same in DLSS. you need 60 fps so the AI is less glitchy when reconstructing a Frame. the lower the FPS the less information the AI has to make a accurate frame.
@@steel5897 Right, the lower frame rates have two problems. One, is the lower the FPS the more latency you add with frame generation. The second issue is that there's less temporal data so the reconstruction quality is lower. Probably GHOSTING is more obvious because that last frames data hangs around when not desired. BUT... I'm sure the temporal issue can be mostly resolved once the games are optimized with FSR3 in mind. I have NO doubt. And 60FPS isn't an exact cutoff. If we assume you have a 120Hz VRR display, then 50FPS-> 100FPS would probably make sense for SLOWER games where a slight increase in latency in camera movement etc is worth the tradeoff to see objects move smoother on screen. Having said that, most TV's are 60Hz with VSYNC so you probably want to just design the game to work at a solid 60FPS VSYNC'd in that situation so dynamically drop the RESOLUTION not the FPS. The goal with SHOOTERS when attached to a TV would be at minimum to aim for 60FPS-> 120FPS (with Frame gen) as a "Quality" Mode option. but aim for 90FPS+ native frames for performance mode (no frame gen). So there's likely a bunch of OPTIONS now and in the future depending on how fast-paced a game is: 30FPS (no FG; highest quality) 30FPS-> 60FPS (FG; PROBABLY never going to be used?) 60FPS (no FG) 60FPS-> 120FPS (FG) 90FPS+-> (no FG)
I don't think you have to as fsr3 will work with nvidia cards (if that's what you are using now). that being said there is a vulkan api for amd. so who knows maybe down the line we will see fsr3 support for that or mods.@@NBWDOUGHBOY
The frames are generated by the gpu being output to the monitor. It will have no effect on emulation because it is not giving more fps during all of the calculations.
I am excited to see how this goes to head to head with DLSS 3.5, the best thing that happens in FSR 3 is that they had addressed previous FSR's issue on ghosting. Let see how this tech performed on Cyberpunk.
Two concurrent big developments. Starfield release date must have been planned 7-8 month at release. The fsr3 must be floatingand they needed the extra time
Nope, The reason might be FSR 3 came in late and Bethesda don't want setbacks on the launch date. I would say a wise decision of Bethesda but the game is patchable so that means in can be added later on the updates.
@mmstick Starfield hit 60 but unstable that's why Bethesda locked it at 30fps GF even said so it was super stable even combat in intens areas so it could be used with FSR3
Mind melted-the gents not only in the same place at the same time, but evidence that they have bodies beneath webcam crops … are you all conventionally rendered or is this an example of FSR 3?!
I don’t have a desktop pc, but considering the messy state of newer GPU’s and their prices I’d be tempted to get an old 1080ti for photography and gaming. But I guess AMD would be the better value going forward, especially with FSR 3.
MY man, FSR 3.0 frame interpolation won't be available on GTX 10 series cards :/ Only the "classic" upsampling will be sadly, so no crazy FPS improvements is to be expected
3:10 I also tried it before with the CyberFSR2 mod which replaced the DLL files for DLSS 2.x and makes it use FSR 2.1 with some glue code - it works between ok and perfect, depending on the game and you can set custom scale factors for every preset, including 1.0 (native)
Looking forward to a comparison of Immortals of Aveums console settings with FSR2 reconstruction vs VSR3 reconstruction. Because the ammoint of ghosting and noise was ridiculous with FSR2.
As long as game developer continue to prioritize high fidelity graphics gaming true 4k gaming will never happen . Recent game releases can cripple performance of RTX 4090 where it was considered to be a true 4k gaming gpu a year ago .. Now 4090 can't even give you a descent framerates when using Raytracing and can only give descent framerates when using dlss which is purely fake resolution. 2 years from now Rtx 4090 will be called best 1440p gaming gpu . just like what happened to the previous generation flagship gpus and this cycle will never stop unless game developer stops innovating. 😢
i have a RX 6800 XT and I can barely play at 4K 60 even with FSR, I just realized that I'll always need a high-end GPU for that. I thought 4K gaming would be more mainstream after all these years... 8 years ago I was playing brand new games at 1080p 60 with a low end GPU for $200. Now you can barely play at 1080p with $300. And 4K starts to make sense from $1000 and up. Things only get worse every year.
Is possible when you pair RX 70xx with Ryzen processor and activate SAM , you can inject in processor (when GPU are 100 utilized) some task FSR 3 need to do. Then SAM tech will be more relevant to transfer some heavy task GPU needs to do , to processor instead! (some how Ryzen processor utilization in games are very low)
RTX4070 owner here... I'm rooting for FSR3 to be good! We're in a really WEIRD situation where DLSS is better in general but FSR exists in games that DLSS doesn't. I'm not on "Team Green" or "Team Red" I'm on "Team ME." Or perhaps, just "Team Gaming." So I'm rooting for COMPETITION that pushes things in the right direction. Same logic applies to Intel XeSS etc.
@yellowflash511 He bought it because he had the money, and he gets to have DLSS. He said he's Team neither, plus, without DLSS we probably wouldn't have FSR to comepte with.
It's virtually the SAME 'tried and tested' upscaling tech that both AMD and Nvidia use - there is NO 'AI' unit in every Nvidia card, obviously !!! Today's AI is nothing more than a marketing gimmick amounting to no more than wide registers + data sets. Real AI would decide we are too risky and wipe us out in less than a cycle.
I wonder if VFG or variable frame gen will exist where you target FPS (say 60) and then the game compensates for the lack of frames to hit the target. Say you are going 60 fps locked and sudden explosion happens on the screen or the intensive scene shows up and you go to 56-57 fps. And instead of seeing judders (due to doubling of frames to match 60 fps target), game engine starts implementing fake frames to where there would've been a doubling of the same frame and as such the game perceptually looks smooth 60 fps with fake frames added where needed instead of just continually producing fake frames between every real frame. I just don't even know if that's even technologically possible to generate fake frames like that in varying intervals.
For Saints Row 2022 they added kind of fsr 2 native mode, there you can put in a costum resolution scaling while using fsr 2, but the implimantation is bugged for me
It doesn't matter if FSR3 is as good. I was never sold on the DLSS 3 because not every game supports it. FSR3 is making that happen. I bought a 3090 ti a month before the 40 series launch and I just thought for the price the 3090 ti was worth it. The 4080 for 100 bucks more had less Vram and cores. Now we know the 4080 is a bit faster especially with DLSS3. Even if FSR3 is a little slower or not as good I really don't care. I'm all in for FSR3! I'm just going to skip the 40 series and get a 50 or 8000 series gpu next year. I would have been okay with a 7900 xtx if I just waited honestly.
A bit faster? The 4080 is definitely quite a bit faster, especially with DLSS and Ray tracing. The cache setup of Ada gives upscaling from lower resolutions even more performance.
@@shebeski only if you play those types of games. I got 24gb of vram. The 4080 will never. The low 1% fps on a 4080 will be lower. Sure in most scenes avg is like 20 to 30fps more but at like 150fps already it's just a waste. The 5080 will be out soon and you will be talking about how trash a 4080 is.
It adds frames but doesn't reduce the input lag. It will be used to get high refresh rate gameplay like 120 fps. consoles will render at 60 and use fsr 3 to get to 120 target
Can someone explain to me exactly what AMD's AI Accelorators (Tensor Cores) do? I heard a rumor months back that while FSR3 will work with RDNA cards, but to give a little bit of an incentive to pick up a 7000 card, FSR3 would use the Tensor Cores to give it a performance bump, despite not locking to to them ala RTX 40 series. This obviously isn't happening, so I am still confused as to what the Tensor Cores do on RDNA2 and RDNA3.
Tensor cores are just matrix of vectors FP32 buffers (nvidia), FP16 (AMD). Not needed for FSR3. On AMD, only RDNA3 have them, its called WMMA (Wave Matrix Multiply Accumulate).
I'm excited for fsr3 even as an Nvidia user but I am wondering how useful it can be in certain situations, as it is limited by how much async compute the game itself uses, which don't new games typically use a lot of that? Also with vsync on, isn't that going to add more latency to a frame gen that would already have more latency?
Still not that interested, I'm playing at 4k 60fps and you need about 60fps anyway otherwise frame generation sucks. Maybe useful for 1440p 120hz/144hz?
You mean mostly the AMD fanboys, who have to be the most hypicricical jerks around. They'll U-Turn on a penny, whenever something changes to suit them. They'll then either completely lie, that they never had a negative stance on the subject matter to begin with or will make out that somehow everything is now completely differant, since AMD has joined the party.
Yes, difference is you dont have to buy an expensive new gpu with no real performance gains, so its understandable why people would hate on nvidia and are excited for fsr 3
@@fafski1199better to simp and be loyal to company that doesn't rip your wallet off. Don't get me wrong, Nvidia makes a good product. But overprices it. What's next, you gonna poor shame me?
Async approach can do wonders, since while CPU is setting up the frame, GPU is idle and in CPU bound games the problem is compounded thus it can add frames without much latency, unlike dlss 3..
DLSS 3 also uses an async approach, but it is different from the approach used in traditional async rendering. It also performs well in CPU bound situations in addition to reflex at resolving the latency issue.
"While CPU is setting up the frame, GPU is idle" Not true at all. Most games for quite a long time now queue at least one frame to prevent this from happening. That is why Reflex exists, it adaptively changes when the CPU starts pre-rendering (generating a frame) so that the GPU only gets it right when it finishes the previous frame.
@@pixels_per_inch you can't queue up the frame like that, i don't know where you got the info but no one has ever done it, because queuing the frame will 1 - add input lag, 2 - GPU doesn't know what next frame would look like without physics and other sim completed unless you do interpolation like DLSS 3 even then there will artifacts, e.g someone firing a high speed rocket launcher, GPU wouldn't know whether in next frame its exploded or not, since its collision engine's job, i am not sure about what reflex does, but even CPU can't start working on next frame without next IO input e.g if there is sudden mouse movement in opposite direction all those pre-rendered frames have to be thrown away..
@@yakovAU Yes DLSS also does async, but its their propriety driver level async which runs RT, tensor DLSS and raster all in parallel, its not dx12 async compute, DLSS 2.0 compute de noising is the only thing done via async compute..
@@pixels_per_inch "While CPU is setting up the frame, GPU is idle" its always true, since GPU is pipelined, it has to wait for previous step in pipeline in this case input assembler and other data needed via CPU, GPUs are still quite dumb, and can't do anything without CPU..
One major question I have is: does FSR 3 rely on CPU or GPU? because many games nowadays are CPU limited, and with upscalers, it puts more strain on CPU, is FSR 3 is software based, it certainly will put even more strain on CPU
Will FSR 3 improve upscaling algorithm in order to fix typical ghosting and shimmering exhibited by FSR 2.x, or it's just meant for frame interpolation?
I seriously hope it does bring improvements to upscaling. Surely frame interpolation is a much more difficult task than temporal upsampling, right? There's no way that the same old FSR2 algorithm is powering FSR3, therefore it should be viable to backport the improved technology to the upsampling process. Frame gen is okay, but it's more of a 'rich get richer' type of deal - it's only viable if you're already hitting 60. Upscaling is far more relevant to those on lower-end machines.
yes it is, look a video on youtube with dlss3, slow down the speed of the video and you ll see every fake frame (ghosting, artifact, missing object, people without arms, legs etc..) but at normal speed you ll dont notice it (meybe at 30 fps you can but over 60fps its hard)
*ASYNC Compute not shaders* Well, that explains why the NVidia GTX10 series and previous is not supported for frame gen. So AMD needs RX-5000 but recommends RX-6000 or better. And NVidia requires RTX40, however it's clear they COULD support the RTX20 and RTX30 series and that it was an intentional marketing decision to push Frame Gen on only their latest hardware. Sure, perhaps the RTX40 will run things better, but if AMD's frame gen works on an RTX2060 then obviously NVidia could make it work too.
@@Prisoner_ksc2-303To my understanding the NVIDIA 3xxx cards do actually have optical flow hardware on them, but it’s disabled or not available at least, not sure if that’s hardware or software at fault
@@mduckernzits disable ! Nvidia just want sell the next generation! Some engineers say that even 2000 series could run it but the cupola decides to disable it for selling the next generation of gpus.
now imagine if Forspoken, Starfield, Jedi Survivor and so forth, all become properly optimized how much more FPS you'd get :) and if you already have a 7900xtx, why would you need this technology, wouldn't it benefit lower tier cards?
It will, but the latency and artifacting will be very high at such low native frame rates. Tho humans get used to many things, im sure some wont mind it.
The problem is, no fucking games use it AMD needs to get their shit together and support developers of most popular and newest games to implement all this 10:47 out of those 12 games, only 2 have large player base
Maybe 'Starfield' SHOULD have been the launching-platform for FSR 3. If only they had actually worked together, AMD and BGS, because FSR 2 in 'Starfield' seems downright "broken". I mean, you can only turn it on and then use a resolution-slider, rather than having even different quality-modes, and supposedly the 100% setting with FSR on performs slightly worse than native, with also slightly worse visuals because of artefacting and such. - It just adds nothing but a generic resolution-slider. This is nothing like the jump you'd see in a game like "Hellblade" or even more similarly "AC: Valhalla", in which it was added after development, and both look similar to 'Starfield' in ways. Different engines, of course, and 'Starfield' works a bit differently, with its engine seemingly being a bit of a pain to deal with. But still, you at least got to run those games at significantly higher resolutions with very solid performance, while 'Starfield' with FSR 2 it just doesn't help at all. - Very disappointing, to say the least, especially considering it's a bad performer at base level (my 6950XT struggles at 1080p), and I hope they will keep improving on it or eventually implement FSR3 (I have a feeling it will be a great QoL-feature down the line). Though, perhaps AMD still needs to figure out the drivers for it as well, because it's still early days.
Of course it performs worse than native when set to 100%… Native wouldn’t be doing anything at all, FSR is a process, it adds processing time. Just the same as games perform worse when enabling DLAA
Thinking about it more, you might be right. But I'm having a bit of a sinister thought from it though. I'd see them "lose ground" in the GPU market, just so they have a future underdog position. To then introduce a new "line" of hardware with AI technology, they prepared from having a head start in Ai technology. With DLSS just testing the grounds. But this is all deep speculation. :)
FSR3 will compete with CPU/GPU resources to do the frame generation, unlike DLSS3, which has actual dedicated hardware SEPARATE from the rest of the GPU. That's why DLSS3 can transcend CPU limitations. FSR3 will only be useful where there are enough GPU respurces left over to generate a frame AND where there is enough time for processing (just like how all DLSS and FSR technologies have an inherent latency in and of themselves). Even with Async compute, most games use as much of the frame as possible to render. On top of all that, AMD themselves say FSR3 can only be used with 60fps base framerate, which limits the possibilities even further, especially on consoles.
that's fundamentally wrong. if DLLS 3 didn't use gpu resources, it would double the frame rate every time. it doesn't, sometimes it even LOWERS the frame rate (forza horizon 5 max settings on a 4060), that's because it does use GPU resources. plus, the fact it can "transcend cpu limitation" has nothing to do with that. that happens because the fakes frames do not need physics, AI, game logic or nothing else processed, because it's not a real frame. so the cpu in entirely irrelevant. it's basically in-betweening for real time renderers. also amd says it's best used with 60fps, not that it can only be used with 60fps. that is also true for dlss3. it's a latency issue.
@@GraveUypothe drops in performance is not because dlls3 takes resources but because the dedicated hardware couldnt generate frame faster than gpu spitting it out so its being a bottleneck IIRC
I'm excited to see how little FSR 3 will help on the Steam Deck. Hmmm, by excited I mean prepared to be disappointed. It's interesting nonetheless since my RX 480 is a bit long in the tooth and I'm overdue for an upgrade.
I myself prefer Nvidia with my 4090 but frame generator is awesome so it would be nice to see them succeed from AMD FSR 3 It can still be beneficial to people like me in games that don’t support DLSS 3 like AMD specialized titles. For example, Starfield.
@@doubledigital_Why? You never buy thinks for yourself? Maby a phone every 1 or 2 years? Such a stupid comment. The RTX 4090 is an amazing GPU. Don’t be jealous
As good as DLSS 3, so as good as trash XD. Ah, love it. I wonder when DLSS 3 will be completely dropped and forgotten? I mean, if it can lower latency, then, maybe it'll hang on, but I doubt it.
Seems like DLSS 3 and FSR 3 produces similar results to frame interpolation by smart TV’s. It’s largely the same story in terms of motion clarity, input lag, visual artefacts and the current limits of the technologies. The biggest difference seems to be that a TV’s motion interpolation cannot be used in conjunction with VRR. Personally, I use motion interpolation in combination with BFI and the result is incredibly smooth 60 FPS gameplay, with the equivalent persistence blur of 240 FPS gameplay.
DLSS and FSR have more data to produce interpolated frames. TV interpolation doesn't have access to motion vectors, depth buffer, occlusion mask, and so on.
@@bltzcstrnx interesting, I don’t know what those are. But given how good interpolation on TV’s can look, I’m actually surprised it hasn’t been implemented in the gaming sphere sooner.
@@kristiangurholt59 most likely because of latency. When watching TV you don't interact with it. So delaying frames doesn't really matter as long as you keep the audio in sync. With games, players do input, which when the responses are delayed is definitely noticeable.
@@bltzcstrnx I understand the argument about latency technically, but outside of competitive online gaming I really don’t think it’s an issue. At least it’s never been for me, even playing on higher difficulties. It’s one of those issues I genuinely think gets blown out of proportion in regards to how much it affects the enjoyment of gameplay. Personally, I’ll trade an extra 1/10 of a second any day to avoid jarring and blurry gameplay.
@@kristiangurholt59 for some sensitive people is not about difficulty or competitiveness. Playing an unresponsive game for these people is actually nauseating, some kind of motion sickness you could say. Edit: I would describe it further like using VR goggles. Like there's a disconnect between your actions and your sensory perception.
And this is what I thought, With FS3 running on as a compute shader, is going to cut into the performance of the gpu and further increase latency. It does however give the more flexibility in improving it through software, where as Nvidia will be more hardware locked, and major improvements will require a new card. In short, Nvidia FG feels sluggish, and FSR3 will be way worse. This is also why they wouldn't let you try.
For years it's been like this, when AMD launches something it will always be better than Nvidia, it will end with Nvidia, but some time later it's always the same: BS!
I’ve given up hoping that AMD can do anything as good as Nvidia, even years later; I’ll believe it when I see it. If they had any confidence in the technology they’d let people play freely instead of just showing curated scenes in just two games. It would be great if this works well and can be brought to consoles, but I’m not gonna hold my breath.
Well if AMD is going to block other upscaling technologies and FORCE me to use it instead of my preference, then they better make it fucking good. They already ruined 3 games i was excited to play because their shitty business practices.
Kudos to the GPU that allowed the rendering of you guys in the same room.
*Q-Dos. Rich prolly over-clucked it
It’s actually good that Rich is bald, keeps the performance up.
@@zombiebillcosbyNvidia HairWorks is turned off eh?
@@kazumakiryu7559 absolutely! Have you seen how much that tanks performance? It’s no coincidence John has really short hair too. Alex basically takes 50% of the render budget.
@@kazumakiryu7559I am quite certain it is rendered on a AMD GPU. Rich is moving faster and feel snappier than in real life because of frame generation. Plus anti aliasing does wonder, I've never seen gamers with such a good skin
Looking at the chairs, you can easily get the pecking order at DF. The luxury throne for the overlord, the design chair for the underling, the hard wood bench for the petty foot soldier.
So true... Send love to Alex 😅
I don't think I've ever seen these three in the same room. Awesome shirt, Alex! And hey, Rich and John!
If this works on all GPU's or APU's this is gonna be an exciting time for handheld enthusiasts like me with the SD or the Ally!
Frame gen only works well when your already hit high frame rates so, no.
thats so wrong lmao@@user-im9zp4yp9x
Idk who told yall that but will only work good on rtx cards. Just buy a newer card already
@@xiyax6241 You're talking about DLSS, I'm talking about FSR, get your facts straight.
@@YouCanCallMeMich i do have my facts straight. Fsr 3 will not work the same on gtx cards. Will work best on rtx cards. Not gonna be like a gtx 970 will have double triple fps. Doesn’t work like that
Its nice to see You guys together in person❤
Can't wait for your fsr fluidframes/ dlss framegen comparison!
Everything FSR 3 is insanely exciting.
I wonder if PS5 will use its FMF or ”frame interpolation” to increase fps in situations were fidelity is lifted on 60hz moments.
Lets say 60hz locked games get an graphical update and some resolution bump, and then lock the new interpolated frames to a v-synced 60fps, then it being more stable and faster then normal regular upscaling.
Or if it would only be used to get 4K 120hz easier on some games with no loss of graphical settings, and it retaining the 4K 60hz resolution. This resolution in question is almost always 1440p or 1800p on 60fps ”performance modes” and fidelity almost always 2160p and locked 30fps.
Would it be possible to cheat the fps and resolution balance and bump it up like a cheated boost to 2160p with greater upscaling from FSR 3 and abusing the FMF ”frame interpolation” to get 16.6ms 60fps stable with greater fidelity and graphics.
Yes!!
I wonder how its going to work on 60fps locked PS5 situations
@@MatrixModded Fidelity increase most likely
What if they only add the frame interpolation to 120hz modes on some games to up the fidelity there 😂 that would be soo disappointing.
I want better graphics and better resolution and 60fps locked.
I think this would definitely help for my VRR TV because when its in the 48-119 fps range there is ghosting on the TV with VRR enabled but with it being locked to 120fps with FSR3 it would clean all that up.
Although I am more of an Nvidia guy, I can't wait to see this and I am very excited AMD finally get something more like DLSS! Hell, if it can even reach the consoles, that would be amazing! Is it feasible?
Given it will run on "older" PC hardware... surely yes.
If even AMD says it works "best" if the game can already maintain a stable 60 FPS, then that probably means that the game NEEDS to maintain 60 FPS and that as a result it won't be feasible (or much of a benefit) on consoles.
They are showing it off with high refresh rate monitors on PC, that seems to be the target use case. 60 > 120 FPS and such.
@@steel5897 It's also said that the maintenance of 60 FPS can include the frame rate increase from enabling FSR
@@steel5897its the same in DLSS. you need 60 fps so the AI is less glitchy when reconstructing a Frame. the lower the FPS the less information the AI has to make a accurate frame.
@@steel5897
Right, the lower frame rates have two problems. One, is the lower the FPS the more latency you add with frame generation. The second issue is that there's less temporal data so the reconstruction quality is lower. Probably GHOSTING is more obvious because that last frames data hangs around when not desired. BUT... I'm sure the temporal issue can be mostly resolved once the games are optimized with FSR3 in mind. I have NO doubt.
And 60FPS isn't an exact cutoff. If we assume you have a 120Hz VRR display, then 50FPS-> 100FPS would probably make sense for SLOWER games where a slight increase in latency in camera movement etc is worth the tradeoff to see objects move smoother on screen. Having said that, most TV's are 60Hz with VSYNC so you probably want to just design the game to work at a solid 60FPS VSYNC'd in that situation so dynamically drop the RESOLUTION not the FPS.
The goal with SHOOTERS when attached to a TV would be at minimum to aim for 60FPS-> 120FPS (with Frame gen) as a "Quality" Mode option. but aim for 90FPS+ native frames for performance mode (no frame gen).
So there's likely a bunch of OPTIONS now and in the future depending on how fast-paced a game is:
30FPS (no FG; highest quality)
30FPS-> 60FPS (FG; PROBABLY never going to be used?)
60FPS (no FG)
60FPS-> 120FPS (FG)
90FPS+-> (no FG)
This is great for emulation. Which sometimes is tied to the FPS and so
Absolutely that's what im most excited for. Hopefully they get it working in Vulkan as well DX11 & 12.
you are in luck they announced all dx11 and 12 games are supported. i think modders will have a field day with this one @@NBWDOUGHBOY
@@Tameem3000 Hopefully. If they mod it into Vulkan I might actually consider buying one of their cards for an Emulation Build.
I don't think you have to as fsr3 will work with nvidia cards (if that's what you are using now). that being said there is a vulkan api for amd. so who knows maybe down the line we will see fsr3 support for that or mods.@@NBWDOUGHBOY
The frames are generated by the gpu being output to the monitor. It will have no effect on emulation because it is not giving more fps during all of the calculations.
if they could get fsr 3 on the consoles that would be really cool !
I am excited to see how this goes to head to head with DLSS 3.5, the best thing that happens in FSR 3 is that they had addressed previous FSR's issue on ghosting. Let see how this tech performed on Cyberpunk.
oh man the ghosting on DLSS aint good either bro. Looks like every car shoots tar out of its ass on the asphalt.
As long as it gets rid of the shimmer I'll be happy, have to forego RT in most games for a clean raster look on my 7800xt
It’s a major miss if some form of FSR 3 isn’t in Starfield on release or soon after otherwise what was even the point of partnering?
Two concurrent big developments. Starfield release date must have been planned 7-8 month at release. The fsr3 must be floatingand they needed the extra time
Nope, The reason might be FSR 3 came in late and Bethesda don't want setbacks on the launch date. I would say a wise decision of Bethesda but the game is patchable so that means in can be added later on the updates.
@@duladrop4252 Wise and correct answer. A shame those braindead Nvidiots cannot compute ;)
Starfield would have to target a 60 FPS minimum to use FSR 3.
@mmstick Starfield hit 60 but unstable that's why Bethesda locked it at 30fps GF even said so it was super stable even combat in intens areas so it could be used with FSR3
love seeing y’all together
Hopefully they can use this tech to improve the graphic fidelity in gt7 vr.
Man.. I ordered psvr2. Can’t wait to try gt7 vr
The one thing you don't want in VR is input lag
@@davidwales9657 You can use FSR Native without the framegen to improve fidelity with no added latency I think
Mind melted-the gents not only in the same place at the same time, but evidence that they have bodies beneath webcam crops … are you all conventionally rendered or is this an example of FSR 3?!
Really appreciate the 2160p upload instead of 1080p 🙌
Who had the idea of releasing this first on Foreskin, nobody even play this game anymore
AMD announced that it will work with every single DirectX 11 & 12 title.
Foreskin 😂
my 1080 ti still in the RACE !!!
I don’t have a desktop pc, but considering the messy state of newer GPU’s and their prices I’d be tempted to get an old 1080ti for photography and gaming. But I guess AMD would be the better value going forward, especially with FSR 3.
MY man, FSR 3.0 frame interpolation won't be available on GTX 10 series cards :/ Only the "classic" upsampling will be sadly, so no crazy FPS improvements is to be expected
Rtx cards only.
Loving DF clips as I don't often have the time to watch the full podcast style show
It's also great how it breaks subjects of discussion out to be more easily-referenced discrete videos.
3:10 I also tried it before with the CyberFSR2 mod which replaced the DLL files for DLSS 2.x and makes it use FSR 2.1 with some glue code - it works between ok and perfect, depending on the game and you can set custom scale factors for every preset, including 1.0 (native)
You didn't try FSR3
@@vapecat1982 obviously. how am I supposed to try it? I was just saying that FSR 2.x can technically do it
Isn't IDKFA a double negative ?
Looking forward to a comparison of Immortals of Aveums console settings with FSR2 reconstruction vs VSR3 reconstruction.
Because the ammoint of ghosting and noise was ridiculous with FSR2.
no way its going to work. They said you need a minimum of 60fps. How will fsr3 work when u get 480p 60 fps with max fsr 2
Johns last line about hoping FSR is as hot as the room got me good. 🤣🤣🤣
But what about Vulkan? How will I play my stuff on SteamDeck with FSR 3 and all!?!?!
I'm sure it will be applicable the same as FSHack is already enabling FSR in all Vulkan games right now on Linux.
@@mmstickNice
Can you use fsr interpolation together with dlss upscaling?
As long as game developer continue to prioritize high fidelity graphics gaming true 4k gaming will never happen . Recent game releases can cripple performance of RTX 4090 where it was considered to be a true 4k gaming gpu a year ago .. Now 4090 can't even give you a descent framerates when using Raytracing and can only give descent framerates when using dlss which is purely fake resolution. 2 years from now Rtx 4090 will be called best 1440p gaming gpu . just like what happened to the previous generation flagship gpus and this cycle will never stop unless game developer stops innovating. 😢
you are a very verbose baby
i have a RX 6800 XT and I can barely play at 4K 60 even with FSR, I just realized that I'll always need a high-end GPU for that. I thought 4K gaming would be more mainstream after all these years... 8 years ago I was playing brand new games at 1080p 60 with a low end GPU for $200. Now you can barely play at 1080p with $300. And 4K starts to make sense from $1000 and up. Things only get worse every year.
Is possible when you pair RX 70xx with Ryzen processor and activate SAM , you can inject in processor (when GPU are 100 utilized) some task FSR 3 need to do. Then SAM tech will be more relevant to transfer some heavy task GPU needs to do , to processor instead! (some how Ryzen processor utilization in games are very low)
RTX4070 owner here... I'm rooting for FSR3 to be good!
We're in a really WEIRD situation where DLSS is better in general but FSR exists in games that DLSS doesn't. I'm not on "Team Green" or "Team Red" I'm on "Team ME." Or perhaps, just "Team Gaming." So I'm rooting for COMPETITION that pushes things in the right direction. Same logic applies to Intel XeSS etc.
Bs. Pushes things in the right direction and you bought a 4070. Gamers are just all bark no bite.
@yellowflash511 He bought it because he had the money, and he gets to have DLSS. He said he's Team neither, plus, without DLSS we probably wouldn't have FSR to comepte with.
Compete*
2:52 I think Genshin Impact actually beat Red Dead PS4 to the punch when they replaced their TAA with FSR 2 in a patch over a year ago
Is FSR3 just frame gen? Do they improve the upscaling tech?
yes they improved it.
It's virtually the SAME 'tried and tested' upscaling tech that both AMD and Nvidia use - there is NO 'AI' unit in every Nvidia card, obviously !!! Today's AI is nothing more than a marketing gimmick amounting to no more than wide registers + data sets. Real AI would decide we are too risky and wipe us out in less than a cycle.
no they do not
yes they did pal
@@Yuilix Pumping out virtually literal copies of a previous frame is NOT "improve the upscaling tech" - quite the opposite!
I wonder if VFG or variable frame gen will exist where you target FPS (say 60) and then the game compensates for the lack of frames to hit the target. Say you are going 60 fps locked and sudden explosion happens on the screen or the intensive scene shows up and you go to 56-57 fps. And instead of seeing judders (due to doubling of frames to match 60 fps target), game engine starts implementing fake frames to where there would've been a doubling of the same frame and as such the game perceptually looks smooth 60 fps with fake frames added where needed instead of just continually producing fake frames between every real frame.
I just don't even know if that's even technologically possible to generate fake frames like that in varying intervals.
it is possible, but it certainly won't happen on this first generation of this tech.
@@GraveUypo If that happens, that will be perfect.
For Saints Row 2022 they added kind of fsr 2 native mode, there you can put in a costum resolution scaling while using fsr 2, but the implimantation is bugged for me
It doesn't matter if FSR3 is as good. I was never sold on the DLSS 3 because not every game supports it. FSR3 is making that happen. I bought a 3090 ti a month before the 40 series launch and I just thought for the price the 3090 ti was worth it. The 4080 for 100 bucks more had less Vram and cores. Now we know the 4080 is a bit faster especially with DLSS3. Even if FSR3 is a little slower or not as good I really don't care. I'm all in for FSR3! I'm just going to skip the 40 series and get a 50 or 8000 series gpu next year. I would have been okay with a 7900 xtx if I just waited honestly.
A bit faster? The 4080 is definitely quite a bit faster, especially with DLSS and Ray tracing. The cache setup of Ada gives upscaling from lower resolutions even more performance.
@@shebeski only if you play those types of games. I got 24gb of vram. The 4080 will never. The low 1% fps on a 4080 will be lower. Sure in most scenes avg is like 20 to 30fps more but at like 150fps already it's just a waste. The 5080 will be out soon and you will be talking about how trash a 4080 is.
@@SomeRandomGuy369 the percent lows are now lower... The 5080 won't be out soon.
@@shebeski 12 months is long for you i guess.
Correction on first seeing fsr2 aa on rdr first
Genshin has had fsr2 as an anti aliasing solution for a while
So this is not designed for having a game run around 40-50 fps bringing it to 60 fps vsync? Will it not work or just look bad?
It adds frames but doesn't reduce the input lag. It will be used to get high refresh rate gameplay like 120 fps. consoles will render at 60 and use fsr 3 to get to 120 target
Will this be available on series x and ps5?
Yes AMD said it also work on Consoles
Only on Xbox 😮
The future is green 💚
Fsr has made my decision between the 7900xtx and 4080 much harder
Do the 7900xtx - I have one, no regrets at all.
Can someone explain to me exactly what AMD's AI Accelorators (Tensor Cores) do?
I heard a rumor months back that while FSR3 will work with RDNA cards, but to give a little bit of an incentive to pick up a 7000 card, FSR3 would use the Tensor Cores to give it a performance bump, despite not locking to to them ala RTX 40 series.
This obviously isn't happening, so I am still confused as to what the Tensor Cores do on RDNA2 and RDNA3.
RDNA2 does not have any dedicated AI cores
Tensor cores are just matrix of vectors FP32 buffers (nvidia), FP16 (AMD).
Not needed for FSR3.
On AMD, only RDNA3 have them, its called WMMA (Wave Matrix Multiply Accumulate).
I'm excited for fsr3 even as an Nvidia user but I am wondering how useful it can be in certain situations, as it is limited by how much async compute the game itself uses, which don't new games typically use a lot of that? Also with vsync on, isn't that going to add more latency to a frame gen that would already have more latency?
Probably, but would the potential increase in latency be large enough compared to controller and panel latency to be significant?
@@MrSlipstreem probably yes, though on a controller people don't really seem to notice latency as much or at least don't seem to care as much.
I'm sure there's a ton of people who were yelling "but latency bruh, fake frames" before..., but now they're rejoicing.
Still not that interested, I'm playing at 4k 60fps and you need about 60fps anyway otherwise frame generation sucks. Maybe useful for 1440p 120hz/144hz?
You mean mostly the AMD fanboys, who have to be the most hypicricical jerks around. They'll U-Turn on a penny, whenever something changes to suit them. They'll then either completely lie, that they never had a negative stance on the subject matter to begin with or will make out that somehow everything is now completely differant, since AMD has joined the party.
Yes, difference is you dont have to buy an expensive new gpu with no real performance gains, so its understandable why people would hate on nvidia and are excited for fsr 3
@@fafski1199better to simp and be loyal to company that doesn't rip your wallet off.
Don't get me wrong, Nvidia makes a good product. But overprices it.
What's next, you gonna poor shame me?
@@newsciencestuff5540 Better yet, don't be loyal to ANY company, just buy what suits your wants/needs and leave it at that.
Dont specific games need to be augmented to even use FSR3?
Funny how this technique makes freesync redundant, as you pad missing frames with generated ones.
Async approach can do wonders, since while CPU is setting up the frame, GPU is idle and in CPU bound games the problem is compounded thus it can add frames without much latency, unlike dlss 3..
DLSS 3 also uses an async approach, but it is different from the approach used in traditional async rendering. It also performs well in CPU bound situations in addition to reflex at resolving the latency issue.
"While CPU is setting up the frame, GPU is idle"
Not true at all. Most games for quite a long time now queue at least one frame to prevent this from happening. That is why Reflex exists, it adaptively changes when the CPU starts pre-rendering (generating a frame) so that the GPU only gets it right when it finishes the previous frame.
@@pixels_per_inch you can't queue up the frame like that, i don't know where you got the info but no one has ever done it, because queuing the frame will 1 - add input lag, 2 - GPU doesn't know what next frame would look like without physics and other sim completed unless you do interpolation like DLSS 3 even then there will artifacts, e.g someone firing a high speed rocket launcher, GPU wouldn't know whether in next frame its exploded or not, since its collision engine's job, i am not sure about what reflex does, but even CPU can't start working on next frame without next IO input e.g if there is sudden mouse movement in opposite direction all those pre-rendered frames have to be thrown away..
@@yakovAU Yes DLSS also does async, but its their propriety driver level async which runs RT, tensor DLSS and raster all in parallel, its not dx12 async compute, DLSS 2.0 compute de noising is the only thing done via async compute..
@@pixels_per_inch "While CPU is setting up the frame, GPU is idle" its always true, since GPU is pipelined, it has to wait for previous step in pipeline in this case input assembler and other data needed via CPU, GPUs are still quite dumb, and can't do anything without CPU..
One major question I have is: does FSR 3 rely on CPU or GPU? because many games nowadays are CPU limited, and with upscalers, it puts more strain on CPU, is FSR 3 is software based, it certainly will put even more strain on CPU
GPU. It's written in HLSL and runs asynchronously on the GPU so it should help overcome CPU bottlenecks like DLSS 3 does
@mar2ck_ In that case I don't have high hopes for FSR3 with nvidia GPUs
Will FSR 3 improve upscaling algorithm in order to fix typical ghosting and shimmering exhibited by FSR 2.x, or it's just meant for frame interpolation?
I hope so, otherwise I think it doesn’t really matter. The ghosting is really really distracting.
I seriously hope it does bring improvements to upscaling. Surely frame interpolation is a much more difficult task than temporal upsampling, right? There's no way that the same old FSR2 algorithm is powering FSR3, therefore it should be viable to backport the improved technology to the upsampling process.
Frame gen is okay, but it's more of a 'rich get richer' type of deal - it's only viable if you're already hitting 60. Upscaling is far more relevant to those on lower-end machines.
didnt fsr 2.2 fix that to an extent
@@विचित्रलड़का considering there is ghosting in Starfield I’m guessing not.
I’m all for FSR 3.0 as I’ve been using DLSS 3.0 since release. Is FSR 3.0 fake frames as well or no 😂?
yes it is, look a video on youtube with dlss3, slow down the speed of the video and you ll see every fake frame (ghosting, artifact, missing object, people without arms, legs etc..) but at normal speed you ll dont notice it (meybe at 30 fps you can but over 60fps its hard)
I want FSR 3 on ray-traced TW3 🥵
Cyberpunk 2077 is on the list😊🔥
Supports DLSS 3.5 too.
Any good games?
*ASYNC Compute not shaders*
Well, that explains why the NVidia GTX10 series and previous is not supported for frame gen. So AMD needs RX-5000 but recommends RX-6000 or better. And NVidia requires RTX40, however it's clear they COULD support the RTX20 and RTX30 series and that it was an intentional marketing decision to push Frame Gen on only their latest hardware. Sure, perhaps the RTX40 will run things better, but if AMD's frame gen works on an RTX2060 then obviously NVidia could make it work too.
DLSS 3.0 needs specific hardware. Rtx40 cards have tensor cores specifically for frame generation. It's a little different...
@@Prisoner_ksc2-303To my understanding the NVIDIA 3xxx cards do actually have optical flow hardware on them, but it’s disabled or not available at least, not sure if that’s hardware or software at fault
@@mduckernzits disable ! Nvidia just want sell the next generation! Some engineers say that even 2000 series could run it but the cupola decides to disable it for selling the next generation of gpus.
60 fps is recommend yet in the screenshot it shows 36 fps?
recommended, not said to not work at 60FPS, I think.
I hope FSR 3 will give my 1080 GTX (non Ti) a little more life.
FSR3 only works on rtx 20 or above according to AMD
@@donsly375 RIP me.
@@Turok1134 FSR 3 upscaling still works on it, however frame generation is not supported.
@@donsly375 No, FSR 3 upscaling still works on GTX 10 and 16 series, it's only frame generation that doesn't work.
@@castlemein5765 the other guy clearly means frame generation by FSR3 and not just the upscaler
now imagine if Forspoken, Starfield, Jedi Survivor and so forth, all become properly optimized how much more FPS you'd get :) and if you already have a 7900xtx, why would you need this technology, wouldn't it benefit lower tier cards?
I’m praying to god AMD catches up with NVIDIA ! so ps5 games can look amazing!
yes yes yes give me those sweet features on Steam Deck
> min 60 fps internal
> steam deck
Will FSR Frame Generation even work on steam deck?
It doesn't support all GPUs "just" most of the CPUs
It will, but the latency and artifacting will be very high at such low native frame rates. Tho humans get used to many things, im sure some wont mind it.
Great for flight simulator@@branchprediction9923
You laugh, but Spore-Folken sounds like a helluva horror game. 😁
Did you see scaling artifacts with fsr 3? Is it closer to dlss?
does this room even has AC installed ??
5:10 I think freesync doesmt work with vsync if I remember
The problem is, no fucking games use it
AMD needs to get their shit together and support developers of most popular and newest games to implement all this
10:47 out of those 12 games, only 2 have large player base
Maybe 'Starfield' SHOULD have been the launching-platform for FSR 3. If only they had actually worked together, AMD and BGS, because FSR 2 in 'Starfield' seems downright "broken". I mean, you can only turn it on and then use a resolution-slider, rather than having even different quality-modes, and supposedly the 100% setting with FSR on performs slightly worse than native, with also slightly worse visuals because of artefacting and such. - It just adds nothing but a generic resolution-slider. This is nothing like the jump you'd see in a game like "Hellblade" or even more similarly "AC: Valhalla", in which it was added after development, and both look similar to 'Starfield' in ways. Different engines, of course, and 'Starfield' works a bit differently, with its engine seemingly being a bit of a pain to deal with. But still, you at least got to run those games at significantly higher resolutions with very solid performance, while 'Starfield' with FSR 2 it just doesn't help at all. - Very disappointing, to say the least, especially considering it's a bad performer at base level (my 6950XT struggles at 1080p), and I hope they will keep improving on it or eventually implement FSR3 (I have a feeling it will be a great QoL-feature down the line). Though, perhaps AMD still needs to figure out the drivers for it as well, because it's still early days.
Of course it performs worse than native when set to 100%…
Native wouldn’t be doing anything at all, FSR is a process, it adds processing time.
Just the same as games perform worse when enabling DLAA
alex's laugh is the best.
IDKFA, what does it mean?
Edit: Other dude got big balls.
The cheat code for Doom 1993 on PC.
@@V3ntilator Aah cool, thanks. I knew it looked familiar.
@@mrnicktoyouYour welcome. Probably worlds most famous cheat code.
Almost impossible to forget it. ;)
@@V3ntilator I still remember ABACABB
@@mrnicktoyouNice. I forgot that one.
Better, they don't lock it behind the new graphics card. Instant W.
I think Nvidia is gifting the gpu market to AMD and moving on to AI.
Thinking about it more, you might be right. But I'm having a bit of a sinister thought from it though. I'd see them "lose ground" in the GPU market, just so they have a future underdog position. To then introduce a new "line" of hardware with AI technology, they prepared from having a head start in Ai technology. With DLSS just testing the grounds. But this is all deep speculation. :)
What games have good DLSS? I find most games that have it it makes the game look worse.
The fact you can enable fsr on every dx11 and 12 is a huge deal for all budget gpus.
My respect grows more and more for AMD.
FSR3 will compete with CPU/GPU resources to do the frame generation, unlike DLSS3, which has actual dedicated hardware SEPARATE from the rest of the GPU. That's why DLSS3 can transcend CPU limitations. FSR3 will only be useful where there are enough GPU respurces left over to generate a frame AND where there is enough time for processing (just like how all DLSS and FSR technologies have an inherent latency in and of themselves). Even with Async compute, most games use as much of the frame as possible to render. On top of all that, AMD themselves say FSR3 can only be used with 60fps base framerate, which limits the possibilities even further, especially on consoles.
that's fundamentally wrong. if DLLS 3 didn't use gpu resources, it would double the frame rate every time. it doesn't, sometimes it even LOWERS the frame rate (forza horizon 5 max settings on a 4060), that's because it does use GPU resources.
plus, the fact it can "transcend cpu limitation" has nothing to do with that. that happens because the fakes frames do not need physics, AI, game logic or nothing else processed, because it's not a real frame. so the cpu in entirely irrelevant. it's basically in-betweening for real time renderers.
also amd says it's best used with 60fps, not that it can only be used with 60fps. that is also true for dlss3. it's a latency issue.
Hmm
@@GraveUypomake sense
@@GraveUypothe drops in performance is not because dlls3 takes resources but because the dedicated hardware couldnt generate frame faster than gpu spitting it out so its being a bottleneck IIRC
I'm excited to see how little FSR 3 will help on the Steam Deck. Hmmm, by excited I mean prepared to be disappointed. It's interesting nonetheless since my RX 480 is a bit long in the tooth and I'm overdue for an upgrade.
Damn, those are some huge coke lines on that table...
everything looks sharp but them.
Getting excited for a tech that lets you play at a lower resolution, is going backwards for me.
This will just let devs be even lazier.
Im waiting for Intel.
PS5 FSR 3?
I myself prefer Nvidia with my 4090 but frame generator is awesome so it would be nice to see them succeed from AMD FSR 3 It can still be beneficial to people like me in games that don’t support DLSS 3 like AMD specialized titles. For example, Starfield.
u should buy a brain not a gpu
@@doubledigital_Why? You never buy thinks for yourself?
Maby a phone every 1 or 2 years? Such a stupid comment.
The RTX 4090 is an amazing GPU. Don’t be jealous
are u mad, YOU PEASANT @@doubledigital_
John looks to be in fine shape
Alex what is this hair man
The answer is no. Especially if it's only shown on Forsaken.
Doom cheatcode!
As good as DLSS 3, so as good as trash XD. Ah, love it. I wonder when DLSS 3 will be completely dropped and forgotten? I mean, if it can lower latency, then, maybe it'll hang on, but I doubt it.
I'm pretty skeptical about the quality. FSR 2 was touted as similar quality to DLSS, but it's light years behind.
yeah i agree with you, FSR 2 is so bad in Cyberpunk and RDR 2
Agreed its insane how bad FSR2 is in all games i have tried it, DLSS is in an entire diffrent universe right now.
even the 2.2¿
Seems like DLSS 3 and FSR 3 produces similar results to frame interpolation by smart TV’s. It’s largely the same story in terms of motion clarity, input lag, visual artefacts and the current limits of the technologies. The biggest difference seems to be that a TV’s motion interpolation cannot be used in conjunction with VRR. Personally, I use motion interpolation in combination with BFI and the result is incredibly smooth 60 FPS gameplay, with the equivalent persistence blur of 240 FPS gameplay.
DLSS and FSR have more data to produce interpolated frames. TV interpolation doesn't have access to motion vectors, depth buffer, occlusion mask, and so on.
@@bltzcstrnx interesting, I don’t know what those are. But given how good interpolation on TV’s can look, I’m actually surprised it hasn’t been implemented in the gaming sphere sooner.
@@kristiangurholt59 most likely because of latency. When watching TV you don't interact with it. So delaying frames doesn't really matter as long as you keep the audio in sync. With games, players do input, which when the responses are delayed is definitely noticeable.
@@bltzcstrnx I understand the argument about latency technically, but outside of competitive online gaming I really don’t think it’s an issue. At least it’s never been for me, even playing on higher difficulties. It’s one of those issues I genuinely think gets blown out of proportion in regards to how much it affects the enjoyment of gameplay. Personally, I’ll trade an extra 1/10 of a second any day to avoid jarring and blurry gameplay.
@@kristiangurholt59 for some sensitive people is not about difficulty or competitiveness. Playing an unresponsive game for these people is actually nauseating, some kind of motion sickness you could say.
Edit: I would describe it further like using VR goggles. Like there's a disconnect between your actions and your sensory perception.
Very promising tech
Oh fuck. If fsr 3 comes to consoles we gonna see them lazy devs making games in native 720 15 fps, slapping on FSR3 as band aid
stay true stay native , stay sharp , say no to dlss & fsr
FSR in Far Cry 6 sucks so hopefully they have sorted it out
I'm going to say what most probably won't. You guys need to focus on your health a bit more. 👍
And this is what I thought, With FS3 running on as a compute shader, is going to cut into the performance of the gpu and further increase latency. It does however give the more flexibility in improving it through software, where as Nvidia will be more hardware locked, and major improvements will require a new card. In short, Nvidia FG feels sluggish, and FSR3 will be way worse. This is also why they wouldn't let you try.
Have U seen the release date, too (that would be the real news, here)…? 😏🙄
It doesn’t use AI efficiently, It’s just shitty fsr2 with frame gen
And? Its usability on all gpu is a big win
Console 60Fps yeah 😃
4k60fps 🎉 maybe even RT
For years it's been like this, when AMD launches something it will always be better than Nvidia, it will end with Nvidia, but some time later it's always the same: BS!
DLSS 3.5 is way better then FSR 3
I’ve given up hoping that AMD can do anything as good as Nvidia, even years later; I’ll believe it when I see it. If they had any confidence in the technology they’d let people play freely instead of just showing curated scenes in just two games. It would be great if this works well and can be brought to consoles, but I’m not gonna hold my breath.
Well if AMD is going to block other upscaling technologies and FORCE me to use it instead of my preference, then they better make it fucking good. They already ruined 3 games i was excited to play because their shitty business practices.
comment section is the reason gpu prices are a joke
try learning how games are made it helps
PS5 does it again
Middle
@shadowban1777 lol I saw 2 others saying first and last so I did the only other one 😂
i dont even have to watch the video to know it's not better than dlss3 ...nvidia is in it's own league ..amd is way behind
You keep feeding on Jensen's milk ;)
@@ChrisM541 nah man it's just facts
@@Spectru91 Nah man the fact is Jensen has been feeding you with his milk for a long, long time ;)
@@ChrisM541 🤣🤣🤣
no, no its not as good as dlss3
Wow! Unbelievably boring interview!