I love his content and slog thru it normally in spite of his Trisha Takanawa voice. These CES videos have been perfect due to the less distracting weird voice thing. I hope he agrees and tones it back because the content is A+
I watch his videos often and noticed the same thing!! My wife likes to sit in my office and read while I watch youtube, and she was certain he was an AI voice. Dawid does tech stuff is another weird voice. It doesn't matter to me, I'm here for the content and Gamer Meld delivers!
For me I'm done with everyone selling hype for each team they love. I'll wait and see FSR4, DLSS4, raw power and price of each card when they are really available and then decide for myself as usual what is best FOR ME. My preference is always price/performance ratio, but that's just me.
2 дня назад+3
People who buy NVIDEA don’t want any “price to performance” compromises. They want the best that’s available while they’re alive to experience it.
@ Homie you do realize that like 90% of the "no compromises" crowd buying NVIDIA are buying 60 and 70 class cards right?... That uh.... Sounds like alot of compromises made mate. You sure it's that and not marketing and brand recognition that's carried over for over a decade to new swaths of products that are nothing like the one's before? "while they’re alive to experience it" calm down lol, no one has had a prophetic moment in their life because they paid over $1,000 on a GPU to get 20 more fps in their game over a 70 class card.
Good for you. Reality is the majority go based on price/ performance because most people don't have $2K lying around.
2 дня назад+1
@ just sold one of my 4090s for $2100.00 - gonna use that. Plus, opened a separate account, put literally a few bucks a week since 2022 which was unnoticeable to me. So I have enough to pick up two 5090s, or a 5090 and a 5080. Whatever, I can get on launch day before the scalpers get it.
as a 7800xt owner I don't know how I Feel about this. What are those "ai accelerators" on my gpu if not for upscaling or frame gen? This time Nvidia is actually porting their tech to previous gen and they don't seem to be lying about the multiple frame gen on the 50 series not compatible with the 40 series, but I'm having a lot more difficulties to trust AMD on this one. Really shady CES for AMD.
well if the hardware has no units for it, what can you do? Nvidia was mart enough doing that 6 years ago so they can support 4 generations for DLSS. AMD went with the wrong way and have to switch to nvidias way now... same will happen with raytracing. AMD is always 2 late and 2 bad
If the future is generating fake frames just to get a higher fps number then I'm out. I want nothing to do with a marketing gimmick running the entire landscape and tricking people that fake things are good.
Yeah going to be 100% honest here. DLSS & FSR are extreme let downs. Gaming companies need to make games with good graphics again instead of being lazy as fuck about it & depending on DLSS & FSR. Games from 2012 look better than games now for the most part and that is pathetic considering how much "better" the technology is. If this is actually the future of gaming we are most likely screwed.
Played on an old arcade recently 1996 game with CRT and I was like what is this trickery. It looks better then current gen, sharpnesses, clarity and colors. Obviously low quality models but still. We are going backwards. Every new gpu or monitor I try Cyberpunk and usually start Nomad but I never even leave the garage before uninstalling, just a puke fest.
@@b.t4604 No lmao. The point is Games used to look better before DLSS & FSR when the creators of games actually put the time & effort in to make them look good. DLSS & FSR just cover up their laziness & lackluster work.
@@nahbro3240 then it's not Alpha Please finish making the Game and Optimizing it stop releasing unfinished games and Please Quit brute forcing the games relying on the power of GPU to make it work, before long cooking instructions will include play XXX game for 45 minutes at high graphics setting to thoroughly cook meat
@@nahbro3240 Who gives two shits about alpha games? Not even sure why you mentioned that. If alpha games are running like shit, it's due to lack of optimization. In most cases, upscaling won't help that. Besides, it's alpha, the majority of players don't play alpha's anyway.
At some point, AMD has to create something that simply doesn't work on previous generation parts. There's no way to keep advancing if backward compatibility is a hard and fast requirement.
But that's a selling point of amd so they should figure it out, I mean I wouldn't expect it on rdna 2 but as a company it looks bad if it's not on rdna 3 that has a.i. cores.
No, IF it can run on older hardware it should be released as such. Unlike Nvidia, who gatekeep new features behind new products for no reason other than it sells cards.
They already knew they where going AI for FSR4 while still getting ready to launch the 7000 series. They could have added whatever they needed before release so that the 7000 series could have run FSR4. Based on the performance we have seen so far for the 9000series cards (may not be accurate but time will tell), my 7900 XTX won't be going anywhere. I rarely need upscaling tech anyway since I see little benefit or gain running 4k over 2k. I can run high to max settings on almost all current games with well over 60+ FPS (usually much higher but never under) unless I do ray tracing. Even then, I don't usually see enough eye candy benefit to run ray tracing. Only game I run ray tracing on is Cyberpunk 2077 anyway.
Should be around 25-30 % better. Nvidia showed numbers for Far Cry 6 in their presentation, which suggest a 27 % improvement in that game (in their test scenario). Given the RT in that game is super undemanding (game was made for 20 series and you can see it when you turn on RT with a 3000 or 4000 series, that it almost costs no frames at all), that means we can expect a raster improvement in that range.
@@OutOfRangeDE Very unlikely. Their benchmarks usually show much more improvement than anyone ever sees in independent benchmarking... they like to deceive everyone into thinking things are better than they are.
It is stated that it's developed for RDNA4, but nowhere it's said it won't work on older RDNA GPUs... at least this is the info from many people that were at CES and talked to AMD
Literally only says the fsr4 upgrade feature works on rdna4, which allows you to use fsr4 in fsr3.2 supported games. Fsr4 is supposed to work like not as well on 7000 series but still works from what I've heard. But we will see
It’s like AMD RDNA4 couldn’t even be bothered to show up or show anything at CES 2025. Like they just quit or forfeit vs RTX 5xxx. So disappointed. They just told others the info. Not tell anything at ces 2025 show.
it feels smoother? well if they make u play with a controller then they are trying to hide the latency. A pad hides additional latency pretty well compared to when u use a mouse.
@@PlusTiger most? no. only the new wave of pc gamers. no pc gamer will play a fps with a pad, that is something console players have brought over to pc. But 3rd person hack and slash action games are pretty okey with a pad, as they are slower and easier to lock on as the game assists.
Super true, as a solutions architect for an mnc bigger than AMD, i can confirm. using controller is a cheat code to hide latency. real thing is when you use keyboard, mouse and that's when the real experience you get. And imo, nvidia thrives over AMD in almost every aspect, except cpu, which they are doing now at digits and no guarantees but i think they will do a cpu that easily slaps amd's and intel's
@@Pillokunyep, I only use a gamepad on PC when playing mostly racing games and some adventure ones or whatever, most others is mouse and keyboard unless I connect it to my TV...
@@PlusTiger complete lie lmao. PC gamers rarely use gamepads, only for racing or games like elden ring. Otherwise we primarily use mouse and keyboard. Controller is a huge disadvantage and just isn't a good input unless an FPS title (most these days) has crazy aim assist
To be honest, as much as 16gb is not enough on the 5080, 16gb is also not enough on the 9070xt.. AMD need to make up for their lack of RTX performance, VRAM, was an ok way to counter that.. if they put 24gb they would have nailed it, oh also, not to mention the fact it's still GDDR6... :/
Any info on input lag? There's already backlash over the new DLSS introducing like 50 ms of delay on the new 5000 series nvidia GPUs. If just upscaling doesn't introduce significant delay, it could be the way to go.
I really liked the thumbnail picture, I wish it was a real design. There are recently made case fans, they are fully metal. I love those. I wish GPUs followed with a metal case and anodized aluminum blades.
When FSR came out initially they were talking about I think at least 4 frames at once being generated between frames being totally possible. I am assuming that will stay true regardless of the AI direction towards upscaling.
Why would i buy 9070xt if my 7900xt could have fsr 4 ? Just cause it has better ray trace (never use it ) ? Think man just think. Now people have good reason to buy a reasonably priced gpu with good upscaling like 4070 and 4070 super did with dlss.
people with 7900 are not gonna be interest in AMD gpus going forward anyway since they are not in the high end anymore. those people will go for 5080 90 or wait for 6080 90.
@@arbernuka3679 people with 7900 are not gonna be interest in AMD gpus going forward anyway since they are not in the high end anymore. those people will go for 5080 90 or wait for 6080 90.
@@arbernuka3679 doubt that, they just said that is only for 9000 series, and even if its going to be for 7000 series either is going to be super limited or going to be delayed
@@lucaskp16 i have 7900 xt since day one. If the upscale technology have a generations, RX 7000 series should have at least 2 gen upscale technology support from Amd. Like FSR 3 and FSR 4. On Nvidia side Upscale technology have better support right know than Amd. I'm talking for AI based upscale not the old manual pixsel algorithm. Seriously not like this...
This will be AMD’s greatest mistake because if most people have to upgrade to a completely different generation in order to get FRS4 you might as well just go to Nvidia
Gotta love that the 7000 series has AI cores that literally don't get used. Talking about die space, the AI cores in the 7000 series if not used here, are simply a fucking waste of die space in the first place in that case.
The upscaling and frame generation software solutions are a stopgap until significant advances are made in graphics processing. Basically fake it till you make it.
I hope you are right, because the alternative is that the smaller nm architechtures are approaching a physical limit meaning that either GPUs will need to get bigger/more power hungry or become increasingly more expensive due to the lower yields.
See where you need tensor cores now, so will the rtx 4000 series be able to run it FSR4, where they have the tensor cores?. I can't wait for dlss4 see what that's about
It was more so people recognize the point being made. With that said, I did change it. But yeah. It’s just so you can quickly glance and see what I’m talking about.
It's still a bloody crap Vaseline on both my 1080P and 1440P monitors... Only looks fine for my 4K TV to see further away and of course upscales from higher resolution.
FSR 2/3 could've worked better if developers dedicated more time working with AMD on it, but of course they were busy doing contract work on DLSS. Still the consumers buying up RTX didn't help out, now AMD is also locking out previous gen. Deserved.
@@AdeptN00B yeah my 4080 has been great aside from the top hdmi being dead I just use the second one on my s95b and it's been fine. I also think I just saw ALL RTX cards will get dlss 4.0?
I agree with everyone posting about ‘fake frames’ and kicking against other ‘AI tricks’ and marketing BS by all manufacturers, but the truth is that ‘traditional’ GPU technologies are hitting the constraints of the laws of physics. The fact that the 5090 has a TDP of up to 575 watts is really telling. The demands of gamers for better more ‘realistic’ graphics with each successive generation means that these tricks have to be employed to meet expectations regarding higher framerates, higher fidelity, higher resolution, and acceptable power draw. Personally, I think there’s so much fun to be had in gaming that doesn’t require such high graphical expectations… look at the success of the Nintendo Switch, Minecraft, and indie games like Among Us, Hollow Knight, and Balatro. In my 40 years of playing games I’ve noticed that demand for photorealism and framerates mostly comes from those who solely play AAA FPS blockbusters and competitive shooters, with the desire for graphical improvements making up for the lack of creativity, variety and gameplay interest that those genres entail.
The 5080 using DLSS4 hits 250+ fps in Cyberpunk at 4K on Ultra settings with Full Ray Tracing and the 5090 is twice as fast, Nobody is on Nvidias heels
@Conumdrum and it's 28 fps with fake frames off. 4 generations later and Nvidia still can't run cyberpunk at 30 fps according to their own website. Ur basically giving them 1-2k/yr to beta test their ai.
@@brussy1 I laugh, all these years and AMD still can't Ray trace DLSS 4 is going to be better than FRS 4 without frame gen and multiple frame gen will have people gaming at 4K at 150+ fps in most titles on the cheap, at the end of the day, its all bad news for AMD
Fake frames feel terrible in most casenarios and sometimes feel meh even in story mode where the latency isn't a crazy factor it still feels off. I just want raw performance as someone who doesn't even really care about raytracing
@@FDFFDS-go1oo Yes they also have a tool called lossless scaling which at 2x frame gen is pretty comparable to regular Nvidia implementation at game engine level, I know u guys love to spew nonsense without any idea what you are talking about its "ok" for story player games where there is no competition and you are just playing games for the look, but otherwise its basically not a real feature people care about
@@Tugela60 Lmao do you even think before you type? You are comparing a video you watch (pre-rendered) vs a live game that takes inputs and renders as you go in real time where your inputs and latency is affected based. Extrapolation has been a thing for a long time and there is a reason it wasn't ever used in games its terrible.
firstly, the gameplay is too smooth for it to be a native 4k output, both systems are running the same hardware so the right most result looking that much better and also running at the same or better performance can't be the result of native output. also people have put the same game, ratchet and clank with fsr 3.1 on their machines and confirmed the artifacting on the leftmost monitor are congruent, while the rightmost has artifacting still but to a much lesser degree which means it is still being put through a solution, they have not called it FSR4 yes that is true, but its a very safe bet to assume it is FSR4 or a succesor to 3.1.
@@Conumdrum your comment still does not address or disprove any of my valid arguments, besides, there is still a way to notice frames per second higher than the video supports, it will technically be the same framerate, but higher refresh rates will look sharper every frame with less blurring, anyways, that wasn't even a main argument point I made, you just nitpicked it from my comment
A very massive improvement in Ratchet and Clank. I know that Ratchet and Clank is one of the worse ones for AMD FSR 2 & 3. I'm assuming it is (FSR4) not backwards due to the AI capabilities of the 7000 and older RDNA series GPU's. Well, we'll know more when AMD is more willing to talk about the 9000 series - and there is a definite fury of rumours floating about 9000 series.
The problem with FSR 3.1 is not the quality. I want more FPS and if that means I get a worse quality Im fine. BUT the problem is that NVIDIA paying a lot of money to the developers for not implementing the latest FSR and if they are implementing it, it is just bad. How do I know it is intentional? Well, modders fixing it in a few days. We wait one and a half year for FSR 3.0 in Cyberpunk and when they released it, FSR 3.1 was allready released half a year ago. This is not over yet, because you could download a mod like a year before FSR 3.0 for Cyberpunk, where you could get FRS 3.0 working already better than what CD Project released a year after. Alan Wake 2 gets DLSS 4 while there is still FSR 2 in the game. Throne and Liberty: DLSS frame gen, FSR no frame gen. There is a huge CPU bottleneck in the game like in every mmo with a ton of players, so frame gen would be very important. And ppl with the RTX 3080, 3090, 2060, GTX 1080, etc. and ofc every AMD user cant use frame gen, while the game is NVIDIA sponsored. The youtubers like Gamers Nexus or Hardware Unboxed and yourself should talk about this, because NVIDIA is fcking destroying the market. Intel is gone and look how much more expensive are the AMD CPUs. Same thing happened with NVIDIA GPUs. Do you like that? Do we want that?
"NVIDIA paying a lot of money to the developers for not implementing the latest FSR " we don't know that and I get tired of hearing it with no proof. This helps no one. I agree with everything else you state though. Influencers need to speak up more about the crap implementation of FSR and the fact it's usually old and outdated especially on games still being actively updated. But I suspect this is also AMD's fault in some way.
9070/70XT is a GRE with ray tracing. Could have at least done an 70XTX to be more along the lines of a 7900XT with ray tracing. Can you believe i actually considered a 5070ti in my head. i have a GRE what is wrong with me..
I think that's aiming too high. I think between 7800XT and 7900XT is where these cards land. AMD uses VRAM buffer as an upsell feature so the 7900XT should still have a reason to exist.
F**k A.I. I run my graphics cards without any type of upscaling, ray-tracing, or any of that stupid nonsense. I want raw performance, and if the card can't deliver without fake frames, I want nothing to do with them. My RTX 4070 Ti Super is the GOAT of this generation. I'm gonna need to pick up a few more.
This is just another reason why Nvidia dominates the market! Even with the newest GPU’s FSR/DLSS will be required on the newest games with RT on full. When it comes to budget gaming at 1080, that is the only area that AMD and now Intel can compete; though the 5060 is likely to be about the same cost and still perform better with DLSS. If running 1080 though, I do not think DLSS/FSR will be needed. Still, I do not like the idea of spending money on a product that isn’t as good if the price is close to the superior product.
Do you have any idea what were you trying to tell ? You sound like most of ngreedia users🐒. You know new amd cards will perform 2x-3x better on raytrace and fsr looks incredible now ?!
@@arbernuka3679 what? 2x-3x time better ray tracing? from the leaks it was supose to be close to 4070 ti in terms of ray tracing, what you saying there could match the new rtx 5080 lol.
@ compared to AMD’s last gen sure, but compared nvidia’s new 5000 series, AMD will still be outclassed! It also seems that FSR4 still has some slight fractal issues, but definitely a big improvement over FSR3 from what I’ve seen on similar videos. The big issue is that Nvidia has 90% of the market, so game developers are optimizing games for Nvidia on PC; Indiana Jones and the great circle doesn’t even support FSR yet! (It will eventually). I always saw AMD as better option for budget gaming due to lower prices and if gaming 1080 no RT, FSR shouldn’t be needed; so I was going to buy one for my son when I build him a PC. But if Nvidia is going to lower the prices of the 5060, that will make it more competitive.
showcasing it on Ratchet and Clank, Sony's movement😂 turn's out that in most of the game's PSSR is just bad, won't be surprised if FSR4 will struggle too in many games, meanwhile DLSS is polished even more with upcoming RTX graphics, but anyway Nvidias marketing is bad asf, but at least DLSS is supported by more games and does a good job with upscaling
"Obviously AI stuff is the future" I completely disagree. These AI features are cool little hacks when you have aging hardware but it's mostly there to hide the fact that current cards are just way too slow for modern rendering techniques such as Ray Tracing. You basically have to kill either image quality (blur/artifacts) or fps to enable the RT features that actually look noticeably better. But the cost is not worth it. It's cool that AMD catches up, but when I hear upscaling and frame gen I basically tune out because why would I be interested in these features with mid-high end GPUs? That's something I'd use on a nvidia xx60 class card. When I buy a 5080 I want pristine sharp and fast gameplay not some upscaled frame gen mess with blur and flicker.
If frame generation makes three frames from the one rendered frame wouldn't that mean that there would be four identical frames? If so wouldn't that just output jerky all be it higher frame rates?
It isn't, you don't need to wait for third party benchmarks. The 5070 isn't much better than the 4070. It's about 20% faster which is slightly less than the 4080 but comes close. Nowhere near a 4090 which would require another 40% of raw performance over the 4080. 5070 only gets 4090 performance with mult-gen. AI generated framerates.
@@David95853 Seems to be the case with FSR 4 as it won't work on 7000 series, but perhaps some features will still work on older GPU's. Unusual move for AMD if you ask me.
Since this FSR4 is only for the 9000 series then back to Nvidia I go, and getting rid of my 7900xtx for. 5080/5090. I supported AMD through the 7000 series, and that will be last the last time.
Same bro, Nvidia lowkey cooking, back to Nvidia. I bought a 7900 GRE like a month ago and this is such a disappointment which is crazy because this GPU is a beast. The higher end 7900's can def run FSR 4, but it would be faster than RDNA 4 so they won't allow it as then it wouldn't sell
This whole upscaling debate is huge BS. To spot differences you have to slow and magnify the screen. IMHO it's just marketing. Anyways, if this is what people want, here you have it.
remember when the 6000 series came out and it didnt have any fsr but they said they were working on it. finally they are getting something close to dlss, it only took 5 years and doesnt even work on 6000 series cards,
If AMD doesn’t come out with some info and pricing and release dates I will just go with nvidea. Was hoping for team red to pull off a win but this is sketch AF!
@@Conumdrum yes, it's amazing but you're not buying a GPU, but a software technology. Multi Frame Generation with AI already existed before the rtx 5000. And there is a lot of open source software that do that.
@@mouadessalim7116 When everyone starts gaming at 120fps+ at 4k on the cheap , AMD is in trouble Great looking fake frames won't offend the ones using them
@@arch1107 Nah I assume it's because at least NVIDIA is supported and not a shock when a generation is just dropped off the map like AMD did with 7000 series (from a 7900 GRE owner)
@@SkylordRevision1 bruh... have you forgotten about fsr being compatible even with nvidia cards? While the same nvidia said dlss 3 wasnt capable of running on previous gen cards? Is your memory that short?
Looking forward to cheaper 7900XTXs. I don't care about upscaling or ray tracing. Complete gimmick designed to fleece people chasing "photorealism" which is for recording videos.... not playing actual games. Nobody cares about puddle reflections in the middle of a CoD match. Literally nobody. It's impressive the first time, then never again.
Making fsr 4 exclusive to rdna 4 and not even offering a fall back layer for the 7000 series is bad p.r. they could at least offer the image quality portion update. My next upgrade will be nvidia when the time comes. Even Intel xess has a fallback dp4a that would've made the most sense. What happened to the 7900xtx a.i. cores that is supposed to be there
Your voice sounds better than the weird stuff you do with it on your normal vids
different mic and setup
Agreed, I can't watch his studio videos, he sounds so unnatural with all the filtering.
agreed, always feels super unnatural in his regular vids as if he's over-prepared and scripted. i like this much more
great content regardless though!
I love his content and slog thru it normally in spite of his Trisha Takanawa voice. These CES videos have been perfect due to the less distracting weird voice thing. I hope he agrees and tones it back because the content is A+
I watch his videos often and noticed the same thing!! My wife likes to sit in my office and read while I watch youtube, and she was certain he was an AI voice. Dawid does tech stuff is another weird voice. It doesn't matter to me, I'm here for the content and Gamer Meld delivers!
For me I'm done with everyone selling hype for each team they love. I'll wait and see FSR4, DLSS4, raw power and price of each card when they are really available and then decide for myself as usual what is best FOR ME. My preference is always price/performance ratio, but that's just me.
People who buy NVIDEA don’t want any “price to performance” compromises. They want the best that’s available while they’re alive to experience it.
@ Homie you do realize that like 90% of the "no compromises" crowd buying NVIDIA are buying 60 and 70 class cards right?... That uh.... Sounds like alot of compromises made mate.
You sure it's that and not marketing and brand recognition that's carried over for over a decade to new swaths of products that are nothing like the one's before?
"while they’re alive to experience it" calm down lol, no one has had a prophetic moment in their life because they paid over $1,000 on a GPU to get 20 more fps in their game over a 70 class card.
@ I’m getting the 5090
Good for you. Reality is the majority go based on price/ performance because most people don't have $2K lying around.
@ just sold one of my 4090s for $2100.00 - gonna use that. Plus, opened a separate account, put literally a few bucks a week since 2022 which was unnoticeable to me. So I have enough to pick up two 5090s, or a 5090 and a 5080. Whatever, I can get on launch day before the scalpers get it.
I care nothing about upscaling or ray tracing! Give me raw power! No gimmicks!
Raw power= more wattage and more wattage= complaints that it's eating up the electricity but yall want more power
So dumb lmao
@@xxelitewarriorxx maybe invest in power efficient architecture, look at the mobile industry.
@@xxelitewarriorxx Part of the gimmicks consume some of that wattage, so why not divert their power to raw power?
Learn computer science and engineering and make it happen then. Video games are simulations, I don't care if frames are fake.
SO basically the 7000 series AI Accelerators are a waste of space that can't be use for AI upscaling, I smell BS on that but ok.
Because if they allowed FSR 4 on RDNA 3 GPU's they would beat 9000 series so they won't let it happen, shame on AMD
Maybe it will be supported over time through drivers,once they sold enough 90xx series. We'll see.
sony had to push AMD to add some RT and AI acceleration to their architecture, so it's likely stock RDNA3 just wasn't cutting it.
@@SkylordRevision1 Without FSR, not the release driver, expect to improve
3DMark Speed Way
7900 XTX (96CU RDNA 3) = 6348
9070 XT (64CU RDNA 4) = 6345
3D Mark Time Spy Extreme
7900 XTX (96CU RDNA 3) = 15260
9070 XT (64CU RDNA 4) = 14558
4080 Super = 14061
@@stupormanPS5 runs on RDNA2 (down clocked) and zen 2 CPU.
as a 7800xt owner I don't know how I Feel about this. What are those "ai accelerators" on my gpu if not for upscaling or frame gen? This time Nvidia is actually porting their tech to previous gen and they don't seem to be lying about the multiple frame gen on the 50 series not compatible with the 40 series, but I'm having a lot more difficulties to trust AMD on this one. Really shady CES for AMD.
TBH it's absolutely trash that they aren't supporting 7000 series. Who the hell wants to buy a new GPU worse than their last one. Shits ludicrous
well if the hardware has no units for it, what can you do? Nvidia was mart enough doing that 6 years ago so they can support 4 generations for DLSS. AMD went with the wrong way and have to switch to nvidias way now... same will happen with raytracing. AMD is always 2 late and 2 bad
@PrefoX the 7000 series has tensor cores though?
If the future is generating fake frames just to get a higher fps number then I'm out. I want nothing to do with a marketing gimmick running the entire landscape and tricking people that fake things are good.
Yeah going to be 100% honest here. DLSS & FSR are extreme let downs. Gaming companies need to make games with good graphics again instead of being lazy as fuck about it & depending on DLSS & FSR. Games from 2012 look better than games now for the most part and that is pathetic considering how much "better" the technology is. If this is actually the future of gaming we are most likely screwed.
Played on an old arcade recently 1996 game with CRT and I was like what is this trickery. It looks better then current gen, sharpnesses, clarity and colors. Obviously low quality models but still. We are going backwards. Every new gpu or monitor I try Cyberpunk and usually start Nomad but I never even leave the garage before uninstalling, just a puke fest.
@@Doubleome1 maybe you have all the blurry and motion blur and all that garbage on?
@@b.t4604 No lmao. The point is Games used to look better before DLSS & FSR when the creators of games actually put the time & effort in to make them look good. DLSS & FSR just cover up their laziness & lackluster work.
uh my girlfriend has taken offence to your comment, not really sure why?🤔🤔🤣🤣🤣😁
Didn't AMD say the 7900XTX had tensor cores which were unused.... exactly for FSR 4?
No. AI Upscaling is not the future if people push back against this garbage. Native resolution FTW.
it is now lol
Well say hello to poor fps in alpha games.
@@nahbro3240 then it's not Alpha Please finish making the Game and Optimizing it stop releasing unfinished games and Please Quit brute forcing the games relying on the power of GPU to make it work, before long cooking instructions will include play XXX game for 45 minutes at high graphics setting to thoroughly cook meat
Agreed but to many people like using upscaling and I doubt it's going anywhere.
@@nahbro3240 Who gives two shits about alpha games? Not even sure why you mentioned that. If alpha games are running like shit, it's due to lack of optimization. In most cases, upscaling won't help that. Besides, it's alpha, the majority of players don't play alpha's anyway.
At some point, AMD has to create something that simply doesn't work on previous generation parts. There's no way to keep advancing if backward compatibility is a hard and fast requirement.
But that's a selling point of amd so they should figure it out, I mean I wouldn't expect it on rdna 2 but as a company it looks bad if it's not on rdna 3 that has a.i. cores.
@@Antiquecurtain It's an expectation we have to let go of if we expect innovation.
No, IF it can run on older hardware it should be released as such. Unlike Nvidia, who gatekeep new features behind new products for no reason other than it sells cards.
@@AluminumHaste But AMD has exactly the same goal -- to sell GPUs. Why should it be compelled to NOT use the same means of getting there?
They already knew they where going AI for FSR4 while still getting ready to launch the 7000 series. They could have added whatever they needed before release so that the 7000 series could have run FSR4. Based on the performance we have seen so far for the 9000series cards (may not be accurate but time will tell), my 7900 XTX won't be going anywhere. I rarely need upscaling tech anyway since I see little benefit or gain running 4k over 2k. I can run high to max settings on almost all current games with well over 60+ FPS (usually much higher but never under) unless I do ray tracing. Even then, I don't usually see enough eye candy benefit to run ray tracing. Only game I run ray tracing on is Cyberpunk 2077 anyway.
Btw all graphics calculation is matrix ops. The tensor core does tensor ops ie multi dimensional matrix ops.
you know what else is massive?
ur mooooooom 😂
Stop dragging that meme
AMD's wide open opportunity they're about to blow
@@-in-the-meantime... Makes zero sense just to lock fsr 4 to the 9000 series but wait amd wants to play an nvidia it just doesn't work ffs
My back
Cant wait to see the 5090 vs the 4090 raster benchmarks.
"raster doesnt matter bla bla bla, you just need 8X frame gen bla bla bla"
Should be around 25-30 % better. Nvidia showed numbers for Far Cry 6 in their presentation, which suggest a 27 % improvement in that game (in their test scenario). Given the RT in that game is super undemanding (game was made for 20 series and you can see it when you turn on RT with a 3000 or 4000 series, that it almost costs no frames at all), that means we can expect a raster improvement in that range.
@@OutOfRangeDE 5%
@@OutOfRangeDE Very unlikely. Their benchmarks usually show much more improvement than anyone ever sees in independent benchmarking... they like to deceive everyone into thinking things are better than they are.
It is stated that it's developed for RDNA4, but nowhere it's said it won't work on older RDNA GPUs... at least this is the info from many people that were at CES and talked to AMD
Truth is we don't know however that world be an excuse to go get a 9070
@@JROCC0625 if AMD screws this up, even though nVidia isn't an answer, I think I'll go and get at least 4080 TBH...
@@WarlockSRBNothing coming from AMD will be close to a 4080, strange comparison.
@@Dexion845 that's not what they are saying. 9070xt is supposed to be close to 4080s
Literally only says the fsr4 upgrade feature works on rdna4, which allows you to use fsr4 in fsr3.2 supported games. Fsr4 is supposed to work like not as well on 7000 series but still works from what I've heard. But we will see
It’s like AMD RDNA4 couldn’t even be bothered to show up or show anything at CES 2025. Like they just quit or forfeit vs RTX 5xxx. So disappointed. They just told others the info. Not tell anything at ces 2025 show.
it feels smoother? well if they make u play with a controller then they are trying to hide the latency. A pad hides additional latency pretty well compared to when u use a mouse.
Or maybe because most people are used to playing with controllers
@@PlusTiger most? no. only the new wave of pc gamers. no pc gamer will play a fps with a pad, that is something console players have brought over to pc. But 3rd person hack and slash action games are pretty okey with a pad, as they are slower and easier to lock on as the game assists.
Super true, as a solutions architect for an mnc bigger than AMD, i can confirm. using controller is a cheat code to hide latency. real thing is when you use keyboard, mouse and that's when the real experience you get. And imo, nvidia thrives over AMD in almost every aspect, except cpu, which they are doing now at digits and no guarantees but i think they will do a cpu that easily slaps amd's and intel's
@@Pillokunyep, I only use a gamepad on PC when playing mostly racing games and some adventure ones or whatever, most others is mouse and keyboard unless I connect it to my TV...
@@PlusTiger complete lie lmao. PC gamers rarely use gamepads, only for racing or games like elden ring. Otherwise we primarily use mouse and keyboard. Controller is a huge disadvantage and just isn't a good input unless an FPS title (most these days) has crazy aim assist
To be honest, as much as 16gb is not enough on the 5080, 16gb is also not enough on the 9070xt.. AMD need to make up for their lack of RTX performance, VRAM, was an ok way to counter that.. if they put 24gb they would have nailed it, oh also, not to mention the fact it's still GDDR6... :/
Any info on input lag?
There's already backlash over the new DLSS introducing like 50 ms of delay on the new 5000 series nvidia GPUs.
If just upscaling doesn't introduce significant delay, it could be the way to go.
your voice is my sleeping pills, so relaxing to the ears.
I really liked the thumbnail picture, I wish it was a real design. There are recently made case fans, they are fully metal. I love those.
I wish GPUs followed with a metal case and anodized aluminum blades.
Any info on amd streaming encoder? better?
Devs won't implement FSR 4 on their game just for 9070 cards good luck for finding game with FSR 4
Right!? Can't even get them to implement FSR 3.1....looking at you CDPR on Cyberpunk 2077!!
But FSR can work on multiple type GPUs.... Including Nvidia
When FSR came out initially they were talking about I think at least 4 frames at once being generated between frames being totally possible. I am assuming that will stay true regardless of the AI direction towards upscaling.
RX 7000 series have AI cores. Or AI threads. Why Amd push people who buy day one RX7000 series. Not cool...
Why would i buy 9070xt if my 7900xt could have fsr 4 ? Just cause it has better ray trace (never use it ) ? Think man just think. Now people have good reason to buy a reasonably priced gpu with good upscaling like 4070 and 4070 super did with dlss.
people with 7900 are not gonna be interest in AMD gpus going forward anyway since they are not in the high end anymore.
those people will go for 5080 90 or wait for 6080 90.
@@arbernuka3679 people with 7900 are not gonna be interest in AMD gpus going forward anyway since they are not in the high end anymore.
those people will go for 5080 90 or wait for 6080 90.
@@arbernuka3679 doubt that, they just said that is only for 9000 series, and even if its going to be for 7000 series either is going to be super limited or going to be delayed
@@lucaskp16 i have 7900 xt since day one. If the upscale technology have a generations, RX 7000 series should have at least 2 gen upscale technology support from Amd. Like FSR 3 and FSR 4. On Nvidia side Upscale technology have better support right know than Amd. I'm talking for AI based upscale not the old manual pixsel algorithm. Seriously not like this...
This will be AMD’s greatest mistake because if most people have to upgrade to a completely different generation in order to get FRS4 you might as well just go to Nvidia
Afaik they said it will come to the 9070 first and later they will look into what cards they will port it to...
I only gonna use the SuperResolution option which is around for over 20 years, nativly rendered no fakery.
Gotta love that the 7000 series has AI cores that literally don't get used. Talking about die space, the AI cores in the 7000 series if not used here, are simply a fucking waste of die space in the first place in that case.
Please dont tell me this.... And I hope FSR will be released to cards like the 7900xtx!!!
tbh knowing amd they probably will put it on earlier cards but maybe it'll be different this time.
It will
The upscaling and frame generation software solutions are a stopgap until significant advances are made in graphics processing. Basically fake it till you make it.
I hope you are right, because the alternative is that the smaller nm architechtures are approaching a physical limit meaning that either GPUs will need to get bigger/more power hungry or become increasingly more expensive due to the lower yields.
How come RX6000 series run DLSS 3.2 on my 6700xt?
the big upcoming platform for radeon is udna 1. i personally believe this is going to be a really big upgrade. prolly radeon 11xxx? or iuno.
The confetti is fixed.
0:28 you know what else is massive? :)
Price
Your shits?
This segue to our sponsor
.. Oh wait wrong channel.
Low Taper Fade
What is that pc case
If it has hardware accelerated upscaling, it has a chance...
Still, NVIDIA's frame smoothing is really impressive...
See where you need tensor cores now, so will the rtx 4000 series be able to run it FSR4, where they have the tensor cores?. I can't wait for dlss4 see what that's about
i wish utubers use proper real thumpnails. fsr3 is never that blurry. plus, there is sharpness bar.
Was about the comment the same thing. Such a BS, clickbait thumbnail.
if you couldnt tell that it was not serious then this video isnt for you go back to playing on your switch bud
The comparison between the two monitors at CES has both computers running FSR 3 & 4 on the performance mode
It was more so people recognize the point being made. With that said, I did change it. But yeah. It’s just so you can quickly glance and see what I’m talking about.
It's still a bloody crap Vaseline on both my 1080P and 1440P monitors... Only looks fine for my 4K TV to see further away and of course upscales from higher resolution.
FSR 2/3 could've worked better if developers dedicated more time working with AMD on it, but of course they were busy doing contract work on DLSS.
Still the consumers buying up RTX didn't help out, now AMD is also locking out previous gen. Deserved.
Hey would the RX 9070 xt go better with my 5800x 3d then the rtx 4080 I have paired right now?
Not necessarily, but we don’t know 9070 xt benchmarks yet
Nah keep 4080.
That 4080 is gonna be good for awhile bro. You having issues playing stuff? Keep it till you do.
@@AdeptN00B yeah my 4080 has been great aside from the top hdmi being dead I just use the second one on my s95b and it's been fine. I also think I just saw ALL RTX cards will get dlss 4.0?
4800 all day. AMD won’t even give us info. Very sketchy!
There might be a XTX version at the end of Q4 2025 or next year.
I’m curious if this may work like XESS where other GPUs are supported, but it’ll be less performant?
I agree with everyone posting about ‘fake frames’ and kicking against other ‘AI tricks’ and marketing BS by all manufacturers, but the truth is that ‘traditional’ GPU technologies are hitting the constraints of the laws of physics.
The fact that the 5090 has a TDP of up to 575 watts is really telling.
The demands of gamers for better more ‘realistic’ graphics with each successive generation means that these tricks have to be employed to meet expectations regarding higher framerates, higher fidelity, higher resolution, and acceptable power draw.
Personally, I think there’s so much fun to be had in gaming that doesn’t require such high graphical expectations… look at the success of the Nintendo Switch, Minecraft, and indie games like Among Us, Hollow Knight, and Balatro.
In my 40 years of playing games I’ve noticed that demand for photorealism and framerates mostly comes from those who solely play AAA FPS blockbusters and competitive shooters, with the desire for graphical improvements making up for the lack of creativity, variety and gameplay interest that those genres entail.
This is a bad road to go down. Innovation will be parked for a bit.
yeah competetition is good how does nvidia feel now intel and amd is on their heel?
Amd is a nothing burger
The 5080 using DLSS4 hits 250+ fps in Cyberpunk at 4K on Ultra settings with Full Ray Tracing and the 5090 is twice as fast, Nobody is on Nvidias heels
@Conumdrum and it's 28 fps with fake frames off. 4 generations later and Nvidia still can't run cyberpunk at 30 fps according to their own website. Ur basically giving them 1-2k/yr to beta test their ai.
@@brussy1 exactly bro they lied they even said that the rtx 5070 is equal to a rtx 4090 ha ha ha😂😂😂😂
@@brussy1 I laugh, all these years and AMD still can't Ray trace DLSS 4 is going to be better than FRS 4 without frame gen and multiple frame gen will have people gaming at 4K at 150+ fps in most titles on the cheap, at the end of the day, its all bad news for AMD
Fake frames feel terrible in most casenarios and sometimes feel meh even in story mode where the latency isn't a crazy factor it still feels off. I just want raw performance as someone who doesn't even really care about raytracing
You do know that all video is comprised mostly of extrapolated frames right? Including the video you are watching.
Have you tried fake frames?
@@FDFFDS-go1oo Yes they also have a tool called lossless scaling which at 2x frame gen is pretty comparable to regular Nvidia implementation at game engine level, I know u guys love to spew nonsense without any idea what you are talking about its "ok" for story player games where there is no competition and you are just playing games for the look, but otherwise its basically not a real feature people care about
@@Tugela60 Lmao do you even think before you type? You are comparing a video you watch (pre-rendered) vs a live game that takes inputs and renders as you go in real time where your inputs and latency is affected based. Extrapolation has been a thing for a long time and there is a reason it wasn't ever used in games its terrible.
Only people are angry with a 40 card😂
i though every new version of amd's fsr does support all models.
Then the price better be fair to gain marketshare from nvidia
I just want to know the price already. I want one.
RX9070----Please be under 600 dollars. Please be under 600 dollars. Please be under 600 dollars. Pleeeeeeaaaaassssseeeeeee!!!
waaaay too much money lol...
479 dollars I've heard.
$599.99 incoming
I'm guessing around 450~500 usd
Just get the 5070 at that point
AMD has not called it FSR 4 or even identified whats on the monitor, it could be 4K gaming
firstly, the gameplay is too smooth for it to be a native 4k output, both systems are running the same hardware so the right most result looking that much better and also running at the same or better performance can't be the result of native output. also people have put the same game, ratchet and clank with fsr 3.1 on their machines and confirmed the artifacting on the leftmost monitor are congruent, while the rightmost has artifacting still but to a much lesser degree which means it is still being put through a solution, they have not called it FSR4 yes that is true, but its a very safe bet to assume it is FSR4 or a succesor to 3.1.
@@Pvydrow RUclips is 60 fps , that's all your seeing 60 FPS is not very smooth any way you slice it
@@Conumdrum your comment still does not address or disprove any of my valid arguments, besides, there is still a way to notice frames per second higher than the video supports, it will technically be the same framerate, but higher refresh rates will look sharper every frame with less blurring, anyways, that wasn't even a main argument point I made, you just nitpicked it from my comment
A very massive improvement in Ratchet and Clank. I know that Ratchet and Clank is one of the worse ones for AMD FSR 2 & 3. I'm assuming it is (FSR4) not backwards due to the AI capabilities of the 7000 and older RDNA series GPU's. Well, we'll know more when AMD is more willing to talk about the 9000 series - and there is a definite fury of rumours floating about 9000 series.
The 7000 series has a.i. cores they should have a version on the 7000 series and as a fan of amd people shouldn't let them off the hook on that one.
@@Antiquecurtain I'm not disagreeing on that point, though I'm hoping for more information before I make a judgment.
I wonder when we will see RDNA 4 budget cards like RX 9060.
The 9070 XT and 9070 are budget cards. They have "thrown in the towel " on high end cards.
fsr4 definite required to be on..or else..it runs same as 7900GRE in both scenario..ray trace and raster
As good as dlss 4? Without something as good as nvidia reflex 2?
AMD has anti lag 2
@ 2nd class
@@Ultrajamz still works better than the nvidia reflex :D
7800xt I got last week was a waste of 500pound no fsr 4 mine broke so had to get a gpu
The problem with FSR 3.1 is not the quality. I want more FPS and if that means I get a worse quality Im fine. BUT the problem is that NVIDIA paying a lot of money to the developers for not implementing the latest FSR and if they are implementing it, it is just bad. How do I know it is intentional? Well, modders fixing it in a few days. We wait one and a half year for FSR 3.0 in Cyberpunk and when they released it, FSR 3.1 was allready released half a year ago. This is not over yet, because you could download a mod like a year before FSR 3.0 for Cyberpunk, where you could get FRS 3.0 working already better than what CD Project released a year after. Alan Wake 2 gets DLSS 4 while there is still FSR 2 in the game. Throne and Liberty: DLSS frame gen, FSR no frame gen. There is a huge CPU bottleneck in the game like in every mmo with a ton of players, so frame gen would be very important. And ppl with the RTX 3080, 3090, 2060, GTX 1080, etc. and ofc every AMD user cant use frame gen, while the game is NVIDIA sponsored.
The youtubers like Gamers Nexus or Hardware Unboxed and yourself should talk about this, because NVIDIA is fcking destroying the market. Intel is gone and look how much more expensive are the AMD CPUs. Same thing happened with NVIDIA GPUs. Do you like that? Do we want that?
"NVIDIA paying a lot of money to the developers for not implementing the latest FSR " we don't know that and I get tired of hearing it with no proof. This helps no one. I agree with everything else you state though. Influencers need to speak up more about the crap implementation of FSR and the fact it's usually old and outdated especially on games still being actively updated. But I suspect this is also AMD's fault in some way.
To "throw in the towel" means something different that you used it for... just saying. Anyway, good job and thank you for your work so far :)
I wouldn't use either dlss(3-4) or fsr the only one that is actually useful is the upscaler everything else adds delay and is aweful.
just an Prime example Poe2 is using FSR 1 and its not even released yet.
AMD has a problem to push features to devs!
9070/70XT is a GRE with ray tracing. Could have at least done an 70XTX to be more along the lines of a 7900XT with ray tracing. Can you believe i actually considered a 5070ti in my head. i have a GRE what is wrong with me..
I have a gre but having crashing with warzone. On the latest drivers as well
I think that's aiming too high. I think between 7800XT and 7900XT is where these cards land. AMD uses VRAM buffer as an upsell feature so the 7900XT should still have a reason to exist.
So video cards will be gone soon.
I was really hoping they came out with the 24 Gigs version. I was ready to spend my money on AMD.
🤣 i remember being happy playing video games on old consoles. Im still happy to play on whatever. Theres artifacts in the carpet 🤣🤣
It seems you are becoming shorter...
F**k A.I. I run my graphics cards without any type of upscaling, ray-tracing, or any of that stupid nonsense. I want raw performance, and if the card can't deliver without fake frames, I want nothing to do with them. My RTX 4070 Ti Super is the GOAT of this generation. I'm gonna need to pick up a few more.
you should have a counter of how many BUT you said hehehe
Idk about everyone else but im tired of all this upscaling and ai stuff. what happened to optimization and real generational performance gains?
5090 will me my last card ever. I want native resoulution + got product who is finnaly done when i pay for that..
This is just another reason why Nvidia dominates the market! Even with the newest GPU’s FSR/DLSS will be required on the newest games with RT on full.
When it comes to budget gaming at 1080, that is the only area that AMD and now Intel can compete; though the 5060 is likely to be about the same cost and still perform better with DLSS. If running 1080 though, I do not think DLSS/FSR will be needed. Still, I do not like the idea of spending money on a product that isn’t as good if the price is close to the superior product.
Do you have any idea what were you trying to tell ? You sound like most of ngreedia users🐒. You know new amd cards will perform 2x-3x better on raytrace and fsr looks incredible now ?!
@@arbernuka3679 what? 2x-3x time better ray tracing? from the leaks it was supose to be close to 4070 ti in terms of ray tracing, what you saying there could match the new rtx 5080 lol.
@ compared to AMD’s last gen sure, but compared nvidia’s new 5000 series, AMD will still be outclassed!
It also seems that FSR4 still has some slight fractal issues, but definitely a big improvement over FSR3 from what I’ve seen on similar videos.
The big issue is that Nvidia has 90% of the market, so game developers are optimizing games for Nvidia on PC; Indiana Jones and the great circle doesn’t even support FSR yet! (It will eventually).
I always saw AMD as better option for budget gaming due to lower prices and if gaming 1080 no RT, FSR shouldn’t be needed; so I was going to buy one for my son when I build him a PC. But if Nvidia is going to lower the prices of the 5060, that will make it more competitive.
showcasing it on Ratchet and Clank, Sony's movement😂 turn's out that in most of the game's PSSR is just bad, won't be surprised if FSR4 will struggle too in many games, meanwhile DLSS is polished even more with upcoming RTX graphics, but anyway Nvidias marketing is bad asf, but at least DLSS is supported by more games and does a good job with upscaling
We are all doomed.
If you hear yourself saying something like, "I'm assuming..." you should stop yourself immediately.
the more fake frames there are the less game devs have to do to optimise. eventually ur gonna get 95%+ fake frames.
"Obviously AI stuff is the future" I completely disagree. These AI features are cool little hacks when you have aging hardware but it's mostly there to hide the fact that current cards are just way too slow for modern rendering techniques such as Ray Tracing. You basically have to kill either image quality (blur/artifacts) or fps to enable the RT features that actually look noticeably better. But the cost is not worth it. It's cool that AMD catches up, but when I hear upscaling and frame gen I basically tune out because why would I be interested in these features with mid-high end GPUs? That's something I'd use on a nvidia xx60 class card. When I buy a 5080 I want pristine sharp and fast gameplay not some upscaled frame gen mess with blur and flicker.
"Nvidias dlss was much better but...."
Have to keep that Nvidia hate strong.
Obviously Nvidias DLSS will still be better they are just ahead in AI upscaling compared to AMDs FSR4.
Except AMD hasn't said its FSR4
If frame generation makes three frames from the one rendered frame wouldn't that mean that there would be four identical frames? If so wouldn't that just output jerky all be it higher frame rates?
AMD is screwed if the 5070 is 4090 performance like they say
It isn't, you don't need to wait for third party benchmarks. The 5070 isn't much better than the 4070. It's about 20% faster which is slightly less than the 4080 but comes close. Nowhere near a 4090 which would require another 40% of raw performance over the 4080. 5070 only gets 4090 performance with mult-gen. AI generated framerates.
Does FSR use the CPU or GPU more? DLSS is hardware accelerated while FSR is software, so I always expect DLSS to be better.
FSR 4 uses machine learning, like DLLs. It uses hardware acceleration; therefore, it only works on next-generation GPUs.
@@David95853 Seems to be the case with FSR 4 as it won't work on 7000 series, but perhaps some features will still work on older GPU's. Unusual move for AMD if you ask me.
@@dcikarugait's not people who kept complaining about fsr is why we are here now not that they got a better solution y'all still complaining
@@chriswright8074 Do you feel I was complaining? FSR was pretty much free when you look at it, and gave life to older GPUs.
AMD Radeon department is Hugely neglected
My biggest gripe is that FSR 3 is still not supported on loads of games and now we have FSR 4?
I'm planning to go for Nvidia RTX 5080 or RTX 5090 laptop. RIP my wallet.
The future is AI based. The future is lost. I want to see the raw power whidout fsr or dlss. Never used those things.
Since this FSR4 is only for the 9000 series then back to Nvidia I go, and getting rid of my 7900xtx for. 5080/5090. I supported AMD through the 7000 series, and that will be last the last time.
Same bro, Nvidia lowkey cooking, back to Nvidia. I bought a 7900 GRE like a month ago and this is such a disappointment which is crazy because this GPU is a beast. The higher end 7900's can def run FSR 4, but it would be faster than RDNA 4 so they won't allow it as then it wouldn't sell
AMD DOING GREAT SO IS NVIDIA AND INTEL
This whole upscaling debate is huge BS. To spot differences you have to slow and magnify the screen. IMHO it's just marketing. Anyways, if this is what people want, here you have it.
They lying just like NVIDIA that older gen "just can't".
Massive?`
Most ganes atv2k give my 7900xtx on defualts with fan tweak to keep itbcooloer
remember when the 6000 series came out and it didnt have any fsr but they said they were working on it. finally they are getting something close to dlss, it only took 5 years and doesnt even work on 6000 series cards,
at least you have XeSS
@@Toufan1-gr4br But that's not AI upscaling, though. Non-Intel GPUs cannot use the machine learning model and have to use the simpler one.
@@selohcin tbh no one wants to be paying for fake frames 🤣 this tech is so stupid.
I don't give a crap about fake frames. I play with that garbage rurned off.
Hahahahaha
If AMD doesn’t come out with some info and pricing and release dates I will just go with nvidea. Was hoping for team red to pull off a win but this is sketch AF!
I don't think they'll wait that long to announce their line up and features.
480$ for 9070 xt
nvidia gave a release date for the new 5000 cards. AMD is going to release just before the nvidia release just to undercut their sales. you watch!
@@qwertyqwerty-zi6drrumor only
@@tonyd7164you must be physic to know this
Gaming is as we know it is dead it’s all down to AI now no real optimization from devs.
We don't need fake frames.
Don't kid yourself, DLSS4 looks fantastic and is low latency
@@ConumdrumYoure a sheep.
Thats just a "solution" to intentionally created problem.
I aint supporting that garbage with my wallet.
@@Conumdrum yes, it's amazing but you're not buying a GPU, but a software technology. Multi Frame Generation with AI already existed before the rtx 5000. And there is a lot of open source software that do that.
@@mouadessalim7116 When everyone starts gaming at 120fps+ at 4k on the cheap , AMD is in trouble Great looking fake frames won't offend the ones using them
All video is fake frames dude. You don't watch video?
i'm about to sell my 7900xtx and get an nvidia card out of spite for this lmao
You weird
because you want 75% of your frames to be fake?
@@chriswright8074cope
@@arch1107 Nah I assume it's because at least NVIDIA is supported and not a shock when a generation is just dropped off the map like AMD did with 7000 series (from a 7900 GRE owner)
@@SkylordRevision1 bruh... have you forgotten about fsr being compatible even with nvidia cards? While the same nvidia said dlss 3 wasnt capable of running on previous gen cards? Is your memory that short?
Lol excited like a little kid in a candy store.
Looking forward to cheaper 7900XTXs. I don't care about upscaling or ray tracing. Complete gimmick designed to fleece people chasing "photorealism" which is for recording videos.... not playing actual games. Nobody cares about puddle reflections in the middle of a CoD match. Literally nobody. It's impressive the first time, then never again.
Making fsr 4 exclusive to rdna 4 and not even offering a fall back layer for the 7000 series is bad p.r. they could at least offer the image quality portion update. My next upgrade will be nvidia when the time comes. Even Intel xess has a fallback dp4a that would've made the most sense. What happened to the 7900xtx a.i. cores that is supposed to be there