Well, they keep locking software technology (upscaling especially) behind chipsets and selling that to you for a premium instead of making the cards cheaper or natively more powerful to better match the prices they are asking for. In other words, GPUs have been a scam for a while now.
Nah but why would you turn on DLSS though there is literally no need. If I want the real world I’ll go outside and if I want a fictional game I will play it. We don’t need games to look exactly like the real world. Some idiots will get confused😂
Well, that's been true for a while. The Human Eye cannot perceive beyond 144 Frames per second. That is why 144Hz screens exist. It's the Human Limit. Yet the mindless consumer still insisted we create higher and higher frame rates for them to blow their cash on. Le Sigh.....
but the cards are still stronger than the last ones without fake frames. so the price getting lower is still amazing for a better card. just don't use the fake frames
All frame generation tricks have one huge weakness: input latency. There is just NO realistic way for the system reliably predict the future. 50 ms latency is way worse than input latency in bad early 2010 60Hz televisions. Even streaming games from cloud can be faster than 50 ms input latency. The output can even look good, but the frame generation presented by Nvidia is still hardly usable for anything more than smoothing out cutscenes with latency like that.
did you hear about Reflex 2 and frame warping? This video didn't show whether Reflex 2 was being used or not. Reflex 2 will be able to know the mouse input at the time of the generated frame and "warp" it to match the input, so in theory it should "feel" like it has nearly the same input lag as with no frame generation.
this is why I dont turn on these AI "enhancements" until I dial in settings to make the base performance good. AI is not gonna make 20fps feel like 200fps. It CAN be useful if you already get 60+fps, to then enhance that to more fps. But if you are getting 20fps, you gotta bring down the settings not add AI stuff. AI is an enhancer to already good performance not a replacement for performance IMO
And I expected them to show input latency WITHOUT frame generation. Personally I don't care what the difference between x2 and x4 is, I care what the diff is between x2 and no FG. Latency is extremely important to be under a certain level, what that level is is different for everyone, but when I tried streaming game that latency was really bad even for a single player game, at least for me. Give me raw performance with low latency even for single player games. I play game like Rogue Trader with 100+ frame, when I limit it to 60, it feels worse.
These "fake frames" are frames your cpu wouldn't be able to compute anyway. Even top dog cpus bottleneck the 4090 at 1440p, and most ppl are still at 1080p
Every week I end up having to download more ram, no matter how many GB's I download my computer seems to be so slow. I'm gonna download a 32 gb package tonight
@@Stinky1am Check the seal on the RAM header where the line feeds into the reservoir, you may have a leak. If you download large amounts the heat will build up and slowly eat away at the seal. Especially if it cools down rapidy after a long download the epoxy can start to crack.
Luke Smith did a video about this years ago, tittled "Computers were faster when they were slower" . Basically, nowadays we got so much power on our hands that optimisation is thrown out of the window - and games are just the newest thing to be hit by this, you can put Linux on a 15 year old Thinkpad and still use it for daily tasks as long as you don't use the internet too much - basic office tasks are more than doable as well as the regular desktop stuff outside of gaming and heavy internet use (or in some cases, at all - I for example used a X220 for 2 years as a daily driver). Given that we do so much on the web these days though, things become obsolete despite them not necessairily needing to be,simply because the web is so damn bloated and unoptimized. Now Games have the same issue so we compensate by throwing fake frames.
On a terminal emulator? Pathetic. You need an external, dumb terminal that uses a hardware serial port. Local terminal emulators are pure bloat. I use arch btw.@@ghost-user559
@@ghost-user559 Yeah I've used w3m before. Not useless but not totally useful either. It's all the JavaScript that terminal-based browsers can't handle that's the issue. I wish there were more alternatives for the terminal, I remember Brodie Robertson talking about Browsh but haven't checked it out yet.
Yeah, it would be a big help if games were properly optimized. Maybe future games will be designed in a different way, like using voxels instead of polygons and pixels.
it's because the raw technology to put into the sillicon it's there yet, we have reached the relative peak of hardware capability to perform - nvidia is now pushing the software, frame generation to be precise
True but like he said in the video to maych this level of performance with just raw horsepower it would cost thousands, consume a massive amount of power and take up alot of space. Ai is going to be the main thing going forward for those reasons unfortunately
Remember the times when games was so well optimized to run on Native graphics without the need of upscaling and fake frames in order to run smoothly because the devs actually cared about polishing their games before release , good times.
Marvel rivals runs smoothly optimized without all those upscaling stuff needed, same goes for elden ring, etc Remember, depends on who the dev is and how they optimized memory usage
What’s funny is that games run better even natively today than before. It’s just a human nature to cry and remember the good old times that actually never existed
@@Mohmar2010 nah man. elden ring runs like shit for the horrible AA that it has. even on fing 3260p downscaled back to 2160 (using DLDSR) the grass is still shimmering xD artistically it looks beautiful except for bright indoor areas that just look like massive light bleed and no shadows (ambient occlusion). for the graphical fidelity that the game has it actually runs pretty bad.
I don't get this dislike for AI. As long as you can make it cheap and I as a consumer can enjoy videogames by shit working with practically "magic". Welcome to the future bros, get on the boat or be left behind. This isn't hurting anyone but you guys who keep on crying about it instead of getting in the race and innovating.
They know they are making the trade from raster to fake frame gen. It is intentional. They hide it because they know what the consumers want, 100% raster performance, no fake frames. There is a move AMD could make, to take the whole market. Target raster and push advertising "raw hardcore performance, no fake frames" and the gamers will flood over. Most of the comments I see are hating fake frames. AMD should treat their version of DLSS frame gen (FSR?) as an optional additional benefit for those who want it - but they should focus on raw performance.
Because the only card that pushed raster in any meaningful way is the 5090 which is about 30% over the 4090. They have no cost effective way to push raster anymore, dies are already huge and power requirements are high too. They'd need to move to chiplets to make it more affordable but that brings its own issues and power requirements would still be high. Using AI models and path tracing is the only way to still move the graphical needle forward without making crazy expensive hardware that few could afford. AMD/Intel aren't even bothering with high end anymore because Nvidia has too much mindshare and they also know there's a raster wall that Nvidia is already hitting up against.
@@MalinkadinkIt's not only that but also that we are reaching the limit of what's physically possible in terms of transistors sizes, we literally can't make them smaller anymore they are already only around 20 atoms wide
as a 3D environment artist DLSS, frame generation and temporal anti aliasing is a serious plague in gamedev. they're bandaid solutions to much deeper problems and youre sacrificing so much information in the g-buffer which determines how the game will look; if you can't get your game to run & render at 16ms frametime (pretty much 60 fps) you're just not doing it right. upscaling tech should be used to make games performant on systems that don't meet minimum requirements for example, the artifacting in games made by large studios with big money invested in it is abyssmal. it isn't easy to get work at the moment especially as an artist and seeing games being dropped with pisspoor performance and even worse, gross graphics that don't even compare to much older titles (in some cases mid 2010s games) is just a gut punch, optimization is like 70% of the job as an artist creating game assets. This industry needs an overhaul.. EDIT: apparently a lot of you cannot comprehend anything longer than a twitter post, DLSS isn't inherently bad; it's being used to cut corners and it is rampant.
We just don't have the tech! GPUs will become the size of a computer case and require 3x the power output! AI is the fix until someone develops a breakthrough.
Sadly it is a war of attrition. People need to talk about the incredible visuals of your game. Screenshots and youtube videos look great with TAA/DLSS and sadly that is all the studios care about. I feel like we lost the days where the community could hold the industry accountable. This issue is sadly in such a difficult spot. Many people don't notice, care or know any better. Maybe some day we can return to sharp images, high frame rates and low latency... At least the indie scene is doing great and are less corruptible.
My biggest issue with the new graphics cards is for games that don’t support DLSS. What about VR systems? Rendering inside of a VR headset is FAR different to a flat screen game and I personally love to see VRAM and raw improvements.
@TheShamefurDispray Correct, however the screens, and games themselves don't use DLSS or any sort of specialised rendering. SteamVR specifically uses their own "Motion Smoothing" frame interpolation for some headsets, but a LOT of people turn it off because it looks awful to have "generated" or interpolated frames in VR. It literally makes the world warp around you and it gives motion sickness in some cases. Raw performance IS required compared to a flatscreen game where software really can make a difference!
@@nonstopnewxy204 that’s where I wanna see more innovations in that side of things. If we have an 5 year old RTX card with high VRAM already, I’d love for companies to look into how we can utilise the VRAM better sure, because we shouldn’t NEED crazy high vram as long as things are optimized. But also it would be good for companies to look into how to make it cheaper and more accessible and possibly just climb those numbers. Unfortunately “cheaper” isn’t something that makes companies more money though. So why do it? I just finished installing a new 4080 into a new PC actually haha, and the VR performance is pretty magical compared to my last card (as you’d sorta half hope/expect lol) It wasn’t cheap, and I wish it wasn’t pricey, but at the same time I’m not sure if I’d rather have waited to get a 5070 or stick with this new card. Only time will tell to see the real world performance y’know?
@@caliginousmoira8565this might be a good thing for people who only game or mainly plays game.. but the thing is those who play game on pc also use it for other thing like render video, render photogrammetry, doing some calculations on the stress of structure, architect, civil engineer, mechanical, physics require a beefy gpu and cpu.. making fake frames on a game won't help speeding up work process..
@@redirectories648 And nobody would choose to upscale and generate fake frames if they didnt have to... I was very excited for 50 series but now I'm just gonna skip them and wait for 60 series... No point upgrading to a GPU that is not even a real upgrade without relying on fake resolution and fake frames.. No thanks.
The most insane part about it all was that when Linus tech tips was allowed to play cyberpunk 2077 on the 4090 and 5090 he was not allowed to change any of the settings nor turn the dlss off. So we literally did not get to see what the performance difference was of the 4090 and the 5090 without dlss / frame generation. Also, it's disgusting how every single time these companies give out these benchmarks they always do it on dlss set on performance as if everyone is just going to automatically accept having the worst quality image of an already downgraded image to begin with as the industry standard. They're pissing in customers mouths and telling everyone its just lemon juice at this point. 😮💨
The problem is we are comparing a normal 200 fps with 4ms, with a fake 200 fps with 57ms. Frames don't matter anymore if I can feel the difference between 60 and 120 fps in a 60hz monitor just by the mouse's latency. I don't even wanna imagine looking a 200 fps with a 24 fps latency. This has to stop, there's no way I'm buying this garbage.
They seem to be pulling some foviated rendering magic though, as seen in Linuses hand on video.(Fringing on screen edges and gun model) If they pull it off, they might manage to "hallucinate" enough frames so that the movement feels like as if your camera is detached from rendering pipeline, like in that one video, but the game logic is running at 24fps, so lower damage numbers in Marvel Rivals, slight delay (ms) between dealing damage and hit registration) and other shicannery that comes from frame rate tied effects.
It won't stope the entirety of big tech from trying to shove AI slop literally down our throats in for at least the next 2 years until the AI winter finally come and silicon Valley will focus on some BS else
I hate this frame gen and upscaling bullshit. I think frame gen going from 60 to 120 fps is fine in singleplayer games, the input delay isnt that different. However, when you upscale 60 to like 240, you still have the 60 fps responsiveness with "240 fps", it doesnt feel smooth to play. Also, all the tests are using slow movement, however, in a game like cyberpunk, you do a LOT of mouse movement, its weird that they never show those types of movements, it feels like they are sort of hiding the downsides of frame gen.
@@triceracopp At least I can dig DLLS, it kinda works in its rawest form. But once you add Frame Generation and so on OOOF. But yeah REQUIRING IT isn't really what you hope for.
50 ms of input lag IS INSANELY HIGH. We have now mice that has 4000Hz or more of polling rate to achieve 1ms and lower input lag and then they add technologies that add 50ms or more
It's really not that high at all. I've hit the highest ranks in competitive in games like CSGO, PUBG, Valorant, Siege, and Overwatch so I'm definitely at the top 1-5% and I can BARELY feel it at all in a game. Obviously don't use it in any of those hyper competitive games, but in the demanding single player titles? You're completely fine dude. It's 99% placebo 😂
@@czproductions its not. I have a 4070 and tried frame gen on cyberpunk with DLSS and while it got VERY smooth it was completely unplayable. My mouse felt like it had no shit like half a second of delay. It was that bad. But again I am one of those people that can actually feel the tiny input lag that VSync gives... but I am not a competitive player, and still I notice these delays. Many people will too. Frame Gen is just out of the equation for me.
@justinhainsworth7726 Oh yeah, I didn't think of that! That makes me feel better if it works well with FG and DLSS. I don't know much about that stuff honestly.
they said u can over clock to get 200 fps already my opinion I'd rather have a cheaper card but with ai than an expensive 120 real frames card but has less base fps than the ai card
I feel like the people saying "No one cares if it's fake frames" has never used frame gen before. You'll get more input lag and artifacting, making it worse than just having a low frame rate. When it gets better sure, but saying 5070 with DLSS and frame gen will be the same as running the game natively on an 4090 is BS.
Nobody cares except a small 1% of the market. They will never see the difference. It's like the Audio or keyboard market where if you aren't sensitive to it and more likely have not experienced it (ex. Mechanical keyboard) , the experience you currently have is already the best experience.
@@surft I disagree, most people still have a mid range PC nowadays with 1080p monitor. The ghosting effect is very noticable when you turn on FG and any ai upscalers, they just dont compare to native. I will happily play nfs 2015 at native 4k60 over marvel rivals at blurry 4k (and often crash anyway). The lag is not noticable but it is very noticable in competitive shooter/fighter
@@iCore7GamingI hope it does but it's not out yet, so we really don't know for sure. It ain't like some secret the companys will cherry pick percents that make them look the best. Don't get me wrong I would LOVE free frame but as of now frame generation is far from "free frames with no/minimal costs"
@@surftYou seriously think people can't tell that they have 1/2 a second of input lag or the screen looking smurry if you move too much. Like I said would love for frame gen to improve and be a great setting for low end users, but as of not it's just now ready for everyday gamers to use.
DLSS was originally for upscaling to your modern 4k display, and Frame Generation was for getting smooth motion on 120Hz monitors. The problem comes in when marketing uses these to inflate performance numbers, and devs use the tech to NOT optimize their games.
You are paying for the extra hardware. But you need different software to drive that hardware. This is how it has always been. Dlss 4 is free and it offers improvements even for 2000 series cards
Good Luck Nivida Physically can't improve preformace without Ai. Not unless you got a pocket that's a black void full of limitless money. On top of that people can't wrap their heads around the idea of current technology being limited. They they make new technology and new way of doing things. You won't see nothing but Ai getting better and Ai technology taking over. That's the future and that's the New technology. But people hate it because it's not currently up to standard or quality they want. But once it is people who complain will forget Ai Upscaling is a thing.
@@BruvvaJoshits not getting stuck its staying in the boat that isn't sinking, wether its new or not its rational to want to stick to the better technology, get off the ai bandwagon lol ai is not gonna solve this
I have mine since 2017 but i wouldn't be able to afford a new one these days LoL. In my country the prices are crazy even for old GPUs, at least when you reach the big budget the differences aren't that big between them, but anyway the entry point became sooo high, unbelievable
Honestly I am still running a 1080 for years now, and it's only been in the last 2 years I have really found it lacking when it comes to performance in the games I typically play and I think alot of that has to do with new games just not being as optimised as they used to be.
@@TwoPlusTwoEqualsFive32 I upgraded my 1080 TI last year to a 3090 TI. I still have the 1080 in my second PC I have inside an arcade cabinet that feeds a second driving arcade cabinet. Forza 4 still looks great on it.
it where its been heading im just amazed that anyone in these chats that consider themselves a pc enthusiast didnt see it coming or is this just fake outrage ... smh , when is saw the size of the 5090 founder before he even said ai i was like yup we here ... that card should be big as or bigger than 4090 ...
You mean like Anti Aliasing? Aniso Filtering? Ambient Occlusion? I suppose if you just got here in a DeLorean from 1985, you remember. Except there weren't graphics cards in your PC then, which have ALWAYS offered some special effects to help games run smooth. "Game sucks, bro. My character is just polys with a texture skin over it. There's no real guts in there."
remember when graphics were at a point where there COULD be a massive upgrade of rasterized performance? come on mam think. They cant keep pushing rasterized performance and make cards affordable and small, and scale to the laptops. ai is necessary, and ai is more exponential than rasterized potential growth. think!
@@Thegooob95 i know i was kidding i guess i just didnt make it clear enough HAHAHA i am pro-ai increasing frames and upscaling and whatnot. i remember when dlss first came out with Control and it was really blurry, and now the drawbacks are damn near imperceptible. i just hope that, like DLSS upscaling, we see that same noticeable generation-on-generation gains with MFG and ray reconstruction (especially w/ regards to latency on MFG)
@@Thegooob95 What are you on about?...literally only last generation from the 3090 to 4090 there was a massive leap in performance (75 to 80% without AI generation crap - RAW performance), AI has it's place and is one metric but the fact is raw horsepower is still a valid metric, especially in regards to input lag. I want to buy the 5090 but at the asking price it's not even worth it... I'd happily pay $1999 if the performance increase was like it was from the 3 to 4 series... in raw power.. it might be but spec numbers are not adding up.. and we are talking about flagship cards here not little15w chips for laptops or handhelds where yes I agree AI can be used a crutch to help make things more liveable.
4080 user here. I almost NEVER enabling DLSS. If I am playing singleplayer game - I wanna see awesome picture. If I am playing multiplayer game - I wanna see my enemies and not the pixel mash.
That's idiotic. DLSS on Quality (if implemented correctly) is basically free frames with no harm for the visuals. There's no pixel mash, most of the people won't even be able to find the difference.
@@delacroixx Idiotic is believe to every corporative bs coming out from any of hardwarecreators. LMAO, dude check the videos from Hardware Unboxed for example. Simple example, I played Ghost of Tsushima, and I can tell you that regular picture is WAY MORE sharp than DLLS with Quality preset. DLSS in some games can be really good, but it is good in first place of lower tier PC to get better frames in exchange with some not critical imagine quality.
@@DanteX3M buddy, I'm using DLSS all the time. In some rare cases it is implemented badly, I agree, but most of the time it's a great way to boost performance without losing anything visually really. Framegen sucks tho so far.
@@TheRealRooslingthe problem is it’s not smooth it just looks smooth even on a controller you can physically feel the difference, any real pc enthusiast will never agree or side with you
its the epitome of modern society. instead of working on improving the actual tech, they focus on only making shortcuts to make it seem like its improved. now the shortcuts are so advanced that it makes you wonder why they spent all that time/money on that instead of just improving the actual tech
What is the benefit of extra frames is there is no improvement on input lag? I just don't get it. It's like turning on that motion gimmick modern TVs have
For me, it's really nice if you own a high refresh rate monitor and a GPU that can get above 60FPS but can't do 144hz. For me, a game running 80FPS and frame generated to 130-144FPS looks much better and doesn't feel completely miserable to play. It's also nice on games with hard framerate caps, like Sekiro. You can turn a 60FPS cap into a 120FPS cap and that's just much nicer for the eyes.
This is only really an issue those that have a bottlenecked CPU, keep anti aliasing on, or keep a low to standard refresh rate on their display and playing 4k.
It will be a matter of taste - there will be artifacts that some people never notice but others find intolerable When they did this with TVS, sports enjoyers got really mad because it had a habit of making the ball disappear 😂
IKR. I have always gotten the 80 class of cards (780, 1080, 3080). I knew the pricing would be rough, but $1,000 MSRP for the x080 card... man that sucks. My 3080 (10GB) feels like it's just starting to show it's age in the latest AAA games. Mainly because of the 10 GB of VRAM (and I'm on 1440p not 4K). Which is why the 5070 non Ti is a complete non starter for me with 12 fking GB of VRAM for $750. Kiss my ass on that one Nvidia. And the "5070 = 4090" thing I call bullshit on already. Nvidia of course means select situations, using the latest DLSS with the now 3 fake frames shoved in-between every real one. Which again I'm sure is all shit that only works on the 5000 series anyways.
It's a complicated matter. Low end cards like 4060 have absolutely garbage value but high end gpus that let you run games with path tracing are absolutely worth it because you see actual improvement in graphics - which to be frank, is in like 3 games right now but that's the developers' fault
Yep, that's what I'm waiting for. Already feel 99.99% confident the "5070 = 4090 performance" claim is pure BS. There will be a few niche titles maybe, and of course only when DLSS is on (prob set to quality), of course frame gen etc. Heck Nvidia's claim probably counts on using DLSS 4 features that don't even work on the 4090. Either way I'm pretty sure plenty of people will be upset when they just pre-order a 5070 believing Nvidia's claims and own benchmarks blindly. Already seen plenty of comments on various tech news sites, and the nvidia sub-reddit (guess not much surprise there) of people believing it 100% zero doubts.
I would be equally intrested to see what testing Hardware Unboxed does too as they have previously tested all these Upscalers against each other and DLSS was doing quite well in those vids, so I'm very curious to see how the new FSR 4 stacks up against the new DLSS 4
The main thing I feel when I go from low FPS to high FPS is how much better it FEELS, not how much smoother it looks. The thing I'm looking for with higher FPS is the lower input lag. I mostly notice FPS when I move the camera and it feels awful when it starts getting below 50FPS to me. There is no point in increasing the framerate if the input lag doesn't also go up, unless you're playing games that don't have much input to notice the lag.
9:19 Important to understand, the less base framerate you have the worst it feels. This is why those fake frames works best of the 1000$+ GPUs, not the 500$ one, and why they cannot use these features to claim the 5070 is as fast as the 4090, even IF you use all the features they introduced here. A 4090 will still have a much better framegen output even if it use x2 boost while the 5070 uses x4, the later will be laggier with a ton of wierd/blurry frames.
Are the demos all slow smoothy panning shots? If so, then there's something being hidden. Did you notice that in the demo a lot of the geometric edges/lines were shaky/jittery? That's also something there. They're basically padding FPS numbers with 3 frames of the same raw image but now drawn by AI. If you've looked at AI generated images these past couple of years, you'd notice that while AI can make really close-looking images to what's fed to it, one thing it can't really do is consistent and very accurate alignment of lines. That's why there's always a skew in the shapes and sizes, lines, and scale of the stuff in the images it produces. 3 whole images made in that manner are being padded in between the actual frames your GPU renders raw. That's why the stuff shake all over the place (the lines aren't overlapping perfectly per frame/image).
That was also addressed in presentation, that technology got better and electricity lines won't be jagged with new DLSS 4, but would be nice to see tests from independent reviewers of course.
@@elivelive I love your mindset NVIDIA won't make anymore cards with good native performance, therefore cards that do that don't exist anymore! Brand loyalty is stupid, why would you care that much about a brand, be loyal to good products, not brands
If you're doubling the resolution, quadruple by area and using DLSS4 frame gen, that means only 1/16 pixels are being rendered by the game. 1920 x 1080 x 30fps = 62,208,000 pixels per second (DLSS4 On) 3840 x 2160 x 120 = 995,328,000 pixels per second 995,328,000 / 62,208,000 = 16
30% extra and it consumes 30% more power. Looks like all they did was whack up the power fed to the chip and upgraded the vram. I’m curious how the 5070 and 5070ti are compared to my 4070ti super.
@@ramborambokitchenkitchen6357 the 5070 outperforms the 4070 but barely so the 5090 will blow your 4070ti super out of the water. the 50 series is essentially just the 40 series with an undersized motherboard and better ai. this is from both vex, gamers nexus, and linus tech tips.
Its concerning how every single tech channel who are able to have the new RTX GPUs are avoiding to say that out loud, or say "well it doesnt bother me" just so people start accepting this AI crap everywhere. Im happy that I was able to find your video among 100+ of videos shilling Nvidias "best release".
Accepting? It's been here. They are merely marketing it to you now that's is mature enough in the common conversation of society. Big difference in my opinion.
5070 with 4090 fake frames!. It should be illegal to benchmark and claim as "Performances" if the GPU or Game MUST HAVE DLLS/FSR,Upscaling,FramaGen to reach the "performances" they are promoting as improvement. They are just Crutches using to cover the flaws and unfinished products.
@@Youngsta0100% those who are smart seek out the information. Low watt brain havers just cling to the marketing information and run with it without second thought.
@@carpetburns1222 don’t blame me for you purchasing a product that was a gimmick. I’ll still be upgrading my 1080ti soon, but wait for real world testing before making my decision.
@@Youngsta0 Next time you buy ground beef that is infected with Listeria, I'll blame you for not doing your homework on what brands of meat have had Listeria outbreaks in the past instead of the company recalling the infected meat in the first place. See how silly your opinion is?
@@MumboJumboZXC I'm still playing games at 1440p on my ultrawide monitor without issues. Newer games I might need to turn down some settings but not often.
same, got myself one of the VR ready ones way back so a nice whopping 11GB of VRAM lol with the industry so messed up and many developers moving into indie studios and projects there's been no need to get a cutting edge card especially for 1 to 2 major AAA titles that end up being "meh" anyway. Gamers are just so desperate for good gameplay these days you could make an adventure game with basic geometric shapes and plenty of people would be happy with it as long as it was fun.
Nah. 12 gig is so you are forced to upgrade. They are purposly leaving a major bottleneck. A fatal weakness. So that your monster of a card needs to get replaced.
COUNTERPOINT: The devs can optimize their shit games. Realism has peaked, polycount has peaked, boredom has peaked. Make a fun game, stylize the graphics, optimize it to run WITHOUT the crutches Ditch UE5 for good measure
I played through Saints Row the 3rd and only recently I have bothered to look at the remastered version and I prefered the OG graphics over the hyper realistic graphics. It's wild to me how I prefer a game from 2011 over a remake released a decade later.
@@iCore7Gaming No, UE5 is unoptimized slop and if you need empirical evidence, Threat Interactive does a great, thorough dissection of all the flaws that UE5 tries to coverup with DLSS, Frame gen, TAA, and the slog that is nanite
as a blender artist and a 3d engine renderer, I hate the AI slop so much, even though I bought an Nvidia 3080 ti, it was because Nvidia just works better than AM, as comparing between AMD radeon R9 Fury X watercooled gpu to Nvidia 1660 super, even though they are roughly equally powered, with AMD i had constant crashes, blue screenings and shredded pixels when using blender (games worked fine though) but then Nvidia works so much better not to mention I actually got to use the gpu power and not render with CPU which is like inly 1% of the potential which AMD would have provided if only I would have been able to use it but I couldnt because there is no support. sure nowadays with blender 3.1 and above, there is options for AMD cards but that requires a rather new card, along the same age from the start of the 4000 series from Nvidia or even newer (to this day I still cannot navigate the names of the AMD cards) and needless to say, I don't fancy getting a weaker card just to have more crashing, more blue screens, more horrible user experience if i want to do anynthing else but gaming. NOW this AI slop, Dlss and the other bs from Nvidia does 0% contribution to making rendering faster or running the whole software in the first place other than makes the already obnoxiously expensive cards even more expensive. its just bullshit around bullshit and there is no way around it, because what can I, a random nobody do? just wait an eternity like a kid with a wishlist for santa and let daddy nvidia make a perfect gpu just for me? just being real, that aint ever gonna happen which is exactly why I am bothered about it.
Exactly this man is lying they got exposed by steam charts and when the ps5 pro launched nearly every pc channel talked about how to build a ps5 pro for $700
He did say the "5070" was midrange and most gamer would be around there for GPU purchasing. NVidia is pushing them to the 5070 TI as that 16gb is really the new minimum needed for current new games.
@RawRustTW8253Nvidia is worth 3 and a half trillion dollars (3,500,000,000,000) with almost 30,000 employees. Obviously the average person isn’t going to be able to make a GPU, but it’s not like you need to be a chef in order to know if food tastes good or not. The same applies to GPUs, nvidia is just spamming AI buzzwords to try to increase their market evaluation as much as possible at the expense of the consumer.
The people riding the 'nuh uh, bruh, it's not REAL frames' bandwagon are even more hilarious. You sound like obnoxious Vegans. "Don't eat that, it's processed and got teh Glutens, bruh!"
We already know from digital foundry that 4x mfg compared to 2x mfg adds barely any latency plus on dlss4 ghodting, image stability and latency itself are vastly improved. You literally cannot compare dlss3.5 and how a game runs and looks right now to what we get with dlss4. Its an entirely different neural model.
This is what I’m all upset about. I grew up to each new card generation having some mind blowing Nvidia Tech Demos that still are cool to look back on. PhysX Smoke/Water/Cloth physics in that one Batman game still look ahead of today’s games. Nvidia stopped giving us more powerful cards that clearly show the games looking better. Now it’s just AI to fake frames to pretend it’s actually more fps. That’s garbage. If I want more frames I want them natural as it can get. No post processing bs like TVs. Get back to making cards that can do more complex shaders and water. Get on water. Smoke and fire too. We still use flat fire and smoke textures. Again Batman Arkham Asylum I think it was? Old game. But SUPER COOL SNOKE! Keep on Ray Tracing. But leave out the AI generating crap.
I liked PhysX too. But that was already so malicious back then. It seemed to me like a feature that was mainly there to make other cards look worse because it only ran on Nvidia cards. It was forbidden for other graphics cards. NVidia gave a lot of money to game developers so that they could incorporate PhysX into their games and put Nvidia's logo on them. From the moment NVIDIA stopped paying for the implantation of PhysX, game developers stopped incorporating it into their games. It would have been better if the feature had been free for all graphics cards, then we might still have the feature today because it would have become standard. The physics are worse in games today unless it's an exceptional game that places extreme value on it.
@@kevinj24535 yeah they should have made it open source. They did the same for RayTracing too to an extent. I mean I get it. I was team Nvidia because of it and don’t have any regrets. I still prefer them over AMD. But the PhysX needs to be open source by now. Like. Let it go. Promote it without putting the Nvidia stamp on it. That would be the nice surprise. Just booting a game “knowing” what to expect. Then boom PhysX on friggin everything. Hair strands. Fur strands. Cloth. Smoke fire and fog. Water that can be redirected if blocked or pushed or even put into a bucket. I want to feel the same feeling as when I went from GTA SanAndreas to GTA IV. The good old days when “next gen” was undoubtably actually next gen. You couldn’t even pretend that you didn’t notice a difference. Once we hit PS4/XBONE I feel like maps just got bigger. And grass. Raytracing is BARELY even noticable. Oh “more accurate shadows and reflections”. Shadows. Ok. Reflections that’s cap. GTA V RayTracing update reflects some cars in the wrong color and buildings are LOD models. Not really worth the update. I’m looking for Wall-E/Toy Story graphics by now.
@ agreed to the max. Or atleast devs making their own version by now. Valve did a thing with their new engine tho. Definitely in the right direction. I hope they use their fog tech for the next “forbidden” Portal 3 game. Volumetric fog pouring out from portals shot in a foggy area would look absolutely incredible.
The CEO specifically bragged about how little frames are actually real in his presentation. NVidia consistently brags about their AI upscaling and AI generated images. They aren't lying to anyone. They are pretty upfront about this. Their entire fucking marketing is about AI shit.
in terms of GPUs this is actually something great that will be improved later on , a lot of unreasonable hate cause there is evil word "AI" that noone really knows much abaut
@amboyman DLSS has already been working really well for me since like version 2, but even V1 was a clear improvement over bilinear upscaling. DLSS is quite clearly no bullshit, it's honestly 75% of what makes their card worth the price tag compared to competitors. You just have to understand what it actually is, and whether it's useful for your own use case. But we'll, RUclipsrs like this and plenty more cannot stand Nvidia being successfull and just have to make up strawman arguments like the title of this video.
This is a stupid argument. If you actually read into DLSS, the arguments against fake frames makes no sense. Essentially, it's a tool that increases transistor-to-frame efficiency and the input lag is reduced through reflex 2 which halves the input delay of reflex 1. Why is there so much hate for the word AI? This is an exact example of how AI is used for good in optimizing transistor usage.
@@maxmustermann3938i dont know man, unless you are running a higher resolution screen, DLSS makes things look fuzzy. Not to mention the lack of VRAM and raw performance means that DLSS runs worse at some points. I hate the micro stutters my 3070 TI gets, and I do not think shilling over more money to team green is going to do anything. At this point I am going to switch fully to AMD to at least get more vram with SAM memory.
7:30 well, if they add one frame in between each actual frame, I can pay 275 actual dollars and then add in between each one a hand-drawn monopoly dollar. The one extra hand-drawn dollar is just my treat, so that they can keep researching some more advanced con tactics with my monopoly money.
I disagree. Frame generation is good for single player game or casual games. For a competitive game we're not worried as much about ray tracing and are probably dropping the settings a bit and running at 240+fps. The difference between the 4090 and 5070 is negligible where it counts. I suggest upgrading the 4090 card to a 50 series as soon as possible; even upgrading from a 4090 to a 5070 will be huge. Nobody wants an old 40 series card in 2025. Get with the times.
@@Cenot4ph yeah, I agree. I think it's because 4090 is a peak of graphics card technology. Where improvements would only mean bigger size, more power consumption and of course unholy price. So they came up with this solution. I guess it's a new era of rendering graphics where every game developer would be forced to implement dlss and frame gen in their projects.
@@Mothmook no, the 4000 series is where this new paradigm is starting. If raw graphics power would work, AMD is able to compete going forward doing just that. My opinion is that AI rendering is the new way forward because it's performs significantly better
Yeah. This just shows that people will buy up anything Nvidia says due to their position. If you really think a 5070 is on par with a 4090 with its minuscule 192 bit bus then you need to be slapped. Lol.
@@Iridiumcosmos you and a lot of people don't seem to understand much at all it seems. This is using multi frame generation in comparison with the 4090 frame gen. This is not a raw performance comparison. It really isn't hard when you apply your brain for a second
Ray tracing became way too big. I, for one, never cared for it. Why would I be excited about realistic lighting? I experience realistic lighting all the time irl, seeing it simulated doesn't make me orgasm I'm sorry. I'd rather say no to it if it hinders performance too greatly. Traditional lighting solutions are fine. No to DLSS, MFG, etc. together with ray tracing. Can I get a GPU that just renders normal shit efficiently? Don't make me overpay for RT cores while also putting software crutches all over the product. We're gonna see an influx of horrendously optimised videogames that are simply unplayable without fake assists. Sad.
now I dont know a single thing about computers or what any of this means, but if ur getting 200+ fake fps and the game looks good and feels good, whats the problem with that
Glad you’re covering this sir. The prices seem “reasonable” if of course the performance is there but I’d say that’s a long shot… a 4090 for $549? I doubt it without assistance from upscaling etc etc
To be fair it does outperform the normal 4070 and is cheaper than it on launch. So at least it’s definitely an upgrade, though they shouldn’t have marketed in the way they did.
Who cares if it's "assisted" or not? A graphics accelerator is also an assistant to the cpu, but you don't see people complaining that nobody is using software rendering to test gaming cpus anymore
@@rithuikrajeev8039 yeah that’s the argument, the 5070 with all enhancements outperforms the 4090 with no enhancements. Yet nvidia markets the 5070 as if it’s better than the 4090 with no enhancements. It’s a sticky road for them to choose to drive on but whatever, they did the same before.
Funny, I was just commenting about fake frames and YT decided to feed me Muta's perfectly timed video. Personally I think AMD should target raster performance, then heavily advertise "no fake frames" and they could take 100% of the gaming market from Nvidia who's lost touch with their consumers.
I like that you are not one of those people that advertise to sell their top-end GPU every generation to buy the new one. Seems to be a trend now :/ Like the housing market, where the houses are now mainly advertised as "investment opportunities" instead of actual housing
In the end, does it actually matter? I can hardly tell the difference between the upscaling/frame gen and normal, and the FPS boost is most certainly worth it
No it doesn't or everybody would be buying AMD like they claim they will but don't. If a company was lying to you, you wouldn't spend $1,500 on a gpu like Mutha did. Lmao
I genuinely believe that frame generation exists to compensate for poor optimization and spec requirements for spec requirements sake. What are we even getting out of these games coming out and "requiring" ray tracing when something like the new Indiana Jones game could be developed with lower hardware targets and still be functionally the same game.
@mind.journey BRO REALLY THINKS COST SAVINGS WILL BE PASSED ONTO HIM AS THE CUSTOMER XD. Bro, wake up. Cost savings havent been passed onto the customer in a hot minute, in really any industry. These companies dont properly compete with each other anymore, they settle into Nashian economics niches, monopolize a small part of the industry, optimize things for themselves, and then pocket the difference for thesmelves. That is what drives them to make a crap DLSS card for unoptimized crap games that are hyper realistic at 1 frame per second with a ton of bugs. Nobody's actually competing to make a good product, just make products that are smoke and mirrors. And somehow, their budgets wind up bloated ANYWAY, because the same lack of competition means they dont have to be smart on how they spend. Case in point, want games to be cheaper? AAA devs could fire their entire DEI kakistocracy staff, hire a much smaller team of actually competent devs, and give them the time and paycheck they need to stick around and churn out quality product. The could hire one decent writer instead of 10 diversity hire RTX developers who got their CS degrees in a cereal box using ChatGPT to code. Companies dont bother optimizing their staff or their games already, because even if people like you finally have had enough and decide you dont wanna buy their slop anymore, then daddy Blackrock will fund their next game anyway so long as it stars a token lead. Literally, you wont see a dime of savings from a 2,000 dollar AI slop generator "graphics" card.
@@mind.journey Glad I didn't have anything in my mouth when I read that comment. Have you ever seen price savings get passed onto the consumer in any meaningful way? This has been going on for much longer than covid, but covid really cemented the idea that you can price things however absurdly and the customer will still buy. Can't really say much more that hatman already said above. And it goes without saying, this thing about requiring raytracing the OP mentioned is perfectly in line with MacroHards Win11s hardware requirements. Short of staying on 10 or switching to linux, lots of ewaste will be coming to a landfill near you. Planned obsolescence anyone?
Just sold part of my Nvidia stock to protect my profits, but I'm holding onto some for the long run because of the company's strong growth prospects. In addition, I'm thinking of expanding the variety in my $400K stock portfolio, but I'm not sure how to handle risks going forward
I managed to grow a nest egg of around 120k to over a Million. I'm specially grateful to my Adviser for his expertise and exposure to different areas of the market
Thank you for the recommendation and I just checked him out on google and found his website, i will sent him an email shortly and I hope he gets back to me soonest
You mean a company that keeps selling you software features locked behind chipsets instead of bringing prices down or making cards natively more powerful to match the cost they are charging is lying to you? Imagine my shock.
Tell me you don't know blackwell has 25% faster cuda cores. Price is very high tho, at least when you consider it's a worse skew than a 192 SM GB202 variant...
@@NickC-IT I dunno about that. I had around 3-4 Nvidia cards before (with EVGA), but AMD is just better honestly. And the interface for Adrenalin is great imo. I'm with my 2nd AMD GPU, and I love it.
@@LordVader1887 Still fascinating that it's the first time the cuda cores themselves are faster. I only checked 10-series to 40-series and they all did 2 instructions per clock now it's 2.5. So the effective clockspeed of the cuda cores is now about 6GHz
The only way this will stop is more competition. I pray Intel gains enough success that they have the confidence to make midrange-high end GPUs and put then for lower prices than Nvidia and AMD but I don't see that as likely.
lol people only want competition so they can buy nvidia cheaper AMD just needs to make compelling gpus thats it ill buy the best one in that price range if they make a 5080 performance card and its ray tracing is equivalent to nvidia and its cheaper ill buy that one
A lot of people in the comment section really don't understand how input lag works 😂. If your raw performance is 20fps it is impossible to get below 1000/20 = 50ms, because the next true frame is only calculated 50ms later by the game engine regardless of how many fake frames you squeeze in between. You can generate a billion frames in between and the lower limit for raw 20fps will still be 50ms.
Yup, they don't know how lag works, don't know how FG works and don't know how time works, apparently. It's sad to see NVIDIA (and AMD) getting away with this obvious grift. But hey, overlays are going to show 200+fps so "muh 5070 is faster than 4090, hur dur"...
They're trying to do to games what SVP and similar programs do to videos, but it's not apples to apples because of the input lag you described. Watching videos isn't really a interactive experience so it won't matter what the lag is from the added fake frames in a video.
reflex 2 is supposed to be able to match the mouse input at the time of the generated frame and "frame warp" it to match the mouse input. So in theory 120 fps generated frames should feel like 120 fps even with 20 fps raw performance.
I used GTX 970 with 4GB (or 3.5GB it was infamous for😂) until 2 months ago. I mean if it was still able to play not so system heavy games at 1080p then your 1060 can still for year or two. Personally after nearly 10 years with GTX 970 I finally upgraded to 4070 Super. I was at first bit on edge whether to wait for these new GPUs, but seeing as the main benefit for 5070 is DLSS4 but still has the same amount of VRAM as 4070/4070 super I can be happy with my decision. Hopefully 12GB of VRAM is still gonna be enough for 1440p or atleast 1080p gaming for many years to come. Lot of ppl are trashing DLSS and frame generation but the good thing about it is that it actually makes 4K gaming possible even with midrange card like 4070S and 5070
@@ultrahd9826 I mean ofc if coy can afford it, especially because it comes with 16GB of VRAM and ofc even base 5070 is better than 4070 super. If I had the time to wait for these new GPUs I would have but my old rig (its HDD) finally gave in, so I had to get PC bit earlier than I had hoped for, but since new GPUs are released every so often, I guess the timing doesnt really matter, there is always the "next big/better thing" 😄
Guaranteed the 5090 vs the 4090 isn’t gonna be the craziest improvement as it is mostly software changes. Waiting for the new drivers will be the real comparison
@@alexmeek610 Unfortunately people like my husband, flight simmers might etc. im only giving it a pass because i get the 4090 to upgrade my 6900xt out of whole ordeal. There will be plenty of people with more money or credit card space than sense. Probably majority.
Went from not owning games to not owning frames
Thats pretty lame
Inb4 frame generation is behind a paywall. I'm sorry about the game industry. I'm just gonna play my archives 😅
What a life eh!😢
@@alphanumeric6582 I actually laughed at it. But I have a weird sense of humor.
Well, they keep locking software technology (upscaling especially) behind chipsets and selling that to you for a premium instead of making the cards cheaper or natively more powerful to better match the prices they are asking for. In other words, GPUs have been a scam for a while now.
75% of the frames are fake, but 100% of the money you spend on this is real.
and 100% of a game is fake
Nah but why would you turn on DLSS though there is literally no need. If I want the real world I’ll go outside and if I want a fictional game I will play it. We don’t need games to look exactly like the real world. Some idiots will get confused😂
Well, that's been true for a while. The Human Eye cannot perceive beyond 144 Frames per second. That is why 144Hz screens exist. It's the Human Limit. Yet the mindless consumer still insisted we create higher and higher frame rates for them to blow their cash on.
Le Sigh.....
@@HTMangaka First it was 24, then 60, now it's 144.
Looks like the human eye is evolving quite fast...
but the cards are still stronger than the last ones without fake frames. so the price getting lower is still amazing for a better card. just don't use the fake frames
6000 series be like here bro take these pills and halucinate you don't even need to generate a frame
And while you're at it hallucinate that the $80 slop you're playing is actually good too
@@one_step_sideways yoo lol
Core Memory Unlocked: PS9 Trailer
Stolen comment from another video
@MrTesterr should i link his profile bro :)
All frame generation tricks have one huge weakness: input latency. There is just NO realistic way for the system reliably predict the future.
50 ms latency is way worse than input latency in bad early 2010 60Hz televisions. Even streaming games from cloud can be faster than 50 ms input latency. The output can even look good, but the frame generation presented by Nvidia is still hardly usable for anything more than smoothing out cutscenes with latency like that.
did you hear about Reflex 2 and frame warping? This video didn't show whether Reflex 2 was being used or not. Reflex 2 will be able to know the mouse input at the time of the generated frame and "warp" it to match the input, so in theory it should "feel" like it has nearly the same input lag as with no frame generation.
This time they are doing it like its done in VR where input is not tied to framerate
this is why I dont turn on these AI "enhancements" until I dial in settings to make the base performance good. AI is not gonna make 20fps feel like 200fps. It CAN be useful if you already get 60+fps, to then enhance that to more fps. But if you are getting 20fps, you gotta bring down the settings not add AI stuff.
AI is an enhancer to already good performance not a replacement for performance IMO
@@resresres1 they were using it, it's automatically used when it's turned on.
And I expected them to show input latency WITHOUT frame generation.
Personally I don't care what the difference between x2 and x4 is, I care what the diff is between x2 and no FG.
Latency is extremely important to be under a certain level, what that level is is different for everyone, but when I tried streaming game that latency was really bad even for a single player game, at least for me.
Give me raw performance with low latency even for single player games.
I play game like Rogue Trader with 100+ frame, when I limit it to 60, it feels worse.
So can I pay 25% real money and use monopoly money for the rest?
hold up, what a minute... i know you.
@@gren2589 wait you do?
These "fake frames" are frames your cpu wouldn't be able to compute anyway. Even top dog cpus bottleneck the 4090 at 1440p, and most ppl are still at 1080p
Fiat currency is basically monopoly money. That's why the prices are so high.
@@xekisxmr forever
Doing benchmark comparisons with different settings should be considered fraud.
Absolutely. Theres actual monetary transactions that will be made due to this trickery
Is it not considered fraud???
@@defskape0068 false advertising at best but technically the info is on the slide so it's just disingenuously presented
sadly its not im glad i had my expectations low for this generation
Absolutely disagree
It's official boys we can now download more FPS, now we just need to download more ram, and have this come full circle
Every week I end up having to download more ram, no matter how many GB's I download my computer seems to be so slow. I'm gonna download a 32 gb package tonight
@@Stinky1am Check the seal on the RAM header where the line feeds into the reservoir, you may have a leak. If you download large amounts the heat will build up and slowly eat away at the seal. Especially if it cools down rapidy after a long download the epoxy can start to crack.
@Appreciation-Community I heard water cooling is good for heat and performance so Ive been spraying the air intake fans with a water spray bottle
@@Stinky1am make sure it's salt water, it helps to increase the boiling point and stop all of the water from evaporating.
technically you can by creating a swap partition (or increasing the paging filesize on windows), so both are achievable.
Luke Smith did a video about this years ago, tittled "Computers were faster when they were slower" . Basically, nowadays we got so much power on our hands that optimisation is thrown out of the window - and games are just the newest thing to be hit by this, you can put Linux on a 15 year old Thinkpad and still use it for daily tasks as long as you don't use the internet too much - basic office tasks are more than doable as well as the regular desktop stuff outside of gaming and heavy internet use (or in some cases, at all - I for example used a X220 for 2 years as a daily driver). Given that we do so much on the web these days though, things become obsolete despite them not necessairily needing to be,simply because the web is so damn bloated and unoptimized. Now Games have the same issue so we compensate by throwing fake frames.
Gotta browse in the terminal brother. It’s the way to combat the bloat.
On a terminal emulator? Pathetic. You need an external, dumb terminal that uses a hardware serial port. Local terminal emulators are pure bloat. I use arch btw.@@ghost-user559
The greed in those day was not a god to all the tech bros.
@@ghost-user559 Yeah I've used w3m before. Not useless but not totally useful either. It's all the JavaScript that terminal-based browsers can't handle that's the issue. I wish there were more alternatives for the terminal, I remember Brodie Robertson talking about Browsh but haven't checked it out yet.
Yeah, it would be a big help if games were properly optimized. Maybe future games will be designed in a different way, like using voxels instead of polygons and pixels.
Fake frames are the only frames we will get from now on. Sadly.
it's because the raw technology to put into the sillicon it's there yet, we have reached the relative peak of hardware capability to perform - nvidia is now pushing the software, frame generation to be precise
Vicariously I live, while gaming dies
True but like he said in the video to maych this level of performance with just raw horsepower it would cost thousands, consume a massive amount of power and take up alot of space. Ai is going to be the main thing going forward for those reasons unfortunately
@@Conraf Just shitty optimization. Excuses. All excuses.
@@Conraf I know. Doesn't change what I said.
Remember the times when games was so well optimized to run on Native graphics without the need of upscaling and fake frames in order to run smoothly because the devs actually cared about polishing their games before release , good times.
Marvel rivals runs smoothly optimized without all those upscaling stuff needed, same goes for elden ring, etc
Remember, depends on who the dev is and how they optimized memory usage
What’s funny is that games run better even natively today than before. It’s just a human nature to cry and remember the good old times that actually never existed
You mean when 30 FPS was considered to be standard?
to be fair, no one played at 4k back then. Most people were playing at 768p or 1024p, very few could play at 1280p.
@@Mohmar2010 nah man. elden ring runs like shit for the horrible AA that it has. even on fing 3260p downscaled back to 2160 (using DLDSR) the grass is still shimmering xD artistically it looks beautiful except for bright indoor areas that just look like massive light bleed and no shadows (ambient occlusion). for the graphical fidelity that the game has it actually runs pretty bad.
A nearly $600 card should be able to run games at 60fps without any AI bullshit. Games are WAY too unoptamized and its actually disgusting.
It can run actually any game, but not in 8K.
@RPGuy1 wasn't Dragon's Dogma 2 struggling on a 4090?
ai bullshit? dlss 4 is not "ai bullshit". it's anti alaising and upscaling. It is a form of optimsation and actually looks BETTER than native.
I don't get this dislike for AI. As long as you can make it cheap and I as a consumer can enjoy videogames by shit working with practically "magic". Welcome to the future bros, get on the boat or be left behind. This isn't hurting anyone but you guys who keep on crying about it instead of getting in the race and innovating.
@@RogueRen dragons dogma was mostly struggling due to CPU bottleneck
Optimization: Gone
Frames: Faked
Hotel: Trivago
underated comment i cant lie lol
This is like a bodybuilder injecting oil into his arms to make his muscles look bigger.
So steroids which helps you build more muscles? So you’re saying that frame generation helps with building more frames?🤦♂️
@@DMTEntity88 no he's saying its like synthol
@@DMTEntity88 no synthol. dum dum...
You mean steroids
@@Somethingfs-sx1ft nope literal oil filling your biceps making them turn into water balloons.
they don't show raster performance anymore
They know what they are doing, Nvidia has always been heavy on marketing too. They've been misleading people for a while.
They know they are making the trade from raster to fake frame gen. It is intentional. They hide it because they know what the consumers want, 100% raster performance, no fake frames. There is a move AMD could make, to take the whole market. Target raster and push advertising "raw hardcore performance, no fake frames" and the gamers will flood over. Most of the comments I see are hating fake frames. AMD should treat their version of DLSS frame gen (FSR?) as an optional additional benefit for those who want it - but they should focus on raw performance.
Because the only card that pushed raster in any meaningful way is the 5090 which is about 30% over the 4090. They have no cost effective way to push raster anymore, dies are already huge and power requirements are high too. They'd need to move to chiplets to make it more affordable but that brings its own issues and power requirements would still be high. Using AI models and path tracing is the only way to still move the graphical needle forward without making crazy expensive hardware that few could afford. AMD/Intel aren't even bothering with high end anymore because Nvidia has too much mindshare and they also know there's a raster wall that Nvidia is already hitting up against.
@@MalinkadinkIt's not only that but also that we are reaching the limit of what's physically possible in terms of transistors sizes, we literally can't make them smaller anymore they are already only around 20 atoms wide
as a 3D environment artist DLSS, frame generation and temporal anti aliasing is a serious plague in gamedev. they're bandaid solutions to much deeper problems and youre sacrificing so much information in the g-buffer which determines how the game will look; if you can't get your game to run & render at 16ms frametime (pretty much 60 fps) you're just not doing it right. upscaling tech should be used to make games performant on systems that don't meet minimum requirements for example, the artifacting in games made by large studios with big money invested in it is abyssmal.
it isn't easy to get work at the moment especially as an artist and seeing games being dropped with pisspoor performance and even worse, gross graphics that don't even compare to much older titles (in some cases mid 2010s games) is just a gut punch, optimization is like 70% of the job as an artist creating game assets. This industry needs an overhaul..
EDIT: apparently a lot of you cannot comprehend anything longer than a twitter post, DLSS isn't inherently bad; it's being used to cut corners and it is rampant.
DLSS is NOT a bandaid though.
@@iCore7GamingOP is saying that DLSS is being used as a bandaid solution by game devs, not that DLSS itself is bad
We just don't have the tech! GPUs will become the size of a computer case and require 3x the power output! AI is the fix until someone develops a breakthrough.
Poo in the loo!
Sadly it is a war of attrition.
People need to talk about the incredible visuals of your game.
Screenshots and youtube videos look great with TAA/DLSS and sadly that is all the studios care about.
I feel like we lost the days where the community could hold the industry accountable.
This issue is sadly in such a difficult spot.
Many people don't notice, care or know any better.
Maybe some day we can return to sharp images, high frame rates and low latency...
At least the indie scene is doing great and are less corruptible.
My biggest issue with the new graphics cards is for games that don’t support DLSS. What about VR systems?
Rendering inside of a VR headset is FAR different to a flat screen game and I personally love to see VRAM and raw improvements.
That isn't true. VR is flat screens.
@TheShamefurDispray Correct, however the screens, and games themselves don't use DLSS or any sort of specialised rendering. SteamVR specifically uses their own "Motion Smoothing" frame interpolation for some headsets, but a LOT of people turn it off because it looks awful to have "generated" or interpolated frames in VR. It literally makes the world warp around you and it gives motion sickness in some cases. Raw performance IS required compared to a flatscreen game where software really can make a difference!
You want more vram and raw performance but are you willing to cough up the money???
@@nonstopnewxy204 that’s where I wanna see more innovations in that side of things. If we have an 5 year old RTX card with high VRAM already, I’d love for companies to look into how we can utilise the VRAM better sure, because we shouldn’t NEED crazy high vram as long as things are optimized. But also it would be good for companies to look into how to make it cheaper and more accessible and possibly just climb those numbers. Unfortunately “cheaper” isn’t something that makes companies more money though. So why do it?
I just finished installing a new 4080 into a new PC actually haha, and the VR performance is pretty magical compared to my last card (as you’d sorta half hope/expect lol)
It wasn’t cheap, and I wish it wasn’t pricey, but at the same time I’m not sure if I’d rather have waited to get a 5070 or stick with this new card. Only time will tell to see the real world performance y’know?
Its crazy if this is where we are at, showing a new gpus performance using dlss and frame gen is stupid.
people championed this.... upscaling and fake frames......
@@caliginousmoira8565this might be a good thing for people who only game or mainly plays game.. but the thing is those who play game on pc also use it for other thing like render video, render photogrammetry, doing some calculations on the stress of structure, architect, civil engineer, mechanical, physics require a beefy gpu and cpu.. making fake frames on a game won't help speeding up work process..
@@redirectories648 And nobody would choose to upscale and generate fake frames if they didnt have to... I was very excited for 50 series but now I'm just gonna skip them and wait for 60 series... No point upgrading to a GPU that is not even a real upgrade without relying on fake resolution and fake frames.. No thanks.
The most insane part about it all was that when Linus tech tips was allowed to play cyberpunk 2077 on the 4090 and 5090 he was not allowed to change any of the settings nor turn the dlss off. So we literally did not get to see what the performance difference was of the 4090 and the 5090 without dlss / frame generation. Also, it's disgusting how every single time these companies give out these benchmarks they always do it on dlss set on performance as if everyone is just going to automatically accept having the worst quality image of an already downgraded image to begin with as the industry standard. They're pissing in customers mouths and telling everyone its just lemon juice at this point. 😮💨
28 fps full rt max settings 5090 no dlss and 20 fps rtx 4090 full rt max settings...
Wait till embargo iz removed check stats and then buy card
i remember when oblivion, gta 4 and crysis completely destroyed higher end systems.
this was way before 2010.
nobody should trust on linus tech
@@DaVizzle_Bro yeah compare a 4090 to a 5090 and there are no difference
I'd rather buy intel or amd gpus
Also where is the 5030 or 5050 or 5060!?
The problem is we are comparing a normal 200 fps with 4ms, with a fake 200 fps with 57ms.
Frames don't matter anymore if I can feel the difference between 60 and 120 fps in a 60hz monitor just by the mouse's latency. I don't even wanna imagine looking a 200 fps with a 24 fps latency.
This has to stop, there's no way I'm buying this garbage.
THIS.
They seem to be pulling some foviated rendering magic though, as seen in Linuses hand on video.(Fringing on screen edges and gun model)
If they pull it off, they might manage to "hallucinate" enough frames so that the movement feels like as if your camera is detached from rendering pipeline, like in that one video, but the game logic is running at 24fps, so lower damage numbers in Marvel Rivals, slight delay (ms) between dealing damage and hit registration) and other shicannery that comes from frame rate tied effects.
They are making an effort to lower latency. Look at nvidea reflex 2. Supposedly lowers latency by also using generated frames down to 2ms
did you not see their latency reduction tech? litterally cuts latency from 60ms to 14ms... you lot have no point.
What is mormal
As a game developer, seeing these comments gives me reassurance that traditional methods of rendering will still be preferred over AI rendering.
It won't stope the entirety of big tech from trying to shove AI slop literally down our throats in for at least the next 2 years
until the AI winter finally come and silicon Valley will focus on some BS else
And about optimizing the game over ai frames bull...
No way! Youre telling me a $549 card won’t outperform a 4090?!? They wouldn’t lie like that!
I hate this frame gen and upscaling bullshit. I think frame gen going from 60 to 120 fps is fine in singleplayer games, the input delay isnt that different. However, when you upscale 60 to like 240, you still have the 60 fps responsiveness with "240 fps", it doesnt feel smooth to play. Also, all the tests are using slow movement, however, in a game like cyberpunk, you do a LOT of mouse movement, its weird that they never show those types of movements, it feels like they are sort of hiding the downsides of frame gen.
@@grandmamp4015, CAN'T WAIT TO GET A 5080 LMFAOOOOOOOOOOOOOOOO!!!!
Nonono u don't understand it's with their dlss they will get the same frame rate in dlss4 games
@@triceracopp At least I can dig DLLS, it kinda works in its rawest form. But once you add Frame Generation and so on OOOF. But yeah REQUIRING IT isn't really what you hope for.
Do you really think someone would do that? Go on stage and lie??
50 ms of input lag IS INSANELY HIGH. We have now mice that has 4000Hz or more of polling rate to achieve 1ms and lower input lag and then they add technologies that add 50ms or more
Most of that lag is caused by a monitor with a low refresh rate and a game you've set the settings way too high for that monitor.
Reflex Pro exist.
It's really not that high at all. I've hit the highest ranks in competitive in games like CSGO, PUBG, Valorant, Siege, and Overwatch so I'm definitely at the top 1-5% and I can BARELY feel it at all in a game. Obviously don't use it in any of those hyper competitive games, but in the demanding single player titles? You're completely fine dude. It's 99% placebo 😂
@@czproductions its not. I have a 4070 and tried frame gen on cyberpunk with DLSS and while it got VERY smooth it was completely unplayable. My mouse felt like it had no shit like half a second of delay. It was that bad. But again I am one of those people that can actually feel the tiny input lag that VSync gives... but I am not a competitive player, and still I notice these delays. Many people will too. Frame Gen is just out of the equation for me.
@@azenyr What resolution and what framerate? I've used frame gen in every tittle and if you get 80-100 FPS with frame gen it feels smooth.
My 1080ti: im tired boss
Lmao I actually picked up one and a 1070 for 2-3rd computers in the house. The RX6xxx was ok. I have one. But this current market can kiss my
Keep hanging in there gtx
It's time to sell boss
what a soldier the 1080 Ti has been, and still is. You'll still play everything with it on 1440p, without any upscaling bullshit.
My 1080Ti still going strong lol
Idk how I feel about these new gpus
The main problem for me is that you can't have Vsync enabled with Frame Gen. Tearing is a big issue for me so the performance isn't worth it.
Yea but at this point most newer monitors have free-sync
@justinhainsworth7726 Oh yeah, I didn't think of that! That makes me feel better if it works well with FG and DLSS. I don't know much about that stuff honestly.
If i will get fake frames,then i will buy it with fake money
Have I got some news for you...
Relax, it's still a great deal for the computing power, so wtf. I'd still buy it, but I feel like there's gonna be a huge shortage
So you’re selling your doge coin for it?
There's a difference between Artificial frames and fake frames. Like atleast get it right, because these BS companies will keep ripping you off.
they said u can over clock to get 200 fps already my opinion I'd rather have a cheaper card but with ai than an expensive 120 real frames card but has less base fps than the ai card
Ah yes, the good ol days where re-encoding 480p to an 720p use to be a thing. But this time, it's with AI.
Reject modernity, embrace tradition... Except price, you still have to pay us 4x as much as a flagship in 2010.
I feel like the people saying "No one cares if it's fake frames" has never used frame gen before. You'll get more input lag and artifacting, making it worse than just having a low frame rate. When it gets better sure, but saying 5070 with DLSS and frame gen will be the same as running the game natively on an 4090 is BS.
Nobody cares except a small 1% of the market. They will never see the difference. It's like the Audio or keyboard market where if you aren't sensitive to it and more likely have not experienced it (ex. Mechanical keyboard) , the experience you currently have is already the best experience.
@@surft I disagree, most people still have a mid range PC nowadays with 1080p monitor. The ghosting effect is very noticable when you turn on FG and any ai upscalers, they just dont compare to native. I will happily play nfs 2015 at native 4k60 over marvel rivals at blurry 4k (and often crash anyway). The lag is not noticable but it is very noticable in competitive shooter/fighter
reflex 2.0 destroys all of your arugments.
@@iCore7GamingI hope it does but it's not out yet, so we really don't know for sure. It ain't like some secret the companys will cherry pick percents that make them look the best. Don't get me wrong I would LOVE free frame but as of now frame generation is far from "free frames with no/minimal costs"
@@surftYou seriously think people can't tell that they have 1/2 a second of input lag or the screen looking smurry if you move too much. Like I said would love for frame gen to improve and be a great setting for low end users, but as of not it's just now ready for everyday gamers to use.
DLSS was originally for upscaling to your modern 4k display, and Frame Generation was for getting smooth motion on 120Hz monitors. The problem comes in when marketing uses these to inflate performance numbers, and devs use the tech to NOT optimize their games.
I wish they focused more on improvements without AI and now you are paying for both software and hardware
You are paying for the extra hardware. But you need different software to drive that hardware. This is how it has always been.
Dlss 4 is free and it offers improvements even for 2000 series cards
Good Luck Nivida Physically can't improve preformace without Ai. Not unless you got a pocket that's a black void full of limitless money. On top of that people can't wrap their heads around the idea of current technology being limited. They they make new technology and new way of doing things. You won't see nothing but Ai getting better and Ai technology taking over. That's the future and that's the New technology. But people hate it because it's not currently up to standard or quality they want. But once it is people who complain will forget Ai Upscaling is a thing.
there is a limit how far they can go without ai
Wait but dlss4 is just on 5000 series
first two replies are so st*pid lmao
Dont delete this video pls
😂
he will
why? its a very bad video ;s
@@kr1me2000 for real lmao, didnt expect something like this from mutahar
dang shill harder
"You will consume the AI slop frames, and you will pay extra for it." -Nvidia
"AI slop" you people really are stuck in the past lmao!
You will own nothing and you will be happy :)
@@timeeternal5756 ??? that argument makes no sense here.
@@BruvvaJosh Calling people stuck in past, just because they don't want to use crappy technology is also argument that makes no sense
@@BruvvaJoshits not getting stuck its staying in the boat that isn't sinking, wether its new or not its rational to want to stick to the better technology, get off the ai bandwagon lol ai is not gonna solve this
"Fine for a single player game" Maybe if you're playing a walking sim or on a low difficulty...
The more you buy, the more Mudahar talks about switching to Linux
a keygen church pfp? I havent seen that before
@drintrovert4564 I was just thinking this, holy.
Switching to Linux is another topic.
@@valenrn8657 no pfp with bot response, dead internet theory
@@thomaselvidge so making a short comment is now being a bot?? Not that hard to click on the profile.
*I could use FSR Frame Gen on my RX 580 and get 120 FPS in a game, then use Lossless Scaling 2x to turn it into 240 FPS!!!*
I have a 4090!!!
yeah. thats usually how new tech works. Bravo
Jan 10th this will be a bigger reality
You cracked the code well done
Tell me you no nothing about AI/ML engineering without telling me you know nothing about AI/ML engineering
@@tag89 Tell me you can't understand a joke without telling me. 🤷
Great, now I can finally afford a 1050Ti.
I have mine since 2017 but i wouldn't be able to afford a new one these days LoL. In my country the prices are crazy even for old GPUs, at least when you reach the big budget the differences aren't that big between them, but anyway the entry point became sooo high, unbelievable
Honestly I am still running a 1080 for years now, and it's only been in the last 2 years I have really found it lacking when it comes to performance in the games I typically play and I think alot of that has to do with new games just not being as optimised as they used to be.
but you just got a 980ti!
You can get a used rx 580 for $50
@@TwoPlusTwoEqualsFive32 I upgraded my 1080 TI last year to a 3090 TI. I still have the 1080 in my second PC I have inside an arcade cabinet that feeds a second driving arcade cabinet. Forza 4 still looks great on it.
2:39 Hey weirdos watching in portrait mode 👋
**** you
What's up homie 👍
I'm not weird. I'm gay.
😅
From Graphics Processing Unit to AI Generating Unit 😭😭
it where its been heading im just amazed that anyone in these chats that consider themselves a pc enthusiast didnt see it coming or is this just fake outrage ... smh , when is saw the size of the 5090 founder before he even said ai i was like yup we here ... that card should be big as or bigger than 4090 ...
Honestly if they would separate the two it would be kinda cool. People who want it can get it, and if you want plain ole raster you can get that.
Remember when graphics cards offered real improvements, and not just elaborate smoke-and-mirrors acts? Those were good times.
You mean like Anti Aliasing? Aniso Filtering? Ambient Occlusion? I suppose if you just got here in a DeLorean from 1985, you remember. Except there weren't graphics cards in your PC then, which have ALWAYS offered some special effects to help games run smooth.
"Game sucks, bro. My character is just polys with a texture skin over it. There's no real guts in there."
@@WindFireAllThatKindOfThingDLSS upscaling? to improve my frames at little cost to clarity? fake pixels, don't want em.
remember when graphics were at a point where there COULD be a massive upgrade of rasterized performance? come on mam think. They cant keep pushing rasterized performance and make cards affordable and small, and scale to the laptops. ai is necessary, and ai is more exponential than rasterized potential growth. think!
@@Thegooob95 i know i was kidding i guess i just didnt make it clear enough HAHAHA i am pro-ai increasing frames and upscaling and whatnot. i remember when dlss first came out with Control and it was really blurry, and now the drawbacks are damn near imperceptible. i just hope that, like DLSS upscaling, we see that same noticeable generation-on-generation gains with MFG and ray reconstruction (especially w/ regards to latency on MFG)
@@Thegooob95 What are you on about?...literally only last generation from the 3090 to 4090 there was a massive leap in performance (75 to 80% without AI generation crap - RAW performance), AI has it's place and is one metric but the fact is raw horsepower is still a valid metric, especially in regards to input lag. I want to buy the 5090 but at the asking price it's not even worth it... I'd happily pay $1999 if the performance increase was like it was from the 3 to 4 series... in raw power.. it might be but spec numbers are not adding up.. and we are talking about flagship cards here not little15w chips for laptops or handhelds where yes I agree AI can be used a crutch to help make things more liveable.
4080 user here. I almost NEVER enabling DLSS.
If I am playing singleplayer game - I wanna see awesome picture.
If I am playing multiplayer game - I wanna see my enemies and not the pixel mash.
That's idiotic. DLSS on Quality (if implemented correctly) is basically free frames with no harm for the visuals. There's no pixel mash, most of the people won't even be able to find the difference.
@@delacroixx
Idiotic is believe to every corporative bs coming out from any of hardwarecreators.
LMAO, dude check the videos from Hardware Unboxed for example.
Simple example, I played Ghost of Tsushima, and I can tell you that regular picture is WAY MORE sharp than DLLS with Quality preset.
DLSS in some games can be really good, but it is good in first place of lower tier PC to get better frames in exchange with some not critical imagine quality.
I got a 4070 I hate dlss always turn it off, would rather turn graphics down for more frames
@@DanteX3M buddy, I'm using DLSS all the time. In some rare cases it is implemented badly, I agree, but most of the time it's a great way to boost performance without losing anything visually really. Framegen sucks tho so far.
@@delacroixx Then you have very shit pc if you need to use DLSS... No one would use DLSS unless they had to... No reason to..
just wait for real benchmarks where everything is laid bare and then make your decision
RIP to well optimized games, unfortunately we'll get only fake frames and AI slop with artifacting going forward
at least we'll still have indie games
Who cares, as long as game is smooth it’s fine with me
@@TheRealRooslingthe problem is it’s not smooth it just looks smooth even on a controller you can physically feel the difference, any real pc enthusiast will never agree or side with you
@@TheRealRoosling if you really like your game being smooth, you should search up latency.
ai slop? you really don't understand what you're talking about do you?
its the epitome of modern society. instead of working on improving the actual tech, they focus on only making shortcuts to make it seem like its improved. now the shortcuts are so advanced that it makes you wonder why they spent all that time/money on that instead of just improving the actual tech
What is the benefit of extra frames is there is no improvement on input lag? I just don't get it. It's like turning on that motion gimmick modern TVs have
Reflex, they're doing a whole new version of it just for this, cuts the input to like 28 ms😊
People now enjoy playing games without Vsync turned on, in a sea of input lag, just for the numbers
For me, it's really nice if you own a high refresh rate monitor and a GPU that can get above 60FPS but can't do 144hz. For me, a game running 80FPS and frame generated to 130-144FPS looks much better and doesn't feel completely miserable to play.
It's also nice on games with hard framerate caps, like Sekiro. You can turn a 60FPS cap into a 120FPS cap and that's just much nicer for the eyes.
This is only really an issue those that have a bottlenecked CPU, keep anti aliasing on, or keep a low to standard refresh rate on their display and playing 4k.
It will be a matter of taste - there will be artifacts that some people never notice but others find intolerable
When they did this with TVS, sports enjoyers got really mad because it had a habit of making the ball disappear 😂
Jensen comes on stage, announces new level of price gouging with "AI/Crypto" as an excuse, gives himself a pay raise, rinse and repeat each year.
Pretty wild that people think current prices of cards are "not that bad".
IKR. I have always gotten the 80 class of cards (780, 1080, 3080). I knew the pricing would be rough, but $1,000 MSRP for the x080 card... man that sucks. My 3080 (10GB) feels like it's just starting to show it's age in the latest AAA games. Mainly because of the 10 GB of VRAM (and I'm on 1440p not 4K). Which is why the 5070 non Ti is a complete non starter for me with 12 fking GB of VRAM for $750. Kiss my ass on that one Nvidia. And the "5070 = 4090" thing I call bullshit on already. Nvidia of course means select situations, using the latest DLSS with the now 3 fake frames shoved in-between every real one. Which again I'm sure is all shit that only works on the 5000 series anyways.
@@sean8102 same here kinda annoying that the 3080 only has 10gb of VRAM otherwise its a solid card, might wait til the 6000 series honestly
It's a complicated matter. Low end cards like 4060 have absolutely garbage value but high end gpus that let you run games with path tracing are absolutely worth it because you see actual improvement in graphics - which to be frank, is in like 3 games right now but that's the developers' fault
@@pingeee 3080 is the new GTX 970
At the end of the day we're just some ordinary gamers...
Ain't that the truth.
You said the thing!!
“I’m a hawk too” ahh cat 🤣🤣🤣
Wait, say that again...
@@Bhante-r1r LOL
This Jensen guy is the new Todd Howard with all the Stockholm Syndrome people worshiping him
He even has a signature jacket
“Ayy Eye” “it just works…”
Snake oil salesman 😂
More like Stockholder syndrome
People worship Todd?
2:08 massive?
You know what else is massive?
LOOOOOOOOOOW TAAAAPERR FAAAAADE
beat me to it
I’ll be excited when Gamer Nexus own these gpus for review. We’ll see if there are actual changes.
30% bigger die and 30% more Power. That's all it's gonna be
Yep, that's what I'm waiting for. Already feel 99.99% confident the "5070 = 4090 performance" claim is pure BS. There will be a few niche titles maybe, and of course only when DLSS is on (prob set to quality), of course frame gen etc. Heck Nvidia's claim probably counts on using DLSS 4 features that don't even work on the 4090. Either way I'm pretty sure plenty of people will be upset when they just pre-order a 5070 believing Nvidia's claims and own benchmarks blindly. Already seen plenty of comments on various tech news sites, and the nvidia sub-reddit (guess not much surprise there) of people believing it 100% zero doubts.
@@kiillabytezfacts over feels bro
I would be equally intrested to see what testing Hardware Unboxed does too as they have previously tested all these Upscalers against each other and DLSS was doing quite well in those vids, so I'm very curious to see how the new FSR 4 stacks up against the new DLSS 4
GN is already on DLSS/FG boat.
Look at their last card review. 40% of the video / charts are with those "functions" ON.
Fact-check me.
The main thing I feel when I go from low FPS to high FPS is how much better it FEELS, not how much smoother it looks. The thing I'm looking for with higher FPS is the lower input lag. I mostly notice FPS when I move the camera and it feels awful when it starts getting below 50FPS to me. There is no point in increasing the framerate if the input lag doesn't also go up, unless you're playing games that don't have much input to notice the lag.
9:19 Important to understand, the less base framerate you have the worst it feels. This is why those fake frames works best of the 1000$+ GPUs, not the 500$ one, and why they cannot use these features to claim the 5070 is as fast as the 4090, even IF you use all the features they introduced here. A 4090 will still have a much better framegen output even if it use x2 boost while the 5070 uses x4, the later will be laggier with a ton of wierd/blurry frames.
Looks like I would be searching for an high-performance AMD PC in 2025.
Thank you, NVIDIA.
Are the demos all slow smoothy panning shots? If so, then there's something being hidden. Did you notice that in the demo a lot of the geometric edges/lines were shaky/jittery? That's also something there. They're basically padding FPS numbers with 3 frames of the same raw image but now drawn by AI. If you've looked at AI generated images these past couple of years, you'd notice that while AI can make really close-looking images to what's fed to it, one thing it can't really do is consistent and very accurate alignment of lines. That's why there's always a skew in the shapes and sizes, lines, and scale of the stuff in the images it produces. 3 whole images made in that manner are being padded in between the actual frames your GPU renders raw. That's why the stuff shake all over the place (the lines aren't overlapping perfectly per frame/image).
That was also addressed in presentation, that technology got better and electricity lines won't be jagged with new DLSS 4, but would be nice to see tests from independent reviewers of course.
Welcome back PlayStation 1
Raw performance > ai
And where are you going to get that raw performance now if not for AI?😂
As a guy building up his Blu-Ray collection, I agree.
Some AI “upscaling” has really ruined a lot of 4K releases of classic films.
@@elivelive I love your mindset
NVIDIA won't make anymore cards with good native performance, therefore cards that do that don't exist anymore!
Brand loyalty is stupid, why would you care that much about a brand, be loyal to good products, not brands
@@webpombo7765 Imagine hating on faster technology
Sandy's muscles > SpongeBob's inflatable muscles
250 fps and 600ms input lag
Lmao
Make up numbers cuz u were wrong about your made up price predictions.
That aged well..lol
everyone doom posting, without seeing any actual benchmarks. Its wild :D
@@opticalsalt2306 Who gives a crap about number of likes
@@opticalsalt2306 The stupid people are always the ones with the most likes
If you're doubling the resolution, quadruple by area and using DLSS4 frame gen, that means only 1/16 pixels are being rendered by the game.
1920 x 1080 x 30fps = 62,208,000 pixels per second
(DLSS4 On) 3840 x 2160 x 120 = 995,328,000 pixels per second
995,328,000 / 62,208,000 = 16
I cant wait to see the raster benchmarks for the 5090 vs the 4090.
30% extra and it consumes 30% more power. Looks like all they did was whack up the power fed to the chip and upgraded the vram. I’m curious how the 5070 and 5070ti are compared to my 4070ti super.
@@ramborambokitchenkitchen63575090 has 21k CUDA cores as opposed to 16k on the 4090
@ramborambokitchenkitchen6357 we will see
@@ramborambokitchenkitchen6357 the 5070 outperforms the 4070 but barely so the 5090 will blow your 4070ti super out of the water. the 50 series is essentially just the 40 series with an undersized motherboard and better ai. this is from both vex, gamers nexus, and linus tech tips.
@@ramborambokitchenkitchen6357 I'm guessing the 5070 will be about 4070 Ti level.
Its concerning how every single tech channel who are able to have the new RTX GPUs are avoiding to say that out loud, or say "well it doesnt bother me" just so people start accepting this AI crap everywhere. Im happy that I was able to find your video among 100+ of videos shilling Nvidias "best release".
Accepting? It's been here. They are merely marketing it to you now that's is mature enough in the common conversation of society. Big difference in my opinion.
They don’t wanna say anything mean to their face or else they won’t get the next invite. Makes sense but also scammy.
hahaha, you people make me laugh so hard... some day you'll understand it
@@Cenot4ph it's your crafty riddles that keep us trying.
@@highpraise-highcritic you need actual brain power or effort to understand this one
5070 with 4090 fake frames!.
It should be illegal to benchmark and claim as "Performances" if the GPU or Game MUST HAVE DLLS/FSR,Upscaling,FramaGen to reach the "performances" they are promoting as improvement. They are just Crutches using to cover the flaws and unfinished products.
It's more negligence from the consumer not doing their homework.
@@Youngsta0100% those who are smart seek out the information. Low watt brain havers just cling to the marketing information and run with it without second thought.
@@Youngsta0what an idiotic take. There is brand trust for a reason.
@@carpetburns1222 don’t blame me for you purchasing a product that was a gimmick. I’ll still be upgrading my 1080ti soon, but wait for real world testing before making my decision.
@@Youngsta0 Next time you buy ground beef that is infected with Listeria, I'll blame you for not doing your homework on what brands of meat have had Listeria outbreaks in the past instead of the company recalling the infected meat in the first place.
See how silly your opinion is?
Still using a 1080ti. Maybe when its a full decade old I will upgrade.
Last good card ngl
@@MumboJumboZXC I'm still playing games at 1440p on my ultrawide monitor without issues. Newer games I might need to turn down some settings but not often.
same, got myself one of the VR ready ones way back so a nice whopping 11GB of VRAM lol with the industry so messed up and many developers moving into indie studios and projects there's been no need to get a cutting edge card especially for 1 to 2 major AAA titles that end up being "meh" anyway. Gamers are just so desperate for good gameplay these days you could make an adventure game with basic geometric shapes and plenty of people would be happy with it as long as it was fun.
Nah. 12 gig is so you are forced to upgrade. They are purposly leaving a major bottleneck. A fatal weakness. So that your monster of a card needs to get replaced.
Just buy a 5070 ti cheaper uprgade then ps5 pro
@@zerosam5541where do you plug the keyboard/controller into the 5070ti?
even my good ol 2080ti has 11gb vram, 3 generations higher only having 12 sounds crazy lol
COUNTERPOINT: The devs can optimize their shit games.
Realism has peaked, polycount has peaked, boredom has peaked.
Make a fun game, stylize the graphics, optimize it to run WITHOUT the crutches
Ditch UE5 for good measure
Ditch UE5? For what?
I played through Saints Row the 3rd and only recently I have bothered to look at the remastered version and I prefered the OG graphics over the hyper realistic graphics. It's wild to me how I prefer a game from 2011 over a remake released a decade later.
ditch UE5? are you on drugs?
@@multi-milliondollarmike5127 Send a message that game devs are no saints either and need to get their shit together?
@@iCore7Gaming No, UE5 is unoptimized slop and if you need empirical evidence, Threat Interactive does a great, thorough dissection of all the flaws that UE5 tries to coverup with DLSS, Frame gen, TAA, and the slog that is nanite
as a blender artist and a 3d engine renderer, I hate the AI slop so much, even though I bought an Nvidia 3080 ti, it was because Nvidia just works better than AM, as comparing between AMD radeon R9 Fury X watercooled gpu to Nvidia 1660 super, even though they are roughly equally powered, with AMD i had constant crashes, blue screenings and shredded pixels when using blender (games worked fine though) but then Nvidia works so much better not to mention I actually got to use the gpu power and not render with CPU which is like inly 1% of the potential which AMD would have provided if only I would have been able to use it but I couldnt because there is no support.
sure nowadays with blender 3.1 and above, there is options for AMD cards but that requires a rather new card, along the same age from the start of the 4000 series from Nvidia or even newer (to this day I still cannot navigate the names of the AMD cards) and needless to say, I don't fancy getting a weaker card just to have more crashing, more blue screens, more horrible user experience if i want to do anynthing else but gaming.
NOW this AI slop, Dlss and the other bs from Nvidia does 0% contribution to making rendering faster or running the whole software in the first place other than makes the already obnoxiously expensive cards even more expensive. its just bullshit around bullshit and there is no way around it, because what can I, a random nobody do? just wait an eternity like a kid with a wishlist for santa and let daddy nvidia make a perfect gpu just for me? just being real, that aint ever gonna happen which is exactly why I am bothered about it.
@17:08 A 20-30% upgrade towards raw performance, is a massive difference/upgrade.
13:32 5080 is not “midrange” “average” come on.
Exactly this man is lying they got exposed by steam charts and when the ps5 pro launched nearly every pc channel talked about how to build a ps5 pro for $700
He did say the "5070" was midrange and most gamer would be around there for GPU purchasing. NVidia is pushing them to the 5070 TI as that 16gb is really the new minimum needed for current new games.
Thank you. Someoone said what I was thinking.
Ye id get why everyone is dissing on nvidia, if you want to then go make gpus ur self and tell me how hard it is
Otherwise.. GET OUT
@RawRustTW8253Nvidia is worth 3 and a half trillion dollars (3,500,000,000,000) with almost 30,000 employees.
Obviously the average person isn’t going to be able to make a GPU, but it’s not like you need to be a chef in order to know if food tastes good or not. The same applies to GPUs, nvidia is just spamming AI buzzwords to try to increase their market evaluation as much as possible at the expense of the consumer.
The people falling for this marketing is absolutely hilarious
CANT WAIT TO GET A 5080 LOL.
What ppl lol? Seems like everyone is calling BS..
@@sw9881, CAN'T WAIT TO GET A 5080 LMFAOOOOOOOOOOOOOOOO!!!!!
Who’s falling for it? Everybody and their mama shits on nvidia every day of their lives
The people riding the 'nuh uh, bruh, it's not REAL frames' bandwagon are even more hilarious. You sound like obnoxious Vegans.
"Don't eat that, it's processed and got teh Glutens, bruh!"
The price isn't reasonable suddenly because you can compare it to the 40 series, the 40 series was massively over priced
We already know from digital foundry that 4x mfg compared to 2x mfg adds barely any latency plus on dlss4 ghodting, image stability and latency itself are vastly improved.
You literally cannot compare dlss3.5 and how a game runs and looks right now to what we get with dlss4. Its an entirely different neural model.
I'm a quake player. I can feel it.
This is what I’m all upset about. I grew up to each new card generation having some mind blowing Nvidia Tech Demos that still are cool to look back on. PhysX Smoke/Water/Cloth physics in that one Batman game still look ahead of today’s games. Nvidia stopped giving us more powerful cards that clearly show the games looking better. Now it’s just AI to fake frames to pretend it’s actually more fps. That’s garbage. If I want more frames I want them natural as it can get. No post processing bs like TVs. Get back to making cards that can do more complex shaders and water. Get on water. Smoke and fire too. We still use flat fire and smoke textures. Again Batman Arkham Asylum I think it was? Old game. But SUPER COOL SNOKE! Keep on Ray Tracing. But leave out the AI generating crap.
the ai generating crap is a crutch to ensure the rt crap runs smooth in the first place that and ue5 games continue to be unoptimized pieces of crap
It is outright shameful how PhysX is STILL not used to it's entire capability by developers.
I liked PhysX too. But that was already so malicious back then. It seemed to me like a feature that was mainly there to make other cards look worse because it only ran on Nvidia cards. It was forbidden for other graphics cards. NVidia gave a lot of money to game developers so that they could incorporate PhysX into their games and put Nvidia's logo on them. From the moment NVIDIA stopped paying for the implantation of PhysX, game developers stopped incorporating it into their games. It would have been better if the feature had been free for all graphics cards, then we might still have the feature today because it would have become standard. The physics are worse in games today unless it's an exceptional game that places extreme value on it.
@@kevinj24535 yeah they should have made it open source. They did the same for RayTracing too to an extent. I mean I get it. I was team Nvidia because of it and don’t have any regrets. I still prefer them over AMD. But the PhysX needs to be open source by now. Like. Let it go. Promote it without putting the Nvidia stamp on it. That would be the nice surprise. Just booting a game “knowing” what to expect. Then boom PhysX on friggin everything. Hair strands. Fur strands. Cloth. Smoke fire and fog. Water that can be redirected if blocked or pushed or even put into a bucket.
I want to feel the same feeling as when I went from GTA SanAndreas to GTA IV. The good old days when “next gen” was undoubtably actually next gen. You couldn’t even pretend that you didn’t notice a difference.
Once we hit PS4/XBONE I feel like maps just got bigger. And grass. Raytracing is BARELY even noticable. Oh “more accurate shadows and reflections”. Shadows. Ok. Reflections that’s cap. GTA V RayTracing update reflects some cars in the wrong color and buildings are LOD models. Not really worth the update. I’m looking for Wall-E/Toy Story graphics by now.
@ agreed to the max. Or atleast devs making their own version by now. Valve did a thing with their new engine tho. Definitely in the right direction. I hope they use their fog tech for the next “forbidden” Portal 3 game. Volumetric fog pouring out from portals shot in a foggy area would look absolutely incredible.
The CEO specifically bragged about how little frames are actually real in his presentation. NVidia consistently brags about their AI upscaling and AI generated images. They aren't lying to anyone. They are pretty upfront about this. Their entire fucking marketing is about AI shit.
in terms of GPUs this is actually something great that will be improved later on , a lot of unreasonable hate cause there is evil word "AI" that noone really knows much abaut
@amboyman DLSS has already been working really well for me since like version 2, but even V1 was a clear improvement over bilinear upscaling. DLSS is quite clearly no bullshit, it's honestly 75% of what makes their card worth the price tag compared to competitors. You just have to understand what it actually is, and whether it's useful for your own use case.
But we'll, RUclipsrs like this and plenty more cannot stand Nvidia being successfull and just have to make up strawman arguments like the title of this video.
This is a stupid argument. If you actually read into DLSS, the arguments against fake frames makes no sense. Essentially, it's a tool that increases transistor-to-frame efficiency and the input lag is reduced through reflex 2 which halves the input delay of reflex 1. Why is there so much hate for the word AI? This is an exact example of how AI is used for good in optimizing transistor usage.
@@maxmustermann3938i dont know man, unless you are running a higher resolution screen, DLSS makes things look fuzzy. Not to mention the lack of VRAM and raw performance means that DLSS runs worse at some points.
I hate the micro stutters my 3070 TI gets, and I do not think shilling over more money to team green is going to do anything. At this point I am going to switch fully to AMD to at least get more vram with SAM memory.
@@amboymanthis has nothing to do with AI prejudice
It's a GPU, it should DRAW the game, not GUESS it
7:30 well, if they add one frame in between each actual frame, I can pay 275 actual dollars and then add in between each one a hand-drawn monopoly dollar. The one extra hand-drawn dollar is just my treat, so that they can keep researching some more advanced con tactics with my monopoly money.
I disagree. Frame generation is good for single player game or casual games. For a competitive game we're not worried as much about ray tracing and are probably dropping the settings a bit and running at 240+fps. The difference between the 4090 and 5070 is negligible where it counts. I suggest upgrading the 4090 card to a 50 series as soon as possible; even upgrading from a 4090 to a 5070 will be huge. Nobody wants an old 40 series card in 2025. Get with the times.
"nobody wants an old 40 series card in 2025"
bro, the 40 series are 2 years old
the amount of people believing that 5070 claim is nuts
with multi frame generation it's very likely true, and this is the new paradigm of graphics rendering
@@Cenot4ph yeah, I agree. I think it's because 4090 is a peak of graphics card technology. Where improvements would only mean bigger size, more power consumption and of course unholy price. So they came up with this solution. I guess it's a new era of rendering graphics where every game developer would be forced to implement dlss and frame gen in their projects.
@@Mothmook no, the 4000 series is where this new paradigm is starting.
If raw graphics power would work, AMD is able to compete going forward doing just that.
My opinion is that AI rendering is the new way forward because it's performs significantly better
Yeah. This just shows that people will buy up anything Nvidia says due to their position. If you really think a 5070 is on par with a 4090 with its minuscule 192 bit bus then you need to be slapped. Lol.
@@Iridiumcosmos you and a lot of people don't seem to understand much at all it seems.
This is using multi frame generation in comparison with the 4090 frame gen. This is not a raw performance comparison.
It really isn't hard when you apply your brain for a second
That subreddit you mentioned at 15:44 is probably half full of Liverpool fans at this point.
😆
Ray tracing became way too big. I, for one, never cared for it. Why would I be excited about realistic lighting? I experience realistic lighting all the time irl, seeing it simulated doesn't make me orgasm I'm sorry. I'd rather say no to it if it hinders performance too greatly. Traditional lighting solutions are fine. No to DLSS, MFG, etc. together with ray tracing. Can I get a GPU that just renders normal shit efficiently? Don't make me overpay for RT cores while also putting software crutches all over the product.
We're gonna see an influx of horrendously optimised videogames that are simply unplayable without fake assists. Sad.
now I dont know a single thing about computers or what any of this means, but if ur getting 200+ fake fps and the game looks good and feels good, whats the problem with that
Glad you’re covering this sir. The prices seem “reasonable” if of course the performance is there but I’d say that’s a long shot… a 4090 for $549? I doubt it without assistance from upscaling etc etc
To be fair it does outperform the normal 4070 and is cheaper than it on launch. So at least it’s definitely an upgrade, though they shouldn’t have marketed in the way they did.
Who cares if it's "assisted" or not? A graphics accelerator is also an assistant to the cpu, but you don't see people complaining that nobody is using software rendering to test gaming cpus anymore
Shouldnt 4090 be able to do the same with its AI TOPS.?
@@rithuikrajeev8039 yeah that’s the argument, the 5070 with all enhancements outperforms the 4090 with no enhancements. Yet nvidia markets the 5070 as if it’s better than the 4090 with no enhancements. It’s a sticky road for them to choose to drive on but whatever, they did the same before.
@@suave605 I believe MFG can be acheived in 40 series cards as well. Looking more like a software lock.
00:24 glitch in the matrix
????
Games today lack good stories, good gameplay, and frames.
What a time we are living in.
Funny, I was just commenting about fake frames and YT decided to feed me Muta's perfectly timed video. Personally I think AMD should target raster performance, then heavily advertise "no fake frames" and they could take 100% of the gaming market from Nvidia who's lost touch with their consumers.
They did this last generation too (40xx).
And on the 30xx series they changed how they measure flops, to again claim 2x.
I like that you are not one of those people that advertise to sell their top-end GPU every generation to buy the new one. Seems to be a trend now :/ Like the housing market, where the houses are now mainly advertised as "investment opportunities" instead of actual housing
1:45 yes. nvidia is running planned obsolescence scam.
In the end, does it actually matter? I can hardly tell the difference between the upscaling/frame gen and normal, and the FPS boost is most certainly worth it
No it doesn't or everybody would be buying AMD like they claim they will but don't. If a company was lying to you, you wouldn't spend $1,500 on a gpu like Mutha did. Lmao
I genuinely believe that frame generation exists to compensate for poor optimization and spec requirements for spec requirements sake. What are we even getting out of these games coming out and "requiring" ray tracing when something like the new Indiana Jones game could be developed with lower hardware targets and still be functionally the same game.
it's ok, because this way the game studios can save a ton on development cost and we can buy the games at a fraction of a price...
@mind.journey BRO REALLY THINKS COST SAVINGS WILL BE PASSED ONTO HIM AS THE CUSTOMER XD.
Bro, wake up. Cost savings havent been passed onto the customer in a hot minute, in really any industry. These companies dont properly compete with each other anymore, they settle into Nashian economics niches, monopolize a small part of the industry, optimize things for themselves, and then pocket the difference for thesmelves. That is what drives them to make a crap DLSS card for unoptimized crap games that are hyper realistic at 1 frame per second with a ton of bugs. Nobody's actually competing to make a good product, just make products that are smoke and mirrors. And somehow, their budgets wind up bloated ANYWAY, because the same lack of competition means they dont have to be smart on how they spend.
Case in point, want games to be cheaper? AAA devs could fire their entire DEI kakistocracy staff, hire a much smaller team of actually competent devs, and give them the time and paycheck they need to stick around and churn out quality product. The could hire one decent writer instead of 10 diversity hire RTX developers who got their CS degrees in a cereal box using ChatGPT to code.
Companies dont bother optimizing their staff or their games already, because even if people like you finally have had enough and decide you dont wanna buy their slop anymore, then daddy Blackrock will fund their next game anyway so long as it stars a token lead.
Literally, you wont see a dime of savings from a 2,000 dollar AI slop generator "graphics" card.
@@mind.journey Glad I didn't have anything in my mouth when I read that comment. Have you ever seen price savings get passed onto the consumer in any meaningful way? This has been going on for much longer than covid, but covid really cemented the idea that you can price things however absurdly and the customer will still buy. Can't really say much more that hatman already said above.
And it goes without saying, this thing about requiring raytracing the OP mentioned is perfectly in line with MacroHards Win11s hardware requirements. Short of staying on 10 or switching to linux, lots of ewaste will be coming to a landfill near you. Planned obsolescence anyone?
@@hatman4818 bro is trying to speedrun the internet outrage buzzword challenge 💀.
@@hatman4818 I was very clearly sarcastic, well apparently not very clearly
Just sold part of my Nvidia stock to protect my profits, but I'm holding onto some for the long run because of the company's strong growth prospects. In addition, I'm thinking of expanding the variety in my $400K stock portfolio, but I'm not sure how to handle risks going forward
I managed to grow a nest egg of around 120k to over a Million. I'm specially grateful to my Adviser for his expertise and exposure to different areas of the market
How can I reach this adviser of yours? because I'm seeking for a more effective investment approach on my savings
Gabriel alberto william is the licensed advisor I use. Just research the name, you will find the necessary details to work with
Thank you for the recommendation and I just checked him out on google and found his website, i will sent him an email shortly and I hope he gets back to me soonest
Was so confused at this comment until I realized it was a 🤖
I'd rather pay more if the focus was power over fake frames
Remember: The more you buy, the more you save.
You mean a company that keeps selling you software features locked behind chipsets instead of bringing prices down or making cards natively more powerful to match the cost they are charging is lying to you? Imagine my shock.
Tell me you don't know blackwell has 25% faster cuda cores. Price is very high tho, at least when you consider it's a worse skew than a 192 SM GB202 variant...
@@NickC-IT I dunno about that. I had around 3-4 Nvidia cards before (with EVGA), but AMD is just better honestly. And the interface for Adrenalin is great imo. I'm with my 2nd AMD GPU, and I love it.
The 5070 is $50 cheaper than the 4070 was at launch.
@@marcasrealaccount 30% higher power draw though, not a great trade off when its 30% more expensive.
@@LordVader1887 Still fascinating that it's the first time the cuda cores themselves are faster. I only checked 10-series to 40-series and they all did 2 instructions per clock now it's 2.5. So the effective clockspeed of the cuda cores is now about 6GHz
This game optimization situation is crazy
700 for 70Ti. The prices are terrible. The 3080 was 800. This needs to stop
The only way this will stop is more competition. I pray Intel gains enough success that they have the confidence to make midrange-high end GPUs and put then for lower prices than Nvidia and AMD but I don't see that as likely.
I will stop it
@@РаЫо Thank you
It's funny and sad at the same time that all it takes is to keep prices fced up for a certain period of time and people start to call them reasonable.
lol people only want competition so they can buy nvidia cheaper AMD just needs to make compelling gpus thats it ill buy the best one in that price range if they make a 5080 performance card and its ray tracing is equivalent to nvidia and its cheaper ill buy that one
A lot of people in the comment section really don't understand how input lag works 😂. If your raw performance is 20fps it is impossible to get below 1000/20 = 50ms, because the next true frame is only calculated 50ms later by the game engine regardless of how many fake frames you squeeze in between. You can generate a billion frames in between and the lower limit for raw 20fps will still be 50ms.
the frames dont get generated by the engine. The core of the gpu does that
@@hanuulaaI think you may have missed the OPs point.
Yup, they don't know how lag works, don't know how FG works and don't know how time works, apparently. It's sad to see NVIDIA (and AMD) getting away with this obvious grift. But hey, overlays are going to show 200+fps so "muh 5070 is faster than 4090, hur dur"...
They're trying to do to games what SVP and similar programs do to videos, but it's not apples to apples because of the input lag you described. Watching videos isn't really a interactive experience so it won't matter what the lag is from the added fake frames in a video.
reflex 2 is supposed to be able to match the mouse input at the time of the generated frame and "frame warp" it to match the mouse input. So in theory 120 fps generated frames should feel like 120 fps even with 20 fps raw performance.
16:43 Me sitting here still running my GTX 1060 6GB 😭😭
I used GTX 970 with 4GB (or 3.5GB it was infamous for😂) until 2 months ago. I mean if it was still able to play not so system heavy games at 1080p then your 1060 can still for year or two. Personally after nearly 10 years with GTX 970 I finally upgraded to 4070 Super. I was at first bit on edge whether to wait for these new GPUs, but seeing as the main benefit for 5070 is DLSS4 but still has the same amount of VRAM as 4070/4070 super I can be happy with my decision. Hopefully 12GB of VRAM is still gonna be enough for 1440p or atleast 1080p gaming for many years to come.
Lot of ppl are trashing DLSS and frame generation but the good thing about it is that it actually makes 4K gaming possible even with midrange card like 4070S and 5070
@@Balnazzardi 12GB will be sufficient for at least 3 years, this dude is tripping.
@@Balnazzardi I was gonna buy the same graphics card you have, the 4070 super 12 GB, but I waited. Now I'm thinking about buying the 5070ti. Should I?
@@ultrahd9826 I mean ofc if coy can afford it, especially because it comes with 16GB of VRAM and ofc even base 5070 is better than 4070 super. If I had the time
to wait for these new GPUs I would have but my old rig (its HDD) finally gave in, so I had to get PC bit earlier than I had hoped for, but since new GPUs are released every so often, I guess the timing doesnt really matter, there is always the "next big/better thing"
😄
Bro i have a GTX 1060 3Gb its still good😂
Guaranteed the 5090 vs the 4090 isn’t gonna be the craziest improvement as it is mostly software changes. Waiting for the new drivers will be the real comparison
I would say its mostly AI capabilities. It is the enhanced AI capabilities which allow for the extra fake frames.
Unless you're in the 3d market, it's not worth upgrading
@@Mohmar2010 who is upgrading from a 4090 to a 5090 this is more for people who bought a rtx titan or a 3090 non ti
@@alexmeek610 Unfortunately people like my husband, flight simmers might etc. im only giving it a pass because i get the 4090 to upgrade my 6900xt out of whole ordeal.
There will be plenty of people with more money or credit card space than sense. Probably majority.
@@alexmeek610I'm upgrading because its a 2 slot design. Thats a huge difference for small form factor builds
We’re never getting optimized games again are we?😢
Frame Gen are fake frames. Raw power is better visually and runs smoothly
false, next.
After upgrading from a 6600 to a 7900xt I’ll stick with this one for the next 10 years lmao
Got my 7900xtx I don’t regret it
AMD will switch to AI too, denying AI is 85 iq.
Somone that actualy bought a 7900xt instead of nivdia
@@BruvvaJosh did I say anything about AI?
@@Scooter227 then what's your issue?
10:57 top right ghosting example
The 5070 is closer in spec to a 4070 super than a 4080
Glad it’s not just me that had a fear about them lying to us, heard AI and got super skeptical