in the rtx 5090 card A.I (Artificial Intelligence) chatGPT can WRITE u a (STORYLINE in 3 minutes for your TEACHER in COLLEGE)....i WOULD NoT MIND A.I WRITING CODE for a VIDEO GAME.....not 3 Minutes.....BUt Coming Out with a GAME every 3 DAYS)...i Dont Know WHAT THE PEople is ARGUING about....More Games Made Fast for US....A.i Baby Baby...(The Developer have to (GUIDE the A.i on how to CODE)
Wouldn't go that far, but when there's obvious diminishing returns trying to upgrade physically, you have to look elsewhere for performance/frame rate gains
Same their insight and detail is phenomenal sometimes they’re actually explaining things for us lamen terms naturally I get what they’re saying which is awesome and they don’t shy away from the technical detail. I love it
With the new ghosting and quality improvements to DLSS I think the days of the major complaints against it is over. Now if they can lock down the latency and further reach parity for image quality to raster (which obviously over time that would be the case) we will finally be in the real future of gaming.
Pretty great that some of the key features are going to work with the older models. It would be great to see an explanation of the features that actually will work with the 20, 30 and 40 series cards.
I remember hearing that the 50 series cards were only able to achieve this increase of performance through ai. The limits of the actual power of the cards are starting to reach their limit. It will be interesting to see how they progress into the next generation of cards.
There won't be any brute power it is definitely an AI future they have throwing brute power for as long as they can that's why they went to AI that is a very nice dream you have
Likely, you will see much less change in rasterization performance if at all and newer and better AI path tracing rendering tech that might even make rasterization look like pixel art.
i would wait, not only is amd gonna increase production on the x3d cpus but the new cpus they are releasing are gonna even further increase supply by late q1
@@ELpeaceonearth "Frame generation bad" until AMD has a version. 😂 DLSS4 should have been the star of the show, because NVIDIA is our only hope since game optimization croaked during the pandemic.
@@ELpeaceonearth Common misconception. Multi framegeneration is only one part of DLSS 4. That part is exclusive to 5000 series. But the improved picture quality and performance is for 3 4 and 5 series.
@@aXque i have a 4060, does that mean i shouldn't upgrade?? I dont really understand what multi frame generation will be helpful in. The quality? Or more fps??
yes, cause this year and next, the game devs will constant increase visuals to use it. I thought the same as you 2 years ago, got a 7900xtx, was planning to use it next 4 years, but then amd took FOREVER for FSR improvement, looks like crap, and after 2 years, many games don't have it. Kingdom come deliverance 2 will only have FSR 2.2 per AMD fsr list. What a joke. Meanwhile game devs LOVE nvidia, and will keep pushing games to the limit of hardware that nvidia provides. I really wanted AMD to do better, especially after banking 1000 dollars on the 7900xtx in early 2023. Nvidia has the 5080 at 1000 dollars, instead of 1200 , and DLSS is in KCD2 day one, so goodbye AMD, Hello Nvidia.
@johnc8327 not necessarily true, i have a 144hz monitor and there is a clear difference in smoothness even if my frames are below 144hz. Similar to a high Hz TV tbh. I dont know why you guys always say this lol its like people tmdpnt test for themselves and just listen to what they read only.
Game mode with frame gen on Samsung TV's give around 10ms added latency going from 30 fps to 60 fps and picture looks ao better! I play all PS4 games this way
Interpolation on TVs doesn't use motion vectors and historically had giant amount of latency. I guess they improved latency but still no data from the game, only raw pixels, which isn't good for artifacts.
i played on my 50 inch tv and bought a monitor at same time because supposable latency... i wasted money on a monitor games run smooth and beautiful on the tv and wayyy cheaper then a monitor
I appreciate how everyone thinks they're the type of player who would not be able to play with a 5070 because they're a hardcore pro and would be able to tell the difference easily. When 90% of players would be just fine with a 5070.
I love hearing these complex details explained in layman's terms by Rich here. Sometimes on DF proper it they assume I know everything by the way that they speak and I just try to keep up haha
Idk what you guys are seeing. That shakyness from the frame gen is a little too jaring. It looks like the object is literally being shaken back and forth. At 4k it will be even more noticeable.
With all the discourse about 'fake frames' you'd expect this to make these cards less desirable. But I'm gonna go out on a limb here and say they're going to be sold out for months and have inflated prices for a long time.
20:47 When the 20 series came out... the first 4K monitors came out??? You have got to be kidding me. I had a 4K monitor back in 2015 after the GTX 980 Ti was released... Does Seth Macy even know anything about computers and the advancements in digital technology?
@@NakedSpaceMan williamjake is correct. ASUS PB287Q 4K released in 2014. Was very capable of being plugged in with one cable, so may be do your research before you mouth off. Still have the invoice & date stamped on - bought beginning of 2015 January.
@EpicPianoArrangements A 60hz workstation monitor. Oh yeah, such gaming capabilities in 2014 when 60hz was considered the barely usable standard for gaming
@@theicewitch9328 all rtx cards will receive all dlss enhancements, so transformer model, reflex 2, dlss 4, etc. only multgen is 50 exclusive. i could post you the link but youtube won't allow it.
Been playing Sandland, Forza Motorsport (8), Ghostwire Tokyo, Deathloop and the Callisto protocol lately. Can’t wait to get the 5070 to match a 4090 performance in those games!
Well, we all hate the false marketing of having 4090 performance, but 5070 being 20~30% better than 4070 and $50 cheaper, does make it enticing. I will wait another 2 years, 6070 is gonna be worth the upgrade.
@@excelbelajar Lol u guys... I have a 2070 Super desktop, and 4060 laptop bought at the end of 2023.. I dont game at max settings and my family doesn't own a gold mine...
@@NeroX-nh8se lol i got a 3080 ti (95% of gaming performance of 3090) for $900 bucks in 2021. But the 5080 is $1000 bucks and a good deal. What’s the 5070, $750? 😭 it used to be $500 lmfaoooo
I'm planning on upgrading from a 3070 to a 5070ti. Do I need to upgrade yet? No probably not, but DLSS3, and now 4, are enough to justify it for me. Full path tracing at a playable framerate? Sign me tf up.
The answer is 15:27 Worth the upgrade? Depends on what you have now and how you play. If you are working with 4090 now, resale value on that will get you $1k+ easy. I currently have 7900xtx and 7800 xt and keeping both. Def getting 5090 if I can get my hands on it at launch. If not, this will be selling for $2500+ if supplies are low. If you want to play in 4k and get high frames with and Ray Tracing with no limitations yeah get 4090, get 5090. I say just get what you can afford.
I just hope that people that mention 4K gaming never speak again in their lives. The future was 4K and 8K but we are further from that now than we were in 2015.
The 5090 exists as a price anchor, what nVidia really wants you to buy is the 5070/5080. The 5090 is priced that way to make the 5070/5080 look like a good deal (they aren't). The 5070 could be sold at $400 and the 5080 at $650.
Guarantee though that the performance of the 5090 is going to smoke the 5080 by almost double given the CUDA cores, RT transistors and double the VRAM.
People complain about everything man, even when they reduce prices. They could be sold at 1200 and 800 if they want. Because people still buy them anyway
The xx90-cards are prosumer/professional-tier cards. They're not only competing with xx80/xx80ti, but also with Quadro/Ada, etc for professional usecases. And in these instances a 2k$ price point is not that much money.
Not really struggling getting enough frames for most games with my 4090 at 4k even without frame generation, mostly getting 70 to 165 fps on all games even on the new modern games.
This video features Richard Leadbetter of Digital Foundry, joining IGN to discuss the NVIDIA 50 Series GPUs and their implications for PC gaming. Key points include: 1. AI-Driven Performance: The 50 Series GPUs utilize AI technologies like DLSS 4 and multi-frame generation to achieve performance boosts, reducing reliance on traditional transistor density increases due to rising costs. 2. Latency Concerns: While AI-driven features significantly enhance frame rates, they also risk higher latency, which can affect gameplay, especially in fast-paced games. 3. Complex Performance Metrics: Richard highlights that "performance" now includes more than frame rates, as low latency remains critical for gameplay quality. 4. Impact on Handheld Devices: The technology's potential for handheld devices, such as the rumored Nintendo Switch 2, is discussed, emphasizing power efficiency and the possibility of a streamlined DLSS version for mobile platforms. 5. Future Analysis: Digital Foundry plans detailed reviews of the 50 Series GPUs to guide gamers in deciding whether to upgrade, considering the trade-offs between advancements and practical gaming needs. The GPUs mark a shift in gaming technology but come with nuanced considerations for both PC and handheld markets.
I saw some pics of MSI PC gaming builds, and they were labeled as gaming PCs, with dual 5090 cards, but couldn't see how they were bridged. DX12 games don't support SLI, unless the game developer goes out of their way to make it work, like DICE did with Battlefield 1, which was the only DX12 game that I found to support and run beautifully in SLI.
I would love it if NVidia would develop a standalone box that would upscale and add frames to the output of something like the Nintendo Switch. I would pay a lot for that.
If ray tracing is not a priority, would it be reasonable to purchase a 4080S without reservation? I plan to wait until the 5000 series cards are available and the price of the 4000 series cards decreases.
I don't see why it wouldn't help in some ways, at least. DLSS 4 has been stated and shown to take VRAM usage in games down at least a bit from DLSS 3. I have a feeling that's why the 5080 still has 16GB of VRAM. I think they are going to try and focus on taking VRAM usage down.
some people don’t like DLSS or other similar features the same way they are reluctant to embrace EV. The reality is if the industry leaders believe it’s the way to go then it’s the way to go.
When I first came across screen stuttering on my xbox one x 2018, I thought the tv should add additional frames to circumvent the stutter. It now seems like that will one day be a possibility, will probably come from the console though. Would be great if the tv could also do it for older platforms and games, with upscaling and HDR. It would be great watching RUclips in 240hz. They still need to mocap video footage using AI for UE5, for the ultimate animation quality and realism. AI should be used for effects and graphical embellishments, not interpolated frames, so input lag isn't reduced in online pvp games. The most low-res basic geometry should still be rendered at 240fps, not up boosted from 35fps with AI interpolation. Thank me later.
DLSS 4, being about to 3x or 4x frame generate; is because of the additional time now achieved in pre-determined transistor configurations… The really scary part is explaining how we achieve those configurations and what they now allow us to do; without it starting to sound like the building blocks of a pre-cognitive AI sub-conscious…
because the money in your pocket is burning your leg? because you want 75% of the frames it generates to be fake? you cannot live your life withput upscaled images? is it vital for you to disipate 550 watts of heat inside your room?
Cyberpunk 2077 on the 5090 is going to have crazy high latency and ghosting with that many made-up frames. There was a reason they showed only close-up shots or slow movement I'm glad RDNA4 is focusing on upscaling rather than Frame Generation
Everyone crying about frame generation latency in first person shooters normally play at 1080p on a 24" screen. This new dlss4 is meant for 4k or higher gaming 🤔 😏
I have an awful Dell I haven't been able to upgrade mainly because of dimensions but the 5070 is the same size as the 3070 does anyone think the 500w PSU could handle the 30w increase?
My big question is, is it worth selling my 4070ti super to buy the 5070ti? From what I can tell with the info so far, probably not... I can run Indy at 4k, full ray tracing with DLSS quality at 60fps and maxed out settings. If the new AI and DLSS4 (which will come to 4000 series and others anyway) allows me to squeeze even more performance/quality out of it, that'd be great. Might not need to upgrade until 7000 or 8000 series..
It would be great, if with the new transformer based ai model, DLSS 4's 4K performance mode comes closer in image quality to that of DLSS 3.5's 4K quality mode. I hope that could make my 4070 super more than enough for 4K60fps gaming at high settings for many games. Even if it does, will the new transformer based model perform better with more AI TOPS, producing more frames in a particular DLSS mode for 50 series, for example, a 5070 upscaling to 4K much better than a 4070 Super/Ti Super just because it has more TOPS?
Easy answer: If you want the best Tech and got the money Get you a 5090 or 5080…. Perfect answer: If you got a 4080 or 4090, maybe just upgrade your display. Best answer: if you have a huge back log that plays the games you currently play at great specs just enjoy your gaming until you’re forced to upgrade and then get what you can afford. Worst answer: splitting rice from rice to find the best rice
I like this comment. I have a huge backlog and a 3080 ti to run it. Buying an ayn Odin portal 2 (7 inch oled cloud gaming handheld only 430 grams weight) to play my games and because its 1080p, I can get even MORE work out of my 3080 ti playing this way
This. Don't understand how or why people upgrade every gen. I'm gonna get the 5090....because I've been rocking a 2080ti since it came out. At 1440p, I've only recently been noticing slow down. With the 5090 I won't upgrade for 5 or 6 years.
If 5070 is on par with 4080, that is already very good. I am more inclined to upgrading my ageing Gigabyte Gaming OC 3080Ti 12GB with 5070Ti 16GB some OC version, most likely from Asus or Gigabyte; Nevertheless, I am still curious to see 5080/5070Ti and 9070 XT against 4080/S in order to understand the gains in rasterization performance. DLSS 4 MFG is sounding as awesome feature, but let's see how the implementations in game will be like.
I know gamers are upset that we can't increase brute force rendering without costing a fortune (I guess we can thank TSMC for that) but at least with Nvidia we have resource options. You will obviously have different fps targets for different games.A Triple A game you can target 45fps and multiply it 2 or 3x or use the new improved DLSS. If you're playing a comp shooter you probably just want to use DLSS and reflex. 99.9% of us probably aren't using 4K Max settings with path tracing and hoping for some crazy smooth experience but at least options are available.
Long answer: NVIDIA's recent unveiling of the GeForce RTX 50-series, spearheaded by the RTX 5090, has generated significant interest in the tech community. Priced at $1,999, the RTX 5090 boasts advanced features, including DLSS 4 technology, and promises substantial performance enhancements over its predecessors. However, for many users, the investment may not justify the upgrade. Several factors contribute to this perspective: Marginal Performance Gains in Traditional Rendering; While NVIDIA claims the RTX 5090 offers up to twice the performance of the RTX 4090, these figures heavily rely on DLSS 4's AI-driven frame generation. In scenarios without DLSS 4, the performance improvement is more modest. For instance, in "Far Cry 6," the RTX 5090 demonstrates approximately a 27% increase over the RTX 4090, and about a 43% increase in "A Plague Tale: Requiem." Dependence on DLSS 4 for Peak Performance; DLSS 4 introduces multi-frame generation, producing up to three AI-generated frames for each rendered frame, significantly boosting frame rates. However, this technology's effectiveness depends on game support and may not be universally compatible. Additionally, some users express concerns about potential input latency and the authenticity of AI-generated frames, particularly in fast-paced gaming environments. High Power Consumption and System Requirements The RTX 5090 has a Total Board Power (TBP) of 575 watts, a notable increase from the RTX 4090's 450 watts. This escalation necessitates robust power supplies and efficient cooling solutions, potentially leading to additional expenses for system upgrades. The increased power draw also raises concerns about energy efficiency and long-term operational costs. Elevated Price Point; At $1,999, the RTX 5090 represents a significant financial commitment. For many users, especially those who already own high-end GPUs like the RTX 4090, the incremental benefits may not justify the substantial expenditure. The cost-to-performance ratio becomes a critical consideration, particularly when previous-generation GPUs continue to deliver exceptional performance in current applications. Conclusion; While the NVIDIA RTX 5090 and the 50-series GPUs introduce impressive technological advancements, including DLSS 4, the decision to upgrade should be carefully evaluated. For users with existing high-performance GPUs, the marginal gains in traditional rendering tasks, reliance on game-specific DLSS 4 support, increased power requirements, and the premium price may render the upgrade less compelling. Assessing individual needs, current system capabilities, and budget constraints is essential before committing to such an investment. (All Credits go to chat GPT)
@Eren-da-Jaeger i would stay on it for 2 more years if possible, this generation is not offering reallu much, if ever, i would go for the 5070, for the power consumotion it theoretically has, the others males no sense at all
Because it cannot predict the future of your decisions (it doesn't know when you click "shoot" button), but it can realize that between 1 and 3 there is 2 and generate it as a transition.
I have a SFF case, I'm upgrading from a 4070 to the 5070 Ti or a 5080. I'm looking at around a 60% to 80% raw performance uplift depending on the graphics card I upgrade to, not to mention I get more VRAM. DLSS 4 multi frame generation is just a cherry on top. Seems worth it to me, especially since I already have a buyer lined up for my 4070, so I'm not paying full price for the upgrade.
Love Digital Foundry and Rich, but they have a few opinions that I don't agree with. Each of them seem to love motion blur and now Rich seems to not mind frame generation in any AAA game. Personally, if the game is an FPS shooter (e.g. Cyberpunk), the added latency is horrid. That's plenty fast-paced to necessitate minimal input lag. Of course it all depends on the base FPS, but that's getting into it.
Uhmmm but arent most DLSS4 features coming to old RTX cards (I know they wont run the same but still..) I would say if own a Cars from the 4000 series is not really worth it to upgrade, but if You are still using a GTX or a RTX 2000 series then a RTX5070 and beyond are good upgrades.
So I’m a console gamer recently turned PC gamer. Help me out….is it worth me upgrading to the 50 series? I currently have a 4070 super and I game in 1440p.
This is an old argument revolving around notions of 'purity' Us old timers remember when people thought AA/AF and reducing rez was 'cheating' rather than just running higher resolution. The 'real frames' argument in new clothes.
If you have an RTX 40 series card, this is a definite SKIP/PASS and u should wait for the RTX 6080 and 6090 which im sure will be the REAL upgrade. Having the best card from the last gen, means you can make it through current gen (Ex: I had my rtx 2070 for 4 years before i got a 40 series card, I skipped the 30 series as there was no need)
i agree i rented a 2070 super from evga and just traded it in for a 3080 for the price difrence and it was 30-40% maybe less in some games its all fomo and fake advertising
I am not upgrading until 2030. In Black Friday 2023 I purchased the AMD TUF Radeon RX7900-XTX, because NGreedia was overpriced on their 4000 series cards. What NGreedia is doing these days is horrible. If you take away the DLSS and Ray Tracing along with all the AI fake frames, the 5000 series cards they have now, are no better than the RTX-3090. People should NOT be purchasing the 5000 cards, because they are paying for an overpriced series of cards for fake frames. If the cards stood by itself, it would not look good at all. And NGreedia knows this. NGreedia are deceiving people. NGreedia are just locking performance behind AI software. When I upgraded to the 7900-XTX, I came from the GTX 1070-Ti. So, I will wait until 2030 before I consider upgrading again.
The problem I see with the 5000 series right now is.... where are the games? Where are the games that were gonna need a 5080/90 in order to run? There are no titles out there currently that can't be run proficiently on a higher end 3000 series card or 4000 with DLSS. Just dont see the need to for anyone with higher than a 3080 to jump up to 5000 yet. I just upgraded from 1080ti to a 4090. I bought that 1080 in 2017 and i was able to play nearly everything on med-high settings up until games like Cyberpunk, and even then it still runs well with FSR. I bought the 4090 in March of this year thinking that if push came to shove that I would just sell the 4090 to help pay for the 5090 if there was any new tech that would be exclusive to the Blackwell cards. I think ill stick with the 4090 for a good while.
Black Myth Wukong is difficult to run at decent fps on my wide 3440x1440 on a 3080 10GB. Saying there is no title that can't be run proficiently on a higher end 3000 series is not acurate since those cannot use Frame Generation :/ And the raw power increase compared to the 3080 at these resolutions should still be decent with a 5080.
I've got a 4090, and I use an LG OLED TV at 144hz. Multi frame generation is not exiting enough for me to upgrade and aside from the multi frame gen the 40 series is getting all the DLSS 4 features as the 5090. I don't do competitive shooters and stick to single player games and aside from Indiana Jones, Black Myth Wukong, Cyberpunk, I get close to 100fps native 4k, and well over 100fps when I use DLSS with no frame generation in most of games.
The videostream quality from Richard could use some DLSS 4.
Lol :)))
in the rtx 5090 card A.I (Artificial Intelligence) chatGPT can WRITE u a (STORYLINE in 3 minutes for your TEACHER in COLLEGE)....i WOULD NoT MIND A.I WRITING CODE for a VIDEO GAME.....not 3 Minutes.....BUt Coming Out with a GAME every 3 DAYS)...i Dont Know WHAT THE PEople is ARGUING about....More Games Made Fast for US....A.i Baby Baby...(The Developer have to (GUIDE the A.i on how to CODE)
Blurry Richard is more valuable than all IGN staff combined, presented in holographic form
@@timothy-l9b what are you talking about?
💀
I'll pay for my 5090 in cash! 3 out of 4 notes will be photocopied.
NVIDIA funds generation.
you can get a 5070 for that kind of money, mate
@@tedcrilly1 no no, its called artificially generated.
@@MrRahibzz...money
some of these comments about the new highend cards wow as if they personally had that kind of money in the first place to buy a 5090 lol...
We are getting very close to Nvidia producing the same cards and just locking performance behind software
My 5yr old GPU just got a free upgrade 😊❤
What's the issue with that if it streamlines production?
software is reproducible. You can code with CUDA, thats why industry uses Nvidia.
Wouldn't go that far, but when there's obvious diminishing returns trying to upgrade physically, you have to look elsewhere for performance/frame rate gains
They are already doing that. I can bet the dlss 4 can work on series 40 cards
Love adding in Digital Foundry input to interviews.
power move
Same their insight and detail is phenomenal sometimes they’re actually explaining things for us lamen terms naturally I get what they’re saying which is awesome and they don’t shy away from the technical detail. I love it
ign parent company if part owner of df now
With the new ghosting and quality improvements to DLSS I think the days of the major complaints against it is over. Now if they can lock down the latency and further reach parity for image quality to raster (which obviously over time that would be the case) we will finally be in the real future of gaming.
Pretty great that some of the key features are going to work with the older models. It would be great to see an explanation of the features that actually will work with the 20, 30 and 40 series cards.
@@donalgodon im still on the 2070 super at 1440p
DLSS4 using the new model.
@@TRAV1NATOR same, such a great card and I can play pretty much everything at 1440p right now
@@Onepieceownzat low or medium settings 😂
*There was a whole chart by Nvidia depicting this very thing*
I think I’m just going to continue with my RTX 4090 FE and see what the future of the 6090 will be, and I hope it’s brute power and not Ai.
hear, hear 🤞🏾
Same
I remember hearing that the 50 series cards were only able to achieve this increase of performance through ai. The limits of the actual power of the cards are starting to reach their limit. It will be interesting to see how they progress into the next generation of cards.
There won't be any brute power it is definitely an AI future they have throwing brute power for as long as they can that's why they went to AI that is a very nice dream you have
Likely, you will see much less change in rasterization performance if at all and newer and better AI path tracing rendering tech that might even make rasterization look like pixel art.
Why doesn't ign make their videos in 4k, and why is the digital foundry camera 360p?
why should they? i watch utube videos at 360p, so that my net volume does not finish quickly
@Alfi-x6b because the world is larger than you
There’s no need, not everyone has a 4k monitor/phone.
Handbrakin the vids down I'd guess?
I’ve been commenting this for years. My guess is they opt for quicker uploads
I think the 5080 is worth it. I have a 3080 ti but I have a 32:9 monitor so it has to work harder and I think the 5080 is good enough for that.
@@frostyshakes7531 my 4080 super does good on my 32:9, so the upgrade will definitely be good for you needs
What cpu are u planning to use for the 5080?
Seth and Richard... On the same video.. 🍿🍿🍿
Richard is right, CPU is very important. Get that X3D!
9800x3D is being shipped to me very soon
i would wait, not only is amd gonna increase production on the x3d cpus but the new cpus they are releasing are gonna even further increase supply by late q1
The 9800X3D will still bottleneck the 5090 at 1080p.
@@hb-hr1nhare you slow? How would the best CPU bottleneck the best GPU? Or you just like sounding smart 😂
@Kinslayu Same, waiting for Amazon to "Ship"... 😮💨
Remember that 4000 series is getting DLSS 4. Just not the exclusive Multi framegeneration feature :)
so is 30 series and u can use lossless scaling it has x4 mode too
@@ELpeaceonearth
"Frame generation bad" until AMD has a version. 😂 DLSS4 should have been the star of the show, because NVIDIA is our only hope since game optimization croaked during the pandemic.
@@ELpeaceonearth Common misconception. Multi framegeneration is only one part of DLSS 4. That part is exclusive to 5000 series. But the improved picture quality and performance is for 3 4 and 5 series.
@@aXque i have a 4060, does that mean i shouldn't upgrade?? I dont really understand what multi frame generation will be helpful in. The quality? Or more fps??
@@Aabid_Hassannn My recommendation is to wait until dlss 4 comes out, try it out for yourself.
considering I'm still rocking a 1060 with health issues...Yes, upgrading will be very worth it
If you haven't upgraded by now you aren't gonna be buying a 50 series. I am not my 4080 super is getting pgraded to DLSS4 anyway
If you are upgrading to 50 series just because you have an 1060, you really need to do your preps and understanding what you doing first.
I have a 1080ti and the upgrade is going to be well worth it.
@@Byakuyabou2 At least multi frame gen will be most beneficial to CPU limited PCs.
That would be like Goku going from base to ultra instinct 😅
The upgrade Will be amazing for you
Can someone summarize this video briefly? I'm a foreigner so it's hard for me to understand.
Best crossover DF vs. IGN
It's not a cross over IGN owns 50% (49-51?) of Digital Foundry.
If my panel only up to 144Hz, do I need over powered GPU cards?
You wouldn’t get the benefit of over 144 fps gpu without screen tearing.
yes, cause this year and next, the game devs will constant increase visuals to use it. I thought the same as you 2 years ago, got a 7900xtx, was planning to use it next 4 years, but then amd took FOREVER for FSR improvement, looks like crap, and after 2 years, many games don't have it. Kingdom come deliverance 2 will only have FSR 2.2 per AMD fsr list. What a joke. Meanwhile game devs LOVE nvidia, and will keep pushing games to the limit of hardware that nvidia provides. I really wanted AMD to do better, especially after banking 1000 dollars on the 7900xtx in early 2023. Nvidia has the 5080 at 1000 dollars, instead of 1200 , and DLSS is in KCD2 day one, so goodbye AMD, Hello Nvidia.
Yes because 5090 will barely get 4k at 120-144 htz natively on max settings
@johnc8327 not necessarily true, i have a 144hz monitor and there is a clear difference in smoothness even if my frames are below 144hz. Similar to a high Hz TV tbh. I dont know why you guys always say this lol its like people tmdpnt test for themselves and just listen to what they read only.
No
Game mode with frame gen on Samsung TV's give around 10ms added latency going from 30 fps to 60 fps and picture looks ao better! I play all PS4 games this way
This works pretty well with switch games as well. Got one of them little AA boxes, and the switch is looking alright on the big tv.
Interpolation on TVs doesn't use motion vectors and historically had giant amount of latency. I guess they improved latency but still no data from the game, only raw pixels, which isn't good for artifacts.
i played on my 50 inch tv and bought a monitor at same time because supposable latency... i wasted money on a monitor games run smooth and beautiful on the tv and wayyy cheaper then a monitor
We don't need new gpu for games. We need BETTER OPTIMIZATION
I appreciate how everyone thinks they're the type of player who would not be able to play with a 5070 because they're a hardcore pro and would be able to tell the difference easily. When 90% of players would be just fine with a 5070.
13:55 bro tried to drop a Bottom Line & had to quickly retract 😅
You would think a guy that talks about graphics fidelity and image quality issues would have a higher bitrate stream. Lead Better Rich, Leadbetter.
The Frame Generation is a no go for me , always turn it off cause of the noticeable input lag (for me)
🧢
just enable Nvidia Reflex man
10ms? Really? Come on man.
he has a xbox series s
@@robe4314 Frame Gen has been known to hit very close to 100 ms and even over it at times.
I love hearing these complex details explained in layman's terms by Rich here. Sometimes on DF proper it they assume I know everything by the way that they speak and I just try to keep up haha
Idk what you guys are seeing. That shakyness from the frame gen is a little too jaring. It looks like the object is literally being shaken back and forth. At 4k it will be even more noticeable.
Ima stick with my 4080 super for now it’s getting DLSS 4 also.
Ya it’s sad but true. I just want to max everything out and I’m starting to see the limit with my vram getting consumed by lighting and frame gen
It seems Dlss 4 will increase performance and less Vram on 40 series
@@mauriciobecker7916 ya it could be great. I do like my 4080 maybe 60 series will be the next big move
@@Blazemaster9same, I love my 4080 Super. I’m gonna wait until 6000 series as well.
With all the discourse about 'fake frames' you'd expect this to make these cards less desirable. But I'm gonna go out on a limb here and say they're going to be sold out for months and have inflated prices for a long time.
the FOMO effect for many
5080 n 5090 not gonna be the case but I can definitely see the 5070s having this problem
20:47 When the 20 series came out... the first 4K monitors came out??? You have got to be kidding me. I had a 4K monitor back in 2015 after the GTX 980 Ti was released...
Does Seth Macy even know anything about computers and the advancements in digital technology?
if it isnt g-sync or freesync who cares if its 4k.
The monitors back then were fake 4K with two cables needed to do 3840x1080. Can you be any more disingenuous?
@@NakedSpaceMan Fyi, the PB287Q monitor was released in 2014 and it only needed 1 DisplayPort 1.2 cable. His statement isn't necessarily disingenuous.
@@NakedSpaceMan williamjake is correct. ASUS PB287Q 4K released in 2014. Was very capable of being plugged in with one cable, so may be do your research before you mouth off. Still have the invoice & date stamped on - bought beginning of 2015 January.
@EpicPianoArrangements A 60hz workstation monitor. Oh yeah, such gaming capabilities in 2014 when 60hz was considered the barely usable standard for gaming
for context: dlss 4 is for ALL RTX cards. only MFG is 50 series exclusive
i dont think 2000 and 3000 series is getting dlss4. Correct me if im wrong.
@@theicewitch9328 all rtx cards will receive all dlss enhancements, so transformer model, reflex 2, dlss 4, etc.
only multgen is 50 exclusive. i could post you the link but youtube won't allow it.
I thought only DLSS 'enchancements' are coming to cards other than RTX 50 series
@@danceoffAsh dlss enhancements are reflex 2, transformer models, dlss 4, etc
but framegen is 40 eclusive and multi framegen is 50 exclusive
Yeah I’m happy with the improvements my 4070s will receive. Not upgrading.
Not sure why ign would post a 30 minute video of Digital Foundry discussing the same thing they talk about for 30 minutes on all their other videos.
IGN clout chasing
Ad revenue.
Because IGN bought Eurogamer, that's why
y'all both right
Yet here you are watching she even commenting
I think all this a.i upscaling and frame gen will lead to photorealistic gameplay in the next gen of gaming
But does the improve gameplay
@@paullasky6865 of course
Been playing Sandland, Forza Motorsport (8), Ghostwire Tokyo, Deathloop and the Callisto protocol lately. Can’t wait to get the 5070 to match a 4090 performance in those games!
Not gonna happen 😂
@@Clutch4IceCream It does with frame gen. You won't notice the difference
@@rkwjunior2298 it was a sarcastic comment. The idea is that none of those games support frame gen.
@@Technova_SgrA I bet that a 5070 will still do well in raw performance compared to an older card from that time.
Well, we all hate the false marketing of having 4090 performance, but 5070 being 20~30% better than 4070 and $50 cheaper, does make it enticing.
I will wait another 2 years, 6070 is gonna be worth the upgrade.
9070 must be better bro
10070 is where it’s at.
@@excelbelajar Lol u guys... I have a 2070 Super desktop, and 4060 laptop bought at the end of 2023.. I dont game at max settings and my family doesn't own a gold mine...
@@NeroX-nh8se lol i got a 3080 ti (95% of gaming performance of 3090) for $900 bucks in 2021. But the 5080 is $1000 bucks and a good deal. What’s the 5070, $750? 😭 it used to be $500 lmfaoooo
3080 retail was $700, 3070 retail was $500. Now reduced to $1k and $750? Interesting
I'm planning on upgrading from a 3070 to a 5070ti. Do I need to upgrade yet? No probably not, but DLSS3, and now 4, are enough to justify it for me. Full path tracing at a playable framerate? Sign me tf up.
@@dan-crum same as me 😀
This is pretty insightful. Many thanks!
Building a new computer right now should I wait for the new monitors with 2.1 display port ? I don't know when they're supposed to come out
The answer is 15:27 Worth the upgrade? Depends on what you have now and how you play. If you are working with 4090 now, resale value on that will get you $1k+ easy. I currently have 7900xtx and 7800 xt and keeping both. Def getting 5090 if I can get my hands on it at launch. If not, this will be selling for $2500+ if supplies are low.
If you want to play in 4k and get high frames with and Ray Tracing with no limitations yeah get 4090, get 5090. I say just get what you can afford.
I just hope that people that mention 4K gaming never speak again in their lives. The future was 4K and 8K but we are further from that now than we were in 2015.
Because rtx 🤡
This dude knows his stuff man I’ve fell in love with PC it’s so many things you have to factor in when making your decisions
Does anyone know when we can expect to see any benchmarks for these 50XX cards?
About 3 weeks time.
Reviews should start to drop around the release date. You need to find out the performance embargo date for each product.
The 5090 exists as a price anchor, what nVidia really wants you to buy is the 5070/5080. The 5090 is priced that way to make the 5070/5080 look like a good deal (they aren't).
The 5070 could be sold at $400 and the 5080 at $650.
Guarantee though that the performance of the 5090 is going to smoke the 5080 by almost double given the CUDA cores, RT transistors and double the VRAM.
Naah they sell a lot. There is no better card
People complain about everything man, even when they reduce prices. They could be sold at 1200 and 800 if they want. Because people still buy them anyway
"The 5070 could be sold at $400 and the 5080 at $650"
This is based on what? How do you know how much they could be sold for?
The xx90-cards are prosumer/professional-tier cards. They're not only competing with xx80/xx80ti, but also with Quadro/Ada, etc for professional usecases. And in these instances a 2k$ price point is not that much money.
Not really struggling getting enough frames for most games with my 4090 at 4k even without frame generation, mostly getting 70 to 165 fps on all games even on the new modern games.
This video features Richard Leadbetter of Digital Foundry, joining IGN to discuss the NVIDIA 50 Series GPUs and their implications for PC gaming. Key points include:
1. AI-Driven Performance: The 50 Series GPUs utilize AI technologies like DLSS 4 and multi-frame generation to achieve performance boosts, reducing reliance on traditional transistor density increases due to rising costs.
2. Latency Concerns: While AI-driven features significantly enhance frame rates, they also risk higher latency, which can affect gameplay, especially in fast-paced games.
3. Complex Performance Metrics: Richard highlights that "performance" now includes more than frame rates, as low latency remains critical for gameplay quality.
4. Impact on Handheld Devices: The technology's potential for handheld devices, such as the rumored Nintendo Switch 2, is discussed, emphasizing power efficiency and the possibility of a streamlined DLSS version for mobile platforms.
5. Future Analysis: Digital Foundry plans detailed reviews of the 50 Series GPUs to guide gamers in deciding whether to upgrade, considering the trade-offs between advancements and practical gaming needs.
The GPUs mark a shift in gaming technology but come with nuanced considerations for both PC and handheld markets.
@@Jakabokbotch2nd Ty
Wow bro, thanks.
@@EXPLORER-hq1us it's an ai bot
@@guccipucci69420 I knew that much 😅, I thought someone took an air summary using some paid aap(because most of them are) and posted it here 😁
@@EXPLORER-hq1us I missed your sarcasm lol
I saw some pics of MSI PC gaming builds, and they were labeled as gaming PCs, with dual 5090 cards, but couldn't see how they were bridged. DX12 games don't support SLI, unless the game developer goes out of their way to make it work, like DICE did with Battlefield 1, which was the only DX12 game that I found to support and run beautifully in SLI.
Will it make any difference for VR gaming
Future is AI whether we like it or not. Brute power gains are now going to be limited gen on gen.
What about Reflex 2? They have yet to discuss it's role in this....
Feels like Richard knows more than he claims, especially in regards to latency with MFG : )
I wish IGN would explain to PC gamers how to “crank up your Power supply” as he stated
Just find where the handle goes in and start cranking
I would love it if NVidia would develop a standalone box that would upscale and add frames to the output of something like the Nintendo Switch. I would pay a lot for that.
It's a great point. Upscaling and frame generation could be a game-changer for handhelds!
Is good idea to buy now a used 4090 for about 1000$ ?
Great breakdown by Richard Leadbetter 👌
If ray tracing is not a priority, would it be reasonable to purchase a 4080S without reservation? I plan to wait until the 5000 series cards are available and the price of the 4000 series cards decreases.
Wonder how the dlss4 going be on my 4070ti super.hope it helps alittle bit.
I don't see why it wouldn't help in some ways, at least. DLSS 4 has been stated and shown to take VRAM usage in games down at least a bit from DLSS 3. I have a feeling that's why the 5080 still has 16GB of VRAM. I think they are going to try and focus on taking VRAM usage down.
@johnaldridge83 it's supposed gives us x2 fram gen and better quality but who knows.this fram gen is not way to go.they drive me nuts lol
Scalpers are going to buy up most of the initial launch supply. Unless they have been stockpiling them which I doubt.
For me not yet I wanna see how well dlss will work on my 3070
some people don’t like DLSS or other similar features the same way they are reluctant to embrace EV. The reality is if the industry leaders believe it’s the way to go then it’s the way to go.
When I first came across screen stuttering on my xbox one x 2018, I thought the tv should add additional frames to circumvent the stutter. It now seems like that will one day be a possibility, will probably come from the console though. Would be great if the tv could also do it for older platforms and games, with upscaling and HDR. It would be great watching RUclips in 240hz. They still need to mocap video footage using AI for UE5, for the ultimate animation quality and realism. AI should be used for effects and graphical embellishments, not interpolated frames, so input lag isn't reduced in online pvp games. The most low-res basic geometry should still be rendered at 240fps, not up boosted from 35fps with AI interpolation. Thank me later.
DLSS 4, being about to 3x or 4x frame generate; is because of the additional time now achieved in pre-determined transistor configurations… The really scary part is explaining how we achieve those configurations and what they now allow us to do; without it starting to sound like the building blocks of a pre-cognitive AI sub-conscious…
Im getting the 5090
Same.
because the money in your pocket is burning your leg?
because you want 75% of the frames it generates to be fake? you cannot live your life withput upscaled images?
is it vital for you to disipate 550 watts of heat inside your room?
It is a good heater 🎉😂
why? It looks shite
@@betag24cn Why do you need others to justify how they spend their own money?
Cyberpunk 2077 on the 5090 is going to have crazy high latency and ghosting with that many made-up frames. There was a reason they showed only close-up shots or slow movement
I'm glad RDNA4 is focusing on upscaling rather than Frame Generation
Just turn off frame generation
This was a very unusual keynote. How little information was given. It's got me morbidly fascinated.
@@khaledaltaban you think you are smart? We didnt know that?? Turn off dlss then you get 25 fps
In 2050, only the first frame is rendered and the rest of the game is a hallucinated wild ride.
If you're on 3000 or below it's worth, 4000 is subjective
the more u buy 💚🖤 the more u save 💚🖤
i was at the keynote. a life memory forever for me.
Rich!!!!! Good Stuff 😊
Everyone crying about frame generation latency in first person shooters normally play at 1080p on a 24" screen. This new dlss4 is meant for 4k or higher gaming 🤔 😏
I have an awful Dell I haven't been able to upgrade mainly because of dimensions but the 5070 is the same size as the 3070 does anyone think the 500w PSU could handle the 30w increase?
Richard is very wise!
So many 5090 variants, hard to decide which brand. 😬
My big question is, is it worth selling my 4070ti super to buy the 5070ti? From what I can tell with the info so far, probably not... I can run Indy at 4k, full ray tracing with DLSS quality at 60fps and maxed out settings. If the new AI and DLSS4 (which will come to 4000 series and others anyway) allows me to squeeze even more performance/quality out of it, that'd be great. Might not need to upgrade until 7000 or 8000 series..
Its not worth it 😂😂 i haventh watched the video, but if you have the 4090, just enjoy it 🎉
It’s worth it to me.
It would be great, if with the new transformer based ai model, DLSS 4's 4K performance mode comes closer in image quality to that of DLSS 3.5's 4K quality mode. I hope that could make my 4070 super more than enough for 4K60fps gaming at high settings for many games. Even if it does, will the new transformer based model perform better with more AI TOPS, producing more frames in a particular DLSS mode for 50 series, for example, a 5070 upscaling to 4K much better than a 4070 Super/Ti Super just because it has more TOPS?
Easy answer: If you want the best Tech and got the money Get you a 5090 or 5080….
Perfect answer: If you got a 4080 or 4090, maybe just upgrade your display.
Best answer: if you have a huge back log that plays the games you currently play at great specs just enjoy your gaming until you’re forced to upgrade and then get what you can afford.
Worst answer: splitting rice from rice to find the best rice
I like this comment. I have a huge backlog and a 3080 ti to run it. Buying an ayn Odin portal 2 (7 inch oled cloud gaming handheld only 430 grams weight) to play my games and because its 1080p, I can get even MORE work out of my 3080 ti playing this way
Nailed it
Play your backlog
Buy some NVIDIA shares with your two grand instead
This. Don't understand how or why people upgrade every gen. I'm gonna get the 5090....because I've been rocking a 2080ti since it came out. At 1440p, I've only recently been noticing slow down. With the 5090 I won't upgrade for 5 or 6 years.
If 5070 is on par with 4080, that is already very good. I am more inclined to upgrading my ageing Gigabyte Gaming OC 3080Ti 12GB with 5070Ti 16GB some OC version, most likely from Asus or Gigabyte; Nevertheless, I am still curious to see 5080/5070Ti and 9070 XT against 4080/S in order to understand the gains in rasterization performance. DLSS 4 MFG is sounding as awesome feature, but let's see how the implementations in game will be like.
how is ai gonna help me with my renders in blender?
We got start using latency to measure performance and not frame rate... MSI after burner should do a update to show latency...
I know gamers are upset that we can't increase brute force rendering without costing a fortune (I guess we can thank TSMC for that) but at least with Nvidia we have resource options.
You will obviously have different fps targets for different games.A Triple A game you can target 45fps and multiply it 2 or 3x or use the new improved DLSS.
If you're playing a comp shooter you probably just want to use DLSS and reflex.
99.9% of us probably aren't using 4K Max settings with path tracing and hoping for some crazy smooth experience but at least options are available.
Oh hell yes! Digital Foundry on ign!?! Big winn!
fake frames do not equal performance! how about i give them $500 for a 5090 and they can use dlss 4 on the cash for the other $1500.
Short answer: no
Why?
Long answer:
NVIDIA's recent unveiling of the GeForce RTX 50-series, spearheaded by the RTX 5090, has generated significant interest in the tech community. Priced at $1,999, the RTX 5090 boasts advanced features, including DLSS 4 technology, and promises substantial performance enhancements over its predecessors.
However, for many users, the investment may not justify the upgrade. Several factors contribute to this perspective:
Marginal Performance Gains in Traditional Rendering;
While NVIDIA claims the RTX 5090 offers up to twice the performance of the RTX 4090, these figures heavily rely on DLSS 4's AI-driven frame generation. In scenarios without DLSS 4, the performance improvement is more modest. For instance, in "Far Cry 6," the RTX 5090 demonstrates approximately a 27% increase over the RTX 4090, and about a 43% increase in "A Plague Tale: Requiem."
Dependence on DLSS 4 for Peak Performance;
DLSS 4 introduces multi-frame generation, producing up to three AI-generated frames for each rendered frame, significantly boosting frame rates. However, this technology's effectiveness depends on game support and may not be universally compatible. Additionally, some users express concerns about potential input latency and the authenticity of AI-generated frames, particularly in fast-paced gaming environments.
High Power Consumption and System Requirements
The RTX 5090 has a Total Board Power (TBP) of 575 watts, a notable increase from the RTX 4090's 450 watts. This escalation necessitates robust power supplies and efficient cooling solutions, potentially leading to additional expenses for system upgrades. The increased power draw also raises concerns about energy efficiency and long-term operational costs.
Elevated Price Point;
At $1,999, the RTX 5090 represents a significant financial commitment. For many users, especially those who already own high-end GPUs like the RTX 4090, the incremental benefits may not justify the substantial expenditure. The cost-to-performance ratio becomes a critical consideration, particularly when previous-generation GPUs continue to deliver exceptional performance in current applications.
Conclusion;
While the NVIDIA RTX 5090 and the 50-series GPUs introduce impressive technological advancements, including DLSS 4, the decision to upgrade should be carefully evaluated. For users with existing high-performance GPUs, the marginal gains in traditional rendering tasks, reliance on game-specific DLSS 4 support, increased power requirements, and the premium price may render the upgrade less compelling. Assessing individual needs, current system capabilities, and budget constraints is essential before committing to such an investment.
(All Credits go to chat GPT)
@@Sets. thank gpt
Long answer: nooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooo
Going from a 30 to 50…might be worth it
Upgrade from what sir ? I have 1080ti, should I upgrade ?
if you need ray tracing perhaps, if not, i wouldnt bother
@@betag24cn yeah naah. I want gameplay and playable framerate. Not super ultra pro lighting and shadows
1080ti should be good for another 7 years, card is a beast
@Eren-da-Jaeger i would stay on it for 2 more years if possible, this generation is not offering reallu much, if ever, i would go for the 5070, for the power consumotion it theoretically has, the others males no sense at all
80 series is now officially a mid-range card… Crazy 😂
If the AI can generate frames why can't the AI figure out how to make the latency lower too at the same time? Can the model not be trained?
You mean Nvidea Reflex 2?
It would be still faster without Frame Generation.
Because it cannot predict the future of your decisions (it doesn't know when you click "shoot" button), but it can realize that between 1 and 3 there is 2 and generate it as a transition.
It can register the click and generate the gun shot tho
It
@Blazemaster9 not good for PVP or PvE.
It's still weird to me DF is part of IGN now. I'd never watch this video if not for Rich
Keeping my 4090 and waiting till it’s actually worth upgrading
I'll wait for rtx 6090 👀, powered by quantum chips
I have a SFF case, I'm upgrading from a 4070 to the 5070 Ti or a 5080. I'm looking at around a 60% to 80% raw performance uplift depending on the graphics card I upgrade to, not to mention I get more VRAM. DLSS 4 multi frame generation is just a cherry on top. Seems worth it to me, especially since I already have a buyer lined up for my 4070, so I'm not paying full price for the upgrade.
I'm glad that I didn't get a 40 series card last year, now I'm gonna get 5080
Me too. I'm upgrading from my 3080.
5080 should have 24GB Vram. Maybe wait for a ti or super?
I went from a 6800xt to a 4090, right before this announcement. 😂 wish i had waited for the 5090. I wonder what the 60 series will bring...
@@kevogames86nahhh I’m waiting to see what amd has. Seems like it could be interesting for mid range
@@Mr.Dodo- It might only have 18gb or 20gb.
Love Digital Foundry and Rich, but they have a few opinions that I don't agree with. Each of them seem to love motion blur and now Rich seems to not mind frame generation in any AAA game. Personally, if the game is an FPS shooter (e.g. Cyberpunk), the added latency is horrid. That's plenty fast-paced to necessitate minimal input lag.
Of course it all depends on the base FPS, but that's getting into it.
The problem of latency in FPP games on PC is the molasses feel of camera/mouselook. Nvidia is trying to cheat it with old VR trick in Reflex 2.
Uhmmm but arent most DLSS4 features coming to old RTX cards (I know they wont run the same but still..)
I would say if own a Cars from the 4000 series is not really worth it to upgrade, but if You are still using a GTX or a RTX 2000 series then a RTX5070 and beyond are good upgrades.
So I’m a console gamer recently turned PC gamer. Help me out….is it worth me upgrading to the 50 series? I currently have a 4070 super and I game in 1440p.
Bro, dont. You're getting DLSS 4 man. You didnt even pay for that. Its given for you for free!
You can't use it without 50 series just like how 40 series is limited to version 3 and 30 series to version 2 and 20 series to version 1
This is an old argument revolving around notions of 'purity'
Us old timers remember when people thought AA/AF and reducing rez was 'cheating' rather than just running higher resolution. The 'real frames' argument in new clothes.
Finaly some sane discussion insted to cursing nvidia out of frustration.
GeForce now looks really tempting now that 12 out of 16 or whatever frames are cloud frames anyway.
Im just here for Richard. No more, no less. Haha
If you have an RTX 40 series card, this is a definite SKIP/PASS and u should wait for the RTX 6080 and 6090 which im sure will be the REAL upgrade. Having the best card from the last gen, means you can make it through current gen
(Ex: I had my rtx 2070 for 4 years before i got a 40 series card, I skipped the 30 series as there was no need)
i agree i rented a 2070 super from evga and just traded it in for a 3080 for the price difrence and it was 30-40% maybe less in some games its all fomo and fake advertising
I am not upgrading until 2030. In Black Friday 2023 I purchased the AMD TUF Radeon RX7900-XTX, because NGreedia was overpriced on their 4000 series cards.
What NGreedia is doing these days is horrible. If you take away the DLSS and Ray Tracing along with all the AI fake frames, the 5000 series cards they have now,
are no better than the RTX-3090. People should NOT be purchasing the 5000 cards, because they are paying for an overpriced series of cards for fake frames.
If the cards stood by itself, it would not look good at all. And NGreedia knows this. NGreedia are deceiving people. NGreedia are just locking performance behind AI software.
When I upgraded to the 7900-XTX, I came from the GTX 1070-Ti. So, I will wait until 2030 before I consider upgrading again.
The problem I see with the 5000 series right now is.... where are the games? Where are the games that were gonna need a 5080/90 in order to run? There are no titles out there currently that can't be run proficiently on a higher end 3000 series card or 4000 with DLSS. Just dont see the need to for anyone with higher than a 3080 to jump up to 5000 yet.
I just upgraded from 1080ti to a 4090. I bought that 1080 in 2017 and i was able to play nearly everything on med-high settings up until games like Cyberpunk, and even then it still runs well with FSR. I bought the 4090 in March of this year thinking that if push came to shove that I would just sell the 4090 to help pay for the 5090 if there was any new tech that would be exclusive to the Blackwell cards. I think ill stick with the 4090 for a good while.
Black Myth Wukong is difficult to run at decent fps on my wide 3440x1440 on a 3080 10GB. Saying there is no title that can't be run proficiently on a higher end 3000 series is not acurate since those cannot use Frame Generation :/ And the raw power increase compared to the 3080 at these resolutions should still be decent with a 5080.
I've got a 4090, and I use an LG OLED TV at 144hz. Multi frame generation is not exiting enough for me to upgrade and aside from the multi frame gen the 40 series is getting all the DLSS 4 features as the 5090. I don't do competitive shooters and stick to single player games and aside from Indiana Jones, Black Myth Wukong, Cyberpunk, I get close to 100fps native 4k, and well over 100fps when I use DLSS with no frame generation in most of games.