I want to reply directly to you to ask what are your thoughts on AAA development nowadays? I bring this up because to be completely honest GPUs “not having enough VRAM” is imo more the fault of AAA studios and developers and less the manufacturers. Im NOT saying that i agree with their ridiculous pricing for less than enough VRAM, but just looking at the graphic fidelity of games that needed maybe 4-8 gb max to run their highest settings from previous years its hard to justify the performance drops that occur when you do push the game to the limit. compare cyberpunk 2077 w hyper realism mods vs games like monster hunter wilds or star wars outlaws. Idk all the technical ins and outs of it but I can see what crazy well made graphics look like and you can run those mods pretty modestly on a 3050 8gb
Under $200 I would consider it for a secondary system. But while I'm o.k. with my current 8GB 3060TI (I don't really play that many demanding games) I wouldn't take less than 10, and frankly to go down from 12 it would have be a good bargain on price/performance. For a main system the more I can get the better as I do have a few things I occasionaly do that can use as much as I can throw at them.
Given its a good enough price, and you can accept that you’ll likely have to stick to medium settings, it could be worth it. But games definitely want more and more VRAM.
8gb 4060 is still the best selling card this month..(europe) so looks like the answer is the majority.. not everybody knows a lot about gpu's, new series release dates, chips, dlss fsr rnda afmf and its future, bottlenecks etc.. kids ask their parents to get a gaming pc and 4060 is the best seller...
@@GamerMeld That's fine for a secondary system/backup. And I have two games I have to turn down significantly from max at 1440, CP2077 and Controll. And a few that I only put on high to keep above 60fps.
The Problem with Indiana Jones isnt probably that 8gb Vram or even 12gb becomes too less vram. The problem is that developers dont give a fuck about optimizing their games.
You do know that ray tracing is required for Indiana Jones, right? This is why you require more vram. That's also just the start of ray tracing being a requirement. Soon, every new game will require it. AMD better step up, or we are paying lots for cards able to play new games.
Yea thats why we are stuck with textures from 2016 in all new games. Nvidia sold to many 8gb GPUs.. Btw game runs really good , no stutters, good lows and it is optimized very well.
@@AtteroDominatuscheck my GRE video .. Maximum setting on 1440p on AMD rx 7900 GRE game runs like a dream even with RT.. Even at 4k with max setting you get 60fps if you have 16gb of vram and it looks stunning on native 4k..
@ this game has top notch high res textures and that eats vram. Its simple, there is no optimizing that can fix that.. Only thing you can do to lower quality to medium or even low and you will still have textures you are watching im games now..
Wound not be surprised if NVIDIA and AMD works together with these AAA gaming studios to keep the requirements high to sell more gpu's. All we get is 30 fps a blurry mess and high input lag.
I think its more the coperate greed pushing deadlines, theres so many games out now they all competing for same consumers. Money looks better on paper then a fully developed game. We will keep buying 50-80% released games aswell
It still plays my old games great, but my CPU and RAM is 10 years old. Trying to work out which modern CPU released in the last 3 years will play my 1080ti without too much bottleneck.
@@LisSolitudinousif you want a decent budget card you might want to hold off for a little bit and possibly take a look at the new intel battle mage cards there looking like they're going to be decent GPUs with good price to Performance.its completely up to you what you get but there's another option.
I upgraded from the 1080Ti to a 7900XTX when I decided to switch from 1080p to 4k. My old 1080Ti still lives on in my wife's build and it has plenty of performance for the games she plays like Phasmophobia, Baldur's Gate 3, and SIMS
Man I still have a 4gb rx580… just old enough to understand how things work too and eagerly looking for a 4090/5090 so I could be set for hopefully a decade
The RTX 5000 series is already in production (and being shipped possibly), so whatever the specs are will also be the final specs on release. The only thing that can change last minute is the price.
"The first Crysis wasn't known for being all that fun" Wut? Damn I'm pretty sure most people thought it was a fun game. I remember putting my suit in speed mode, running in the jungle at the speed of a cheetah from a North Korean copter. I manage to hide in a shanty only for one missile to destroy the entire structure with pieces of sheet metal flying all around me. It was so cool I remember in that moment the one thing in my mind was "this game is TIGHT!". Switching between all the different suit modes added a good layer of strategy to the game too.
Compared to current games Crysis is way more fun Crysis comes from the generation where devs were putting fun elements and not propagating their personal agenda through games
yeah I kinda liked this game but realistically it was known for its graphics. Not because it was particularly stellar game as is. At the time there were many more interesting and engaging games. It's kinda heartbreaking that honestly if you look at it now in current market it actually does sound like quite original and interesting game... what a time to be alive...
AMD really does manage to always miss opportunities 🙄 Imagine the RX 8600 but with 12GB. It doesn't have to be 16GB but why not make it 12? 8GB of VRAM cost $27 so adding another 4 is like another $14 in production cost. So add $20 to the sales price and you're good with a MUCH better product. But no. I just don't get it. It feels as if they don't WANT to win. They manage to make good competition for NVIDIA at a fraction of a budget but then there's always some god awful design choice that must make Jensen go "😂For a minute I actually thought they'd be a problem😂".
Because outside of this one game it isn´t needed. It like a car with 60HP and 6 exhaust pipes. You make certain design considerations and this one was made that way because it makes sense. Chill out it´s just one badly optimized game. You need an RTX 4090 to play it at max and you think the GPU is at fault...
i bet they will have a 8 gb and a 16 gb version , that's why they do it like they did with 7600 and 7600xt , because if you make 7600xt 12 gig then you are basically left with 7600 6 gig and this shit no one would buy.
I don't fall for that, they beat the world with top tier CPUS, they are probably doing the same with GPUS, but they are not gonna tell the world their tactics... They have the resources to do so, so why even telling your competitors? They have turned the tide almost completely already. 🤙 The only thing is that when they "conquer" everything, are they gonna deliver us anything good or will assume the same role as intel, mitigating the most it's potency delivery?
they probably ran out of technology and all the big brains need more alien tech to reverse-engineer. takes a while... last time they did it, we got plasma TVs!
Also make the higher level cards have more ram what since does it make for a 5070 (probably a 5070 TI/super also) and 5080 all have the same amount of ram.
I hate to defend these companies, because they are greedy af, but the problem is actually customer Fomo, not the companies or their practices, they are doing it because people's fomo gets them to still buy it......if customers would just not buy it on principle, the companies would start to listen, they aren't going to listen as long as people are spending money on their crap.
not really. the higher density Vram chips are a bit later than expected going in to mass production so are not available. increasing bus width is expensive. So the options are delay 6+ months or redesign the cards (about 6 months delay) with wider buses and increase price accordingly either way its mid 2025 launch and what are they going to do about a mid gen refresh if they delayed to make wider busses and got the denser modules at the same time, sell even more expensive chips for even more money. I wish they would sell cards with increased bus width but the market spoke generations ago that no one was willing to pay more for a wider bus. IF nvidia delay lets say to wait for mass production of 3GB vram chips and then that gets delayed again as its only samsung (in that country that just declaired then undeclaired martial law) who say they are going in to MP next year so far so expect delays, well the 40 series is no longer being made so there would be no gpu supply (from nvidia which is the market) for 6+ months. Guess scalped 40 series cards for most of 2025 would be the only option in that case, so I would say its better for everyone (except scalpers) to just push out another gen of cards without enough vram, it is business as ussual after all. And yes seeing whats happening I did just buy a 7900xt with 20GB of vram.
I think my 6800 should last till Intel's Druid comes out, hopefully by then their drivers improve over time especially regarding older games....their GPUs seem to be more power efficient than its counter parts
Or upgrade to the run out end of gen 7900xtx's 24gb vram !! I mean its no 4090 and RT it gets crushed basically by a 4070ti but in raster its 4080 lvl's !! But 24gb vram is going to be solid for a long time still !!
People: Shame on you Nvidia. That same person: Well, here goes my 5070. Don't have money for higher. And 5060 sucks. I don't want AMD. Guess I just have tu turn on DLSS. - Human with more copium than oxygen in his veins.
@@roshawn1111Yep. Most people want AMD to compete so they can buy Nvidia cheaper. It won't work this time around fellas. Nvidia simply won't care because they have a bigger customers to serve now (the AI crowds) and they have to. Because anyone who have tried building a business (small/big, no matter what size) would know that if you have a chance to get as much profit as you can, then you have to. Because it will not come for long. Nvidia knows this, that's why they're milking it as much as they can before the others catching up to them.
Sadly new games make you rely on upscaling whether you have an amd card or nvidia card, and dlss being quite a lot better means more people will still gladly pay the nvidia tax, myself included.
@@Crazzzycarrot-qo5wr Hey ! Don't forget Intel. Intel GPUs are good too. Just less popular. FSR isn't that worse than DLSS. The real issue is that most developers are too lazy to make any proper FSR implementation. Actually (you can check the file version). There is 0. And I insist. 0 games with the latest FSR. None on Earth. For those with older FSR. Some use lower than 3.0 (which is kinda not great) and some use 3.0 (3.0 have some issues compared to 3.1.0 or 3.1.1 or 3.1.2) and as you can see in my comment. The latest FSR is 3.1.2 even if let's say a game features it (which is not the case) all other .dll files require for proper frame gen or greater shader lighting is missing or non-updated). I started to seek the FSR file to update it. (The real FSR) Most games don't have it on Steam. And some do. But lack of the other one). Funny enough the only "game" with the latest XeSS I own is 3DMARK x)
@@FireCestina1200 yes, dlss is inherently better than fsr. Of course this is to be expected. It has nothing to do with the game running latest fsr version or not, since fsr 2.x looks better than 3.x from what ive seen and tested. Intel gpus are good value, but not for higher graphics and resolutions, for now at least.
its not too bad, 1440p is sweet spot resolution, as most ppl say that jump from 1080p to 1440p is really noticeable, and from 1440p to 4K isnt that much but it requires more graphic power. In that case, Amd's offering at `$400-`$500 are mostly RX -700XT and RX -800XT which are great for 1440p and even entry level 4k, with a decent amount of life in them aswell, look at rx 6700xt/7700xt and rx 7800xt, for rast perf, they r better than nvidia's even 500-600$ offering for price/perf and simple outright throughput. It's a hard truth to face, but ppl need to understand that nvidia really isnot making cards in the favour of their loyal customerbase.
2 to 3 years ago the net was full of reviews showing how GPUs struggled with modern games on high settings due to low VRAM. We are now seeing the third gen of cards that have, if rumours are correct, done nothing to address this. Instead of providing the actual hardware required, they are all pushing upscaling. Of itself there is no issue to having upscaling, but not so they can skim even more profit out of the consumer for relatively lower performance and reduced hardware. Hopefully the consumer is waking up to the scammy behaviour, though that was also talked about 2 to 3 years ago and people still back stupid but shiny.
For real. Double digit vram should be the minimum for any graphics card nowadays. The high end needs more too. The 1080ti was fucking loaded with vram back in the day and it STILL kinda kicks ass 8 years later tbh. I think Nvidia is making them like that intentionally so the card struggles within a few years and you have to buy something from their new lineup. Just my theory.
Actually quite disgusting .. flip side to a degree Nvidia sells you their 8gb crap then boosts the Vram in later models of its same rubbish making you by the same crap twice just with better specs .. AMD doesnt you buy there rubbish its what it is rubbish 8 gb 7600 /8600 ( i dont think i ever seen them sell a 16 or upgraded V ram 7600 16gb card or what ever ) like Nvidia does !! call it what you want ( forcing to to buy better at the start ) ( BS marketing ) But atleast with AMD you pay for with no upgrades !! kinda putting the sale back on you if you buy their crap at the start well thats on you 16gb should be the lowest Vram on a card in 2024 .. Personally I would wait till the better intel cards ( i guess B770 cards would be there names ) Ive got the ARC 770 16gb paired with a 14600kf for my fun all intel build ( 7800x3d 7900xtx is my main ) but for the most part that little ARC770 16gb plays most stuff pretty well with no issues..
@@jrodd13 I'm still using the 1080ti, that thing might last me a few more years at this point. I only started hitting VRAM issues a few years ago, so I might have to consider turning down the details on newer games now. But 1440p and 90fps is still pretty good for an old card.
Idk why is that rx 8600 claim to bé 8gb cuz there was just leak on amd gpus rx 8800xt, 8700xt and both are 16gb but 8800xt háve more cores and 256bit bus and rx 8600 xt shoukd háve 192 bit bus and 12gb maybe we gonna have gpu Like regular rx 8600 non xt with 8gb but it really looks Like its gonna bé 12gb version and nôt 8gb, over all im a bit dissapointed IF rumors are trú cuz i have rx 7900gre and i love this gpu its powerfull háve enough VRAM and its was cheaper than my Rtx 4070 super wich i return cuz it háve nonstop issues with it. I hoped i could upgrade to am5 platform from mine 5800x3d and rx 7900gre to Basic ally 7800x3d or go balls deep and go with 9800x3D cuz why nôt and i would pair it witj 8800xt cuz first claim was it gonna have raster of the rx 7900xtx but if this is True than performance is gonna bé Basically worse than on my rx 7900gre OC version but well RT is better but IF amd manage improve RT performance to that level as nvidia háve im bé happy and mé personally im not using RT at all even tho my gpu can do it quite well i just dont Like to play in 1440p ultra settings and háve only Like 60fps nahhh Like 100plus all the tíme with low frame tíme and huge issue are games BASED on Ue5 which use shit load of RT baked into game thru engine and there this performance can bé good. But we will see how its gonna bé really.
A lot a people will be disappointed when buying a card with 8GB. Likely most people are illiterate when it comes to computer components and nVidia, (primarily but not exclusively), preys on this. There is no future for 8GB anymore - Indiana Jones is just the start.
comment user slams media! for! lack! of! anything! useful! tl;dr: need a ”disappoint” score in all reviews targeting the unsavvy. you nailed the hammer right between the eyes with this comment: the perfect word in the right spot. > disappointed this is an *extremely* important word, especially in the Wide World With Web Words Where Wiews and Watches Withhold Weight Without Watching, When Wiggling Wasn't Without... sorry, fuckit, can't do it. the this era marketing driven *everything* where: a mild disagreement over tipping culture in spain + media¹ = ”Socialist Jew Slams Black GOP Superstar While Elon's Kardashian Nig[advertisement]” ~~! everything below that fold hidden like the titanic ¡~~ reporting live from Reddit or Twitter or X (read as 'found in a hole filled with water in the ocean ') by ..¿xxxBACON-PUSSY-BENEFITS-BEEFTITSxxx¿..
@@Jhintingbot 1080ti is one of the best and most future proof cards of all time. It's heavily outperformed these days obviously because it's not just vram amount that matters, but it's still serviceable for most people at 1080p.
@@allbies I can still play games at 1440p and 100fps + depending on which one. Even max settings 144+ fps on some. Usually removing useless settings helps
@@lordfyita2096 It's because this guy is a giant click baiter for no reason. His content isn't bad but he does the annoying click bait titles like he as if he really needs to do that. I wish he would stop.
Nvidia isn't going to change their mind. They do NOT care about gaming. At least until us gaming people boycott them and they lose out on their money. People, we have to force their hand and boycott them. If you do its not like you won't have option. Amd will get you by until nvidia comes to their senses.
I think the Ai bubble would have to burst, if it ever will, their marked cap is inflated af, as long as they still get huge revenue from ai, they won't care about gamers
Didn't AMD say they were going back to competing against Nvidia only at the low and mid level GPUs and not competing at the high level? Leaving Nvidia to compete alone at the high level? I seem to remember hearing something about that recently.
It is hard to be disappointed in a product that doesn't exist. I don't spend mu life fretting g over speculation. After release I will form an opinion.
Back then everyone and their mom wanted a 3060 or 3070 with 8gb. I only managed to get a 6800xt 16gb. They said don't worry about 8gb it's all about the raytracing. Times have changed alright!
If I have an rtx 4070, would the 5070 be compatible? I'm curious if they will have the same connector and similar enough dimensions, wattage and so on, that it will work
6800xt vs 7800xt is basically the same card in real life scenarios. For casual games - around 5 frames difference. To upgrade to something that provides less than 25% framerate improvement is pointless for me, considering the the prices.
The chair DOES look nice. But it's 500+ USD. That is quite a lot. You can get a full recliner for less. Though I suppose a 10 year warranty IS pretty nice...
Right. Get a La-z-boy recliner, and mount your monitor to the floor. I used a couple of 6x6" steel plates welded to some rectangular tubing, then a standard pneumatic swivel monitor mount on top. Secure everything to the floor with some lag bolts. Keyboard goes in my lap, mouse on the arm rest. If I need to get up or want to watch TV I just swing it out the way up against the wall. Saves a ton of space.
personally it should be 600=12gb 700=16gb 800=20gb even then im of the belief that no card sold in 2023 going forward should be less than 16gb when they are all going up in price !!
@@volvot6rdesignawd702 for the price of cards today, they really don't offer enough. I have a water cooled 1080ti that cost me about $1200 to $1400. That same figure today, with inflation, should be about $1800, I'd be fine with $2000 for a top of the line card. But here in Australia, it's now worth $3500 for a 4090.
12gb Vram should actually be the new baseline by this point in technology... 8gb cards are fine for mobile systems and the likes, but not for desktops.
Why do so many people immediately blame gpu manufacturers for poor performing GPUs when it’s devs and studios that have the power to design what their games need? Game engines can almost always be better optimized, resources can be better managed, and project can be delayed to make this happen, but nowadays we more often than not see rushed releases, poor CPU utilization, and graphics that pale in comparison to modded games from 2 years ago but still require double the VRAM it feels like nowadays AAA studios are designing their games to be played at minimum on the highest end specs rather than the largest audience. And then when you look at the fact that some studios actually partner with GPU manufacturers to create these spec sheets - it just feels like an ugly game of leap frog where the only way to play your highly anticipated games is to keep buying better hardware every year
The thing is most people aren't going to pay 50% more on a game so that it runs on 30% less vram also the vram has stayed the same for like 8 years now and games are only getting more complex so it is mostly on gpu manufactures fault it makes a lot more sense to a gpu manufacture to spend $15 on 4gb more vram then $30 more on each game as well as less game releases due to having to spend a lot of time optimizing.
@alextoscano but the thing is games ARENT getting more complex, graphics are virtually in the same place if not a worse state than they were in 2020-2022. We can see it in the way that devs use upscaling and anti aliasing techniques to hide poorly designed sets - the best example of this is literally any video that talks about how unreal engine 5 is affecting the industry. Just look at frame generation and DLSS. It’s absolutely CRIMINAL that a software bandaid such as framegen is appearing on spec sheets for games, especially when frame gen and upscaling absolutely destroy VRAM, and it all started because of TAA (temporal anti aliasing) The real issue is the fact that those at the top are pinching pennies at our expense - AAA studios dont HAVE to charge $120 for their games, they dont HAVE to add micro transactions or subscription services - just as much as GPU manufacturers COULD add more VRAM to their cards or COULD make affordable low end GPUs but dont
@ It really all comes down to this: We should be buying high end hardware to push the graphics of our games to the limit, not the other way around If i was buying games just to push my GPU to its limits i wouldnt buy games, id just let it run Furmark at 4k stress testing preset 24/7
@ and this is before you take into account the quality of games that have been coming out, when was the last time you saw a AAA game that came out of a major studio that wasnt a buggy incomplete mess on release?
@@J4rring gotta agree with you, dlss and framegen were meant to help lower end gpus compete at a higher resolution. Instead devs have turned it into bandaid fixes for poor optimizations. Then they want to charge you 70/100 dollar ultimate editions for bugs and 60gb worth of patches on your Internet bill to fix it.
Sorry just gonna say it with Indy. Those specs. are devs. being lazy. Telling me I need DLSS on for decent frames...this better be referring to people who want over 60. As it stands I think Pathtracing is garbage as yes it is easier for devs. to implements but makes RT even MORE of a drain than it currently is., Hearing about the difference of Cyberpunk's Ultra RT vs. Pathtracing mode it sounds unnecessary and totally not worth it. Also This is worse than everyone says about Cyberpunk and how unoptimized it is. Everyone we have a new winner here!
@@NetflixForeign I can't play cyberpunk without path tracing. Its is a higher quality image than Ray tracing. Its nice that the 4090 can handle it since the 3090 was basically a slide show with it turned on.
@@NetflixForeign DLSS + Frame Gen for cyberpunk at 4k to use path tracing yes. It it looks way that way than without. But I also want to have all my knobs set to 11 at the cost of fps. As long as it's high enough. Otherwise I'll wait until the next Gen top end card and play it with that one.
I'm playing Indiana Jones and the Great Circle right now on a 3090 and I'm blown away at the performance with everything maxed out (except for full RT which I have off). I am getting a solid consistent 70FPS in 4k HDR. So definitely no problems here and it's not using anywhere near all my VRAM
Wow, if that 8600 is only going to have 8GB on it, they better price it at $99 or it will never sell. I wouldn't buy anything with less than 16gb, but according to a recent Hardware unboxed poll, there are some people who would pay up to $100 for one. Hopefully there are two variants of the 8600, an 8600 and an 8600 XT, and the 8600 version is the one with only 8GB, while the 8600XT has the previously reported 12GB. In the prior generation, the 7600 has 8GB while the 7600 XT has 16gb. Even with 12gb, I'll not be considering the 7600XT unless it's really cheap.
Nvidia has shot itself in the foot with the 5000 series with their lack of vram and appallingly slow bus speeds, nvidia's greed is getting the better of it
Hey i have AMD Radeon RX 6750xt just having anxiety over12 gb vram will it be sufficient for 5 years if i run games at preset(mostly FPS and sometimes AAA open world games at only 1080p)?i want to make use of it for at least 5 years then upgrade to the new GPU
Sorry mate but "having to sign in to get driver updates" as you stated is and was totally false. Bad form on your part. Drivers were easily downloadable direct from site.
so don't use the app. I have never used Geforce Experience. I actually rip it out of the drivers before installing them. Like the guy says, just download them from the site.
I've just built a new system for myself and I've put a second hand 7800xt in it as a placeholder for something new next year. At this point I'm so happy with it, I might just keep it for longer. There's no way I'm downgrading memory size and the Nvidia options with 16Gb are likely going to be hideously expensive. I guess if the 8800xt is exceptional I might be swayed. The intel cards are definitely looking better, but I'm not prepared to risk the poor compatibility issues.
go Intel wait for the specs of the B770 cards my guess we will see 16gb to 20gb ( maybe 20gb ) .. But forget cheap AMD or Nvidia cards Nvidia has the mind share and market share ( because people are either stupid or like getting rolled ) and AMD will follow suit with just undercutting Nvidia's pricing with better Vram ( obviously not on there crap 8gb junk ) !!
Yeah really disappointing that they would rather screw us customers, because they get better margins and nvidia is extremely anti competetive especially with vram. Intel is pretty much our only hope for reasonably priced gpus
Lissen i have only nvidia gpus from gt 720 later gtx 1050ti after that rtx 2070super later Rtx 3070 OC and my Last gpu from nvidia ws Rtx 4070 super which i must return cuz it was just ahhh nôt good and Basic ally bad priced gpu over all special in europe i buy it for 729.99eur crazy price for Basically entry level of mid range gpu, over all gpu can run games but in most new tituls that we have now i multiple Times run into problém with VRAM alocation and nôt enough VRAM at all also stupid stutters and bad bad bad drivers im. Sorry Like i know that amd háve good driver support now cuz in past it wasnt that great but i expect better results from nvidia as Leader with gpus this was damn shame Like even my Rtx 3070 OC dont háve that múch issues with driver than my Rtx 4070 super and most of the tíme drivers dont brake game it self but apps Like obs or other recording programs also frame gen used to dont work in my many games after uodating driver Like damn shit show only good thing i love about 40series was dlaa and dlss those were beautifull. But i choose nôt to wait and maybe that was wrong cuz i choose rx 7900gre for 630eur still expensive but 100eur less than Rtx 4070 super and thats big price gap in my book so i pick it up, what i find was nice drivers was great also adrenalin software damn nice stuff very easy to learn and 16gb of VRAM also around 10 to 15 %better performance in 1440p or 4k and in games that use more VRAM this makes big lead but in online games both Rtx 4070 super and rx 7900gre was on pair but rx 7900gre delivered always more 1%lows maybe cuz of that bigger VRAM. Over all i dont regreting to go with team red and i dont care that they dont push out that many drivers Like nvidia does point is that those drivers work and thats main thing for mé also im done with nvidia, prices goes to the moon and new gen its nôt bé any different and for Like 10% of the higher performance i dont care at all nôt worf it so i stay with team red and only thing that i really miss is dlaa and dlss cuz fsr 3.2 is really good and Basically with this version amd shrink gap between fsr and dlss quite a bit but quality móde or even more low is still better on dlss for sure and that without thinking. I just wish i really wish that AMD release that new FSR4 for us that háve rx 7000 series i hope its gonna work with out gpus as well. And when it comes to this new AMD 8000series well in short this is nôt bé improvement in performance but its gonna be something Like beta test for new and better RT performance in my opinion so who are those gpus for well Basically in my opinion People that háve rx 5000 series and 6000 series for those People its very worf it to upgrade to something Like 8700xt ot rx 8800xt i think but for mé i dont know IF amd finally give us fsr4 with at least 90% of the visual Like dlss than i Will pay all money for any amd gpu cuz only think that amd holds back is good upscaler cuz many people dont even use RT but they use upscalers and most amd users use rather intel Xess than Amd fsr cuz most of the tíme devs implement old fsr version or they implement fsr wrong Like in cyberpunk 2077 the worst fsr imolementation i ever seen. Btw sorry for my english write Ing on the phone at 4am😂.
The raw number of CUs isn't the end-all of performance. The 7800 XT has equal or better performance than the 6800 XT with 60 CUs vs 72. People fixate too much on those numbers when to the end user they essentially mean nothing - what will matter is how well it performs and what price they set.
You're not just a "pretty big guy". There are some other things to say, that could be worked on. But first, the future health complications should be considered. And they are real. Take it from another "pretty big guy" who is 6 foot 4, 270 lbs. Not easy to find the clothes that fit and look good at the same time, especially the normal freaking jeans. Or something.
64 cu's is the leak we always had for Navi 48, and the performance was always said to be ~4080, probably between 7900xt and 4080 so I wouldn't be worried.
Playing both sides..... sure you only have to move a few settings to fix. Yeah and that is what we do, what we've always done. Sure I want more VRAM, but if I move a setting down and it fixes, we all know the picture quality takes a very very small hit most of the time.
I still remember back in 2020 how ludicrous it was that the RTX 3080 would only get 10GB of VRAM! It just didn't add up, whether you looked at GPUs casually or got into the details!
Given how fast new and, I guess, improved components arrive almost daily, why would a PC Gamer buy anything today knowing it will be obsolete in a few months if that long?
3 thing I want to see. -Lowest amount of VRAM is 10GB/12GB. And it's for the 8500 - 8800XT and higher have at least 96/128MB of infinity cache and not 64MB. - Updating ressources file in order to the latest tech (the .dll) with a program to flash current gen (7000) and newer gen
and AMD not giving in your demands is so strange when NVIDIA is lot easier to mod and easier to update dll...and best of all, its free for user to do it vs AMD your freedom is limited under them that under amd only devs allow to use fsr source code...not modders and users , but for intel, they made dll that change amd tune last minute changes with its fsr3.1 code. Can't wait to see more intel xess2 dlls now...more plugins for the users is not just for devs only, specially you got linux user space love to work with dlls in vkd3d-proton code now they got latencyflex alternative now...intel XeLL
@JuanPerez-jg1qk Real. The biggest problem is many games have FSR support so poor. Updating the .dll (I tried) sometimes makes the game unable to launch or launch then crash instantly. Hopefully. Most games work flawlessly with the update. But from manual tuning, no frame-gen from FSR 3.1.2
No offense to your chair but I prefer actual office chairs the old style high backs they are very comfortable and normally made out of material that won't crack fall apart or wear out easily like most chairs these days, they are also stupid expensive but I've found I spend less over the years buying quality
At this point I believe there may be an understanding of game companies releasing poorly optimized games with intent so gamers continue to drop thousands of dollars on GPUs.
My bet...the 8600 specs are for the non-XT. Also, considering that the 7800 XT is on par with the 6800XT in performances with 12 fewer CUs and half the infinity cache, I'd say that this RX 8800 GPU is likely going to be faster than the on paper specs appear. Time, as always, will tell.😉
Serious question Why does literally EVERYONE leave off the 3090 and 3090 Ti when posting comparisons? Where does my 3090Ti card fit amongst these bars?
The 3090 is a mid-range card (with loads of VRAM) these days (destroyed in performance by the £604 7900 XT), but because of its high release price and no availability in stores anymore (at discount), not many people own it. It is not really relevant anymore because of this.
@@norbert4787 Maybe, but it is still the best non 4090 card, and it stomps the 4060, 70, and 80's. It was at one point THE Nvidia card to have. I bet there are a ton of people wondering if it is worth the dimes for a 5090 or to buy a step-up when the 4090's crash. That is my primary concern. Literally. The 3090Ti can do 99% of what a 4090 can, and my original upgrade plan was to upgrade GFX cards every 2 generations, but at this point I need to also consider that the 5090 will require a complete hardware upgrade, where the 4090 can work with the i9-9900k setup i have currently. Just saying. It was really weird to me that every review site just pretends the 3090Ti does not exist.
That's why I will either stick with 7900XT and its 20GB of VRAM or I'll trade it in for a RTX 5090 if the price is not gonna be ridiculous. Whoever wants to have a solid gaming experience in 2025 with new games should not even consider card under 16GB of VRAM. That's just how it is...unfortunately.
the latency improvement from switching to a monolithic die should make up for any shortcomings from lack of CUs. they are also claiming a big power efficiency increase... AMD is playing the game of the future. they know how certain states are gonna start trying to pass power usage laws(looking at you CA). by focusing on efficiency, they can gain market share by default if they are more efficient than nvidia
If a game requires more than 8gb vram for 1080p, it's not entirely the gpu maker fault. Devs are becoming lazy because of modern computer specs. I understand that the the top tier graphics card need more vram. But this is not my fault to have a cheaper graphics card.
If the specs are as leaked for 8800xt there is no way AMD can charge more than 450$ for this. If it is 450$ Then it will sell like a hot cake , even 479$ will sell real fast , and also its looking like 8600 is aiming for the 239-259$ market given the fact that it has no memory uplift or more compute units. Leaks are not looking bad , it all depends on the price , if AMD decides to go over 500$ and 300$ is DOA
Regarding AMD's GPU RAM, I'd rather see 16, 24 and 32GB options. Having an 8GB is, frankly, a joke these days as even much earlier cards from AMD have managed that. I also don't like CU and cache on a brand new card being lower than that of a card two generations earlier. Not good. I do wonder if dual RX 6800 XT in multiGPU configuration would be a lot more cost-effective and give better perfotmance... I can see this being the case, at least with games using the Vulkan API.
I think the 5060, 5080, and 8600 will pretty much be DOA thanks to a lack of lack of VRAM. Even the 5070 may be a tough sell if AMD sets the price of the 8800XT at $499. As for the performance of the 8800XT, +4 CU's and +25% clock boost over the 7800 XT has been leaked. This should translate to a massive jump in performance over previous gen, at least on paper.
Check bit.ly/4gcZ2uS to get yourself a comfortable chair today AND use C730 to get $30 off on it.
I want to reply directly to you to ask what are your thoughts on AAA development nowadays?
I bring this up because to be completely honest GPUs “not having enough VRAM” is imo more the fault of AAA studios and developers and less the manufacturers.
Im NOT saying that i agree with their ridiculous pricing for less than enough VRAM, but just looking at the graphic fidelity of games that needed maybe 4-8 gb max to run their highest settings from previous years its hard to justify the performance drops that occur when you do push the game to the limit. compare cyberpunk 2077 w hyper realism mods vs games like monster hunter wilds or star wars outlaws. Idk all the technical ins and outs of it but I can see what crazy well made graphics look like and you can run those mods pretty modestly on a 3050 8gb
I'm sure the 8600xt will have 16 gigs like the 7600xt does, the x600 sku they were speaking of was likely the non-xt version only
Why do you lip sync your videos? It looks really weird and unnatural and distracts from what you are saying.
I bet they make a 5080 Super Evo Ti TIE to cover the gap. Should become a betting category in Vegas? 🤔
Who would buy a new card with only 8Gb of VRAM?
Under $200 I would consider it for a secondary system. But while I'm o.k. with my current 8GB 3060TI (I don't really play that many demanding games) I wouldn't take less than 10, and frankly to go down from 12 it would have be a good bargain on price/performance. For a main system the more I can get the better as I do have a few things I occasionaly do that can use as much as I can throw at them.
Given its a good enough price, and you can accept that you’ll likely have to stick to medium settings, it could be worth it. But games definitely want more and more VRAM.
8gb 4060 is still the best selling card this month..(europe) so looks like the answer is the majority.. not everybody knows a lot about gpu's, new series release dates, chips, dlss fsr rnda afmf and its future, bottlenecks etc..
kids ask their parents to get a gaming pc and 4060 is the best seller...
@@GamerMeld That's fine for a secondary system/backup. And I have two games I have to turn down significantly from max at 1440, CP2077 and Controll. And a few that I only put on high to keep above 60fps.
There is nothing wrong with 8 GB of VRAM, it depends on your expectations. I still play on a 4 GB card today and rarely fill them at 1080p.
The Problem with Indiana Jones isnt probably that 8gb Vram or even 12gb becomes too less vram. The problem is that developers dont give a fuck about optimizing their games.
You do know that ray tracing is required for Indiana Jones, right? This is why you require more vram. That's also just the start of ray tracing being a requirement. Soon, every new game will require it. AMD better step up, or we are paying lots for cards able to play new games.
Yea thats why we are stuck with textures from 2016 in all new games. Nvidia sold to many 8gb GPUs.. Btw game runs really good , no stutters, good lows and it is optimized very well.
@@AtteroDominatuscheck my GRE video .. Maximum setting on 1440p on AMD rx 7900 GRE game runs like a dream even with RT.. Even at 4k with max setting you get 60fps if you have 16gb of vram and it looks stunning on native 4k..
What if Nvidia keep making 8gbs so that developers will optimize their game. if they dont. people wont play
@ this game has top notch high res textures and that eats vram. Its simple, there is no optimizing that can fix that.. Only thing you can do to lower quality to medium or even low and you will still have textures you are watching im games now..
Wound not be surprised if NVIDIA and AMD works together with these AAA gaming studios to keep the requirements high to sell more gpu's. All we get is 30 fps a blurry mess and high input lag.
It's like game developers aren't even trying to optimize their games anymore
I think its more the coperate greed pushing deadlines, theres so many games out now they all competing for same consumers.
Money looks better on paper then a fully developed game.
We will keep buying 50-80% released games aswell
Jensen and Lisa are family. I don't get fanboys when both CEO can go to family dinners and laugh at you.
@@hiteshbehera5229they aren't
@@hiteshbehera5229 they use same engine, what's the to optimize?
lookin' at my GTX 1080Ti with 11GB of Vram......
hang in there o gal 😢
Go ahead & tuck her in & close her eyes 💀
It still plays my old games great, but my CPU and RAM is 10 years old. Trying to work out which modern CPU released in the last 3 years will play my 1080ti without too much bottleneck.
@@jodiepalmer2404 try going with AM4 and R7 5700X
Not overly expensive, provides fairly good performance
@@LisSolitudinousif you want a decent budget card you might want to hold off for a little bit and possibly take a look at the new intel battle mage cards there looking like they're going to be decent GPUs with good price to Performance.its completely up to you what you get but there's another option.
I upgraded from the 1080Ti to a 7900XTX when I decided to switch from 1080p to 4k. My old 1080Ti still lives on in my wife's build and it has plenty of performance for the games she plays like Phasmophobia, Baldur's Gate 3, and SIMS
Lol my rx 580s I bought brand new at $239 had 8gb of vram.... 7 years ago. Amd was supposed to be our savior after ryen launched. Fkn shareholders
Exactly!
Ironically, that seems to be intel now 😅
Man I still have a 4gb rx580… just old enough to understand how things work too and eagerly looking for a 4090/5090 so I could be set for hopefully a decade
12 gb 2060 here
@@xenirdhey man! i want to get back to gaming playing AAA games is 12gb vram not enough nowadays?
I'm not buying any future GPUs with less than 12 GB of vram.
The RTX 5000 series is already in production (and being shipped possibly), so whatever the specs are will also be the final specs on release. The only thing that can change last minute is the price.
"The first Crysis wasn't known for being all that fun" Wut? Damn I'm pretty sure most people thought it was a fun game. I remember putting my suit in speed mode, running in the jungle at the speed of a cheetah from a North Korean copter. I manage to hide in a shanty only for one missile to destroy the entire structure with pieces of sheet metal flying all around me. It was so cool I remember in that moment the one thing in my mind was "this game is TIGHT!". Switching between all the different suit modes added a good layer of strategy to the game too.
I still replay it (as well as 2 & 3) from time to time.. I've always enjoyed the games.
I really enjoyed the Crysis games, but I just know a lot of people more looked the first one as a benchmark.
I didn't really care for the first game personally, but it was alright. It does surprise me that it still looks good today though.
Compared to current games
Crysis is way more fun
Crysis comes from the generation where devs were putting fun elements and not propagating their personal agenda through games
yeah I kinda liked this game but realistically it was known for its graphics. Not because it was particularly stellar game as is. At the time there were many more interesting and engaging games. It's kinda heartbreaking that honestly if you look at it now in current market it actually does sound like quite original and interesting game... what a time to be alive...
AMD really does manage to always miss opportunities 🙄
Imagine the RX 8600 but with 12GB. It doesn't have to be 16GB but why not make it 12? 8GB of VRAM cost $27 so adding another 4 is like another $14 in production cost. So add $20 to the sales price and you're good with a MUCH better product. But no. I just don't get it. It feels as if they don't WANT to win. They manage to make good competition for NVIDIA at a fraction of a budget but then there's always some god awful design choice that must make Jensen go "😂For a minute I actually thought they'd be a problem😂".
High production cost ?
especially after Trump banned quality components and material from China which are known to be cheaper.
Because outside of this one game it isn´t needed. It like a car with 60HP and 6 exhaust pipes.
You make certain design considerations and this one was made that way because it makes sense. Chill out it´s just one badly optimized game. You need an RTX 4090 to play it at max and you think the GPU is at fault...
i bet they will have a 8 gb and a 16 gb version , that's why they do it like they did with 7600 and 7600xt , because if you make 7600xt 12 gig then you are basically left with 7600 6 gig and this shit no one would buy.
@@sierraecho884 8GB is not future proof 🙄 Even in 1080p there will be more and more games that exceed an 8GB limit of VRAM.
If 8600xt will be a 1080p card then it wont need 16gb, so why waste more money for a card that will not benefit from it.
Man it sucks seeing AMD not trying to push their GPUs to be a way better product. It's like they're not even trying anymore.
I mean they did say they would be taking a step back this time around and going back to the drawing board
tbf the 6800xt was like $650 so it still matches its price to performance with its 72 CUs
I don't fall for that, they beat the world with top tier CPUS, they are probably doing the same with GPUS, but they are not gonna tell the world their tactics... They have the resources to do so, so why even telling your competitors? They have turned the tide almost completely already. 🤙
The only thing is that when they "conquer" everything, are they gonna deliver us anything good or will assume the same role as intel, mitigating the most it's potency delivery?
When almost nobody buys their GPUs
they probably ran out of technology and all the big brains need more alien tech to reverse-engineer. takes a while... last time they did it, we got plasma TVs!
16GB VRAM or no deal. They are trying to coerce people into early upgrade paths. Scummy!
Also make the higher level cards have more ram what since does it make for a 5070 (probably a 5070 TI/super also) and 5080 all have the same amount of ram.
I hate to defend these companies, because they are greedy af, but the problem is actually customer Fomo, not the companies or their practices, they are doing it because people's fomo gets them to still buy it......if customers would just not buy it on principle, the companies would start to listen, they aren't going to listen as long as people are spending money on their crap.
99% sure 8600 will have 8 gig while 8600xt will have 16
Problem is that they are forcing you to buy a 2k card for 1080p gaming. They want to sell you expensive product.
not really. the higher density Vram chips are a bit later than expected going in to mass production so are not available. increasing bus width is expensive.
So the options are delay 6+ months or redesign the cards (about 6 months delay) with wider buses and increase price accordingly either way its mid 2025 launch and what are they going to do about a mid gen refresh if they delayed to make wider busses and got the denser modules at the same time, sell even more expensive chips for even more money.
I wish they would sell cards with increased bus width but the market spoke generations ago that no one was willing to pay more for a wider bus.
IF nvidia delay lets say to wait for mass production of 3GB vram chips and then that gets delayed again as its only samsung (in that country that just declaired then undeclaired martial law) who say they are going in to MP next year so far so expect delays, well the 40 series is no longer being made so there would be no gpu supply (from nvidia which is the market) for 6+ months. Guess scalped 40 series cards for most of 2025 would be the only option in that case, so I would say its better for everyone (except scalpers) to just push out another gen of cards without enough vram, it is business as ussual after all.
And yes seeing whats happening I did just buy a 7900xt with 20GB of vram.
Its looks my 6950 Will be with me for more few years.
I think my 6800 should last till Intel's Druid comes out, hopefully by then their drivers improve over time especially regarding older games....their GPUs seem to be more power efficient than its counter parts
no reason to upgrade because there is nothing good to play.
im also running a 6950, its going to stay with me for a LONG time, im really liking it, went from a gtx 1080 to this.
Or upgrade to the run out end of gen 7900xtx's 24gb vram !! I mean its no 4090 and RT it gets crushed basically by a 4070ti but in raster its 4080 lvl's !!
But 24gb vram is going to be solid for a long time still !!
For a sec I thought it was the HD 6950 😂
People: Shame on you Nvidia.
That same person: Well, here goes my 5070. Don't have money for higher. And 5060 sucks. I don't want AMD. Guess I just have tu turn on DLSS.
- Human with more copium than oxygen in his veins.
@@FireCestina1200 lmfao no joke I hear people crying telling amd to compete and when they do for a cheaper price no one buys them. 🙃🙃
@@roshawn1111Yep. Most people want AMD to compete so they can buy Nvidia cheaper. It won't work this time around fellas. Nvidia simply won't care because they have a bigger customers to serve now (the AI crowds) and they have to. Because anyone who have tried building a business (small/big, no matter what size) would know that if you have a chance to get as much profit as you can, then you have to. Because it will not come for long. Nvidia knows this, that's why they're milking it as much as they can before the others catching up to them.
Sadly new games make you rely on upscaling whether you have an amd card or nvidia card, and dlss being quite a lot better means more people will still gladly pay the nvidia tax, myself included.
@@Crazzzycarrot-qo5wr Hey ! Don't forget Intel. Intel GPUs are good too. Just less popular.
FSR isn't that worse than DLSS. The real issue is that most developers are too lazy to make any proper FSR implementation.
Actually (you can check the file version). There is 0. And I insist. 0 games with the latest FSR. None on Earth. For those with older FSR. Some use lower than 3.0 (which is kinda not great) and some use 3.0 (3.0 have some issues compared to 3.1.0 or 3.1.1 or 3.1.2) and as you can see in my comment. The latest FSR is 3.1.2 even if let's say a game features it (which is not the case) all other .dll files require for proper frame gen or greater shader lighting is missing or non-updated).
I started to seek the FSR file to update it. (The real FSR) Most games don't have it on Steam. And some do. But lack of the other one).
Funny enough the only "game" with the latest XeSS I own is 3DMARK x)
@@FireCestina1200 yes, dlss is inherently better than fsr. Of course this is to be expected.
It has nothing to do with the game running latest fsr version or not, since fsr 2.x looks better than 3.x from what ive seen and tested.
Intel gpus are good value, but not for higher graphics and resolutions, for now at least.
Top performance for $500 is just a dead dream at this point
its not too bad, 1440p is sweet spot resolution, as most ppl say that jump from 1080p to 1440p is really noticeable, and from 1440p to 4K isnt that much but it requires more graphic power. In that case, Amd's offering at `$400-`$500 are mostly RX -700XT and RX -800XT which are great for 1440p and even entry level 4k, with a decent amount of life in them aswell, look at rx 6700xt/7700xt and rx 7800xt, for rast perf, they r better than nvidia's even 500-600$ offering for price/perf and simple outright throughput. It's a hard truth to face, but ppl need to understand that nvidia really isnot making cards in the favour of their loyal customerbase.
@@sikandernaeem9661Also their RT isn't halfway bad. Sure, nvidia cards would do better. But it's a struggle for both at high res, untill 7900/4080.
that makes sense tbh
2 to 3 years ago the net was full of reviews showing how GPUs struggled with modern games on high settings due to low VRAM. We are now seeing the third gen of cards that have, if rumours are correct, done nothing to address this. Instead of providing the actual hardware required, they are all pushing upscaling. Of itself there is no issue to having upscaling, but not so they can skim even more profit out of the consumer for relatively lower performance and reduced hardware.
Hopefully the consumer is waking up to the scammy behaviour, though that was also talked about 2 to 3 years ago and people still back stupid but shiny.
the 8600 having only 8gbs of ram is dissapointing. I'm definetly going to get the B580 knowing that.
For real. Double digit vram should be the minimum for any graphics card nowadays. The high end needs more too. The 1080ti was fucking loaded with vram back in the day and it STILL kinda kicks ass 8 years later tbh. I think Nvidia is making them like that intentionally so the card struggles within a few years and you have to buy something from their new lineup. Just my theory.
Actually quite disgusting .. flip side to a degree Nvidia sells you their 8gb crap then boosts the Vram in later models of its same rubbish making you by the same crap twice just with better specs ..
AMD doesnt you buy there rubbish its what it is rubbish 8 gb 7600 /8600 ( i dont think i ever seen them sell a 16 or upgraded V ram 7600 16gb card or what ever ) like Nvidia does !!
call it what you want ( forcing to to buy better at the start ) ( BS marketing ) But atleast with AMD you pay for with no upgrades !! kinda putting the sale back on you if you buy their crap at the start well thats on you 16gb should be the lowest Vram on a card in 2024 ..
Personally I would wait till the better intel cards ( i guess B770 cards would be there names ) Ive got the ARC 770 16gb paired with a 14600kf for my fun all intel build ( 7800x3d 7900xtx is my main ) but for the most part that little ARC770 16gb plays most stuff pretty well with no issues..
@@jrodd13 I'm still using the 1080ti, that thing might last me a few more years at this point. I only started hitting VRAM issues a few years ago, so I might have to consider turning down the details on newer games now. But 1440p and 90fps is still pretty good for an old card.
Idk why is that rx 8600 claim to bé 8gb cuz there was just leak on amd gpus rx 8800xt, 8700xt and both are 16gb but 8800xt háve more cores and 256bit bus and rx 8600 xt shoukd háve 192 bit bus and 12gb maybe we gonna have gpu Like regular rx 8600 non xt with 8gb but it really looks Like its gonna bé 12gb version and nôt 8gb, over all im a bit dissapointed IF rumors are trú cuz i have rx 7900gre and i love this gpu its powerfull háve enough VRAM and its was cheaper than my Rtx 4070 super wich i return cuz it háve nonstop issues with it. I hoped i could upgrade to am5 platform from mine 5800x3d and rx 7900gre to Basic ally 7800x3d or go balls deep and go with 9800x3D cuz why nôt and i would pair it witj 8800xt cuz first claim was it gonna have raster of the rx 7900xtx but if this is True than performance is gonna bé Basically worse than on my rx 7900gre OC version but well RT is better but IF amd manage improve RT performance to that level as nvidia háve im bé happy and mé personally im not using RT at all even tho my gpu can do it quite well i just dont Like to play in 1440p ultra settings and háve only Like 60fps nahhh Like 100plus all the tíme with low frame tíme and huge issue are games BASED on Ue5 which use shit load of RT baked into game thru engine and there this performance can bé good. But we will see how its gonna bé really.
ahem... 7600 XT, 6750 XT, 6700 XT, 7800 XT?
just leave 7700xt for a minute, even 6700xt destroying 3080 is wild . more than double the performance .
A lot a people will be disappointed when buying a card with 8GB. Likely most people are illiterate when it comes to computer components and nVidia, (primarily but not exclusively), preys on this. There is no future for 8GB anymore - Indiana Jones is just the start.
comment user slams media! for! lack! of! anything! useful!
tl;dr: need a ”disappoint” score in all reviews targeting the unsavvy.
you nailed the hammer right between the eyes with this comment: the perfect word in the right spot.
> disappointed
this is an *extremely* important word, especially in the Wide World With Web Words Where Wiews and Watches Withhold Weight Without Watching, When Wiggling Wasn't Without...
sorry, fuckit, can't do it.
the this era marketing driven *everything* where:
a mild disagreement over tipping culture in spain
+
media¹
=
”Socialist Jew Slams Black GOP Superstar
While Elon's Kardashian Nig[advertisement]”
~~! everything below that fold hidden like the titanic ¡~~
reporting live from
Reddit or Twitter or X (read as 'found in a hole filled with water in the ocean ')
by
..¿xxxBACON-PUSSY-BENEFITS-BEEFTITSxxx¿..
Agree. I have been running out of vram on re2. 8gb is not enough for this old game
well, it was fucking obvious that those 8-12gb cards from nvidia aren't future proof wasn't it? it will only get worse from here.
Unless DLSS 4 helps 🤷🏻♂️
@@mikeramos91i hate the concept of dlss. its a crutch.
@@CountlessPWNZit’s ok as longs as it on quality mode. Yes it’s good, but still doesn’t excuse the RAM limit
XsS: 6gb ps5: 9gb ps5 pro: 10gb XsX: 10gb Vram. 8gb is enough for this Gen
@@tomaspavka2014 those are consoles that will be completely obsolete the moment the next generation of games comes out.
Game Devs : Lets optimize game so gamers enjoy the game. Nvidia : NOOOOO ! I need to sell graphics card !
Dude my 2080ti has 11GB of Vram, the settings for warzone I run at to maximize FPS and FoV eats 9.5GB.
My 1080ti also has 11Gb, it's crazy.
@@Jhintingbot 1080ti is one of the best and most future proof cards of all time. It's heavily outperformed these days obviously because it's not just vram amount that matters, but it's still serviceable for most people at 1080p.
@@allbies I can still play games at 1440p and 100fps + depending on which one. Even max settings 144+ fps on some. Usually removing useless settings helps
You didn't say crap about the 5090's problem.
the price is the problem that is kick ass card!
@@lordfyita2096 It's because this guy is a giant click baiter for no reason. His content isn't bad but he does the annoying click bait titles like he as if he really needs to do that. I wish he would stop.
@@trsskater Thanks mate, have a good one.
Nvidia isn't going to change their mind. They do NOT care about gaming. At least until us gaming people boycott them and they lose out on their money. People, we have to force their hand and boycott them. If you do its not like you won't have option. Amd will get you by until nvidia comes to their senses.
Seem you all forget how Nvidia states that they will prioritize AI over gaming not a long ago......
Gamers boycotting Nvidia will do pretty much nothing to them. Gamers don't make a lot of money for Nvidia anyway.
as you say they dont care about gaming anymore and if is not profitable anymore then they will stop making them they dont care.
I think the Ai bubble would have to burst, if it ever will, their marked cap is inflated af, as long as they still get huge revenue from ai, they won't care about gamers
Yeah buddy this is exactly why I traded my 3080 10gb for a 7900xtx not than double the Vram and it's freaking amazing and a absolute beast
This isn't a problem for NVIDIA, only the consumer. RTX 5090 is the true Blackwell, the rest NVIDIA nerfed purposefully on VRAM.
I'm glad I picked up a 7900 XTX for
750$ new without original box
You think that's new? Oh silly you. It probably has few months of cryptomining behind it.
@roastinpeace2320 which doesn't impact performance at all.
@@CrazyAjvar Indeed but it impacts lifespan of your GPU. Still good bargain for 750$.
Didn't AMD say they were going back to competing against Nvidia only at the low and mid level GPUs and not competing at the high level? Leaving Nvidia to compete alone at the high level? I seem to remember hearing something about that recently.
It is hard to be disappointed in a product that doesn't exist. I don't spend mu life fretting g over speculation. After release I will form an opinion.
I didnt get the issue with the RTX 5000. What is the issue?
The 8 GB in vram of the rtx 5060
Back then everyone and their mom wanted a 3060 or 3070 with 8gb. I only managed to get a 6800xt 16gb. They said don't worry about 8gb it's all about the raytracing. Times have changed alright!
Also ray tracing needs vram, so 8GB cards could barely do it despite the hardware. Like, 3070 could been the best card if not the memory...
I'll hold judgement until the official announcement and 3rd party benchmarks.
If I have an rtx 4070, would the 5070 be compatible? I'm curious if they will have the same connector and similar enough dimensions, wattage and so on, that it will work
6800xt vs 7800xt is basically the same card in real life scenarios. For casual games - around 5 frames difference.
To upgrade to something that provides less than 25% framerate improvement is pointless for me, considering the the prices.
**5000 series not out yet**
RUclipsrs: RTX 5000 Has A BIG Problem!
The chair DOES look nice. But it's 500+ USD. That is quite a lot. You can get a full recliner for less. Though I suppose a 10 year warranty IS pretty nice...
Right. Get a La-z-boy recliner, and mount your monitor to the floor. I used a couple of 6x6" steel plates welded to some rectangular tubing, then a standard pneumatic swivel monitor mount on top. Secure everything to the floor with some lag bolts. Keyboard goes in my lap, mouse on the arm rest. If I need to get up or want to watch TV I just swing it out the way up against the wall. Saves a ton of space.
for 300 you will get a very good office chair, that is muuuutch better than this binged out 100$ chair.
@hithere2561 Ive gone through too many $300 chairs. My last was $600 and it's more comfortable and still going after several years.
Games companies and gpu companies are ruining pc gaming simultaneously
What do you mean Nvidia was proven wrong? they were proven right. Made you buy another GPU after just 2-3 years. And you'll buy them.
Moore's law got replaced by Moron's law: the consumers would buy Jensen's overpriced crap no matter how much they complain about the price.
I'd have added.
600 = 10gbs
700 = 14 gbs
800 = 20 gbs
So it doesn't feel that bad not having a high-end gpu
personally it should be
600=12gb
700=16gb
800=20gb
even then im of the belief that no card sold in 2023 going forward should be less than 16gb when they are all going up in price !!
@@volvot6rdesignawd702 for the price of cards today, they really don't offer enough. I have a water cooled 1080ti that cost me about $1200 to $1400. That same figure today, with inflation, should be about $1800, I'd be fine with $2000 for a top of the line card. But here in Australia, it's now worth $3500 for a 4090.
@@volvot6rdesignawd702 my 7800 xt was $580 and has 16gb
@@volvot6rdesignawd702tbh 8800xt could still have 16gb and it will be great. Only really need above 12gb for 4k so cards with 12gb are fine too
12gb Vram should actually be the new baseline by this point in technology... 8gb cards are fine for mobile systems and the likes, but not for desktops.
Why do so many people immediately blame gpu manufacturers for poor performing GPUs when it’s devs and studios that have the power to design what their games need?
Game engines can almost always be better optimized, resources can be better managed, and project can be delayed to make this happen, but nowadays we more often than not see rushed releases, poor CPU utilization, and graphics that pale in comparison to modded games from 2 years ago but still require double the VRAM
it feels like nowadays AAA studios are designing their games to be played at minimum on the highest end specs rather than the largest audience. And then when you look at the fact that some studios actually partner with GPU manufacturers to create these spec sheets - it just feels like an ugly game of leap frog where the only way to play your highly anticipated games is to keep buying better hardware every year
The thing is most people aren't going to pay 50% more on a game so that it runs on 30% less vram also the vram has stayed the same for like 8 years now and games are only getting more complex so it is mostly on gpu manufactures fault it makes a lot more sense to a gpu manufacture to spend $15 on 4gb more vram then $30 more on each game as well as less game releases due to having to spend a lot of time optimizing.
@alextoscano but the thing is games ARENT getting more complex, graphics are virtually in the same place if not a worse state than they were in 2020-2022. We can see it in the way that devs use upscaling and anti aliasing techniques to hide poorly designed sets - the best example of this is literally any video that talks about how unreal engine 5 is affecting the industry.
Just look at frame generation and DLSS. It’s absolutely CRIMINAL that a software bandaid such as framegen is appearing on spec sheets for games, especially when frame gen and upscaling absolutely destroy VRAM, and it all started because of TAA (temporal anti aliasing)
The real issue is the fact that those at the top are pinching pennies at our expense - AAA studios dont HAVE to charge $120 for their games, they dont HAVE to add micro transactions or subscription services - just as much as GPU manufacturers COULD add more VRAM to their cards or COULD make affordable low end GPUs but dont
@ It really all comes down to this: We should be buying high end hardware to push the graphics of our games to the limit, not the other way around
If i was buying games just to push my GPU to its limits i wouldnt buy games, id just let it run Furmark at 4k stress testing preset 24/7
@ and this is before you take into account the quality of games that have been coming out, when was the last time you saw a AAA game that came out of a major studio that wasnt a buggy incomplete mess on release?
@@J4rring gotta agree with you, dlss and framegen were meant to help lower end gpus compete at a higher resolution. Instead devs have turned it into bandaid fixes for poor optimizations. Then they want to charge you 70/100 dollar ultimate editions for bugs and 60gb worth of patches on your Internet bill to fix it.
AMD has again screwed us. At least Intel is still trying. Damn you NVidia! *fist shaking*
just wait for the GPU to be announced before making a judgement
Sorry just gonna say it with Indy. Those specs. are devs. being lazy. Telling me I need DLSS on for decent frames...this better be referring to people who want over 60.
As it stands I think Pathtracing is garbage as yes it is easier for devs. to implements but makes RT even MORE of a drain than it currently is., Hearing about the difference of Cyberpunk's Ultra RT vs. Pathtracing mode it sounds unnecessary and totally not worth it.
Also This is worse than everyone says about Cyberpunk and how unoptimized it is. Everyone we have a new winner here!
If Intel is smart they will bring out their B770 and B780 with this Indy info given I have heard they are both 16 gb.
@@NetflixForeign I can't play cyberpunk without path tracing. Its is a higher quality image than Ray tracing. Its nice that the 4090 can handle it since the 3090 was basically a slide show with it turned on.
@@trsskater So you basically want to be married into DLSS for something that maybe improves the RT by like 5-10% at best, maybe more like 5%.
@@NetflixForeign DLSS + Frame Gen for cyberpunk at 4k to use path tracing yes. It it looks way that way than without. But I also want to have all my knobs set to 11 at the cost of fps. As long as it's high enough. Otherwise I'll wait until the next Gen top end card and play it with that one.
Indiana Jones uses forced Raytracing… thats the problem.
Who the fuck wanna play Indiana Jones game?
yeah, something weird going on here. teh mind boggles.
Hanging on to my 6800xt for another cycle i guess.
I have a EVGA 3080 12GB model, can we see those specs or the 3080ti with 12gb?
Im disappointed so im just going to get the 7900 XTX
you must love stutter lags
if you get an okay price, i dont think youll regret it
@@ibot-u3s explain?
@@Bastyyyyyy was planning to get a 4090 a couple months back but inflation and I just started hating nvidia for their stupidity
Only if it's like 400/300 dollars after the 8800xt release
Indiana Jones
You give me: NASA level PC
I give you: Skyrim with 20 realistic mods graphic level
What is a NASA level PC? You mean antiquated like the SLS? 😂
@@Chocobollzno, those used in labs for material simulations or for genome analysis in AMES ;p
@@Sajgoniarz 😏
I play indiana jones on 4060 without any problem. I don't know what are people complaining about.
I'm playing Indiana Jones and the Great Circle right now on a 3090 and I'm blown away at the performance with everything maxed out (except for full RT which I have off). I am getting a solid consistent 70FPS in 4k HDR. So definitely no problems here and it's not using anywhere near all my VRAM
Installed the new Nvidea app and it isn't working at all. Even after reinstall and a couple of restarts,. What's Up N?
Wow, if that 8600 is only going to have 8GB on it, they better price it at $99 or it will never sell. I wouldn't buy anything with less than 16gb, but according to a recent Hardware unboxed poll, there are some people who would pay up to $100 for one. Hopefully there are two variants of the 8600, an 8600 and an 8600 XT, and the 8600 version is the one with only 8GB, while the 8600XT has the previously reported 12GB. In the prior generation, the 7600 has 8GB while the 7600 XT has 16gb. Even with 12gb, I'll not be considering the 7600XT unless it's really cheap.
They're following the Nvidia philosophy it looks like pay more get more Payless get less Nvidia be like hey AMD let me show you a trick in business
@@carlr2837 99$ there really is some people that want shit for free lmfao
One says Intel Battlemage is dead already and other says druid GPU is already being built. I am confused.
Nvidia has shot itself in the foot with the 5000 series with their lack of vram and appallingly slow bus speeds, nvidia's greed is getting the better of it
they make so much off the ai chips now that im surprised they even make gaming gpus at all.
Hey i have AMD Radeon RX 6750xt just having anxiety over12 gb vram will it be sufficient for 5 years if i run games at preset(mostly FPS and sometimes AAA open world games at only 1080p)?i want to make use of it for at least 5 years then upgrade to the new GPU
Sorry mate but "having to sign in to get driver updates" as you stated is and was totally false. Bad form on your part. Drivers were easily downloadable direct from site.
If you were using the app, you had to sign in.
I mean I kinda figured that's he meant...
so don't use the app. I have never used Geforce Experience. I actually rip it out of the drivers before installing them. Like the guy says, just download them from the site.
3:30 when did intel released i7 13900k??? must be a mistake!
I've just built a new system for myself and I've put a second hand 7800xt in it as a placeholder for something new next year. At this point I'm so happy with it, I might just keep it for longer. There's no way I'm downgrading memory size and the Nvidia options with 16Gb are likely going to be hideously expensive. I guess if the 8800xt is exceptional I might be swayed. The intel cards are definitely looking better, but I'm not prepared to risk the poor compatibility issues.
i wonder what happen to infinity cache...??? they had it so hype during 6800xt...128MB...now its 64MB and not change since amd dowgraded since rdna3
Look I LOVE AMD. But there simply not cheep any more. 40 quid isn't enough to make me buy the weeker card. May the best card win
go Intel wait for the specs of the B770 cards my guess we will see 16gb to 20gb ( maybe 20gb ) .. But forget cheap AMD or Nvidia cards Nvidia has the mind share and market share ( because people are either stupid or like getting rolled ) and AMD will follow suit with just undercutting Nvidia's pricing with better Vram ( obviously not on there crap 8gb junk ) !!
Yeah really disappointing that they would rather screw us customers, because they get better margins and nvidia is extremely anti competetive especially with vram. Intel is pretty much our only hope for reasonably priced gpus
Lissen i have only nvidia gpus from gt 720 later gtx 1050ti after that rtx 2070super later Rtx 3070 OC and my Last gpu from nvidia ws Rtx 4070 super which i must return cuz it was just ahhh nôt good and Basic ally bad priced gpu over all special in europe i buy it for 729.99eur crazy price for Basically entry level of mid range gpu, over all gpu can run games but in most new tituls that we have now i multiple Times run into problém with VRAM alocation and nôt enough VRAM at all also stupid stutters and bad bad bad drivers im. Sorry Like i know that amd háve good driver support now cuz in past it wasnt that great but i expect better results from nvidia as Leader with gpus this was damn shame Like even my Rtx 3070 OC dont háve that múch issues with driver than my Rtx 4070 super and most of the tíme drivers dont brake game it self but apps Like obs or other recording programs also frame gen used to dont work in my many games after uodating driver Like damn shit show only good thing i love about 40series was dlaa and dlss those were beautifull. But i choose nôt to wait and maybe that was wrong cuz i choose rx 7900gre for 630eur still expensive but 100eur less than Rtx 4070 super and thats big price gap in my book so i pick it up, what i find was nice drivers was great also adrenalin software damn nice stuff very easy to learn and 16gb of VRAM also around 10 to 15 %better performance in 1440p or 4k and in games that use more VRAM this makes big lead but in online games both Rtx 4070 super and rx 7900gre was on pair but rx 7900gre delivered always more 1%lows maybe cuz of that bigger VRAM. Over all i dont regreting to go with team red and i dont care that they dont push out that many drivers Like nvidia does point is that those drivers work and thats main thing for mé also im done with nvidia, prices goes to the moon and new gen its nôt bé any different and for Like 10% of the higher performance i dont care at all nôt worf it so i stay with team red and only thing that i really miss is dlaa and dlss cuz fsr 3.2 is really good and Basically with this version amd shrink gap between fsr and dlss quite a bit but quality móde or even more low is still better on dlss for sure and that without thinking. I just wish i really wish that AMD release that new FSR4 for us that háve rx 7000 series i hope its gonna work with out gpus as well. And when it comes to this new AMD 8000series well in short this is nôt bé improvement in performance but its gonna be something Like beta test for new and better RT performance in my opinion so who are those gpus for well Basically in my opinion People that háve rx 5000 series and 6000 series for those People its very worf it to upgrade to something Like 8700xt ot rx 8800xt i think but for mé i dont know IF amd finally give us fsr4 with at least 90% of the visual Like dlss than i Will pay all money for any amd gpu cuz only think that amd holds back is good upscaler cuz many people dont even use RT but they use upscalers and most amd users use rather intel Xess than Amd fsr cuz most of the tíme devs implement old fsr version or they implement fsr wrong Like in cyberpunk 2077 the worst fsr imolementation i ever seen. Btw sorry for my english write Ing on the phone at 4am😂.
The raw number of CUs isn't the end-all of performance. The 7800 XT has equal or better performance than the 6800 XT with 60 CUs vs 72. People fixate too much on those numbers when to the end user they essentially mean nothing - what will matter is how well it performs and what price they set.
“Can it run Indy?” Doesn’t have the same ring to it
You're not just a "pretty big guy". There are some other things to say, that could be worked on. But first, the future health complications should be considered. And they are real. Take it from another "pretty big guy" who is 6 foot 4, 270 lbs. Not easy to find the clothes that fit and look good at the same time, especially the normal freaking jeans. Or something.
Brother, 740 dollars for a fucken chair is insane
64 cu's is the leak we always had for Navi 48, and the performance was always said to be ~4080, probably between 7900xt and 4080 so I wouldn't be worried.
Playing both sides..... sure you only have to move a few settings to fix. Yeah and that is what we do, what we've always done. Sure I want more VRAM, but if I move a setting down and it fixes, we all know the picture quality takes a very very small hit most of the time.
I still remember back in 2020 how ludicrous it was that the RTX 3080 would only get 10GB of VRAM! It just didn't add up, whether you looked at GPUs casually or got into the details!
i'm not buying a new gpu will the vram on the 70s hits 32 gig.
Intel needs a 18 month cycle for a couple of generations while the others do 24 months. They need to catch up properly
Given how fast new and, I guess, improved components arrive almost daily, why would a PC Gamer buy anything today knowing it will be obsolete in a few months if that long?
I do not hope Nvidia changes their graphics specs last minute. More vram is part of what is making the prices shoot up.
3 thing I want to see.
-Lowest amount of VRAM is 10GB/12GB. And it's for the 8500
- 8800XT and higher have at least 96/128MB of infinity cache and not 64MB.
- Updating ressources file in order to the latest tech (the .dll) with a program to flash current gen (7000) and newer gen
and AMD not giving in your demands is so strange when NVIDIA is lot easier to mod and easier to update dll...and best of all, its free for user to do it vs AMD your freedom is limited under them that under amd only devs allow to use fsr source code...not modders and users , but for intel, they made dll that change amd tune last minute changes with its fsr3.1 code. Can't wait to see more intel xess2 dlls now...more plugins for the users is not just for devs only, specially you got linux user space love to work with dlls in vkd3d-proton code now they got latencyflex alternative now...intel XeLL
@JuanPerez-jg1qk Real. The biggest problem is many games have FSR support so poor. Updating the .dll (I tried) sometimes makes the game unable to launch or launch then crash instantly. Hopefully. Most games work flawlessly with the update. But from manual tuning, no frame-gen from FSR 3.1.2
Amazing video, thank you Gamer Meld.
Why is that a problem for Nvidia? Amd said they stopped aiming at being faster than Nvidia…
No offense to your chair but I prefer actual office chairs the old style high backs they are very comfortable and normally made out of material that won't crack fall apart or wear out easily like most chairs these days, they are also stupid expensive but I've found I spend less over the years buying quality
At this point I believe there may be an understanding of game companies releasing poorly optimized games with intent so gamers continue to drop thousands of dollars on GPUs.
My bet...the 8600 specs are for the non-XT. Also, considering that the 7800 XT is on par with the 6800XT in performances with 12 fewer CUs and half the infinity cache, I'd say that this RX 8800 GPU is likely going to be faster than the on paper specs appear. Time, as always, will tell.😉
Impossible ull get 4080 raster with 8800xt.
I would say around 7900 gre raster. But with much improved raytracing.
Serious question Why does literally EVERYONE leave off the 3090 and 3090 Ti when posting comparisons?
Where does my 3090Ti card fit amongst these bars?
The 3090 is a mid-range card (with loads of VRAM) these days (destroyed in performance by the £604 7900 XT), but because of its high release price and no availability in stores anymore (at discount), not many people own it. It is not really relevant anymore because of this.
its only a SUPERIOR deal for 3D modeling people like me. I'm planning to buy it used but not mining @@norbert4787
@@norbert4787 Maybe, but it is still the best non 4090 card, and it stomps the 4060, 70, and 80's.
It was at one point THE Nvidia card to have. I bet there are a ton of people wondering if it is worth the dimes for a 5090 or to buy a step-up when the 4090's crash.
That is my primary concern. Literally.
The 3090Ti can do 99% of what a 4090 can, and my original upgrade plan was to upgrade GFX cards every 2 generations, but at this point I need to also consider that the 5090 will require a complete hardware upgrade, where the 4090 can work with the i9-9900k setup i have currently.
Just saying.
It was really weird to me that every review site just pretends the 3090Ti does not exist.
The most popular GPU on Steam is still the RTX 3060... why even make a game that most people won't be able to run?
16GB (still) for 5080 is just ridiculous, and cut me the crap for bandwidth etc. It's shit.
And AMD is following along in this shitshow. Wtf.
RTX performance is not nacessary. It is a wrong way to push the rtx case.
I saw a deal in used card of EVGA 3X Rtx 3060 ti in $290. Should i buy it Right now? I'm confused please help 🥺👀
no
I don't mind the 5079 12gb vram-192bit ggdr7. I do mind it though when it costs more than 499$.
Why are people expecting to play these new demanding games with low budget hardware?
Modern day optimization: buy a better hardware.
I still think everything beyond 6GB VRAM is useless because nearly all modern games only use 4GB VRAM or less
The vast majority of PC gamer use Mid-range GPU. Why make a game where most of your customers can't run your game???
The vast majority of pc use LOW range GPU.
@tomaspavka2014 OK, but I said gamer
That's why I will either stick with 7900XT and its 20GB of VRAM or I'll trade it in for a RTX 5090 if the price is not gonna be ridiculous. Whoever wants to have a solid gaming experience in 2025 with new games should not even consider card under 16GB of VRAM. That's just how it is...unfortunately.
Let me guess : the 12vhpwr ?
the latency improvement from switching to a monolithic die should make up for any shortcomings from lack of CUs. they are also claiming a big power efficiency increase... AMD is playing the game of the future. they know how certain states are gonna start trying to pass power usage laws(looking at you CA). by focusing on efficiency, they can gain market share by default if they are more efficient than nvidia
5070 with a 192bit bus is criminal 💀.
If a game requires more than 8gb vram for 1080p, it's not entirely the gpu maker fault. Devs are becoming lazy because of modern computer specs. I understand that the the top tier graphics card need more vram. But this is not my fault to have a cheaper graphics card.
The 5080 is coming with 24gb vram after the initial launch at 16gb
I higly doubt they will launch with only 8gb of ram. It would be illogical if they care about sales.
If the specs are as leaked for 8800xt there is no way AMD can charge more than 450$ for this. If it is 450$ Then it will sell like a hot cake , even 479$ will sell real fast , and also its looking like 8600 is aiming for the 239-259$ market given the fact that it has no memory uplift or more compute units. Leaks are not looking bad , it all depends on the price , if AMD decides to go over 500$ and 300$ is DOA
Regarding AMD's GPU RAM, I'd rather see 16, 24 and 32GB options. Having an 8GB is, frankly, a joke these days as even much earlier cards from AMD have managed that. I also don't like CU and cache on a brand new card being lower than that of a card two generations earlier. Not good.
I do wonder if dual RX 6800 XT in multiGPU configuration would be a lot more cost-effective and give better perfotmance... I can see this being the case, at least with games using the Vulkan API.
Instead of games being more optimised they just rely on raw performance
Has a big problem.... Is too expensive, prepare your kidney for cutting....
That chair doesn't make my coffee.
George Davis
I think the 5060, 5080, and 8600 will pretty much be DOA thanks to a lack of lack of VRAM. Even the 5070 may be a tough sell if AMD sets the price of the 8800XT at $499. As for the performance of the 8800XT, +4 CU's and +25% clock boost over the 7800 XT has been leaked. This should translate to a massive jump in performance over previous gen, at least on paper.
wait till the cards come out