RTX 5000 Has A BIG Problem!

Поделиться
HTML-код
  • Опубликовано: 9 янв 2025

Комментарии • 881

  • @GamerMeld
    @GamerMeld  Месяц назад +14

    Check bit.ly/4gcZ2uS to get yourself a comfortable chair today AND use C730 to get $30 off on it.

    • @J4rring
      @J4rring Месяц назад +1

      I want to reply directly to you to ask what are your thoughts on AAA development nowadays?
      I bring this up because to be completely honest GPUs “not having enough VRAM” is imo more the fault of AAA studios and developers and less the manufacturers.
      Im NOT saying that i agree with their ridiculous pricing for less than enough VRAM, but just looking at the graphic fidelity of games that needed maybe 4-8 gb max to run their highest settings from previous years its hard to justify the performance drops that occur when you do push the game to the limit. compare cyberpunk 2077 w hyper realism mods vs games like monster hunter wilds or star wars outlaws. Idk all the technical ins and outs of it but I can see what crazy well made graphics look like and you can run those mods pretty modestly on a 3050 8gb

    • @Prestiged_peck
      @Prestiged_peck Месяц назад

      I'm sure the 8600xt will have 16 gigs like the 7600xt does, the x600 sku they were speaking of was likely the non-xt version only

    • @quigonlynn1
      @quigonlynn1 Месяц назад

      Why do you lip sync your videos? It looks really weird and unnatural and distracts from what you are saying.

    • @dakoderii4221
      @dakoderii4221 Месяц назад

      I bet they make a 5080 Super Evo Ti TIE to cover the gap. Should become a betting category in Vegas? 🤔

  • @watchthisusa
    @watchthisusa Месяц назад +422

    Who would buy a new card with only 8Gb of VRAM?

    • @kaseyboles30
      @kaseyboles30 Месяц назад +37

      Under $200 I would consider it for a secondary system. But while I'm o.k. with my current 8GB 3060TI (I don't really play that many demanding games) I wouldn't take less than 10, and frankly to go down from 12 it would have be a good bargain on price/performance. For a main system the more I can get the better as I do have a few things I occasionaly do that can use as much as I can throw at them.

    • @GamerMeld
      @GamerMeld  Месяц назад +54

      Given its a good enough price, and you can accept that you’ll likely have to stick to medium settings, it could be worth it. But games definitely want more and more VRAM.

    • @deplasmann
      @deplasmann Месяц назад +21

      8gb 4060 is still the best selling card this month..(europe) so looks like the answer is the majority.. not everybody knows a lot about gpu's, new series release dates, chips, dlss fsr rnda afmf and its future, bottlenecks etc..
      kids ask their parents to get a gaming pc and 4060 is the best seller...

    • @kaseyboles30
      @kaseyboles30 Месяц назад +1

      @@GamerMeld That's fine for a secondary system/backup. And I have two games I have to turn down significantly from max at 1440, CP2077 and Controll. And a few that I only put on high to keep above 60fps.

    • @Ad_Dystopia
      @Ad_Dystopia Месяц назад +16

      There is nothing wrong with 8 GB of VRAM, it depends on your expectations. I still play on a 4 GB card today and rarely fill them at 1080p.

  • @Samael746
    @Samael746 Месяц назад +259

    The Problem with Indiana Jones isnt probably that 8gb Vram or even 12gb becomes too less vram. The problem is that developers dont give a fuck about optimizing their games.

    • @AtteroDominatus
      @AtteroDominatus Месяц назад +25

      You do know that ray tracing is required for Indiana Jones, right? This is why you require more vram. That's also just the start of ray tracing being a requirement. Soon, every new game will require it. AMD better step up, or we are paying lots for cards able to play new games.

    • @roki977
      @roki977 Месяц назад +20

      Yea thats why we are stuck with textures from 2016 in all new games. Nvidia sold to many 8gb GPUs.. Btw game runs really good , no stutters, good lows and it is optimized very well.

    • @roki977
      @roki977 Месяц назад

      @@AtteroDominatuscheck my GRE video .. Maximum setting on 1440p on AMD rx 7900 GRE game runs like a dream even with RT.. Even at 4k with max setting you get 60fps if you have 16gb of vram and it looks stunning on native 4k..

    • @FDHAND
      @FDHAND Месяц назад +1

      What if Nvidia keep making 8gbs so that developers will optimize their game. if they dont. people wont play

    • @roki977
      @roki977 Месяц назад +4

      @ this game has top notch high res textures and that eats vram. Its simple, there is no optimizing that can fix that.. Only thing you can do to lower quality to medium or even low and you will still have textures you are watching im games now..

  • @fakkel321
    @fakkel321 Месяц назад +61

    Wound not be surprised if NVIDIA and AMD works together with these AAA gaming studios to keep the requirements high to sell more gpu's. All we get is 30 fps a blurry mess and high input lag.

    • @hiteshbehera5229
      @hiteshbehera5229 Месяц назад +15

      It's like game developers aren't even trying to optimize their games anymore

    • @TheShaunydog
      @TheShaunydog Месяц назад +2

      I think its more the coperate greed pushing deadlines, theres so many games out now they all competing for same consumers.
      Money looks better on paper then a fully developed game.
      We will keep buying 50-80% released games aswell

    • @dingickso4098
      @dingickso4098 Месяц назад +3

      Jensen and Lisa are family. I don't get fanboys when both CEO can go to family dinners and laugh at you.

    • @notaras1985
      @notaras1985 Месяц назад +1

      ​@@hiteshbehera5229they aren't

    • @henson2k
      @henson2k Месяц назад

      @@hiteshbehera5229 they use same engine, what's the to optimize?

  • @blakedmc1989RaveHD
    @blakedmc1989RaveHD Месяц назад +213

    lookin' at my GTX 1080Ti with 11GB of Vram......
    hang in there o gal 😢

    • @504Trey
      @504Trey Месяц назад

      Go ahead & tuck her in & close her eyes 💀

    • @jodiepalmer2404
      @jodiepalmer2404 Месяц назад +6

      It still plays my old games great, but my CPU and RAM is 10 years old. Trying to work out which modern CPU released in the last 3 years will play my 1080ti without too much bottleneck.

    • @LisSolitudinous
      @LisSolitudinous Месяц назад +1

      @@jodiepalmer2404 try going with AM4 and R7 5700X
      Not overly expensive, provides fairly good performance

    • @danieljones112
      @danieljones112 Месяц назад +1

      ​@@LisSolitudinousif you want a decent budget card you might want to hold off for a little bit and possibly take a look at the new intel battle mage cards there looking like they're going to be decent GPUs with good price to Performance.its completely up to you what you get but there's another option.

    • @MohawkNinja636
      @MohawkNinja636 Месяц назад +1

      I upgraded from the 1080Ti to a 7900XTX when I decided to switch from 1080p to 4k. My old 1080Ti still lives on in my wife's build and it has plenty of performance for the games she plays like Phasmophobia, Baldur's Gate 3, and SIMS

  • @daveyboytellem
    @daveyboytellem Месяц назад +106

    Lol my rx 580s I bought brand new at $239 had 8gb of vram.... 7 years ago. Amd was supposed to be our savior after ryen launched. Fkn shareholders

    • @GamerMeld
      @GamerMeld  Месяц назад +12

      Exactly!

    • @d1r3wolf8
      @d1r3wolf8 Месяц назад +8

      Ironically, that seems to be intel now 😅

    • @TheBonneter
      @TheBonneter Месяц назад

      Man I still have a 4gb rx580… just old enough to understand how things work too and eagerly looking for a 4090/5090 so I could be set for hopefully a decade

    • @xenird
      @xenird Месяц назад +1

      12 gb 2060 here

    • @regieregie748
      @regieregie748 Месяц назад

      @@xenirdhey man! i want to get back to gaming playing AAA games is 12gb vram not enough nowadays?

  • @Ya39oub_G
    @Ya39oub_G Месяц назад +13

    I'm not buying any future GPUs with less than 12 GB of vram.

  • @BeckOfficial
    @BeckOfficial Месяц назад +12

    The RTX 5000 series is already in production (and being shipped possibly), so whatever the specs are will also be the final specs on release. The only thing that can change last minute is the price.

  • @HiddenAdept
    @HiddenAdept Месяц назад +38

    "The first Crysis wasn't known for being all that fun" Wut? Damn I'm pretty sure most people thought it was a fun game. I remember putting my suit in speed mode, running in the jungle at the speed of a cheetah from a North Korean copter. I manage to hide in a shanty only for one missile to destroy the entire structure with pieces of sheet metal flying all around me. It was so cool I remember in that moment the one thing in my mind was "this game is TIGHT!". Switching between all the different suit modes added a good layer of strategy to the game too.

    • @craigd9305
      @craigd9305 Месяц назад +4

      I still replay it (as well as 2 & 3) from time to time.. I've always enjoyed the games.

    • @GamerMeld
      @GamerMeld  Месяц назад +13

      I really enjoyed the Crysis games, but I just know a lot of people more looked the first one as a benchmark.

    • @UltimateGattai
      @UltimateGattai Месяц назад +2

      I didn't really care for the first game personally, but it was alright. It does surprise me that it still looks good today though.

    • @hiteshbehera5229
      @hiteshbehera5229 Месяц назад +6

      Compared to current games
      Crysis is way more fun
      Crysis comes from the generation where devs were putting fun elements and not propagating their personal agenda through games

    • @Lapsio
      @Lapsio Месяц назад +1

      yeah I kinda liked this game but realistically it was known for its graphics. Not because it was particularly stellar game as is. At the time there were many more interesting and engaging games. It's kinda heartbreaking that honestly if you look at it now in current market it actually does sound like quite original and interesting game... what a time to be alive...

  • @kaystephan2610
    @kaystephan2610 Месяц назад +45

    AMD really does manage to always miss opportunities 🙄
    Imagine the RX 8600 but with 12GB. It doesn't have to be 16GB but why not make it 12? 8GB of VRAM cost $27 so adding another 4 is like another $14 in production cost. So add $20 to the sales price and you're good with a MUCH better product. But no. I just don't get it. It feels as if they don't WANT to win. They manage to make good competition for NVIDIA at a fraction of a budget but then there's always some god awful design choice that must make Jensen go "😂For a minute I actually thought they'd be a problem😂".

    • @arik2216
      @arik2216 Месяц назад

      High production cost ?
      especially after Trump banned quality components and material from China which are known to be cheaper.

    • @sierraecho884
      @sierraecho884 Месяц назад +4

      Because outside of this one game it isn´t needed. It like a car with 60HP and 6 exhaust pipes.
      You make certain design considerations and this one was made that way because it makes sense. Chill out it´s just one badly optimized game. You need an RTX 4090 to play it at max and you think the GPU is at fault...

    • @Integroabysal
      @Integroabysal Месяц назад

      i bet they will have a 8 gb and a 16 gb version , that's why they do it like they did with 7600 and 7600xt , because if you make 7600xt 12 gig then you are basically left with 7600 6 gig and this shit no one would buy.

    • @kaystephan2610
      @kaystephan2610 Месяц назад +5

      @@sierraecho884 8GB is not future proof 🙄 Even in 1080p there will be more and more games that exceed an 8GB limit of VRAM.

    • @kraya6740
      @kraya6740 Месяц назад

      If 8600xt will be a 1080p card then it wont need 16gb, so why waste more money for a card that will not benefit from it.

  • @ulisesurquia3197
    @ulisesurquia3197 Месяц назад +94

    Man it sucks seeing AMD not trying to push their GPUs to be a way better product. It's like they're not even trying anymore.

    • @Ziyoblader
      @Ziyoblader Месяц назад +8

      I mean they did say they would be taking a step back this time around and going back to the drawing board

    • @wintersoldier0
      @wintersoldier0 Месяц назад +4

      tbf the 6800xt was like $650 so it still matches its price to performance with its 72 CUs

    • @G4BR0_TV
      @G4BR0_TV Месяц назад +5

      I don't fall for that, they beat the world with top tier CPUS, they are probably doing the same with GPUS, but they are not gonna tell the world their tactics... They have the resources to do so, so why even telling your competitors? They have turned the tide almost completely already. 🤙
      The only thing is that when they "conquer" everything, are they gonna deliver us anything good or will assume the same role as intel, mitigating the most it's potency delivery?

    • @dingickso4098
      @dingickso4098 Месяц назад +2

      When almost nobody buys their GPUs

    • @rynz_2893
      @rynz_2893 Месяц назад +1

      they probably ran out of technology and all the big brains need more alien tech to reverse-engineer. takes a while... last time they did it, we got plasma TVs!

  • @Void-uj7jd
    @Void-uj7jd Месяц назад +57

    16GB VRAM or no deal. They are trying to coerce people into early upgrade paths. Scummy!

    • @Sterno90
      @Sterno90 Месяц назад +2

      Also make the higher level cards have more ram what since does it make for a 5070 (probably a 5070 TI/super also) and 5080 all have the same amount of ram.

    • @matilija
      @matilija Месяц назад

      I hate to defend these companies, because they are greedy af, but the problem is actually customer Fomo, not the companies or their practices, they are doing it because people's fomo gets them to still buy it......if customers would just not buy it on principle, the companies would start to listen, they aren't going to listen as long as people are spending money on their crap.

    • @Integroabysal
      @Integroabysal Месяц назад +3

      99% sure 8600 will have 8 gig while 8600xt will have 16

    • @sasagrcevic7048
      @sasagrcevic7048 Месяц назад +3

      Problem is that they are forcing you to buy a 2k card for 1080p gaming. They want to sell you expensive product.

    • @notwhatitwasbefore
      @notwhatitwasbefore Месяц назад +1

      not really. the higher density Vram chips are a bit later than expected going in to mass production so are not available. increasing bus width is expensive.
      So the options are delay 6+ months or redesign the cards (about 6 months delay) with wider buses and increase price accordingly either way its mid 2025 launch and what are they going to do about a mid gen refresh if they delayed to make wider busses and got the denser modules at the same time, sell even more expensive chips for even more money.
      I wish they would sell cards with increased bus width but the market spoke generations ago that no one was willing to pay more for a wider bus.
      IF nvidia delay lets say to wait for mass production of 3GB vram chips and then that gets delayed again as its only samsung (in that country that just declaired then undeclaired martial law) who say they are going in to MP next year so far so expect delays, well the 40 series is no longer being made so there would be no gpu supply (from nvidia which is the market) for 6+ months. Guess scalped 40 series cards for most of 2025 would be the only option in that case, so I would say its better for everyone (except scalpers) to just push out another gen of cards without enough vram, it is business as ussual after all.
      And yes seeing whats happening I did just buy a 7900xt with 20GB of vram.

  • @attila1312
    @attila1312 Месяц назад +61

    Its looks my 6950 Will be with me for more few years.

    • @Epic3032
      @Epic3032 Месяц назад +4

      I think my 6800 should last till Intel's Druid comes out, hopefully by then their drivers improve over time especially regarding older games....their GPUs seem to be more power efficient than its counter parts

    • @Rayu25Demon
      @Rayu25Demon Месяц назад +14

      no reason to upgrade because there is nothing good to play.

    • @Chris-gh6ko
      @Chris-gh6ko Месяц назад +4

      im also running a 6950, its going to stay with me for a LONG time, im really liking it, went from a gtx 1080 to this.

    • @volvot6rdesignawd702
      @volvot6rdesignawd702 Месяц назад +2

      Or upgrade to the run out end of gen 7900xtx's 24gb vram !! I mean its no 4090 and RT it gets crushed basically by a 4070ti but in raster its 4080 lvl's !!
      But 24gb vram is going to be solid for a long time still !!

    • @mikeramos91
      @mikeramos91 Месяц назад +2

      For a sec I thought it was the HD 6950 😂

  • @FireCestina1200
    @FireCestina1200 Месяц назад +39

    People: Shame on you Nvidia.
    That same person: Well, here goes my 5070. Don't have money for higher. And 5060 sucks. I don't want AMD. Guess I just have tu turn on DLSS.
    - Human with more copium than oxygen in his veins.

    • @roshawn1111
      @roshawn1111 Месяц назад +8

      @@FireCestina1200 lmfao no joke I hear people crying telling amd to compete and when they do for a cheaper price no one buys them. 🙃🙃

    • @Chocobollz
      @Chocobollz Месяц назад +4

      ​​@@roshawn1111Yep. Most people want AMD to compete so they can buy Nvidia cheaper. It won't work this time around fellas. Nvidia simply won't care because they have a bigger customers to serve now (the AI crowds) and they have to. Because anyone who have tried building a business (small/big, no matter what size) would know that if you have a chance to get as much profit as you can, then you have to. Because it will not come for long. Nvidia knows this, that's why they're milking it as much as they can before the others catching up to them.

    • @Crazzzycarrot-qo5wr
      @Crazzzycarrot-qo5wr Месяц назад +1

      Sadly new games make you rely on upscaling whether you have an amd card or nvidia card, and dlss being quite a lot better means more people will still gladly pay the nvidia tax, myself included.

    • @FireCestina1200
      @FireCestina1200 Месяц назад +1

      @@Crazzzycarrot-qo5wr Hey ! Don't forget Intel. Intel GPUs are good too. Just less popular.
      FSR isn't that worse than DLSS. The real issue is that most developers are too lazy to make any proper FSR implementation.
      Actually (you can check the file version). There is 0. And I insist. 0 games with the latest FSR. None on Earth. For those with older FSR. Some use lower than 3.0 (which is kinda not great) and some use 3.0 (3.0 have some issues compared to 3.1.0 or 3.1.1 or 3.1.2) and as you can see in my comment. The latest FSR is 3.1.2 even if let's say a game features it (which is not the case) all other .dll files require for proper frame gen or greater shader lighting is missing or non-updated).
      I started to seek the FSR file to update it. (The real FSR) Most games don't have it on Steam. And some do. But lack of the other one).
      Funny enough the only "game" with the latest XeSS I own is 3DMARK x)

    • @Crazzzycarrot-qo5wr
      @Crazzzycarrot-qo5wr Месяц назад

      @@FireCestina1200 yes, dlss is inherently better than fsr. Of course this is to be expected.
      It has nothing to do with the game running latest fsr version or not, since fsr 2.x looks better than 3.x from what ive seen and tested.
      Intel gpus are good value, but not for higher graphics and resolutions, for now at least.

  • @Musiclover23324
    @Musiclover23324 Месяц назад +16

    Top performance for $500 is just a dead dream at this point

    • @sikandernaeem9661
      @sikandernaeem9661 Месяц назад +3

      its not too bad, 1440p is sweet spot resolution, as most ppl say that jump from 1080p to 1440p is really noticeable, and from 1440p to 4K isnt that much but it requires more graphic power. In that case, Amd's offering at `$400-`$500 are mostly RX -700XT and RX -800XT which are great for 1440p and even entry level 4k, with a decent amount of life in them aswell, look at rx 6700xt/7700xt and rx 7800xt, for rast perf, they r better than nvidia's even 500-600$ offering for price/perf and simple outright throughput. It's a hard truth to face, but ppl need to understand that nvidia really isnot making cards in the favour of their loyal customerbase.

    • @deauthorsadeptus6920
      @deauthorsadeptus6920 Месяц назад

      ​@@sikandernaeem9661Also their RT isn't halfway bad. Sure, nvidia cards would do better. But it's a struggle for both at high res, untill 7900/4080.

    • @utopic1312
      @utopic1312 Месяц назад

      that makes sense tbh

  • @igavinwood
    @igavinwood Месяц назад +6

    2 to 3 years ago the net was full of reviews showing how GPUs struggled with modern games on high settings due to low VRAM. We are now seeing the third gen of cards that have, if rumours are correct, done nothing to address this. Instead of providing the actual hardware required, they are all pushing upscaling. Of itself there is no issue to having upscaling, but not so they can skim even more profit out of the consumer for relatively lower performance and reduced hardware.
    Hopefully the consumer is waking up to the scammy behaviour, though that was also talked about 2 to 3 years ago and people still back stupid but shiny.

  • @relentlesslyquirky2904
    @relentlesslyquirky2904 Месяц назад +83

    the 8600 having only 8gbs of ram is dissapointing. I'm definetly going to get the B580 knowing that.

    • @jrodd13
      @jrodd13 Месяц назад +18

      For real. Double digit vram should be the minimum for any graphics card nowadays. The high end needs more too. The 1080ti was fucking loaded with vram back in the day and it STILL kinda kicks ass 8 years later tbh. I think Nvidia is making them like that intentionally so the card struggles within a few years and you have to buy something from their new lineup. Just my theory.

    • @volvot6rdesignawd702
      @volvot6rdesignawd702 Месяц назад +3

      Actually quite disgusting .. flip side to a degree Nvidia sells you their 8gb crap then boosts the Vram in later models of its same rubbish making you by the same crap twice just with better specs ..
      AMD doesnt you buy there rubbish its what it is rubbish 8 gb 7600 /8600 ( i dont think i ever seen them sell a 16 or upgraded V ram 7600 16gb card or what ever ) like Nvidia does !!
      call it what you want ( forcing to to buy better at the start ) ( BS marketing ) But atleast with AMD you pay for with no upgrades !! kinda putting the sale back on you if you buy their crap at the start well thats on you 16gb should be the lowest Vram on a card in 2024 ..
      Personally I would wait till the better intel cards ( i guess B770 cards would be there names ) Ive got the ARC 770 16gb paired with a 14600kf for my fun all intel build ( 7800x3d 7900xtx is my main ) but for the most part that little ARC770 16gb plays most stuff pretty well with no issues..

    • @UltimateGattai
      @UltimateGattai Месяц назад +2

      ​@@jrodd13 I'm still using the 1080ti, that thing might last me a few more years at this point. I only started hitting VRAM issues a few years ago, so I might have to consider turning down the details on newer games now. But 1440p and 90fps is still pretty good for an old card.

    • @eikbolha5883
      @eikbolha5883 Месяц назад +1

      Idk why is that rx 8600 claim to bé 8gb cuz there was just leak on amd gpus rx 8800xt, 8700xt and both are 16gb but 8800xt háve more cores and 256bit bus and rx 8600 xt shoukd háve 192 bit bus and 12gb maybe we gonna have gpu Like regular rx 8600 non xt with 8gb but it really looks Like its gonna bé 12gb version and nôt 8gb, over all im a bit dissapointed IF rumors are trú cuz i have rx 7900gre and i love this gpu its powerfull háve enough VRAM and its was cheaper than my Rtx 4070 super wich i return cuz it háve nonstop issues with it. I hoped i could upgrade to am5 platform from mine 5800x3d and rx 7900gre to Basic ally 7800x3d or go balls deep and go with 9800x3D cuz why nôt and i would pair it witj 8800xt cuz first claim was it gonna have raster of the rx 7900xtx but if this is True than performance is gonna bé Basically worse than on my rx 7900gre OC version but well RT is better but IF amd manage improve RT performance to that level as nvidia háve im bé happy and mé personally im not using RT at all even tho my gpu can do it quite well i just dont Like to play in 1440p ultra settings and háve only Like 60fps nahhh Like 100plus all the tíme with low frame tíme and huge issue are games BASED on Ue5 which use shit load of RT baked into game thru engine and there this performance can bé good. But we will see how its gonna bé really.

    • @gaster2411
      @gaster2411 Месяц назад +1

      ahem... 7600 XT, 6750 XT, 6700 XT, 7800 XT?

  • @buddhirajgautam5731
    @buddhirajgautam5731 Месяц назад +9

    just leave 7700xt for a minute, even 6700xt destroying 3080 is wild . more than double the performance .

  • @douganderson1249
    @douganderson1249 Месяц назад +8

    A lot a people will be disappointed when buying a card with 8GB. Likely most people are illiterate when it comes to computer components and nVidia, (primarily but not exclusively), preys on this. There is no future for 8GB anymore - Indiana Jones is just the start.

    • @kristas-not
      @kristas-not Месяц назад

      comment user slams media! for! lack! of! anything! useful!
      tl;dr: need a ”disappoint” score in all reviews targeting the unsavvy.
      you nailed the hammer right between the eyes with this comment: the perfect word in the right spot.
      > disappointed
      this is an *extremely* important word, especially in the Wide World With Web Words Where Wiews and Watches Withhold Weight Without Watching, When Wiggling Wasn't Without...
      sorry, fuckit, can't do it.
      the this era marketing driven *everything* where:
      a mild disagreement over tipping culture in spain
      +
      media¹
      =
      ”Socialist Jew Slams Black GOP Superstar
      While Elon's Kardashian Nig[advertisement]”
      ~~! everything below that fold hidden like the titanic ¡~~
      reporting live from
      Reddit or Twitter or X (read as 'found in a hole filled with water in the ocean ')
      by
      ..¿xxxBACON-PUSSY-BENEFITS-BEEFTITSxxx¿..

    • @addydiesel6627
      @addydiesel6627 Месяц назад

      Agree. I have been running out of vram on re2. 8gb is not enough for this old game

  • @A3r0XxLol
    @A3r0XxLol Месяц назад +10

    well, it was fucking obvious that those 8-12gb cards from nvidia aren't future proof wasn't it? it will only get worse from here.

    • @mikeramos91
      @mikeramos91 Месяц назад +1

      Unless DLSS 4 helps 🤷🏻‍♂️

    • @CountlessPWNZ
      @CountlessPWNZ Месяц назад +4

      ​@@mikeramos91i hate the concept of dlss. its a crutch.

    • @qaesarx
      @qaesarx Месяц назад +1

      @@CountlessPWNZit’s ok as longs as it on quality mode. Yes it’s good, but still doesn’t excuse the RAM limit

    • @tomaspavka2014
      @tomaspavka2014 Месяц назад

      XsS: 6gb ps5: 9gb ps5 pro: 10gb XsX: 10gb Vram. 8gb is enough for this Gen

    • @heroninja1125
      @heroninja1125 Месяц назад +3

      @@tomaspavka2014 those are consoles that will be completely obsolete the moment the next generation of games comes out.

  • @eemtolga
    @eemtolga Месяц назад +6

    Game Devs : Lets optimize game so gamers enjoy the game. Nvidia : NOOOOO ! I need to sell graphics card !

  • @Eidolon5150
    @Eidolon5150 Месяц назад +6

    Dude my 2080ti has 11GB of Vram, the settings for warzone I run at to maximize FPS and FoV eats 9.5GB.

    • @Jhintingbot
      @Jhintingbot Месяц назад

      My 1080ti also has 11Gb, it's crazy.

    • @allbies
      @allbies Месяц назад

      @@Jhintingbot 1080ti is one of the best and most future proof cards of all time. It's heavily outperformed these days obviously because it's not just vram amount that matters, but it's still serviceable for most people at 1080p.

    • @Jhintingbot
      @Jhintingbot Месяц назад

      @@allbies I can still play games at 1440p and 100fps + depending on which one. Even max settings 144+ fps on some. Usually removing useless settings helps

  • @lordfyita2096
    @lordfyita2096 Месяц назад +6

    You didn't say crap about the 5090's problem.

    • @Tannercl101
      @Tannercl101 Месяц назад +1

      the price is the problem that is kick ass card!

    • @trsskater
      @trsskater 29 дней назад +3

      @@lordfyita2096 It's because this guy is a giant click baiter for no reason. His content isn't bad but he does the annoying click bait titles like he as if he really needs to do that. I wish he would stop.

    • @lordfyita2096
      @lordfyita2096 29 дней назад

      @@trsskater Thanks mate, have a good one.

  • @danielhulan3058
    @danielhulan3058 Месяц назад +21

    Nvidia isn't going to change their mind. They do NOT care about gaming. At least until us gaming people boycott them and they lose out on their money. People, we have to force their hand and boycott them. If you do its not like you won't have option. Amd will get you by until nvidia comes to their senses.

    • @sobatmedhok1306
      @sobatmedhok1306 Месяц назад +7

      Seem you all forget how Nvidia states that they will prioritize AI over gaming not a long ago......

    • @Ad_Dystopia
      @Ad_Dystopia Месяц назад +6

      Gamers boycotting Nvidia will do pretty much nothing to them. Gamers don't make a lot of money for Nvidia anyway.

    • @nicane-9966
      @nicane-9966 Месяц назад +2

      as you say they dont care about gaming anymore and if is not profitable anymore then they will stop making them they dont care.

    • @Firestorm-m4m
      @Firestorm-m4m Месяц назад +2

      I think the Ai bubble would have to burst, if it ever will, their marked cap is inflated af, as long as they still get huge revenue from ai, they won't care about gamers

    • @MrKrampyHands
      @MrKrampyHands Месяц назад +1

      Yeah buddy this is exactly why I traded my 3080 10gb for a 7900xtx not than double the Vram and it's freaking amazing and a absolute beast

  • @iLegionaire3755
    @iLegionaire3755 28 дней назад +2

    This isn't a problem for NVIDIA, only the consumer. RTX 5090 is the true Blackwell, the rest NVIDIA nerfed purposefully on VRAM.

  • @Ziyoblader
    @Ziyoblader Месяц назад +4

    I'm glad I picked up a 7900 XTX for
    750$ new without original box

    • @roastinpeace2320
      @roastinpeace2320 Месяц назад

      You think that's new? Oh silly you. It probably has few months of cryptomining behind it.

    • @CrazyAjvar
      @CrazyAjvar Месяц назад

      ​@roastinpeace2320 which doesn't impact performance at all.

    • @roastinpeace2320
      @roastinpeace2320 Месяц назад

      @@CrazyAjvar Indeed but it impacts lifespan of your GPU. Still good bargain for 750$.

  • @Skott62
    @Skott62 Месяц назад +3

    Didn't AMD say they were going back to competing against Nvidia only at the low and mid level GPUs and not competing at the high level? Leaving Nvidia to compete alone at the high level? I seem to remember hearing something about that recently.

  • @TheSquirrelsNest
    @TheSquirrelsNest Месяц назад +2

    It is hard to be disappointed in a product that doesn't exist. I don't spend mu life fretting g over speculation. After release I will form an opinion.

  • @392Hemi-x3
    @392Hemi-x3 Месяц назад +6

    I didnt get the issue with the RTX 5000. What is the issue?

  • @welcomebrother1
    @welcomebrother1 Месяц назад +6

    Back then everyone and their mom wanted a 3060 or 3070 with 8gb. I only managed to get a 6800xt 16gb. They said don't worry about 8gb it's all about the raytracing. Times have changed alright!

    • @deauthorsadeptus6920
      @deauthorsadeptus6920 Месяц назад +2

      Also ray tracing needs vram, so 8GB cards could barely do it despite the hardware. Like, 3070 could been the best card if not the memory...

  • @darthdadious6142
    @darthdadious6142 Месяц назад +7

    I'll hold judgement until the official announcement and 3rd party benchmarks.

  • @NickDrinksWater
    @NickDrinksWater Месяц назад

    If I have an rtx 4070, would the 5070 be compatible? I'm curious if they will have the same connector and similar enough dimensions, wattage and so on, that it will work

  • @hithere2561
    @hithere2561 Месяц назад +3

    6800xt vs 7800xt is basically the same card in real life scenarios. For casual games - around 5 frames difference.
    To upgrade to something that provides less than 25% framerate improvement is pointless for me, considering the the prices.

  • @S1nGuLariTY_
    @S1nGuLariTY_ Месяц назад +4

    **5000 series not out yet**
    RUclipsrs: RTX 5000 Has A BIG Problem!

  • @AskMoonBurst
    @AskMoonBurst Месяц назад +5

    The chair DOES look nice. But it's 500+ USD. That is quite a lot. You can get a full recliner for less. Though I suppose a 10 year warranty IS pretty nice...

    • @peter2liter
      @peter2liter Месяц назад +1

      Right. Get a La-z-boy recliner, and mount your monitor to the floor. I used a couple of 6x6" steel plates welded to some rectangular tubing, then a standard pneumatic swivel monitor mount on top. Secure everything to the floor with some lag bolts. Keyboard goes in my lap, mouse on the arm rest. If I need to get up or want to watch TV I just swing it out the way up against the wall. Saves a ton of space.

    • @hithere2561
      @hithere2561 Месяц назад +2

      for 300 you will get a very good office chair, that is muuuutch better than this binged out 100$ chair.

    • @BenjaminCronce
      @BenjaminCronce 26 дней назад

      @hithere2561 Ive gone through too many $300 chairs. My last was $600 and it's more comfortable and still going after several years.

  • @Sorariel
    @Sorariel Месяц назад +2

    Games companies and gpu companies are ruining pc gaming simultaneously

  • @MihilRanathunga1990
    @MihilRanathunga1990 Месяц назад +3

    What do you mean Nvidia was proven wrong? they were proven right. Made you buy another GPU after just 2-3 years. And you'll buy them.

    • @herobrinecyberdemon8104
      @herobrinecyberdemon8104 Месяц назад +1

      Moore's law got replaced by Moron's law: the consumers would buy Jensen's overpriced crap no matter how much they complain about the price.

  • @erickalvarez6486
    @erickalvarez6486 Месяц назад +10

    I'd have added.
    600 = 10gbs
    700 = 14 gbs
    800 = 20 gbs
    So it doesn't feel that bad not having a high-end gpu

    • @volvot6rdesignawd702
      @volvot6rdesignawd702 Месяц назад +11

      personally it should be
      600=12gb
      700=16gb
      800=20gb
      even then im of the belief that no card sold in 2023 going forward should be less than 16gb when they are all going up in price !!

    • @UltimateGattai
      @UltimateGattai Месяц назад

      ​@@volvot6rdesignawd702 for the price of cards today, they really don't offer enough. I have a water cooled 1080ti that cost me about $1200 to $1400. That same figure today, with inflation, should be about $1800, I'd be fine with $2000 for a top of the line card. But here in Australia, it's now worth $3500 for a 4090.

    • @rynz_2893
      @rynz_2893 Месяц назад

      @@volvot6rdesignawd702 my 7800 xt was $580 and has 16gb

    • @justmatt2655
      @justmatt2655 Месяц назад

      ​@@volvot6rdesignawd702tbh 8800xt could still have 16gb and it will be great. Only really need above 12gb for 4k so cards with 12gb are fine too

  • @johnprinsloo2291
    @johnprinsloo2291 Месяц назад +2

    12gb Vram should actually be the new baseline by this point in technology... 8gb cards are fine for mobile systems and the likes, but not for desktops.

  • @J4rring
    @J4rring Месяц назад +9

    Why do so many people immediately blame gpu manufacturers for poor performing GPUs when it’s devs and studios that have the power to design what their games need?
    Game engines can almost always be better optimized, resources can be better managed, and project can be delayed to make this happen, but nowadays we more often than not see rushed releases, poor CPU utilization, and graphics that pale in comparison to modded games from 2 years ago but still require double the VRAM
    it feels like nowadays AAA studios are designing their games to be played at minimum on the highest end specs rather than the largest audience. And then when you look at the fact that some studios actually partner with GPU manufacturers to create these spec sheets - it just feels like an ugly game of leap frog where the only way to play your highly anticipated games is to keep buying better hardware every year

    • @alextoscano6916
      @alextoscano6916 Месяц назад +2

      The thing is most people aren't going to pay 50% more on a game so that it runs on 30% less vram also the vram has stayed the same for like 8 years now and games are only getting more complex so it is mostly on gpu manufactures fault it makes a lot more sense to a gpu manufacture to spend $15 on 4gb more vram then $30 more on each game as well as less game releases due to having to spend a lot of time optimizing.

    • @J4rring
      @J4rring Месяц назад +2

      @alextoscano but the thing is games ARENT getting more complex, graphics are virtually in the same place if not a worse state than they were in 2020-2022. We can see it in the way that devs use upscaling and anti aliasing techniques to hide poorly designed sets - the best example of this is literally any video that talks about how unreal engine 5 is affecting the industry.
      Just look at frame generation and DLSS. It’s absolutely CRIMINAL that a software bandaid such as framegen is appearing on spec sheets for games, especially when frame gen and upscaling absolutely destroy VRAM, and it all started because of TAA (temporal anti aliasing)
      The real issue is the fact that those at the top are pinching pennies at our expense - AAA studios dont HAVE to charge $120 for their games, they dont HAVE to add micro transactions or subscription services - just as much as GPU manufacturers COULD add more VRAM to their cards or COULD make affordable low end GPUs but dont

    • @J4rring
      @J4rring Месяц назад +2

      @ It really all comes down to this: We should be buying high end hardware to push the graphics of our games to the limit, not the other way around
      If i was buying games just to push my GPU to its limits i wouldnt buy games, id just let it run Furmark at 4k stress testing preset 24/7

    • @J4rring
      @J4rring Месяц назад +1

      @ and this is before you take into account the quality of games that have been coming out, when was the last time you saw a AAA game that came out of a major studio that wasnt a buggy incomplete mess on release?

    • @roshawn1111
      @roshawn1111 Месяц назад +4

      @@J4rring gotta agree with you, dlss and framegen were meant to help lower end gpus compete at a higher resolution. Instead devs have turned it into bandaid fixes for poor optimizations. Then they want to charge you 70/100 dollar ultimate editions for bugs and 60gb worth of patches on your Internet bill to fix it.

  • @t0mn8r35
    @t0mn8r35 Месяц назад +30

    AMD has again screwed us. At least Intel is still trying. Damn you NVidia! *fist shaking*

    • @utopic1312
      @utopic1312 Месяц назад +3

      just wait for the GPU to be announced before making a judgement

  • @NetflixForeign
    @NetflixForeign Месяц назад +5

    Sorry just gonna say it with Indy. Those specs. are devs. being lazy. Telling me I need DLSS on for decent frames...this better be referring to people who want over 60.
    As it stands I think Pathtracing is garbage as yes it is easier for devs. to implements but makes RT even MORE of a drain than it currently is., Hearing about the difference of Cyberpunk's Ultra RT vs. Pathtracing mode it sounds unnecessary and totally not worth it.
    Also This is worse than everyone says about Cyberpunk and how unoptimized it is. Everyone we have a new winner here!

    • @NetflixForeign
      @NetflixForeign Месяц назад +1

      If Intel is smart they will bring out their B770 and B780 with this Indy info given I have heard they are both 16 gb.

    • @trsskater
      @trsskater 29 дней назад

      @@NetflixForeign I can't play cyberpunk without path tracing. Its is a higher quality image than Ray tracing. Its nice that the 4090 can handle it since the 3090 was basically a slide show with it turned on.

    • @NetflixForeign
      @NetflixForeign 29 дней назад

      @@trsskater So you basically want to be married into DLSS for something that maybe improves the RT by like 5-10% at best, maybe more like 5%.

    • @trsskater
      @trsskater 29 дней назад

      @@NetflixForeign DLSS + Frame Gen for cyberpunk at 4k to use path tracing yes. It it looks way that way than without. But I also want to have all my knobs set to 11 at the cost of fps. As long as it's high enough. Otherwise I'll wait until the next Gen top end card and play it with that one.

  • @wiLdchiLd2k
    @wiLdchiLd2k Месяц назад +4

    Indiana Jones uses forced Raytracing… thats the problem.

    • @sasagrcevic7048
      @sasagrcevic7048 Месяц назад

      Who the fuck wanna play Indiana Jones game?

    • @doghous3
      @doghous3 Месяц назад +1

      yeah, something weird going on here. teh mind boggles.

  • @fanofentropy2280
    @fanofentropy2280 Месяц назад +2

    Hanging on to my 6800xt for another cycle i guess.

  • @RoxburysFinest617
    @RoxburysFinest617 Месяц назад

    I have a EVGA 3080 12GB model, can we see those specs or the 3080ti with 12gb?

  • @HiYouTube1226
    @HiYouTube1226 Месяц назад +10

    Im disappointed so im just going to get the 7900 XTX

    • @ibot-u3s
      @ibot-u3s Месяц назад

      you must love stutter lags

    • @Bastyyyyyy
      @Bastyyyyyy Месяц назад +7

      if you get an okay price, i dont think youll regret it

    • @HiYouTube1226
      @HiYouTube1226 Месяц назад +5

      @@ibot-u3s explain?

    • @HiYouTube1226
      @HiYouTube1226 Месяц назад +3

      @@Bastyyyyyy was planning to get a 4090 a couple months back but inflation and I just started hating nvidia for their stupidity

    • @mexicanopdb
      @mexicanopdb Месяц назад

      Only if it's like 400/300 dollars after the 8800xt release

  • @Sajgoniarz
    @Sajgoniarz Месяц назад +3

    Indiana Jones
    You give me: NASA level PC
    I give you: Skyrim with 20 realistic mods graphic level

    • @Chocobollz
      @Chocobollz Месяц назад

      What is a NASA level PC? You mean antiquated like the SLS? 😂

    • @Sajgoniarz
      @Sajgoniarz Месяц назад

      @@Chocobollzno, those used in labs for material simulations or for genome analysis in AMES ;p

    • @Chocobollz
      @Chocobollz Месяц назад

      @@Sajgoniarz 😏

  • @Paco1337
    @Paco1337 Месяц назад +1

    I play indiana jones on 4060 without any problem. I don't know what are people complaining about.

  • @cptairwolf
    @cptairwolf Месяц назад

    I'm playing Indiana Jones and the Great Circle right now on a 3090 and I'm blown away at the performance with everything maxed out (except for full RT which I have off). I am getting a solid consistent 70FPS in 4k HDR. So definitely no problems here and it's not using anywhere near all my VRAM

  • @darkman237
    @darkman237 Месяц назад

    Installed the new Nvidea app and it isn't working at all. Even after reinstall and a couple of restarts,. What's Up N?

  • @carlr2837
    @carlr2837 Месяц назад +3

    Wow, if that 8600 is only going to have 8GB on it, they better price it at $99 or it will never sell. I wouldn't buy anything with less than 16gb, but according to a recent Hardware unboxed poll, there are some people who would pay up to $100 for one. Hopefully there are two variants of the 8600, an 8600 and an 8600 XT, and the 8600 version is the one with only 8GB, while the 8600XT has the previously reported 12GB. In the prior generation, the 7600 has 8GB while the 7600 XT has 16gb. Even with 12gb, I'll not be considering the 7600XT unless it's really cheap.

    • @Ziyoblader
      @Ziyoblader Месяц назад +1

      They're following the Nvidia philosophy it looks like pay more get more Payless get less Nvidia be like hey AMD let me show you a trick in business

    • @roshawn1111
      @roshawn1111 Месяц назад

      @@carlr2837 99$ there really is some people that want shit for free lmfao

  • @ToriksLV
    @ToriksLV Месяц назад

    One says Intel Battlemage is dead already and other says druid GPU is already being built. I am confused.

  • @bittripper3530
    @bittripper3530 Месяц назад +6

    Nvidia has shot itself in the foot with the 5000 series with their lack of vram and appallingly slow bus speeds, nvidia's greed is getting the better of it

    • @tonyv1796
      @tonyv1796 28 дней назад

      they make so much off the ai chips now that im surprised they even make gaming gpus at all.

  • @ojasyadav3661
    @ojasyadav3661 Месяц назад

    Hey i have AMD Radeon RX 6750xt just having anxiety over12 gb vram will it be sufficient for 5 years if i run games at preset(mostly FPS and sometimes AAA open world games at only 1080p)?i want to make use of it for at least 5 years then upgrade to the new GPU

  • @zorbakaput8537
    @zorbakaput8537 Месяц назад +5

    Sorry mate but "having to sign in to get driver updates" as you stated is and was totally false. Bad form on your part. Drivers were easily downloadable direct from site.

    • @AutoCannonSaysHi
      @AutoCannonSaysHi Месяц назад +1

      If you were using the app, you had to sign in.
      I mean I kinda figured that's he meant...

    • @milesharris1949
      @milesharris1949 Месяц назад

      so don't use the app. I have never used Geforce Experience. I actually rip it out of the drivers before installing them. Like the guy says, just download them from the site.

  • @faceless_ghost
    @faceless_ghost Месяц назад

    3:30 when did intel released i7 13900k??? must be a mistake!

  • @mrg2039
    @mrg2039 Месяц назад

    I've just built a new system for myself and I've put a second hand 7800xt in it as a placeholder for something new next year. At this point I'm so happy with it, I might just keep it for longer. There's no way I'm downgrading memory size and the Nvidia options with 16Gb are likely going to be hideously expensive. I guess if the 8800xt is exceptional I might be swayed. The intel cards are definitely looking better, but I'm not prepared to risk the poor compatibility issues.

  • @JuanPerez-jg1qk
    @JuanPerez-jg1qk Месяц назад +1

    i wonder what happen to infinity cache...??? they had it so hype during 6800xt...128MB...now its 64MB and not change since amd dowgraded since rdna3

  • @Duarenteed
    @Duarenteed Месяц назад +3

    Look I LOVE AMD. But there simply not cheep any more. 40 quid isn't enough to make me buy the weeker card. May the best card win

    • @volvot6rdesignawd702
      @volvot6rdesignawd702 Месяц назад

      go Intel wait for the specs of the B770 cards my guess we will see 16gb to 20gb ( maybe 20gb ) .. But forget cheap AMD or Nvidia cards Nvidia has the mind share and market share ( because people are either stupid or like getting rolled ) and AMD will follow suit with just undercutting Nvidia's pricing with better Vram ( obviously not on there crap 8gb junk ) !!

    • @Firestorm-m4m
      @Firestorm-m4m Месяц назад

      Yeah really disappointing that they would rather screw us customers, because they get better margins and nvidia is extremely anti competetive especially with vram. Intel is pretty much our only hope for reasonably priced gpus

    • @eikbolha5883
      @eikbolha5883 Месяц назад

      Lissen i have only nvidia gpus from gt 720 later gtx 1050ti after that rtx 2070super later Rtx 3070 OC and my Last gpu from nvidia ws Rtx 4070 super which i must return cuz it was just ahhh nôt good and Basic ally bad priced gpu over all special in europe i buy it for 729.99eur crazy price for Basically entry level of mid range gpu, over all gpu can run games but in most new tituls that we have now i multiple Times run into problém with VRAM alocation and nôt enough VRAM at all also stupid stutters and bad bad bad drivers im. Sorry Like i know that amd háve good driver support now cuz in past it wasnt that great but i expect better results from nvidia as Leader with gpus this was damn shame Like even my Rtx 3070 OC dont háve that múch issues with driver than my Rtx 4070 super and most of the tíme drivers dont brake game it self but apps Like obs or other recording programs also frame gen used to dont work in my many games after uodating driver Like damn shit show only good thing i love about 40series was dlaa and dlss those were beautifull. But i choose nôt to wait and maybe that was wrong cuz i choose rx 7900gre for 630eur still expensive but 100eur less than Rtx 4070 super and thats big price gap in my book so i pick it up, what i find was nice drivers was great also adrenalin software damn nice stuff very easy to learn and 16gb of VRAM also around 10 to 15 %better performance in 1440p or 4k and in games that use more VRAM this makes big lead but in online games both Rtx 4070 super and rx 7900gre was on pair but rx 7900gre delivered always more 1%lows maybe cuz of that bigger VRAM. Over all i dont regreting to go with team red and i dont care that they dont push out that many drivers Like nvidia does point is that those drivers work and thats main thing for mé also im done with nvidia, prices goes to the moon and new gen its nôt bé any different and for Like 10% of the higher performance i dont care at all nôt worf it so i stay with team red and only thing that i really miss is dlaa and dlss cuz fsr 3.2 is really good and Basically with this version amd shrink gap between fsr and dlss quite a bit but quality móde or even more low is still better on dlss for sure and that without thinking. I just wish i really wish that AMD release that new FSR4 for us that háve rx 7000 series i hope its gonna work with out gpus as well. And when it comes to this new AMD 8000series well in short this is nôt bé improvement in performance but its gonna be something Like beta test for new and better RT performance in my opinion so who are those gpus for well Basically in my opinion People that háve rx 5000 series and 6000 series for those People its very worf it to upgrade to something Like 8700xt ot rx 8800xt i think but for mé i dont know IF amd finally give us fsr4 with at least 90% of the visual Like dlss than i Will pay all money for any amd gpu cuz only think that amd holds back is good upscaler cuz many people dont even use RT but they use upscalers and most amd users use rather intel Xess than Amd fsr cuz most of the tíme devs implement old fsr version or they implement fsr wrong Like in cyberpunk 2077 the worst fsr imolementation i ever seen. Btw sorry for my english write Ing on the phone at 4am😂.

  • @ElysaraCh
    @ElysaraCh Месяц назад

    The raw number of CUs isn't the end-all of performance. The 7800 XT has equal or better performance than the 6800 XT with 60 CUs vs 72. People fixate too much on those numbers when to the end user they essentially mean nothing - what will matter is how well it performs and what price they set.

  • @richardlennox621
    @richardlennox621 Месяц назад

    “Can it run Indy?” Doesn’t have the same ring to it

  • @jubeykibagame
    @jubeykibagame 28 дней назад +1

    You're not just a "pretty big guy". There are some other things to say, that could be worked on. But first, the future health complications should be considered. And they are real. Take it from another "pretty big guy" who is 6 foot 4, 270 lbs. Not easy to find the clothes that fit and look good at the same time, especially the normal freaking jeans. Or something.

  • @chasecook6073
    @chasecook6073 Месяц назад

    Brother, 740 dollars for a fucken chair is insane

  • @Masaim6
    @Masaim6 Месяц назад

    64 cu's is the leak we always had for Navi 48, and the performance was always said to be ~4080, probably between 7900xt and 4080 so I wouldn't be worried.

  • @TUTruth
    @TUTruth Месяц назад

    Playing both sides..... sure you only have to move a few settings to fix. Yeah and that is what we do, what we've always done. Sure I want more VRAM, but if I move a setting down and it fixes, we all know the picture quality takes a very very small hit most of the time.

  • @archelonprime
    @archelonprime Месяц назад

    I still remember back in 2020 how ludicrous it was that the RTX 3080 would only get 10GB of VRAM! It just didn't add up, whether you looked at GPUs casually or got into the details!

  • @Theanalyst-pz1ui
    @Theanalyst-pz1ui Месяц назад

    i'm not buying a new gpu will the vram on the 70s hits 32 gig.

  • @2muchjpop
    @2muchjpop Месяц назад

    Intel needs a 18 month cycle for a couple of generations while the others do 24 months. They need to catch up properly

  • @outlet6989
    @outlet6989 Месяц назад

    Given how fast new and, I guess, improved components arrive almost daily, why would a PC Gamer buy anything today knowing it will be obsolete in a few months if that long?

  • @trsskater
    @trsskater Месяц назад

    I do not hope Nvidia changes their graphics specs last minute. More vram is part of what is making the prices shoot up.

  • @FireCestina1200
    @FireCestina1200 Месяц назад +4

    3 thing I want to see.
    -Lowest amount of VRAM is 10GB/12GB. And it's for the 8500
    - 8800XT and higher have at least 96/128MB of infinity cache and not 64MB.
    - Updating ressources file in order to the latest tech (the .dll) with a program to flash current gen (7000) and newer gen

    • @JuanPerez-jg1qk
      @JuanPerez-jg1qk Месяц назад

      and AMD not giving in your demands is so strange when NVIDIA is lot easier to mod and easier to update dll...and best of all, its free for user to do it vs AMD your freedom is limited under them that under amd only devs allow to use fsr source code...not modders and users , but for intel, they made dll that change amd tune last minute changes with its fsr3.1 code. Can't wait to see more intel xess2 dlls now...more plugins for the users is not just for devs only, specially you got linux user space love to work with dlls in vkd3d-proton code now they got latencyflex alternative now...intel XeLL

    • @FireCestina1200
      @FireCestina1200 Месяц назад

      @JuanPerez-jg1qk Real. The biggest problem is many games have FSR support so poor. Updating the .dll (I tried) sometimes makes the game unable to launch or launch then crash instantly. Hopefully. Most games work flawlessly with the update. But from manual tuning, no frame-gen from FSR 3.1.2

  • @urazoktay7940
    @urazoktay7940 Месяц назад

    Amazing video, thank you Gamer Meld.

  • @fbCleaner57
    @fbCleaner57 Месяц назад

    Why is that a problem for Nvidia? Amd said they stopped aiming at being faster than Nvidia…

  • @meyatetana2973
    @meyatetana2973 Месяц назад +1

    No offense to your chair but I prefer actual office chairs the old style high backs they are very comfortable and normally made out of material that won't crack fall apart or wear out easily like most chairs these days, they are also stupid expensive but I've found I spend less over the years buying quality

  • @nolan6137
    @nolan6137 Месяц назад

    At this point I believe there may be an understanding of game companies releasing poorly optimized games with intent so gamers continue to drop thousands of dollars on GPUs.

  • @madguitarist
    @madguitarist Месяц назад

    My bet...the 8600 specs are for the non-XT. Also, considering that the 7800 XT is on par with the 6800XT in performances with 12 fewer CUs and half the infinity cache, I'd say that this RX 8800 GPU is likely going to be faster than the on paper specs appear. Time, as always, will tell.😉

  • @sc9433
    @sc9433 Месяц назад +1

    Impossible ull get 4080 raster with 8800xt.
    I would say around 7900 gre raster. But with much improved raytracing.

  • @Drakezius
    @Drakezius Месяц назад

    Serious question Why does literally EVERYONE leave off the 3090 and 3090 Ti when posting comparisons?
    Where does my 3090Ti card fit amongst these bars?

    • @norbert4787
      @norbert4787 Месяц назад +1

      The 3090 is a mid-range card (with loads of VRAM) these days (destroyed in performance by the £604 7900 XT), but because of its high release price and no availability in stores anymore (at discount), not many people own it. It is not really relevant anymore because of this.

    • @user-HAYMONjustregyoutuber99
      @user-HAYMONjustregyoutuber99 Месяц назад

      its only a SUPERIOR deal for 3D modeling people like me. I'm planning to buy it used but not mining ​@@norbert4787

    • @Drakezius
      @Drakezius Месяц назад

      @@norbert4787 Maybe, but it is still the best non 4090 card, and it stomps the 4060, 70, and 80's.
      It was at one point THE Nvidia card to have. I bet there are a ton of people wondering if it is worth the dimes for a 5090 or to buy a step-up when the 4090's crash.
      That is my primary concern. Literally.
      The 3090Ti can do 99% of what a 4090 can, and my original upgrade plan was to upgrade GFX cards every 2 generations, but at this point I need to also consider that the 5090 will require a complete hardware upgrade, where the 4090 can work with the i9-9900k setup i have currently.
      Just saying.
      It was really weird to me that every review site just pretends the 3090Ti does not exist.

  • @laszlozsurka8991
    @laszlozsurka8991 Месяц назад

    The most popular GPU on Steam is still the RTX 3060... why even make a game that most people won't be able to run?

  • @匿名希望-s8x
    @匿名希望-s8x Месяц назад +1

    16GB (still) for 5080 is just ridiculous, and cut me the crap for bandwidth etc. It's shit.
    And AMD is following along in this shitshow. Wtf.

  • @donhu6578
    @donhu6578 Месяц назад

    RTX performance is not nacessary. It is a wrong way to push the rtx case.

  • @little__secret_9119
    @little__secret_9119 29 дней назад

    I saw a deal in used card of EVGA 3X Rtx 3060 ti in $290. Should i buy it Right now? I'm confused please help 🥺👀

  • @HazardRiot
    @HazardRiot Месяц назад

    I don't mind the 5079 12gb vram-192bit ggdr7. I do mind it though when it costs more than 499$.

  • @Smashplus-117
    @Smashplus-117 Месяц назад

    Why are people expecting to play these new demanding games with low budget hardware?

  • @darthfikus5206
    @darthfikus5206 Месяц назад +1

    Modern day optimization: buy a better hardware.

  • @edelzocker8169
    @edelzocker8169 Месяц назад

    I still think everything beyond 6GB VRAM is useless because nearly all modern games only use 4GB VRAM or less

  • @rdnowlin1206
    @rdnowlin1206 Месяц назад +1

    The vast majority of PC gamer use Mid-range GPU. Why make a game where most of your customers can't run your game???

    • @tomaspavka2014
      @tomaspavka2014 Месяц назад +1

      The vast majority of pc use LOW range GPU.

    • @rdnowlin1206
      @rdnowlin1206 Месяц назад

      @tomaspavka2014 OK, but I said gamer

  • @MasterCorvus
    @MasterCorvus Месяц назад

    That's why I will either stick with 7900XT and its 20GB of VRAM or I'll trade it in for a RTX 5090 if the price is not gonna be ridiculous. Whoever wants to have a solid gaming experience in 2025 with new games should not even consider card under 16GB of VRAM. That's just how it is...unfortunately.

  • @addydiesel6627
    @addydiesel6627 Месяц назад

    Let me guess : the 12vhpwr ?

  • @Cheeseypoofs85
    @Cheeseypoofs85 Месяц назад

    the latency improvement from switching to a monolithic die should make up for any shortcomings from lack of CUs. they are also claiming a big power efficiency increase... AMD is playing the game of the future. they know how certain states are gonna start trying to pass power usage laws(looking at you CA). by focusing on efficiency, they can gain market share by default if they are more efficient than nvidia

  • @TheRealName7
    @TheRealName7 Месяц назад

    5070 with a 192bit bus is criminal 💀.

  • @hadimokaram5391
    @hadimokaram5391 Месяц назад

    If a game requires more than 8gb vram for 1080p, it's not entirely the gpu maker fault. Devs are becoming lazy because of modern computer specs. I understand that the the top tier graphics card need more vram. But this is not my fault to have a cheaper graphics card.

  • @radicalturkey
    @radicalturkey Месяц назад

    The 5080 is coming with 24gb vram after the initial launch at 16gb

  • @kinsal_9308
    @kinsal_9308 Месяц назад

    I higly doubt they will launch with only 8gb of ram. It would be illogical if they care about sales.

  • @Integroabysal
    @Integroabysal Месяц назад

    If the specs are as leaked for 8800xt there is no way AMD can charge more than 450$ for this. If it is 450$ Then it will sell like a hot cake , even 479$ will sell real fast , and also its looking like 8600 is aiming for the 239-259$ market given the fact that it has no memory uplift or more compute units. Leaks are not looking bad , it all depends on the price , if AMD decides to go over 500$ and 300$ is DOA

  • @DelticEngine
    @DelticEngine Месяц назад +1

    Regarding AMD's GPU RAM, I'd rather see 16, 24 and 32GB options. Having an 8GB is, frankly, a joke these days as even much earlier cards from AMD have managed that. I also don't like CU and cache on a brand new card being lower than that of a card two generations earlier. Not good.
    I do wonder if dual RX 6800 XT in multiGPU configuration would be a lot more cost-effective and give better perfotmance... I can see this being the case, at least with games using the Vulkan API.

  • @Sphyxx
    @Sphyxx Месяц назад

    Instead of games being more optimised they just rely on raw performance

  • @user-uw3gy5ox8r
    @user-uw3gy5ox8r Месяц назад

    Has a big problem.... Is too expensive, prepare your kidney for cutting....

  • @psylinx
    @psylinx Месяц назад

    That chair doesn't make my coffee.
    George Davis

  • @peter2liter
    @peter2liter Месяц назад

    I think the 5060, 5080, and 8600 will pretty much be DOA thanks to a lack of lack of VRAM. Even the 5070 may be a tough sell if AMD sets the price of the 8800XT at $499. As for the performance of the 8800XT, +4 CU's and +25% clock boost over the 7800 XT has been leaked. This should translate to a massive jump in performance over previous gen, at least on paper.

    • @utopic1312
      @utopic1312 Месяц назад

      wait till the cards come out