Just like everything else Every year new products will release 😊 I’ll be skipping the 50/60 & 70 series!!! I just finished playing assassin creed mirage on my RTX3070😌
I’m actually passing down my 4080 to my son’s PC, should last him a long time I hope 🤞. Also, you can get a big jump from one generation, 4090 is almost 70 percent faster than a 3090
why obsolete? its not that they 40th series gets slower 😂 i keep my 4070 ti super for more than a year for sure i have a white palit one love it paired with 7800x3d
I'll be getting a new PC in a few years, will know when the time is right. Still marveling at how fast games load from my internal/external SSD every time I buy a game on PC that I previously only had on Xbox One S.
@@c523jw7 I honestly don’t play a lot of the newer games. So, for me it still works great. But with a lot of developers utilizing the Unreal 5 engine I’m afraid I won’t have a choice but to upgrade sometime soon.
I've just finished Avatar:Frontiers of Pandora ultra settings, 2nd time through, 7800x3d / 7900xt, typically GPU ~ 380W, now started Horizon Zero Dawn ultra settings ~ max 150W, I think I'll be sticking with this combo for a good while yet.
If the ray tracing is a lot better in the new AMD card and it’s just as fast as my card maybe I’ll get it , maybe. Depends on Price as long as it can run all my games maxed out at 1440p high refresh rate. it doesn’t have to be a 4K card as long as it can do anything I want at 1440 P 60 and above
The rumors hype the stuff up so much that really Nvidia doesn't even need to advertise it. For those who watch the rumor videos they want the card sometimes years before it even comes out and by the time it releases they feel like they have to get it whether they need it or not.
@@frogboyx1Gaming I know what you mean. All this youtube stuff is pretty much a giant advertising machine for all these big companies. Not only do they show ads but also many of the videos are paid for by companies. When a youtuber gets a sample graphics card to test that's free advertisement and more people look at it than they otherwise would have.
The 50 series wont even drop till next year sometime. The hype is real. 4080super is more than enough. Just game and be happy. Slow down the brain a bit. Wverything is moving at a pace thats un-obtanable these days. Wish pc gpu's would skip a year or two and refine what they release so people could enjoy the high tech state of tech in general we live in these days✌️
Of course not man. That's precisely why they do it! They want people who upgrade every generation to keep their sales figures up. For those RUclipsrs who don't have to pay for their equipment, they're going to review the new stuff they're being sent. The closest you'll get to them taking economics into account, is when they compare the old cards with the new ones.
Gamers we need to go back to the basics. Turn off the fps counters and just play the games you love. If your happy with how your game plays and are having a good time then no need to upgrade. Once your gpu start having issues handling the games then it's time to upgrade. That's how I need to approach it at least because of getting the upgrade bug all too often.
Nothing wrong with putting emphasis on good performance. I can't stand 60hz locked games anymore and 30fps is dead on arrival for me. 90hz is the minimum and really 120hz is ideal for me to enjoy a game.
I dont mind the content but please drop the constant 'bus' if you dont understand it 😂.. Bus size is one metric. Memory frequency and cache size and speed is also another. The 3 combined are a far more important metric over just simply talking about bus size. If you look at 4070super benchmarks youll see its 192 bit bus easily beat 256bit bus lanes and higher... If AMD want to improve on efficiency they will also no doubt stack more cache and reduce bus sizes.
to be competitive, AMD may not be able to drop it too much because the difference between GDDR7 and GDDR6 if rumors are to be believed. The specs are likely already dialed in...just a matter of selling the old stock as you see them giving deals at Microcenter on their GPUs now
@@ofon2000 I've no idea when or if AMD will move to GDDR7. It appears definitely not in the 8000 series rumours. If anything that appears to be using the slower memory used in the 7900gre stacked with more cache to make up for it.
@@Spratty-wb3is should be a bit worse for 4k, but hopefully a good bit better at 1440p while not having as many stuttering issues due to the strange overheating issues. I'm definitely not interested in buying a 900 dollar 5070 ti to get 16 gb from Nvidia, so I'm likely gonna go with Radeon once again.
You know i think you're actually starting to think like a pc gamer. Its about working with what you got and appreciating your hardware. I think pc gamers talk about pc being great thinking about their best experience..this has led people to believe everyhing is perfect and every game is perfect...its not but gosh its good enough and better than consoles easily. I think you put perfection expectations on this and it just made things harder and you have worked back into reality. It was mind blowing to think you couldnt even enjoy a 4080.for a while there...glad your enjoying it man. Iv loved owning high end pc man.
Also iv heard alot of pc gamers with the cash to have the best stuff now say they enjoyed it more when they were working with less capable parts. We have gotten spoilt.
If you upgraded in the last few years and your GPU has no issues with lacking VRAM, you should be good for a number of years of really good visual gaming. Why It takes 3 to 5+ years to create a game, and at the start of development, the game is developed team uses, in effect, is using high-end hardware. By the time the game comes out, a number of generations of hardware have normal gone by. So this is why good GPUs should last around 4 to 5 years. So there is no need to go get a shiny, AS for Nicida 5080 I fear some people will get burned by 5080, it going to be a cut-down GPU, so it can be sold into China. Now my thoughts are that AMD is going to go after market share for the midrange GPUs. Nivida will not worry, as making money hand over fist with AI you pay a lot of arm leg for top end Nividia GPU Intel GPUs are come of age. So seeing next generation's vs. last generation tests will be interesting as Value of money will be with the last generation of GPU
The tech community is fickle, that's for sure. I know people irl, just gamers not youtubers or tech workers, who bought the 2080ti, the 3090, then bought the 4090. Like even if you sell the old gpu, by time the new one releases everyone sells their "old" 90 series gpu so you won't get much money. The used market floods. If the 5090 gets 50% more performance than a 4090 and cost $1799 and you get $800 for selling your 4090 after spending $1600+ for it, well you lost $800 and still have to spend $1k to get the 5090 and they have no need to. The 4090 will be fine for two more gens of high end work, but no one listens.
If you really want to retain value you sell it a week before they announce the new cards but then you run the risk of not having a video card for awhile. I got my 3080 for $700 and sold it for $500 a few months after Ada released. I basically spent $200 to rent a 3080 for 2 years. No biggie. The 4090 is still going for ~$1400 on eBay so I could sell it before NVIDIA announces the next generation and get the majority of my money back. If I do upgrade I'll probably give it to a friend for a good price. 4090 is still perfectly fine but I do have a 4K 240hz monitor so the sky is the limit.
@@Phil_529 I get what you mean but with the way leaks go, I think you have to kind of sell a little sooner and then you're left without a gpu until you can actually get the next gen card. But yea, you can do that to get more value, for sure.
If everybody got at least a 4070 Ti Super or a 7900XT they still going to have a better gaming experience even over the next consoles so i wouldn't be too worried .
You really think so. I mean they gave a 2070 super in 2020 for thr ps5 when that was the card people were using on pcs. I do think the next console would be probably 5070 super.level. These consoles were impressive at launch but just got smashed very quickly.
@hzwinip interesting. So 20 series wasn't that good. Wasn't it all dlss for performance. I didn't factor that in. So the ps5 wasn't really 2070 super cus doesn't have dlss. 20 series must have been disappointing.
@@c523jw7 ray tracing was the biggest difference, but rarely worth enabling due to the performance hit. 1080ti has 11GB vram vs the 2070 supers 8GB with bigger bus also. Flagship pcs are pretty far ahead of consoles.
The thing is you don’t buy every time they drop something new. If what you got is getting the job done well why upgrade. I’ll upgrade when the gen 10 consoles are about to drop.
Not obsolete to me lol. Just waiting for the 4080 prices to drop! I still use my 3080 no need for a new GPU. I also have a 2080 not getting rid of them... Just won't buy a new GPU. I'm not that spoiled lol.
Absolutely frog there is absolutely no reason to upgrade your GPU every gen. just is not a big enough performance jump. Basically, all that happens is the higher performance cards drop down a tier or something like the power of a 4080 will be in the new 5070 so it makes absolutely no sense for people to say the 40 series are obsolete lol….honestly frog it would be cool if you didn’t immediately jump to a 50 series card with all the other channels. Keep making content on how your 4080 and 7900xt are holding up…..and another thing I've been thinking is I think people are hyping up the 50 series a bit to much. Nvidia is already dominating the GPU market so I just don't think they have enough incentive to go all out with 50 series cards. Been a lot of rumors about them cutting back on the specs actually. it seems like they are kinda saving that big jump for when they think they need it.
@@frogboyx1Gaming yeah, I guess that's true man…..I guess just do it if you actually want to though bud. It would suck to turn it into too much of a “job” …unless that is what you actually want I guess lol….gaming and tech and stuff like that is my happy place lol. Just Don't want you to lose that buddy.
@@timothysullivan-pm3sl I’m getting a headache thinking about if I should upgrade to a 4080 super lol, I’m one click away, cuz I have my doubts on 5000 series
@@Fate-qw3rn I went through the same thing trying to decide whether to get a 4080 or a 4070 ti super lol…after a lot (and I mean a lot!!!) of research I picked the 4070 ti super you get very very similar performance for around 200 bucks left and left me some money to upgrade my ram too.
Spot on. Like it's not obsolete cus you can dial settings down change resolutions. We really need to get out of that mentality and Froggy has used that alot. It's all or nothing thinking. Our beloved Froggy thinking makes it really hard for him to just be happy with his hardware.
Im going from a 4090 to the 5090 so i can use Display port 2.1 on my monitor as currently i can't run it at maximum resolution past 120hz although its a 240hz monitor. The 4090 will be moved jnto a second living room PC as a big picture mode console like experience when i dont want to sit in my studio after work all day
😂😂😂 Just seeing you struggle with fomo is so funny but I feel bad too. Frog you should do a doco on pc fomo and interview other people with the same conditions. I'd absolutely watch it.
One being more powerful does not make something obsolete. Same technology. New gpu ,yes newer tech with possibly newer abilities that the previous generation did not have. But I get what you meant. You could get a better card power wise.
@@mmremugamesmm what even is obsolete ?, There are plenty of games coming out, that you can still play with gtx400/500 or hd 7000/r9 200 series cards.
@@plottipaa_hauki95 true, just a matter of what performance your looking for. A lot of the newer hardware can’t play the really old games actually. The older it gets software wise more compatibility we lose. At least without somebody going in fixing it.
The 5080 will have 16g vram and 256 bus 😂😂😂😂 This is the latest rumour. Only 5090 has increases. Don't buy this trash, 50 series is straight KAKA 😂😂. Frog meditate and be happy with your 4080 man cus these new Nvidias GPUs are gonna be an absolute scam...and that's coming off the 40 series 😂😂😂
I wouldn't write it off yet 😂.. I mean AMD Jesus put up a video the other day showing the performance increase in Gddr7 and that alone was massive especially in RT. Until we know the price we know nothing.
@c523jw7 I dunno honestly it's such a muddled mess. I mean what you could say is if 16GB on the 5080 isn't enough then 16GB on the 4080 isn't right? The whole Vram thing for me has got out of hand and both Nvidia and AMD I think are to blame for that. The 3090 changed everything when Nvidia instead of making a separate professional GPU decided to throw 24GB on a gaming gpu and tell professionals that was there choice. No gaming gpu needed 24GB of vram. Now AMD have started throwing stacks of vram at gaming gpu's and that's clearly had an impact as games are using more vram but is that cause they need it or because they can and it saves the developers time? Should we pay more for more vram so a developer can save time whilst charging more and more for games? And where does this leave AMD? I mean sure they threw a ton of the cheapest memory available at the 7000 series but when they move to next gen what happens, are they throwing 20,24 GB of gddr7 at midrange gpu so they can throw 32, 40 GB of Gddr7 at high end cause if people don't see more than the 24 already available it's not an improvement 🤷♂️. You get what I'm saying? ...
Just like everything else
Every year new products will release 😊
I’ll be skipping the 50/60 & 70 series!!!
I just finished playing assassin creed mirage on my RTX3070😌
i feel like thats just technology in general. theres always going to be something better in the future
This way by default otherwise we wouldn’t get new generations
I’m actually passing down my 4080 to my son’s PC, should last him a long time I hope 🤞.
Also, you can get a big jump from one generation, 4090 is almost 70 percent faster than a 3090
Couldnt care less about the 50 series tbh
why obsolete? its not that they 40th series gets slower 😂 i keep my 4070 ti super for more than a year for sure i have a white palit one love it paired with 7800x3d
The 4070ti s is what I just purchased.. my first pc but it Hass a rx 7 7700. I hope it lasts years to come. At lease 3 years
I'll be getting a new PC in a few years, will know when the time is right. Still marveling at how fast games load from my internal/external SSD every time I buy a game on PC that I previously only had on Xbox One S.
No rush enjoy what you got
I’m still rocking an RTX 2080 Ti. I can wait another year to upgrade to the 50 series easy.
Question how is that holding up for you. Like what res are you playing at and what sort of frames are you targeting? How does it do on new games??
@@c523jw7 I honestly don’t play a lot of the newer games. So, for me it still works great. But with a lot of developers utilizing the Unreal 5 engine I’m afraid I won’t have a choice but to upgrade sometime soon.
For those on 1080p resolution for whatever valid reason (in my case esports game) even the old 20 series cards are still good enough or the 6750t
I've just finished Avatar:Frontiers of Pandora ultra settings, 2nd time through, 7800x3d / 7900xt, typically GPU ~ 380W, now started Horizon Zero Dawn ultra settings ~ max 150W, I think I'll be sticking with this combo for a good while yet.
If the ray tracing is a lot better in the new AMD card and it’s just as fast as my card maybe I’ll get it , maybe. Depends on Price as long as it can run all my games maxed out at 1440p high refresh rate. it doesn’t have to be a 4K card as long as it can do anything I want at 1440 P 60 and above
The rumors hype the stuff up so much that really Nvidia doesn't even need to advertise it. For those who watch the rumor videos they want the card sometimes years before it even comes out and by the time it releases they feel like they have to get it whether they need it or not.
Yep. Trust me it's even worse if you are a content creator
@@frogboyx1Gaming I know what you mean. All this youtube stuff is pretty much a giant advertising machine for all these big companies. Not only do they show ads but also many of the videos are paid for by companies. When a youtuber gets a sample graphics card to test that's free advertisement and more people look at it than they otherwise would have.
The 50 series wont even drop till next year sometime. The hype is real. 4080super is more than enough. Just game and be happy. Slow down the brain a bit. Wverything is moving at a pace thats un-obtanable these days. Wish pc gpu's would skip a year or two and refine what they release so people could enjoy the high tech state of tech in general we live in these days✌️
Of course not man. That's precisely why they do it! They want people who upgrade every generation to keep their sales figures up. For those RUclipsrs who don't have to pay for their equipment, they're going to review the new stuff they're being sent. The closest you'll get to them taking economics into account, is when they compare the old cards with the new ones.
Gamers we need to go back to the basics. Turn off the fps counters and just play the games you love. If your happy with how your game plays and are having a good time then no need to upgrade. Once your gpu start having issues handling the games then it's time to upgrade. That's how I need to approach it at least because of getting the upgrade bug all too often.
Nothing wrong with putting emphasis on good performance. I can't stand 60hz locked games anymore and 30fps is dead on arrival for me. 90hz is the minimum and really 120hz is ideal for me to enjoy a game.
The next midrange typically is the same as the last GEN higher-end. But as Nvidia has proven, that’s not always the case.
yea but that card will probably be 2500 bucks yikes no ty
I dont mind the content but please drop the constant 'bus' if you dont understand it 😂..
Bus size is one metric.
Memory frequency and cache size and speed is also another.
The 3 combined are a far more important metric over just simply talking about bus size.
If you look at 4070super benchmarks youll see its 192 bit bus easily beat 256bit bus lanes and higher...
If AMD want to improve on efficiency they will also no doubt stack more cache and reduce bus sizes.
to be competitive, AMD may not be able to drop it too much because the difference between GDDR7 and GDDR6 if rumors are to be believed. The specs are likely already dialed in...just a matter of selling the old stock as you see them giving deals at Microcenter on their GPUs now
@@ofon2000 I've no idea when or if AMD will move to GDDR7.
It appears definitely not in the 8000 series rumours.
If anything that appears to be using the slower memory used in the 7900gre stacked with more cache to make up for it.
@@Spratty-wb3is It seems like they aren't...especially since RDNA 4 is supposed to be cheap.
@ofon2000 yeah and I get that!
Be interested to see what happens rdna5 however..
@@Spratty-wb3is should be a bit worse for 4k, but hopefully a good bit better at 1440p while not having as many stuttering issues due to the strange overheating issues.
I'm definitely not interested in buying a 900 dollar 5070 ti to get 16 gb from Nvidia, so I'm likely gonna go with Radeon once again.
You know i think you're actually starting to think like a pc gamer. Its about working with what you got and appreciating your hardware. I think pc gamers talk about pc being great thinking about their best experience..this has led people to believe everyhing is perfect and every game is perfect...its not but gosh its good enough and better than consoles easily.
I think you put perfection expectations on this and it just made things harder and you have worked back into reality.
It was mind blowing to think you couldnt even enjoy a 4080.for a while there...glad your enjoying it man. Iv loved owning high end pc man.
Also iv heard alot of pc gamers with the cash to have the best stuff now say they enjoyed it more when they were working with less capable parts. We have gotten spoilt.
If you upgraded in the last few years and your GPU has no issues with lacking VRAM, you should be good for a number of years of really good visual gaming.
Why It takes 3 to 5+ years to create a game, and at the start of development, the game is developed team uses, in effect, is using high-end hardware.
By the time the game comes out, a number of generations of hardware have normal gone by. So this is why good GPUs should last around 4 to 5 years.
So there is no need to go get a shiny,
AS for Nicida 5080 I fear some people will get burned by 5080, it going to be a cut-down GPU, so it can be sold into China.
Now my thoughts are that AMD is going to go after market share for the midrange GPUs.
Nivida will not worry, as making money hand over fist with AI you pay a lot of arm leg for top end Nividia GPU
Intel GPUs are come of age.
So seeing next generation's vs. last generation tests will be interesting as Value of money will be with the last generation of GPU
The tech community is fickle, that's for sure. I know people irl, just gamers not youtubers or tech workers, who bought the 2080ti, the 3090, then bought the 4090. Like even if you sell the old gpu, by time the new one releases everyone sells their "old" 90 series gpu so you won't get much money. The used market floods. If the 5090 gets 50% more performance than a 4090 and cost $1799 and you get $800 for selling your 4090 after spending $1600+ for it, well you lost $800 and still have to spend $1k to get the 5090 and they have no need to.
The 4090 will be fine for two more gens of high end work, but no one listens.
If you really want to retain value you sell it a week before they announce the new cards but then you run the risk of not having a video card for awhile.
I got my 3080 for $700 and sold it for $500 a few months after Ada released. I basically spent $200 to rent a 3080 for 2 years. No biggie. The 4090 is still going for ~$1400 on eBay so I could sell it before NVIDIA announces the next generation and get the majority of my money back. If I do upgrade I'll probably give it to a friend for a good price.
4090 is still perfectly fine but I do have a 4K 240hz monitor so the sky is the limit.
@@Phil_529 I get what you mean but with the way leaks go, I think you have to kind of sell a little sooner and then you're left without a gpu until you can actually get the next gen card. But yea, you can do that to get more value, for sure.
If everybody got at least a 4070 Ti Super or a 7900XT they still going to have a better gaming experience even over the next consoles so i wouldn't be too worried .
You really think so. I mean they gave a 2070 super in 2020 for thr ps5 when that was the card people were using on pcs.
I do think the next console would be probably 5070 super.level.
These consoles were impressive at launch but just got smashed very quickly.
@@c523jw7 consider that a 1080ti was the same performance as the 2070 super and came out in 2017.
@hzwinip interesting. So 20 series wasn't that good. Wasn't it all dlss for performance. I didn't factor that in. So the ps5 wasn't really 2070 super cus doesn't have dlss. 20 series must have been disappointing.
@@c523jw7 ray tracing was the biggest difference, but rarely worth enabling due to the performance hit. 1080ti has 11GB vram vs the 2070 supers 8GB with bigger bus also. Flagship pcs are pretty far ahead of consoles.
@@c523jw7 pretty sure ps 5 would've been more comparable to a 6600xt or 6700.
The thing is you don’t buy every time they drop something new. If what you got is getting the job done well why upgrade. I’ll upgrade when the gen 10 consoles are about to drop.
Hopefully this means this new generation of 50 cards drops in prices sooner and forces the price of older cards down as well.
Delusional. Nvidia could never.
The 5000 series is going to be record high prices, Nvidia doesn't care about you.
@@GoonyMclinux The cheapest 50 series card wil be, the RTX 5050, starting at 550 USD.
@@mariusjohannessen9626 yeah, thats what im saying.
Not obsolete to me lol. Just waiting for the 4080 prices to drop! I still use my 3080 no need for a new GPU. I also have a 2080 not getting rid of them... Just won't buy a new GPU. I'm not that spoiled lol.
im cool with my FE 3090. no need to upgrade for a few more years
Just bought a new shed for my pre ordered 5090😅
And the 5080 will be obsolete too since it will be only half as powerful as the 5090.
Absolutely frog there is absolutely no reason to upgrade your GPU every gen. just is not a big enough performance jump. Basically, all that happens is the higher performance cards drop down a tier or something like the power of a 4080 will be in the new 5070 so it makes absolutely no sense for people to say the 40 series are obsolete lol….honestly frog it would be cool if you didn’t immediately jump to a 50 series card with all the other channels. Keep making content on how your 4080 and 7900xt are holding up…..and another thing I've been thinking is I think people are hyping up the 50 series a bit to much. Nvidia is already dominating the GPU market so I just don't think they have enough incentive to go all out with 50 series cards. Been a lot of rumors about them cutting back on the specs actually. it seems like they are kinda saving that big jump for when they think they need it.
If I don't jump I lose out on the views and channel growth
@@frogboyx1Gaming yeah, I guess that's true man…..I guess just do it if you actually want to though bud. It would suck to turn it into too much of a “job” …unless that is what you actually want I guess lol….gaming and tech and stuff like that is my happy place lol. Just Don't want you to lose that buddy.
@@timothysullivan-pm3sl I’m getting a headache thinking about if I should upgrade to a 4080 super lol, I’m one click away, cuz I have my doubts on 5000 series
@@Fate-qw3rnlol oh I feel ya buddy……what is your current gpu if you don't mind me asking?
@@Fate-qw3rn I went through the same thing trying to decide whether to get a 4080 or a 4070 ti super lol…after a lot (and I mean a lot!!!) of research I picked the 4070 ti super you get very very similar performance for around 200 bucks left and left me some money to upgrade my ram too.
I am happy with my 3070ti
once the 4090 launches my 4080super pc will refuse to turn on.
I never had fomo and never will 😊
All the technology we have today will be old tech in 5-10 years from now….
I like the content. Even if i dont agree i still watch...most of it anyway. Just let the ideas come.
Obsolete is such a dumb term these people use. It's not like the performance you're getting now isn't going to he the same when new shit comes out
Spot on. Like it's not obsolete cus you can dial settings down change resolutions. We really need to get out of that mentality and Froggy has used that alot. It's all or nothing thinking. Our beloved Froggy thinking makes it really hard for him to just be happy with his hardware.
Im going from a 4090 to the 5090 so i can use Display port 2.1 on my monitor as currently i can't run it at maximum resolution past 120hz although its a 240hz monitor. The 4090 will be moved jnto a second living room PC as a big picture mode console like experience when i dont want to sit in my studio after work all day
This might sound stupid but try a hdmi 2.1 cable it solved the problem for me with my 4090 connected to 4k 165hz screen
Thanks! My monitor is 7680x2160 so HDMi 2.1 only allows for 120hz at that resolution @i3l4ckskillzz79
Good vid frogger!!
😂😂😂 Just seeing you struggle with fomo is so funny but I feel bad too.
Frog you should do a doco on pc fomo and interview other people with the same conditions. I'd absolutely watch it.
Well technically they were always obsolete because RTX 4090 exist at 2x price
One being more powerful does not make something obsolete. Same technology. New gpu ,yes newer tech with possibly newer abilities that the previous generation did not have. But I get what you meant. You could get a better card power wise.
@@mmremugamesmm what even is obsolete ?, There are plenty of games coming out, that you can still play with gtx400/500 or hd 7000/r9 200 series cards.
@@plottipaa_hauki95 true, just a matter of what performance your looking for. A lot of the newer hardware can’t play the really old games actually. The older it gets software wise more compatibility we lose. At least without somebody going in fixing it.
i got a 780ti
These clickbait titles are getting wild lol. You need to chill
simp for xbox and amd zzzzzz
Dude what's your problem? Don't like his content then unsubscribe & kick rocks & it's simple as that 👌🏾
Maybe just stick to Shitbox.
Tired of these GPU's comparisons videos. I like my PS5 and my S23 Ultra phone. Eating my CHOMPS beef sticks.
The 5080 will have 16g vram and 256 bus 😂😂😂😂
This is the latest rumour. Only 5090 has increases.
Don't buy this trash, 50 series is straight KAKA 😂😂.
Frog meditate and be happy with your 4080 man cus these new Nvidias GPUs are gonna be an absolute scam...and that's coming off the 40 series 😂😂😂
Shhhhh. 👌
I wouldn't write it off yet 😂..
I mean AMD Jesus put up a video the other day showing the performance increase in Gddr7 and that alone was massive especially in RT.
Until we know the price we know nothing.
True..if its 16g that would suck though dont you think?@Spratty-wb3is
@c523jw7 I dunno honestly it's such a muddled mess.
I mean what you could say is if 16GB on the 5080 isn't enough then 16GB on the 4080 isn't right?
The whole Vram thing for me has got out of hand and both Nvidia and AMD I think are to blame for that.
The 3090 changed everything when Nvidia instead of making a separate professional GPU decided to throw 24GB on a gaming gpu and tell professionals that was there choice.
No gaming gpu needed 24GB of vram.
Now AMD have started throwing stacks of vram at gaming gpu's and that's clearly had an impact as games are using more vram but is that cause they need it or because they can and it saves the developers time?
Should we pay more for more vram so a developer can save time whilst charging more and more for games?
And where does this leave AMD?
I mean sure they threw a ton of the cheapest memory available at the 7000 series but when they move to next gen what happens, are they throwing 20,24 GB of gddr7 at midrange gpu so they can throw 32, 40 GB of Gddr7 at high end cause if people don't see more than the 24 already available it's not an improvement 🤷♂️.
You get what I'm saying? ...
@@Spratty-wb3is 16GB has been proven to be enough if it's GDDR7