@@vigilant_1934 It seems that Gddr7 is produced by Samsung. And the TSMC chips were not particularly good (although this may be due to the Ada Lovelace architecture). In the same 30 series, there were chips from Samsung and they are very good
@@Guldfisken90 buying graphics card that is more then 2 years old and will be replaced by next gen in few days for 2k is stupid, I have saved way more then 2k, I can afford to buy 5090 even if it cost 2K or even 3K but I refuse to spend that much just on graphics card that is going to be obsolete in few years
@@evilsatorii He never said anything about buying a GPU that's more than 2 years old. He obviously meant save up for 2 years to have 2k to buy that next gen one in a few days for 2k.
5050 Ti 10GB would be nice upgrade on my 8 years old 1050 Ti. Is this possible dream or i have to wait lot more years to reach 10GB(+) with low end cards?
nvidia will never give low end cards with more than 8gb vram and btw what do u mean by low end cards ? like in india i bought 3060 ti with gst added which cost me around 600 usd which is very expensive for me so nothing nvidia release is low end while they completely shutdown the gtx line which was budget line of their gpus so RTX gpus are not considered as low end they are more considered as like mid - high range gaming gpus
Buy a Arc B580, got one and I'm really impressed. Would be much upgrade from your 1050ti and for around 300€/$ you won't get more horses for the money at the moment. And I think Nvidia won't do anything in this price category 😅
00:17 Fallout 76 devs really so lazy they're just rereleasing the Twitch Prime outfits as Nvidia rewards... Also like how there seems to be a Nuclear Winter gamemode shot but that gamemode's been gone for like 2-3 years.
I hope Nvidia realises that these unrealistic marked up GPU prices are very toxic for the overall PC community... GTX 1080TI even when we use the inflation calculator was less than $1000 and was great for its time... You need to adjust your prices otherwise you will be known only for your AI works and greed to the general consumers
@@tcdixon90 don't get hyped for the 50 series, 5060 still has 8GB and the performance uplifts across the board are about 10-20%. Also prices are going up even higher. And the power consumption is going up more. Waste of a generation for gamers, only good for AI or rendering companies that need something powerful like the 5090. Gamers who think Nvidia will give a good low end GPU this year should instead turn to the Arc B580, as you can't stay on 8GB of VRAM forever.
0:18 I tried to install Nvidia app with a AMD gpu just for The Finals reward which I didn't know what it would be. I was disappointed the app wasn't accessabe this way. Now I'm even more disappointed it's just that one mask I never wanted that you can just buy anytime, so it's not even special.
rtx 5080 for $1499 kkkk, maybe in ray tracing games will be more powerful than 4090, but in rasterization its certainly that will be -5% ou 10% less power than 4090.....(by do way is almost the same price at launch of the rtx 4090......) really is almost the same chip. with the almost the same price, but this times, maybe the "scalpers" don't hit the price too high.....
Ya never happy. You for get not everybody can afford mid tier to high-end gpu. They have to make them affordable for some people. If you don't like it go amd. Lmfaooooooo
@@tcdixon90 the whole point of graphics cards is that you can use them to play games. They compute GRAPHICS. The problem is that graphics are getting higher and higher resolution and quality as game developing progresses. VRAM requirements are shooting through the roof these days. The new Indiana Jones game comfortably consumes 20+GB of VRAM with raytracing. As software updates, hardware needs to follow, but with how much VRAM games absorb nowadays, 8 GB cards are at the end of their acceptable lifespan. This is why people are so dissapointed that the 50s series is gonna have 8 GB cards again.
Ну не думаю, что дорого В том же 2022 когда выходила 40 серия, она не особо дороже стоила Ну учитывай, что в Россию они напрямую в продажу не пойдут(т.е через параллельный импорт и тд.)
Graphics cards are for computing graphics. So you can play games. Here's the fact: modern games easily consume more than 8 GB of VRAM. Indiana Jones comfortably just eats 20 GB of VRAM - there is no excuse for Nvidia to still put only 8 GB of VRAM in their cards. If Intel can release a $250 card with 12 GB of VRAM, so can Nvidia. The reason we don't get more VRAM is profit, and profit only. Yes, no one needs 50 GB of VRAM, but atleast 10 GB should be standard by now. These cards are gonna be absolutely terrible for futureproofing and will be VRAM bottlenecked in pretty much all games within just a few years.
@@Zeeves not true. The amount of gamers is growing at a fast rate, and games are getting more and more intensive. People ARE willing to spend hundreds of dollars on cards. Where there's a market, there will always be companies - and this market is a goldmine. It's just that Nvidia has claimed ownership over 85% of the goldmine and can now just do whatever they want with it.
@@NiftyNova GPUs at home won't be needed. Gaming will all be streaming-based subscription services - like netflix; consumers won't need to own nor will they be willing to invest in expensive gaming hardware, not even consoles.
@@Zeeves i don't know a single gamer that actually uses cloud gaming. Because it's reliant on a stable, strong internet connection, it means that without internet connection, you litterally just can't play any games. Network down? No games. Slow connection? No games. Unstable internet? Enjoy gaming for 20 seconds before you get kicked off the servers. Gamers aren't middle-aged moms looking for the most convenient way to watch Squid Game. Building PC's is a hobby for many. Not to mention GPU's are needed for way more than games, like rendering and animating. Nvidia's cloud-gaming service, GeForce NOW, launched 5 years ago, and i don't know a single person that's EVER even used it, let alone used it on a day to day basis. It is also limited to 1080p60, meaning that you'd get a WAY worse gaming experience - no more 1440p or 4k and bye bye high refresh rates. It's a downgrade. Not to mention it only has 30 available games. Yes, i believe cloud gaming will increase in popularity because there are people out there who don't want to buy an expensive PC. But based on current trends, it's not going to take over classic gaming anytime soon, let alone in 5 years. If local-hardware gaming was dying, Nvidia wouldn't be making litterally TRILLIONS of dollars selling GPU's.
RTX 5060 8GB 🤣
It's good to see The Finals getting the attention it deserves.
Bring back Ampere pricing(MSRP)
5070 500$
5080 700$
5090 1500$
That would require the corporation *not* being greedy scum. Yeah, not happening then.
5090 2000-3000$
Then bring back cheap Samsung 8nm. TSMC 4nm probably costs Nvidia 3 times as much if not more.
@@vigilant_1934 It seems that Gddr7 is produced by Samsung. And the TSMC chips were not particularly good (although this may be due to the Ada Lovelace architecture). In the same 30 series, there were chips from Samsung and they are very good
Hell ya The Finals!
I swear they purposefully put The Finals' cosmetic as the beat goes away lmao
How do you even do the LAN missions? is it not a thing in Australia?
Rtx 5080...in 2025...16 gb!!!OMG, No Nvidia, no can be true!!😮😢
at least it is quite cheap, not like 4090
@@evilsatoriilets hope that it is cheap💀
@@evilsatorii Just learn to save up my dude. 2k, is pocket money if you started 2 years ago
@@Guldfisken90 buying graphics card that is more then 2 years old and will be replaced by next gen in few days for 2k is stupid, I have saved way more then 2k, I can afford to buy 5090 even if it cost 2K or even 3K but I refuse to spend that much just on graphics card that is going to be obsolete in few years
@@evilsatorii He never said anything about buying a GPU that's more than 2 years old. He obviously meant save up for 2 years to have 2k to buy that next gen one in a few days for 2k.
And ANOTHER year with my beloved $700 retail 3080 FE 🙂🙂
Well RTX 30XX is back, now you should call Ampere Blackwell. RTX 3070 Ti had 8 GB of VRAM and RTX 5080 will have 16 GB of VRAM. Nvidia ❌ Ngreedia ✅
5050 Ti 10GB would be nice upgrade on my 8 years old 1050 Ti. Is this possible dream or i have to wait lot more years to reach 10GB(+) with low end cards?
nvidia will never give low end cards with more than 8gb vram and btw what do u mean by low end cards ? like in india i bought 3060 ti with gst added which cost me around 600 usd which is very expensive for me so nothing nvidia release is low end while they completely shutdown the gtx line which was budget line of their gpus so RTX gpus are not considered as low end they are more considered as like mid - high range gaming gpus
They consider rtx 4060 low end. 4070 and 4070 ti mid range and rtx 4080 high end and 4090 enthusiast gpu..@@BublaBenchmarks1999
Buy a Arc B580, got one and I'm really impressed. Would be much upgrade from your 1050ti and for around 300€/$ you won't get more horses for the money at the moment. And I think Nvidia won't do anything in this price category 😅
I got the 4090 over a year ago and it’s been great.
so has the event started? is there no live from Nvidia?
Bring us some new information about g sync pulsar, please!!
Yeah im still playing on GT 640M LE
00:17 Fallout 76 devs really so lazy they're just rereleasing the Twitch Prime outfits as Nvidia rewards... Also like how there seems to be a Nuclear Winter gamemode shot but that gamemode's been gone for like 2-3 years.
I hope Nvidia realises that these unrealistic marked up GPU prices are very toxic for the overall PC community... GTX 1080TI even when we use the inflation calculator was less than $1000 and was great for its time... You need to adjust your prices otherwise you will be known only for your AI works and greed to the general consumers
Hey #Geforce, I can barely hit the 300 FPS in The Finals with my RTX4090.
WTH? How would I ever get into The Finals with these FPS?!
Hyped for the 50 series 😁
I hope this is sarcasm 😅
@Crazicalbecause ur not hype doesn't mean anybody shouldn't be.
@@tcdixon90 don't get hyped for the 50 series, 5060 still has 8GB and the performance uplifts across the board are about 10-20%. Also prices are going up even higher. And the power consumption is going up more. Waste of a generation for gamers, only good for AI or rendering companies that need something powerful like the 5090. Gamers who think Nvidia will give a good low end GPU this year should instead turn to the Arc B580, as you can't stay on 8GB of VRAM forever.
0:18 I tried to install Nvidia app with a AMD gpu just for The Finals reward which I didn't know what it would be. I was disappointed the app wasn't accessabe this way. Now I'm even more disappointed it's just that one mask I never wanted that you can just buy anytime, so it's not even special.
rtx 5080 for $1499 kkkk, maybe in ray tracing games will be more powerful than 4090, but in rasterization its certainly that will be -5% ou 10% less power than 4090.....(by do way is almost the same price at launch of the rtx 4090......)
really is almost the same chip. with the almost the same price, but this times, maybe the "scalpers" don't hit the price too high.....
RTX 5060 nice
RTX 5060 with 8G VRAM. Is it true?🤡
How is that possible? The GTX1070 of 2016 had 8 GB, but for the RTX5060 of 2025 to have 8 GB again is not progress, but marking time.
Ya never happy. You for get not everybody can afford mid tier to high-end gpu. They have to make them affordable for some people. If you don't like it go amd. Lmfaooooooo
@@tcdixon90 the whole point of graphics cards is that you can use them to play games. They compute GRAPHICS. The problem is that graphics are getting higher and higher resolution and quality as game developing progresses. VRAM requirements are shooting through the roof these days. The new Indiana Jones game comfortably consumes 20+GB of VRAM with raytracing. As software updates, hardware needs to follow, but with how much VRAM games absorb nowadays, 8 GB cards are at the end of their acceptable lifespan. This is why people are so dissapointed that the 50s series is gonna have 8 GB cards again.
Let's go #GeForceGreats
I hope GTA Vl release nvidia geoforce now RTX ON not overheating.
50 No 50 No Please
these game choices blowwww
Thanks
how about be more gamer friendly and not overprice your gpus
cool.
Super hype
#GeForceGreats
И сколько они будут в России стоить :) ?
Ну не думаю, что дорого
В том же 2022 когда выходила 40 серия, она не особо дороже стоила
Ну учитывай, что в Россию они напрямую в продажу не пойдут(т.е через параллельный импорт и тд.)
На момент выхода, думаю тысяч так 300-400, потом через пол года, год по 150-250
@@aremamarema Не наоборот, на момент выхода надо покупать, ибо потом цены станут еще выше(пример это 40 серия, она стала только дороже)
I hope nobody buy your cards... The amount of money for a mid range card are crazy.. I don't even say a word about 5090..it's a joke
Give some games for free instead of in-game rewards that would be more reasonable #GeForceGreats
Stop bothering Nvidia about VRAM. They ain't gonna put 50GB Of VRAM on a $300 GPU because developers don't know how to optimize anything.
Graphics cards are for computing graphics. So you can play games. Here's the fact: modern games easily consume more than 8 GB of VRAM. Indiana Jones comfortably just eats 20 GB of VRAM - there is no excuse for Nvidia to still put only 8 GB of VRAM in their cards. If Intel can release a $250 card with 12 GB of VRAM, so can Nvidia. The reason we don't get more VRAM is profit, and profit only. Yes, no one needs 50 GB of VRAM, but atleast 10 GB should be standard by now. These cards are gonna be absolutely terrible for futureproofing and will be VRAM bottlenecked in pretty much all games within just a few years.
@@NiftyNovaGPUs for gaming will also not be a thing altogether, within a 5 years.
@@Zeeves not true. The amount of gamers is growing at a fast rate, and games are getting more and more intensive. People ARE willing to spend hundreds of dollars on cards. Where there's a market, there will always be companies - and this market is a goldmine. It's just that Nvidia has claimed ownership over 85% of the goldmine and can now just do whatever they want with it.
@@NiftyNova GPUs at home won't be needed. Gaming will all be streaming-based subscription services - like netflix; consumers won't need to own nor will they be willing to invest in expensive gaming hardware, not even consoles.
@@Zeeves i don't know a single gamer that actually uses cloud gaming. Because it's reliant on a stable, strong internet connection, it means that without internet connection, you litterally just can't play any games. Network down? No games. Slow connection? No games. Unstable internet? Enjoy gaming for 20 seconds before you get kicked off the servers.
Gamers aren't middle-aged moms looking for the most convenient way to watch Squid Game. Building PC's is a hobby for many. Not to mention GPU's are needed for way more than games, like rendering and animating.
Nvidia's cloud-gaming service, GeForce NOW, launched 5 years ago, and i don't know a single person that's EVER even used it, let alone used it on a day to day basis. It is also limited to 1080p60, meaning that you'd get a WAY worse gaming experience - no more 1440p or 4k and bye bye high refresh rates. It's a downgrade. Not to mention it only has 30 available games.
Yes, i believe cloud gaming will increase in popularity because there are people out there who don't want to buy an expensive PC. But based on current trends, it's not going to take over classic gaming anytime soon, let alone in 5 years. If local-hardware gaming was dying, Nvidia wouldn't be making litterally TRILLIONS of dollars selling GPU's.
Geforce now oszustwo jakich mało.
Nvidia suck in arch Linux 😡👿
Waiting for RTX 5090!
берите 4070s и будет счастье
#GeForceGreats
#GeForceGreats
#GeForceGreats
#GeForceGreats