@@SiliconSteak I don't mind the herpes. However what you do with that GPU in bed, that is going to give me nightmares so, thank you. Good video and good Christmas btw. 👍
I finally upgraded my 1660 super to a 3070, the card is a BEAST! But oh boy, those 8GB of VRAM are completely ridiculous for such a card.... It is so sad to see. The good thing though is even if I max everything on RE4 Remake, I never get lower framerate! I just get hiccups once in a while. I guess that is because of the high memory bandwith. And in my country it was at the same price as a 4060, so, who cares, let's just enjoy these things man!!
I have an ASUS RTX 4060 OC Edition, and I'm perfectly fine with what I have. I can run the Resident Evil 2 Remake (from the Xbox App) at a smooth 4K 90fps (upscaled from 1080p). Sounds like your inferior Steam Version of the Resident Evil 2 Remake started to struggle. That's what happens when you lick the knob of Gabe Newell.
Well don’t do it on Christmas. But you definitely are going to have to leave her. Tell her there are cards out there with thicker bus widths and it’s just not going to work
Simplex 1 isn't as bad as Simplex 2. Shockingly, I test negative for all that stuff. So far, I'm running a RX 7900 XTX and a 4060 Ti 16 GB in my home computers but they are mostly for AI Inference.
I've had hsv-1 on my lips ever since I was an infant. Selfish progenitors, had severe outbreaks until I was around 12, bullied relentlessly because people couldn't keep their lips off mine
My recent buying history: Upgraded from a 1080 Ti (best NVIDIA card in history) to a 4090 in 2023 and sold that card for the same price that I've bought it for recently. My next target: The 5090. Why? Because I can and everything else in that upcoming lineup seems like a scam to me.
Thank you. That means so much coming from you Brain-rot-skibidii. I'm glad my content is reaching it's intended audience lol. But seriously I appreciate it.
the reason i got a vega56 rather than gtx1070ti as a "used 70 dollar emergency until i can upgrade inthe future card" was because of hbm2´s bandwidth. in the weakest scenarios with an OC its a 1070ti but in the best scenarios it was like a 1080 like in resident evil, especially 1440p. due to that bandwidth. 448gb/s were no slouch in 2017. they are basically what the 5060 is gonna have.
@@SiliconSteak yeah, nowadays the drivers are where vega can be trusted and it runs quite flawlessly, back when they were released, my god, amd gave you HIV just trying to avoid the amount of crashes, performance bugs and instability. couldnt touch it without two doublethick layers of rubber.
I have 7900xt and 4070super. No games use more than 16 gigs vram yet. It will allocate it if you have it but will not actually use it. That being said I would get more than you think you need rather than focusing on ray tracing or dlss. I usually turn ray tracing off even on 4070super. Only use it on cyberpunk and few others
The 3070 in 2025 is going to be a truly heartbreaking experience, its already dying at the 1440p resolution (even with lowered settings and dlss enabled). This is how bad the vram issue has gotten. Your getting 30fps stable with third party frame gen software like Lossless Scaling + DLSS + Low settings in certain titles (UE5 ones mostly). Now heres the thing, my card has served me well, but in the last year a switch flicked where the industry just stopped optimizing games. Space Marine 2 is an incredible game, but at 1440p on a 3070 you are suffering so badly. Helldivers 2 works at low settings 1440p w/Lossless Scaling. Forever Winter cannot run beyond 40fps regardless of the resolution and DLSS. This card is gimped so hard it feels like a scam, so much so that I am moving to AMD for the sake of getting longetivity out of my cards. Id rather have better raster preformance in the coming years than have AI features to act as a crutch for gimped raster preformance. Fundamentally, no amount of AI is gonna solve the problems of limited hardware without making concessions. Those concessions have led to the industry simply not optimizing their titles, and furthermore to general image quality on newer titles resembeling a grainy oil spill in the Sahara Dessert. Lastly it has led to an excuse for planned obselence as a means to lure back customers every 3 years when previously, graphics cards could last anywhere from 5-7. In conclusion, id rather take my chances with AMD's VRAM and raster preformance because wherever that investment lands, atleast il have a better chance of having my games look how I want them to, across a longer period of time.
@@gavinferguson2938 i wouldnt play ue5 games unless its a custom version of ue5. Ue5 is a mess with no optimization in mind and is used by people who know nothing about it. It's laughable that u need mid range gpu to play remnant 2 as an example it looks like a 2012 game, games in 2013 and 2014 looked better than this
yep I can fully confirm, had my rtx 3070 for 3 years until a month ago and at both 1080p and 1440p since 2022, the 8GB vram really ruined the graphics card. For a card that was a high refresh rate 1440p gpu is now a low end 1080p card and this is only the start. the upcoming rtx 5060 won't reach 3070 performance with the 8gb vram, 128 bit bus ( at least 3070 has 256 bit bus which is much better then the 4060 Ti). I changed to a RX 6750 XT which performs the same, guy wanted to swap his gpu to mine so I accepted it. I don't care about the ray tracing at all even when I did have my 3070 cause what difference does it make if I get under 60fps or under 30fps anyway. The whole 30 series is dead as of these days and only the 3090 is good cause of vram but it consumes 500w of power which is same as 4090 for 70% less performance and has overheating issues. VRAM is the second most important thing I look for after the price.
you think you can store 100 ultra-scaled frames on a 8gb vram card? just cap your frame rate. it'll run smoother and longer and better without upscaling, or uncap with disabled shadows and aliasing, you're not even noticing half the large shadow resolutions that you're running
I got it too Got a 3080 fe thinking it would be good for 4k Its not for any game that came after the 30 series were released Now that is pushing me to want to buy a 5090 rip
an American is talking and the double standard mode is OFF. amazing, I'm proud of people who are not flirting for hardware manufactures in their videos, you said the bitter truth.
ive had nvidia cards of multiple generations well maintained, dust free, little amount of gaming over their life still die. high performance is not always high resilience, factor the cost to max 5 years of life
One herpes on your cheek? Bro your are not in the league here, I currently have 18 Herpes ulcers inside my mouth at the back to the throat, I would say try to eat normally with that. sad truth is I often get more than this and so often I can´t really care too much about it anymore
That's why you pick a monitor and build a system on its resolution. Nvidia has no control over 3rd party manufacturing processes. That thermal pad issue is on whoever makes the strix model
@SiliconSteak no sh/t? Board partners on this gpu year, were the reference models...fairly certain... hmmm 🤔 I know those strix cards have a vapor chamber in them that dont like any position other than a traditional mount.
@@SiliconSteak ya...everyone's cheaping out in production costs. You never upgraded the pads? It was a well known issue. Anyways, thanks for chattn with me and have a good Christmas
@@TheMichaelparis memory leaks always be annoying. i had that once in cod cold war , the culprit was actually one of the nvidia filters from geforce experience. updating that fixed it.
I had a 960 2gb (instead of 4gb) , I had a 1060 3gb (I stead of 6gb). I had. 6600xt when I needed a 6700xt. I have a 6700xt and a 7800xt. I hope I'm good for a while.
So far my 6600xt with 8gb is fine and has been fine in 1080p. I plan to swap it soon for something higher end and maybe 1440p but somehow i feel it's more an itch, cause everything online playable with high fps, and AAA are more than fine too. I have maybe 1 game i'd like to play soon that might have issues with it
if ur on 1080p 8gb is mostly fine, those who say it arent run every setting at ultra, this is not normal for 99% of people other than benchmarkers. now 1440p is one thing, yes 8gb will struggle there, but most people use 1080.
@lucidluhx6088 I agree in the sense that, yes, you can push past 8gb in some context but sure you can fiddle with options for minimal loss of quality even there. It doesn't make sense to buy an 8gb one new but if you have one it's not a big deal (yet)
4060Ti replaced the 3050 4070-4070Ti replaced the 3060 4070TiS-4080S replaced 3060Ti-3070Ti 4090 replaced 3080-3090Ti. The sooner y'all understand this, the sooner the current market makes sense.
Note that the 3070 replaced the 2080. The 3070Ti underperformed so badly and Nvidia was so scared of Navi 21 that they called it the 3070Ti instead of the 3080.
Im looking to replace my rtx 3090 this comming generation. 16gb will probably not give me 4 good years and I might need to uppgrade again in 2 years time. 24 gb + will be enough for 2 generations. So its either the 5080ti or 5090 that will last 2 generations or a 5070 ti / 5080 that will last one generations. Herpesfree…
@hexenkind1 Sure there could be. We have seen leaks of a 5080 with 20 or 24gb of vram and there is a huge gap between the 90 and 80. I bet we could at least get a full 103 die 80ti
Using a 3060 ti at 4k is a issue in itbself are you rrying to give your self herps it was a buget 1440p card at launch never a 4k card i bet you drop to 1440p you would have had a great experience along with much higher fps
🤣😆Entertaining. But that's how it is. Also it seems Nvidia software is not backwards compatible. XeSS2 is with it FG and LL. And lot of Wattage to replace the broken. Two different architectures where I see INTEL is winning. Note we are only in second generation and launch drivers. 4090 24gb VRAM 384 bit bus 450W 4080 16gb VRAM 256 bit bus 320W 4070ti 12gb VRAM 192 bit bus 285W 4070 12gb VRAM 192 bit bus 185W 4060ti 8gb VRAM 128 bit bus 160W 4060 8gb VRAM 128 bit bus 115W 3090 24gb VRAM 384 bit bus 350W 3080ti 12gb VRAM 384 bit bus 350W 3080 10gb VRAM 320 bit bus 320W 3070ti 8gb VRAM 256 bit bus 310W 3070 8gb VRAM 256 bit bus 220W 3060ti 8gb VRAM 256 bit bus 200W 3060 12gb VRAM 192 bit bus 170W B580 12gb VRAM 192 bit bus 190W a750 8gb VRAM 256 bit bus 225W a770 16gb VRAM 256 bit bus 225W 6500xt 4gb VRAM 64 bit bus 107W 6600xt 8gb VRAM 128 bit bus 132W 6700 10gb VRAM 160 bit bus 220W 6750xt 12gb VRAM 192 bit bus 250W 6800xt 16gb VRAM 256 bit bus 350W 6900xt 16gb VRAM 256 bit bus 300W 6950xt 16gb VRAM 256 bit bus 335W 7600xt 8gb VRAM 128 bit bus 120W 7800xt 16gb VRAM 256 bit bus 300W 7900xt 20gb VRAM 320 bit bus 300W 7900xtx 24gb VRAM 384 bit bus 420W
Nvidia making high end 8, 10gb GPUs sucked and solution is to spend 1000$ on new Nvidia GPU.. Your 1000$ rtx 4080 will very likely become obsolete in two months btw.. Neuro rendering, dlss 4, gddr7 . Then new solution will be to buy another 1200$ Nvidia GPU..
Do NOT take a shot every time I say herpes in this video.... you will not make it...
@@SiliconSteak I don't mind the herpes. However what you do with that GPU in bed, that is going to give me nightmares so, thank you. Good video and good Christmas btw. 👍
I finally upgraded my 1660 super to a 3070, the card is a BEAST! But oh boy, those 8GB of VRAM are completely ridiculous for such a card.... It is so sad to see. The good thing though is even if I max everything on RE4 Remake, I never get lower framerate! I just get hiccups once in a while. I guess that is because of the high memory bandwith. And in my country it was at the same price as a 4060, so, who cares, let's just enjoy these things man!!
I have an ASUS RTX 4060 OC Edition, and I'm perfectly fine with what I have. I can run the Resident Evil 2 Remake (from the Xbox App) at a smooth 4K 90fps (upscaled from 1080p). Sounds like your inferior Steam Version of the Resident Evil 2 Remake started to struggle. That's what happens when you lick the knob of Gabe Newell.
Oh so when my wife said she got it from Nvida she wasn't lying! Thanks for the confirmation bro I thought she got it from someone else!!
I have bad news for you.
my partner of 6 years left me for an AMD gpu wth 16gb vram.. true story.. (the fact we broke up anyway)
@ sorry to hear that man..
RTX 3070 or especially the 3070Ti only having 8gb is criminal.
I actually have an 3070 and its a great card but the 8gb Vram really feels like a calculated Bottleneck
This def was the most egregious example yea
People paying more than $1000 for those during mining craze and corona chan is even crazier.
3070Ti was supposed to be the 3080, but Nvidia was so embarrassed by it that they lowered the name.
@@adeptuspotatocus6451 They got away with it next gen though
I need space with my 1050, I don't know how to tell her that I've been talking to a 5090.
Well don’t do it on Christmas. But you definitely are going to have to leave her. Tell her there are cards out there with thicker bus widths and it’s just not going to work
r9 290x in 2013 had 8gb versions, and you think 8gb VRAM is enough for 4k in 2021 you are legit insane
Wowie, this video going into my Out of context RUclipsr’s folder.
Shout Out to all the single 8 GB GPU's out there
The RX580 is a lone wolf that eat everything in his time...
I recently bought the 7800xt with 16 gb vram
slowly becoming my favourite tech channel! keep up the banger vids that intro was crazy 😭
Simplex 1 isn't as bad as Simplex 2. Shockingly, I test negative for all that stuff. So far, I'm running a RX 7900 XTX and a 4060 Ti 16 GB in my home computers but they are mostly for AI Inference.
I'm glad you didn't catch it..
I've had hsv-1 on my lips ever since I was an infant. Selfish progenitors, had severe outbreaks until I was around 12, bullied relentlessly because people couldn't keep their lips off mine
My recent buying history: Upgraded from a 1080 Ti (best NVIDIA card in history) to a 4090 in 2023 and sold that card for the same price that I've bought it for recently.
My next target: The 5090.
Why? Because I can and everything else in that upcoming lineup seems like a scam to me.
A 512 bit 32GB card is pretty gigachad. If you want to go all out, the 5090 is the card to do it on..
@@SiliconSteak It is. Would also never do it if I didn't sell the 4090 for a very high price.
True. I used my 3090 to pay for my 4080 so it was way easier to upgrade
@@Hexenkind1 Why? Because you have a weak mind and perfect prey for FOMO tactics.
@@xPhantomxify no
Wow the quality of your videos has improved so much, I saw you maybe 1-2 months ago and you only had 50 subs you deserve it man good work
Thank you. That means so much coming from you Brain-rot-skibidii. I'm glad my content is reaching it's intended audience lol. But seriously I appreciate it.
@@SiliconSteak Of course bro ♥
That’s one hell of an opener!
Dang that cold opening was cold
the reason i got a vega56 rather than gtx1070ti as a "used 70 dollar emergency until i can upgrade inthe future card" was because of hbm2´s bandwidth. in the weakest scenarios with an OC its a 1070ti but in the best scenarios it was like a 1080 like in resident evil, especially 1440p. due to that bandwidth. 448gb/s were no slouch in 2017. they are basically what the 5060 is gonna have.
I have always thought the vega cards with HBM were really cool. They never made sense for me to pick up though..
@@SiliconSteak yeah, nowadays the drivers are where vega can be trusted and it runs quite flawlessly, back when they were released, my god, amd gave you HIV just trying to avoid the amount of crashes, performance bugs and instability. couldnt touch it without two doublethick layers of rubber.
Its not herpes its her PC
current rig has 1060 6gb, eveying up the 5080 for my new build
I have 7900xt and 4070super. No games use more than 16 gigs vram yet. It will allocate it if you have it but will not actually use it.
That being said I would get more than you think you need rather than focusing on ray tracing or dlss. I usually turn ray tracing off even on 4070super. Only use it on cyberpunk and few others
I found out that my dad had been with my GPU....
Hoping i will be set for a while with 7900xt. Had it for one and half years already
The 3070 in 2025 is going to be a truly heartbreaking experience, its already dying at the 1440p resolution (even with lowered settings and dlss enabled). This is how bad the vram issue has gotten. Your getting 30fps stable with third party frame gen software like Lossless Scaling + DLSS + Low settings in certain titles (UE5 ones mostly).
Now heres the thing, my card has served me well, but in the last year a switch flicked where the industry just stopped optimizing games. Space Marine 2 is an incredible game, but at 1440p on a 3070 you are suffering so badly. Helldivers 2 works at low settings 1440p w/Lossless Scaling. Forever Winter cannot run beyond 40fps regardless of the resolution and DLSS. This card is gimped so hard it feels like a scam, so much so that I am moving to AMD for the sake of getting longetivity out of my cards. Id rather have better raster preformance in the coming years than have AI features to act as a crutch for gimped raster preformance. Fundamentally, no amount of AI is gonna solve the problems of limited hardware without making concessions. Those concessions have led to the industry simply not optimizing their titles, and furthermore to general image quality on newer titles resembeling a grainy oil spill in the Sahara Dessert. Lastly it has led to an excuse for planned obselence as a means to lure back customers every 3 years when previously, graphics cards could last anywhere from 5-7.
In conclusion, id rather take my chances with AMD's VRAM and raster preformance because wherever that investment lands, atleast il have a better chance of having my games look how I want them to, across a longer period of time.
@@gavinferguson2938 i wouldnt play ue5 games unless its a custom version of ue5. Ue5 is a mess with no optimization in mind and is used by people who know nothing about it. It's laughable that u need mid range gpu to play remnant 2 as an example it looks like a 2012 game, games in 2013 and 2014 looked better than this
yep I can fully confirm, had my rtx 3070 for 3 years until a month ago and at both 1080p and 1440p since 2022, the 8GB vram really ruined the graphics card. For a card that was a high refresh rate 1440p gpu is now a low end 1080p card and this is only the start. the upcoming rtx 5060 won't reach 3070 performance with the 8gb vram, 128 bit bus ( at least 3070 has 256 bit bus which is much better then the 4060 Ti). I changed to a RX 6750 XT which performs the same, guy wanted to swap his gpu to mine so I accepted it. I don't care about the ray tracing at all even when I did have my 3070 cause what difference does it make if I get under 60fps or under 30fps anyway.
The whole 30 series is dead as of these days and only the 3090 is good cause of vram but it consumes 500w of power which is same as 4090 for 70% less performance and has overheating issues. VRAM is the second most important thing I look for after the price.
Your intros have been absolutely outta pocket lately, gang. Keep it up tho ong!
I was considering the RTX 5000 series but intels battle mage is tempting to pick up instead
Its wild 3060ti used for 4k, my gpu might just grow a leg and jumps off from my 4th floor window 😭
I was about to do a budget build for a friend with 6650xt decided to up it to 6750xt just because of vram. I think the extra $75 will be worth it.
It will
u telling us u upgraded from 3060 ti to 3090, then 4080, then your conclusion telling us no need buy expensive card?
My final thought is it depends… think about you need and buy that. I didn’t need 24gb of VRAM for example
you think you can store 100 ultra-scaled frames on a 8gb vram card? just cap your frame rate. it'll run smoother and longer and better without upscaling, or uncap with disabled shadows and aliasing, you're not even noticing half the large shadow resolutions that you're running
I got it too
Got a 3080 fe thinking it would be good for 4k
Its not for any game that came after the 30 series were released
Now that is pushing me to want to buy a 5090
rip
an American is talking and the double standard mode is OFF.
amazing, I'm proud of people who are not flirting for hardware manufactures in their videos, you said the bitter truth.
ive had nvidia cards of multiple generations well maintained, dust free, little amount of gaming over their life still die.
high performance is not always high resilience, factor the cost to max 5 years of life
Nah I built my PC to last .
Your mom is built to last
what chair is that
Serta executive chair big and tall i5000. It’s pretty nice
And thats why nvdia gpus will sell, and amd not competing is the worst thing ...
One herpes on your cheek? Bro your are not in the league here, I currently have 18 Herpes ulcers inside my mouth at the back to the throat, I would say try to eat normally with that. sad truth is I often get more than this and so often I can´t really care too much about it anymore
That's why you pick a monitor and build a system on its resolution. Nvidia has no control over 3rd party manufacturing processes. That thermal pad issue is on whoever makes the strix model
The reference design had the same issue
@SiliconSteak no sh/t? Board partners on this gpu year, were the reference models...fairly certain... hmmm 🤔 I know those strix cards have a vapor chamber in them that dont like any position other than a traditional mount.
That may be so. I had mine normally mounted, the gpu core was fine at 72-80C, the vram got up to 110 though.
@@SiliconSteak ya...everyone's cheaping out in production costs. You never upgraded the pads? It was a well known issue. Anyways, thanks for chattn with me and have a good Christmas
hi, is that an innocn 27" monitor in the background?
Yes
while playing space marine 2 my 4080S is running out of vram lol the devs are at fault also not optimising their games.
rip
its not running out of vram, its if anything a memoryleak you should report.
@ i get the lag spikes when it is switching to system ram. Really annoying didn't have this before the latest patch broke the game for me..
@@TheMichaelparis memory leaks always be annoying. i had that once in cod cold war , the culprit was actually one of the nvidia filters from geforce experience. updating that fixed it.
@@InnerFury1886 hmm maybe because i installed the new beta drivers which was supposed to fix the nvidia app
I had a 960 2gb (instead of 4gb) , I had a 1060 3gb (I stead of 6gb). I had. 6600xt when I needed a 6700xt. I have a 6700xt and a 7800xt. I hope I'm good for a while.
You were on the struggle bus there for a little while
So far my 6600xt with 8gb is fine and has been fine in 1080p. I plan to swap it soon for something higher end and maybe 1440p but somehow i feel it's more an itch, cause everything online playable with high fps, and AAA are more than fine too. I have maybe 1 game i'd like to play soon that might have issues with it
if ur on 1080p 8gb is mostly fine, those who say it arent run every setting at ultra, this is not normal for 99% of people other than benchmarkers. now 1440p is one thing, yes 8gb will struggle there, but most people use 1080.
@lucidluhx6088 I agree in the sense that, yes, you can push past 8gb in some context but sure you can fiddle with options for minimal loss of quality even there. It doesn't make sense to buy an 8gb one new but if you have one it's not a big deal (yet)
the bouncing mic is really distracting. fix plz
😂 This was funny...got busy with your 3060!
Doesn’t the 3060ti have 12gb or am I trippin
Nah it’s got 8gb 💀
it's the 3060 that has 12gb of vram (and a 8gb version) but the 3060ti comes in 1 version with 8gb
I know It's personal thing but may i ask you please how old are you? just curious
I will tell you if you guess first. I'm curious how old you think I am
@@SiliconSteak My guess 26 like me
@@waleed-87 24
@@SiliconSteak May god bless you
@@waleed-87 thanks, merry Christmas
4060Ti replaced the 3050
4070-4070Ti replaced the 3060
4070TiS-4080S replaced 3060Ti-3070Ti
4090 replaced 3080-3090Ti.
The sooner y'all understand this, the sooner the current market makes sense.
Note that the 3070 replaced the 2080. The 3070Ti underperformed so badly and Nvidia was so scared of Navi 21 that they called it the 3070Ti instead of the 3080.
The first mistake is buying nvidia gpu in first place.
I only have the gtx 1060 with 6 gb vram
Im looking to replace my rtx 3090 this comming generation. 16gb will probably not give me 4 good years and I might need to uppgrade again in 2 years time. 24 gb + will be enough for 2 generations. So its either the 5080ti or 5090 that will last 2 generations or a 5070 ti / 5080 that will last one generations. Herpesfree…
There will be no 5080 Ti. Pretty sure about that.
@ nothing confirmed but it makes sense for there to be one in the future.
@@Zygmunator It also made sense with the 40 series and they have never done it.
@hexenkind1 Sure there could be. We have seen leaks of a 5080 with 20 or 24gb of vram and there is a huge gap between the 90 and 80. I bet we could at least get a full 103 die 80ti
@@SiliconSteak yes, but there is never a guarantee. I think it will happen but at least 6 months in or later.
1:25, AYYYYOOOOOOOOOOOOOO😭😭😭😭
Hey if you only play fortnite that's more than enough
I just need more vbucks…
1:25 🤣🤣🤣
Using a 3060 ti at 4k is a issue in itbself are you rrying to give your self herps it was a buget 1440p card at launch never a 4k card i bet you drop to 1440p you would have had a great experience along with much higher fps
I bought for $770 during mining boom in 2021
Also back then it could play 4k in a lot of games easily. Thats why I was saying games are getting way too hard to run now
🤣😆Entertaining. But that's how it is. Also it seems Nvidia software is not backwards compatible. XeSS2 is with it FG and LL. And lot of Wattage to replace the broken. Two different architectures where I see INTEL is winning. Note we are only in second generation and launch drivers.
4090 24gb VRAM 384 bit bus 450W
4080 16gb VRAM 256 bit bus 320W
4070ti 12gb VRAM 192 bit bus 285W
4070 12gb VRAM 192 bit bus 185W
4060ti 8gb VRAM 128 bit bus 160W
4060 8gb VRAM 128 bit bus 115W
3090 24gb VRAM 384 bit bus 350W
3080ti 12gb VRAM 384 bit bus 350W
3080 10gb VRAM 320 bit bus 320W
3070ti 8gb VRAM 256 bit bus 310W
3070 8gb VRAM 256 bit bus 220W
3060ti 8gb VRAM 256 bit bus 200W
3060 12gb VRAM 192 bit bus 170W
B580 12gb VRAM 192 bit bus 190W
a750 8gb VRAM 256 bit bus 225W
a770 16gb VRAM 256 bit bus 225W
6500xt 4gb VRAM 64 bit bus 107W
6600xt 8gb VRAM 128 bit bus 132W
6700 10gb VRAM 160 bit bus 220W
6750xt 12gb VRAM 192 bit bus 250W
6800xt 16gb VRAM 256 bit bus 350W
6900xt 16gb VRAM 256 bit bus 300W
6950xt 16gb VRAM 256 bit bus 335W
7600xt 8gb VRAM 128 bit bus 120W
7800xt 16gb VRAM 256 bit bus 300W
7900xt 20gb VRAM 320 bit bus 300W
7900xtx 24gb VRAM 384 bit bus 420W
I think people are doomposting about the 5080 too much, it's going to be a very powerful card, especially with DLSS 4
Did you watch my video on DLSS 4?
Nvidia making high end 8, 10gb GPUs sucked and solution is to spend 1000$ on new Nvidia GPU.. Your 1000$ rtx 4080 will very likely become obsolete in two months btw.. Neuro rendering, dlss 4, gddr7 . Then new solution will be to buy another 1200$ Nvidia GPU..
It was some ride though… still better than the 3090 that was cooking. Also I got it for $830
No
doomposting. chill.