EDIT: Turns out I couldn't enable resizable bar due to the old vBIOS, so if you're having the same problem, make sure to update to be able to use ReBar. Thanks for watching! You can support my work here: www.buymeacoffee.com/ratechyt discord.gg/PFb9cMstZH instagram.com/ratechyt twitter.com/ratechyt facebook.com/ratechyt
yes, this part was sketchy AF on my card, because the bios naming was extremely confusing and I ended up with TWO possible candidates of bios to update to. I dived in, made the best possible guess and fired it up, thanks got it worked. RIP to Gigabita Vision OC users LOL
Thanks for review. I had Asus TUF 3080 with 10GB, it served me amazingly well on 3440*1440 monitor. Even when I upgraded to 4090, my trusty 3080 took an honorable place on display on my bookshelf in my living room. Eventually I sold it to one of my acquaintance for around 410 USD, good for him and good for me.
Great content bro, especially when you actually show live results instead excel sheets copy paste graphs like other youtubers love to do. Keep up the good work.
I would have to strongly agree. My 3080 ftw3 was the first higher end card I bought new, and it will most likely end up accompanying me to the afterlife.
Almost every 3080 at this age needs thermal pad replacement, and that's 50-70$ extra and you got no warranty. On the other side for a bit extra $ you get a brand new 4070 with warranty and DLLS 3.0 support. For me the DLLS 3.0 only is a huge advantage.
I have it 2nd hand for 2 years. Undervolted so it hovers around 200 to 220v under heavy loads. Lost 5% performance (aka 60 vs 63fps....) and gained 33% lower power and temps. Its a great card for 1440p high refresh rate gaming. Maxed out Doom eternal with RT at 180fps is still cool even in 2024.
Buying used is good and the 3080 is a beast pf a card, but is a card that came during the Cryptomining boom, so i personally would avoid those used cards.
cards which were used by miners who knew what the are doing are probably the best used cards you can buy, besides the ones which werent used. Competent miners underclock their cards and clean them, and since they run all the time they dont get damaged by temperature changes.
I had a 3080 12gb EVGA FTW3 Ultra in my second PC. I ended up selling it and upgraded it to a RTX 4070 Super FE that paired amazingly with my 7800x3d. The 3080 in general is a very power hungry GPU. With my 7800x3d along with it I was pulling 560 watts of power. About 80 watts was from the 7800x3d. So that GPU was pushing almost 500watts if not power limited or undervolted.
@@Grineck I bought it from a bulgarian website (similiar to ebay) as I live in Bulgaria.The prices for 2nd hand GPU like this vary from 300$ up to 500$, but sometimes people want to sell things fast and put them on low price.So far I haven't had any problem with the GPU, temps are good and everything, so I hope it stays that way
I will eventually get a 7900X3D/7900 XT system to satisfy my OCD but until then I am chilling with 5900X and 3070ti I kinda wanna build a 5700X3D/RX 5700 crossfire system for fun since 5700X3D chips are so cheap on aliexpress rn
Awesome insight on the 3080! Im using a 3800X on an AsRock Steel Legend B550M with a FE3080 10GB. I got re-sizeable-bar to activate on my setup by updating the 3080 bios from the Nvidia web page.
I was pretty surprised the other day when I realised my friends 4 year old 3080 outperforms my 7900GRE :D I also upgraded from a 5600x to a 5700x3d and it fixed a lot of weird stuters I was getting
no man, if your 3080 outperforms your 7900gre then theres something wrong. i had a 6800xt in my girls pc that pretty much matched my old 3080 rog strix, now i bought a 7900gre for her, and the gre is DEFINITELY WITHOUT A DOUBT the faster gpu in 1440p, so as i said theres something wrong with your rigg if thats the case for you. a 7900gre is pretty much neck to neck with a 4070super and very close to 3090 performance.
I don't understand the reason of comparing the price of a SH component with the price of a new component - you could have used the price of 7800 XT found as a SH component. Based on what Ebay is listing, the price of 7800 XT (SH) is similar to the price of 7800 XT you provided in the video. Even if this is the case, you should compare the price of 2 components found in a similar situation, even if in this current situation the 7800 XT SH and new model are similarly priced.
I was thinking between 3080 and 6800/6800xt, because 3080 is still faster than 4070 (minus 2gb of vram), and 6800xt is like 5% slower than 7800xt. I had 3080 ti until few months ago, but I was depressed (some personal stuff) and sold it and decided that I don't need desktop pc and my 7 years old Thinkpad x1 tablet gen 2 is enough for some smaller projects in Creo (ex Pro/E), and I still have the Xbox.
That's unfortunate. I've been extremely lucky with used parts, I only had an AM3+ motherboard die on me out of dozens of used parts that I bought within the last 6 years.
@@rand0mon3-m3k It isn't true. Reputation of the 12V rails and OFC-ing their reputation has nothing to do with it. Only raw current calculation helps here because the RTX 3000 series is very very PWR hungry compared to the RTX Pascal Refresh 2000 series.
Your having a problem with stuttering in some games. Idk why in God of War the frame time graph in my 3080 is completely smooth, yours has alot of spikes.
@@dinooxxtuga2771 That's actually fairly stable, it moves up and down just a few frames which is why it looks like that. That happens after installing drivers & launching the game for the first time, and mostly goes away after about a couple of hours of gameplay. The performance issues in Warzone & Apex don't happen once I pull out the RTX 3080 and install the RX 6800.
I still have my 3080 FTW3 and I can’t seem to keep my temps down. I mean it’s not too bad but I would like it to be round 65 or 70. I have my setup in a Corsair case! Tips on adjusting fan speed?
i personally play cyberpunk at native 1440p with all the settings maxed out and "modded" frame generation with pathtracing i get avg 50-60fps with dips into the 40's and without i avg 80-100 fps. 3080ti gigabyte vision +899 +70
With DLSS it's perfectly serviceable. I was just playing the last of us remastered in 4K and it played it no problem with DLSS without it though you're correct.
@@ShadowOfNexxusIn TLoU DLSS works correctly. But in Ratchet and Horizon FW enabling DLSS does not reduce VRAM even by a bit. So to run Ratchet at 1440p/maxed out you need 11-12GB of VRAM, no matter if you use DLSS or not. Horizon needs a bit less VRAM, so at 1440p/maxed out 10GB should be fine.
@@ShadowOfNexxusIn TLoU DLSS works correctly. But in Ratchet and Horizon FW enabling DLSS does not reduce VRAM even by a bit. So to run Ratchet at 1440p/maxed out you need 11-12GB of VRAM, no matter if you use DLSS or not. Horizon needs a bit less VRAM, so at 1440p/maxed out 10GB should be fine.
the 3080 is still a great card for 1440p, sad it only had 10gb at first... should've been ONLY 12gb for the 3080 as its just such a powerful gpu, 3070 should've had 8gb, 3070ti 10gb as 3070ti is also WAAAAY to powerful to have only 8gb. yes the 3080 can still remain decently well for the coming 1-2 years but man its starting to creep up slowly.... sad to see. great gpus overall.
i have b550 itx asrock with a 5700g and 3080... love the combo but resizable bar was enabled through my bios and it says on geforce program that it is enabled...
@@D71Gaming It's not about BIOS, tried enabling and disabling resizable bar plenty of times - didn't work. Played around with different driver versions, registry settings and what not, it just refused to turn on.
@@chunk3875 Resizable bar wouldn't enable due to outdated vBIOS, I honestly didn't think of that so that's on me. A better CPU would indeed deliver better performance in titles like Hogwarts Legacy and Spider Man Remastered, as I mentioned we are CPU bound in those titles.
Take a note that Resize bar increases VRAM consumption a little bit. So in some games at certain settings 3080 10GB may run out of VRAM, if ReBar is enabled.
as i am looking at it it isnt bad but i dont think it is worth it in countries where you can only buy from facebook marketplace almost for the same price as new
There is not point picking a 3080 over the 6800xt as the 6800xts Vram will allow it to run newer games at higher res textures. Id even take the 6800 over the 3080.
@@jons6348 When the GPU isn't being fully utilized, it means that the graphics card is being held back by the processor. With a faster CPU, the graphics card would be fully utilized and the frame rate would be higher.
@@RATechYTthis doesn't make any sense. The CPU at this time was only at 70% utilization. If it was at 95-100% utilization then yes it's a bottleneck but here it was far below that. How is the CPU a bottle neck if it isn't even fully utilized.
@@jons6348 You can be CPU bound either by single core performance or core count. For example a Ryzen 3 1200 will be the bottleneck in majority of titles that were tested in the video, because it's a 4C/4T CPU, yet even if it had enough cores to not reach full utilization, we would still be CPU bound since it's an old CPU with low single core performance, and it wouldn't be able to unlock the full potential of a GPU like the RTX 3080. Simply put if a graphics card isn't reaching 95-100% usage with a particular processor, it means that the CPU is the bottleneck, which can be to an extent mitigated by overclocking the CPU or the memory.
@@johndough4871 You can enable or disable resizable bar in BIOS, which is what I tried doing in addition to clearing the CMOS. Updating the vBIOS is the only thing I didn't try, so that's gotta be it.
@@spawnersiak 7800 XT, 3080 and 4070 are all right in line with each other in terms of rasterized gaming performance. Better in some, worse in other titles, but generally the same according to reviews online.
@@RATechYT That's total BS. 7800 XT is faster than 3080 and 4070. It's little faster than 3080 Ti lol. I have 7800 xt and 4070 non super in my other PC, and 4070 always gets his ass kicked by 7800 xt lol.
@@spawnersiak I'm just saying what other reviewers are reporting. According to these reviews, on average performance isn't massively different between these three cards, they're generally within 5-10% of each other. Of course results will vary from one review to the other depending on the games tested, but so far that's what I personally got out of it. www.techpowerup.com/review/amd-radeon-rx-7800-xt/32.html www.techspot.com/review/2734-amd-radeon-7800-xt/#Average_1440p-png www.tomshardware.com/reviews/amd-radeon-rx-7800-xt-review/3 The 7800 XT is a great card - it's better in rasterized performance than both and ray tracing performance has been improved massively to the point Nvidia basically doesn't have a lead anymore. There are videos out there that show it easily beating the 3080 & 4070 in certain titles like Starfield for example. The point I'm trying to make is, in *general*, an RTX 3080 is capable of delivering near identical performance to the RTX 4070 and 7800 XT at just 360-ish bucks used.
@@spawnersiak What matters is percentage increase. If it's from 45 vs 60 FPS, then it's a 33% increase, which is a lot. If it's from 150 to 165, it's only a 10% increase.
Half of this video is basically 5600x heavily limiting the 3080. This cpu isn't good enough for this card to show its true power, at least without some ram tuning.
@@HeartOfAdel Hogwarts Legacy and Spider Man remastered are the only titles where the Ryzen 5 5600 bottlenecks the RTX 3080, with Hogwarts Legacy being slightly CPU bound.
@@RATechYTit's also cpu limited in Jedi Survivor and Dying Light 2. I cam give plenty of other examples as well. Hogwarts Legacy has a huge cpu bottleneck here, not a slight one.
@@HeartOfAdel The GPU usage drops in those titles due to stutters/movement in the frame time. Star Wars Jedi Survivor is a horribly optimized title (you can check the video I added in the card), the GPU usage drops due to stutters, not because the CPU can't keep up. Top end hardware also stutters in this game. Dying Light 2 runs poorly with DX11 on the RTX 3080, the wavy frame time causes a fluctuation in GPU usage, which is why I recommended switching to DX12 and enabling asynchronous compute. With DX12 the GPU usage is basically maxed out. I wouldn't consider 75-90% GPU usage in Hogwarts to be a huge bottleneck, but yeah, it is more than slight. I did mention areas where we're CPU bound and where a better processor would deliver more frames.
@RATechYT these stutters and gpu usage are caused partly by the cpu, and not only the game. I did not not have such stuttering in Jedi Survivor with 3080/9900KS OC. In Hogsmeade even with my cpu the usage drops to 90-95% on Ultra/DLAA getting around 100-120 frames. There's also plenty of testing on my channel. I just think that 5600x isn't a perfect pair for that card, but it's decent anyway. You just either should've chosen a different cpu for this test, or at least try less cpu heavy games.
@@HeartOfAdel The RX 6800 & XT models run flawlessly in Dying Light 2 without any frame time/GPU usage issues in DX11, they're mostly fully utilized. DX12 performance is basically the same as DX11, so the wavy frame times in DX11 is an RTX 3080 problem. Jedi Survivor stutters on top end hardware as well, the game is a mess. Just watch the first 3-4 minutes of this video: ruclips.net/video/d29ht5KuSw8/видео.htmlsi=AXHvCUm7_MKSGPZp Once again, using a higher end processor will maybe result in slightly better frame times in this game, but stuttering will mostly persist, that's for sure. If stutters occur not only on a mid range system but also high end one as shown in the video above - it's clearly a game problem. There are plenty of RUclips videos and Reddit posts proving that, yet performance issues might have been fixed by now. I agree, a 5800X3D for example would be more balanced with an RTX 3080, though unfortunately the R5 5600 at 4.57 GHz is what I had when testing the RTX 3080. I just disagree with the part where you said that half of the video the CPU is heavily limiting the RTX 3080, which it clearly isn't. In God of War, Forza Horizon 5, Warzone 2, Apex Legends, Cyberpunk 2077 and Dying Light 2 (DX12) the RTX 3080 is fully utilized basically at all times. Spider Man Remastered and Hogwarts Legacy are the only titles where we're CPU bound, and once again, *I do mention in the video* that a faster processor will result in better performance in those titles.
The RTX 4070 comes with a warranty DlSS three and Nvidia frame generation. What is the point of this video exactly? Even the 4060 would have better performance than a 3080
If you can find a 4070 that gets better performance even with FG than my 3080ti ill be impressed because ive not met anyone i know who owns one who gets better fps.
The average wannabe nerd talking nonsense about Vram cause of 3 shitty games over 10y, while 3080 still rocking and owning most competitors in value for money.
@@jayclarke777 VRAM limitations was always a relevant topic for everyone with crítical thought, not only the AMD fans. Have a few GPUs handcapped by lack of vram here: GTX 780ti GTX 970/980 R9 285/380 2GB GTX 1060 3GB R9 Fury/X and R9 nano RX 480/580 4GB version RTX 3070/Ti
EDIT: Turns out I couldn't enable resizable bar due to the old vBIOS, so if you're having the same problem, make sure to update to be able to use ReBar.
Thanks for watching! You can support my work here: www.buymeacoffee.com/ratechyt
discord.gg/PFb9cMstZH instagram.com/ratechyt twitter.com/ratechyt facebook.com/ratechyt
yes, this part was sketchy AF on my card, because the bios naming was extremely confusing and I ended up with TWO possible candidates of bios to update to. I dived in, made the best possible guess and fired it up, thanks got it worked. RIP to Gigabita Vision OC users LOL
I planned in 2023 to buy a 3080 but i don't pay 400 bucks for used card or scalper prices.
paid 490 bucks for 4070 with 2 years warranty...happy af
Thanks for review. I had Asus TUF 3080 with 10GB, it served me amazingly well on 3440*1440 monitor. Even when I upgraded to 4090, my trusty 3080 took an honorable place on display on my bookshelf in my living room. Eventually I sold it to one of my acquaintance for around 410 USD, good for him and good for me.
Just got an evga 3080 upgrading from a 2070. Didn't expect such a huge jump in performance but it's awesome
i mean ofc, the 3080 was a huge jump even from me jumping from a 2080super, it was an insane jump, probably around 40-50 fps increase average.
Great content bro, especially when you actually show live results instead excel sheets copy paste graphs like other youtubers love to do. Keep up the good work.
Thank you!
But bro try making the font and color a little defferent to easily see the graph@@RATechYT
Cost . performance .. overclocking . mods to push further past it. the RTX 3080 10gb one of the best gpus of all time.
148Mhs on a modified founders 3080 10gb with bio trained DNA.
70mhs on a crappy one and best before NITR0 did some mods .. was 100 to 110mhs
I would have to strongly agree. My 3080 ftw3 was the first higher end card I bought new, and it will most likely end up accompanying me to the afterlife.
Almost every 3080 at this age needs thermal pad replacement, and that's 50-70$ extra and you got no warranty. On the other side for a bit extra $ you get a brand new 4070 with warranty and DLLS 3.0 support. For me the DLLS 3.0 only is a huge advantage.
I have it 2nd hand for 2 years. Undervolted so it hovers around 200 to 220v under heavy loads. Lost 5% performance (aka 60 vs 63fps....) and gained 33% lower power and temps. Its a great card for 1440p high refresh rate gaming. Maxed out Doom eternal with RT at 180fps is still cool even in 2024.
Whats your undervolt setting?
@@ultimategohan1551 1800mhz @ 825mV
@@ultimategohan1551 1800mhz at 825mV
@@ultimategohan1551 1800mhz 825mV (my reply gets deleted, 3rd try)
@@ultimategohan1551 1800mhz 825mV (my reply gets deleted, 3rd try)
Buying used is good and the 3080 is a beast pf a card, but is a card that came during the Cryptomining boom, so i personally would avoid those used cards.
cards which were used by miners who knew what the are doing are probably the best used cards you can buy, besides the ones which werent used. Competent miners underclock their cards and clean them, and since they run all the time they dont get damaged by temperature changes.
I had a 3080 12gb EVGA FTW3 Ultra in my second PC. I ended up selling it and upgraded it to a RTX 4070 Super FE that paired amazingly with my 7800x3d. The 3080 in general is a very power hungry GPU. With my 7800x3d along with it I was pulling 560 watts of power. About 80 watts was from the 7800x3d. So that GPU was pushing almost 500watts if not power limited or undervolted.
wow, thats quite a lot
mine pulls 340 max which is still alot, but cant see it pulling 500
@@chrisyep Yeah dude is tripping, no way it pulled 500w
@@chrisyep My 3080 Ti at most has pulled 366w
I got same setup it’s a beast love it. 4070 super does excellent at 1440
I just bought used 3080 for 275$ (it usually goes for 300-400$ in my country)
Where did you buy it at? I've been looking over at ebay but cant find one for that cheap.
@@Grineck I bought it from a bulgarian website (similiar to ebay) as I live in Bulgaria.The prices for 2nd hand GPU like this vary from 300$ up to 500$, but sometimes people want to sell things fast and put them on low price.So far I haven't had any problem with the GPU, temps are good and everything, so I hope it stays that way
Just ordered one, going from an RX580 😂 hopefully a pretty big boost.
I will eventually get a 7900X3D/7900 XT system to satisfy my OCD but until then I am chilling with 5900X and 3070ti
I kinda wanna build a 5700X3D/RX 5700 crossfire system for fun since 5700X3D chips are so cheap on aliexpress rn
Awesome insight on the 3080! Im using a 3800X on an AsRock Steel Legend B550M with a FE3080 10GB. I got re-sizeable-bar to activate on my setup by updating the 3080 bios from the Nvidia web page.
3080fe with EK waterblock from eBay for £400 a few months ago, beast card even compared to a 3070
I bet the 3080 is a beast, been using the 3070 and still do now in 2024 and it rocks, I'm sure the 3080 does way better.
i flipped my 3080 for a 3080ti with a $25 profit. The used market is the way to go always.
I was pretty surprised the other day when I realised my friends 4 year old 3080 outperforms my 7900GRE :D I also upgraded from a 5600x to a 5700x3d and it fixed a lot of weird stuters I was getting
no man, if your 3080 outperforms your 7900gre then theres something wrong.
i had a 6800xt in my girls pc that pretty much matched my old 3080 rog strix, now i bought a 7900gre for her, and the gre is DEFINITELY WITHOUT A DOUBT the faster gpu in 1440p, so as i said theres something wrong with your rigg if thats the case for you.
a 7900gre is pretty much neck to neck with a 4070super and very close to 3090 performance.
I paid $375 for a 3080 off of facebook, the only thing that holds it back is the low VRam
I don't understand the reason of comparing the price of a SH component with the price of a new component - you could have used the price of 7800 XT found as a SH component. Based on what Ebay is listing, the price of 7800 XT (SH) is similar to the price of 7800 XT you provided in the video. Even if this is the case, you should compare the price of 2 components found in a similar situation, even if in this current situation the 7800 XT SH and new model are similarly priced.
I was thinking between 3080 and 6800/6800xt, because 3080 is still faster than 4070 (minus 2gb of vram), and 6800xt is like 5% slower than 7800xt. I had 3080 ti until few months ago, but I was depressed (some personal stuff) and sold it and decided that I don't need desktop pc and my 7 years old Thinkpad x1 tablet gen 2 is enough for some smaller projects in Creo (ex Pro/E), and I still have the Xbox.
I bought a used Aorus Xtreme 3080. Died after 5 months… wasted tons of money 😢 . Buying new from now on.
damn, shouldve repasted though, you never know with 30 series card because they were the victims of the gpu mining craze during 2020-22
That's unfortunate. I've been extremely lucky with used parts, I only had an AM3+ motherboard die on me out of dozens of used parts that I bought within the last 6 years.
Bought one off eBay March 2023 and it’s still going strong. I got lucky. The seller repasted shortly before selling and it was a 3080 12gb
The LHR versions of the cards were made like 2021 with less mining performance. It's good to buy this ones
It was software limit right which was removed not long after@@Arejen03
RA Tech : not every 700 PSU is able to deliver 12-13 Amps via a single PCI-Ex cable to PWR this continuous 350 Watt GPU.
Actually you do have plenty, and some of them are rated 620w, 650w. You just have to look into the 12v rails and ofc their reputation.
You still need 700W minimum. Just gotta be careful when choosing a unit.
@@rand0mon3-m3k It isn't true. Reputation of the 12V rails and OFC-ing their reputation has nothing to do with it.
Only raw current calculation helps here because the RTX 3000 series is very very PWR hungry compared to the RTX Pascal Refresh 2000 series.
@@RATechYT not every PSU will do but i use a 3080 with one p650b :p
@@lflyr6287 like i said, not every PSU will do but i use a 3080 with one p650b :p so go figure it.
Great videos! Simple, informative, useful. Wish yoiu luck and more interesting hardware to test
Thank you so much!
Remember guys, DLSS and FSR exists! RTX 3080 is a very solid card no hold on for atleast couple of years.
I am getting a used one to play on my 1080p 165hz monitor. On my country this will cost 1/3 of a 4070
09/22/2024 I AM Running a EVGA HYBRID 3080 WITH A 5800X3D?
3080 and a 5800x3d amazing pair great performance.
I bought a used 3080 last month from microcenter with a 2 year 'money back' warranty. If it dies, i get 400$ to spend at microcenter
Your having a problem with stuttering in some games.
Idk why in God of War the frame time graph in my 3080 is completely smooth, yours has alot of spikes.
@@dinooxxtuga2771 That's actually fairly stable, it moves up and down just a few frames which is why it looks like that. That happens after installing drivers & launching the game for the first time, and mostly goes away after about a couple of hours of gameplay.
The performance issues in Warzone & Apex don't happen once I pull out the RTX 3080 and install the RX 6800.
I still have my 3080 FTW3 and I can’t seem to keep my temps down. I mean it’s not too bad but I would like it to be round 65 or 70. I have my setup in a Corsair case! Tips on adjusting fan speed?
@@GorillaBlack4F If you don't have MSI afterburner installed, download it and set a fan curve. You might need to repaste the card as well.
@@RATechYT thanks my guy much appreciated!! Ima see how it goes.
I buy also used rx6700 10g for 115 usd and I super happy with it...
Dude ur videos helped me so much. Keep it up!
Glad I could help!
Did you intentionally run cyberpunk without DLSS on 1440p? The game looks great with that, and FPS is more playable.
@@andreskushnir2356 You're right, should've added DLSS results or at least mentioned upscaling. My bad.
i personally play cyberpunk at native 1440p with all the settings maxed out and "modded" frame generation with pathtracing i get avg 50-60fps with dips into the 40's and without i avg 80-100 fps. 3080ti gigabyte vision +899 +70
I got a used 6800xt instead. That 16gb vram is more usefull in new title like Ratchet and clank, last of us, horizon etc
With DLSS it's perfectly serviceable. I was just playing the last of us remastered in 4K and it played it no problem with DLSS without it though you're correct.
@@ShadowOfNexxusIn TLoU DLSS works correctly. But in Ratchet and Horizon FW enabling DLSS does not reduce VRAM even by a bit.
So to run Ratchet at 1440p/maxed out you need 11-12GB of VRAM, no matter if you use DLSS or not. Horizon needs a bit less VRAM, so at 1440p/maxed out 10GB should be fine.
@@ShadowOfNexxusIn TLoU DLSS works correctly. But in Ratchet and Horizon FW enabling DLSS does not reduce VRAM even by a bit.
So to run Ratchet at 1440p/maxed out you need 11-12GB of VRAM, no matter if you use DLSS or not. Horizon needs a bit less VRAM, so at 1440p/maxed out 10GB should be fine.
is the 10 gigs of vram sufficient?
10:14
Yes
Good video. I'm currently gaming on a 3080 10gb. My 3090 & 4090 are making money with AI.
the 3080 is still a great card for 1440p, sad it only had 10gb at first... should've been ONLY 12gb for the 3080 as its just such a powerful gpu, 3070 should've had 8gb, 3070ti 10gb as 3070ti is also WAAAAY to powerful to have only 8gb.
yes the 3080 can still remain decently well for the coming 1-2 years but man its starting to creep up slowly.... sad to see. great gpus overall.
True. 3080 should have had 12GB right from the start. Then it would easily last a couple more years w/o any issues with VRAM capacity.
I just ordered a brand new 3080 for $350
i have b550 itx asrock with a 5700g and 3080... love the combo but resizable bar was enabled through my bios and it says on geforce program that it is enabled...
Unfortunately it turns out my graphics card vBIOS was outdated, which is why I couldn't enable ReBar.
You mean to tell me you dont know how to turn on Rebar on Bios ? Its like 2 to 3 clicks max...
@@D71Gaming It's not about BIOS, tried enabling and disabling resizable bar plenty of times - didn't work. Played around with different driver versions, registry settings and what not, it just refused to turn on.
Resize bar and a better cpu like 5700x3d would’ve gave you way better performance.
@@chunk3875 Resizable bar wouldn't enable due to outdated vBIOS, I honestly didn't think of that so that's on me.
A better CPU would indeed deliver better performance in titles like Hogwarts Legacy and Spider Man Remastered, as I mentioned we are CPU bound in those titles.
Take a note that Resize bar increases VRAM consumption a little bit. So in some games at certain settings 3080 10GB may run out of VRAM, if ReBar is enabled.
Love your commentary style review ❤
@@bumbeelegends7018 Thank you!
as i am looking at it it isnt bad but i dont think it is worth it in countries where you can only buy from facebook marketplace almost for the same price as new
There is not point picking a 3080 over the 6800xt as the 6800xts Vram will allow it to run newer games at higher res textures. Id even take the 6800 over the 3080.
Definitely. Idk about the non xt version though
@@arbmoneyful Vram makes a difference, iF you look at HuBs video on how much Vram is used in Games today, even at 1080p it can exceed 10gb.
@@mercurio822 There are advantages and disadvantages when going with either the 6800 XT or the 3080. It's up to personal preference.
Meanwhile Wukong consumes 8.4 gigs in 4K Ultra native + path tracing.
16 gigs are useless if the chip is weak.
@@Fantomas24ARM not true. It uses up to 14gb vram
@5:45 spiderman is cpu bound?? How test says 70% cpu utilization? How is 70% cpu utilization cpu bound??
@@jons6348 When the GPU isn't being fully utilized, it means that the graphics card is being held back by the processor. With a faster CPU, the graphics card would be fully utilized and the frame rate would be higher.
@@RATechYTthis doesn't make any sense. The CPU at this time was only at 70% utilization. If it was at 95-100% utilization then yes it's a bottleneck but here it was far below that. How is the CPU a bottle neck if it isn't even fully utilized.
@@jons6348 You can be CPU bound either by single core performance or core count. For example a Ryzen 3 1200 will be the bottleneck in majority of titles that were tested in the video, because it's a 4C/4T CPU, yet even if it had enough cores to not reach full utilization, we would still be CPU bound since it's an old CPU with low single core performance, and it wouldn't be able to unlock the full potential of a GPU like the RTX 3080.
Simply put if a graphics card isn't reaching 95-100% usage with a particular processor, it means that the CPU is the bottleneck, which can be to an extent mitigated by overclocking the CPU or the memory.
What do you mean you couldn't enable rebar? It was super easy on my ASUS TUF card? Pretty sure it was through nvidia too
@@johndough4871 No matter what I did in the BIOS or in the software, it refused to turn on. Unless Nvidia Control panel misreports it of course.
@@RATechYT update bios 3080. fist bios not support rebar
@@RATechYT LOL DUDE YOU CANT UPDATE A GPU VBIOS FROM YOUR MOTHERBOARDS BIOS YOU HAVE TO UPDATE THE VBIOS ON THE GPU ITSELF
Thank me later
@@RATechYT The 3080 and some other 30 series cards came out right before rebar. There is an update to the vbios on 3080s to enable it.
@@johndough4871 You can enable or disable resizable bar in BIOS, which is what I tried doing in addition to clearing the CMOS. Updating the vBIOS is the only thing I didn't try, so that's gotta be it.
Can you do 1080 too i suppose many peoples use 3080 for 1080p ultra rt enabled
1080p Ultra with RT won't be an issue for this card in majority of titles.
I think the ryzen 4070 is better price to Performance
Ryzen 4070?
You are cognitively impaired if you think this is funny
@RATechYT I guess ztt hasn't reached this channel yet
@@key3n47 Zach's Tech Turf? Love that channel! Haven't checked his videos out in a while though.
Peak idiot comment
You know ... everyone wants now 4K...
@@truckdriver4387 Not sure if I can agree with that.
if a 3080 cant run 200 fps in any game 1440p the coders just dont cant code properly. games look worse than 2009 sometimes..
I found a 8gb 3080ti for $286 lol
7800 XT is faster than 3080 lol
@@spawnersiak 7800 XT, 3080 and 4070 are all right in line with each other in terms of rasterized gaming performance. Better in some, worse in other titles, but generally the same according to reviews online.
@@RATechYT That's total BS. 7800 XT is faster than 3080 and 4070. It's little faster than 3080 Ti lol. I have 7800 xt and 4070 non super in my other PC, and 4070 always gets his ass kicked by 7800 xt lol.
@@spawnersiak I'm just saying what other reviewers are reporting. According to these reviews, on average performance isn't massively different between these three cards, they're generally within 5-10% of each other. Of course results will vary from one review to the other depending on the games tested, but so far that's what I personally got out of it.
www.techpowerup.com/review/amd-radeon-rx-7800-xt/32.html
www.techspot.com/review/2734-amd-radeon-7800-xt/#Average_1440p-png
www.tomshardware.com/reviews/amd-radeon-rx-7800-xt-review/3
The 7800 XT is a great card - it's better in rasterized performance than both and ray tracing performance has been improved massively to the point Nvidia basically doesn't have a lead anymore. There are videos out there that show it easily beating the 3080 & 4070 in certain titles like Starfield for example. The point I'm trying to make is, in *general*, an RTX 3080 is capable of delivering near identical performance to the RTX 4070 and 7800 XT at just 360-ish bucks used.
@@RATechYT Dude... On overall performance 7800 XT is 10-15fps faster, so it's A LOT lol.
@@spawnersiak What matters is percentage increase. If it's from 45 vs 60 FPS, then it's a 33% increase, which is a lot. If it's from 150 to 165, it's only a 10% increase.
the 10 gb version has zero reason to exist
Completely forgot 12 GB RTX 3080 was a thing.
3080 is a wasted sand like the 3070 it lacks vram to be relevant, no buy te 6800 or 6800xt.
Buying a used GPU XD!
@@PCgamerChannel It's great!
U dum or something
Half of this video is basically 5600x heavily limiting the 3080. This cpu isn't good enough for this card to show its true power, at least without some ram tuning.
@@HeartOfAdel Hogwarts Legacy and Spider Man remastered are the only titles where the Ryzen 5 5600 bottlenecks the RTX 3080, with Hogwarts Legacy being slightly CPU bound.
@@RATechYTit's also cpu limited in Jedi Survivor and Dying Light 2. I cam give plenty of other examples as well. Hogwarts Legacy has a huge cpu bottleneck here, not a slight one.
@@HeartOfAdel The GPU usage drops in those titles due to stutters/movement in the frame time. Star Wars Jedi Survivor is a horribly optimized title (you can check the video I added in the card), the GPU usage drops due to stutters, not because the CPU can't keep up. Top end hardware also stutters in this game.
Dying Light 2 runs poorly with DX11 on the RTX 3080, the wavy frame time causes a fluctuation in GPU usage, which is why I recommended switching to DX12 and enabling asynchronous compute. With DX12 the GPU usage is basically maxed out.
I wouldn't consider 75-90% GPU usage in Hogwarts to be a huge bottleneck, but yeah, it is more than slight. I did mention areas where we're CPU bound and where a better processor would deliver more frames.
@RATechYT these stutters and gpu usage are caused partly by the cpu, and not only the game. I did not not have such stuttering in Jedi Survivor with 3080/9900KS OC. In Hogsmeade even with my cpu the usage drops to 90-95% on Ultra/DLAA getting around 100-120 frames. There's also plenty of testing on my channel. I just think that 5600x isn't a perfect pair for that card, but it's decent anyway. You just either should've chosen a different cpu for this test, or at least try less cpu heavy games.
@@HeartOfAdel The RX 6800 & XT models run flawlessly in Dying Light 2 without any frame time/GPU usage issues in DX11, they're mostly fully utilized. DX12 performance is basically the same as DX11, so the wavy frame times in DX11 is an RTX 3080 problem.
Jedi Survivor stutters on top end hardware as well, the game is a mess. Just watch the first 3-4 minutes of this video:
ruclips.net/video/d29ht5KuSw8/видео.htmlsi=AXHvCUm7_MKSGPZp
Once again, using a higher end processor will maybe result in slightly better frame times in this game, but stuttering will mostly persist, that's for sure. If stutters occur not only on a mid range system but also high end one as shown in the video above - it's clearly a game problem. There are plenty of RUclips videos and Reddit posts proving that, yet performance issues might have been fixed by now.
I agree, a 5800X3D for example would be more balanced with an RTX 3080, though unfortunately the R5 5600 at 4.57 GHz is what I had when testing the RTX 3080. I just disagree with the part where you said that half of the video the CPU is heavily limiting the RTX 3080, which it clearly isn't. In God of War, Forza Horizon 5, Warzone 2, Apex Legends, Cyberpunk 2077 and Dying Light 2 (DX12) the RTX 3080 is fully utilized basically at all times. Spider Man Remastered and Hogwarts Legacy are the only titles where we're CPU bound, and once again, *I do mention in the video* that a faster processor will result in better performance in those titles.
Your 3080 is bad
Why do you think so?
The RTX 4070 comes with a warranty DlSS three and Nvidia frame generation. What is the point of this video exactly? Even the 4060 would have better performance than a 3080
@@baronoflight Would you like me to delete the video since you don't see a point in it?
3080 walks on a 4060, frame generation can be modded in (nvidia lock us out to sell more cards they can do it easily)
If you can find a 4070 that gets better performance even with FG than my 3080ti ill be impressed because ive not met anyone i know who owns one who gets better fps.
The average wannabe nerd talking nonsense about Vram cause of 3 shitty games over 10y, while 3080 still rocking and owning most competitors in value for money.
The fact you disliked game A or B doesn't make them shitty.
Cheers.
It's the AMD fans saying that.
They can't get a win in RT, so they result in talking about rasterization or VRAM.
@@jayclarke777 VRAM limitations was always a relevant topic for everyone with crítical thought, not only the AMD fans.
Have a few GPUs handcapped by lack of vram here:
GTX 780ti
GTX 970/980
R9 285/380 2GB
GTX 1060 3GB
R9 Fury/X and R9 nano
RX 480/580 4GB version
RTX 3070/Ti
@@jayclarke777 you’re big on this tribalism thing huh? Over a piece of tech bro?