I was out of PC building for a time during this GPU, but I have to agree. It's still to this day unbelievably good looking. HEY EVERYONE! COME SEE HOW GOOD IT LOOKS 😂
Dude this thing was a POWERHOUSE in Davinci 16. The only thing i don’t miss was the driver stability during the release of this card. Radeon Pro drivers fixed the productivity stability issues but introduced artifacting in a lot of games. AMD has come a long way ❤
The reason it's so much on the used market is the hashrate of memory based cryptocurrencies, it was almost as fast as a 3090 on mining Etherium. Had one for a while, but sold it and got a 6900 XT, but if I had known about the hashrate then I'd have kept it and made some money :D Also, it looks really nice and clean!
Radeon VII was not a failure per se, it was a workstation card that got pushed to consumers in a time when Nvidia did a huge price hike. Suddenly the card could be released somewhat competitively. It would be perfectly fine as a workstation gpu otherwise.
This was nothing near being a "flop" it was a monster computecard, and it did ETH 120mhs at 96w with some tweaking, pretty sure that was 3090 performance. Not to mention this was the precursor to the MI300X chip. If you want to talk about a flop that would be the pentium 4, or FX5900U or ATi 2900XT or Bulldozer.
@@Gastell0 I'd say the VII is most similar to the MI50, down to the GPU and shader count. The MI25 uses the Vega 10 GPU while VII and MI50 use Vega 20.
Still using the Radeon VII, it still handles all the games I play no problem at 1440p, those results are suprising. Also use it for graphic design and video editing with Adobe apps and with Stable Diffusion for AI. Definately looking to upgrade soon with a 4090 or whatever the latest is from Nvidia when the time comes.
Not for my usage no, AMD is just so far behind in terms of technology compared to Nvidia. I've been with AMD for many years but it's time to move forward. Nvidia has far more support for encoding and AI as well as gaming features.@@dreamtalk2794
Mine its a piece of crap cant run current games without black screen and drivers shitting itself alot i got the drivers fucked up meanwhile not playing any game
Hello, I want to buy one RADEON VII, but everyone in the forums complains about getting code 43 error and this error cannot be resolved. Can you recommend this GPU?
I had a Radeon VII a few years ago and the card loves to be undervolted and overclocked. That stock cooler held it back. I water cooled mine and it was a different card afterwards.
Primitive Shaders ended up as part of NGGC: Next Generation Geometry Culling, that ended up in RDNA. Based on tests on Linux, there weren't enough shader stages exposed on Vega, so even if NGGC was used, performance didn't change. The option doesn't really do anything on RDNA1 either, but it's enabled by default for RDNA2 and 3. There's a reason RDNA2+ are geometry crunching monsters, tessellation included.
I got one at msrp before the pandemic and the price increase to gpus. I still use it paired with a 5950X to make live music visualizations with a program called Magic Music Visuals.
The price went up because FP64 became relevant in machine learning. Radeon VII is one of the higher FP64 compute cards available to the general consumer base.
Double precision speed is not important in ML or inference. 16 and 8 bit types (both FP and INT) are the most frequently used, i.e. critical in that field.
But was it really a flop? They all sold after all.... My Vega 64 LC died last month (probably the dead capacitor issue many of them seem to be developing as they age like corked wine). It did everything I needed it to until then, which coincided well with Phantom Liberty, BG3 and Starfield coming out - didn't need anything better till now. Replaced it with a 7900XTX which leaves the old card in the dust, five years and 3 generations on.
Not watched the whole video yet but I got a XFX Radeon VII a bit after it came out but before the GPU price hike for $500 USD. It was by far the best GPU I owned till recently. Blew away the 1080, 1080 ti, 2080, and even the w6800($2500 workstation card) in my gaming/game dev mix workloads. Still have it in one of my older builds and it still does really well for work tasks. Just need to get a newer CPU to pair it with.
Hello, I want to buy one RADEON VII, but everyone in the forums complains about getting code 43 error and this error cannot be resolved. Can you recommend this GPU?
@@onurcanbiricik9291, It has been a couple months since I used the computer with it installed. I never saw an error like that though. Do they say it was in gaming or work loads? Will say only get it now if it is a really good deal because AMD announced they will not longer be supplying driver support for it and other VEGA/Polaris GPUs. They will still work but just no longer getting new features through driver updates.
This was my first desktop GPU back when I built my first PC in early 2019. I wanted a GPU with 1080 Ti like performance for my first desktop and I didn't like the direction Nvidia was going charging $1200 with the 2080 Ti (and now we have $1600+ 4090s and intentionally gimped 4060s.....) for the time the Radeon VII was more than enough for me then. I now have a RX 6950 XT imo the Radeon VII was an underrated GPU I never cared about ray tracing.
Wow.... I got mine right when it came out and it was $100 less than MSRP. I got it for $600 and it came with three games... one of which was WWZ, which I still play. My kid is using the Radeon VII in his computer and it still runs great. It only performed about 10% less than my 2080 Ti when I undervolted and put a slight overclock on it. It is a really, really good card. I sold the 2080 Ti, but I will never get rid of this card. I love it.
Wow! $100 off msrp is crazy low for such a beast! Radeon VII will always be a special card, excessive amount of HBM2 (which now is non-existent for new consumer cards, I loved HBM technology as well) best card for mining up until 3090 came out, juicy looking, great developing-editing capabilities and power hungry, combined with rarity. Such a niche piece of hardware. All components needed for an upcoming classic already, I'd hold on to it like, forever. Love the story!
I bought one at release and even though it was a silicon lottery winner in terms of voltage it still needed liquid cooling. Either way it was close enough to a 2080 where you couldn't tell a difference without the fps counter, came with 3 free games (good games too)and was a beast in DaVinci and Eth mining. Made well over $12k off that one GPU over the years. Still have it in my closet somewhere
Yeah, linux drivers will get optimized for older GPUs long after windows drivers only get optimized on newer GPUs. It was noticeable in some games on my old RX 480.
my fav card is the 290X, it was my first ever high-end gpu 🤩 that thing was very much held back by its reference cooler, the Sapphire Tri-X I got was just a way better product. Not sure if your round of benchmarks would be viable for a card that old though
I had 3 290x in crossfire x back in 2013. I loved that setup. People complained about crossfire but It worked and no microstutters on my rig but it was really tuned. Gaming in 4k back then was new and I loved it. I really wish I could crossfire my 7900xtx. I have a 7800xt here I want to try some benchmarks with Mgpu, I heard it doesn't work though.
@@2000jalebi R9 Fury/290x is really old, maybe confused with Vega 64 which is Gen I Vega, Radeon VII is actually Gen II Vega architecture. Regardless, the only thing going for 5700XT was power efficiency but it was slower than Radeon VII in performance.
Yes when people say that its really bad at 4k. When its 60fps in a shooter. You know are standards for gameplay performance is through the roof. Side note: For reference I only like play the doom and eternal at 120fps avg or better. It just feels better. So Gear Seekers is clearly validated.
I just wished that COD upgraded to FSR 2.2.1 😅 edit: I had the Vega64, it looked great and with "balanced" performance it didn't draw as much. But some games just acted weird, I played Overwatch actively and got several crashes and almost ended up with a ban.. I had no choice but to switch to a GTX 1080 (Zotac Amp Extreme).
Lol, I was so happy gamer reviewers hated this card, and didn’t do much in the way of “practical” production reviews. Made the demand less, and kept the price MSRP(until the crypto dorks understood this card). I was able to at least get three at launch which totaled a little more than a single 2080ti. I used mostly for programs/plugins that utilized GPU based rendering. If you were not memory restricted two VII’s where on par with a 2080ti using CUDA. When memory was an issue the VII blew by the 2080ti. Not till OPTIX’s intergration started to mature did the VII start to loose production value, kinda. I must have been lucky with the drivers, I used them as egpus with a mac and internally with a threadripper system, I never experienced issues gaming or working with programs like so many reviewers said they had. I didn’t like the noise they made so I water cooled them which allowed further tweeking and boost of the cards clocks. Also for gaming using the smart access memory allowed to stabilize frame rates much better than I would get with my 2080ti. It always seemed Nvidia cards could reach highs but the drops were always so visibly annoying, which a lot of reviewers really didn’t cover at the time.
I noticed that you kept original air cooler on it. Well, for all your high-res test it automatically means that your card hit thermal throttling limits. I spent additional 200 dollars to get AIO for this card, I played a lot with airflow and got the thermals to the level where this does not happens. This card has incredibly high FP64 performance (or double precision) and shines in productivity. Photo or video editing, compute etc. Radeon VII PRO version even has 2x that much computing performance. There isnt many products which are capable to do both gaming and productivity. 4K with FSR 2.1 on Balanced is quite good, but honestly, Radeon VII is bottlenecking even my Threadripper 1900x (Gen 1 Ryzen).
Nah i don't agree it was a flop let alone AMD's biggest flop. It was just the most frail card ever made (code 43 = dead card) and too good at mining and productivity instead of gaming. Also AMD didn't really make that many of these. If anything the 5700xt was a larger flop with the well documented driver issues, same goes for the 6500xt and 7600's trainwreck launches. Anybody who bought one of these at MSRP and mined crypto with it made back what they paid and probably more than enough to buy a 4090 or sold it for a nice profit during the mining craze, so if their cards didn't get the infamous code 43 the radeon VII owners were very happy with their purchase.
The radeon vii was my first gpu. I bought it used shortly after it came out back in 2019 and I love the card. I enabled sam support and the card runs great.
Hello, I want to buy one RADEON VII, but everyone in the forums complains about getting code 43 error and this error cannot be resolved. Can you recommend this GPU?
16gb of hbm2 is still melting everything in 4k at solid 60fps, of course water cooled. When i am using it with Da Vinci Resolve for 6k rendering only GPUs with price tag 5k dollars are close to R7. Very weird tests, with 5950x in Warzone in 1440 i have solid 120 fps
Is it with HBCC enabled? How much and what system memory speed? These things are quite important for HBM2 GPU Is Resizable Bar enabled and working? (you can check it in GPUz advanced tab)
It's was a compute monster for its price and I was surprised they released a consumer card with such high FP64 perf and memory bandwidth/size. So I wouldn't really call it a flop. It's AMD's titan (the OG titan I mean)
Hello, I want to buy one RADEON VII, but everyone in the forums complains about getting code 43 error and this error cannot be resolved. Can you recommend this GPU?
I got the 50th anniversary red version. Never any issues. It still works great. I still play new games in 4K. Just... This was released before Ray Tracing was a thing.
Even if i sort of wasted 300 euros on my Radeon VII, man.. it looks way *too good* and honestly, a very efficient performer, 1010mv so most games eat only 180 watts. And buggy games that need FPS cap only eat 70w 😂 But Cyberpunk is a tragedy! Wow. 1080p Horizon is also *brutal*. Does feel like its noticeably held back by the cooler - If i used a Morpheus on it, i think i could easily do stable daily 1950mhz core on my sample. 😂
Had an overclocked VII on water, it was near 2080 ti performance in raster when heat wasn't an issue. Really underrated card. I mean for how old it is with 16 GB of vram is pretty incredible.
i sorta wasted money too. got one new one in box. card works great and in games like helldivers two in Max settings at 1080p it gets round 50 FPS just fine. I also used auto undervolt on the amd adrenaline software so it pulls about 130 Watts most of the time. I think my 11400 though is slightly holding it back.
I bought one of these when it first came out. I still run it to this day. While I'm no longer a hardcore gamer it runs everything I want just fine. I should have sold it during the mining bubble. I see most of them have been used for mining, judging by the sheer number for sale as faulty. But mine, which has never been used for mining, carries on running just fine. I've not bothered undervolting or overclocking it. It's a lovely looking card. For some strange reason I am quite attached to this card and will probably just keep it in a cupboard when I do eventually feel the need to upgrade.
I love how the 3080 12gb casually wipes the floor with the 4070 at higher resolutions. What the hell where Nvidia smoking when they fitted that POS with 192 bit memory bus?
Wish this thing is affordable on the used market. I'd love to just get this and add to my collection (and put it on my dad's 10 year old Ubuntu desktop PC)
All the tests are unfortunately invalid. The Radeon VII is a DX 11 card and will experience up to 30% additional performance in DX 11 or Vulkan. Additionally, the card's performance climbs as it climbs in resolution (deltas close the closer you get to 4K due to its high bandwidth and throughput). VRAM is more beneficial, the higher the resolution climbs, obviously. And while, no not all games use 16 GB of RAM, titles like RE 4 show that games that need more than 12 GB of VRAM are really only serviceable by cards with 16 GB of VRAM. Finally, the Radeon VII is a superstar in productivity, compute and AI workloads. It's forgivable for it to be a good but not stellar gaming performer.
Still have my Radeon VII (Bought it when it came out at same time as 2080) and to this day no byers remorse. No mmatter what I throw at it the GPU laughs and wants more. I do use it for gaming at 1440 on my 34" ultrawide screen and the pair is made for each other. Also have her in a itx case when I built it (2019 i think). Those who did not buy a Radeon VII really missed out (bought mine from MSI). No rush to really upgrade the GPU either.
Vega 20 is a workstation / compute die that AMD forced into the gaming role. It would be cool if you had tested non-gaming workloads (Davinci, Photoshop, Lightroom etc).
i got a rvii on launch day and it performed pretty good. but the drivers at the time were not the best and it was a pretty loud card when under load. i made a good profit reselling it in the height of the mining boom tho. im still happy with my time with it tho. would use again.
still expensive because it carries the following things: compute per price, vram, hashrate history, little supply, 1st 7nm consumer gpu. No other GPU in the history of GPUs carries all those specific tags at once.
Is it wrong that I was actually expecting a Fiji card instead? But in fairness the R9 Nano was at least an early player on the tiny gpu build and as someone who appreciate super tiny system that was rather nice back then and something I still kinda miss from Radeon cards: they're still always rather high power consumption nowadays compared to Nvidia offerings which is very ironic consider how vastly more efficient modern Ryzen is vs intel for example (They're good enough to trade blows with Apple silicon, at least when it's been a while since Apple refreshes it)
Too bad the performance wasn't better because this was a great looking graphics card. "It's a train wreck going off a cliff", don't hold back Nick tell us what you really think. :p
A very pretty graphics card, but fell behind in terms of performance while lacking in features. Adding insult to the injury, it's as expensive as an RTX 2080 and AMD Copeon Terrible Group dropped the support for it in favor of more efficient RX 5700 XT within half a year. Since Copeon VII, AMD Copeon brand is forever bound with 'falling behind in performance while falling short in features' when compared with Nvidia GPUs. As for 'ages like fine wine' saying, then why does anyone want to buy AMD Copeon GPU early instead of waiting for them to become 'fine wine'? People need to think, not cope
You completly missed the point of this GPU. What makes it great is that you can use it for work and play some games. I use it for work on solidworks and autocad. I then play the only game I like COD LoL. At 1440P.
I bought this brand new from AMD when it came out and I really wasn't disappointed in it lol. The look was very nice with my build and it played every game I played so I was very happy with it. Mining killed these things left and right so getting them used is almost guaranteed a bad deal for the buyer.
I used to have Radeon VII along with my silver Vega 64, i mean the performance already isnt great back then, but man it's still the best looking gpu have ever seen, this and my silver Vega 64 just hits in a different way. Oh well, i shouldn't have sold this card when the crypto boom is happening.
It's a joke, the Flop architecture is a huge joke, it failed to beat the Titan V/RTX. I mean, why else AMD named their card differently from the RX 4/500?
The entirety of Vega was a disappointment and it was Raja Koduri's responsibility. Intel made a mistake by hiring him, he's more of a liability than an asset. Was declared End of Life within months, what an appalling GPU. AMD without Koduri has been nothing but smooth, competitive sailing
please put 6800 into the comparison list as well. no offense though, but I'm so amazed by just how many youtubers out there that are ignoring the 6800...... there are only a few of you guys who actually remembered the 6800.
ouxch ... any one else notice that intells A770 is getting whacked to the bottom of the stack. that is itnel's current best.... they really need to get their battlemage out already.
one of the nicest looking cards to come out - i would love to add one to my collection but it's far too expensive
I was out of PC building for a time during this GPU, but I have to agree. It's still to this day unbelievably good looking. HEY EVERYONE! COME SEE HOW GOOD IT LOOKS 😂
I have one in amazing condition with the original packaging if you’re interested 🤷🏼♂️
I have it and it looks great but I'm forced to keep the rest of my set up red lol 😊
@@dredoesstuffdds5029 how much
You can have mine (:
Dude this thing was a POWERHOUSE in Davinci 16. The only thing i don’t miss was the driver stability during the release of this card. Radeon Pro drivers fixed the productivity stability issues but introduced artifacting in a lot of games. AMD has come a long way ❤
It was also a great GPU mining card, sadly the gaming performance was all over the place since launch.
The reason it's so much on the used market is the hashrate of memory based cryptocurrencies, it was almost as fast as a 3090 on mining Etherium.
Had one for a while, but sold it and got a 6900 XT, but if I had known about the hashrate then I'd have kept it and made some money :D
Also, it looks really nice and clean!
Radeon VII was not a failure per se, it was a workstation card that got pushed to consumers in a time when Nvidia did a huge price hike. Suddenly the card could be released somewhat competitively. It would be perfectly fine as a workstation gpu otherwise.
It definitely was a gaming failure
This was nothing near being a "flop" it was a monster computecard, and it did ETH 120mhs at 96w with some tweaking, pretty sure that was 3090 performance. Not to mention this was the precursor to the MI300X chip.
If you want to talk about a flop that would be the pentium 4, or FX5900U or ATi 2900XT or Bulldozer.
It's also essentially same chip as on MI25
@@Gastell0 I'd say the VII is most similar to the MI50, down to the GPU and shader count. The MI25 uses the Vega 10 GPU while VII and MI50 use Vega 20.
Still using the Radeon VII, it still handles all the games I play no problem at 1440p, those results are suprising. Also use it for graphic design and video editing with Adobe apps and with Stable Diffusion for AI. Definately looking to upgrade soon with a 4090 or whatever the latest is from Nvidia when the time comes.
7900 xtx not interesting? I think its not a bad card as ive been looking into gpus and current prices
Not for my usage no, AMD is just so far behind in terms of technology compared to Nvidia. I've been with AMD for many years but it's time to move forward. Nvidia has far more support for encoding and AI as well as gaming features.@@dreamtalk2794
Yeah i was surpised by tests as well, i have solid 120 fps in warzone on 1440p where here he have barely 90.
Mine its a piece of crap cant run current games without black screen and drivers shitting itself alot i got the drivers fucked up meanwhile not playing any game
Hello, I want to buy one RADEON VII, but everyone in the forums complains about getting code 43 error and this error cannot be resolved. Can you recommend this GPU?
I had a Radeon VII a few years ago and the card loves to be undervolted and overclocked. That stock cooler held it back. I water cooled mine and it was a different card afterwards.
legend has it, people are still waiting for the miracle drivers that add primitive shaders promised in AMD's own official videos, still up on youtube.
Primitive Shaders ended up as part of NGGC: Next Generation Geometry Culling, that ended up in RDNA.
Based on tests on Linux, there weren't enough shader stages exposed on Vega, so even if NGGC was used, performance didn't change. The option doesn't really do anything on RDNA1 either, but it's enabled by default for RDNA2 and 3.
There's a reason RDNA2+ are geometry crunching monsters, tessellation included.
I got one at msrp before the pandemic and the price increase to gpus. I still use it paired with a 5950X to make live music visualizations with a program called Magic Music Visuals.
The price went up because FP64 became relevant in machine learning. Radeon VII is one of the higher FP64 compute cards available to the general consumer base.
radeon vii is literally their server accelerator slapped into regular gpu form factor with video outputs
@@tubaeseries5705 No it's not. It has 1/4 FP64 throughtput (which is awesome for a consumer card tbh), where the server ones are 1/2.
Double precision speed is not important in ML or inference. 16 and 8 bit types (both FP and INT) are the most frequently used, i.e. critical in that field.
So sad to see these prices. A lot of them died off during the mining boom. Combine that with low production these are rare items
But was it really a flop? They all sold after all.... My Vega 64 LC died last month (probably the dead capacitor issue many of them seem to be developing as they age like corked wine). It did everything I needed it to until then, which coincided well with Phantom Liberty, BG3 and Starfield coming out - didn't need anything better till now. Replaced it with a 7900XTX which leaves the old card in the dust, five years and 3 generations on.
Not watched the whole video yet but I got a XFX Radeon VII a bit after it came out but before the GPU price hike for $500 USD. It was by far the best GPU I owned till recently. Blew away the 1080, 1080 ti, 2080, and even the w6800($2500 workstation card) in my gaming/game dev mix workloads. Still have it in one of my older builds and it still does really well for work tasks. Just need to get a newer CPU to pair it with.
Hello, I want to buy one RADEON VII, but everyone in the forums complains about getting code 43 error and this error cannot be resolved. Can you recommend this GPU?
@@onurcanbiricik9291, It has been a couple months since I used the computer with it installed. I never saw an error like that though. Do they say it was in gaming or work loads?
Will say only get it now if it is a really good deal because AMD announced they will not longer be supplying driver support for it and other VEGA/Polaris GPUs. They will still work but just no longer getting new features through driver updates.
This was my first desktop GPU back when I built my first PC in early 2019. I wanted a GPU with 1080 Ti like performance for my first desktop and I didn't like the direction Nvidia was going charging $1200 with the 2080 Ti (and now we have $1600+ 4090s and intentionally gimped 4060s.....) for the time the Radeon VII was more than enough for me then. I now have a RX 6950 XT imo the Radeon VII was an underrated GPU I never cared about ray tracing.
Yep, no one is running ray tracing on 2080 class gpus anymore, even 3080 class is becoming obsolete.
Wow.... I got mine right when it came out and it was $100 less than MSRP. I got it for $600 and it came with three games... one of which was WWZ, which I still play. My kid is using the Radeon VII in his computer and it still runs great. It only performed about 10% less than my 2080 Ti when I undervolted and put a slight overclock on it. It is a really, really good card. I sold the 2080 Ti, but I will never get rid of this card. I love it.
Wow! $100 off msrp is crazy low for such a beast! Radeon VII will always be a special card, excessive amount of HBM2 (which now is non-existent for new consumer cards, I loved HBM technology as well) best card for mining up until 3090 came out, juicy looking, great developing-editing capabilities and power hungry, combined with rarity. Such a niche piece of hardware. All components needed for an upcoming classic already, I'd hold on to it like, forever. Love the story!
@@artunkansu4461 I still have the box for it too. I wish I would have bought a second one instead of a 2080Ti.
I bought one at release and even though it was a silicon lottery winner in terms of voltage it still needed liquid cooling. Either way it was close enough to a 2080 where you couldn't tell a difference without the fps counter, came with 3 free games (good games too)and was a beast in DaVinci and Eth mining. Made well over $12k off that one GPU over the years. Still have it in my closet somewhere
I would like to see the the Readon 7 under Linux and see if the drivers there make a difference.
Yeah, linux drivers will get optimized for older GPUs long after windows drivers only get optimized on newer GPUs. It was noticeable in some games on my old RX 480.
Those cards make good FP64 HPC cards they outperform even the 4090 for computing double precision math.
Yikes! A broken rib?! Hope you heal up quickly and don't have to deal with the discomfort for too long, Nick.
my fav card is the 290X, it was my first ever high-end gpu 🤩 that thing was very much held back by its reference cooler, the Sapphire Tri-X I got was just a way better product. Not sure if your round of benchmarks would be viable for a card that old though
I had an MSI 290X Lightning. It was pretty awesome.
I had 3 290x in crossfire x back in 2013. I loved that setup. People complained about crossfire but It worked and no microstutters on my rig but it was really tuned. Gaming in 4k back then was new and I loved it. I really wish I could crossfire my 7900xtx. I have a 7800xt here I want to try some benchmarks with Mgpu, I heard it doesn't work though.
This was an interesting GPU. It was already kinda behind when they released it. The 5700XT ran circles around it too upon its release.
Radeon VII was faster and still faster then 5700, what are you talking about 😂
He is thinking of the r9 fury cards. Or the radeon vii pro workstation cards
@@2000jalebi R9 Fury/290x is really old, maybe confused with Vega 64 which is Gen I Vega, Radeon VII is actually Gen II Vega architecture. Regardless, the only thing going for 5700XT was power efficiency but it was slower than Radeon VII in performance.
I have a 5700xt, Radeon vii and a vega 64, the 5700xt is a little quicker than the vega 64 and the Radeon vii smashes them both.
He ain’t wrong about AMD aging well. The fact that people are still buying the rx580 can confirm this
Those people don't have a choice.
I have this card. It’s compute focused, as the whole of Vega was. The architecture still exist as cdna now for… compute, as that’s where it shines.
Yes when people say that its really bad at 4k. When its 60fps in a shooter. You know are standards for gameplay performance is through the roof. Side note: For reference I only like play the doom and eternal at 120fps avg or better. It just feels better. So Gear Seekers is clearly validated.
The reason the pricing is crazy is that it mines the Dynex coin more efficiently than any modern (consumer grade) gpu
The early Blue/Yellow color scheme Vega workstation cards felt like such a breath of fresh air in a market with nothing but Black/Green and Black/Red
I just wished that COD upgraded to FSR 2.2.1 😅
edit: I had the Vega64, it looked great and with "balanced" performance it didn't draw as much. But some games just acted weird, I played Overwatch actively and got several crashes and almost ended up with a ban.. I had no choice but to switch to a GTX 1080 (Zotac Amp Extreme).
Lol, I was so happy gamer reviewers hated this card, and didn’t do much in the way of “practical” production reviews. Made the demand less, and kept the price MSRP(until the crypto dorks understood this card). I was able to at least get three at launch which totaled a little more than a single 2080ti. I used mostly for programs/plugins that utilized GPU based rendering. If you were not memory restricted two VII’s where on par with a 2080ti using CUDA. When memory was an issue the VII blew by the 2080ti. Not till OPTIX’s intergration started to mature did the VII start to loose production value, kinda. I must have been lucky with the drivers, I used them as egpus with a mac and internally with a threadripper system, I never experienced issues gaming or working with programs like so many reviewers said they had. I didn’t like the noise they made so I water cooled them which allowed further tweeking and boost of the cards clocks.
Also for gaming using the smart access memory allowed to stabilize frame rates much better than I would get with my 2080ti. It always seemed Nvidia cards could reach highs but the drops were always so visibly annoying, which a lot of reviewers really didn’t cover at the time.
Great video 😊
I noticed that you kept original air cooler on it. Well, for all your high-res test it automatically means that your card hit thermal throttling limits. I spent additional 200 dollars to get AIO for this card, I played a lot with airflow and got the thermals to the level where this does not happens.
This card has incredibly high FP64 performance (or double precision) and shines in productivity. Photo or video editing, compute etc. Radeon VII PRO version even has 2x that much computing performance. There isnt many products which are capable to do both gaming and productivity.
4K with FSR 2.1 on Balanced is quite good, but honestly, Radeon VII is bottlenecking even my Threadripper 1900x (Gen 1 Ryzen).
I wish companies would optimize for older cards. I feel like the horsepower is there. They don’t need to turn into ewaste just yet!
Nah i don't agree it was a flop let alone AMD's biggest flop. It was just the most frail card ever made (code 43 = dead card) and too good at mining and productivity instead of gaming. Also AMD didn't really make that many of these. If anything the 5700xt was a larger flop with the well documented driver issues, same goes for the 6500xt and 7600's trainwreck launches. Anybody who bought one of these at MSRP and mined crypto with it made back what they paid and probably more than enough to buy a 4090 or sold it for a nice profit during the mining craze, so if their cards didn't get the infamous code 43 the radeon VII owners were very happy with their purchase.
Banger of a video fams. I like this type of video. Keep up the hardwork
The radeon vii was my first gpu. I bought it used shortly after it came out back in 2019 and I love the card. I enabled sam support and the card runs great.
Hello, I want to buy one RADEON VII, but everyone in the forums complains about getting code 43 error and this error cannot be resolved. Can you recommend this GPU?
Once it got to Cyberpunk I knew it would be game over 😭. I was expecting more ngl.
I was as well. It was an absolute trainwreck in Cyberpunk
@@GearSeekersAfter all the patches, Cyberpunk seems to dislike Vega for some reason. Probably poor GPU occupancy.
How did you get the ram to run at that speed I thought 12900k has max clock
That a card that needs undervolt for the junction temp/hotspot would get way to high. Great card in my opinion.
lol at the re-sale values, someone forgot to tell these people GPU mining has utterly crashed.
Amen
16gb of hbm2 is still melting everything in 4k at solid 60fps, of course water cooled.
When i am using it with Da Vinci Resolve for 6k rendering only GPUs with price tag 5k dollars are close to R7.
Very weird tests, with 5950x in Warzone in 1440 i have solid 120 fps
and its still working since 2019 so big win for AMD, even when this card is a horse for a work, i edit videos in 4k and photos as well.
Is it with HBCC enabled? How much and what system memory speed? These things are quite important for HBM2 GPU
Is Resizable Bar enabled and working? (you can check it in GPUz advanced tab)
Vega Cards doesn't support ReBar, AMD drop the driver support for that tech
@@reiniermoreno1653 It does though, at least hardware does, Mi25 is same chip and pcb and has rebar
@@Gastell0 yeah I mean that the reviewer was probably using official drivers so he couldn't enable ReBar in that case
It's was a compute monster for its price and I was surprised they released a consumer card with such high FP64 perf and memory bandwidth/size.
So I wouldn't really call it a flop. It's AMD's titan (the OG titan I mean)
Hello, I want to buy one RADEON VII, but everyone in the forums complains about getting code 43 error and this error cannot be resolved. Can you recommend this GPU?
I got the 50th anniversary red version. Never any issues. It still works great. I still play new games in 4K. Just... This was released before Ray Tracing was a thing.
Ray tracing was a thing. It's called 2080 ti buddy.
@@Bk_owns I stand corrected... I guess I just didn't care about it. I don't lean one way or the other. I just chose the Radeon VII.
How was it a flop? It was a limited production run and they all got sold. It did not under-perform in sales.
Even if i sort of wasted 300 euros on my Radeon VII, man.. it looks way *too good* and honestly, a very efficient performer, 1010mv so most games eat only 180 watts. And buggy games that need FPS cap only eat 70w 😂 But Cyberpunk is a tragedy! Wow. 1080p Horizon is also *brutal*.
Does feel like its noticeably held back by the cooler - If i used a Morpheus on it, i think i could easily do stable daily 1950mhz core on my sample. 😂
Had an overclocked VII on water, it was near 2080 ti performance in raster when heat wasn't an issue. Really underrated card. I mean for how old it is with 16 GB of vram is pretty incredible.
i sorta wasted money too. got one new one in box. card works great and in games like helldivers two in Max settings at 1080p it gets round 50 FPS just fine. I also used auto undervolt on the amd adrenaline software so it pulls about 130 Watts most of the time. I think my 11400 though is slightly holding it back.
I bought one of these when it first came out. I still run it to this day. While I'm no longer a hardcore gamer it runs everything I want just fine. I should have sold it during the mining bubble. I see most of them have been used for mining, judging by the sheer number for sale as faulty. But mine, which has never been used for mining, carries on running just fine. I've not bothered undervolting or overclocking it. It's a lovely looking card. For some strange reason I am quite attached to this card and will probably just keep it in a cupboard when I do eventually feel the need to upgrade.
I love how the 3080 12gb casually wipes the floor with the 4070 at higher resolutions. What the hell where Nvidia smoking when they fitted that POS with 192 bit memory bus?
Wish this thing is affordable on the used market. I'd love to just get this and add to my collection (and put it on my dad's 10 year old Ubuntu desktop PC)
Take mine
I wish there was a unreal and/or blender project benchmark comparison for this card. Considering the radeon 7 was sudo workstation card.
Hi can u tested it again on custom driver with SAM enabled and maybe hags?
can you test Radeon VII with 5800x3D ??? thanks.
All the tests are unfortunately invalid. The Radeon VII is a DX 11 card and will experience up to 30% additional performance in DX 11 or Vulkan. Additionally, the card's performance climbs as it climbs in resolution (deltas close the closer you get to 4K due to its high bandwidth and throughput). VRAM is more beneficial, the higher the resolution climbs, obviously. And while, no not all games use 16 GB of RAM, titles like RE 4 show that games that need more than 12 GB of VRAM are really only serviceable by cards with 16 GB of VRAM. Finally, the Radeon VII is a superstar in productivity, compute and AI workloads. It's forgivable for it to be a good but not stellar gaming performer.
Still have my Radeon VII (Bought it when it came out at same time as 2080) and to this day no byers remorse. No mmatter what I throw at it the GPU laughs and wants more. I do use it for gaming at 1440 on my 34" ultrawide screen and the pair is made for each other. Also have her in a itx case when I built it (2019 i think).
Those who did not buy a Radeon VII really missed out (bought mine from MSI). No rush to really upgrade the GPU either.
Vega 20 is a workstation / compute die that AMD forced into the gaming role. It would be cool if you had tested non-gaming workloads (Davinci, Photoshop, Lightroom etc).
i got a rvii on launch day and it performed pretty good. but the drivers at the time were not the best and it was a pretty loud card when under load. i made a good profit reselling it in the height of the mining boom tho. im still happy with my time with it tho. would use again.
Hope you get better Nick! Also loved the gameplay.
still expensive because it carries the following things: compute per price, vram, hashrate history, little supply, 1st 7nm consumer gpu. No other GPU in the history of GPUs carries all those specific tags at once.
AMD has new frame gen software / drivers. Would you be able to test that some time please?
Great video! GTX 1080TI should be next!
Is it wrong that I was actually expecting a Fiji card instead? But in fairness the R9 Nano was at least an early player on the tiny gpu build and as someone who appreciate super tiny system that was rather nice back then and something I still kinda miss from Radeon cards: they're still always rather high power consumption nowadays compared to Nvidia offerings which is very ironic consider how vastly more efficient modern Ryzen is vs intel for example (They're good enough to trade blows with Apple silicon, at least when it's been a while since Apple refreshes it)
Would love to see an update on FSR3 enabled games to see if it makes this card relevant again.
I wish I could use this Cooler design with all my GPU's, I don't care about temps I want my GPU to look this cool.
i wish they at least kept this aesthetic on their gpus
It's normal 3840 steam processors at 16fpp to be slower than 4096 at 16fpp and their equality of 2048 at 32 fpp. Simple math.
My friend replace origo cooler on Radeon 7 with Raijintek cooler with 2x 120 mm fans. Temps 10/15°C down and noise is also a bit lower.
Too bad the performance wasn't better because this was a great looking graphics card. "It's a train wreck going off a cliff", don't hold back Nick tell us what you really think. :p
A very pretty graphics card, but fell behind in terms of performance while lacking in features. Adding insult to the injury, it's as expensive as an RTX 2080 and AMD Copeon Terrible Group dropped the support for it in favor of more efficient RX 5700 XT within half a year. Since Copeon VII, AMD Copeon brand is forever bound with 'falling behind in performance while falling short in features' when compared with Nvidia GPUs. As for 'ages like fine wine' saying, then why does anyone want to buy AMD Copeon GPU early instead of waiting for them to become 'fine wine'? People need to think, not cope
You completly missed the point of this GPU. What makes it great is that you can use it for work and play some games. I use it for work on solidworks and autocad. I then play the only game I like COD LoL. At 1440P.
I bought this brand new from AMD when it came out and I really wasn't disappointed in it lol. The look was very nice with my build and it played every game I played so I was very happy with it. Mining killed these things left and right so getting them used is almost guaranteed a bad deal for the buyer.
I used to have Radeon VII along with my silver Vega 64, i mean the performance already isnt great back then, but man it's still the best looking gpu have ever seen, this and my silver Vega 64 just hits in a different way. Oh well, i shouldn't have sold this card when the crypto boom is happening.
A flop perhaps for gaming, but by GOD were these things sought after for mining! I mean every card was, but this card in particular was amazing.
Best value double precision math on the planet.
"Well.." - Miners.
It's a joke, the Flop architecture is a huge joke, it failed to beat the Titan V/RTX. I mean, why else AMD named their card differently from the RX 4/500?
It was pushed to the market as a mining card, and that's it.
my vega 64 lc was getting 30fps with 4K in CP2077so dunno what was happening there.
that card still beast interm crypto mining. Maybe that why, bcs hashrate on that gpu affected by that hbm2
The entirety of Vega was a disappointment and it was Raja Koduri's responsibility. Intel made a mistake by hiring him, he's more of a liability than an asset. Was declared End of Life within months, what an appalling GPU. AMD without Koduri has been nothing but smooth, competitive sailing
please put 6800 into the comparison list as well.
no offense though, but I'm so amazed by just how many youtubers out there that are ignoring the 6800......
there are only a few of you guys who actually remembered the 6800.
I never had one sorry
@@GearSeekers yeah, just an advice for your next videos, it's a good video as always.
Worked great for me from launch to earlier this year. Performed excellently in a hackintosh.
I think the Hackintosh community is keeping the price high.
biggest flop? i think you mistake the "7" on the radeon VII for the HD 7990 which was a flop from the day it was announced.
What about this is 2023?
It would have been cool to throw a 2080 and ti version and a 1080 ti in the mix and also Premiere and Blender testing
Unfortunately I don't have any 2080s or 1080 Tis anymore
I have one of these, and I'd even like to sell it, why doesn't it work for Adobe programs on my Hackintosh
great video guys do more throw back tech please
When Scythe Fuma 3 review?
Biggest flop? Nah, I'd give that title to HD 2900 XT.
how about the nvidia titan v
"Vaporware"! That is a new term to me, lol! 😅 I live under a rock and Patrick Star is my roommate! 😄
Funny people thought this having 16gb of hbm2 was overkill, and well considering the performance, maybe it is 😅
Still a good GPU for workstation use!
For a card so old it's still pushing out descent frame rates with some tweaks
Wtf is going out there, here the radeon vii is around 110.00-120.000 huf which is around 300$ us
was like a 2080 but never like a 2080ti or titan rtx like people thought
It's more of a collector's item at that price.
ouxch ... any one else notice that intells A770 is getting whacked to the bottom of the stack. that is itnel's current best.... they really need to get their battlemage out already.
If they had waited and released Radeon 7 a rdna card it would have tore 2080ti a new one lol
bro you buggin, your 12900k is outdated and its bottlenecking your test... JK this gpu is brutal !
Sorry I wasn't aware a 2 year top end CPU paired with a 4 and half year old GPU would be a bottlenecked 🤣
max hbm mem keep running until crashes stop
Was told this was a productivity card, because of the vram