Well last time Intel tried doing something unique with memory/storage it was ultimately too expensive and deprecated... Also HBM still exists, it's on its 3rd iteration and it's being used in AI products due to its massive bandwidth compared to GDDR6 and 7
If i had a nickel for everytime amd made and released a technology where they'd stack memory on their products, i'd have 2 nickels. Which isn't alot but its funny that it happened twice.
i absolute love those generations of AMD and Nvidea cards i build my very first gaming PC in winter 2016 with my best friend and i bought a Radeon R9 390x nitro+ from Sapphire which looked like a space ship for us at this time i had the best GPU in my whole friend circle (when 300$ was considered a TON of money for a graphics card) and that thing was cramped into my super cheap, tiny sharcoon case :D and i would defenitely love to own a Fury X aswell. The integrated watercooler is defenitely a good thing and i always liked how tiny they made the card itself. i can just imagine the heat coming out of a case that has 2x fury x cards running in crossfire! and due to moores law being not accurate anymore, i could totally see that Crossfire and SLI makes a comeback. especially since we have Nvlink, which is WAY faster, works better and actually doubles the vram of each card maybe they will bring card that have 2GPUs on them and run in Nvlink, kinda like the epic 295x2 did back then and other cards
@@BatmanandRobinVsLarryHoover i was also an AMD boy. Mostly for money saving purposes i took the FX 6300 and OCd it to healthy 4,7ghz. Together with the OCd 390x nitro i felt like i had a nuclear reactor in my room haha. but games played so well despite the CPU defenitely being the weak spot.
thanks for the interesting story! In 2015-2016 I had an R9 270X, which ran all games at medium and sometimes high settings coupled with the FX 6300 processor. And I also dreamed of an R9 Fury X then)) And now, after so many years, it’s in my hands! As for the future development of video cards, we will probably see a multi-chip layout like Ryzen processors in the future🤔
It's amazing that they actually made these and the subsequent Vega cards as well. AMD was struggling financially at that time and this was a big risk for them. These (and Vega) sadly had more downsides than positives. They were expensive at the time, drew a lot of power, the air cooled models ran hot (especially the VRMs). Reliability in the longer term was also an issue. AMD's drivers were not as good back then as they are today. Vega was a lot better. Still had issues, but they were competitive with Nvidia for the most part, endlessly tune-able.
skipped fury went 6970 to 290 390 390x then vega 56 (hated this card) then 1080 2080ti 1080ti (used very cheap 50usd) sold them all and got a 7900xtx and a770
@@ut2k4wikichici I'm sorry to hear your V56 experience was so bad. I've kept my V56 because it hits above its class and it's resale value isn't enough to make me want to sell it.
i actually got myself a perfectly working Vega 56 on the side as a backup card but also as a reminder that AMD had the balls to manufacture an HBM card.
There are too many tests; they could be presented in a more condensed form. Otherwise, I like this kind of approach to using older hardware with the right games.
HBM2 is used in a Server segment. Higher bandwidth does not necessary means higher TFLOPS. But it has a massive throughput where multiple Ai machines bennefits from it. In a gaming segment, you need fast memory with low latency and GDDRX has it quite a bit lower than HBM. The second problem is the GPU CHIP. It is not fast enough to process over 500GB/s of data. If you pair HBM2 memory with RTX 4090, the card will be like 25% faster but almost twice as expensive.
Ha, i have two of these in my closet form an old cross-fire system i built and barely used. I still can't believe i spent over $1500 for them and I can't give them away today now.
@@Al-USMC-RET send them to me! I do micro soldering and I would love to have some HBM tiles to use for... Science. Each chip has 1024-bit bus. 128GB/s bandwidth. Fun stuff to use for projects. Especially AI. Huge inference gain because of the bandwidth.
I have the Fury X myself, it's a nice card, but I have to say the coolest one was the Nano, it would have been an awesome flex if it was able to beat Maxwell.
Yes, these are great graphics cards. But in my opinion, why they did not find demand - a high price. If they cost $ 100 less, they would become much more popular.
I have both and I’ll take the 4070 super😂. It’s kind of the argument of I have this old V8 truck but now I can only get a v6 twin turbo that’s faster and more efficient. The Fury sure can heat your house in the winter though.
Самая большая проблема старых hi-end карт это объем памяти. Но новое поколение конечно же лучше, TDP ниже, производительность выше. Если бы не проклятые майнеры, играть было бы дешево.
It seems that all the games from the BF series are illegal because the Russian language has been added and as we know, the cracked versions of this game were mostly Russian-language
@@livegamesii I bought the card just for the games I had a GTX 970 at the time and let me tell you that RX570 aged so well and that price was just months away from the mining nightmare 😔
everytime i see these videos i get reminded of people complaing their top of the line 2024 GPU cant play at 4k 155 fps on ultra graphics with raytracing and no DLSS or frame gen.. and then i look at these cards strugguling to get to 60 fps on 900p-1080p
Yes, that's true. But for some reason, I personally find it more interesting to watch old video cards trying to run games than to watch new ones, which are already clear that they will run all games on high-ultra settings.
The biggest problem which AMD encountered was indeed the political relationship of Game developers with NVIDIA. This is apparent more than before if you compare the games with the likes of Skyrim and Fallout 4 favoring NVIDIA then now Starfield favoring AMD. Engineers having the lack of resources to optimize their drivers for previous generations before RDNA1 onwards lead to many AMD GPU owners giving up their cards in favor of NVIDIA cause they cannot play their favorite games in Day 1 and some had to wait for months before it got fixed. Square-Enix ties with NVIDIA in the past like FF12, FF13, FF14 and FF15 as a good example. Now if not with AMD winning the PS4 and PS5 and XBOX ONE/Series Consoles as its SoC's exclusive manufacturer and now winning the Linux Community, things would have not favor AMD today and remained a niche alternative to NVIDIA.
And yes - I have owned GPUs from both AMD as well as NVIDIA for 3 decades. From Riva TNT2, GeForce 2 MX400, 3DFX Voodoo4 4500, GeForce 9400 GT, ATi HD4850, GT630, GTS 450, RX 460, GTX 1660 Ti, RX 6700 XT.
They differ in the cooling system and power elements. They have the same graphics processor. And in operation, you will not notice the difference between them. Asus may be quieter under load
buying a new gpu these days feels like buying a faster car without fixing the roads first who can enjoy a ferrari when its driving on an UGLY ENGINE 5 road.
A flag ship in name only, it performace was marginally better the the cheaper air cooled version whichbwas considered a step down. And definitely not as good as what nvidia had out at the time. Dont get me wrong, i wish it had been better.
Hi! Guys, I misspoke here 5:42 - I meant Radeon VII
🔥🔥🔥THE BEST VIDEO CARDS FOR 2024🔥🔥🔥
✅Budget graphics cards:
RTX 4060 8GB amzn.to/4dUE5mH
RX 7600 8GB amzn.to/4hcspyG
RX 7600XT 16GB amzn.to/40fdy0m
RTX 4060 Ti 8GB amzn.to/48d0rPf
RTX 4060 TI 16GB amzn.to/4098YRl
✅Medium graphics cards:
RX 7700 XT 12GB amzn.to/3C2HD9e
RTX 4070 12GB amzn.to/4eOw5VG
RX 7800 XT 16GB amzn.to/4fg0ZX1RTX
RTX 4070 SUPER 12GB amzn.to/4dTH3YL
RTX 4070 TI 12GB amzn.to/48hdxek
✅Top video cards:
RX 7900 GRE 16GB amzn.to/3YiLxSW
RTX 4070 TI SUPER 16GB amzn.to/40dpvnp
RX 7900 XT 20 GB amzn.to/3YACn4D
When they announced the x3d, many said it would fail like hbm. Amd is the only one brave enough to do something like this
Well last time Intel tried doing something unique with memory/storage it was ultimately too expensive and deprecated...
Also HBM still exists, it's on its 3rd iteration and it's being used in AI products due to its massive bandwidth compared to GDDR6 and 7
@ yes it still exists. I’m using rx vega gpu with hbm 2
AMD has never been afraid to experiment!)
@@livegamesii What's the )?
Its not fail, its just too expensive for consumers
If i had a nickel for everytime amd made and released a technology where they'd stack memory on their products, i'd have 2 nickels. Which isn't alot but its funny that it happened twice.
They probebly used gained knowledge from developing HBM and baked it into the 3D V-cache technology.
i absolute love those generations of AMD and Nvidea cards
i build my very first gaming PC in winter 2016 with my best friend
and i bought a Radeon R9 390x nitro+ from Sapphire which looked like a space ship for us
at this time i had the best GPU in my whole friend circle (when 300$ was considered a TON of money for a graphics card)
and that thing was cramped into my super cheap, tiny sharcoon case :D
and i would defenitely love to own a Fury X aswell. The integrated watercooler is defenitely a good thing
and i always liked how tiny they made the card itself.
i can just imagine the heat coming out of a case that has 2x fury x cards running in crossfire!
and due to moores law being not accurate anymore, i could totally see that Crossfire and SLI makes a comeback.
especially since we have Nvlink, which is WAY faster, works better and actually doubles the vram of each card
maybe they will bring card that have 2GPUs on them and run in Nvlink, kinda like the epic 295x2 did back then and other cards
I had a 390x build a black and red MSI gaming X with a i5 4690k
I had the same card with an fx 8350. Blew my mind back then
@@BatmanandRobinVsLarryHoover i was also an AMD boy. Mostly for money saving purposes i took the FX 6300 and OCd it to healthy 4,7ghz. Together with the OCd 390x nitro i felt like i had a nuclear reactor in my room haha.
but games played so well despite the CPU defenitely being the weak spot.
thanks for the interesting story! In 2015-2016 I had an R9 270X, which ran all games at medium and sometimes high settings coupled with the FX 6300 processor. And I also dreamed of an R9 Fury X then)) And now, after so many years, it’s in my hands!
As for the future development of video cards, we will probably see a multi-chip layout like Ryzen processors in the future🤔
@max.racing I managed to overclock my FX 6300 to 4.6 GHz, which also greatly accelerated the system😅
If only the went to 8gb of HBM1.
I'd argue even if it was 6 gigs, it could have done very well
And the price would be lower than Nvidia))
I've r9 290x liquid cooled one. And I'm really happy 😊
Yes, a great map for its age, which runs many modern games 😃
to be fair, memory compression is a thing and the hbm performed capacity wise better than other 4 GB cards and was on close to the 6 GB on the 980
yes, this also played a significant role in the speed of operation.
It's amazing that they actually made these and the subsequent Vega cards as well.
AMD was struggling financially at that time and this was a big risk for them.
These (and Vega) sadly had more downsides than positives. They were expensive at the time, drew a lot of power, the air cooled models ran hot (especially the VRMs). Reliability in the longer term was also an issue. AMD's drivers were not as good back then as they are today.
Vega was a lot better. Still had issues, but they were competitive with Nvidia for the most part, endlessly tune-able.
Yes, I totally agree with you!
skipped fury went 6970 to 290 390 390x then vega 56 (hated this card) then 1080 2080ti 1080ti (used very cheap 50usd) sold them all and got a 7900xtx and a770
Why didn't you like Vega 56?
@@ut2k4wikichici I'm sorry to hear your V56 experience was so bad. I've kept my V56 because it hits above its class and it's resale value isn't enough to make me want to sell it.
i actually got myself a perfectly working Vega 56 on the side as a backup card but also as a reminder that AMD had the balls to manufacture an HBM card.
underrated channel. good woork
Thank you very much! Subscribe to our channel! :)))
Fury, loved it, it still has special spot for HW in my heart
There are too many tests; they could be presented in a more condensed form. Otherwise, I like this kind of approach to using older hardware with the right games.
HBM2 is used in a Server segment. Higher bandwidth does not necessary means higher TFLOPS. But it has a massive throughput where multiple Ai machines bennefits from it. In a gaming segment, you need fast memory with low latency and GDDRX has it quite a bit lower than HBM. The second problem is the GPU CHIP. It is not fast enough to process over 500GB/s of data. If you pair HBM2 memory with RTX 4090, the card will be like 25% faster but almost twice as expensive.
Yes, that's exactly it!
I loved this card so much because it was a different approach. I still want one.
I almost went and got a non-X several years ago, but that 4GB VRAM made me to forget it.
nice video! i enjoyed the little history lesson on the gpu, very good card for its time :)
Thank you very much, I'm glad you liked it! Subscribe to the channel, there will be a lot of interesting videos🙂
I had 2 laptops before my pc , they had a 950M and 1050 respectively , they were both awful , but the upgrade to the 7900xtx was much needed:)
A video card ahead of its time! The review is very interesting, I love watching things like this 😃👍
Thank you, glad you liked it!
I came just to see benchmarks but damn the history of the gpu was great to see.
that Fury X load temp is lower than my RX570 Idle temp😂
17:07: NFS Unbound*, not Heat
I own both a Fury X and a R9 Nano. These cards really punch above their weight tbh, not even a 1.55 GHz RX 590 can touch a Fury X in Time Spy.
Its guest PC has a Vega 56, it works an excellent for playing games
I remember this one i was dreaming of buying it one day, but the best I could afford was GTX 960 STRIX
what drivers did you use?
I used 2 drivers: 1. The latest official 2. The latest from NimeZ (in new games: Alan Wake, Horizon FW)
@@livegamesii Thanks! I get massive issues in the new cod warzone, could you maybe try to test that next?
@@livegamesii Do you happen to have a link to the nimeZ drivers?
Ha, i have two of these in my closet form an old cross-fire system i built and barely used. I still can't believe i spent over $1500 for them and I can't give them away today now.
And now they are almost worthless 🥲
@@Al-USMC-RET send them to me! I do micro soldering and I would love to have some HBM tiles to use for... Science. Each chip has 1024-bit bus. 128GB/s bandwidth. Fun stuff to use for projects. Especially AI. Huge inference gain because of the bandwidth.
Such a unique card ngl
I have the Fury X myself, it's a nice card, but I have to say the coolest one was the Nano, it would have been an awesome flex if it was able to beat Maxwell.
Yes, these are great graphics cards. But in my opinion, why they did not find demand - a high price. If they cost $ 100 less, they would become much more popular.
4:38 And 9 years later, we have RTX 4070 Super which has even less memory bandwidth than the Fury X.
I have both and I’ll take the 4070 super😂. It’s kind of the argument of I have this old V8 truck but now I can only get a v6 twin turbo that’s faster and more efficient. The Fury sure can heat your house in the winter though.
Go get Fury X then, What you waiting for? Go on, Buy.
Самая большая проблема старых hi-end карт это объем памяти. Но новое поколение конечно же лучше, TDP ниже, производительность выше.
Если бы не проклятые майнеры, играть было бы дешево.
It seems that all the games from the BF series are illegal because the Russian language has been added and as we know, the cracked versions of this game were mostly Russian-language
No, not all. Only BF 4. The rest I bought from EA App.
No overheat for this card. Temp is not higher than 60 degree😯
Time proved that R9 390X was a better buy.
But this was clear from the start :)
Imagine current amd gpus with hbm3e it can do over 1tb/s
I dreamed of one like this... but I only had enough money for a gtx 750 ti 😁
When it came out, the 750 ti was a great card for its price👍
Remember when the RX 570 was 99.99 bundled with 3 games worth over a 100 will we ever see something like that ever again?
@@RoerDaniel i wish
@RoerDaniel Was there really such an action?😅
@@livegamesii I bought the card just for the games I had a GTX 970 at the time and let me tell you that RX570 aged so well and that price was just months away from the mining nightmare 😔
radeon 890m performs basically like this
Maybe 🤔 It needs to be checked
everytime i see these videos i get reminded of people complaing their top of the line 2024 GPU cant play at 4k 155 fps on ultra graphics with raytracing and no DLSS or frame gen.. and then i look at these cards strugguling to get to 60 fps on 900p-1080p
Yes, that's true. But for some reason, I personally find it more interesting to watch old video cards trying to run games than to watch new ones, which are already clear that they will run all games on high-ultra settings.
That isn't a Radeon 8. It's VII. 5+1+1=7.
Yes, I misspoke. And the script said “VII” 😅
what game is at 4:12? also great video
Cyberpunk 2077. Thank you! Subscribe, there will be many interesting videos about video cards and processors!)
The biggest problem which AMD encountered was indeed the political relationship of Game developers with NVIDIA. This is apparent more than before if you compare the games with the likes of Skyrim and Fallout 4 favoring NVIDIA then now Starfield favoring AMD.
Engineers having the lack of resources to optimize their drivers for previous generations before RDNA1 onwards lead to many AMD GPU owners giving up their cards in favor of NVIDIA cause they cannot play their favorite games in Day 1 and some had to wait for months before it got fixed. Square-Enix ties with NVIDIA in the past like FF12, FF13, FF14 and FF15 as a good example.
Now if not with AMD winning the PS4 and PS5 and XBOX ONE/Series Consoles as its SoC's exclusive manufacturer and now winning the Linux Community, things would have not favor AMD today and remained a niche alternative to NVIDIA.
And yes - I have owned GPUs from both AMD as well as NVIDIA for 3 decades. From Riva TNT2, GeForce 2 MX400, 3DFX Voodoo4 4500, GeForce 9400 GT, ATi HD4850, GT630, GTS 450, RX 460, GTX 1660 Ti, RX 6700 XT.
Yes, you are right, I agree with you!
Wow, you had a lot of video cards😮 By the way, subscribe, there will be a lot of interesting videos about video cards and processors!
Subbed :)
Radeon 8... is Radeon 7
Yes, I misspoke.. But it was written correctly in the script 😅
Sir did we talked to each other's on discord ?.
it's unlikely 🤔
Whats the difference between Asus ProArt rtx 4080 super and Zotac 4080 super Why is ProArt More expensive then Zotac! Plz help
Proart uses better power delivery than ZOTAC and can be pushed harder and faster than ZOTAC
They differ in the cooling system and power elements. They have the same graphics processor. And in operation, you will not notice the difference between them. Asus may be quieter under load
What, Crysis 3 is optimized af
buying a new gpu these days feels like buying a faster car without fixing the roads first
who can enjoy a ferrari when its driving on an UGLY ENGINE 5 road.
A flag ship in name only, it performace was marginally better the the cheaper air cooled version whichbwas considered a step down. And definitely not as good as what nvidia had out at the time.
Dont get me wrong, i wish it had been better.
Yes, it is. But this amd graphics card was almost the same in performance as the GTX 980 ti. Fury X lagged slightly behind it.
music is annoying.