Why Would They Make This? 16GB RX 580 From Aliexpress...

Поделиться
HTML-код
  • Опубликовано: 19 янв 2025

Комментарии • 1,2 тыс.

  • @PhoticSneezeOne
    @PhoticSneezeOne Год назад +1505

    All jokes aside: It is mind blowing that a card from 2016 (yeah, that´s right) performs like that in 2023. Polaris was/is AMD´s masterpiece.

    • @anita.b
      @anita.b Год назад +236

      AMD does that a lot, it's not a unique case of amazing performance down the road.
      Three words.
      7970 Ghz Edition.

    • @fgfg655
      @fgfg655 Год назад +111

      so does GeForce 10 series GPUs.

    • @jonathank5841
      @jonathank5841 Год назад +15

      Hey guys, sorry for interrupting but would like to ask if said graphic card you mention would be s good card for beginners? Would love to play stuff like hell let loose, even on medium thst would be awesom. Im looking for a graphic card thst wont break the bank

    • @leopoldbuttersstotch6060
      @leopoldbuttersstotch6060 Год назад +25

      I was always kinda confused why people love the rx 580 but shit on the 1060 6gb, in my market they’re the same price, and the 580 uses a lot more power while not being much more powerful (just 2gb of extra vram, but let’s be honest, 6gb is completely enough for the 1060 class of gpu)

    • @wrth
      @wrth Год назад

      ​@@jonathank5841depends how much is your bank, where you live, and if you're willing to go second hand or not

  • @Psychx_
    @Psychx_ Год назад +921

    The RX580 2048SP is called RX 570 elsewhere in the world.

    • @mijingles8864
      @mijingles8864 Год назад +42

      The normal 2048 has 8gb. Double the amount of the normal 570.

    • @p0intblank597
      @p0intblank597 Год назад

      @@mijingles8864 there is a 8gb variant of rx570

    • @alsy342
      @alsy342 Год назад

      @@mijingles8864some of the RX570s actually have 8 gb of vram

    • @GewelReal
      @GewelReal Год назад +45

      no, it is a 2048SP
      570 is a different thing

    • @fuxseb
      @fuxseb Год назад +61

      We shall call it 570X then. Or even better, 570 Ti.

  • @Psychx_
    @Psychx_ Год назад +987

    The 16 GB version only has 192GB/s of mem bandwidth. If you could push that a little higher (normal 580 has 256GB/s), it would net you some significant, additional performance. Polaris10 allows for adjusting memory timings and bios mods btw.

    • @donciutino7490
      @donciutino7490 Год назад +31

      polaris is dead bc of new DirectX

    • @upfront2375
      @upfront2375 Год назад +5

      @@donciutino7490 What is it? the 12.2?

    • @vitinhx794
      @vitinhx794 Год назад +5

      @@donciutino7490 is it 12.1?

    • @Tetraverse
      @Tetraverse Год назад +83

      For some reason they used 6Gbps memory modules instead of the 8Gbps modules that the original RX 580 had. This probably made the performance significantly worse.

    • @xKrikket
      @xKrikket Год назад +36

      The memory also runs 500 Mhz slower. Everything is nerfed on this card.

  • @MenkoDany
    @MenkoDany Год назад +295

    The 2048SP thing got me intrigued. Turns out, it's literally just the RX 570 with a small 40mhz factory overclock on the max boost. The RX 570 is the same die as the 580 and same architecture design as the 590 (it was a die shrink)

    • @MenkoDany
      @MenkoDany Год назад +20

      I'm wish Dawid tested ML stuff, theoretically (if you ignore AMD shenanigans) this GPU could be amazing bang 4 buck for stable diffusion/LLMs. I mean, if tinygrad succeeds :D

    • @b127_1
      @b127_1 Год назад +35

      RX 590 is actually a different die. It is made on 12nm instead of 14nm.

    • @pfizerpricehike9747
      @pfizerpricehike9747 Год назад +8

      @@MenkoDanyonly if you wanna tinker. gfx803 isn’t officially supported by rocm for a while. Thanks to amds drivers the speed on windows in ML is blatantly bad. In linux you will have to find a way to get rocm going for these specific cards, then you could actually get halfway decent performance for the price but it’s not really worth the hassle if you can just buy a 3060 12 gig or more recent 12gig Radeon card used for not much more

    • @MenkoDany
      @MenkoDany Год назад

      @@pfizerpricehike9747 Trust me, I know. During the early LLaMA days I desperately wanted more ram (4090 is not enough), I was this --->

    • @vinylSummer
      @vinylSummer Год назад

      ​@@pfizerpricehike9747idk about pricing in your region, but in Russia that 580 16gb costs half the price that the cheapest used 3060s usually go for, while providing 30% more VRAM. For applications that need VRAM that bad the card seems like a good deal

  • @masiosareanivdelarev562
    @masiosareanivdelarev562 Год назад +39

    I bought my RX 580 8GB between 2017 and 2018 and it was great for 1080p gaming. I just upgraded it this year a month ago to a RADEON RX 6650 XT. I still recommend the RX 580 if you are on a budget you can still game with that GPU.

    • @brandonfulstone7628
      @brandonfulstone7628 Год назад +4

      I have an RX 580 8gb and it runs at least some modern games decently well

    • @itpugil
      @itpugil 9 месяцев назад +2

      I just upgraded from an RX 470 just last week to a 7800XT. That thing can still game, especially with FSR enabled. That feature is such a lifesaver!

    • @ItsNeverMe
      @ItsNeverMe 9 месяцев назад +1

      Yes, I bought my rx 580 for 45€ 1 year ago. I only upgraded to a gtx 1080 2 months ago because it was in a pc I got for 20€ used (well, it was free but I needed a 20€ wifi card)
      It has a i7-6850K on a MSI X99A Gaming Pro Carbon in an old Lian Li PC-Z70.

    • @itpugil
      @itpugil 9 месяцев назад +1

      I also forgot to mention the 7800XT I got was sold to me for.....wait for it.....89 usd. Yes 89 bucks. A friend I build PC's for gave me a free 1000w ASUS ROG Strix PSU and tried to sell his 7800XT to me for 409 usd hoping that I'd bite the offer. Since I told him I was using a Seasonic M12II 525w PSU so I was limited on upgrades. I was considering the 6650XT since its good on a 500w PSU. When I declined, he just sold it to me at 89 usd. I've been gaming since then, replayed RE 4 Remake, and everything is at max settings on my 1080p 27" 144hz curved monitor.

    • @ItsNeverMe
      @ItsNeverMe 9 месяцев назад +1

      @@itpugil Holy $hit, that's cheap!

  • @tricky_hy2892
    @tricky_hy2892 Год назад +781

    I clicked faster than linus being exposed by gamer nexus

    • @bennoboy97
      @bennoboy97 Год назад +26

      Wow this is an original comment

    • @Crusader1089
      @Crusader1089 Год назад +53

      So you clicked after several months of investigation?

    • @JBrinx18
      @JBrinx18 Год назад +8

      Faster than 44.25 minutes

    • @OppositePengui
      @OppositePengui Год назад +1

      Doesn’t seem like they’ve properly set a response on their RUclips yet

    • @Hexanilix
      @Hexanilix Год назад +1

      NAWWW 💀💀💀

  • @Sundayinthelife
    @Sundayinthelife Год назад +32

    My guess is that the 16gb may help on productivity tasks? Very fun video. I like that when you crank things to unusable, the 16gb is technically 3x better

  • @zloboslav_
    @zloboslav_ Год назад +92

    This memory could be very useful for loading larger AI models for the same reasons that the 12GB 3060 is the budget secret weapon for AI enthusiasts.

    • @thorgraum1462
      @thorgraum1462 Год назад +2

      preach

    • @JohnDoe-cv8iw
      @JohnDoe-cv8iw Год назад +1

      yup!! i just learned about the 3060 myself.. just picked up 3 for my AI rig.. really not bad performance for the price!!!

    • @classictechhd2932
      @classictechhd2932 11 месяцев назад

      @GreedIsBad slow but no memory limitation at cheap price maybe we gotta wait stablediffusion for cpus ryzen 7000 support bf16 but not fp16

    • @w04h
      @w04h 9 месяцев назад +4

      You are not running any AI on amd hardware, and not because of tensor cores, but awful software support and lack of driver updates. Rocm is no longer updated for these cards.

    • @Lynnfield3440
      @Lynnfield3440 22 дня назад

      Would a Maxwell titan be usable for that? It has 12GB as well and is available for around 100 euro.

  • @czbrat
    @czbrat Год назад +113

    Rather than doing ultra preset or 4K you could have only increased the texture setting. Wouldn't have hurt performance and still used most of the memory the game could use anyway.

    • @12Rosen
      @12Rosen Год назад +2

      Increasing textures reduces performance too..

    • @alexanderjohansson2671
      @alexanderjohansson2671 Год назад +35

      @@12Rosen Not as much as everything on ultra though.

    • @gorky_vk
      @gorky_vk Год назад +3

      @@12Rosen only when bandwidth is quite low

    • @TheNightquaker
      @TheNightquaker Год назад +9

      @@12Rosen Not nearly as much as increasing everything to Ultra. Especially not with 580's memory bandwidth, which can handle high res textures just fine.

    • @MJ-uk6lu
      @MJ-uk6lu Год назад +9

      @@12Rosen No, it doesn't. It was tested so many times. Anisotropic filtering doesn't have any performance cost either. This has been true since like 2002 or 2003.

  • @TonyHarlan
    @TonyHarlan Год назад +15

    Why your channel and videos haven't popped up and been recommended for me in so long is infuriating. This is literally the first video of yours I've ever seen, and I have no idea why. You are freaking hilarious, and I enjoyed this video so much. Already liked and subscribed.
    I'm still running an RX 580. Told myself I'd replace it when it wasn't good enough, and it's still doing everything I need it to do.

    • @balsalmalberto8086
      @balsalmalberto8086 9 месяцев назад

      Same got inside a puter I built back in 2013, cpu is bottle necking it. I rarely game on it as I have a better PC for that.

    • @samholdsworth420
      @samholdsworth420 8 месяцев назад +1

      Davvid is 😎 🫘

    • @MoultrieGeek
      @MoultrieGeek 8 месяцев назад

      Same here, I watch a ton of tech videos and never saw Dawid in my recommend. I found him through Toasty Bros and immediately subbed.

  • @stephanieamare
    @stephanieamare 8 месяцев назад +4

    The first concerning thing for me at the 2:15 mark....the mention of PhysX. That's always been an nVidia thing--I don't even remember what the last game to have that "PhysX" branding, was. But this 16GB version might be useful in the same way the 3060 12GB was/is. Not the best for gaming, but a lot of video/animation programs take up a lot of VRAM.

  • @HR-wd6cw
    @HR-wd6cw Год назад +46

    I still like how some graphics cards, like the 3D Wildcat back in the day (which was a CAD-oriented card) had upgradable RAM slots on the card. Then again, these cards were also not cheap as i think they ran a few thousand dollars each at the time. That being said, I think we are venturing into rather absurd territory where graphics cards have more RAM than some systems. I used to joke (back when 64GB iPods were around) that it was sad if your iPod had more storage space than your computer.

    • @MJ-uk6lu
      @MJ-uk6lu Год назад

      Unfortunately, Wildcat was severely flawed card in many ways.

  • @Daygunjim
    @Daygunjim Год назад +6

    Hey Dawid, I also have one of these scalpel RX 580s. And for overclocking with MSI afterburner it's everything to do with the power limit settings. They are running at like 50% power capacity, so slide that bad boy up and it will hold the frequencies.

  • @justbubba4373
    @justbubba4373 Год назад +37

    I feel like it's not even for gaming at that point. Probably for mining or machine learning that needs "Just Vram"

    • @IdentifiantE.S
      @IdentifiantE.S Год назад +1

      Its sure !

    • @gorky_vk
      @gorky_vk Год назад

      Memory is cheap now (who know do they even use new chips) and this is a way to stand up in market saturated with millions of other 580 cards available now.
      But it's pointless, it's just like 4GB gt710 cards.

  • @hopoff9968
    @hopoff9968 Год назад +62

    The 580 was my first gpu, it sure was a huge upgrade from integrated graphics😅

    • @jaskajokunen3716
      @jaskajokunen3716 Год назад

      @@wildcat002 i got 4k monitor and still use my 1080ti for it Some games run fine 4k with it still.

    • @dorayaki5494
      @dorayaki5494 Год назад +4

      @hawky2k215 Performance wise yes. Power draw wise not really. The 580 drew a whopping 100w less than the 390x. If you used it for around 4-5 years 4h/day you saved around $100 in the electrical bill alone (depending on the kwh price). That and it also probably reduced the load on the CPU... If you bought your 580 8GB at the time for $200 you effectively reduced the cost by half. No money wasted there if you ask me since it's still by today standards a very solid card and can be resold much easier at a higher price than a 390x.

    • @balsalmalberto8086
      @balsalmalberto8086 9 месяцев назад +1

      My second gpu. Bought it as an upgrade to GTX 550 Ti. Got it right before the GPU prices went insane in 2020

  • @MRJN23
    @MRJN23 Год назад +158

    Dawid never disappoints

    • @Xenoray1
      @Xenoray1 Год назад +3

      dawiddoesneverdissapoint

    • @TheLaurentDupuis
      @TheLaurentDupuis Год назад +4

      Except with EPYC CPU.

    • @bregowine
      @bregowine Год назад

      God bless him

    • @targz__
      @targz__ Год назад +1

      he's using AI generated images. that's very disappointing

    • @IdentifiantE.S
      @IdentifiantE.S Год назад

      @@targz__Its true unfortunately 🥲

  • @wizardothefool
    @wizardothefool Год назад +5

    Now imagine 2 of them, in Crossfire

  • @1337Ox
    @1337Ox Год назад +43

    Yeah when I heard about this version of the card, I just asked myself "why? and WHY the cut down version?". 8GB is really the sweetspot for RX580, more and slower memory will not help you in any case. RX580 is such a good card, released in 2017 and still relevant card for 1080p gaming in 2023, and you can undervolt most of the cards so they are super efficient for this architecture.

    • @BronzedTube
      @BronzedTube Год назад +7

      I just finally upgraded from the rx 580 - the thing was a champ. I mean still ran everything I ever threw at it well enough.
      jumped up to the 7600. This new card runs so cold for me, I dunno how I am going to heat my office this winter.

    • @jameslake7775
      @jameslake7775 Год назад +10

      My guess would be etherium leftovers. Mining ETH required a lot of RAM, but wasn’t very intense on the core. My understanding was that it did benefit significantly from faster memory, but maybe they were chasing some type of power savings or maybe this was just what was available and cheap and worked.

    • @degnartsE
      @degnartsE Год назад

      ​@@BronzedTube its kinda crazy I have a 580 and I was planning to upgrade to a 7600 too 💀💀

    • @rustler08
      @rustler08 Год назад +3

      @@BronzedTube Why, though? You could have literally spent like $40 more for a brand new, on sale 6700 XT or 6750 XT that would have crushed the 7600. Or, even a used, under warranty 3070.

    • @rustler08
      @rustler08 Год назад +1

      @@degnartsE Please don't. Just spend the small amount of extra money on a 6750XT. Skip eating out like once or twice, you'll have a vastly superior card.

  • @helljester8097
    @helljester8097 Год назад +7

    I love how if you stop to look closely at the ai generated images at the beginning they are pretty much 90% squiggly lines that don’t mean anything like my drawings of a “mad scientist laboratory” when I was 4 except it’s all colored in and shaded real well.

    • @raven4k998
      @raven4k998 Год назад +1

      you know if the circumcized 580 can do that well with 16gb of vram imagine what the regular version would do with the same memory🤔🤔

  • @Eltezznn
    @Eltezznn Год назад +7

    It's a very weird variant because I think It's made for ML (stable diffusion, Vicuna-13b...) BUT, as far as I know, these cards are compatible only with a very outdated version of ROCm (which is the AMD's crippled version of CUDA) so the performance will be pretty low these days.
    We need someone to test this card with ROCm to see if "it does something" though.

  • @xDJaack
    @xDJaack Год назад +5

    It doesn't matter what the video is about, it could be an intricate look into a cable and you manage to make it entertaining! Well done my guy!

  • @PEKKABEAST
    @PEKKABEAST Год назад +3

    And Nivida is still going to try to sell us 8gb cards.

    • @Safetytrousers
      @Safetytrousers Год назад

      This video was showing 16GB was pointless on this model of GPU.

  • @BruceM-i3o
    @BruceM-i3o Год назад +4

    I absolutely love Dawid’s Drama Free content. Great video!

  • @stevetheborg
    @stevetheborg Год назад +1

    1410mhz. undervolted by 20. with a 16 percent over power budget runs stable.

  • @BeardedGinger
    @BeardedGinger Год назад +3

    the rx580 8gb was a beast card probably one of my most favorite gpu's of all time. It did 4k on my tv when i had my HTPC setup at the time and played wild lands at respectable frame rates and it handled my ultrawide in games too. bought it for 100$ sold it for 300$

    • @MarginalSC
      @MarginalSC Год назад

      Given it was basically a 480 it remained relevant for a shocking amount of time

  • @TonicofSonic
    @TonicofSonic Год назад +1

    My 8gb 580 purchased last year for 120$ delivered was the single best purchase I have made in years. Seems everything else I buy dissapoints. From cars to bluray players, a win is hard to come by these days...

    • @happybuggy1582
      @happybuggy1582 Год назад

      This one will be obsolete this year😢 good card though

    • @TonicofSonic
      @TonicofSonic Год назад +1

      @@happybuggy1582 It won't for me. I don't play new games. I stay at least 3 years behind releases so I actually get to play finished games. This PC is used to play backlogged games that span all the way to SNES games. Currently playing fallout new vegas for the first time, in preperation for starfield in 4 years 😂

  • @EliezYT
    @EliezYT Год назад +44

    Honestly I'd say the best card for something like this would be a RX 5700 XT that would make it a budget 1440p powerhouse.

    • @marktheunknown1829
      @marktheunknown1829 Год назад

      I'm using a 6600xt for 1440p, even though it is said to be a 1080p card, it is performing quite well

    • @Am_Yeff
      @Am_Yeff Год назад +3

      My 5700xt is already a 1440p powerhouse

    • @EliezYT
      @EliezYT Год назад +4

      @@Am_Yeff Not anymore my RX 5700 XT is struggling in 1440p gaming because of vram.

    • @Am_Yeff
      @Am_Yeff Год назад +3

      @@EliezYT mine isnt, runs VR DCS great and most things at 144fps, really still an amazing card

    • @EliezYT
      @EliezYT Год назад +3

      @@Am_Yeff Are you playing newer titles like The last of us and Cyberpunk? I see the vram usage go pretty high playing 1440p high settings.

  • @slackerdc
    @slackerdc Год назад +6

    I love Dawid's videos because he doesn't say why things are a bad idea he gets the hardware and shows you and it speaks for itself.

  • @pixelpuppy
    @pixelpuppy Год назад +8

    I wonder how this card would do for Stable Diffusion rendering, since the more video memory it has, the bigger the images it can generate.

    • @BadBoy11m
      @BadBoy11m Год назад

      That's what I asking for too

    • @jackielee297
      @jackielee297 Год назад

      Quality wise, better go for RX 6600 and beyond

  • @GeneralNickles
    @GeneralNickles Год назад +12

    The added memory is likely for crypto mining.
    The RX580 was mining powerhouse for a while, but only 8gb version, because mining uses a crap ton of vram. 4gb just doesnt cut it.
    So 16gb would probably have a pretty big effect on your hash rate.

    • @Longbowgun
      @Longbowgun 11 месяцев назад

      It's for A.I.

    • @nihadasadli2642
      @nihadasadli2642 3 месяца назад

      Nope, one thing, GPU mining doesn't even exist pretty much, second, more vram doesn't get you higher hashrate.

    • @GeneralNickles
      @GeneralNickles 3 месяца назад

      @@nihadasadli2642 you are completely wrong on both accounts.

  • @tarno_bejo_
    @tarno_bejo_ 8 месяцев назад +1

    Yep, thats what i thought with 2048sp previously (not this brand tho).
    But then i realize that 120 watts max is pretty much sweet (compared to the over 200 watts standard rx 580).
    So, im accepting it with bitter truth.

  • @an3k
    @an3k Год назад +4

    You can use a polarization filter. That will make the information on the chips much more visible.

    • @stephen1r2
      @stephen1r2 Год назад

      That often won't help in these cases. The cards are constructed from found working parts pulled from dead cards. To conceal the sourcing of the chips they sand off the model/serial numbers

  • @fatrobin72
    @fatrobin72 Год назад +2

    I remember seeing graphics cards in Maplin back when dad would go in to buy some electronics components... and that gpu box gave me nostalgic feelings towards that...

  • @Deja117
    @Deja117 Год назад +20

    Weird idea for these kind of cards... You could try rendering a video with them, to see if the extra memory can maybe somehow make it a little faster? 😅

    • @FreebooterFox
      @FreebooterFox 11 месяцев назад +1

      Yeah, my first thought is that this is for cryptomining, or rendering or something. I can't imagine why else this FrankenGPU would exist.

  • @KimBoKastekniv47
    @KimBoKastekniv47 Год назад +1

    Dawid testing 2018 games wondering why they don't use more than 8GB of VRAM.

  • @wizardothefool
    @wizardothefool Год назад +15

    WOW, 16gb really brings out the 4k performance. I could only imagine what some games could do with 2 rx580s working together in tandem

    • @redist7201
      @redist7201 Год назад

      по видео видно что это всё напрасно,16 это всё в никуда
      the video shows that it's all in vain, and it's all going nowhere

    • @AtomicSub
      @AtomicSub Год назад

      @@wizardothefoolYeah RX 580 at 4K, it’s a massive piece of shit and dual gpu gaming is dead

    • @jackielee297
      @jackielee297 Год назад +1

      2 rx580? Might as well get 1 RX6700 XT

    • @101-cesaraugusto7
      @101-cesaraugusto7 Месяц назад

      Bro was mad ​@AtomicSub

  • @nexxusty
    @nexxusty Год назад +2

    I knew you'd buy one of these when I saw them the other week.
    Nice one brother.

  • @Dorraj
    @Dorraj Год назад +15

    Dawid, you gotta understand (and a lot of gamers and Nvidia takes advantage of this misconception) that resolution is not the only thing that affects VRAM usage. Texture settings are typically the settings that affect VRAM usage, but even then there are a lot of games that "smartly" use VRAM for textures. In a lot of modern games, the texture setting simply affects how fast the textures are streamed in, not how big the textures are, RE4R being the best example. (annoying how they put a gb size next to the setting despite that not being what it does)
    But that's not all, any RT settings, specific detail settings, all will affect VRAM usage, not just "pump to 4k to use VRAM". It hurt me to see you playing games at 4k with medium settings, like what's the point of testing out the VRAM if you don't change the settings that actually scale with more VRAM!

    • @rustler08
      @rustler08 Год назад +1

      TL;DR

    • @Dorraj
      @Dorraj Год назад

      @@rustler08 TLDR: 4K resolution is not the only way to increase VRAM usage. Normal settings do that too. Turning a game up to 4K but keeping the settings low will not use too much VRAM.

    • @jurpo6
      @jurpo6 Год назад

      I get you're trying to be clever, but not enough to prove him wrong. It ultimately varies based on the method game developers used to make the game. Doom eternal, RDR2 pre-caches everything into vram to prevent stutters. Tiling is used with dx12 and as you say, is a smart way to use vram. It however requires some advanced yet troublesome prediction or scheduling to prevent stuttering. There are many games that simply determine textures based off of resolution, so dawid is completely fine in his video. The game is not going to load higher resolution textures for a lower screen resolution, there is zero point. On the flip side a game might default to 2k textures or higher when set to 4k resolution.

    • @LorikQuinn
      @LorikQuinn Год назад

      Shadows also use quite a bit of VRAM

  • @dontevenlook
    @dontevenlook Год назад +1

    I'd love if he could plop the 16gb memory into the beefcake RX 580.

  • @jacksongunner7122
    @jacksongunner7122 Год назад +7

    I’m waiting for the 16GB GT710 review.

    • @SaraMorgan-ym6ue
      @SaraMorgan-ym6ue 8 месяцев назад

      they did it because they could not because they should
      watch Jurassic park for the reference🤣🤣🤣🤣🤣

  • @pf100andahalf
    @pf100andahalf Год назад +1

    Dawid, you did not buy a 16gb rx 580 from aliexpress. You stole it and ran and ran like Forrest Gump until you were on the other side of the world until you got far enough away that you got away with it, you magnificent bastard.

  • @Imkishore_12
    @Imkishore_12 Год назад +8

    This intro has better cinematic than some games 😂

  • @pandabytes4991
    @pandabytes4991 Год назад +1

    Always happy when the best TechTuber around releases a new video. I seem to always go out of my way to ensure I get to enjoy the video ASAP.

  • @acepedro12
    @acepedro12 Год назад +3

    It maybe circuncised, but it's still 12 inches long.

  • @virlanmarianbogdan653
    @virlanmarianbogdan653 Год назад +2

    Dude I love your videos! Thank you for doing your job!

  • @NatInTheHat49
    @NatInTheHat49 Год назад +3

    This abomination was made for mining a fair few of these are in the wild considering the price difference to other 16gb cards it made a lot of sense

    • @aleksazunjic9672
      @aleksazunjic9672 Год назад

      Correct. 2048 SP was chosen because it used less energy for similar yield.

  • @solson223
    @solson223 Месяц назад

    I love the comical ride along on the way to making your point. Good stuff!

  • @TheRealKensai86
    @TheRealKensai86 Год назад +5

    As soon as i saw the beginning of the video, I immediately thought "the vram is not the bottleneck".
    I had a strix rx480 (basically the same as your 580) and used it up untill a few months ago. A very good card and lasted a lot longer than it should. It now lives in my daughters minecraft pc.
    Great video btw!

  • @poliwharaslah965
    @poliwharaslah965 Год назад +1

    This is type of video nvidia needed to sell 4060ti 30gb version..

  • @pv2xeek
    @pv2xeek Год назад +5

    I would be curious to see how it handles generative AI tasks. Some LLMs are massive memory hogs.

  • @MTBScotland
    @MTBScotland Год назад +4

    be interesting to see how such a card works in something like davinci resolve. comparing it to say a 570 which is what it's based on would be a fairer test

  • @eliasrayan6421
    @eliasrayan6421 Год назад +2

    Do you think this card is suitable for deep learning ?
    I mean, if you accept longer training times, 16gb of memory for less than $250 (I didn't find any mention about the price in the video) might be acceptable for broke computer scientists. The only issue is that AMD cards are much, much slower than NVIDIA's because most DL libraries are made with CUDA...

  • @emilypeters8888
    @emilypeters8888 Год назад +3

    I love that people are resurrecting old tech with upgrades, especially since in my opinion manufactures should make things like ram upgradable, and cpus usable regardless of mobile or desktop in any platform, mobile highend cpu's use to be a killer option for passively cooled home theater pcs, etc.. and then if you could get a highend mobile gpu on a card you were set to have the best experince.

    • @jackielee297
      @jackielee297 Год назад

      Chinese market are doing that. They buy old motherboards, RAM, and GPU and create a Frankenstein outta them

  • @kernsanders3973
    @kernsanders3973 Год назад +1

    You should totally test Stable Diffusion, people over there are crying for more VRAM. Would love to see if this card can run SDXL where new cards at 8gb/11gb are struggling. If a RX580 16gb can run SDXL perfectly fine then im certainly getting one.

    • @manaphylv100
      @manaphylv100 Год назад

      I doubt Polaris is compatible with SD at all. It might run with a custom build, but probably only at 0.5 it/s or something useless.
      I have a 16 GB Radeon Instinct MI50 (modded to Pro VII), and it can hardly hit 2 it/s with the AMD build.
      You really need an Nvidia card for SD, or one of the RX 7000 cards. The cheapest option is the CMP 40HX at ~$80, which is the mining version of the RTX 2060S/2070; it can hit 4~5 it/s using single-precision (its half-precision is crippled), or about 75% of an RTX 2060S. The value-performance option is the RTX 2080 Ti with the 22 GB mod, but make sure you buy your own VRAM chips and find a reputable technician to do it.

    • @kernsanders3973
      @kernsanders3973 Год назад

      @@manaphylv100 Wow cool thanks for the info, certainly going to look into that!

  • @personalgao
    @personalgao Год назад +6

    Memory is use for generative AI.
    The problem is that many open source projects use nVidia and little by little we can use AMD GPUs.
    This could be a great card, not for games, but other applications.
    But... the software is not yet.

    • @mareck6946
      @mareck6946 Год назад

      you can run stable diffusion on it.

    • @jomeyqmalone
      @jomeyqmalone Год назад

      Are a lot of people actually looking for 7 year old AMD GPUs for this kind of work though?

    • @mareck6946
      @mareck6946 Год назад +1

      @@jomeyqmalone some may for budget stuff. rendering works too. also video editing with eg davinci resolve can benefit from thsi large FB. i mean teh next competitor to that value in terms of vram is prolly the Intel A770. Which is quiet a gap tho.

  • @Dark.Shingo
    @Dark.Shingo Год назад +2

    My biggest takeaway is that video memory isn't as important as the panic for upgrade made people think. Newer games seem to ask for more but not really doing anything with it.

    • @jackielee297
      @jackielee297 Год назад +1

      Unless you're super competitive and needed top of the line setups, there's really no reason to rush for an upgrade

  • @tibib0ss
    @tibib0ss Год назад +4

    Video starts @1:30

  • @snackpacking
    @snackpacking Год назад +2

    Aw man for a second I thought this was a GTX 580 and got excited! But, I watched the whole thing anyways because your videos are awesome.

  • @petecoventry6858
    @petecoventry6858 Год назад +7

    Nothing I buy from there ever arrives lol

  • @JRTheElectronicGuy
    @JRTheElectronicGuy Год назад +1

    I would love to test the card for some mining. See if there’s any difference whatsoever being able to use all 16 GB for crypto mining.

  • @Kyle-rv6vx
    @Kyle-rv6vx Год назад

    Thankyou for the awesome content 🫂

  • @robdunning6285
    @robdunning6285 Год назад +3

    Great video. Very entertaining. My son and I are currently in the process of upgrading an old sff business machine just to see how it plays games at 1080p. Low power low profile. Thnx again.

  • @Kardall
    @Kardall Год назад +1

    I am actually surprised how well the 5700 XT is doing right now in Starfield. Being a minimum spec, I have been tinkering with it and I can get some pretty high settings and still hovering around the 60fps without FSR. Quite surprising. Before when there would be something like a GTX650 minimum, the game would have to be run on low settings to get 60fps.

  • @fluffyboom
    @fluffyboom Год назад +4

    what have they done to my precious 580...

    • @MrTefe
      @MrTefe Год назад +1

      lmao nice joke. nobody uses a GPU that bad nowadays

    • @fv101
      @fv101 Год назад +2

      @@MrTefe shut 😔😔😔😔

    • @StayMadNobodycares
      @StayMadNobodycares Год назад +2

      @@MrTefe alot of people do, I bought one last week, not my main rig but, I still bought it.

    • @underrated6902
      @underrated6902 Год назад

      ​@@MrTefesurprise surprise, it's close to the 🐐 1060

  • @wizardothefool
    @wizardothefool Год назад +1

    16gbs is a lot for one little gpu, maybe if had 2? gpus? hmm? just a hunch

  • @iguanac6466
    @iguanac6466 Год назад +5

    You could have reprogrammed the BIOS for a more aggressive fan curve (keeping it ~65C) and bumped the core clock speeds and even improved memory timings. That would have been an interesting experiment. I've always wondered if tightening memory latency helped gameplay.
    One other thing you could do by making your own BIOS is to bump the memory speeds up as well (until you start seeing ECC errors in HWInfo64).

  • @Nobe_Oddy
    @Nobe_Oddy Год назад +2

    you ALWAYS start my Saturday morning off right with HILARITY!!! THANK YOU DAWID!

  • @anneneale5032
    @anneneale5032 Год назад +3

    Just love the humour and the way you take nothing seriously about tech. If we want serious results, I will watch Gamers Nexus. But your channel is funny and makes me feel better. Also, I love your Anna she's so cool, I wish we so her more often. Cheers!

    • @raven4k998
      @raven4k998 Год назад +1

      are you learning with lenode?🤣🤣🤣

    • @Dark.Shingo
      @Dark.Shingo Год назад

      @@raven4k998 LENOOOOOODE.

  • @relkyagreymoon12
    @relkyagreymoon12 Год назад +2

    Dawid, you should try playing the 7 days to die game. If you have a save with a lot going on, that old game uses a lot of VRAM (10gb is not even enough at 1440p). You can use that to compare GPU VRAM.

  • @Itz_Ciprian
    @Itz_Ciprian Год назад +3

    Day 53 of Ahoy there

  • @denvera1g1
    @denvera1g1 Год назад

    unless you're running out of VRAM to hold reduced textures adding more VRAM doesnt generally increase performance UNLESS you add a wider bus.
    However, adding more VRAM can allow for farther LOD fade, and higher textures throughout.

  • @warrenmcclure7819
    @warrenmcclure7819 Год назад +2

    I mean, was it really a scam. It truly had 16GB of vRAM which i didnt think it would really have

  • @reidster87
    @reidster87 Год назад +1

    Huh! Now I want to fire up my trusty Vega 64 Frontier Edition (16GB HBM2) and see what it takes to fully utilize the VRAM.

  • @Suc-Chii
    @Suc-Chii Год назад +1

    Funny thing there's also a RX590 GME, which is just an rx 580 with 40% more clocks. Which make it a slight slower 590 and slightly faster 580

  • @Phil8719
    @Phil8719 Год назад

    10:31 - It's interesting that 100% GPU usage is reported here when the framerate is being limited due to VRAM capacity, and it looks like it's not using all 8GB for some reason

  • @kevinwood3325
    @kevinwood3325 9 месяцев назад

    Solid choice for anyone needing a huge VRAM buffer on a budget. Linus Torvald comes to mind. Back in 2020, his personal workstation included a 3970X Threadripper and an RX 580. Dude doesn't need much GPU power, apparently.

  • @playerofgames7916
    @playerofgames7916 Год назад +2

    Dawid Hollywood are searching you

  • @Pendif
    @Pendif Год назад +1

    I still use DVI with adaptor to hdmi

  • @alderoth01
    @alderoth01 Год назад +2

    It's crazy my 2nd gen RTX 3060 has more vram than most new cards lol. I thought I was getting on the ground floor with my single fan, amazon special, GPU, but its actually been a work horse, and since its so ridiculous to 16Gb Vram or higher on a GPU I feel extremely fortunate lol.

  • @PopsHowTo
    @PopsHowTo Год назад +1

    I appreciate that when you are done with a video, YOUR DONE! It’s like BAM! video over! Go home! I love it! Haha 😂

  • @SabrinaWesker
    @SabrinaWesker 9 месяцев назад

    I have a 4gb version of the 750ti from some internet sale, and it was actually pretty decent back in the day with the extra vram. It overclocked very well! This made me think of it.

  • @peterlarkin762
    @peterlarkin762 Год назад

    The vram chips were sanded down because they came from a batch that performed too low or had too many errors. Samsung or a third party would sell these on discounted to other parties, so they can do more testing (hopefully) and reclaim the good ones. It's really common and not necessarily as sketchy as it looks - but they're probably low spec.

  • @FINNIUSORION
    @FINNIUSORION 11 месяцев назад

    that's actually pretty smart for marketing and sales. loads of people only pay attention to memory when they're buying a card. oh that's a 16 gig card wow!

  • @shanestatham
    @shanestatham Год назад +1

    How many times did you say 'circumcised'? lol

  • @ZeePanzer
    @ZeePanzer 11 месяцев назад +1

    a gigantic amount of those 2048sp, are fake, remarked RX470s.

  • @JamesRichardsPlays
    @JamesRichardsPlays Год назад

    I have an MSI Radeon RX 580 GAMING X 8GB card. It has served me well all these years. It is retiring at the end of this week when I build my new computer. I find it odd though, I don't know what I did, but I was getting 60 fps on High at 1080 in Cyberpunk 2077. Secret, the "Ultra" settings in a lot of games, if not all, is really not that great. You might get 1 or 2% increase in visual fidelity. I would prefer the more smooth and increased FPS at High.

  • @ridingnerdy6406
    @ridingnerdy6406 Год назад

    You can talk about sketchy Aliexpress hardware all day, but at around $55-60 those 8gb Chinese RX580s are by far the best value for 1080p gaming. It's an automatic inclusion on any budget build I do now.

  • @dragonsystems5973
    @dragonsystems5973 Год назад

    I used to have so much fun doing this stuff... I recently hit an income level where I can buy basically anything I want tech wise... and it has sorta taken the fun away

  • @joshuascholar3220
    @joshuascholar3220 Год назад +1

    Why did they SAND DOWN the memory chips?

  • @LewdSCP1471A
    @LewdSCP1471A Год назад

    Im amazed how long my 580 has lasted, unfortunately more intensive games like cyberpunk not being able to hit a consistent 60 fps is really bugging me.

  • @fitah47
    @fitah47 Год назад +2

    please do more aliexpress videos, that's the only place where I can get cheaper prices where I live

  • @yves1926
    @yves1926 Год назад +1

    I love this sort of product, so original

  • @SaintPuppetS
    @SaintPuppetS Год назад

    I also have a Strix 580, it's actually insane how good this card is, i didn't even realize how much of a better base clock it has compared to the other 580s.

  • @thecheapshot1065
    @thecheapshot1065 Год назад +1

    The CPU core allocation and usage and the memory allocation stats on the screen while playing is something I've always wanted to deal with my son's PC. We've been running a FX 8350 processor with a 1660 super. Recently unlike what The geek squad tells everyone we actually had a motherboard failure although it could be the processor gave. Either way I picked up a ryzen 5 5600 as an open box that that's sorta land itself nicely 2D newer RAM and the newer motherboard. We have a Corsair liquid cooler that I hope will mitigate the heat of a slight overclock so that I can get the 5602 5600x plus standards.

  • @kaseyboles30
    @kaseyboles30 9 месяцев назад

    Note they also slightly dropped the clock speeds on the reference 2048 (just under 100mhz slower) though this kit bashed card could be set to anything.

  • @zerora199
    @zerora199 Год назад

    Dawid is the first RUclipsr I know to reuse clips for a sponsorship.

  • @blze0018
    @blze0018 Год назад

    1:03 You sure are, buddy! Good job! You'll be a real boy soon!

  • @nonyabizz9390
    @nonyabizz9390 Год назад

    I actually love the 2048 cards, but that's mostly because, when looking for most performance possible with fewest dollars spent, I don't think anything beats it. Where else are you gonna get a $45-$80 video card that is that powerful?
    While certainly a little mock-worthy, I think the card really shows how capable it still is, and is a clear win for absolute budget gaming PC builds!

  • @keh998
    @keh998 Год назад +1

    I think the reason they frankenstein 16gb because it used to be used for mining purpose

  • @bluewave2432
    @bluewave2432 9 месяцев назад

    This video just shows how good of a card the rx580 is nowadays. Another timeless piece of silicon!