Radeon HD 7000 Series with Hindsight: Great, or Disappointing?

Поделиться
HTML-код
  • Опубликовано: 18 сен 2024

Комментарии • 396

  • @juanignacioaschura9437
    @juanignacioaschura9437 2 года назад +294

    Actually, the 7000 moniker came full-circle two times:
    - ATI RADEON 7000 (R100): 2000
    - AMD RADEON HD7000 (GCN1.0): 2011/2012
    - AMD RADEON RX7000 (RDNA3): 2022

    • @ahreuwu
      @ahreuwu 2 года назад +23

      can't wait for the amd radeon xl7000 (rgp3) in 2033

    • @DigitalJedi
      @DigitalJedi 2 года назад +6

      I have one of each of the first 2. Hoping to switch away from Nvidia for the RX7000 series to go full circle.

    • @pauls4522
      @pauls4522 2 года назад +4

      yup iI found that pattern too. Almost every 10 years AMD loops back to 7000 series.
      The original Radeon 7000 series, HD 7000 series, and now the upcoming 7000xt series.
      I have personally owned the awful 32mb ati radeon 7000. A card which I actually fondly remember because it was my first gpu in my first ever pc that was all mine and not shared with family. I did not learn it was a gimped card until years later. Gimped as in 64bit memory bus, so even 3DFX cards from late 1998/early 1999 could beat it despite having DDR memory instead of SDRam. For example my sister had a 16mb 3dfx voodoo 3500TV and a 1ghz athlon cpu and 384mb of ram. I had a 500mhz intel pentium 3, 768mb ram, and the 32mb ati radeon 7000.
      back then I thought he pc was faster because she had a drastically better process, and a much nicer monitor providing cleaner visuals, but later now I know her voodoo card was at least 30% faster than mine.
      I also owned an HD7970 and still fondly remember it as my workhorse gpu for 5.5 years, and would have kept using it if the card did not start dying around the 5 year mark leading me to replacing in february 2019.

    • @albal156
      @albal156 2 года назад +3

      OMG 10 years a apart too lol.

  • @GamingRevenant
    @GamingRevenant 2 года назад +141

    The AMD HD5870 is literally the GPU which powered the start of my channel, and for more than 10 years allowed me to record videos and play video games with little to no problems (on 1080p). I consider it the best GPU pricewise vs. its performance that I have ever had.
    When I went ahead to buy a HD 6870, the salesperson literally told me it was not worth upgrading because all AMD did was shift the previous generation along the model names down without really improving. I'm happy he was honest about it, and so it seems he was right.

    • @e-cap1239
      @e-cap1239 2 года назад +10

      Actually, the 6870 is worse, as it has less shaders. But it was sold for nearly half the price. I personally also had an HD 5870 2GB and it's performance was quite good even if nowadays, it is comparable to a GTX 1050.

    • @raresmacovei8382
      @raresmacovei8382 2 года назад +6

      But that's not actually correct. Performance was mostly the same, but the card was smaller, more efficient, cheaper, great at crossfire

    • @stevy2
      @stevy2 2 года назад +2

      I had the HD 6950, unlocked to a 6970. That thing was a budget beast back in the day.

    • @Cheesemonk3h
      @Cheesemonk3h 2 года назад +1

      @@stevy2 I used a 6950 up until like 2018, it was severely outdated by that point but i still used it to play overwatch at launch. shame how quickly that game got ruined by identity politics

    • @Annyumi_
      @Annyumi_ 2 года назад

      @GamingRevenant you can get GTX 260 to start your 10 year channel since all your games you playing are DX9 so you can use that instead of dx11 gpu

  • @BudgetBuildsOfficial
    @BudgetBuildsOfficial 2 года назад +227

    The HD7770 was my opening into the world of real PC gaming, coming out when both the PS4/XboxOne were incredible overpriced and offering comparable quality I still have plenty of fond memories (and my original, if slightly awful VTX Variant of the card)
    Really liked this little recap into the real world situations of this generation 👌

    • @HappySlappyFace
      @HappySlappyFace 2 года назад +2

      Burger

    • @jaect
      @jaect 2 года назад +4

      @@HappySlappyFace Pretty much same man, had a Gigabyte HD7970 as my starting card and it paved the road forward; fantastic stuff, lasted a damn while too.

    • @Forendel
      @Forendel 2 года назад +1

      @@jaect im still using 7970 lol

    • @tuff_lover
      @tuff_lover 2 года назад

      Eww. Did you overclock it, at least?

    • @HappySlappyFace
      @HappySlappyFace 2 года назад +1

      @@tuff_lover why eww? I'm a 7970 user too and it's really holding up strong

  • @rzarectz
    @rzarectz 2 года назад +54

    Kliks your tech content is top notch. Keep it up!

  • @enricofermi3471
    @enricofermi3471 2 года назад +318

    That performance-effiviency thing explained around seven and a half to eight minutes into the video seems to be in reverse right now, lol. nVidia is rolling out 350-450 watt furnaces (3090, 3090Ti) with more to come in the 4000 series, while AMD is keeping it within 300 watts, with a bit over that in their recent 6950XT.

    • @miknew20
      @miknew20 2 года назад +44

      if current rumours are right, ada lovelace is gonna have 450w+tdps to compete. maybe even one 600w sku. if true, then history repeats itself in funny ways.

    • @12Music72
      @12Music72 2 года назад +68

      @@2kliksphilip Because it came from a source who's known to be spot on with the rumours?

    • @PMARC14
      @PMARC14 2 года назад +9

      @@2kliksphilip with the release of the next gen cards which I hope to see you review, we will be able to better tell. AMD's RDNA2 seems to have a great design being usable as an iGPU in devices such as a steam deck. They seem to also have advanced new designs with multi-chip gpu's like their ground-breaking ryzen cpu's. In comparison NVIDIA seems to be doing the exact thing AMD has/was been doing, being behind on these new technologies, they hope minor architecture revisions better nodes and overclocks will pull them forward for the moment. Nvidia is using a custom 4N node from TSMC for its next gen card, AMD seems likely too use a mix of 5nm and 6nm. Die size may matter less for AMD also due to mutli-chip design as compared too previous cards discussed. If AMD can even pull parity with nvidia with this, then they will have accomplished something similar to what Nvidia managed with the 600 series gpu's.

    • @Just4Games2011
      @Just4Games2011 2 года назад +21

      @@2kliksphilip Because people who leaked them have very good track records of leaks. RDNA 3 is gonna be an MCM GPU with 3 dies, vs what nVidia will have as monolithic 4090. Moore's Law Is Dead has a good track record on these leaks becoming the reality, and he confirmed both Lovelace for 600W and RDNA 3 for around 400-450W.

    • @pyrophobia133
      @pyrophobia133 2 года назад +14

      both sides take turns making power hungry cards

  • @01chohan
    @01chohan 2 года назад +63

    AMD putting 3GB of VRAM in the 7900 series made such a massive difference as the GPUs aged. Reminds me of the i5 2500k vs i7 2700k comparison

    • @Orcawhale1
      @Orcawhale1 2 года назад +1

      Both of which are completely wrong.
      As the 7970 ran out of performance, long before it turned into vram problem.
      The same thing happend to the 2500k.

    • @AFourEyedGeek
      @AFourEyedGeek 2 года назад +15

      Haha, no it didn't, the 3GB was an awesome decision.
      Check out '7970 gaming in 2020' RUclips video. The card was doing really well for such an old card, helped the higher amount of RAM on it. My Vega 64 is still going strong thanks to the 8GB of RAM on it.

    • @Orcawhale1
      @Orcawhale1 2 года назад +1

      @@AFourEyedGeek No, your Vega 64 is still going strong on account on the fact, that we've hit a point of diminishing returns in graphics.
      Which means that there's no longer the same push for increase in graphics.
      And im afraid your wrong, the 7970 does infact run out of performance, way before the vram buffer hits max.

    • @AFourEyedGeek
      @AFourEyedGeek 2 года назад +9

      @@Orcawhale1 the video link I suggested was evidence to the contrary regarding the 7970, games using just under 3GB VRAM, and performing relatively well. Increasing texture sizes has a minimal hit on performance but eats into the VRAM though. Higher textures can allow games to look good though you'll have to lower other settings to maintain decent fps. Some settings eat into different areas of a system, tweaking settings for your specific system can be beneficial, even a mismatch with higher RAM than relative performance of the GPU can offer a reasonable trade off.

    • @pauls4522
      @pauls4522 2 года назад +1

      @@Orcawhale1 incorrect dude.
      By the late 2010s in around 2016 the 3gb of vram was a limiting factor of the card more so than the raw performance.
      I have personally noticed this in plenty of games from around that era such as mirrors edge catalyst. When paired with the less commonly used 6gb variant, games from that era can have maxed out textures and still hit 1080P/60 just fine. With 3gb variant however you would see fps in the 20s if you attempted to max out the textures.

  • @Peterscraps
    @Peterscraps 2 года назад +20

    I got a 3090 because it was literally the only thing I could get my hands on, the same thing happened back in early 2016 when I got my r9 390X. It seems I got stuck in the upgrade cycle that has the worst power efficiency and market conditions. Don't get me fucking started on the abomination that is the 3090 ti, double the price with a new power supply with energy costs going through the roof.
    Nvidia is playing the market because they didn't last time, It's like seeing your drunk uncle piss away his inheritance at betfred

    • @veryfunnyguy4991
      @veryfunnyguy4991 2 года назад +1

      “Another successful office blunt rotation. pass that shit homie. Actually Norman you're being kicked out of the circle. Oh you can't do this to me I started this sesh! You chief that shit for like 6 puffs before you pass you're out Norman. DO YOU KNOW MUCH MUCH I SPENT ON THIS WEED!? Couldn't be much dog this is mid af. Fuck these opps I'll buy my own weed. I would like to purchase your best loud. Yeah fella try this out it's called goblin gas. WHAT THE FUCK DID YOU PUT IN THIS SHIT!? AH HAHAHAHA FELLA I'M TWEAKIN!!!”

  • @mancavegamingandgardening9901
    @mancavegamingandgardening9901 2 года назад +57

    The real winner? The 7950 owners who bought an 'underwhelming' card that was later BIOS patched to have a better clock speed, and THEN was one of the best crypto mining GPUS at the time so you could second-hand the card and pocket more than you paid for it. I think this series is the first time I saw a GPU appreciate in value.

    • @dylon4906
      @dylon4906 2 года назад +3

      same kinda thing with my rx 580 where I sold it for more than twice I bought it for 2 years after I bought it

    • @Shiinamusiclyricssubs
      @Shiinamusiclyricssubs 2 года назад

      @@dylon4906 same with my vega

    • @ProperlyPsychotic
      @ProperlyPsychotic 2 года назад

      @@Shiinamusiclyricssubs well that's been the case for nearly every GPU that's remotely competent at crypto mining

  • @mariuspuiu9555
    @mariuspuiu9555 2 года назад +119

    I believe the 7000 series is considered legendary because it improved a lot with driver updates (aka AMD Finewine)

    • @RealMephres
      @RealMephres 2 года назад +3

      AMD's Drivers are somehow very good cross-platform in almost every regard. Them unnecessarily increasing prices is annoying, but you got to give respect to their software department.

    • @Orcawhale1
      @Orcawhale1 2 года назад

      AMD Finewine has never been a thing.
      It was simply down to the architecture being reused over and over again.

    • @mariuspuiu9555
      @mariuspuiu9555 2 года назад +8

      @@Orcawhale1 It was a major thing early on with GCN. the driver updates gave that architecture major improvements over time (especially for those who bought the HD7000 and 200 series). for example, in Doom Eternal the GCN cards are at times twice as fast as Nvidia's 600/700 series (direct competitors at launch).
      the 7970's competitor should have been the 580, but it can beat the 780ti in some titles now.

    • @AFourEyedGeek
      @AFourEyedGeek 2 года назад +7

      The AMD drivers kept improving performance for GCN, while NVIDIAs drivers hurt performance over time and then dropped support early.
      It probably does have a lot to do with AMD using the GCN for so long, however the outcome for 7000 series owners is that the GPU was improving over time.

    • @Orcawhale1
      @Orcawhale1 2 года назад +1

      ​@@mariuspuiu9555 Doom Eternal is using the Vulkan, which itself is based on AMD Mantle, from 2013.
      So obviously GCN is going to perform better than Nvidia.
      What's more, the increased performance, was simply down to the fact that developers became more familar with AMD's GCN architecture.
      As the Ps4 and Xbox One both used AMD hardware.

  • @simpson6700
    @simpson6700 2 года назад +25

    I feel like we need history videos like this whenever a new generation comes out. Reminding us of the Performance, price, die size, and power draw. I feel like nvidia and AMD went off the rails this gen with the shortage and i don't think we will ever go back to normality, because the shortage was profitable.

  • @additivent
    @additivent 2 года назад +91

    They seem to have aged well, but only failing now because of AMD's shoddy driver support. The 7770 was a legitimate budget option for me 2 years ago, if it wasn't for a friend offering me a 1060 3GB for cheap.

    • @RuruFIN
      @RuruFIN 2 года назад +7

      Modified NimeZ drivers gives the older cards some extra life support.

    • @cyjanek7818
      @cyjanek7818 2 года назад +8

      Wasnt it checked on Linus tech Tips chanel that older amd cards actually work with newer games (better driver support)?

    • @MandoMTL
      @MandoMTL 2 года назад +5

      @@cyjanek7818 Fine Wine. Yep.

    • @conenubi701
      @conenubi701 2 года назад +16

      "shoddy driver support". This is an ancient card, you can't reasonably expect this card to be supported in this day and age with drivers

    • @peters.7428
      @peters.7428 2 года назад

      Custom drivers?

  • @FatSacks
    @FatSacks 2 года назад +65

    I had a 7970 back in the day that I bought for only $330 a few months after it released! It was legitimately the best GPU I ever owned. I played Crysis 3 on it with pretty high settings, Bioshock 3, GTA 5, Wolfenstein, Far Cry 4 and lasted me until Witcher 3. My friend let me have his 780 and being blinded by the shiny new nvidia card I sold the 7970 on ebay for a cheeky $150. 2 months later the first mining craze struck and the card was suddenly worth $500+ AND the 780 turned out to be a piece of trash with how hot it ran and how loud the cooler was compared to the Sapphire cooler on the 7970 and to top it all off it wasn't even faster (I was also running an 8150 at the time so I think nvidia drivers sucked on bulldozer compared to the GCN stuff).
    After the 780 I went to the 980 (went to intel with the 7700k here) then the 1070, 1080, 2080ti (+9900k) and now these days I'm running a 3080, and it's ok I guess. RT is nice and the Founder's cooler looks sick and runs pretty cool, but I don't feel the same way I felt about the 7970. Maybe it's just me getting older and more jaded and expecting more out of my hardware.

    • @FatheredPuma81
      @FatheredPuma81 2 года назад +3

      I'd guess that the reason for that is because it was the tail end of the Extreme Settingz era. Where you needed a top of the line card to max out a game at 1080p and get a solid 60 FPS and the start of a tech stagnation that was only really broken when the 10 series came out.
      I'm sure many people feel the same way about the 8800 Gxx cards actually.
      Having just upgraded from a GTX 1080 to a 6800 XT and then "upgraded" (in my eyes) to a RTX 2080 Ti I can see that only really noticed a different in games I don't really play. DLSS will likely make this much much worse when I upgrade from the 2080 Ti to a 4080/5080 or something.

    • @FatSacks
      @FatSacks 2 года назад +4

      Man the 2080ti was honestly the worst card I've had in a while, it ran so much hotter and louder than my 1080. I don't think I would have went to it from a 6800xt.

    • @0Synergy
      @0Synergy 2 года назад +3

      @@FatSacks I actually went from a GTX 1080 to a 6800XT. What a fucking difference lmao. Crazy card.

    • @FatheredPuma81
      @FatheredPuma81 2 года назад +1

      @@FatSacks Tbh I just wanted to go back to playing Minecraft at a decent framerate and watching videos. Was actually aiming for a 3070 or 3080 but ended up finding it for $500.
      It does run hot though. With an OC and raised temps it gets to 84C :\.

    • @FatheredPuma81
      @FatheredPuma81 2 года назад

      @@0Synergy The gaming performance jump is massive. If you only game and don't watch any videos (especially while gaming), don't care about RT, and don't care about OpenGL games (Minecraft) then it's a really good card. Especially if you can get one at MSRP.

  • @thepcenthusiastchannel2300
    @thepcenthusiastchannel2300 2 года назад +22

    It's actually easy to tell how much of the performance uplift going from a 6970 to a 7970 was due to the new architecture and how much was due to the die shrink. A die shrink allows you to pack more transistors and often allows you to achieve higher clocks. In this case, the 28nm didn't allow for that much of a clock speed improvement. Going from 880 MHz to 925MHz. The bump in ALUs "Shader Units" also wasn't that large going from 1536 to 2048 pushing the FP32 performance up from 2.7 TFlops to 3.8 TFlops.
    The number of ROPs (still the main performance determining factor) was the same at 32 ROPs for both. This mean't that the Pixel Fill Rate went from 28.2 GPixel/s to 29.6 GPixel/s yet the performance went up significantly between the two cards.
    This is because the Fill Rate efficiency went up, the Pixel Shader efficiency went up as well. The move AMD made was from VLIW to SIMD. So what we're seeing is a boost in efficiency predominently led by architectural efficiency improvements rather than node shrink improvements.
    However, GCN was mean't to service two markets at once and this was its downfall in the consumer market. It was mean't for the Professional Workstation Market as well as the Consumer Gaming Market. This mean't that GCN tended to be extremely powerful compute wise but lacked Pixel Fill Rate performance. nVIDIA, on the other hand, segmented their architectures. nVIDIA's gaming line had more ROPs than AMD's and less Compute performance. nVIDIA basically went the way of the ATI/AMD 4000/5000 series with Kepler and Maxwell. Less power usage, more gaming oriented.

  • @JakoZestoko
    @JakoZestoko 2 года назад +15

    I am one of those people still running a 7770 GHz edition to this day. It only cost $100 CAD, which with today's prices is mind-bogglingly cheap for a mid-range card. I can still run CSGO at my desired graphics settings and consistently get at least 144 fps, so I'm happy with it. Definitely some of the best price/performance we'll ever see in this space I think.

    • @LucasCunhaRocha
      @LucasCunhaRocha 2 года назад

      I had a 7700 ghz and later a 7950 from sapphire and they both died quite fast, dunno if it was just me being unlucky or a design problem.
      I am using a 980 now which was a massive upgrade.

    • @jasonjenkinson2049
      @jasonjenkinson2049 2 года назад

      My first real card was a Sapphire HD 7790 2gb and boy did that change my world. Unfortunately I traded it for a gtx 760 three years ago, which used twice the power for a marginal improvement on performance. Luckily I know better now.

    • @lordwafflesthegreat
      @lordwafflesthegreat 2 года назад

      I loved my 7770 GHz. Had it 'till 2017/2018. It had only 1 gb of VRAM, but could still run Watch_Dogs 1 and 2 with no issues.
      It was replaced very recently, when the 3060ti first dropped.

    • @goncalomoura9816
      @goncalomoura9816 2 года назад

      @@jasonjenkinson2049 That was a terrible decision, nowadays people are finding out that an HD 7790 / R7 260X clearly outperforms a GTX 680 or GTX 770 in newer DX 12 and Vulkan titles while costing 4 times less and using half the power draw, but to be fair nobody knew that back in the day, this video is all wrong comparing early benchmarks, "tech experts" were advising people on a budget to buy the GTX 650 Ti or 750 Ti paired with i3 dual core CPU's over an FX 6300 + 260X. Nowadays those cards and CPU's cannot even launch the newest games, while the HD 7790 can still play Elden Ring, FH 5 or God of War smoothly at 1080p, and FX CPU's are outperforming older 2nd and 3rd gen intel ones.

    • @jasonjenkinson2049
      @jasonjenkinson2049 2 года назад

      @@goncalomoura9816 😢

  • @timothypattonjr.4270
    @timothypattonjr.4270 2 года назад +7

    The 7970 aged better than expected because AMD continued to use GCN for far longer than Nvidia used Kepler.

    • @Loundsify
      @Loundsify 2 года назад +1

      It also helped that a lot of game engines on console were built around GCN.

  • @akn8690
    @akn8690 2 года назад +9

    Im both subscribed to this and kliksphilip channel but videos from this channel wont drop to my subscriptions page. I saw this in my home page. weird.
    Also I would like to see a video about why amd drivers were bad and are they still bad and should we pay more for nvidia cards for its better and more stable software support ( both dlss, nvidia broadcast kinda things and drivers.)

    • @unison_moody
      @unison_moody 2 года назад +1

      Same

    • @gennoveus
      @gennoveus 2 года назад

      The video was also hidden for me, too ...

  • @TrueThanny
    @TrueThanny 2 года назад +9

    Crossfire actually worked very well at the time. The number of games that didn't work well with it was very small.
    I had a pair of 7970's clocked at 1200MHz each, and that was faster than any nVidia card for years afterwards.
    As for efficiency, AMD is well ahead with RDNA 2, and is likely to get further ahead with RDNA 3.
    It's an error to conclude that the difference in power consumption is down to the node. The power efficiency difference between TSMC's 7nm node and Samsung's 8nm node is far lower than the efficiency difference between RDNA 2 and Ampere. Whether or not RDNA 3 takes the absolute performance crown, it's looking like it will utterly destroy nVidia in performance per watt. And that's with nVidia seemingly having a slight node advantage.
    But we'll see once we have actual data rather than just rumors.

  • @dklingen
    @dklingen 2 года назад +5

    Great video and a nice perspective - sadly the new card performance matching prior generation card pricing days seems to be lost as we are hitting $2K for top tier (insanity).

  • @kiri101
    @kiri101 2 года назад +11

    The HD 7850 2GB has aged like a fine wine, mine's still powering forwards with modded drivers on Windows and excellent out-of-the-box support on Linux.

    • @Orcawhale1
      @Orcawhale1 2 года назад

      Then obviously it hasn't aged like fine wine.

    • @kiri101
      @kiri101 2 года назад

      @@Orcawhale1 It received considerable performance improvements in its lifetime, especially in open source drivers, and still continues to benefit from increased functionality and performance with modded drivers. So yes, like fine wine.

  • @hellwire4582
    @hellwire4582 2 года назад +11

    I got a 7950 as replacement for a 6670 in 2020. I was blown away when i could run red dead 2 at 1600x900 low settings with an oc'd i7 950 and a 7950 at 40 fps ^^ the game still looked totally amazing and felt smooth.

    • @FatSacks
      @FatSacks 2 года назад +1

      that's badass

    • @Orcawhale1
      @Orcawhale1 2 года назад

      Them im afraid, you don't know what smooth is.

    • @hellwire4582
      @hellwire4582 2 года назад +7

      @@Orcawhale1 i know your brain is smooth

  • @ODST1I7
    @ODST1I7 2 года назад +1

    I'll always remember the 7850 fondly. I used it for almost 6 years, even though I had to run most games at low by the end of its life. The price point was something else back then, I got an ASUS card for about $250 grouped with $100 in new games. Great video, Phillip.

  • @syncmonism
    @syncmonism 2 года назад +8

    A lot of people don't understand just how much of Nvidia's performance advantage has had to do with having the advantage of games being better optimized for Nvidia hardware. People seem to like to attribute better performance per watt at the same node size as entirely dependent upon which GPU architecture is better, but that's not a reasonable assumption to make.
    In other words, Nvida's GPU engineers had an unfair advantage against AMD's

    • @cyjanek7818
      @cyjanek7818 2 года назад +7

      ​@@huleyn135 I dont know if you dont belive him or if you call these practices shit but it is true that biggest one on the market manipulates it in every possible way.
      For example it was known problem that when laptop ryzen 4xxxx series came out and beat everything intel had people had trouble finding laptop with those cpus, beause many companies had deals with intel to take only their cpus. This is possibly the reason why amd has advantage program - many laptops with amd were some low end things that made bad impression.
      Same with optimizing games - it isnt some hot take that games work better with card from companies who helped, I think this problem got a lot of attention during nvidia hairworks and witcher 3 era.
      If its unfair or "it is what it is" depends on your view but I just wanted to clarify that it really happens.

    • @simpson6700
      @simpson6700 2 года назад

      @@huleyn135 it's true that amd cards were better at pure computational tasks that weren't games and some games (codemaster's grid, dirt and later F1) reflected that performance too. But in the end you should buy your GPU based on what you get, not what could maybe be in the future if every game was optimized for AMD, based on a couple of outliers. Just like you shouldn't buy an Nvidia GPU on the premise that every game could be made with gameworks technologies.

    • @user-fs9mv8px1y
      @user-fs9mv8px1y 2 года назад

      Honestly the only real advantage nvidia has is better raytracing performance as far as I know

  • @kobathor
    @kobathor 2 года назад +5

    I loved my Sapphire HD 7970 Dual-X until 2016, when I got a Sapphire R9 FURY for $250 and sold the 7970. I still sometimes regret selling the 7970. I had treated it so well, having repasted it, replaced a dead fan, things like that. It even played a lot of DirectX "12" games ;)

  • @simonrazer8303
    @simonrazer8303 2 года назад +4

    it felt like you said "in conclusion" 50 times in this video

    • @simonrazer8303
      @simonrazer8303 2 года назад

      @@2kliksphilip Oh oke. Well, it was a good video no matter how 👍

  • @tambarskelfir
    @tambarskelfir 2 года назад +4

    It's a nice recap of the situation, but there's some context missing. At the time of the HD7000 series, Rory Read decided that there was no future in discreet GPUs and AMD wasn't going to compete in that market going forward. AMD under Read, genuinely believed that the future was APUs, and the GCN architecture was perfect for APUs. Good enough for graphics, also excellent for GPGPU. This perennial lack of a competing product from AMD in discreet GPUs was in no small way due to corporate policy, which didn't change until after Ryzen.

    • @beetheimmortal
      @beetheimmortal 2 года назад +1

      That's absolutely insane. You just gotta love stupid decisions made by the leaders.

  • @Squeaky_Ben
    @Squeaky_Ben 2 года назад +2

    I remember building my very first gaming PC in 2011.
    8 GB of RAM, a 6 Core Phenom II and a HD 6970.
    All to play Crysis 2 and Battlefield 3 at maximum resolution.
    Those really were the days.

  • @Lyazhka
    @Lyazhka 2 года назад +2

    nice and informative video! looking forward to see FSR 2.0 in extra low resolutions!

  • @xTheMoegamer
    @xTheMoegamer 2 года назад

    My PC gaming journey Started with the Radeon HD7950.. how the time flies! It's also still my backup if anything happens to my 1080.. it would still more or less run my most played games so I am happy

  • @danielmadstv
    @danielmadstv 2 года назад +1

    This was awesome and I really enjoyed the history. I got into gaming with the HD 6950 and honestly didn't know much about the exact history afterwards. This was super interesting and I hope you make more like this. I'm also eagerly awaiting your FSR 2.0 video! Can't wait to hear your take, that's the one I respect the most as you are my designated upscaling and AI expert. Thank you for your excellent work as always.

  • @GD-mt4pe
    @GD-mt4pe 2 года назад +1

    I didn't get this video in my subscription feed, I saw it in my recommended 4 days later.

  • @freedomofmotion
    @freedomofmotion 2 года назад +5

    7850 was an overlooking monster. I had one that ran at 1.6ghz or something mad like thst. Silicone lottery for sure but damn that was a good card. MSI twin model

    • @Loundsify
      @Loundsify 2 года назад

      So it was literally running twice the compute of the PS4.

  • @bmcreider
    @bmcreider Год назад

    Awesome video on my current rabbit hole of nostalgia I’ve ventured down.

  • @pauls4522
    @pauls4522 2 года назад

    I have seen multiple youtubers construct the same data and results about the 7k series, but I have to say your presentation was absolutely phenomenal.
    Fun Fact though, the HD 7970 was actually released in late December 2011 in relatively limited supply. Not that the extra week really matters much in retrospec.
    The og HD7970 was originally beaten by the GTX680, but with driver improvements throughout many years it actually beat nvidia card.
    There was also a less commonly talked about super-overkill-for-the-time 6gb variant of the HD7970. Many of the large youtubers in the rare case it is mentioned instantly dismiss it as not mattering because "DERP- the HD7970 is not powerful enough to fully utilize 6gb vram -DERP" without even going into the proper analysis of it. Which I find to be a shame. I personally know quite a few games such as mirrors edge catalyst on top of others released around the 2016/2017 which were bottlenecked by the 3gb vram buffer, which could have at least seen maximized textures and 60fps still if using the 6gb variant.
    I bought my HD7970 when its price dropped to 300$ and came with 3 free games in September of 2013. I used the card until February 2019, and the only reason I retired the card then was because the card was starting to see signs that it was dying such as no longer being able to overclock, a 1-2 games would have gpu driver crashes, and some games would see "occasional" artifacts that would go away with a game restart. Most of the time though the card still ran great, but I knew it was not sustainable, so I took advantage of the early 2019 mining crash to pick up a dirt cheap

  • @0Blueaura
    @0Blueaura 2 года назад

    the switch to 6000 series at 2:00 got me really good xD

  • @StingrayForLife
    @StingrayForLife 2 года назад

    My HIS 7970 ghz edition just died a couple of weeks ago after almost a decade of service. So many memories!

  • @demonabis
    @demonabis 2 года назад

    My first card was a RADEON HD7850, paired with a FX-8320 and 8GB of RAM in 2012, many memories with that card, accompanied me for many years until it died and I got a 270. I clearly remember I was debating whether I should've got the 7850 or the 660Ti...

  • @steel5897
    @steel5897 2 года назад +16

    Hot take: ETH mining probably kept AMD GPUs on the map.
    Nvidia has been better for gaming and especially encoding for a long, long time. But those RX 480 and RX 580 cards were a hot commodity for miners, insane mh/dollar and very power efficient.

    • @Fractal_32
      @Fractal_32 2 года назад

      Plus AMD actually had asynchronous compute compared to Nvidia’s 10 series which didn’t or at least didn’t benefit from it as much as AMD.
      I’ve also heard that Nvidia ripped out parts of their hardware scheduler on Pascal (or it was Maxwell?) although I couldn’t find anything proving or disproving this claim. If this claim is correct it would have made AMD cards better for game developers who want to do more advanced techniques for better images/better performance/*stability. *stability as in not dropping frames/frame time consistency when an explosion occurs since a ton of particle affects have to be rendered.

  • @ffwast
    @ffwast 2 года назад +1

    [gently pats old 7950] "Don't worry buddy I still love you,a home theater pc is an important job you know"

  • @CompatibilityMadness
    @CompatibilityMadness 2 года назад +4

    Great video on interesting topic, but I have to ask :
    Performance used in table (@6:00), is based on Your test, time-of-release data/reviews, or on even later tests (done near the end of it's driver support, circa. 2020) ?
    Because this will impact GCN quite a lot (AMD drivers needed to mature before full performance could be seen). Same deal was with Navi and RDNA cards (Polaris/Fury to a lesser extent since they aren't as radical in changes vs. 7970). For example in GCN's 1.0 case, we only get good frame pacing (1% and 0.1%) after NV shows it's impact, and release of tools that actually measure it. Anyone who remembers that whole thing ?
    My 0.50$ about this topic : Multi-generation performance comparisons should always be made on PC with CPUs that have highest IPC available (for example today that would be Ryzen 5000 series or Alder Lake for GCN/Maxwell 2.0 GPUs).
    There is simply no way to effectively/accurately measure best cards, with era specific hardware (OC'ed or not).
    Furthermore, GCN was made to be more general compute focus architecture vs. Terascale, as it was meant to rival or at least contest with Nvidia's CUDA platform on super computers and other large scale compute projects (which can bring A LOT of money for more efficient compute designs). I think this is why efficiency on "pure graphics" on it, can be viewed as not as good as on previous generation cards (transistor budget was used to make it more versatile, instead of faster in specifc area). Simply put : It's not GPUs fault it's not as efficient in pure graphics as previous generations, it's simply how GPU market changed in few years since Terascale cards were released, that made it more profitable to leave some "game performance" behind for better overall product.

  • @haven216
    @haven216 2 года назад +1

    My first GPU that I ever owned was the HD 7730. For a little cheap GPU, it did quite well with the games I played at the time like Minecraft and Terraria.

  • @prgnify
    @prgnify 2 года назад +1

    The thing about the HD7000 IMO was that it was a good buy for so much longer, with ATI/AMD 'rebadging' it with different tech. To me the 280x is the greatest card ever in terms of longevity and being 'fine wine', and if I'm not mistaken it was 'just' an HD7000 in disguise.
    Honest to god, I bought a vega 64 (the red devil one, cause of cooling) at lauch. For years I regretted it, since it was priced between the 1080 and 1080ti, and if I puckered up and paid the extra, the 1080ti would have been better - especially since AMD walked back on some features (I can't remember by name, but there was some pipeline/scheduler thing that was supposed to make vega the best thing ever [tm]). But nowadays, especially after seeing Hardware Unboxed videos, I'm fine with my decision - I'm still using it daily, never had any issues, HBM is still nice for OpenCL.

  • @DCNion
    @DCNion 2 года назад

    It's quite annoying that these videos don't show up in the subscriptions feed.. Luckily I caught it in the Home page.

  • @Lenk9
    @Lenk9 2 года назад

    it was only one year ago since i upgraded from my radeon 7800 to my current gtx 1080. you served me well radeon

  • @JwFu
    @JwFu 2 года назад +1

    Since i moved out (more then a decade ago) and had to pay energy bills my self i started to care more about efficiency.
    good video, somehow didn't popup in my sub tab.

  • @Trovosity
    @Trovosity 2 года назад +1

    Love the information, love the music

  • @Catzzye
    @Catzzye 2 года назад +9

    Video idea: upscaling old futuristic graphics card box art

  • @SlapsYourShit
    @SlapsYourShit 2 года назад +6

    I think it's important to acknowledge that the 7000 series is literally the reason frametimes became a standard metric in framerate analysis, as it had horrendous stuttering on DX9 games unless they were specifically patched (like Skyrim).
    It's the reason I went NVIDIA with the 970 and never went back.

  • @Venoxium
    @Venoxium 2 года назад +2

    My first GPU was the Radeon 7790! Had an ASUS one and I remember arguing with myself if I was going to go with the cheaper 7770 GHZ Edition and buy a FX8320 or stick with the FX6300 and go with the 7790.

  • @bakuhost
    @bakuhost 2 года назад +2

    What HD7000 was 10 years ago? Can't believe it's that old.

  • @luckythewolf2856
    @luckythewolf2856 2 года назад +2

    I’m pretty sure the 6970 was the real replacement for the 5870 because the 5970 was a dual gpu card and the 6990 was the successor to that. This means the naming was shifted up one tier of card.

    • @luckythewolf2856
      @luckythewolf2856 2 года назад

      @@2kliksphilip Ah I see! Yes I agree hopefully the next generation can help us although I’m not super hopeful for the power consumption being great

  • @itmegibby2459
    @itmegibby2459 2 года назад +12

    The 7970 was good because as the drivers matured it kept getting faster even well after its launch. It was similar in story to the 290X situation, lost to the 780ti at launch, matched it a year later, beat it 2 years later while being 200$ cheaper. The "finewine" was a big deal for people who didn't have the money or expectation to upgrade every year. Of course that proves that AMD's drivers were lacking, and trust me HOLY **** THEY WERE BAD, but at least that improvement came and even surpassed expectations.

  • @auroradynia
    @auroradynia 2 года назад +1

    i used a 7950 for a long time, i got it in late 2015 so already pretty outdated with the GTX 1000 series just around the corner but it served me damn well until about 2021

  • @mikehunt42069
    @mikehunt42069 2 года назад +2

    Still sitting on a hd7970 and 2500k. Gpu is starting to artifact, even tried repasting and completely blowing out the dust but to no avail. My guess it's the memory modules dying. CPU is a bit degraded, it's not holding 4.5ghz like it used to. I'm still happy with the performance but if AMD's upcoming RDNA2 apus can exceed my hd7970's performance I'd be happy to upgrade to that platform.

    • @jorge69696
      @jorge69696 2 года назад

      Same. Still on a 2500k with a 380 which performs basically the same as a 7970. Can't wait to upgrade.

    • @estevaoang
      @estevaoang 2 года назад

      I fixed mine with iron in the die for like 40 minutes lol , and also fixed a hd7770 just changing the bios, maybe try it

  • @nathanielhawkins6959
    @nathanielhawkins6959 Год назад

    I was using a 7850HD Until the beginning of this year. It worked, got me through most of my games and I had fun with it. The only problem was that it didn’t actually support 1080p! I spent many years playing at 1600x1200 and just got used to it over time.

  • @unyu-cyberstorm64
    @unyu-cyberstorm64 2 года назад +1

    My ATI Radeon HD 5870 runs Dreadnought at 1080P 60 at medium settings. It’s very nice, and is DX11 compatible

  • @DDRWakaLaka
    @DDRWakaLaka Год назад +2

    what music do you use in this? the songs are all great

  • @onurekinaydin
    @onurekinaydin 2 года назад

    This was an amazing video, it put everything gpu related that we have been going through into perspective. I wasn't old enough to follow the gpu market back in 2011 so, thank you so much for this video!

  • @cavegamer5989
    @cavegamer5989 Год назад

    I had both a 6750 and a 6970 in 2015-17 they were very capable even then. Ran the new battlefront from 2015 at over 60fps so I was happy. Had a 7970 for a few days after that then just traded it for a gtx 1050 and the experience was just so much better.
    Now I've come full circle and have a rx 6600xt.
    Great card.

  • @kasuraga
    @kasuraga 2 года назад

    i rocked a crossfire hd 4830 setup for years. upgraded it after like 5 or 6 years to an hd7850 then to the gtx 970 after a few years. great cards for the price imo

  • @vAlcatr4z
    @vAlcatr4z 2 года назад

    HD7870 was the first gpu I first owned, I can say that it's still the top in my gpu list. It ran many games at near max settings back then with no issues regarding fps until I started playing Rust which months later killed the gpu (image would freeze few minutes after launching any game). What a great legacy.

  • @TiborSzarvas
    @TiborSzarvas 2 года назад

    I remember my first ATI card after a decade of nVidia reign: a Club3D X800 RX in 2005. Then in 2008, my last VGA card was a Sapphire HD3850 with 256MBs, those were the times!
    I was one of the latest to change to PCI-E; around 2013-14 I got a Sapphire R7 250 1GB card which got me through till the end of 2018 when I purchased a 8GB RX580...
    I still have the rx580, it has no problems running even the newest games in 1080p (on medium). I was today years old when I learnt it consumes 330Watts which is way too much for my 550W power supply.
    Thanks for another entertaining and informational video, Philip.

    • @Loundsify
      @Loundsify 2 года назад

      330w is total system power. The RX 580 uses 185w

    • @TiborSzarvas
      @TiborSzarvas 2 года назад

      @@Loundsify well, thanks. So one still needs that much power in that "department" of the power supply. My previous supply was 550W also, but distributed less power to the card, and the PC kept freezing/restarting while gaming. Changing to a better brand solved this, but i'm still thinking maybe i haven't got enough wattage for the card or some other component. Idk, anyway thanks for the info.

  • @leonader9465
    @leonader9465 2 года назад +2

    For some reason I was craving a 2kliksphilip video about hardware. lol Thanks.

  • @parkerlreed
    @parkerlreed 2 года назад +1

    I still have my RX 480 8 GB from 2016 and it's going great. For me that was the last of the great performers at a great price.

  • @SciFiFactory
    @SciFiFactory 2 года назад

    It would be great to have you on the Moores Law is Dead podcast!
    Do you know Tom? He can talk about GPUs for ages. That would be a stellar episode! :D

  • @Maupa.
    @Maupa. 2 года назад +1

    Hi 2kliksphilip, do you decide what videos will be visible in the Subscription feed? Because I don't see this video on mine and that's really annoying.

  • @DogeCharger
    @DogeCharger 2 года назад

    Ahhhh I remember my 7850 I had to turn on with a paper clip for the second power supply because I was using a lenovo prebuilt

  • @Jackikins
    @Jackikins 2 года назад

    My first computer had a 7870 and the CPU was an FX 8350. Oh boy, did it run HOT. But it served me well until it eventually fell after my father tried to stick another one in there for crossfire, the case didn't have enough airflow and it basically died becase both cards ended up running around 110C on Minecraft's title screen. That computer went through alot, survived things it shouldn't have like my dumb young mind trying to blindly plug in the thing for the power supply and taking a huge jolt of electricity and it kind of remained in a "zombie" state. It ran for around 3-4 more years, blue screening, 100% disk usage adnauseum. That computer moved to a GTX 970 sometime shortly after the shock and eventually fell to one last bluescreen around 2018, where I then got another PC, self built. CPU is an 8700k, 16GB of RAM (now updated to 64GB) and my beautiful baby child, the 1080ti that I got for around £550, just before they were discontinued.
    I'll never forget that first PC. I made alot of mistakes, alot I never repeated again with computers. It annoyed the hell out of me, but I still have a soft spot for it since it was my true introduction to PC gaming.

  • @henrik1743
    @henrik1743 2 года назад

    I remember the max fan test of this card, it was nuts

  • @mordecaiepsilon
    @mordecaiepsilon 2 года назад

    You should probably consider the significance of asynchronous compute in the GCN architecture and GCN's superior API support. You need to consider these limitations of the 600 and 700 series NVIDIA cards and how AMD's 7000 series and GCN pushed NVIDIA to improve their asynchronous compute for the 900 series. As you said, AMD's shiny new architecture powered a new generation of consoles that are still with us today. Otherwise, great video

  • @Fezzy976
    @Fezzy976 2 года назад +1

    HD7000 series was amazing. You missed a lot of points here and only focused on the negatives.
    AMD brought ATi only a few years before hand and brought them mainly to focus on iGPU's for their upcoming APU series of products. Now factor in that Intel used horrendous marketing tactics (most of which AMD sued them for) Intel got the upper hand in the CPU market and now AMD had much smaller budgets to work with. But yet they will competed with the big boys in Intel and Nvidia.
    The 7000 series was excellent in terms of raw performance and its overclocking potential was insane. I had my 7970GHz overclocked to 1.25GHz and the memory was up at least 600MHz from what I can remember. This made it crush the 680 which was already pretty much at its max clocks and had zero room for overclocking.
    Then factor in double the framebuffer (1.5GB vs 3GB) and for games with huge texture mods this made the 7000 series almost double the speed of the 680 in games like Skyrim and Fallout 3.
    AMD included a hardware scheduler within GCN so they bet heavily on the next API's having more low level access to the GPU's with DX11 but Microsoft never included this. Nvidia spent millions on creating a software scheduler in their driver to compensate for this. Which is why Nvidia cards were always better in DX11 games. This is why AMD had to create their own API with low level access which was called Mantle and released for BF4 in a patch.
    Mantle was insane, and the 0.1 and 1% lows made the 680 look pitiful in comparison. AMD gave their Mantle code away to the OpenGL team who used it as a baseline for Vulkun and also worked with Microsoft and the code was included in DX12. This is why these cards were the cards that started the whole "AMD Fine Wine" where the GCN cards would perform insanely well in newer games using the newer API's thanks to the architecture used. Now the API's could take advantage of the GCN's built in hardware scheduler.
    This was also the case with AMD's CPU's too. AMD bet heavily on applications becoming more multithreaded. Which is why they pushed for more cores and threads in their CPU's such as Piledriver and Bulldozer. But this never came about as Intel had such a stranglehold on the market and we got 4 core 8 thread CPU's for nearly a decade before AMD managed to create Zen and punished Intel for their complacency.
    Now AMD have more money coming in, they have bigger budgets to work with and that is going to pay off big time and not just in the CPU space.

    • @Orcawhale1
      @Orcawhale1 2 года назад

      1. ATI was bought in 2006, and continued to operate in their own buildings, and their own name.
      What's more AMD and Intel settle all of their disputes in 2009, so the notion that this somehow had any effect on HD7000 is just dumb.
      2. Now, your just outright lying.
      The 7970 wasn't faster than the GTX 680, hence why the 7970 GHZ edition was released.
      What's more, the GTX 680 litterally introduced gpu boost, which essentially was auto OC.
      Not to mention, the various extreme OC cards, like 680 SOC, or 680 Lighting.
      3. Mods don't count for anything, and especially not on Bethesda games.
      That's a terrible argument.
      4. Nope, not at all.
      Your confusing the ashes of singularity and asynchronous compute debacle, with Mantle.
      5. Mantle wasn't "insane", it wasn't widely supported, and became irrelevant with DX12's release in 2015.
      And nor is it the reason for "finewine", your lying again.
      6. You can thank the Ps4 and Xbox One for holding the pc market back.
      It has nothing to do with Intel or AMD.
      TL:DR None of your arguments work, or somehow makes the HD7000 amazing.

  • @lmchucho
    @lmchucho 2 года назад

    Philip this coming generación, is the third 7000 generación from ATI. The very first Radeon card was the Radeon 7200 from the year 2000.

  • @ianw9708
    @ianw9708 2 года назад

    I bought my HD7970 GE second hand on 2014 (coming from a 7770), and I used it until it completely died on 2021. It even went through the oven a few times. Always a real trooper. I miss RAMDACs :(

  • @livingthedream915
    @livingthedream915 2 года назад +1

    I find your comparisons to the Titan GPUs very difficult to stomach - almost no one had those GPUs to begin with and the PC gaming community at large ignored those products entirely. Halo products at the far high ends of product segments are usually overpriced to the point of absurdity by both gpu manufacturers (why buy a $500 7970 ghz when you can buy a $350 HD 7950 and OC to match the $500 product) and it was often the value proposition that caught gamers interest

  • @AlleyKatPr0
    @AlleyKatPr0 2 года назад +2

    You failed to mention pipelines, as we are now moving into the era of mesh shaders with DX12 'ultimate'.
    The reason this is worth mentioning is that, there is every chance that AMD's performance will be different.

    • @sadmanpranto9026
      @sadmanpranto9026 2 года назад

      I don't understand these stuff...
      good different or bad different (AMD performance) ?

    • @AlleyKatPr0
      @AlleyKatPr0 2 года назад

      @@sadmanpranto9026 "different" in that, it might be frames are better or worse, or, that the cards cannot scrub the memory cache as fast, or faster, than nvidia

    • @AlleyKatPr0
      @AlleyKatPr0 2 года назад

      @@2kliksphilip I would research the DX12U capabilities of AMD and nvidia, to see which of them has an edge...maybe ask them questions about it...maybe ask Valve about it...it's your channel 'man - but the mesh shaders are new(ish) and here.
      The days of vertex shaders being the dominant force in game/realtime rendering is coming to an end...whomever leads this will lead your next excel spreadsheet in Calibri 12 regular font when you do your analysis.
      AMD may (or may not) have a better future than nvidia for ALL of your raised points on power usage and efficiency - purely because their nm scale and design of chip might be better for DX12U usage than nvidia.
      Time will tell, but research the topics, make some predictions.

    • @AlleyKatPr0
      @AlleyKatPr0 2 года назад

      @@2kliksphilip You made several comparisons historically with the changes that nvidia and AMD have made to their grapihcs card designs and where they were focusing their efforts.
      It is a forward projection, as it is of interest in anyone looking at GPU designs - just as, you were making predictions of the future and your assumptions as to what would have happened in the past is AMD made different choices.
      This video of yours is very much read as if you are speaking specifically about the future and the hopes of what AMD might be doing in the future regarding power and efficiency.
      Mesh shaders are the compute term AMD are knee deep in - if they get this 'right' then nvidia will be faced with a competitor which has a more efficient design.
      What you are speaking of is as if, you are speaking about 'yield', wherein if the GPUJ's are designed well, they will offer more GPU power for less money, as they would be more efficient - giving a greater yield.
      As GPU 'power' is (basically) running an API and making very specific types of calculations involving real-time rendering in DirectX or Vulkan or w/e; the best GPU is the one which does those types of calculations better than the competitor.
      Mesh shaders are therefore: the main focus.
      The performance gain is quite significant.
      If I was in any understanding that you were making a video purely to cover the past and not in any way imply the future, then I apologise.
      microsoft.github.io/DirectX-Specs/d3d/MeshShader.html
      gpuopen.com/directx12-ultimate/

  • @SaltyMaud
    @SaltyMaud 2 года назад

    I've had good timing with my past GPU upgrades. 4890 in 2009, 7950 in 2013 and GTX1080 in 2016, they've all been great cards at a great time. Not so sure about picking up a RTX3070 in 2022, but since I managed to yoink one at MSRP, I just went with it, might not be the best GPU purchase I've made lately if RTX4000 is coming out soon and it's as crazy as it's said to be.

  • @ocudagledam
    @ocudagledam 2 года назад

    My experience with the 7000 series was that I bought a 7950 for a song exactly three years after it launched, kept it for 4 years, during which it fairly happily chewed through what I cared to throw at it at high details at 1080p and even after that I was still able to resell it (for half of what I originally paid for it) as it still noticeably outperformed the RX560, which was AMD's entry level solution at the time.
    Regarding the 3GB, I can tell you that it made a heck of a difference in AC Unity, which, even at 1080p, struggled on the 2GB cards unless the textures were set to low, and, considering how long I'd kept it, I'd wager that it wasn't the only game where I benefitted from the extra gig.
    BTW, I previously owned a Radeon HD 6950 2GB for another four years (that is, more or less, since that card had launched) and while one generation up doesn't seem like much, and in theory there wasn't that much of a difference in raw power, in practice it was enough to give me another 4 years of moderate gaming, so I would call the 7000 series quite successful.

  • @benhinton5475
    @benhinton5475 2 года назад +2

    The 7000 series was pretty fantastic for compute when it came out

  • @Lady_Zenith
    @Lady_Zenith 2 года назад +1

    The drivers. You forgot one important thing. How awful the GCN drivers were for years. The 7900 series grew so much, but it took years, by the time it started to shine (aka its performance started reflecting the die size and power) Nvidia was no longer even making GK104, it was GM204 days. Most important thing was how terribly GCN ran in DX11, the drawcall performance and CPU overhead was worse than DX9, when on NV side it was the opposite. This changed now btw. The AMD 22.10.01.03 May preview alpha driver managed to fix and DX11 MT rendering. Yes it does and runs great. It only came 12 years too late, and is optimized for RDNA way more, but it helps a lot even on GCN. This is such a shame, entire generations hammered by incompetence of software driver team. If they had this driver back then, the 7970 would have stomped the 600 series.

  • @beachbum111111
    @beachbum111111 2 года назад

    The radeon 7950 was the first PC gaming card I had, previously I had a laptop with a Radeon 5650 that only lasted me 2 years, but that 7950 survived until this last year where it seems to crash for good, and man did it get me through alot.

  • @MarcinSzklany
    @MarcinSzklany 2 года назад

    I love the hardware videos you do. Very insightful. Thanks!

  • @goblinphreak2132
    @goblinphreak2132 2 года назад

    I'm glad you didn't show the lie reviews around that time. Claiming a 7970 couldn't play Skyrim at 1080p ultra 4/4 (aa/af) at 60fps.... I was totally pissed because i noticed their lies because I owned the card. Not only did I have a 7970 but my cpu was technically slower as I was running an AMD cpu instead of intel (still do). I tested the game, 1080p, ultra, 4/4, and sure enough i pegged 60 the whole game. it never dropped. only when I changed AA/AF to 16x/8x respectively, did I notice my fps dropping to 50, which matched the reviews. Turns out they were testing AMD on max settings and Nvidia on lower settings, to give Nvidia the edge in the review. I even took video proving their bullshit but fanboys never listen. The 7000 series was god tier. I loved my Asus 7970 blower edition (aka reference). After a year i had heating issues, removed cooler, cleaned everything, and ended up applying arctic silver 5, the got tier cpu thermal paste. and not only did my temps lower, but they were even lower than the stock paste that was used. insane. so i was able to run the card with a lower fan profile thanks to better paste. card is somewhere upstairs on a shelf.
    AMD changes price performance at specific times. like when we had 390x and then later the RX480 which was basically the same performance as the 390x, for HALF the price. $429 for 390x launch day meanwhile the RX480 launched for $200 which is actually more than half price.... $230 less for the same performance was a huge boon for those who couldn't afford the higher tier stuff. which is how things go. AMD always does this. most generations are faster performance and either same price or more price. and then out of nowhere one generation will offer last gen performance for significantly less price. like my given example. i could go back further into detail but im too lazy. its happened more than once.

  • @annekedebruyn7797
    @annekedebruyn7797 2 года назад

    I for one was glad I stuck with the 6770 until the RX480 came out.
    I was always jealous tho. The HD7000 series were stunning cards in term of looks.

  • @ytszazu2gmail381
    @ytszazu2gmail381 2 года назад

    Have a R5 240, HD 7750, HD 7770 and R9 270 as GCN 1.0 cards. Weirdly, my R9 270 failed first. The others are still working through. Still using the R5 240 as a family computer

  • @isaacweisberg3571
    @isaacweisberg3571 2 года назад

    Philip, this was a great video which made me very nostalgic

  • @J0elPeters
    @J0elPeters 2 года назад

    I still have my 7950 which I bought used in 2014. Such a fantastic card

  • @SweepiNetworks
    @SweepiNetworks 2 года назад +2

    Sadly, the HD7970 only was on par with Nvidia’s second largest chip of the Kepler architecture (GK104), allowing Nvidia to sell the GK104 as an HighEnd tier card labled GTX 680, and use their most powerful Kepler chip (GK110) to establish a new “Enthusiast” tier with the GTX Titan (GK110).

  • @TylerL220
    @TylerL220 2 года назад

    My first card was a 7850, paired with an i7 920. That system impressed the hell out of me for what it was.

  • @PackardKotch
    @PackardKotch 2 года назад

    I know this is a Radeon 7000 series video, but damn does 6000 series hold a special place in my heart. First dedicated graphics (in AIO pc) and first dedicated graphics in desktop pc (6770m and 6950 1gb respectively)

  • @TrevorLentz
    @TrevorLentz 2 года назад

    I remember grabbing a used MSI Twin Frozr 7950, upgrading my rig from a brand new 6850. My 8320 FX series processor may have been a bottleneck, but it kept my main rig up to date enough to enjoy games at 1080p. I wish I had kept the eBay email receipts to help remember how much I spent back in those days. GCN 1.0 was actually really outstanding for used parts folks of the time. I never saw the 7950 as a disappointment. It was... good enough for me (a dude who only wanted to play games with my budget 8320 fx build at the time). Sure, AMD could have overclocked and made it less power efficient, but the market was a whole other world back then. I'll admit, a few years ago... I was, foolishly, a bit of an AMD fanboy back then, but I still could enjoy games without breaking the bank.
    Having a better job and more money now, I can afford to upgrade my PC way more frequently (I should say, a few of my PCs). When I was younger and working a minimum wage job, the inferior products at a reasonable price were a perfect step into a better generation of PC hardware.

  • @benedictjajo
    @benedictjajo 2 года назад

    ah... will never forget my 2gb 7870 which I used for 7 years. It still works but getting its well deserved rest in the store room.

  • @exmerion
    @exmerion 2 года назад

    my 7850 1gb had 1 fan and ran incredible hot. When it hit 80c during the summer it would crash the display drivers.

  • @NekBlyde
    @NekBlyde 2 года назад

    This vid didn't appear anywhere in my subscription box, I only noticed it in my recommendations because I was watching your Iceland Day 2 video which DID appear in my sub box. I hope 2kliksphilip gets fixed. :'(

  • @SyphistPrime
    @SyphistPrime 2 года назад

    My fondest memory of this era of GPUs was one not posting in my PC so I had to go with a 6850. Makes me kinda sad because a 7000 series card would be usable for some light Linux gaming for much longer than the 6850 lasted. It was night and day though going from a 6450 to a 6850. I still have both of those GPUs to this day. The 6850 is unused, but the 6450 is a great display adapter for my server PC for if I need to hook up a display for a console output.

  • @marsaurelius
    @marsaurelius Год назад

    4:52 Oh boy how we went full circle.
    The Rtx 4000 and Rx 7000 series generation is more perfermance for A LOT more money and same performance for a little bit more money

  • @TheXev
    @TheXev 2 года назад

    You can point to two events when it comes to the efficiency of AMD graphics cards. One: David Wang leaves ATI/AMD- GPU's efficiency goes to pot. Two: David Wang returns to AMD - RTG's GPU's efficiency increases dramatically and becomes competitive again. RDNA3 will be the first GPU he fully had is hands into since his return to AMD for a full 3 years of development. I expect great things out of RDNA3.

  • @DaboooogA
    @DaboooogA 2 года назад +1

    I had the R7970 Lightning by MSI

  • @jameslewis2635
    @jameslewis2635 2 года назад

    I still have a HD 7870 from my old Windows 7 system and it lasted me quite a long time. It was powerful enough to be competitive enough at the time against Nvidia's products. However it did seem that AMD's design team seemed to get totally lost in terms of making a follow-up series with everything that followed up to the current 6000 series following behind Nvidia's products with the only place AMD could compete on being in terms of price.

  • @anasevi9456
    @anasevi9456 2 года назад +1

    My best friend had a flipping 7870 ghz edition he got used off of amazon for $130usd in mid 2013 that made a mockery of my $320 770 by late 2015...
    Kepler may have been a better buy at launch and perhaps till 2013, but it aged like milk, worse than any other uArch since. 7970 and 7870 owners have every right to laugh at us. But Nvidia learned: and though Pascal technically didn't age as good as contemporary GCN performance wise, they were better buys then and are still solid cards 6 years later.

  • @vibardaniell.5833
    @vibardaniell.5833 2 года назад

    used to have a 7950 3gb, not my favorite card but I really commend it for being able to run doom eternal

  • @mulymule12
    @mulymule12 2 года назад

    Ooo, I had 2 7850s in crossfire, some games struggled but otherwise lasted well up to 2019 without issues