Are AMD Radeon Graphics in Terminal Decline?!!?

Поделиться
HTML-код
  • Опубликовано: 15 дек 2024

Комментарии • 984

  • @mm-yt8sf
    @mm-yt8sf 7 месяцев назад +600

    wanting a company to lose all competition and get ravaged by tepid improvements and steep prices is where "fan" turns to "cultist"

    • @13loodLust
      @13loodLust 7 месяцев назад +68

      How DARE you call us God-King Jensen worshippers cultists!

    • @cajampa
      @cajampa 7 месяцев назад +29

      My dude, if you call Nvidias improvements tepid.....what would you call AMD's?

    • @Heathmcdonald
      @Heathmcdonald 7 месяцев назад +53

      He means when there is no competition that's the result

    • @involuntarysoul3867
      @involuntarysoul3867 7 месяцев назад +13

      you are not a true believer, get him!

    • @cajampa
      @cajampa 7 месяцев назад +17

      @@Heathmcdonald But there is hardly any competition already.
      If Nvidia want to sell lots of hardware they need to offer something. Or we will just not buy. Just like how it was when Intel gave us 5% per generation. When AMD had nothing to offer to compete.
      Then your system lasted for many years because the performance was raised so little you did not miss out by waiting.
      Ngreedia have gotten used to massive crazy profits. Those will not come from 5% performance raise gen to gen. Because people will not pay 2K again for extra 5%...... They would have no reason to, so they have to offer more to get these high level of profits.
      But I get you all want to be doomers.
      I am just not. Still AMD is not going anywhere. They will just not compete in consumer GPU for the high end for a long time, because they are not even trying.
      Still have not heard anyone saying they want to have only one supplier of GPU, so OP is just making up this little doomer scenario in his head.

  • @alexguild
    @alexguild 7 месяцев назад +271

    "terminal decline" gets the clicks and reads which is what the publishers need--just like content creators.

    • @JahonCross
      @JahonCross 7 месяцев назад +13

      True

    • @Blingchachink
      @Blingchachink 7 месяцев назад

      BURN

    • @Beerfloat
      @Beerfloat 7 месяцев назад +11

      Yes, the title is clickbait. Daniel is pretty good at that too, right!
      Still, despite the usual flood of comments from people who claim to have just bought a Radeon and are super happy (what bearing do these stories have on any of this?), market realities after three disappointing generations of RDNA chips look pretty grim. Terminal decline? Maybe not. Healthy and/or getting better? Also not.

    • @GewelReal
      @GewelReal 7 месяцев назад +3

      ​@@Beerfloatwasn't RDNA2 a bit of a success?

    • @farmeunit
      @farmeunit 7 месяцев назад +1

      @@GewelReal It was, as well as 7800XT being a top seller. The real problem with 6000 was supply throughout pandemic. Graphics is a fraction of the AI and server market. They make a lot more putting silicon towards those parts, which leaves leftovers for gaming market.

  • @13loodLust
    @13loodLust 7 месяцев назад +345

    Userbenchmark guy got a job at PC gamer?

    • @SpectreICollateral
      @SpectreICollateral 7 месяцев назад +18

      It's not some crazy userbenchmark rant, it's a look at their official earnings which are public information. And the Radeon gaming earnings have done especially poorly Q1 2024, they're 48% worse than Q1 2023 and their leadership expects the second half of the year to be even lower.

    • @PingTPunk-rq9us
      @PingTPunk-rq9us 7 месяцев назад +4

      So much for "Advanced Marketing Device"

    • @Nein99x
      @Nein99x 7 месяцев назад +42

      @@SpectreICollateral It still reads like a rant. Even if they have a year where radeon cards aren't selling that well, they won't abandon their graphics division. There simply is no evidence that proofs this terminal decline crackpot theory.

    • @zeitgeistx5239
      @zeitgeistx5239 7 месяцев назад +1

      @@Nein99xAMD never specified discreet GPU, they specified all GPU segments.

    • @ericbagley3613
      @ericbagley3613 7 месяцев назад

      Uh oh! The part of our business that makes us the least amount of money is in a downturn. Pack it up boys!
      Slow news week, obviously.

  • @andrewbroadfort6856
    @andrewbroadfort6856 7 месяцев назад +270

    I switched to AMD because of available memory and performance at a lower price. I'm very happy so far. I don't care about RT because at the resolution I use RT hits FPS too hard to be useable.

    • @caribbaviator7058
      @caribbaviator7058 7 месяцев назад +52

      Don’t forget FSR which still looks like 💩 compared to DLSS.

    • @SaltiestSweat
      @SaltiestSweat 7 месяцев назад +71

      @@caribbaviator7058 no it doesnt

    • @decliningship4208
      @decliningship4208 7 месяцев назад +27

      I play COD and for some reason the RX 6000 series have more FPS than Nvidia RTX 3000 series

    • @invisibleevidence1970
      @invisibleevidence1970 7 месяцев назад +15

      ​@@SaltiestSweatIt unfortunately it does .

    • @f-22raptor25
      @f-22raptor25 7 месяцев назад +13

      @@SaltiestSweat in some game dlss perf. looks like fsr quality

  • @telepop123
    @telepop123 7 месяцев назад +77

    It's more just really bad journalism rather than clickbait.

    • @Son37Lumiere
      @Son37Lumiere 6 месяцев назад +2

      It's both. Sensationalism at its finest.

  • @jumpman1213
    @jumpman1213 7 месяцев назад +312

    I’m on team red, but even I thought RDNA3 was disappointing. At some point AMD needs to realize “nvidia at home w/ 50 dollar discount” tactic is never going to cut it.

    • @misterPAINMAKER
      @misterPAINMAKER 7 месяцев назад +27

      I think the RX 7800XT was great.

    • @patrickjames3321
      @patrickjames3321 7 месяцев назад +12

      You are clearly mistaken, AMD gpus are good this gen, the best ones of the competition this time around

    • @22Chad_Reed22
      @22Chad_Reed22 7 месяцев назад +52

      ​@@patrickjames3321 if not using upscaling or raytracing. Things I don't care about but the only reason I got the xtx was because it was much cheaper ther then a 4080 at the current price I see no reason to take the xtx over the 4080.

    • @elderman64
      @elderman64 7 месяцев назад +6

      ​@@patrickjames3321 We're talking about sales and sales tactics, not performance

    • @elysian3623
      @elysian3623 7 месяцев назад +18

      in fairness, it's not that a 7900XTX is bad, it's that the 4090 is twice as fast as a 3090, in fairness, Nvidia went to the most expensive node possible and did everything they could to take that performance crown and that they did, bit it's 2.5x as expensive if i choose a 4090 vs a 7900 in the uk and it just isn't worth it

  • @MrRoro582
    @MrRoro582 7 месяцев назад +134

    Competition is good for us the consumers. I went team RED this time a round. No issues with my SAPPHIRE NITRO+ 7900XTX . I play at 4k and have no driver issues.

    • @PrimeCo129
      @PrimeCo129 7 месяцев назад +6

      Same! Not Sapphire though but definitely a 7900 xtx. Works like a charm.

    • @MrRoro582
      @MrRoro582 7 месяцев назад +1

      @@PrimeCo129 they are good GPUS for gaming.

    • @bigmatth23
      @bigmatth23 7 месяцев назад +10

      @@PrimeCo129 Same for my 6950XT, never had any bugs. Surprised that it is still very good these days. Can't complain about that!

    • @misterPAINMAKER
      @misterPAINMAKER 7 месяцев назад +6

      But if AMD Radeon with RDNA 5 goes bankrupt, you won't have any new game drivers.

    • @maskoncr00ked
      @maskoncr00ked 7 месяцев назад +10

      Not too long ago I got a 6950 xtx and I am happy with it.

  • @misterbrickest
    @misterbrickest 7 месяцев назад +86

    OK, the internet has broken people's minds. Everything is clickbait now, even beyond the headlines. AMD's "Gaming" segment is down because of many complex factors:
    1) AMD counts consoles (and handheld chips) in this, we are mid gen into consoles coming up to the PS5 Pro and whatever Xbox "Pro" version, so we are far away from initial demand.
    2) We are at the end of a GPU generation cycle, most of the sales happen towards launches. This also applies to the handheld chips (iGPU like Z1 extreme for ROG Ally etc.) as they are approaching end of life cycle as well, with upcoming Strix Point replacement with next CPU Series.
    3) This point affects many companies: POST PANDEMIC sales rebound. Many companies had higher than average sales as many people were bored at home (some with extra cash) and invested into hobbies and other elective purchases. This REALLY affected GPU markets extra as there as a BOOM in crypto mining, driving prices and demand up to ridiculously high levels. 2 to 3 times MSRP on some cards, mainly Nvidia, but AMD as well.
    4) Add ALL of this to the fact that AMD didn't exactly nail it this gen (XTX was great, overall good value on mid range as well) and has had slower sales in some markets in the actual Desktop GPU division.
    So before everyone reads another clickbait headline, go read the stats (learn what they actually cover) and transcripts. AMD's main business, as selling GPUs is more a "side-hustle" for them, is doing great. Record Data Center GPU sales. Steadily chipping away at Intel's desktop CPU market.
    So everyone calm down, AMD is fine and flush with cash.

    • @Eleganttf2
      @Eleganttf2 7 месяцев назад +4

      you sound like on a damage control PR lol

    • @kathrynck
      @kathrynck 7 месяцев назад +4

      I'm not sure I agree with point 2) but the rest make sense.
      Also, sales for AMD are pre-AIB partner, which is pre-retail... so there's a lead-time on that. gpu's on the shelf selling today, were sold by AMD as chips some 6 months ago. One's being 'bought' right now will be hitting shelves as finished products for like the holiday season, where Nvidia plans to release their new cards... it's kind of at the wind-down stage of current gen gpu manufacture.

    • @misterbrickest
      @misterbrickest 7 месяцев назад +5

      ​@@kathrynck I hear you. There's absolutely sales throughout the generation, I'm just referring to the late gen liquidation (prices are cut : lower margins) and the new gen spike of sales.

    • @misterbrickest
      @misterbrickest 7 месяцев назад +7

      ​@@Eleganttf2 Just to be clear, I'm not saying AMD is doing great. In fact I stated that they didn't exactly nail it this gen. And I just bought a 4060 laptop (gotta massive deal over Christmas $779 open box legion slim 5) so I'm not trying to be a fanboy either.
      I'm just tired of the hyperbole in EVERY SINGLE ARTICLE and VIDEO (lol sarcasm... get it? )

    • @stormrider01
      @stormrider01 7 месяцев назад +2

      This comment is gold! Thank you!🙏

  • @Mark-kr5go
    @Mark-kr5go 7 месяцев назад +27

    All these fears seem blown out of proportion. AMD is in a stronger position than it was for the better part of the last decades in both the CPU and GPU segments. They will be fine.

  • @busterspark4410
    @busterspark4410 7 месяцев назад +25

    If anything is in terminal decline, it’s Xbox.

    • @Nicx343x
      @Nicx343x 7 месяцев назад +7

      sad but true

    • @brewergamer
      @brewergamer 7 месяцев назад

      Microsoft is smart enough to understand that software is more profitable than hardware, that's why it would literally take 36+ Sonys to equal the value of Microsoft.

    • @valentinvas6454
      @valentinvas6454 6 месяцев назад

      @@brewergamer We can talk about raw numbers but Xbox first party games in these past years are often bad to mediocre and that's the main problem. Halo Infinite, Redfall, Starfield and now Hellblade 2. I played all of these and they aren't looking so hot.

    • @OC.TINYYY
      @OC.TINYYY 6 месяцев назад

      ​@@valentinvas6454Sony hasn't released a first party single player game in over 2+ years, you realize that right? Xbox has had quite a few, all from average to great. If anything, Xbox has released tons of games, people are just too ignorant to see it. Sony is in a downward spiral when it comes to single player games. They are full on GaaS now.

  • @ionrage
    @ionrage 7 месяцев назад +27

    I would actually say nearly every single consumer segment is in decline, not just gaming gpu industry, nearly everything.
    The only thing thats doing fine is business to business.
    Your average joe consumer isn'y buying much, this includes Nvidia gaming gpus are also on the decline.

    • @anthonylong5870
      @anthonylong5870 7 месяцев назад

      Not 3D printing...Its EXPLODING right now

    • @Son37Lumiere
      @Son37Lumiere 6 месяцев назад

      @@anthonylong5870 Yes that and the AI craze, everything else is in decline though.

  • @MlnscBoo
    @MlnscBoo 7 месяцев назад +17

    I just built a friend a full AMD rig a few days ago. 5900x for $210 & a 7800xt for $500. It's about 20% faster than my own 11700k and 6800. Very Nice

    • @Beerfloat
      @Beerfloat 7 месяцев назад +1

      So?

    • @Greez1337
      @Greez1337 7 месяцев назад +3

      Yes, having a GPU that is around 20% stronger than you GPU would do that.....

    • @The_Chad_
      @The_Chad_ 7 месяцев назад

      Your poor friend. I've been building PCs for people for years and I almost never use AMD products in other people's PCs. My own main system is all AMD, but I can fix stuff that goes wrong and I knew I was making a compromise when I built mine. I wouldn't do that to someone else though, unless they insist. And even thought I knew what I was signing up for when I built mine, I've regretted it like crazy. At the time I mostly played Ubisoft games and other console ports. Now I mostly play indie games and AMD has been pure garbage, especially now that indie games are using UE5 a lot with ray tracing and DLSS but not FSR. My 2080Ti runs way better than my 6900XT in a lot of indie games.

    • @CryingWolf916
      @CryingWolf916 7 месяцев назад +1

      did you also tell your friend that he will never get to experience proper raytracing

    • @MlnscBoo
      @MlnscBoo 7 месяцев назад +1

      @@Beerfloat So I'm Jealous lol. I just thought it was funny people were saying AMD is dying and me and thousands of others are building amd rigs right now.
      @The_Chad_ I too have been building computers for 25 years. AMD has no more problems than Nvidia now. Question do you think Raytracing is going away? And do you think Nvidia should have a monopoly on it if it doesn't? Do you remember the madfinger debacle on mobile nvidia chips? I don't trust companies that artificially close source tech. My G2X was perfectly capable of using these "nvidia only features". And when people figured out it was literally just a line of code preventing them from using these "features" and NOT hardware, poo hit the fan. Spending 500 dollars on an Nvidia gpu will get you a fraction of the performance of the AMD equivalent. Nvidia has become the apple of the gpu market. Overpriced, and anti-consumer.

  • @pradeepsethi90
    @pradeepsethi90 7 месяцев назад +4

    I'm still on 1080p monitor, upgraded from 1650 to a 7700xt last year. I'm pretty satisfied honestly.

  • @ToxicBenchmarks
    @ToxicBenchmarks 7 месяцев назад +37

    Yeah, been hearing about the Radeon "Terminal decline" ever since they bought ATI. Guess what, it did not happen.
    Let's not forget that GPU's like the RX 7800XT actually sold quite well. If the rumors are true and they will focus on the low to mid-end tier GPU's next gen, and actually have competitive pricing, they could easily win some market share and use that time to improve on the software side of things, like FSR Upscaling (It seems they actually started to work on this with FSR 3.1) And then they could come back into the High-end tier with a new generation (Maybe RDNA 5).
    I have a Sapphire Nitro+ 7900XTX that I got after returning my RTX 4080 because of fan rattling and high temps. Because I do not tend to use RT in games (Although with the overclock i got, it is not that far from the 4080 in RT) I decided to save myself 200 euros and get the 7900 XTX Nitro+. and I will definitely consider going AMD again when I will upgrade my GPU, as I have not had any issues with it. It overclocks like a beast. Got a ≈20% increase in performance from UV+OC.
    Edit: Oh, btw let's not forget that Nvidia also makes less money from the gaming division. Their Revenue from gaming is now 12% of their total earnings. Down from 28% last year.

    • @bindxpoxt
      @bindxpoxt 7 месяцев назад +3

      it actually happen one time, and its about to happen again.

    • @ToxicBenchmarks
      @ToxicBenchmarks 7 месяцев назад +1

      @@bindxpoxt What exactly do you mean? What happened?

    • @bindxpoxt
      @bindxpoxt 7 месяцев назад +8

      @@ToxicBenchmarks they left the high end gpu market and stop competing, just like its rumored they will do again in the next gen. They failed once again, leaving nvidia a monopoly and us consumers getting fked up by insane prices.

    • @ishiddddd4783
      @ishiddddd4783 7 месяцев назад +2

      The problem is that amd can barely compete with nvidia treating gaming as an after thought, while that's amd main focus.
      Only cult like fanboys want amd to fail, but reality is that they are a galaxy away from nvidia, of course it comes down to each person, but it's the reason only the XTX is the sole RDNA3 gpu in the surveys, radeon doesn't sell nor produce enough gpu's to keep up with nvidia.

    • @kagetora03
      @kagetora03 7 месяцев назад +3

      @@bindxpoxt people will still try and find ways to justify paying a 60-class card for 500 bucks anyway, even if amd has a good generation with good prices, not much will change. people just want amd to be more competitive so that they can get nvidia gpus for cheaper lmao

  • @crzyces1693
    @crzyces1693 7 месяцев назад +14

    Edit: I have been on Steam since the year it launched. I have never, ever gotten an invite to take the survey and I've had AMD GPU's on 5 different occasions, dating back to the x700m 128 era)
    Intel would be out of their mind to stop making GPUs. After missing the mobile phone chip phenomena turned world-wide staple every year to missing the last two+ mining booms if they were to miss out on the x64 AI mania, they may as well start downsizing and calling themselves _"Little Blue"_ as they will go the way of IBM. Even if the AI market is a giant bubble, that does not mean AI chips will not become the next cell phone chips, and frankly, I think it will probably be bigger, and the bubble itself is more reflective of the markets as a whole than the technology behind it.
    Same goes for AMD. The overlap between the tech is so large right now, and AMD's piece of the graphics pie is still such a huge part of their annual income, it would be moronic, possibly *criminal* in the eyes of shareholders, for them to abandon it. As much as I want a high end card from them to be on the market this year for *me* to have on my block of purchasing options, I understand why sticking with midrange cards may be the right play for them. If they know they cannot compete against a 5080 or 5090 at a reasonable price, then from a fiduciary responsibility standpoint it doesn't make sense for them to do so, especially if they feel they can fast track a new, higher price to performance and far superior design to the market 6 months to a year before Nvidia is ready to release their next chips. Also, this could change up the cadence so AMD is not competing directly with Nvidia every other year in two year blocks. Nvidia could run away with the crown one year, then AMD could take it for less money (at least initially), before Nvidia comes back to reclaim the lead, then 10-12 months later AMD leapfrogs them and so on.
    I also assume that AMD will want to bin as many chips as possible that don't quite reach professional level of power to performance demands so they can launch with enough cards after supplying Microsoft and Sony in the console market.
    So does this suck? Well for me, yes, as I have a funny feeling that the rumors of a $900 5080 that just edges out the 4090 in raster while beating it by 50% in RT is rubbish while the 5090, if the specs are accurate cannot be less than $2K. Monolithic, 2 chips, it doesn't matter. If the 5080 looks like it will be 1 350MM^2'd chip with a 256 bit BUS and the 5090 will come in at 700MM^'2, 356 bit BUS with 92MB of Cache (or 2 325MM dies with a separate IO die on an even bigger substrate, though I doubt that is happening in the consumer discreet GPU market) the thing is going to cost a ton. Twice the everything besides Cache as the 5080...yep, I just don't see any breaks on the pricing front. That said, if the 5080 comes out first with no competition from AMD then there is no reason for them to keep the same price, never mind drop it. If AMD can't match up with Raytracing/Pathtracing right now for what people paying that much will expect, it's just a waste of silicon for them to divert away from the pro Mi chips, forcing them to lower prices more on their 7900XT and XTX again while probably not doing much to satisfy what $1K+ GPU buyers want.
    Hopefully in a few years I'll be looking at 10900XT/X's from AMD, but for now it looks like I'll be stuck overpaying a little bit for Nvidia if I want a card to do what I expect it to when we are talking about the price of a used 1995 Honda Civic.

    • @testerpt5
      @testerpt5 7 месяцев назад

      that is odd, I have an account for 15years (?) and received the invitation at least 3 times, maybe they do some surveys taking in consideration market locations

    • @haukionkannel
      @haukionkannel 7 месяцев назад

      I have been is steam survey several times… i would say regularly!
      Couple of times with NVIDIA 1070 and couple if more times with AMD 5700XT…

  • @ZioGiGi
    @ZioGiGi 7 месяцев назад +35

    "Terminal decline" 🙄 Jeremy Laird needs to do some real journalism. I went with a 7000xtx and never been happier. Converted my brothers to Radeon. Like it or not Radeon is here to stay.

    • @Beerfloat
      @Beerfloat 7 месяцев назад +3

      Wait, you bought a Radeon and Jeremy Laird did not even mention that? He was possibly not even aware of it???
      Clearly this amazing development changes the entire outlook. Pfft. Journalism these days, amirite 🙄

    • @poopoppy
      @poopoppy 7 месяцев назад

      I have a 7800 x3d with a 6950. I like AMD. I wouldn't buy a high end AMD card though, as they just don't have the features like DLSS. They just can't quite compete at the moment. Yes. They are here to stay. But they can't go toe to toe with Nvidia.

    • @shimik11
      @shimik11 6 месяцев назад

      You sound triggered ​@@Beerfloat

  • @ehenningsen
    @ehenningsen 7 месяцев назад +46

    Im not a corporation fan, ive gone with NVIDIA recently for raytracing and the features.
    Im looking forward to seeing how much of the raytracing performance and other features gap is closed with RDNA4 and RDNA5.
    I just want the best card

    • @RadarLeon
      @RadarLeon 7 месяцев назад +3

      4060 ti >$389-$399 6800>$369-$389, the 6800 being a generation behind both out proforms a 4060ti even when it uses DLSS, it also outproforms it in Ray tracing with both turned on and the only way the 4060 is playable is with DLSS and its still beat by 30fps when ray tracing spiderman 2.
      I will take my pure raw powered frames over the AI generated and smudged DLSS frames anyday.
      That being said the only thing amd doesn't have a match for is a $1600 4090ti, also i love my 7900 GRE as gimped as that card is from the 7900xt the price is fair.

    • @chase7974
      @chase7974 7 месяцев назад +5

      Exactly. AMD can't compete with DLSS, drivers, and RT performance, and they probably never will.

    • @adlibconstitution1609
      @adlibconstitution1609 7 месяцев назад +6

      I got my rx 7800xt swapped for an rtx 4070super because ive been playing cyberpunk frequently lately and my gosh the ray tracing performance os like a generational upgrade. I've finished 2 dlc endings with path tracing turned on in 1440p which is out of the question when i was using the Rx 7800xt

    • @NatrajChaturvedi
      @NatrajChaturvedi 7 месяцев назад +4

      They will call you a Jensen fanboy and Nvidia cultist for that. Not wanting to have to put up with loosing access to amazing features like DLSS and even RT after spending $600+ (if buying 4070 or more expensive) is apparently an alien thought to these people (how??!?!?!)
      It makes me wonder if they really actually buy graphics cards with real money or are these Ryzen 3600 + rx580 owners waving AMD flags online lol????

    • @RadarLeon
      @RadarLeon 7 месяцев назад +2

      @@NatrajChaturvedi DLSS I hate because the artifacts it puts into your game also is not great on fps games

  • @dremy746
    @dremy746 7 месяцев назад +31

    Hold on, you're saying that a clickbait title article is clickbait? the internet would never lie to me like that!

  • @johndelabretonne2373
    @johndelabretonne2373 7 месяцев назад +13

    I would say it seems like PCGamer is in terminal decline!

  • @tylerclayton6081
    @tylerclayton6081 7 месяцев назад +34

    AMD is a $245 Billion dollar company who makes all the GPU’s and CPU’s for all the consoles and gaming handheld devices. Even if sales are down it makes no sense for them to just cancel their gaming GPU’s. They’re still making huge profits

    • @Sly_404
      @Sly_404 7 месяцев назад +6

      They don’t. Check their earnings reports. The gaming business area generates little profits. The claim isn’t that they stop all gaming but that a dedicated desktop gpu line might not be financial viable much longer for them.

    • @pedroferrr1412
      @pedroferrr1412 7 месяцев назад +3

      @@Sly_404 only an idiot will abandon the GPU domestic market, when there are only 2 real players (AMD and Nvidia) AMD is full of money this time, just wait until 2026, and you will see if they aren´t making GPU´s for games anymore.

    • @Beerfloat
      @Beerfloat 7 месяцев назад +2

      @@pedroferrr1412 How often have we heard 'just wait' for some new AMD GPU generation to turn things around?

    • @sulphurous2656
      @sulphurous2656 7 месяцев назад

      They literally aren't. At most they might make $100-$200M on Radeon GPUs, otherwise they're losing money.

    • @The_Chad_
      @The_Chad_ 7 месяцев назад

      @@Beerfloat ALWAYS!!🤣🤣

  • @chuckwood3426
    @chuckwood3426 7 месяцев назад +3

    The main problem this generation has probably been the pricing and the naming. An example: The 7900 XT launched for $900 which was quite expensive for what it delivered compared to Nvidia. I personally bought this card for $707 half a year later and it was a great buy! But that was probably the price is should have had from the beginning if they wanted it to be popular. And at this point its too late as it has already been consigned to an also ran in peoples minds.
    Also the name for it should really have been 7800 XT. That would make more sense as there were already another card named 7900 and it would then launch at the same launch price of $700 as the 6800 XT did.

  • @jeffreyrodriguez9009
    @jeffreyrodriguez9009 7 месяцев назад +1

    Wanted a pre built for my first gaming pc in a while, couldn't find any with a 4080 at 2k or less so I went with a XTX pc for 1850 and couldn't be happier

  • @Posh_Quack
    @Posh_Quack 7 месяцев назад +26

    Watching this on a 7000 series GPU

    • @Beerfloat
      @Beerfloat 7 месяцев назад

      It seems not very many people are, though.

    • @j_will9
      @j_will9 7 месяцев назад +3

      Unfortunately people would rather pay more for less performance and use DLSS to justify it or cry about AMD lack of RT performance

    • @Posh_Quack
      @Posh_Quack 7 месяцев назад +1

      @@j_will9 Im testing out graphics settings and honestly I can barely tell when RT is on besides the lack of frames. I just turn RT off and set everything else to ultra and it looks great.

    • @Maxgamingperformance-nl1xc
      @Maxgamingperformance-nl1xc 7 месяцев назад +1

      @@Posh_Quack you're right, RT is too overrated. Just marketing tactics.

    • @Maxgamingperformance-nl1xc
      @Maxgamingperformance-nl1xc 7 месяцев назад

      @@j_will9 They justify frame gen too, I don't know why they think that's good.

  • @akigaming-tv
    @akigaming-tv 7 месяцев назад +1

    I was so excited with my 7900XTX, I received it, installed it...I launched cyberpunk and was modifying some graphics parameters inside the game menu to use the full potential of this card and Tada !!!!!! ==> back to windows with message "Drivers stopped responding ..." :(
    Ok No worries, I used DDU, AMD cleaning, I re-installed so many times the drivers (only drivers mode, pro drivers... anything) => same issue , CP277, Forza 5 when modifying in game graphics options => Crash with message "Drivers stopped responding..."
    I returned it and I ordered a 4080 super... this is the end of the story..AMD maybe in another life.. thanks for everything

  • @raven80wolfx2
    @raven80wolfx2 7 месяцев назад +5

    I am happy with my 7900 xtx it's an amazing gpu. Stable and the performances gotten better and better!

  • @cael_1303
    @cael_1303 7 месяцев назад +8

    I actually got the steam survey last month so I contributed to that RX6600 number for the first time. Last time I got it I think I had yet to upgrade from my 750TI. But yeah, I think "terminal" was the clickbait word for the article of the day. If it had just said decline then people probably wouldn't have paid attention. I imagine "critical" or "serious" were other words they could have used.

  • @the_alquemist101
    @the_alquemist101 7 месяцев назад +30

    I think that if AMD is able to solve the CUDA problem they can create a massive turning point in the GPU competition market.

    • @WolfChen
      @WolfChen 7 месяцев назад +9

      It's just not possible for them.

    • @LucidStrike
      @LucidStrike 7 месяцев назад +2

      Does the HIP wrapper or whatever not already "solve" CUDA?

    • @Argoon1981
      @Argoon1981 7 месяцев назад +8

      CUDA is a Nvidia tech.
      AMD has a compute language of their own and even if they didn't, they support OpenCL just fine, that is a open free CUDA competitor.
      Software makers just need to get off of their lazy ass and support OpenCL or AMD own compute language.

    • @x0Fang0x
      @x0Fang0x 7 месяцев назад +4

      ​@@Argoon1981AMD needs to make a tech better or on par with CUDA that's why he said AMD has a CUDA problem. Currently AMD's alternative offering to CUDA is worse.

    • @arenzricodexd4409
      @arenzricodexd4409 7 месяцев назад

      ​@@Argoon1981it is not about software developer being lazy. Nvidia willing to spend billions to ensure their CUDA ecosystem grow. Years ago nvidia go around the universities to make sure CUDA being teached. Thing is i see the industry want to break free from CUDA but doing all the work need tons of resource. Did you know why AMD push heavily towards open source years ago? One of the primary reason is they want to save cost by unloading the efforts towards open community. The moment their open source driver end up being more established in linux they end up firing more people from their linux division as reported by phoronix.

  • @Dipankar89
    @Dipankar89 7 месяцев назад +2

    The need to fix FSR 2 many people
    avoid AMD due to fsr alone.

    • @Son37Lumiere
      @Son37Lumiere 6 месяцев назад +1

      Which is pathetic because in reality fsr isn't that much worse than dlss. DLSS is primarily only better at lower resolutions and well, if you have to use upscaling at 1080p, you might want to consider just getting a console.

    • @Dipankar89
      @Dipankar89 6 месяцев назад

      @@Son37Lumiere yeah fr

  • @stevenanderson3205
    @stevenanderson3205 7 месяцев назад +21

    Welcome to RUclips every thumbnail is click bait.

    • @PrincessofKeys
      @PrincessofKeys 7 месяцев назад +5

      How is this thumbnail clickbait?

  • @turbobros_online1561
    @turbobros_online1561 7 месяцев назад +1

    Great to see your channel growing Daniel! I appreciate all your hard work!

  • @johndoe7270
    @johndoe7270 7 месяцев назад +4

    There is no Team AMD or Team Nvidia, they are one team behind the scenes, colluding together to jack up prices.

    • @Ori_Ovadia
      @Ori_Ovadia 7 месяцев назад +1

      what are you talking about?

    • @johndoe7270
      @johndoe7270 7 месяцев назад +2

      @@Ori_Ovadia Price collusion.

    • @karmaftw3730
      @karmaftw3730 7 месяцев назад

      Delusional

    • @Ori_Ovadia
      @Ori_Ovadia 7 месяцев назад

      @@johndoe7270 you are suggesting AMD and NVIDIA are a monopoly in the GPU market, this is not the case. They are not working together to keep the prices high.

    • @CryingWolf916
      @CryingWolf916 7 месяцев назад

      bro is onto something

  • @jamesrock9446
    @jamesrock9446 7 месяцев назад +2

    When I run CAD benchmarks, 7900xtx outperformed the 4090 in many cases. I ended up returning the amd card but regretted keeping the 4090. 4090 cost 2x more here in Canada than the 7900xtx and for my purposes, the amd card had a lot going for it.

  • @flatfish-d9o
    @flatfish-d9o 7 месяцев назад +7

    pc gamer global authority on clickbait poorly thought out and researched articles

  • @txmetalcobra
    @txmetalcobra 7 месяцев назад +2

    7800XT was the best bang for the buck GPU this generation! Some Nvidia fanboy wrote this article. AMD tech keeps getting better and better every generation. I was team green until the 40 series total BS. Nvidia can go shove it until they fix the low vram BS for a $600 card. They force you to spend $800 to get something future proof. HAWK TUUAHH!

  • @JohnSmith-ro8hk
    @JohnSmith-ro8hk 7 месяцев назад +27

    These doom and gloom articles are so clueless. AMD shifted production from gpus, which they still had a ton of old 6000 gpus to sell, to apus and laptop dies. They are making more money because they shifted where they produce. I don't think people realize how much AMD actually created outside of gpus that requires wafer capacity.

    • @Sly_404
      @Sly_404 7 месяцев назад +2

      The earnings reports disagree with your claim, that they make more money. This couldn’t be further from the truth.

    • @Beerfloat
      @Beerfloat 7 месяцев назад

      Where are they in laptops though?
      CPUs, yeah, a good few. GPUs and APUs? Near complete absence.

    • @TuanLe-gl4yt
      @TuanLe-gl4yt 7 месяцев назад

      @@Sly_404 ruclips.net/video/PhK2HiXeBWo/видео.htmlsi=REZkW4qnd2mtgNst&t=478
      LOL its not that huge of decline. Clearly that shitty jounour didnt do his job.

    • @JohnSmith-ro8hk
      @JohnSmith-ro8hk 7 месяцев назад

      @@Sly_404 4% decline last year but a huge surge in Q4 in a time where everyone's revenue is down and in a lul between launches. You sell old stock at a discount and you don't make big profits, thats expected. Console sales are lackluster, thats an industry issue. Want to wager what Q1 report will say?

  • @adriankoch964
    @adriankoch964 7 месяцев назад +2

    Even RDNA4 could be good for AMD. They don't have to be superfast or perform at the high end. All they have to provide is excellent price to perfomance. That's why the rx580s are doing so well in the steam survey. When it released it was priced relative to a 1060 & performed ~20% better.
    If the 'midrange' RDNA4 will offer rx7900 performance with better raytracing at

  • @my-yt-inputs2580
    @my-yt-inputs2580 7 месяцев назад +5

    Article on "Motley Fool" is all you need to know. Not something that brings trust in mind.

    • @Eleganttf2
      @Eleganttf2 7 месяцев назад

      so you dont believe in AMD's earnings call ? 😂

    • @omgnowairly
      @omgnowairly 7 месяцев назад +1

      @@Eleganttf2 its more that "Motley Fool" puts out content to pump and dump stocks with an accuracy of a coin flip. Deep insights takes second place to "vibes".

    • @my-yt-inputs2580
      @my-yt-inputs2580 7 месяцев назад +1

      @@Eleganttf2 It's not about the earnings call, it's about the title of the article itself. Also let's wait to see what the Nvidia earnings call will be.

    • @Son37Lumiere
      @Son37Lumiere 6 месяцев назад

      @@omgnowairly I agree but in this case an earnings call is an earnings call, hard to interpret or spin it many other ways. Whereas the pcgamer article was pure sensationalism.

  • @DaveGamesVT
    @DaveGamesVT 7 месяцев назад +2

    They could have cleaned up this gen if they kept prices reasonable. But no, they wanted to follow nvidia's BS of raising the prices dramatically. They chose to be only SLIGHTLY less garbage. Chasing higher margins has cost them market share.

  • @jerrycurls88
    @jerrycurls88 7 месяцев назад +4

    Imagine being so far up nvidia's a$$ you root for the competition to fail. Without competition, the 5060 will have 8gb of vram and sell for 800.

  • @devongreen7308
    @devongreen7308 7 месяцев назад +2

    Let's be honest, even if AMD made a gpu generation that beat ever single NVIDIA gpu at every teir, there would still be people that wouldn't buy them just because it isn't Nvidia. It's sad but true. I just switched to AMD from Nvidia and couldn't be more pleased. Sure, it's behind in ray tracing, and they need to fix FSR3 and anti-lag+. But I believe in due time with RDNA5 they will gain some market share. They're gonna have to do their absolute best to exceed NVIDIA in performance and then push the new product line with some incentives, actual advertising, and then bundle deals with Ryzen CPUs at discounted prices.

  • @tzad2104
    @tzad2104 7 месяцев назад +6

    without proper competittion nvidia will charge 899 for xx50ti or xx60 series cards

    • @Chrissy717
      @Chrissy717 7 месяцев назад

      Which is hilarious to me, because the 50/60 series will be the ones AMD will compete with in the next generation.
      And only those.

    • @haukionkannel
      @haukionkannel 7 месяцев назад

      And so it begins!

  • @EySa7
    @EySa7 7 месяцев назад +1

    Being a long time Nvidia customer, I bought 7900xtx and gave a chance for 2 months. I was okay with RT performance not being good. But I encounter lot of driver issues, adrenaline not working with any of the latest drivers I tried. I had to use some their old or beta drivers to even able to use adrenaline etc.. that is too much hassle.. I reall wanted to use that card but returned it at the end.. As a side note, I returned and got different brand 7900xtx to see if it was a problem with decice itself, but nope..., happened to al lof them

  • @lukec1
    @lukec1 7 месяцев назад +4

    Graphics in general are in "decline." Or at least we are running out of improvement space.

    • @onomatopoeia162003
      @onomatopoeia162003 7 месяцев назад +1

      my pov. Would be look at the whole tech industry as a whole.

  • @happybeing777
    @happybeing777 7 месяцев назад +2

    I switched to team green last year but i hope amd can polish their features so it can at least compete with nvidia

  • @ravijotsingh7831
    @ravijotsingh7831 7 месяцев назад +3

    I am fully convinced to buy rx7600 with a diff btween 7600 and 4060 of Rs. 6000

    • @Son37Lumiere
      @Son37Lumiere 6 месяцев назад +1

      The 7600 is a far better deal, the 4060 realistically cannot ray trace either.

  • @jedimindtrickonyou3692
    @jedimindtrickonyou3692 7 месяцев назад +1

    I think those of us who watch Daniel are much more likely to buy AMD than the average user. They offer more value at certain price points (sometimes much more value and memory), but they gotta get FSR up to snuff. It’s the biggest reason I got a 4070 instead of the more powerful 7800xt. But I do have a 7840HS mini pc that use daily for gaming and emulation in my living room.

  • @TheFearChannel
    @TheFearChannel 7 месяцев назад +9

    If you aren't playing a heavy raytracing title like Cyberpunk(and buying one of the higher end cards to go with it), or doing 3D modeling in blender, going Nvidia is just overpaying for a graphics card.

    • @merlingt1
      @merlingt1 7 месяцев назад +4

      DLSS

    • @kevinerbs2778
      @kevinerbs2778 7 месяцев назад +2

      @@merlingt1 makes RTX cards twice the price old GTX cards.

    • @ishiddddd4783
      @ishiddddd4783 7 месяцев назад

      @@kevinerbs2778 then i wonder why amd not offering DLSS, competitive RT, nor software compatibility, can justify barely undercutting nvidia rtx cards.

    • @TheFearChannel
      @TheFearChannel 7 месяцев назад +2

      @@merlingt1 If you end up with more silicon for your dollar, you can just render native at high FPS.

    • @Maxgamingperformance-nl1xc
      @Maxgamingperformance-nl1xc 7 месяцев назад

      blender improved amd performance some time ago

  • @lltheFacell
    @lltheFacell 7 месяцев назад

    RDNA2 just made way more sense $/performance wise, up until very recently, witch could explain some lackluster Q1 sales figures for RDNA3

  • @viniqf
    @viniqf 7 месяцев назад +9

    AMD is in a complicated situation where they have good products at great price points, but never really manage to outcompete Nvidia.
    They are behind in software features like DLSS, Frame Gen and Ray Tracing, which leaves them to offering more VRAM and other hardware features to compensate, but it seems the market favors those technologies over pure hardware. It just seems like they need to get their feature-partity just right, and offer exactly the same software features, but from their side, to truly become competitive.
    People claim RT doesn't matter, and the same for upscaling, but somehow the company that focuses on both of these solutions outsells AMD by large margins and doesn't seem to be losing any marketshare. I hope RDNA5 is the change they need, and a start of true competition, by directly hitting Nvidia instead of just undercutting them.

    • @WutTheHekk
      @WutTheHekk 7 месяцев назад

      Ahha tiddy, I'll see myself out..

    • @Sp3cialk304
      @Sp3cialk304 7 месяцев назад +7

      It's because the people who say modern features don't matter are a small but very loyal and loud group. If you get outside the "DIY/Techtube" echo chamber and into the actual gaming communities and you see those things really matter. It's like flat earthers, they are small in number, but very vocal and loud at pushing their opinion.

    • @arch_..
      @arch_.. 7 месяцев назад

      It also doesn't help that AMD isn't being aggressive in the GPU market. They have good products that can compete with Nvidia in raster but the prices are too high, if they want to increase their market share they need to focus on budget gamers, sub 200 and 300 GPUs. A bit over that people are expecting to play in higher resolutions and DLSS is insanely good for what it offers you and RT is a lot better on the green side, it also doesn't help much that Nvidia is always 1 step ahead in features and AMD is always trying to catch up.

    • @robertmajors1737
      @robertmajors1737 7 месяцев назад

      I really think AMD needs to cut prices at the top of their stack. $100 isn't enough to give up dlss and superior rt. I'm not a big proponent of either feature, but 4k is probably the best use case for both of them, and that's likely where the majority of these cards will be used, unless they're for super high fps e-sports titles in 1440p. But dlss can definitely be the difference in acceptable framerates and not quite good enough in demanding games at 4k. I hate that 12vhpwr connector, and Nvidia's practices, but honestly at this moment in time, I'd probably buy the 4080 super over the 7900xtx if I were building a 4k rig.

    • @Sp3cialk304
      @Sp3cialk304 7 месяцев назад

      @@arch_.. I think it's because the "DIY" community focuses in way to hard in raster only native benchmarks. But the average person is going to look at how it performs in a real world use case. As someone who owns cards from both, if you are at 1440p and up you are using DLSS in every new release. Free performance and it looks better than the native TAA/FXAA that modern games use. Many new releases actually use upscaling by default, and use the upscaler to handle AA. With FSR I did everything possible to avoid using it, it cripples image quality. And with the PS5 pro being so focused on AI upscaling and RT, the difference are only going to matter more.

  • @walter1824
    @walter1824 7 месяцев назад +2

    Nvidia not doing great eighter when they mentioned them not being a gaming company and launched lower tier cards with false +1 tier naming at higher prices
    With AMD we`re literally stuck with 6000 series with every single card below the X900 XT/XTX

    • @haukionkannel
      @haukionkannel 7 месяцев назад

      Does not matter… those +1 naming gpus sell well, much bette than AMD…
      But yeah… both companies are down, but no worry! AI is selling so well that gaming GPUs are allmost meaningless!

  • @kafgark
    @kafgark 7 месяцев назад +5

    I've been gaming on a 6700XT since June 2022. I'm willing to change it to RX 8800 when it releases. AMD offers a better performance per price at 1080p 144hz.

    • @takik.2220
      @takik.2220 7 месяцев назад

      the rx 8800 xt is most likely going to be a 1440p/4k high refresh rate gaming gpu

    • @kafgark
      @kafgark 7 месяцев назад

      @@takik.2220 that's why i mentioned non-XT.

  • @mad_mario_
    @mad_mario_ 7 месяцев назад

    Daniel, would you please keep the white screens brief while presenting, it is really hard on the eyes

  • @AnjBias
    @AnjBias 7 месяцев назад +2

    Y'all deserve an Nvidia monopoly, I do hope Radeon closes and you get mid-range cards that cost $1000.
    Then again AMD never stopped producing CPUs when they had Bulldozer and only Radeon was selling.

  • @thomasburchsted3287
    @thomasburchsted3287 6 месяцев назад

    Yeah I’ve had trouble trying to get friends to buy what would be a better video card for them vs what they end up getting. One of them mainly plays WZ in 1440p, never uses ray tracing on anything etc. Told him to buy a 7900XTX vs the 4080 he ended up with.

  • @mRibbons
    @mRibbons 7 месяцев назад +3

    I don't get vindictive hatred either. There's literally nothing to gain by participating besides Internet points maybe. When the Sega failed everyone was sad.

  • @UndyingGhost
    @UndyingGhost 7 месяцев назад +2

    Tech Yes City came to the same conclusion. Article is a clickbait.

  • @SiCSpiT1
    @SiCSpiT1 7 месяцев назад +11

    Sega does what Nintendon't, Daniel! Genesis 4life!

    • @infinity2z3r07
      @infinity2z3r07 7 месяцев назад +1

      nintendo don't have blast processing!!1!!1

  • @ging3rbreadtiger618
    @ging3rbreadtiger618 7 месяцев назад +1

    Will you be giving pointers on which of your videos are doing better for the algorithm based on what you put in your description. I am finding this very interesting.

  • @fernandoteacher
    @fernandoteacher 7 месяцев назад +3

    I know I'm not like most gamers by using AMD, but I'm enjoying my 6700xt and don't plan to ever again buy an Nvidia card

  • @thseed7
    @thseed7 7 месяцев назад +2

    My last two PC builds have been all AMD. I have been incredibly happy playing just about every major game release without problems and at incredibly nice image quality. Of course, RT and DLSS are great Nvidia features and Intel may have some excellent productivity capabilities that make them excellent choices for some users. For me, doing light AI video enhancement and playing games, the 7800X3D/7900 XT combo I'm running crushes everything I've thrown at it. Not the best does not mean AMD isn't very good. I feel like media sh!ts on them for being in second place in GPUs and first place in CPUs. That doesn't make sense.

  • @yissnakklives8866
    @yissnakklives8866 7 месяцев назад +3

    Speaking of integrated....I've caught mys3kf playing on a mini pc to save energy, since the entire system consumes ~120 watt max and has no trouble playing what I want to currently (FO4 revival!)

    • @niebuhr6197
      @niebuhr6197 7 месяцев назад

      Underclocked gpus will save you a lot of power and still destroy igpus in performance.

  • @brewergamer
    @brewergamer 7 месяцев назад +1

    Very smart video. I think you hit the nail perfectly on the head. This happens every 2 years because AMD releases GPUs every 2 years, there's always that in-between year where there is a decline in sales vs last year.. which is to be expected due to not releasing a new series every year but every 2 years. Every time we see "AMD IS BANKRUPT" and stupid phrases like that. AMD did not release a new series in 2024, which will cause an expected decline in sales. Simple as that.

  • @medryan9411
    @medryan9411 7 месяцев назад +3

    Just got an AMD Gpu and im super happy with it

  • @LiLBitsDK
    @LiLBitsDK 7 месяцев назад

    that 3000 series isn't replaced by 4000 makes sense since midrange was like no upgrade if you had a 3060ti

  • @nuruddinpeters9491
    @nuruddinpeters9491 7 месяцев назад +5

    The first and last card I bought from Team Red was the ATI 9600 Pro just to play Doom 3 on PC back in 2004.
    Then, 3 years later I wound up buying an GTX 8800 to play F.E.A.R. on PC and never looked back!
    Honestly, red-team is powerful but they really were always behind ever since the original RAGE ATI folded and was purchased by AMD.
    I want red and blue team to keep nvidia in check.
    Yes, I have a 4090, but I'm not happy about prices... No one should be happy about prices.

    • @freeisalwaysme
      @freeisalwaysme 7 месяцев назад +3

      Clearly you are fine with the price. If you purchased it.

    • @nuruddinpeters9491
      @nuruddinpeters9491 7 месяцев назад +2

      @@freeisalwaysme No, its not clearly OK as i stated -- I can afford it, but not happy about it.
      I buy Nvidia for performance and feature set which has been far superior since at least 2004. (The whole reason for my mini flash back rant.)

    • @Eleganttf2
      @Eleganttf2 7 месяцев назад +2

      @@nuruddinpeters9491 yeah sadly thats what happen when you have a bad competition with AMD being a dissapointment as always

    • @nuruddinpeters9491
      @nuruddinpeters9491 7 месяцев назад

      @@NineS5 Yes, i want a passive benefit from healthy competition.

  • @EySa7
    @EySa7 7 месяцев назад

    Looking at the list , seems like 7900xtx is their one of the most used cards in Steam, why then they decided to not make a high level card for next series ? like a competitor to 4090 or 5090? I would def buy that card and try if that was the case.

  • @saminder_singh
    @saminder_singh 7 месяцев назад +3

    Happy with my 6800xt and will surely buy upcoming amd gen becuse you know ngredia will be expensive lets see

  • @Monster-Abee
    @Monster-Abee 7 месяцев назад +2

    Recently sold my 4090 because I didn't see much difference outside of ray tracing. Only lost a couple hundred after owning it for a year as the prices held up very well. I borrowed a 6750xt and noticed smoother motion as my oled ultrawide wasn't Nvidia validated in the drivers. So I bought a 7900 xt, the Asus tuf oc model. It's perfect. Massive cooler, beautiful and well built. Half the price of my 4090. I'm enjoying Gears 5 and Horizon forbidden west in HDR which doesn't even have ray tracing yet are some of the best looking games out. Go figure. 🤷‍♂️

  • @TakaChan569
    @TakaChan569 7 месяцев назад +5

    As long as AMD are making GPU's i will keep buying them, unless NVIDIA prices come down i don't see myself getting another GPU from them.

  • @omgnowairly
    @omgnowairly 7 месяцев назад

    These vendors may be focusing on the data center and AI now. Thing is, competition in that space is heating up with offerings better suited to the task. They will have to come back to consumer graphics solutions.

  • @njgaming3322
    @njgaming3322 7 месяцев назад +7

    It really doesn’t help when AMD only released the mid-high tier GPU’s in the 7000 series and never released the low end RX 7500 and RX 7400, which I’m sure got plenty of sales in the 6000 series since it was affordable and preformed well at 1080p/720p

    • @arenzricodexd4409
      @arenzricodexd4409 7 месяцев назад

      Problem is that affordable market did not bring any profit to AMD. the sub $300 market are cut throat now with how slim the margin are in that segment. It is the reason why AMD use 6nm for RX7600 and not 5nm.

    • @Eleganttf2
      @Eleganttf2 7 месяцев назад +3

      lol we dont want anymore even trash gpu like the rx 6400 or 6500 xt

  • @tomtomkowski7653
    @tomtomkowski7653 7 месяцев назад +2

    AMD is present right now and this changes nothing.
    xx80 went from $700 to $1200 and what AMD did do? Oh yes, they priced their counterpart (7900xtx) at $1000.
    This is no longer a competition because AMD just follows Nvidia so if they would disappear from the market nothing would change.
    If they want to survive they have to make Zen moment which means a massive advantage in terms of price/performance and they have to catch Nvidia with features.

  • @cybernd6426
    @cybernd6426 7 месяцев назад +3

    For me everything pcgamer wrote there feels so jumped to conclusion or taking something out of context and using it for your own opinion without first looking at it and then making an opinion

  • @mattpulliam4494
    @mattpulliam4494 7 месяцев назад

    I think all the info is pretty accurate. Rdna tech seems to be decided to be rewritten and pushed off a series. Rdna4 is supposed to be monolithic and itself and whatever big fix might uplift 15%. Which is fine sell some MI and hang on till rdna5 seems to be the theme. They better be ready for that launch with some good upturn.

  • @iancurrie8844
    @iancurrie8844 7 месяцев назад +5

    I’ve had an RX7900XT since launch and it’s been so problematic. I’ve been a pc user since C64 and a pc builder since 386. I know what I’m doing. I’m not saying I can’t work around it, I can. It’s just so problematic. I regret buying it so much.

    • @VanyaYlianov
      @VanyaYlianov 7 месяцев назад

      what problem do u have? i came from gtx 1080 to 7900xtx and didn felt any downgrade in comfort or stability, except amd not saves windows and folders multimonitor placing when u turn off or stand by monitors.

    • @takik.2220
      @takik.2220 7 месяцев назад

      can you explain what you mean by "problematic"?

    • @PaulSpades
      @PaulSpades 7 месяцев назад

      @@VanyaYlianov Yes it does. I have double monitors with one either in protrait or landscape, and it reconfigures stuff on the desktop fine when I switch from one configuration to the other, or when the power cuts off (I only keep one monitor on the UPC).
      I do have video tearing issues, though. Even if I set displays to vsync.

    • @VanyaYlianov
      @VanyaYlianov 7 месяцев назад +1

      @@PaulSpades i used vsync for videoapps(same on nvidia) until december drivers. since no tearing even without it, dunno why

  • @justinbarnes8834
    @justinbarnes8834 7 месяцев назад

    I'm not on any team other than my own. I buy whatever makes the best sense at the price point I can afford. However I tend to stay away from brand new technology as it tends to have problems at first, these then get sorted out fairly quickly and then I might consider buying. I am also not swayed by gimmicks such as ray tracing on lower end hardware that can't handle it.

  • @josephrunkles4577
    @josephrunkles4577 7 месяцев назад +4

    Um, no, I love AMD, open source and gpu are great value. I didn't get a 7000 series cuz i already had a 6700xt, which i love.. can't wait for 8000 series.

  • @Wilsonandcars
    @Wilsonandcars 7 месяцев назад +1

    Just swapped to AMD 7900xtx for my triple screen sim setup, works great!

  • @Bobis32
    @Bobis32 7 месяцев назад +3

    were also almost a year and a half into this Graphics Generation aswell even Nvidia is seeing a downturn 17% in Revenue Y/Y(Nvidia Doesn't split their earnings by division like AMD Does) as well as a receding profit margin, AMD expected this and unlike Nvidia AMD has Seen Margin Improve as well as being up 2% Y/Y. AMD is in an excellent spot and is likely going to keep pushing in Graphics as well as CPU, this is also not accounting for the current economic downturn.

    • @JohnSmith-ro8hk
      @JohnSmith-ro8hk 7 месяцев назад +5

      Correct. This doesn't even take into account the fact that amd dedicated wafer capacity to consoles that aren't selling. They had to contractually, whether those consoles sell or not. Then add in the dies for handhelds and apus and laptops.

  • @Vantrakter
    @Vantrakter 7 месяцев назад

    I quite like my RX 7900 XT MBA that I bought precisely a year ago. No complaints from an RX 6700 XT. It has no detectable coil while, works well (now) with OpenCL and for the games I play (no upscaling available) (Arthur: A Knight's Tale currently)...and rough estimate is that 40% of threads in the forum of my favourite Linux distro are about Nvidia. (As in problems with)..although Nvidia has been making certain noises about open sourcing their driver bits for Linux for some time. We'll see

    • @Beerfloat
      @Beerfloat 7 месяцев назад

      Sounds like a lot of Linux users went with Nvidia, then

  • @tomthomas3499
    @tomthomas3499 7 месяцев назад +6

    People want AMD to be competitive so that they can buy cheaper Mvidia gpus 😂

    • @freeisalwaysme
      @freeisalwaysme 7 месяцев назад

      Yep.

    • @rich2579
      @rich2579 7 месяцев назад +3

      Competition in any industry is good for the consumer, literal economics 101

    • @tomthomas3499
      @tomthomas3499 7 месяцев назад +1

      @@rich2579 That's the idea, but nobody's buying from said competition, you are gonna end up with monopoly like we are currently heading..

  • @moomeuh1342
    @moomeuh1342 7 месяцев назад

    Just remember how AMD CPU division was just before Zen, it was in terminal decline at the time far more than Radeon graphics is right now.
    But Intel was slowing down, and made no improvements gen over gen.
    This is a very different situation, Nvidia is not focused on gaming market anymore, but the're still doing GPUs.
    Maybe AMD will catch up if Nvidia focuses more on AI/servers and uses a lower lithography process for their gaming GPUs

  • @praisetheoak
    @praisetheoak 7 месяцев назад +4

    Ultimately, it boils down to individuals not conducting thorough research on how they utilize their finances, aka: people being dumb. While Nvidia clearly dominates the higher-end range and there's really no argument in that, AMD presents a more advantageous option for the average user or gamer in terms of value for money from lower to mid-range.

    • @whatistruth_1
      @whatistruth_1 7 месяцев назад +2

      If and only if you focus solely on fps/dollar and not real world usage.
      Which in real world usage, DLSS leads to better image quality and performance versus what FSR offers.(explanation below)
      You can use a lower DLSS preset while still looking better than FSR. And get better performance.

    • @shadynebula6948
      @shadynebula6948 7 месяцев назад +1

      That’s what it is, on top of the fact that Nvidia is the most well known GPU brand. Most casual people looking to get a PC likely only know them

    • @praisetheoak
      @praisetheoak 7 месяцев назад

      ​@@whatistruth_1 Practically speaking, many of Nvidia's advanced features are simply useless for the average consumer. Not all users will be rendering ray-traced 8k models in Blender, creating thousands of AI images, managing terabytes of complex AI models, or rendering/editing hours of video on a secondary PC while livestreaming at the same time. Such uses are atypical for the average consumer, who nonetheless pays a premium for capabilities they may never utilize.
      DLSS is impressive, but it's questionable to compromise VRAM and performance to play at 720p upscaled to 1440p when one could play at native 1440p with AMD.
      Real-time ray tracing is truly effective on the 4070/4080/4090 series if you aim for over 60fps without upscaling. Below that, both AMD and Nvidia are comparable, with their 7000 series and 3060/3080/4060 cards respectively.

    • @praisetheoak
      @praisetheoak 7 месяцев назад

      @@shadynebula6948 That's sadly the truth.

    • @praisetheoak
      @praisetheoak 7 месяцев назад +3

      @@whatistruth_1 In practical terms, the average consumer is unlikely to utilize 90% of the features Nvidia offers, making it difficult to justify the premium cost. It's improbable that everyone will be rendering fully path-traced 8K Blender models, producing thousands of AI-generated images, handling terabytes of complex AI models, or editing extensive hours of video on a secondary computer while simultaneously streaming. While DLSS is indeed impressive, compromising on native resolution and VRAM to play games at 720p upscaled to 1440p seems unfavorable compared to playing at native 1440p with AMD.

  • @l.i.archer5379
    @l.i.archer5379 7 месяцев назад +1

    There's Team Red, Team Green, and Team Blue, thus giving us RGB.

    • @kerotomas1
      @kerotomas1 7 месяцев назад +1

      Now we need white for ARGB

  • @0x8badbeef
    @0x8badbeef 7 месяцев назад +3

    I'm not optimistic that technology developed for the enterprise sector can cross pollinate into the consumer sector because heat is not a problem in a server room.

    • @henryviiifake8244
      @henryviiifake8244 7 месяцев назад

      Heat is _definitely_ a problem in a server room. More heat = more $$$ spent on cooling.

    • @0x8badbeef
      @0x8badbeef 7 месяцев назад

      @@henryviiifake8244 it is not a problem because no one would complain about the noise.

    • @ishiddddd4783
      @ishiddddd4783 7 месяцев назад

      @@0x8badbeef that's why consumer gpu's dont come with blower coolers anymore, consumer side is pretty safe with enterprise tech coming to us, that's what the 4090 die was, it's the same die as tesla ada gpu's, and tesla hopper is barely larger, they simply don't need a big cooler since the servers already come with their own fans, and we don't have to worry since pretty much every cooler available is overkill for a 4090.

    • @0x8badbeef
      @0x8badbeef 7 месяцев назад

      @@ishiddddd4783 that is premature. We haven't seen what consumer NPU's are going to look like. Will it be part of a CPU, or GPU, or separate card. The closest thing I've seen on AI is in a Task Manager as an NPU, see "Working Meteor Lake Prototype Spotted At Computex!" on PCWorld channel time index 1:23.

  • @RaphaelSwinkels
    @RaphaelSwinkels 7 месяцев назад

    i'm happy i bought last gen.. ;p, i had the money for 7800xt or a 6800xt.. + was already running a RTX 3070 (@ 1080p) , but as soon as i bought a 1440p,165hz monitor,
    so i went for the bigger Vram, for same price + wider video bus-with.

  • @zine_eddinex24
    @zine_eddinex24 7 месяцев назад +3

    We need competition

    • @VASILISPAGOURAS7
      @VASILISPAGOURAS7 7 месяцев назад

      no mate we dont care about enthusiast class gpus

    • @arenzricodexd4409
      @arenzricodexd4409 7 месяцев назад

      We need people willing to pay high price first. Then we can talk. Else there is no intensive for competition to enter the market. Aside intel we still have the likes of ARM, Qualcomm and Imagination that still have legit GPU IP and actively advancing them.

    • @JamesBond77
      @JamesBond77 7 месяцев назад

      We need your mom my guy.😬

  • @michaelthomas5433
    @michaelthomas5433 7 месяцев назад

    Time for PowerVR to make a comeback and gobble up that market share.

  • @Juice_VI
    @Juice_VI 7 месяцев назад +11

    SEEEEEGAAAAAAAA

    • @JamesBond77
      @JamesBond77 7 месяцев назад +1

      You want Sega to produce GPUs ?🧐

  • @onomatopoeia162003
    @onomatopoeia162003 7 месяцев назад +1

    When I saw/heard articles like this.... Look at the overall tech market. lol

  • @KanakaBhaswaraPrabhata
    @KanakaBhaswaraPrabhata 7 месяцев назад +3

    "no one cares about RT" said amd users. if no one cares, amd wouldn't redesign their RT cores for RDNA4 to compete with nvidia and intel. it's just the fact that amd RT is bad, and RT is the future of rendering technique. i believe when amd RT is competing with nvidia, their pricing will match nvidia at same specs. gone are "cheaper" cards.

    • @JohnSmith-ro8hk
      @JohnSmith-ro8hk 7 месяцев назад +1

      No one outside of 4090 users care about RT for another 10 years. By the time the technology is mature and gpus are capable of pushing it we will be several gpu generations ahead of where we are today. But go ahead and keep buying garbage 8gb gpus because its made by your favorite corporation. Typical coomsumer.

  • @pem3480
    @pem3480 7 месяцев назад

    Amd will always be around. What is interesting to me is to see, what will happen to the low and mid tier GPUS with Intel on the heels. They are getting stronger with ever driver update.

  • @invisibleevidence1970
    @invisibleevidence1970 7 месяцев назад +3

    Amd to Nvidia, is what Xbox is to PS5 .

  • @jmangames5562
    @jmangames5562 7 месяцев назад

    @DanielOwen You should post a poll for your users lets see how many use an AMD gpu or an Nvidia. Also ALL consoles use AMD so as much as people think Nvidia sells way more you would have to factor in console sales and the % they make per. Have a great week!

  • @MrRoro582
    @MrRoro582 7 месяцев назад +3

    SEGA!!!

  • @CJWall_rott
    @CJWall_rott 7 месяцев назад +1

    I’d say Radeon is gaining market share steam hardware survey has a stick up its rump

  • @dauntae24
    @dauntae24 7 месяцев назад +3

    Triggered fan boy’s incoming.

    • @JohnSmith-ro8hk
      @JohnSmith-ro8hk 7 месяцев назад

      woah what a witty comment. i'm sure huang will appreciate your support of his balls

    • @dauntae24
      @dauntae24 7 месяцев назад +1

      @@JohnSmith-ro8hk I buy whatever works best so I have no dog in this fight.

  • @kr00tman
    @kr00tman 7 месяцев назад

    @12:12 mark 7000 series is bad for crypto mining because the vbios is locked down, if amd unlocks the vbios on rdna4 or 5 then amd could be the top dog in terms of gpu mining.

  • @Joric78
    @Joric78 7 месяцев назад

    It's sad AMD isn't more competitive. I switched back to Nvidia this generation, although that was mostly due to VR and a shift to the higher end. The 7800 XT and 7900 GRE would probably have still been my bang for buck choice otherwise.

  • @denvera1g1
    @denvera1g1 7 месяцев назад

    To the people who say Xbox exiting the console market saying this would be bad for AMD
    Only on a spreadsheet of market share, and thats assuming windows handhelds dont completely fill that hole.
    The margins on consoles are incredibly low to negative, AMD has a massive opertunity loss for every bit of silicon that ends up as a GPU and console rather than a desktop CPU or enterprise CPU/AI card.
    AMD makes more on handhelds with dies 1/2 the size of the Xbox series S, than AMD make on the Xbox Series X.
    The 7800XT at $499, could completely drop the RAM cooler and board, then be split up to be turned into more than 2x7800X($329 each) Shipping and packaging would be cheaper for these two CPUs than a single 7800XT
    For reference. for the silicon used in the Xbox Series X, AMD could have sold almost 5x3800X(not including the dirt cheap IO die this cost increase for the 3800x is offset by the cost increase of the higher defect rate on the Xbox series X)