B580 Review: Mainstream Ready!

Поделиться
HTML-код
  • Опубликовано: 12 дек 2024

Комментарии • 117

  • @AlexSchendel
    @AlexSchendel 17 часов назад +8

    I've enjoyed my time with my A750 and A770 (I'm an Intel employee, so the discounts are too good to pass up). I have not had any real issues with game compatibility, and driver updates have helped a lot. My biggest gripe has been the fact that VR is not supported and so you have to jump through a lot of hoops to get it working at all... I just don't want to have to do that to use my headset. I really hope that now that the drivers are in a good spot and Battlemage is out the door, in a much better spot than at launch, the drivers team can pivot to addressing the lack of VR support. Native support for SteamVR and Oculus Link are sorely needed.

    • @allxtend4005
      @allxtend4005 12 часов назад

      The last time i saw so much BS was on redit

  • @Jizousensei-R
    @Jizousensei-R День назад +25

    i'm curious of B770 for later tbh

    • @ac8598
      @ac8598 День назад +3

      Yeah, that should be more powerful at higher VRam. 20gb?

    • @Jizousensei-R
      @Jizousensei-R 23 часа назад +5

      @ac8598 at least 16GB

    • @chamoferindamencha8964
      @chamoferindamencha8964 21 час назад +1

      Fingers crossed they even make it, tbh.
      The die in the b580 isn't cut down at all, so the B770 would be a fully different, bigger die. I do hope they designed one and just haven't announced it yet, but that's not a given, given Intel's financial state.

    • @Jizousensei-R
      @Jizousensei-R 17 часов назад +3

      @@chamoferindamencha8964 at least announcement when CES 2025, if they want to

    • @allxtend4005
      @allxtend4005 12 часов назад

      there is no B770 and will not be.

  • @hvdiv17
    @hvdiv17 15 часов назад +4

    Intel is targeting where most of the gamers are at 1080p and 1440p which is a smart move for them.
    The ones that buy the $500+ gpus are a small group.

    • @Observer168
      @Observer168 8 часов назад

      Cloud companies are buying up thousands of 4090's at a time.

  • @Ale-ch7xx
    @Ale-ch7xx День назад +31

    Will you be doing any PCIE 3.0 testing for us users still on PCIE 3.0 such as the B450 and a520 motherboard?

    • @arlynnfolke
      @arlynnfolke 23 часа назад +7

      this

    • @Xero_Wolf
      @Xero_Wolf 22 часа назад +1

      Why are you still on PCIE 3.0? I'm poor but not that poor.

    • @Ale-ch7xx
      @Ale-ch7xx 22 часа назад +15

      @@Xero_Wolf Because I prefer to save money. I didn't become a home owner this year by throwing away money. The 5950x is perfectly fine for gaming, editing, and rendering. I picked it up on sale. It does what I need it to do, especially when you're on a tighter budget.
      B580 buyers are like me, who are on a tight budget or don't want to spend a lot of money on a new build. I currently have a Geforce 1650, which I bought when I first built the computer along with a Ryzen 5 2500x. I currently still have the 1650.
      The 5950x will not bottleneck anything up-to a 7800 XT or possibly 7900 XT unless you lower your settings in games.

    • @wargamingrefugee9065
      @wargamingrefugee9065 21 час назад +5

      @@Xero_Wolf I'm not trying to dog pile, just add my take for staying on PCIe 3. I have a dedicated video editing/up-scaling/Stable Diffusion/RAID 5 storage rig with an i9-10850K, 64 GB of RAM and an Arc A770 LE 16GB card. It works well and fits my use case. But, as you pointed out, it does have a limitation -- PCIe bandwith. The solution, or at least a huge offset to the limitation, is to only use graphics cards with with 16 PCIe lanes. So, that's what I do; nVidia's RXT 2060 Super, Intel's A750 and A770 all have 16x PCIe lanes.
      It 's basically the same story for my gaming rig. I'm on 10th gen Intel and limited to PCIe 3.
      Making the change to PCIe 4 would mean buying a lot of new stuff to replace the stuff which still works and fits my needs. That's something I will not be doing as long as there's a work around.

    • @jakejoyride
      @jakejoyride 21 час назад

      b450 support pcie 4 just update your bios

  • @marcasswellbmd6922
    @marcasswellbmd6922 19 часов назад +4

    I am still using an ASRock PG B550 Velocita, 5900X, and the Great 6800XT. That 16gigs has been more than enough. My God that GPU has future proofed me to this day.. Yeah Yeah I know there are way faster cards out now.. But that XFX 6800XT has held me down at 4k 60 FPS gaming and even higher frames in most games. Like I just played Hogwarts last night above 60 FPS on high settings in 4K.. And forget about it for 2K this card still crushes 1440P gaming.. AMD does truly give people more for their money.. And AM4 is still the gift that keeps on giving to me.. I am do for a new build though it's been 7ish years but honestly I can see myself getting by on this set up for another 5/6 years.. IDK...

  • @ScottGrammer
    @ScottGrammer 20 часов назад +1

    The issue with DaVinci Resolve is concerning to me as that is a use case I had planned on. Hardware Canucks tested it with Resolve, but they just rendered an edited video with it. In that test, it did better than the 4060, but not as well as the A-series cards.

    • @PelonixYT
      @PelonixYT 18 часов назад

      i believe that is normal unfortunately, iirc Intel said that is expected results

    • @pcworld
      @pcworld  18 часов назад +1

      I will definitely be chasing this one down as I'm curious too. I'm trying to contact PugetBench to figure out if this is something on the testing side.
      -Adam

    • @KL-pe4kn
      @KL-pe4kn 6 часов назад

      @@pcworldAdam did you get the chance to just try out general editing activities in Resolve, without running the PugeBench benchmark macro itself?

  • @LeesChannel
    @LeesChannel Час назад

    My mind is always on Gordon, I hope he is able to spend quality time with his loved ones.

  • @hbdude155
    @hbdude155 22 часа назад +3

    Welcome to the cool kids club intel

  • @AshrafAliS
    @AshrafAliS 22 часа назад

    Sorting those numbers would be easier for viewers, because we need to check which is doing good, now we need to check all the numbers to check which is good, if it's sorted we can check top 2 or 3.

  • @burnitdwn
    @burnitdwn 14 часов назад

    Hah, I recently grabbed a used 1080TI for my wife's PC. It was about $135 on ebay + shipping and sales tax brought it to around $165 here. I was going to just install my Radeon 6800 in her PC, but waiting for a better $500 GPU. I was tempted to pull the trigger when the 7900GRE hit $499 a while back, and was tempted by $620 7900XT, but, i am not in a rush.

  • @ramonzaions7522
    @ramonzaions7522 19 часов назад +1

    Great Video Guys!

  • @SecretlySeven
    @SecretlySeven 9 часов назад

    Looking at my A-770 I feel like Intel owes us early adopters a special discount for a B series card. I feel like if they stayed on the Alchemist cards longer we'd be in a better place. I'm still dealing with issues in games.

  • @Martin-wj6um
    @Martin-wj6um 3 часа назад

    Hi guys! Good job, thank You! Get well Gordon!!!

  • @AlexSchendel
    @AlexSchendel 17 часов назад +1

    GN had similar idle power issues, but Level1 got idle power at 7.5 to 13W depending on monitor refresh rate. I'm curious about settings related to:
    - What is the monitor setup between reviewers (resolution and multi-monitor or not)?
    - What are the BIOS and Windows ASPM settings?

    • @lokeung0807
      @lokeung0807 14 часов назад

      From the experience of Alchemist, Intel idle power is fine for 60mhz monitor, but will be significantly higher for more than that

    • @AlexSchendel
      @AlexSchendel 14 часов назад +2

      @@lokeung0807 Yeah. First of all, power savings require ASPM to be enabled.
      Sadly, I'm running 2x1440p displays, with one at 144Hz and so that means I'm constantly drawing 45W no matter what.

    • @lokeung0807
      @lokeung0807 12 часов назад

      @@AlexSchendel I am using A750, with a 100mhz 1080p monitor, GPU used 39-41W in ideal...

    • @AlexSchendel
      @AlexSchendel 12 часов назад

      @@lokeung0807 I'm assuming you mean 100Hz? That sounds like it should be fine if you have ASPM configured properly.

    • @lokeung0807
      @lokeung0807 12 часов назад

      @@AlexSchendel Yes, the power draw will drop significantly if I turn to 60mhz
      but will be high if I turn to 100mhz

  • @PCandTech-tr8xt
    @PCandTech-tr8xt 9 часов назад

    User: Any data on LLM inference speed or Stable Diffusion rendering?
    Blogger: Data? The only data I care about is my kill-death ratio.

  • @jaynorwood2
    @jaynorwood2 12 часов назад

    I don't believe Meteor Lake had the matrix hardware, so maybe just Lunar Lake laptops will support the frame generation.

  • @marcasswellbmd6922
    @marcasswellbmd6922 19 часов назад +3

    Still to this day I could care less about RT it's just something I don't really notice as I am running around trying not to die..

  • @IndianCitizen04
    @IndianCitizen04 11 часов назад

    I think Driver is responsible for higher idle power draw of Intel GPU.

  • @RhinoXpress
    @RhinoXpress 4 часа назад

    So where was the 6700 in the thumbnail?

  • @henrikg1388
    @henrikg1388 17 часов назад

    I just wish some YT channel could do a more general comparison across mobile, desktop and even consoles to give a broader view than the niche DIY desktop market and minute FPS comparisons. The non-desktop gamers are actually a majority these days. Will the Arc get a mobile version? I think there will be mostly flat packages to the gaming kids this Christmas. Reflect that.
    Writing this on a four year old laptop with a 16GB RTX 3080 (mobile) that probably still outperforms this new budget wonder.

  • @Observer168
    @Observer168 8 часов назад

    Need 24GB and 48GB versions to give Nvidia some competition. Cloud hosting companies are buying up all the 4090s

  • @mattpulliam4494
    @mattpulliam4494 15 часов назад

    Ok it's time to oc these competitors and see how much Intel spreads it's wings. At least Tom Peterson has mentioned there's some there to be tapped

  • @Wild_Cat
    @Wild_Cat День назад +3

    Intel Arc FTW!

  • @baysidejr
    @baysidejr 12 часов назад

    Meteor lake doesnt have xmx hardware. Just lunar lake and arc. The b580 looks good. I have an a770 so waiting to see if a b770 is released.

    • @kspfan001
      @kspfan001 49 минут назад

      It has been stuck in development hell and likely won't come out until late 2025/early 2026, if at all. They may just skip it and move onto celestial.

  • @AjaySensei
    @AjaySensei 4 часа назад

    Proof that the GTX 1080 Ti is the GREATEST OF ALL TIME!

  • @antonlogunov1936
    @antonlogunov1936 20 часов назад

    Please, fix idle power draw. I live in California, electricity is ultra expensive.

    • @cirozorro
      @cirozorro 8 часов назад

      See kit gurus review (22 minute mark), had to make 2 setting changes and got the idle power draw to reduce for 36W→15W [1. Enable ASPM in bios 2. In OS go to PCI Express Link state Power Management and select Maximum power Saving]

  • @allxtend4005
    @allxtend4005 12 часов назад

    strange cards you are testing against, why did not pick the RTX 3060 ti 12gb version ? why did not pick the Rx 6700xt ? why test it agaisnt the A750 and not against the A770 ? Why is there a 1080ti and you test RT ? so many questions.

    • @kspfan001
      @kspfan001 48 минут назад +1

      Why not just go to one of the many other PC hardware youtube channels that have tested it against the gpus you want to see? Nobody has every gpu lying around to test my guy.

  • @castillo_-_
    @castillo_-_ 13 часов назад

    51:00 your saying we can run a system with this gpu with a 450w psu?

  • @brianrobinson3961
    @brianrobinson3961 23 часа назад +1

    0:22 B580

  • @Veptis
    @Veptis 18 часов назад

    Would be a lot of fun to profile the games and jump into the pipeline and calls to see what's going on but I don't think that is easy to explain to your audience. Any idea where such content exists, if at all?

    • @pcworld
      @pcworld  18 часов назад +1

      That would be awesome for sure, but way over my head for sure, but I can talk to Will about it.
      -Adam

  • @VincentAmadeus
    @VincentAmadeus День назад +2

    Worth to upgrade from my rx 6600? 🙏🏻 Please helppppp

    • @JBrinx18
      @JBrinx18 День назад +3

      No. You want 60% or most boost b4 upgrading

    • @darth254
      @darth254 День назад +1

      this will be a decent boost in performance, and a very hefty VRAM boost.
      if you're not happy with your 6600's performance on games you like to play, then you're not happy. so if you have the $$$, then move on from it. if you are happy with your 6600 or don't have the money to spare, then continue to stick with it.

    • @hammerheadcorvette4
      @hammerheadcorvette4 22 часа назад +1

      Definite upgrade over my 5700XT

    • @Xero_Wolf
      @Xero_Wolf 22 часа назад +1

      I would still wait. Better GPUs are coming next year from all 3 vendors. Although this it depends on your budget too. Let's see when AMDs lower end cards land first.

    • @gorky_vk
      @gorky_vk 20 часов назад +2

      Not really. 6600 is beaten by this card, sure. But 6600 was never "playing on ultra" type of card and in reviews is mostly destroyed because of pure VRAM limitation. Even in some games where 4600 get a pass, most if not all games for some reason use more VRAM on AMD cards.
      On settings you more likely play difference is not that big. If you're buying now b580 would be easy recommendation even with price difference but as upgrade no.

  • @stephanhart9941
    @stephanhart9941 День назад

    No more Brad GPU reviews on camera? Where you at homie?

    • @BradChacos
      @BradChacos День назад +6

      I'll be on Full Nerd today, I've never been based in SF so the video reviews ain't me!

    • @stephanhart9941
      @stephanhart9941 22 часа назад +2

      @BradChacos I know you are a fellow New Englander. Dorchester representing!

  • @blackvq3759
    @blackvq3759 22 часа назад +2

    Okay Nvidia lets sell the 5060 for $250😅

    • @zerosam5541
      @zerosam5541 19 часов назад +1

      Nivdia is more Likely to incrase the vram to 12 or more then lower price

  • @donkey_mediocre7246
    @donkey_mediocre7246 8 часов назад

    My 1650 can finally rest

  • @happybuggy1582
    @happybuggy1582 7 часов назад

    They are about a generation late

  • @robertlawrence9000
    @robertlawrence9000 23 часа назад +5

    I wish they would release a B990. We need more high end cards. The low end GPU market is flooded already with new cards and used cards from previous generations with multiple price choices. Even as is the $250 B580 is sold out on many sites and board partner cards are $100 to $200 more, completely defeating the purpose of a $250 card. Intel really should be aiming higher. There is only 1 high tier card right now and we all know which one that is. Imagine if they released a massive card to compete with the RTX 4090 or a potential RTX 5090. That would really get people talking and people might be more eager to get their high end GPU if they could make it compete performance to price vs the competition. Even generations later, people would still be thinking about getting the older higher end card as the prices go down and it competes in the lower end some years down the road. I think this is a better strategy.

    • @crymore7942
      @crymore7942 22 часа назад +1

      A 40 / 5090 isn’t just a ‘high end’ card , it’s the best gpu performance wise ever made. You are your under estimating what’s involved in bringing a card like that to market . Ask yourself why AMD aren’t bringing out a competitor to the 5090 ? Intel are the new boys , they can’t just pull out a 5090 type card out of their ass that’s also cheaper with features like great ray tracing etc They are trying to get market share whilst also improving their development work .

    • @robertlawrence9000
      @robertlawrence9000 21 час назад +1

      @@crymore7942 It's all about scaling. More processors.

    • @4m470
      @4m470 21 час назад

      Buy a 7900XTX

    • @Solruc_
      @Solruc_ 21 час назад

      The die would be impossibly expensive to produce, the B580 is bringing 4060 performance with the die size of a 4070 Super, to rival 4090 performance with the current architecture would be unsustainable and even for a halo product it would be ridiculously expensive

    • @robertlawrence9000
      @robertlawrence9000 21 час назад +1

      @@4m470 That's an upper mid tier card.

  • @kartikpintu
    @kartikpintu 19 часов назад

    Should I upgrade or keep using my Rx 6700 xt?

    • @Gorib_Bhai32
      @Gorib_Bhai32 19 часов назад +2

      keep your rx 6700xt.

    • @pcworld
      @pcworld  18 часов назад

      Keep riding that, it's still a damn good card.
      -Adam

  • @pastelink6767
    @pastelink6767 19 часов назад

    So they caught up to this gen with a bigger die on a smaller node right before next gen is about to ship from AMD and Nvidia? Doesn't sound competitive to me unless Intel is willing to take the L for possible long term gains.

  • @kennethwilson8236
    @kennethwilson8236 19 часов назад

    Yeah the Indiana Jones game has something weird going on with it

    • @pcworld
      @pcworld  18 часов назад

      So you have seen others reporting problems?
      -Adam

  • @Veptis
    @Veptis 21 час назад

    Why is your title and description localized? I don't want that and ther eis no indicator.
    (Android app).

  • @ayaanmza
    @ayaanmza 14 часов назад

    Pls check warzone on b580

  • @paulboyce8537
    @paulboyce8537 6 часов назад

    00:05:48 Test machine specs. Every review makes this mistake apples to oranges mistake. AMD CPU doesn't give the best result for INTEL ARC. Reason for this is different architecture. AMD/Nvidia GPU with REBAR/E Cores +-5% zero zilch nada gain. INTEL ARC with REBAR/E Cores +30% gain optimized for INTEL CPU (hence Intel Processor Diagnostic Tool). INTEL ARC is as good as the CPU paired. Better CPU pairing and better performance from splitting the tasks with CPU. To understand that AMD/Nvidia CPU only feeds the GPU to do all the tasks with queuing and waiting that results to broken frames. So good INTEL CPU for INTEL GPU and AMD CPU for AMD/Nvidia GPU.
    When you understand the two different architectures you also know how misleading these tests are. As a rule INTEL all good frames vs double the frames but half broken hence high Wattage to replace the broken frames that can be half if not over and this causes stutter and bad experience from AMD/Nvidia GPU. At the end these tests are worthless. Comparing the experience matters. For example at 4k A770 beats 4070ti with half the frames.

  • @christophermullins7163
    @christophermullins7163 День назад +3

    Ok. It wasnt a complete failure. I just hope that with how much intel is losing on these gpus(4070super amount of silicon cost on b580 😮) they dont just close shop on dgpus and drivers. Hope not.

    • @hammerheadcorvette4
      @hammerheadcorvette4 22 часа назад +2

      Xe is gonna be HUGE for Intel because of AI and other components in the Intel ecosystem. It's not going anywhere. TAP said the next gen is already done, and the hardware team has moved to the next gen following it. Intel is in this for real.

    • @Sureflame
      @Sureflame 21 час назад

      They're not losing money lol. The 4060s are on 50 class dies and the 70s are on 60 class dies 💀. They're just not gimping the card in favor of leaning on AI to make fake frames instead despite charging double of what it should be.

    • @gorky_vk
      @gorky_vk 20 часов назад

      @@Sureflame Maybe not losing but they surely not make any money on them either with such a big die produced in third party foundry. Good for consumers but that's path which AMD took with Phenom CPU's long time ago and this led the company almost to bankruptcy.

    • @Sureflame
      @Sureflame 20 часов назад

      @gorky_vk It's cheap last generation sand though. Not bleeding edge hardware. Nvidia is gearing to use GDDR7 so that's one reason the card is 250 as an asking price.
      Provided they provide a respectable vram, bit bus, and memory speeds on newer cards; they might blow the b580 out of the water but they'll be asking 100 more than the B580 lol.

    • @gorky_vk
      @gorky_vk 18 часов назад

      ​@@Sureflame TSMC 5 nm is not cheap. It's not the leading edge anymore but still advance node and not cheap at all.
      GDDR7 is completely pointless on entry level cards and I seriously doubt that entry level cards from nvidia will use it even with next generation. Even if they do that will be in 6+ months.

  • @凌云-u5k
    @凌云-u5k Час назад

    interesting😂named with "black" both game and plugin of davinci has problem❤ is black some key spell words that will abandon the magic of battlemage❤

  • @Marshaluranus
    @Marshaluranus День назад +2

    apparently everyone forgot 6700xt existed

    • @NolanHayes-b8w
      @NolanHayes-b8w 20 часов назад +3

      Not the same price point man

    • @oneanother1
      @oneanother1 18 часов назад +1

      Jayzcent had a 3070 in the benchmarks.

    • @damianabregba7476
      @damianabregba7476 7 часов назад

      Still interesting point of comparison. It was rather cheap one day and it seems like this should have comparable performance, same vram, with much better RT.
      6700xt with better rt for less is quite compelling. ​@@NolanHayes-b8w

    • @Chinu-w5x
      @Chinu-w5x 2 часа назад

      One question buddy here in my country iam getting rx6700 xt at the same and even a little cheaper than this which one should I get??