B580 Review: Mainstream Ready!

Поделиться
HTML-код
  • Опубликовано: 15 дек 2024

Комментарии • 191

  • @Jizousensei-R
    @Jizousensei-R 4 дня назад +55

    i'm curious of B770 for later tbh

    • @ac8598
      @ac8598 4 дня назад +6

      Yeah, that should be more powerful at higher VRam. 20gb?

    • @Jizousensei-R
      @Jizousensei-R 3 дня назад +11

      @ac8598 at least 16GB

    • @chamoferindamencha8964
      @chamoferindamencha8964 3 дня назад +6

      Fingers crossed they even make it, tbh.
      The die in the b580 isn't cut down at all, so the B770 would be a fully different, bigger die. I do hope they designed one and just haven't announced it yet, but that's not a given, given Intel's financial state.

    • @Jizousensei-R
      @Jizousensei-R 3 дня назад +5

      @@chamoferindamencha8964 at least announcement when CES 2025, if they want to

    • @allxtend4005
      @allxtend4005 3 дня назад +1

      there is no B770 and will not be.

  • @hvdiv17
    @hvdiv17 3 дня назад +20

    Intel is targeting where most of the gamers are at 1080p and 1440p which is a smart move for them.
    The ones that buy the $500+ gpus are a small group.

    • @Observer168
      @Observer168 3 дня назад

      Cloud companies are buying up thousands of 4090's at a time.

    • @4brotroydavis
      @4brotroydavis 2 дня назад +1

      Small group that buys 4070s, 4070 supers, 7700xts, and 7800xts??

    • @syncmonism
      @syncmonism 2 дня назад

      No, they're just pricing it competitively. They can't afford to make enough of them to actually sell them in large volumes, because they are losing money on each one sold at 250 USD.

    • @AnEagle
      @AnEagle 2 дня назад +2

      ​@@Observer168cloud companies tend to buy enterprise cards

    • @irsshill4502
      @irsshill4502 23 часа назад

      @@4brotroydavis In the past, those cards were mid range. Now they are high end. Thx nvidia.

  • @AlexSchendel
    @AlexSchendel 3 дня назад +22

    I've enjoyed my time with my A750 and A770 (I'm an Intel employee, so the discounts are too good to pass up). I have not had any real issues with game compatibility, and driver updates have helped a lot. My biggest gripe has been the fact that VR is not supported and so you have to jump through a lot of hoops to get it working at all... I just don't want to have to do that to use my headset. I really hope that now that the drivers are in a good spot and Battlemage is out the door, in a much better spot than at launch, the drivers team can pivot to addressing the lack of VR support. Native support for SteamVR and Oculus Link are sorely needed.

    • @allxtend4005
      @allxtend4005 3 дня назад

      The last time i saw so much BS was on redit

    • @AnEagle
      @AnEagle 2 дня назад

      I can see that happening, I have also been noticing a huge uptick in support on the Linux side in the last 6 months, so I think they've started to tackle all that stuff, now that the actual image processing is getting good

    • @AlexSchendel
      @AlexSchendel День назад

      @@AnEagle I've been using Linux on it since day one, and while it was annoying to have to force the use of beta mesa drivers at first, I've had a pretty great experience all around. It never got the same flashy articles about how much better each driver update was, but on the flip side, it is so much easier to update. I do have a Windows install for the once in a blue moon time that I need it, and to this day, the drivers still fail to update on their own. If I try to update drivers from Arc control, it fails to find them. On Linux, any package manager just finds that mesa was updated and installs it nice and easy.

  • @LeesChannel
    @LeesChannel 3 дня назад +23

    My mind is always on Gordon, I hope he is able to spend quality time with his loved ones.

    • @threadripper979
      @threadripper979 2 дня назад +3

      I only come to check on Gordon. These jokers - not so much.

    • @bradeinarsen
      @bradeinarsen День назад

      @@threadripper979 Rude! Adam and Brad (and friends) are great too!

    • @threadripper979
      @threadripper979 День назад

      @@bradeinarsen Sure, slick. Anything you say.

  • @Ale-ch7xx
    @Ale-ch7xx 4 дня назад +39

    Will you be doing any PCIE 3.0 testing for us users still on PCIE 3.0 such as the B450 and a520 motherboard?

    • @arlynnfolke
      @arlynnfolke 3 дня назад +9

      this

    • @Xero_Wolf
      @Xero_Wolf 3 дня назад +3

      Why are you still on PCIE 3.0? I'm poor but not that poor.

    • @Ale-ch7xx
      @Ale-ch7xx 3 дня назад +21

      @@Xero_Wolf Because I prefer to save money. I didn't become a home owner this year by throwing away money. The 5950x is perfectly fine for gaming, editing, and rendering. I picked it up on sale. It does what I need it to do, especially when you're on a tighter budget.
      B580 buyers are like me, who are on a tight budget or don't want to spend a lot of money on a new build. I currently have a Geforce 1650, which I bought when I first built the computer along with a Ryzen 5 2500x. I currently still have the 1650.
      The 5950x will not bottleneck anything up-to a 7800 XT or possibly 7900 XT unless you lower your settings in games.

    • @wargamingrefugee9065
      @wargamingrefugee9065 3 дня назад +7

      @@Xero_Wolf I'm not trying to dog pile, just add my take for staying on PCIe 3. I have a dedicated video editing/up-scaling/Stable Diffusion/RAID 5 storage rig with an i9-10850K, 64 GB of RAM and an Arc A770 LE 16GB card. It works well and fits my use case. But, as you pointed out, it does have a limitation -- PCIe bandwith. The solution, or at least a huge offset to the limitation, is to only use graphics cards with with 16 PCIe lanes. So, that's what I do; nVidia's RXT 2060 Super, Intel's A750 and A770 all have 16x PCIe lanes.
      It 's basically the same story for my gaming rig. I'm on 10th gen Intel and limited to PCIe 3.
      Making the change to PCIe 4 would mean buying a lot of new stuff to replace the stuff which still works and fits my needs. That's something I will not be doing as long as there's a work around.

    • @jakejoyride
      @jakejoyride 3 дня назад

      b450 support pcie 4 just update your bios

  • @robotsix6268
    @robotsix6268 2 дня назад +4

    B580 Street price $370. It's all a lie.

  • @Eternalduoae
    @Eternalduoae 2 дня назад +3

    This video has it all! B850, GTX 1080 Ti, X3D CPU! It's incredibly long and detailed and has sections covering all the bases
    Bravo!

  • @guppyduppy1897
    @guppyduppy1897 День назад +1

    I think it's amazing what intel achived here.

  • @TheGanntak
    @TheGanntak 2 дня назад +2

    Looks a really nice card for the price and the fact they aimed at 1440p is awesome as thats a huge market. Im in awe of that 1080 though its neck and neck with the 3060 on most and the 4060 on some games very impressive card.

  • @MrHal900
    @MrHal900 2 дня назад +2

    Very well done Intel!!! Drivers will not be an issue, Intel does develop them a lot.

  • @hbdude155
    @hbdude155 3 дня назад +5

    Welcome to the cool kids club intel

  • @AshrafAliS
    @AshrafAliS 3 дня назад +1

    Sorting those numbers would be easier for viewers, because we need to check which is doing good, now we need to check all the numbers to check which is good, if it's sorted we can check top 2 or 3.

  • @lavi688
    @lavi688 2 дня назад +1

    For a $200 (current value) 1080ti, says how good the card was and is and will always remain Nvidia's GOAT. Its keeping up with the B580 and even beats it in one of the tests. Not worth 'side-grading' to a B580 if you have a 1080TI for better 1% lows.

  • @AjaySensei
    @AjaySensei 3 дня назад +4

    Proof that the GTX 1080 Ti is the GREATEST OF ALL TIME!

    • @kdxanatos
      @kdxanatos 9 минут назад

      My only complaint about that card is that it's given me no reason to upgrade in the 8 years I've had it! I'm finally planning on a new one with a completely new system build. Longest lasting card in my 25ish years of computer gaming!

  • @gameingHyenia
    @gameingHyenia День назад

    Indy Works just fine on my a770 after i updated to the New drivers. B580 i do not have 1 Probably will never get 1 . Not at 400.00

  • @laz7777
    @laz7777 2 дня назад +1

    I'm hoping they release a mid or high end competitor. A B770 would be an immediate buy for me. For now I'll probably have to settle for whatever AMD releases asduming they dont raise their prices this gen.

    • @Koaloth
      @Koaloth 14 часов назад

      have the A770 now great card will get the B770 if they release it

  • @josiahsuarez
    @josiahsuarez День назад +1

    For the cost this card performs!

  • @marcasswellbmd6922
    @marcasswellbmd6922 3 дня назад +5

    Still to this day I could care less about RT it's just something I don't really notice as I am running around trying not to die..

    • @4brotroydavis
      @4brotroydavis 2 дня назад

      Yea but if u were WALKING around and exploring you'd appreciate it. Its not a deal breaker to not have but hard to not want if youve ever gotten to play something at a respectable performance level with it on

  • @ScottGrammer
    @ScottGrammer 3 дня назад +1

    The issue with DaVinci Resolve is concerning to me as that is a use case I had planned on. Hardware Canucks tested it with Resolve, but they just rendered an edited video with it. In that test, it did better than the 4060, but not as well as the A-series cards.

    • @PelonixYT
      @PelonixYT 3 дня назад

      i believe that is normal unfortunately, iirc Intel said that is expected results

    • @pcworld
      @pcworld  3 дня назад +2

      I will definitely be chasing this one down as I'm curious too. I'm trying to contact PugetBench to figure out if this is something on the testing side.
      -Adam

    • @tylertoptier2353
      @tylertoptier2353 2 дня назад

      Probably because of a smaller memory bus then the A series.

  • @AlexSchendel
    @AlexSchendel 3 дня назад +1

    GN had similar idle power issues, but Level1 got idle power at 7.5 to 13W depending on monitor refresh rate. I'm curious about settings related to:
    - What is the monitor setup between reviewers (resolution and multi-monitor or not)?
    - What are the BIOS and Windows ASPM settings?

    • @lokeung0807
      @lokeung0807 3 дня назад

      From the experience of Alchemist, Intel idle power is fine for 60mhz monitor, but will be significantly higher for more than that

    • @AlexSchendel
      @AlexSchendel 3 дня назад +2

      @@lokeung0807 Yeah. First of all, power savings require ASPM to be enabled.
      Sadly, I'm running 2x1440p displays, with one at 144Hz and so that means I'm constantly drawing 45W no matter what.

    • @lokeung0807
      @lokeung0807 3 дня назад

      @@AlexSchendel I am using A750, with a 100mhz 1080p monitor, GPU used 39-41W in ideal...

    • @AlexSchendel
      @AlexSchendel 3 дня назад +1

      @@lokeung0807 I'm assuming you mean 100Hz? That sounds like it should be fine if you have ASPM configured properly.

    • @lokeung0807
      @lokeung0807 3 дня назад

      @@AlexSchendel Yes, the power draw will drop significantly if I turn to 60mhz
      but will be high if I turn to 100mhz

  • @trumpdonald7838
    @trumpdonald7838 13 часов назад

    Intel's technical level and driver development have a high level, which is worth looking forward to.

  • @pregorygeck6605
    @pregorygeck6605 5 часов назад

    Comparing the a750 to the b580 like "apples to apples" is nasty!

  • @burnitdwn
    @burnitdwn 3 дня назад

    Hah, I recently grabbed a used 1080TI for my wife's PC. It was about $135 on ebay + shipping and sales tax brought it to around $165 here. I was going to just install my Radeon 6800 in her PC, but waiting for a better $500 GPU. I was tempted to pull the trigger when the 7900GRE hit $499 a while back, and was tempted by $620 7900XT, but, i am not in a rush.

  • @RighteousBruce
    @RighteousBruce День назад +1

    Starts at 21:22

  • @SecretlySeven
    @SecretlySeven 3 дня назад

    Looking at my A-770 I feel like Intel owes us early adopters a special discount for a B series card. I feel like if they stayed on the Alchemist cards longer we'd be in a better place. I'm still dealing with issues in games.

  • @ramonzaions7522
    @ramonzaions7522 3 дня назад +1

    Great Video Guys!

  • @Accuaro
    @Accuaro 23 часа назад

    Hi, may I ask if this card can play Halo: MCC and Overwatch? I have looked all over YT and found no results 😭😭

  • @phoenix1971
    @phoenix1971 16 часов назад

    Back in October, I managed to pick up a Palit RTX 2080 Ti Gaming Pro 11GB for £225 UK, which I thought was a bargain. I'd be interested to know how the B580 competes against this type of card.

  • @DamnGamesTV
    @DamnGamesTV 2 дня назад

    here in indonesia intel arc B580 is around $281😢 and i decide buying zotac rtx 4060ti 16gb amp model for $381 is it good price for rtx 4060 ti 16gb ?

  • @FordEntertainmentGroupFEG
    @FordEntertainmentGroupFEG 2 дня назад +2

    Really? An hour video?

  • @mattpulliam4494
    @mattpulliam4494 3 дня назад

    Ok it's time to oc these competitors and see how much Intel spreads it's wings. At least Tom Peterson has mentioned there's some there to be tapped

  • @matthiasmartin4355
    @matthiasmartin4355 День назад

    Unfortunately testing from websites like computerbase has found the B580 is struggling across pretty much all Unreal Engine 5 games. Hopefully Intel can improve that via a driver update.

  • @RhinoXpress
    @RhinoXpress 3 дня назад

    So where was the 6700 in the thumbnail?

    • @pcworld
      @pcworld  2 дня назад +1

      Whoops, typo! Thanks for the catch, it's fixed now.
      -Adam

  • @BarisYener
    @BarisYener 2 дня назад

    I used my Intel ARC A750 with new Windows11 Install, putting in a NVIDIA GPU and then back to A750 and had the same issue with the Puget Davinci Benchmark. I guess thats someting Driver/Windows related. Before the NVIDIA GPU was installed there was no problem with runing the Davinci Standard Benchmark, but the extended one couldn't be finished.

  • @allxtend4005
    @allxtend4005 3 дня назад

    strange cards you are testing against, why did not pick the RTX 3060 ti 12gb version ? why did not pick the Rx 6700xt ? why test it agaisnt the A750 and not against the A770 ? Why is there a 1080ti and you test RT ? so many questions.

    • @kspfan001
      @kspfan001 3 дня назад +1

      Why not just go to one of the many other PC hardware youtube channels that have tested it against the gpus you want to see? Nobody has every gpu lying around to test my guy.

    • @pcworld
      @pcworld  2 дня назад +1

      We picked these particular cards because they were the closest in price compared to the B580 (including the used GTX 1080 Ti).
      - Adam

    • @kazuviking
      @kazuviking 2 дня назад

      3060 12gig 30-40€ more expensive, 6700XT is unobtanium in the EU. 1080TI have the same raster performance as a 4060. A770 costs more than the B580.

  • @Martin-wj6um
    @Martin-wj6um 3 дня назад

    Hi guys! Good job, thank You! Get well Gordon!!!

  • @jaynorwood2
    @jaynorwood2 3 дня назад

    I don't believe Meteor Lake had the matrix hardware, so maybe just Lunar Lake laptops will support the frame generation.

  • @henrikg1388
    @henrikg1388 3 дня назад

    I just wish some YT channel could do a more general comparison across mobile, desktop and even consoles to give a broader view than the niche DIY desktop market and minute FPS comparisons. The non-desktop gamers are actually a majority these days. Will the Arc get a mobile version? I think there will be mostly flat packages to the gaming kids this Christmas. Reflect that.
    Writing this on a four year old laptop with a 16GB RTX 3080 (mobile) that probably still outperforms this new budget wonder.

  • @xg320
    @xg320 День назад

    where is game footage with stats?

  • @PCandTech-tr8xt
    @PCandTech-tr8xt 3 дня назад

    User: Any data on LLM inference speed or Stable Diffusion rendering?
    Blogger: Data? The only data I care about is my kill-death ratio.

  • @Observer168
    @Observer168 3 дня назад

    Need 24GB and 48GB versions to give Nvidia some competition. Cloud hosting companies are buying up all the 4090s

  • @docbrody
    @docbrody 2 дня назад +1

    I came here to say, “caveat”

  • @4brotroydavis
    @4brotroydavis 2 дня назад

    Currently have a 5700xt and aoon upgrading to a 1440p monitor. I think the 5700 can hold up long enuff until we get hard truth about whats coming next. But if i can find this card for 270 still in stock ill grab her

  • @baysidejr
    @baysidejr 3 дня назад

    Meteor lake doesnt have xmx hardware. Just lunar lake and arc. The b580 looks good. I have an a770 so waiting to see if a b770 is released.

    • @kspfan001
      @kspfan001 3 дня назад

      It has been stuck in development hell and likely won't come out until late 2025/early 2026, if at all. They may just skip it and move onto celestial.

  • @Wild_Cat
    @Wild_Cat 4 дня назад +3

    Intel Arc FTW!

  • @IndianCitizen04
    @IndianCitizen04 3 дня назад

    I think Driver is responsible for higher idle power draw of Intel GPU.

  • @Veptis
    @Veptis 3 дня назад

    Why is your title and description localized? I don't want that and ther eis no indicator.
    (Android app).

  • @syncmonism
    @syncmonism 2 дня назад

    It's not mainstream ready if they can't afford to sell very many of them.

  • @marcasswellbmd6922
    @marcasswellbmd6922 3 дня назад +5

    I am still using an ASRock PG B550 Velocita, 5900X, and the Great 6800XT. That 16gigs has been more than enough. My God that GPU has future proofed me to this day.. Yeah Yeah I know there are way faster cards out now.. But that XFX 6800XT has held me down at 4k 60 FPS gaming and even higher frames in most games. Like I just played Hogwarts last night above 60 FPS on high settings in 4K.. And forget about it for 2K this card still crushes 1440P gaming.. AMD does truly give people more for their money.. And AM4 is still the gift that keeps on giving to me.. I am do for a new build though it's been 7ish years but honestly I can see myself getting by on this set up for another 5/6 years.. IDK...

  • @donkey_mediocre7246
    @donkey_mediocre7246 3 дня назад

    My 1650 can finally rest

  • @antonlogunov1936
    @antonlogunov1936 3 дня назад

    Please, fix idle power draw. I live in California, electricity is ultra expensive.

    • @cirozorro
      @cirozorro 3 дня назад +1

      See kit gurus review (22 minute mark), had to make 2 setting changes and got the idle power draw to reduce for 36W→15W [1. Enable ASPM in bios 2. In OS go to PCI Express Link state Power Management and select Maximum power Saving]

  • @gameingHyenia
    @gameingHyenia День назад +1

    Yes the 1080Ti i will take mine to the grave with me.

  • @Veptis
    @Veptis 3 дня назад

    Would be a lot of fun to profile the games and jump into the pipeline and calls to see what's going on but I don't think that is easy to explain to your audience. Any idea where such content exists, if at all?

    • @pcworld
      @pcworld  3 дня назад +1

      That would be awesome for sure, but way over my head for sure, but I can talk to Will about it.
      -Adam

  • @Reaper9-4
    @Reaper9-4 2 дня назад

    buying an intel card when its more widely supported by VR and more games with xess2

  • @blackvq3759
    @blackvq3759 3 дня назад +2

    Okay Nvidia lets sell the 5060 for $250😅

    • @zerosam5541
      @zerosam5541 3 дня назад +1

      Nivdia is more Likely to incrase the vram to 12 or more then lower price

  • @kartikpintu
    @kartikpintu 3 дня назад

    Should I upgrade or keep using my Rx 6700 xt?

    • @Gorib_Bhai32
      @Gorib_Bhai32 3 дня назад +2

      keep your rx 6700xt.

    • @pcworld
      @pcworld  3 дня назад

      Keep riding that, it's still a damn good card.
      -Adam

  • @brianrobinson3961
    @brianrobinson3961 3 дня назад +1

    0:22 B580

  • @VincentAmadeus
    @VincentAmadeus 4 дня назад +3

    Worth to upgrade from my rx 6600? 🙏🏻 Please helppppp

    • @JBrinx18
      @JBrinx18 4 дня назад +3

      No. You want 60% or most boost b4 upgrading

    • @darth254
      @darth254 4 дня назад +1

      this will be a decent boost in performance, and a very hefty VRAM boost.
      if you're not happy with your 6600's performance on games you like to play, then you're not happy. so if you have the $$$, then move on from it. if you are happy with your 6600 or don't have the money to spare, then continue to stick with it.

    • @hammerheadcorvette4
      @hammerheadcorvette4 3 дня назад +1

      Definite upgrade over my 5700XT

    • @Xero_Wolf
      @Xero_Wolf 3 дня назад +2

      I would still wait. Better GPUs are coming next year from all 3 vendors. Although this it depends on your budget too. Let's see when AMDs lower end cards land first.

    • @gorky_vk
      @gorky_vk 3 дня назад +3

      Not really. 6600 is beaten by this card, sure. But 6600 was never "playing on ultra" type of card and in reviews is mostly destroyed because of pure VRAM limitation. Even in some games where 4600 get a pass, most if not all games for some reason use more VRAM on AMD cards.
      On settings you more likely play difference is not that big. If you're buying now b580 would be easy recommendation even with price difference but as upgrade no.

  • @warnacokelat
    @warnacokelat 2 дня назад

    This card will only get better with driver updates.

  • @stephanhart9941
    @stephanhart9941 4 дня назад

    No more Brad GPU reviews on camera? Where you at homie?

    • @BradChacos
      @BradChacos 4 дня назад +7

      I'll be on Full Nerd today, I've never been based in SF so the video reviews ain't me!

    • @stephanhart9941
      @stephanhart9941 3 дня назад +2

      @BradChacos I know you are a fellow New Englander. Dorchester representing!

    • @4brotroydavis
      @4brotroydavis 2 дня назад

      ​@BradChacos i just got hip to you when yall sat down with Tom...u a real one

  • @castillo_-_
    @castillo_-_ 3 дня назад

    51:00 your saying we can run a system with this gpu with a 450w psu?

    • @kazuviking
      @kazuviking 2 дня назад +1

      You can run a 7800X3D with a 3070 on a 350W psu without issues.

    • @renatoramos8834
      @renatoramos8834 2 дня назад

      YOU'RE

  • @northernseeker1822
    @northernseeker1822 День назад

    No stock, price rises by retailers and even more by scalpers, wait for AMD.

  • @pastelink6767
    @pastelink6767 3 дня назад

    So they caught up to this gen with a bigger die on a smaller node right before next gen is about to ship from AMD and Nvidia? Doesn't sound competitive to me unless Intel is willing to take the L for possible long term gains.

  • @ozanozkirmizi47
    @ozanozkirmizi47 2 дня назад +1

    If this is not going to be a "Huge Wake Up Call For AMD" I do not know, what will...

    • @jonorgames6596
      @jonorgames6596 2 дня назад

      Why do you say that? Intel is loosing money on each card here, negative margins.

  • @ayaanmza
    @ayaanmza 3 дня назад

    Pls check warzone on b580

  • @happybuggy1582
    @happybuggy1582 3 дня назад

    They are about a generation late

  • @gameingHyenia
    @gameingHyenia День назад

    250.00 Where is it 250.00

  • @kennethwilson8236
    @kennethwilson8236 3 дня назад

    Yeah the Indiana Jones game has something weird going on with it

    • @pcworld
      @pcworld  3 дня назад

      So you have seen others reporting problems?
      -Adam

  • @kazuviking
    @kazuviking 2 дня назад +1

    Small correction. Intel cross vendor locked the DP4A path to their integrated/dedicated gpus in late 2022. Everyone else can use the Shader Model 6.4 path which is considerably worse than DP4A in performance, image quality wise the same. FLOAT16 vs INT8.
    Any XeSS version released after 2022 have the DP4A path locked so only DX12 Shader Model 6.4 is available.

  • @christophermullins7163
    @christophermullins7163 4 дня назад +3

    Ok. It wasnt a complete failure. I just hope that with how much intel is losing on these gpus(4070super amount of silicon cost on b580 😮) they dont just close shop on dgpus and drivers. Hope not.

    • @hammerheadcorvette4
      @hammerheadcorvette4 3 дня назад +2

      Xe is gonna be HUGE for Intel because of AI and other components in the Intel ecosystem. It's not going anywhere. TAP said the next gen is already done, and the hardware team has moved to the next gen following it. Intel is in this for real.

    • @Sureflame
      @Sureflame 3 дня назад

      They're not losing money lol. The 4060s are on 50 class dies and the 70s are on 60 class dies 💀. They're just not gimping the card in favor of leaning on AI to make fake frames instead despite charging double of what it should be.

    • @gorky_vk
      @gorky_vk 3 дня назад

      @@Sureflame Maybe not losing but they surely not make any money on them either with such a big die produced in third party foundry. Good for consumers but that's path which AMD took with Phenom CPU's long time ago and this led the company almost to bankruptcy.

    • @Sureflame
      @Sureflame 3 дня назад

      @gorky_vk It's cheap last generation sand though. Not bleeding edge hardware. Nvidia is gearing to use GDDR7 so that's one reason the card is 250 as an asking price.
      Provided they provide a respectable vram, bit bus, and memory speeds on newer cards; they might blow the b580 out of the water but they'll be asking 100 more than the B580 lol.

    • @gorky_vk
      @gorky_vk 3 дня назад

      ​@@Sureflame TSMC 5 nm is not cheap. It's not the leading edge anymore but still advance node and not cheap at all.
      GDDR7 is completely pointless on entry level cards and I seriously doubt that entry level cards from nvidia will use it even with next generation. Even if they do that will be in 6+ months.

  • @robertlawrence9000
    @robertlawrence9000 3 дня назад +5

    I wish they would release a B990. We need more high end cards. The low end GPU market is flooded already with new cards and used cards from previous generations with multiple price choices. Even as is the $250 B580 is sold out on many sites and board partner cards are $100 to $200 more, completely defeating the purpose of a $250 card. Intel really should be aiming higher. There is only 1 high tier card right now and we all know which one that is. Imagine if they released a massive card to compete with the RTX 4090 or a potential RTX 5090. That would really get people talking and people might be more eager to get their high end GPU if they could make it compete performance to price vs the competition. Even generations later, people would still be thinking about getting the older higher end card as the prices go down and it competes in the lower end some years down the road. I think this is a better strategy.

    • @crymore7942
      @crymore7942 3 дня назад +1

      A 40 / 5090 isn’t just a ‘high end’ card , it’s the best gpu performance wise ever made. You are your under estimating what’s involved in bringing a card like that to market . Ask yourself why AMD aren’t bringing out a competitor to the 5090 ? Intel are the new boys , they can’t just pull out a 5090 type card out of their ass that’s also cheaper with features like great ray tracing etc They are trying to get market share whilst also improving their development work .

    • @robertlawrence9000
      @robertlawrence9000 3 дня назад +1

      @@crymore7942 It's all about scaling. More processors.

    • @4m470
      @4m470 3 дня назад

      Buy a 7900XTX

    • @Solruc_
      @Solruc_ 3 дня назад +1

      The die would be impossibly expensive to produce, the B580 is bringing 4060 performance with the die size of a 4070 Super, to rival 4090 performance with the current architecture would be unsustainable and even for a halo product it would be ridiculously expensive

    • @robertlawrence9000
      @robertlawrence9000 3 дня назад +1

      @@4m470 That's an upper mid tier card.

  • @layoutkimsstudio2341
    @layoutkimsstudio2341 2 дня назад

    why nobody trying to run on poe2 bruh. i wanna now XD

  • @凌云-u5k
    @凌云-u5k 3 дня назад

    interesting😂named with "black" both game and plugin of davinci has problem❤ is black some key spell words that will abandon the magic of battlemage❤

  • @Marshaluranus
    @Marshaluranus 4 дня назад +2

    apparently everyone forgot 6700xt existed

    • @NolanHayes-b8w
      @NolanHayes-b8w 3 дня назад +3

      Not the same price point man

    • @oneanother1
      @oneanother1 3 дня назад +1

      Jayzcent had a 3070 in the benchmarks.

    • @damianabregba7476
      @damianabregba7476 3 дня назад

      Still interesting point of comparison. It was rather cheap one day and it seems like this should have comparable performance, same vram, with much better RT.
      6700xt with better rt for less is quite compelling. ​@@NolanHayes-b8w

    • @Chinu-w5x
      @Chinu-w5x 3 дня назад

      One question buddy here in my country iam getting rx6700 xt at the same and even a little cheaper than this which one should I get??

  • @ZarakiKenpachi81
    @ZarakiKenpachi81 16 часов назад

    What's the point ? Title & description are in my native language. But there is not even subtitles. What's the point to do this ? Idi*ts...

  • @Accuaro
    @Accuaro 23 часа назад +1

    Ah, also for Wukong the engine is the RTX branch for UE5, so it definitely does run better on Nvidia (the game just got another optimisation update for Nvidia, albeit for RT lol)

  • @gameingHyenia
    @gameingHyenia День назад

    Bench mark is your EYES and ears not data on a screen ,When you go down the FPS and Data on a screen you are always going to see things that do not effect your Gameplay. Eyes and Ears people.

  • @csh9853
    @csh9853 2 дня назад

    Stop saying ah or uh

  • @Manimmakimthisup
    @Manimmakimthisup День назад

    Linux?

  • @markdove5930
    @markdove5930 День назад

    No games no thanks

  • @paulboyce8537
    @paulboyce8537 3 дня назад

    00:05:48 Test machine specs. Every review makes this mistake apples to oranges mistake. AMD CPU doesn't give the best result for INTEL ARC. Reason for this is different architecture. AMD/Nvidia GPU with REBAR/E Cores +-5% zero zilch nada gain. INTEL ARC with REBAR/E Cores +30% gain optimized for INTEL CPU (hence Intel Processor Diagnostic Tool). INTEL ARC is as good as the CPU paired. Better CPU pairing and better performance from splitting the tasks with CPU. To understand that AMD/Nvidia CPU only feeds the GPU to do all the tasks with queuing and waiting that results to broken frames. So good INTEL CPU for INTEL GPU and AMD CPU for AMD/Nvidia GPU.
    When you understand the two different architectures you also know how misleading these tests are. As a rule INTEL all good frames vs double the frames but half broken hence high Wattage to replace the broken frames that can be half if not over and this causes stutter and bad experience from AMD/Nvidia GPU. At the end these tests are worthless. Comparing the experience matters. For example at 4k A770 beats 4070ti with half the frames.

    • @pcworld
      @pcworld  2 дня назад

      We're not here to give Arc the best result, we are here to see how it runs using our testing scenarios. It's important that potential buyers see this GPU will run across different kinds of test systems, which is why it's always good to watch multiple reviews and hope everyone is testing on something different.
      - Adam

    • @paulboyce8537
      @paulboyce8537 2 дня назад

      @@pcworld Your testing scenarios are misleading giving the competition an unfair advantage without pointing it out. For your credit at least it was mentioned that experience was somewhat better and different to the competition putting numbers aside.
      Yes it is good to watch many tests but youtube only parrots other channels with few exceptions and you really need to dig deep to get some reviews that take the different architecture in count. It is easy just to show numbers even that these have very little meaning to the real performance.

  • @lmmc7708
    @lmmc7708 2 дня назад

    365