I've enjoyed my time with my A750 and A770 (I'm an Intel employee, so the discounts are too good to pass up). I have not had any real issues with game compatibility, and driver updates have helped a lot. My biggest gripe has been the fact that VR is not supported and so you have to jump through a lot of hoops to get it working at all... I just don't want to have to do that to use my headset. I really hope that now that the drivers are in a good spot and Battlemage is out the door, in a much better spot than at launch, the drivers team can pivot to addressing the lack of VR support. Native support for SteamVR and Oculus Link are sorely needed.
Fingers crossed they even make it, tbh. The die in the b580 isn't cut down at all, so the B770 would be a fully different, bigger die. I do hope they designed one and just haven't announced it yet, but that's not a given, given Intel's financial state.
@@Xero_Wolf Because I prefer to save money. I didn't become a home owner this year by throwing away money. The 5950x is perfectly fine for gaming, editing, and rendering. I picked it up on sale. It does what I need it to do, especially when you're on a tighter budget. B580 buyers are like me, who are on a tight budget or don't want to spend a lot of money on a new build. I currently have a Geforce 1650, which I bought when I first built the computer along with a Ryzen 5 2500x. I currently still have the 1650. The 5950x will not bottleneck anything up-to a 7800 XT or possibly 7900 XT unless you lower your settings in games.
@@Xero_Wolf I'm not trying to dog pile, just add my take for staying on PCIe 3. I have a dedicated video editing/up-scaling/Stable Diffusion/RAID 5 storage rig with an i9-10850K, 64 GB of RAM and an Arc A770 LE 16GB card. It works well and fits my use case. But, as you pointed out, it does have a limitation -- PCIe bandwith. The solution, or at least a huge offset to the limitation, is to only use graphics cards with with 16 PCIe lanes. So, that's what I do; nVidia's RXT 2060 Super, Intel's A750 and A770 all have 16x PCIe lanes. It 's basically the same story for my gaming rig. I'm on 10th gen Intel and limited to PCIe 3. Making the change to PCIe 4 would mean buying a lot of new stuff to replace the stuff which still works and fits my needs. That's something I will not be doing as long as there's a work around.
I am still using an ASRock PG B550 Velocita, 5900X, and the Great 6800XT. That 16gigs has been more than enough. My God that GPU has future proofed me to this day.. Yeah Yeah I know there are way faster cards out now.. But that XFX 6800XT has held me down at 4k 60 FPS gaming and even higher frames in most games. Like I just played Hogwarts last night above 60 FPS on high settings in 4K.. And forget about it for 2K this card still crushes 1440P gaming.. AMD does truly give people more for their money.. And AM4 is still the gift that keeps on giving to me.. I am do for a new build though it's been 7ish years but honestly I can see myself getting by on this set up for another 5/6 years.. IDK...
The issue with DaVinci Resolve is concerning to me as that is a use case I had planned on. Hardware Canucks tested it with Resolve, but they just rendered an edited video with it. In that test, it did better than the 4060, but not as well as the A-series cards.
I will definitely be chasing this one down as I'm curious too. I'm trying to contact PugetBench to figure out if this is something on the testing side. -Adam
Sorting those numbers would be easier for viewers, because we need to check which is doing good, now we need to check all the numbers to check which is good, if it's sorted we can check top 2 or 3.
Hah, I recently grabbed a used 1080TI for my wife's PC. It was about $135 on ebay + shipping and sales tax brought it to around $165 here. I was going to just install my Radeon 6800 in her PC, but waiting for a better $500 GPU. I was tempted to pull the trigger when the 7900GRE hit $499 a while back, and was tempted by $620 7900XT, but, i am not in a rush.
Looking at my A-770 I feel like Intel owes us early adopters a special discount for a B series card. I feel like if they stayed on the Alchemist cards longer we'd be in a better place. I'm still dealing with issues in games.
GN had similar idle power issues, but Level1 got idle power at 7.5 to 13W depending on monitor refresh rate. I'm curious about settings related to: - What is the monitor setup between reviewers (resolution and multi-monitor or not)? - What are the BIOS and Windows ASPM settings?
@@lokeung0807 Yeah. First of all, power savings require ASPM to be enabled. Sadly, I'm running 2x1440p displays, with one at 144Hz and so that means I'm constantly drawing 45W no matter what.
I just wish some YT channel could do a more general comparison across mobile, desktop and even consoles to give a broader view than the niche DIY desktop market and minute FPS comparisons. The non-desktop gamers are actually a majority these days. Will the Arc get a mobile version? I think there will be mostly flat packages to the gaming kids this Christmas. Reflect that. Writing this on a four year old laptop with a 16GB RTX 3080 (mobile) that probably still outperforms this new budget wonder.
See kit gurus review (22 minute mark), had to make 2 setting changes and got the idle power draw to reduce for 36W→15W [1. Enable ASPM in bios 2. In OS go to PCI Express Link state Power Management and select Maximum power Saving]
strange cards you are testing against, why did not pick the RTX 3060 ti 12gb version ? why did not pick the Rx 6700xt ? why test it agaisnt the A750 and not against the A770 ? Why is there a 1080ti and you test RT ? so many questions.
Why not just go to one of the many other PC hardware youtube channels that have tested it against the gpus you want to see? Nobody has every gpu lying around to test my guy.
Would be a lot of fun to profile the games and jump into the pipeline and calls to see what's going on but I don't think that is easy to explain to your audience. Any idea where such content exists, if at all?
this will be a decent boost in performance, and a very hefty VRAM boost. if you're not happy with your 6600's performance on games you like to play, then you're not happy. so if you have the $$$, then move on from it. if you are happy with your 6600 or don't have the money to spare, then continue to stick with it.
I would still wait. Better GPUs are coming next year from all 3 vendors. Although this it depends on your budget too. Let's see when AMDs lower end cards land first.
Not really. 6600 is beaten by this card, sure. But 6600 was never "playing on ultra" type of card and in reviews is mostly destroyed because of pure VRAM limitation. Even in some games where 4600 get a pass, most if not all games for some reason use more VRAM on AMD cards. On settings you more likely play difference is not that big. If you're buying now b580 would be easy recommendation even with price difference but as upgrade no.
I wish they would release a B990. We need more high end cards. The low end GPU market is flooded already with new cards and used cards from previous generations with multiple price choices. Even as is the $250 B580 is sold out on many sites and board partner cards are $100 to $200 more, completely defeating the purpose of a $250 card. Intel really should be aiming higher. There is only 1 high tier card right now and we all know which one that is. Imagine if they released a massive card to compete with the RTX 4090 or a potential RTX 5090. That would really get people talking and people might be more eager to get their high end GPU if they could make it compete performance to price vs the competition. Even generations later, people would still be thinking about getting the older higher end card as the prices go down and it competes in the lower end some years down the road. I think this is a better strategy.
A 40 / 5090 isn’t just a ‘high end’ card , it’s the best gpu performance wise ever made. You are your under estimating what’s involved in bringing a card like that to market . Ask yourself why AMD aren’t bringing out a competitor to the 5090 ? Intel are the new boys , they can’t just pull out a 5090 type card out of their ass that’s also cheaper with features like great ray tracing etc They are trying to get market share whilst also improving their development work .
The die would be impossibly expensive to produce, the B580 is bringing 4060 performance with the die size of a 4070 Super, to rival 4090 performance with the current architecture would be unsustainable and even for a halo product it would be ridiculously expensive
So they caught up to this gen with a bigger die on a smaller node right before next gen is about to ship from AMD and Nvidia? Doesn't sound competitive to me unless Intel is willing to take the L for possible long term gains.
00:05:48 Test machine specs. Every review makes this mistake apples to oranges mistake. AMD CPU doesn't give the best result for INTEL ARC. Reason for this is different architecture. AMD/Nvidia GPU with REBAR/E Cores +-5% zero zilch nada gain. INTEL ARC with REBAR/E Cores +30% gain optimized for INTEL CPU (hence Intel Processor Diagnostic Tool). INTEL ARC is as good as the CPU paired. Better CPU pairing and better performance from splitting the tasks with CPU. To understand that AMD/Nvidia CPU only feeds the GPU to do all the tasks with queuing and waiting that results to broken frames. So good INTEL CPU for INTEL GPU and AMD CPU for AMD/Nvidia GPU. When you understand the two different architectures you also know how misleading these tests are. As a rule INTEL all good frames vs double the frames but half broken hence high Wattage to replace the broken frames that can be half if not over and this causes stutter and bad experience from AMD/Nvidia GPU. At the end these tests are worthless. Comparing the experience matters. For example at 4k A770 beats 4070ti with half the frames.
Ok. It wasnt a complete failure. I just hope that with how much intel is losing on these gpus(4070super amount of silicon cost on b580 😮) they dont just close shop on dgpus and drivers. Hope not.
Xe is gonna be HUGE for Intel because of AI and other components in the Intel ecosystem. It's not going anywhere. TAP said the next gen is already done, and the hardware team has moved to the next gen following it. Intel is in this for real.
They're not losing money lol. The 4060s are on 50 class dies and the 70s are on 60 class dies 💀. They're just not gimping the card in favor of leaning on AI to make fake frames instead despite charging double of what it should be.
@@Sureflame Maybe not losing but they surely not make any money on them either with such a big die produced in third party foundry. Good for consumers but that's path which AMD took with Phenom CPU's long time ago and this led the company almost to bankruptcy.
@gorky_vk It's cheap last generation sand though. Not bleeding edge hardware. Nvidia is gearing to use GDDR7 so that's one reason the card is 250 as an asking price. Provided they provide a respectable vram, bit bus, and memory speeds on newer cards; they might blow the b580 out of the water but they'll be asking 100 more than the B580 lol.
@@Sureflame TSMC 5 nm is not cheap. It's not the leading edge anymore but still advance node and not cheap at all. GDDR7 is completely pointless on entry level cards and I seriously doubt that entry level cards from nvidia will use it even with next generation. Even if they do that will be in 6+ months.
Still interesting point of comparison. It was rather cheap one day and it seems like this should have comparable performance, same vram, with much better RT. 6700xt with better rt for less is quite compelling. @@NolanHayes-b8w
I've enjoyed my time with my A750 and A770 (I'm an Intel employee, so the discounts are too good to pass up). I have not had any real issues with game compatibility, and driver updates have helped a lot. My biggest gripe has been the fact that VR is not supported and so you have to jump through a lot of hoops to get it working at all... I just don't want to have to do that to use my headset. I really hope that now that the drivers are in a good spot and Battlemage is out the door, in a much better spot than at launch, the drivers team can pivot to addressing the lack of VR support. Native support for SteamVR and Oculus Link are sorely needed.
The last time i saw so much BS was on redit
i'm curious of B770 for later tbh
Yeah, that should be more powerful at higher VRam. 20gb?
@ac8598 at least 16GB
Fingers crossed they even make it, tbh.
The die in the b580 isn't cut down at all, so the B770 would be a fully different, bigger die. I do hope they designed one and just haven't announced it yet, but that's not a given, given Intel's financial state.
@@chamoferindamencha8964 at least announcement when CES 2025, if they want to
there is no B770 and will not be.
Intel is targeting where most of the gamers are at 1080p and 1440p which is a smart move for them.
The ones that buy the $500+ gpus are a small group.
Cloud companies are buying up thousands of 4090's at a time.
Will you be doing any PCIE 3.0 testing for us users still on PCIE 3.0 such as the B450 and a520 motherboard?
this
Why are you still on PCIE 3.0? I'm poor but not that poor.
@@Xero_Wolf Because I prefer to save money. I didn't become a home owner this year by throwing away money. The 5950x is perfectly fine for gaming, editing, and rendering. I picked it up on sale. It does what I need it to do, especially when you're on a tighter budget.
B580 buyers are like me, who are on a tight budget or don't want to spend a lot of money on a new build. I currently have a Geforce 1650, which I bought when I first built the computer along with a Ryzen 5 2500x. I currently still have the 1650.
The 5950x will not bottleneck anything up-to a 7800 XT or possibly 7900 XT unless you lower your settings in games.
@@Xero_Wolf I'm not trying to dog pile, just add my take for staying on PCIe 3. I have a dedicated video editing/up-scaling/Stable Diffusion/RAID 5 storage rig with an i9-10850K, 64 GB of RAM and an Arc A770 LE 16GB card. It works well and fits my use case. But, as you pointed out, it does have a limitation -- PCIe bandwith. The solution, or at least a huge offset to the limitation, is to only use graphics cards with with 16 PCIe lanes. So, that's what I do; nVidia's RXT 2060 Super, Intel's A750 and A770 all have 16x PCIe lanes.
It 's basically the same story for my gaming rig. I'm on 10th gen Intel and limited to PCIe 3.
Making the change to PCIe 4 would mean buying a lot of new stuff to replace the stuff which still works and fits my needs. That's something I will not be doing as long as there's a work around.
b450 support pcie 4 just update your bios
I am still using an ASRock PG B550 Velocita, 5900X, and the Great 6800XT. That 16gigs has been more than enough. My God that GPU has future proofed me to this day.. Yeah Yeah I know there are way faster cards out now.. But that XFX 6800XT has held me down at 4k 60 FPS gaming and even higher frames in most games. Like I just played Hogwarts last night above 60 FPS on high settings in 4K.. And forget about it for 2K this card still crushes 1440P gaming.. AMD does truly give people more for their money.. And AM4 is still the gift that keeps on giving to me.. I am do for a new build though it's been 7ish years but honestly I can see myself getting by on this set up for another 5/6 years.. IDK...
The issue with DaVinci Resolve is concerning to me as that is a use case I had planned on. Hardware Canucks tested it with Resolve, but they just rendered an edited video with it. In that test, it did better than the 4060, but not as well as the A-series cards.
i believe that is normal unfortunately, iirc Intel said that is expected results
I will definitely be chasing this one down as I'm curious too. I'm trying to contact PugetBench to figure out if this is something on the testing side.
-Adam
@@pcworldAdam did you get the chance to just try out general editing activities in Resolve, without running the PugeBench benchmark macro itself?
My mind is always on Gordon, I hope he is able to spend quality time with his loved ones.
Welcome to the cool kids club intel
Sorting those numbers would be easier for viewers, because we need to check which is doing good, now we need to check all the numbers to check which is good, if it's sorted we can check top 2 or 3.
Hah, I recently grabbed a used 1080TI for my wife's PC. It was about $135 on ebay + shipping and sales tax brought it to around $165 here. I was going to just install my Radeon 6800 in her PC, but waiting for a better $500 GPU. I was tempted to pull the trigger when the 7900GRE hit $499 a while back, and was tempted by $620 7900XT, but, i am not in a rush.
Great Video Guys!
Looking at my A-770 I feel like Intel owes us early adopters a special discount for a B series card. I feel like if they stayed on the Alchemist cards longer we'd be in a better place. I'm still dealing with issues in games.
Hi guys! Good job, thank You! Get well Gordon!!!
GN had similar idle power issues, but Level1 got idle power at 7.5 to 13W depending on monitor refresh rate. I'm curious about settings related to:
- What is the monitor setup between reviewers (resolution and multi-monitor or not)?
- What are the BIOS and Windows ASPM settings?
From the experience of Alchemist, Intel idle power is fine for 60mhz monitor, but will be significantly higher for more than that
@@lokeung0807 Yeah. First of all, power savings require ASPM to be enabled.
Sadly, I'm running 2x1440p displays, with one at 144Hz and so that means I'm constantly drawing 45W no matter what.
@@AlexSchendel I am using A750, with a 100mhz 1080p monitor, GPU used 39-41W in ideal...
@@lokeung0807 I'm assuming you mean 100Hz? That sounds like it should be fine if you have ASPM configured properly.
@@AlexSchendel Yes, the power draw will drop significantly if I turn to 60mhz
but will be high if I turn to 100mhz
User: Any data on LLM inference speed or Stable Diffusion rendering?
Blogger: Data? The only data I care about is my kill-death ratio.
I don't believe Meteor Lake had the matrix hardware, so maybe just Lunar Lake laptops will support the frame generation.
Still to this day I could care less about RT it's just something I don't really notice as I am running around trying not to die..
I think Driver is responsible for higher idle power draw of Intel GPU.
So where was the 6700 in the thumbnail?
I just wish some YT channel could do a more general comparison across mobile, desktop and even consoles to give a broader view than the niche DIY desktop market and minute FPS comparisons. The non-desktop gamers are actually a majority these days. Will the Arc get a mobile version? I think there will be mostly flat packages to the gaming kids this Christmas. Reflect that.
Writing this on a four year old laptop with a 16GB RTX 3080 (mobile) that probably still outperforms this new budget wonder.
Need 24GB and 48GB versions to give Nvidia some competition. Cloud hosting companies are buying up all the 4090s
Ok it's time to oc these competitors and see how much Intel spreads it's wings. At least Tom Peterson has mentioned there's some there to be tapped
Intel Arc FTW!
Meteor lake doesnt have xmx hardware. Just lunar lake and arc. The b580 looks good. I have an a770 so waiting to see if a b770 is released.
It has been stuck in development hell and likely won't come out until late 2025/early 2026, if at all. They may just skip it and move onto celestial.
Proof that the GTX 1080 Ti is the GREATEST OF ALL TIME!
Please, fix idle power draw. I live in California, electricity is ultra expensive.
See kit gurus review (22 minute mark), had to make 2 setting changes and got the idle power draw to reduce for 36W→15W [1. Enable ASPM in bios 2. In OS go to PCI Express Link state Power Management and select Maximum power Saving]
strange cards you are testing against, why did not pick the RTX 3060 ti 12gb version ? why did not pick the Rx 6700xt ? why test it agaisnt the A750 and not against the A770 ? Why is there a 1080ti and you test RT ? so many questions.
Why not just go to one of the many other PC hardware youtube channels that have tested it against the gpus you want to see? Nobody has every gpu lying around to test my guy.
51:00 your saying we can run a system with this gpu with a 450w psu?
0:22 B580
Adam fail
Would be a lot of fun to profile the games and jump into the pipeline and calls to see what's going on but I don't think that is easy to explain to your audience. Any idea where such content exists, if at all?
That would be awesome for sure, but way over my head for sure, but I can talk to Will about it.
-Adam
Worth to upgrade from my rx 6600? 🙏🏻 Please helppppp
No. You want 60% or most boost b4 upgrading
this will be a decent boost in performance, and a very hefty VRAM boost.
if you're not happy with your 6600's performance on games you like to play, then you're not happy. so if you have the $$$, then move on from it. if you are happy with your 6600 or don't have the money to spare, then continue to stick with it.
Definite upgrade over my 5700XT
I would still wait. Better GPUs are coming next year from all 3 vendors. Although this it depends on your budget too. Let's see when AMDs lower end cards land first.
Not really. 6600 is beaten by this card, sure. But 6600 was never "playing on ultra" type of card and in reviews is mostly destroyed because of pure VRAM limitation. Even in some games where 4600 get a pass, most if not all games for some reason use more VRAM on AMD cards.
On settings you more likely play difference is not that big. If you're buying now b580 would be easy recommendation even with price difference but as upgrade no.
No more Brad GPU reviews on camera? Where you at homie?
I'll be on Full Nerd today, I've never been based in SF so the video reviews ain't me!
@BradChacos I know you are a fellow New Englander. Dorchester representing!
Okay Nvidia lets sell the 5060 for $250😅
Nivdia is more Likely to incrase the vram to 12 or more then lower price
My 1650 can finally rest
They are about a generation late
I wish they would release a B990. We need more high end cards. The low end GPU market is flooded already with new cards and used cards from previous generations with multiple price choices. Even as is the $250 B580 is sold out on many sites and board partner cards are $100 to $200 more, completely defeating the purpose of a $250 card. Intel really should be aiming higher. There is only 1 high tier card right now and we all know which one that is. Imagine if they released a massive card to compete with the RTX 4090 or a potential RTX 5090. That would really get people talking and people might be more eager to get their high end GPU if they could make it compete performance to price vs the competition. Even generations later, people would still be thinking about getting the older higher end card as the prices go down and it competes in the lower end some years down the road. I think this is a better strategy.
A 40 / 5090 isn’t just a ‘high end’ card , it’s the best gpu performance wise ever made. You are your under estimating what’s involved in bringing a card like that to market . Ask yourself why AMD aren’t bringing out a competitor to the 5090 ? Intel are the new boys , they can’t just pull out a 5090 type card out of their ass that’s also cheaper with features like great ray tracing etc They are trying to get market share whilst also improving their development work .
@@crymore7942 It's all about scaling. More processors.
Buy a 7900XTX
The die would be impossibly expensive to produce, the B580 is bringing 4060 performance with the die size of a 4070 Super, to rival 4090 performance with the current architecture would be unsustainable and even for a halo product it would be ridiculously expensive
@@4m470 That's an upper mid tier card.
Should I upgrade or keep using my Rx 6700 xt?
keep your rx 6700xt.
Keep riding that, it's still a damn good card.
-Adam
So they caught up to this gen with a bigger die on a smaller node right before next gen is about to ship from AMD and Nvidia? Doesn't sound competitive to me unless Intel is willing to take the L for possible long term gains.
Yeah the Indiana Jones game has something weird going on with it
So you have seen others reporting problems?
-Adam
Why is your title and description localized? I don't want that and ther eis no indicator.
(Android app).
Pls check warzone on b580
00:05:48 Test machine specs. Every review makes this mistake apples to oranges mistake. AMD CPU doesn't give the best result for INTEL ARC. Reason for this is different architecture. AMD/Nvidia GPU with REBAR/E Cores +-5% zero zilch nada gain. INTEL ARC with REBAR/E Cores +30% gain optimized for INTEL CPU (hence Intel Processor Diagnostic Tool). INTEL ARC is as good as the CPU paired. Better CPU pairing and better performance from splitting the tasks with CPU. To understand that AMD/Nvidia CPU only feeds the GPU to do all the tasks with queuing and waiting that results to broken frames. So good INTEL CPU for INTEL GPU and AMD CPU for AMD/Nvidia GPU.
When you understand the two different architectures you also know how misleading these tests are. As a rule INTEL all good frames vs double the frames but half broken hence high Wattage to replace the broken frames that can be half if not over and this causes stutter and bad experience from AMD/Nvidia GPU. At the end these tests are worthless. Comparing the experience matters. For example at 4k A770 beats 4070ti with half the frames.
Ok. It wasnt a complete failure. I just hope that with how much intel is losing on these gpus(4070super amount of silicon cost on b580 😮) they dont just close shop on dgpus and drivers. Hope not.
Xe is gonna be HUGE for Intel because of AI and other components in the Intel ecosystem. It's not going anywhere. TAP said the next gen is already done, and the hardware team has moved to the next gen following it. Intel is in this for real.
They're not losing money lol. The 4060s are on 50 class dies and the 70s are on 60 class dies 💀. They're just not gimping the card in favor of leaning on AI to make fake frames instead despite charging double of what it should be.
@@Sureflame Maybe not losing but they surely not make any money on them either with such a big die produced in third party foundry. Good for consumers but that's path which AMD took with Phenom CPU's long time ago and this led the company almost to bankruptcy.
@gorky_vk It's cheap last generation sand though. Not bleeding edge hardware. Nvidia is gearing to use GDDR7 so that's one reason the card is 250 as an asking price.
Provided they provide a respectable vram, bit bus, and memory speeds on newer cards; they might blow the b580 out of the water but they'll be asking 100 more than the B580 lol.
@@Sureflame TSMC 5 nm is not cheap. It's not the leading edge anymore but still advance node and not cheap at all.
GDDR7 is completely pointless on entry level cards and I seriously doubt that entry level cards from nvidia will use it even with next generation. Even if they do that will be in 6+ months.
interesting😂named with "black" both game and plugin of davinci has problem❤ is black some key spell words that will abandon the magic of battlemage❤
apparently everyone forgot 6700xt existed
Not the same price point man
Jayzcent had a 3070 in the benchmarks.
Still interesting point of comparison. It was rather cheap one day and it seems like this should have comparable performance, same vram, with much better RT.
6700xt with better rt for less is quite compelling. @@NolanHayes-b8w
One question buddy here in my country iam getting rx6700 xt at the same and even a little cheaper than this which one should I get??