Fingers crossed they even make it, tbh. The die in the b580 isn't cut down at all, so the B770 would be a fully different, bigger die. I do hope they designed one and just haven't announced it yet, but that's not a given, given Intel's financial state.
No, they're just pricing it competitively. They can't afford to make enough of them to actually sell them in large volumes, because they are losing money on each one sold at 250 USD.
I've enjoyed my time with my A750 and A770 (I'm an Intel employee, so the discounts are too good to pass up). I have not had any real issues with game compatibility, and driver updates have helped a lot. My biggest gripe has been the fact that VR is not supported and so you have to jump through a lot of hoops to get it working at all... I just don't want to have to do that to use my headset. I really hope that now that the drivers are in a good spot and Battlemage is out the door, in a much better spot than at launch, the drivers team can pivot to addressing the lack of VR support. Native support for SteamVR and Oculus Link are sorely needed.
I can see that happening, I have also been noticing a huge uptick in support on the Linux side in the last 6 months, so I think they've started to tackle all that stuff, now that the actual image processing is getting good
@@AnEagle I've been using Linux on it since day one, and while it was annoying to have to force the use of beta mesa drivers at first, I've had a pretty great experience all around. It never got the same flashy articles about how much better each driver update was, but on the flip side, it is so much easier to update. I do have a Windows install for the once in a blue moon time that I need it, and to this day, the drivers still fail to update on their own. If I try to update drivers from Arc control, it fails to find them. On Linux, any package manager just finds that mesa was updated and installs it nice and easy.
@@Xero_Wolf Because I prefer to save money. I didn't become a home owner this year by throwing away money. The 5950x is perfectly fine for gaming, editing, and rendering. I picked it up on sale. It does what I need it to do, especially when you're on a tighter budget. B580 buyers are like me, who are on a tight budget or don't want to spend a lot of money on a new build. I currently have a Geforce 1650, which I bought when I first built the computer along with a Ryzen 5 2500x. I currently still have the 1650. The 5950x will not bottleneck anything up-to a 7800 XT or possibly 7900 XT unless you lower your settings in games.
@@Xero_Wolf I'm not trying to dog pile, just add my take for staying on PCIe 3. I have a dedicated video editing/up-scaling/Stable Diffusion/RAID 5 storage rig with an i9-10850K, 64 GB of RAM and an Arc A770 LE 16GB card. It works well and fits my use case. But, as you pointed out, it does have a limitation -- PCIe bandwith. The solution, or at least a huge offset to the limitation, is to only use graphics cards with with 16 PCIe lanes. So, that's what I do; nVidia's RXT 2060 Super, Intel's A750 and A770 all have 16x PCIe lanes. It 's basically the same story for my gaming rig. I'm on 10th gen Intel and limited to PCIe 3. Making the change to PCIe 4 would mean buying a lot of new stuff to replace the stuff which still works and fits my needs. That's something I will not be doing as long as there's a work around.
Looks a really nice card for the price and the fact they aimed at 1440p is awesome as thats a huge market. Im in awe of that 1080 though its neck and neck with the 3060 on most and the 4060 on some games very impressive card.
Sorting those numbers would be easier for viewers, because we need to check which is doing good, now we need to check all the numbers to check which is good, if it's sorted we can check top 2 or 3.
For a $200 (current value) 1080ti, says how good the card was and is and will always remain Nvidia's GOAT. Its keeping up with the B580 and even beats it in one of the tests. Not worth 'side-grading' to a B580 if you have a 1080TI for better 1% lows.
My only complaint about that card is that it's given me no reason to upgrade in the 8 years I've had it! I'm finally planning on a new one with a completely new system build. Longest lasting card in my 25ish years of computer gaming!
I'm hoping they release a mid or high end competitor. A B770 would be an immediate buy for me. For now I'll probably have to settle for whatever AMD releases asduming they dont raise their prices this gen.
Yea but if u were WALKING around and exploring you'd appreciate it. Its not a deal breaker to not have but hard to not want if youve ever gotten to play something at a respectable performance level with it on
The issue with DaVinci Resolve is concerning to me as that is a use case I had planned on. Hardware Canucks tested it with Resolve, but they just rendered an edited video with it. In that test, it did better than the 4060, but not as well as the A-series cards.
I will definitely be chasing this one down as I'm curious too. I'm trying to contact PugetBench to figure out if this is something on the testing side. -Adam
GN had similar idle power issues, but Level1 got idle power at 7.5 to 13W depending on monitor refresh rate. I'm curious about settings related to: - What is the monitor setup between reviewers (resolution and multi-monitor or not)? - What are the BIOS and Windows ASPM settings?
@@lokeung0807 Yeah. First of all, power savings require ASPM to be enabled. Sadly, I'm running 2x1440p displays, with one at 144Hz and so that means I'm constantly drawing 45W no matter what.
Hah, I recently grabbed a used 1080TI for my wife's PC. It was about $135 on ebay + shipping and sales tax brought it to around $165 here. I was going to just install my Radeon 6800 in her PC, but waiting for a better $500 GPU. I was tempted to pull the trigger when the 7900GRE hit $499 a while back, and was tempted by $620 7900XT, but, i am not in a rush.
Looking at my A-770 I feel like Intel owes us early adopters a special discount for a B series card. I feel like if they stayed on the Alchemist cards longer we'd be in a better place. I'm still dealing with issues in games.
Back in October, I managed to pick up a Palit RTX 2080 Ti Gaming Pro 11GB for £225 UK, which I thought was a bargain. I'd be interested to know how the B580 competes against this type of card.
Unfortunately testing from websites like computerbase has found the B580 is struggling across pretty much all Unreal Engine 5 games. Hopefully Intel can improve that via a driver update.
I used my Intel ARC A750 with new Windows11 Install, putting in a NVIDIA GPU and then back to A750 and had the same issue with the Puget Davinci Benchmark. I guess thats someting Driver/Windows related. Before the NVIDIA GPU was installed there was no problem with runing the Davinci Standard Benchmark, but the extended one couldn't be finished.
strange cards you are testing against, why did not pick the RTX 3060 ti 12gb version ? why did not pick the Rx 6700xt ? why test it agaisnt the A750 and not against the A770 ? Why is there a 1080ti and you test RT ? so many questions.
Why not just go to one of the many other PC hardware youtube channels that have tested it against the gpus you want to see? Nobody has every gpu lying around to test my guy.
I just wish some YT channel could do a more general comparison across mobile, desktop and even consoles to give a broader view than the niche DIY desktop market and minute FPS comparisons. The non-desktop gamers are actually a majority these days. Will the Arc get a mobile version? I think there will be mostly flat packages to the gaming kids this Christmas. Reflect that. Writing this on a four year old laptop with a 16GB RTX 3080 (mobile) that probably still outperforms this new budget wonder.
Currently have a 5700xt and aoon upgrading to a 1440p monitor. I think the 5700 can hold up long enuff until we get hard truth about whats coming next. But if i can find this card for 270 still in stock ill grab her
I am still using an ASRock PG B550 Velocita, 5900X, and the Great 6800XT. That 16gigs has been more than enough. My God that GPU has future proofed me to this day.. Yeah Yeah I know there are way faster cards out now.. But that XFX 6800XT has held me down at 4k 60 FPS gaming and even higher frames in most games. Like I just played Hogwarts last night above 60 FPS on high settings in 4K.. And forget about it for 2K this card still crushes 1440P gaming.. AMD does truly give people more for their money.. And AM4 is still the gift that keeps on giving to me.. I am do for a new build though it's been 7ish years but honestly I can see myself getting by on this set up for another 5/6 years.. IDK...
See kit gurus review (22 minute mark), had to make 2 setting changes and got the idle power draw to reduce for 36W→15W [1. Enable ASPM in bios 2. In OS go to PCI Express Link state Power Management and select Maximum power Saving]
Would be a lot of fun to profile the games and jump into the pipeline and calls to see what's going on but I don't think that is easy to explain to your audience. Any idea where such content exists, if at all?
this will be a decent boost in performance, and a very hefty VRAM boost. if you're not happy with your 6600's performance on games you like to play, then you're not happy. so if you have the $$$, then move on from it. if you are happy with your 6600 or don't have the money to spare, then continue to stick with it.
I would still wait. Better GPUs are coming next year from all 3 vendors. Although this it depends on your budget too. Let's see when AMDs lower end cards land first.
Not really. 6600 is beaten by this card, sure. But 6600 was never "playing on ultra" type of card and in reviews is mostly destroyed because of pure VRAM limitation. Even in some games where 4600 get a pass, most if not all games for some reason use more VRAM on AMD cards. On settings you more likely play difference is not that big. If you're buying now b580 would be easy recommendation even with price difference but as upgrade no.
So they caught up to this gen with a bigger die on a smaller node right before next gen is about to ship from AMD and Nvidia? Doesn't sound competitive to me unless Intel is willing to take the L for possible long term gains.
Small correction. Intel cross vendor locked the DP4A path to their integrated/dedicated gpus in late 2022. Everyone else can use the Shader Model 6.4 path which is considerably worse than DP4A in performance, image quality wise the same. FLOAT16 vs INT8. Any XeSS version released after 2022 have the DP4A path locked so only DX12 Shader Model 6.4 is available.
Ok. It wasnt a complete failure. I just hope that with how much intel is losing on these gpus(4070super amount of silicon cost on b580 😮) they dont just close shop on dgpus and drivers. Hope not.
Xe is gonna be HUGE for Intel because of AI and other components in the Intel ecosystem. It's not going anywhere. TAP said the next gen is already done, and the hardware team has moved to the next gen following it. Intel is in this for real.
They're not losing money lol. The 4060s are on 50 class dies and the 70s are on 60 class dies 💀. They're just not gimping the card in favor of leaning on AI to make fake frames instead despite charging double of what it should be.
@@Sureflame Maybe not losing but they surely not make any money on them either with such a big die produced in third party foundry. Good for consumers but that's path which AMD took with Phenom CPU's long time ago and this led the company almost to bankruptcy.
@gorky_vk It's cheap last generation sand though. Not bleeding edge hardware. Nvidia is gearing to use GDDR7 so that's one reason the card is 250 as an asking price. Provided they provide a respectable vram, bit bus, and memory speeds on newer cards; they might blow the b580 out of the water but they'll be asking 100 more than the B580 lol.
@@Sureflame TSMC 5 nm is not cheap. It's not the leading edge anymore but still advance node and not cheap at all. GDDR7 is completely pointless on entry level cards and I seriously doubt that entry level cards from nvidia will use it even with next generation. Even if they do that will be in 6+ months.
I wish they would release a B990. We need more high end cards. The low end GPU market is flooded already with new cards and used cards from previous generations with multiple price choices. Even as is the $250 B580 is sold out on many sites and board partner cards are $100 to $200 more, completely defeating the purpose of a $250 card. Intel really should be aiming higher. There is only 1 high tier card right now and we all know which one that is. Imagine if they released a massive card to compete with the RTX 4090 or a potential RTX 5090. That would really get people talking and people might be more eager to get their high end GPU if they could make it compete performance to price vs the competition. Even generations later, people would still be thinking about getting the older higher end card as the prices go down and it competes in the lower end some years down the road. I think this is a better strategy.
A 40 / 5090 isn’t just a ‘high end’ card , it’s the best gpu performance wise ever made. You are your under estimating what’s involved in bringing a card like that to market . Ask yourself why AMD aren’t bringing out a competitor to the 5090 ? Intel are the new boys , they can’t just pull out a 5090 type card out of their ass that’s also cheaper with features like great ray tracing etc They are trying to get market share whilst also improving their development work .
The die would be impossibly expensive to produce, the B580 is bringing 4060 performance with the die size of a 4070 Super, to rival 4090 performance with the current architecture would be unsustainable and even for a halo product it would be ridiculously expensive
Still interesting point of comparison. It was rather cheap one day and it seems like this should have comparable performance, same vram, with much better RT. 6700xt with better rt for less is quite compelling. @@NolanHayes-b8w
Ah, also for Wukong the engine is the RTX branch for UE5, so it definitely does run better on Nvidia (the game just got another optimisation update for Nvidia, albeit for RT lol)
Bench mark is your EYES and ears not data on a screen ,When you go down the FPS and Data on a screen you are always going to see things that do not effect your Gameplay. Eyes and Ears people.
00:05:48 Test machine specs. Every review makes this mistake apples to oranges mistake. AMD CPU doesn't give the best result for INTEL ARC. Reason for this is different architecture. AMD/Nvidia GPU with REBAR/E Cores +-5% zero zilch nada gain. INTEL ARC with REBAR/E Cores +30% gain optimized for INTEL CPU (hence Intel Processor Diagnostic Tool). INTEL ARC is as good as the CPU paired. Better CPU pairing and better performance from splitting the tasks with CPU. To understand that AMD/Nvidia CPU only feeds the GPU to do all the tasks with queuing and waiting that results to broken frames. So good INTEL CPU for INTEL GPU and AMD CPU for AMD/Nvidia GPU. When you understand the two different architectures you also know how misleading these tests are. As a rule INTEL all good frames vs double the frames but half broken hence high Wattage to replace the broken frames that can be half if not over and this causes stutter and bad experience from AMD/Nvidia GPU. At the end these tests are worthless. Comparing the experience matters. For example at 4k A770 beats 4070ti with half the frames.
We're not here to give Arc the best result, we are here to see how it runs using our testing scenarios. It's important that potential buyers see this GPU will run across different kinds of test systems, which is why it's always good to watch multiple reviews and hope everyone is testing on something different. - Adam
@@pcworld Your testing scenarios are misleading giving the competition an unfair advantage without pointing it out. For your credit at least it was mentioned that experience was somewhat better and different to the competition putting numbers aside. Yes it is good to watch many tests but youtube only parrots other channels with few exceptions and you really need to dig deep to get some reviews that take the different architecture in count. It is easy just to show numbers even that these have very little meaning to the real performance.
i'm curious of B770 for later tbh
Yeah, that should be more powerful at higher VRam. 20gb?
@ac8598 at least 16GB
Fingers crossed they even make it, tbh.
The die in the b580 isn't cut down at all, so the B770 would be a fully different, bigger die. I do hope they designed one and just haven't announced it yet, but that's not a given, given Intel's financial state.
@@chamoferindamencha8964 at least announcement when CES 2025, if they want to
there is no B770 and will not be.
Intel is targeting where most of the gamers are at 1080p and 1440p which is a smart move for them.
The ones that buy the $500+ gpus are a small group.
Cloud companies are buying up thousands of 4090's at a time.
Small group that buys 4070s, 4070 supers, 7700xts, and 7800xts??
No, they're just pricing it competitively. They can't afford to make enough of them to actually sell them in large volumes, because they are losing money on each one sold at 250 USD.
@@Observer168cloud companies tend to buy enterprise cards
@@4brotroydavis In the past, those cards were mid range. Now they are high end. Thx nvidia.
I've enjoyed my time with my A750 and A770 (I'm an Intel employee, so the discounts are too good to pass up). I have not had any real issues with game compatibility, and driver updates have helped a lot. My biggest gripe has been the fact that VR is not supported and so you have to jump through a lot of hoops to get it working at all... I just don't want to have to do that to use my headset. I really hope that now that the drivers are in a good spot and Battlemage is out the door, in a much better spot than at launch, the drivers team can pivot to addressing the lack of VR support. Native support for SteamVR and Oculus Link are sorely needed.
The last time i saw so much BS was on redit
I can see that happening, I have also been noticing a huge uptick in support on the Linux side in the last 6 months, so I think they've started to tackle all that stuff, now that the actual image processing is getting good
@@AnEagle I've been using Linux on it since day one, and while it was annoying to have to force the use of beta mesa drivers at first, I've had a pretty great experience all around. It never got the same flashy articles about how much better each driver update was, but on the flip side, it is so much easier to update. I do have a Windows install for the once in a blue moon time that I need it, and to this day, the drivers still fail to update on their own. If I try to update drivers from Arc control, it fails to find them. On Linux, any package manager just finds that mesa was updated and installs it nice and easy.
My mind is always on Gordon, I hope he is able to spend quality time with his loved ones.
I only come to check on Gordon. These jokers - not so much.
@@threadripper979 Rude! Adam and Brad (and friends) are great too!
@@bradeinarsen Sure, slick. Anything you say.
Will you be doing any PCIE 3.0 testing for us users still on PCIE 3.0 such as the B450 and a520 motherboard?
this
Why are you still on PCIE 3.0? I'm poor but not that poor.
@@Xero_Wolf Because I prefer to save money. I didn't become a home owner this year by throwing away money. The 5950x is perfectly fine for gaming, editing, and rendering. I picked it up on sale. It does what I need it to do, especially when you're on a tighter budget.
B580 buyers are like me, who are on a tight budget or don't want to spend a lot of money on a new build. I currently have a Geforce 1650, which I bought when I first built the computer along with a Ryzen 5 2500x. I currently still have the 1650.
The 5950x will not bottleneck anything up-to a 7800 XT or possibly 7900 XT unless you lower your settings in games.
@@Xero_Wolf I'm not trying to dog pile, just add my take for staying on PCIe 3. I have a dedicated video editing/up-scaling/Stable Diffusion/RAID 5 storage rig with an i9-10850K, 64 GB of RAM and an Arc A770 LE 16GB card. It works well and fits my use case. But, as you pointed out, it does have a limitation -- PCIe bandwith. The solution, or at least a huge offset to the limitation, is to only use graphics cards with with 16 PCIe lanes. So, that's what I do; nVidia's RXT 2060 Super, Intel's A750 and A770 all have 16x PCIe lanes.
It 's basically the same story for my gaming rig. I'm on 10th gen Intel and limited to PCIe 3.
Making the change to PCIe 4 would mean buying a lot of new stuff to replace the stuff which still works and fits my needs. That's something I will not be doing as long as there's a work around.
b450 support pcie 4 just update your bios
B580 Street price $370. It's all a lie.
This video has it all! B850, GTX 1080 Ti, X3D CPU! It's incredibly long and detailed and has sections covering all the bases
Bravo!
I think it's amazing what intel achived here.
Looks a really nice card for the price and the fact they aimed at 1440p is awesome as thats a huge market. Im in awe of that 1080 though its neck and neck with the 3060 on most and the 4060 on some games very impressive card.
Very well done Intel!!! Drivers will not be an issue, Intel does develop them a lot.
Welcome to the cool kids club intel
Sorting those numbers would be easier for viewers, because we need to check which is doing good, now we need to check all the numbers to check which is good, if it's sorted we can check top 2 or 3.
For a $200 (current value) 1080ti, says how good the card was and is and will always remain Nvidia's GOAT. Its keeping up with the B580 and even beats it in one of the tests. Not worth 'side-grading' to a B580 if you have a 1080TI for better 1% lows.
Proof that the GTX 1080 Ti is the GREATEST OF ALL TIME!
My only complaint about that card is that it's given me no reason to upgrade in the 8 years I've had it! I'm finally planning on a new one with a completely new system build. Longest lasting card in my 25ish years of computer gaming!
Indy Works just fine on my a770 after i updated to the New drivers. B580 i do not have 1 Probably will never get 1 . Not at 400.00
I'm hoping they release a mid or high end competitor. A B770 would be an immediate buy for me. For now I'll probably have to settle for whatever AMD releases asduming they dont raise their prices this gen.
have the A770 now great card will get the B770 if they release it
For the cost this card performs!
Still to this day I could care less about RT it's just something I don't really notice as I am running around trying not to die..
Yea but if u were WALKING around and exploring you'd appreciate it. Its not a deal breaker to not have but hard to not want if youve ever gotten to play something at a respectable performance level with it on
The issue with DaVinci Resolve is concerning to me as that is a use case I had planned on. Hardware Canucks tested it with Resolve, but they just rendered an edited video with it. In that test, it did better than the 4060, but not as well as the A-series cards.
i believe that is normal unfortunately, iirc Intel said that is expected results
I will definitely be chasing this one down as I'm curious too. I'm trying to contact PugetBench to figure out if this is something on the testing side.
-Adam
Probably because of a smaller memory bus then the A series.
GN had similar idle power issues, but Level1 got idle power at 7.5 to 13W depending on monitor refresh rate. I'm curious about settings related to:
- What is the monitor setup between reviewers (resolution and multi-monitor or not)?
- What are the BIOS and Windows ASPM settings?
From the experience of Alchemist, Intel idle power is fine for 60mhz monitor, but will be significantly higher for more than that
@@lokeung0807 Yeah. First of all, power savings require ASPM to be enabled.
Sadly, I'm running 2x1440p displays, with one at 144Hz and so that means I'm constantly drawing 45W no matter what.
@@AlexSchendel I am using A750, with a 100mhz 1080p monitor, GPU used 39-41W in ideal...
@@lokeung0807 I'm assuming you mean 100Hz? That sounds like it should be fine if you have ASPM configured properly.
@@AlexSchendel Yes, the power draw will drop significantly if I turn to 60mhz
but will be high if I turn to 100mhz
Intel's technical level and driver development have a high level, which is worth looking forward to.
Comparing the a750 to the b580 like "apples to apples" is nasty!
Hah, I recently grabbed a used 1080TI for my wife's PC. It was about $135 on ebay + shipping and sales tax brought it to around $165 here. I was going to just install my Radeon 6800 in her PC, but waiting for a better $500 GPU. I was tempted to pull the trigger when the 7900GRE hit $499 a while back, and was tempted by $620 7900XT, but, i am not in a rush.
Starts at 21:22
Looking at my A-770 I feel like Intel owes us early adopters a special discount for a B series card. I feel like if they stayed on the Alchemist cards longer we'd be in a better place. I'm still dealing with issues in games.
Great Video Guys!
Hi, may I ask if this card can play Halo: MCC and Overwatch? I have looked all over YT and found no results 😭😭
Back in October, I managed to pick up a Palit RTX 2080 Ti Gaming Pro 11GB for £225 UK, which I thought was a bargain. I'd be interested to know how the B580 competes against this type of card.
here in indonesia intel arc B580 is around $281😢 and i decide buying zotac rtx 4060ti 16gb amp model for $381 is it good price for rtx 4060 ti 16gb ?
Really? An hour video?
Ok it's time to oc these competitors and see how much Intel spreads it's wings. At least Tom Peterson has mentioned there's some there to be tapped
Unfortunately testing from websites like computerbase has found the B580 is struggling across pretty much all Unreal Engine 5 games. Hopefully Intel can improve that via a driver update.
So where was the 6700 in the thumbnail?
Whoops, typo! Thanks for the catch, it's fixed now.
-Adam
I used my Intel ARC A750 with new Windows11 Install, putting in a NVIDIA GPU and then back to A750 and had the same issue with the Puget Davinci Benchmark. I guess thats someting Driver/Windows related. Before the NVIDIA GPU was installed there was no problem with runing the Davinci Standard Benchmark, but the extended one couldn't be finished.
strange cards you are testing against, why did not pick the RTX 3060 ti 12gb version ? why did not pick the Rx 6700xt ? why test it agaisnt the A750 and not against the A770 ? Why is there a 1080ti and you test RT ? so many questions.
Why not just go to one of the many other PC hardware youtube channels that have tested it against the gpus you want to see? Nobody has every gpu lying around to test my guy.
We picked these particular cards because they were the closest in price compared to the B580 (including the used GTX 1080 Ti).
- Adam
3060 12gig 30-40€ more expensive, 6700XT is unobtanium in the EU. 1080TI have the same raster performance as a 4060. A770 costs more than the B580.
Hi guys! Good job, thank You! Get well Gordon!!!
I don't believe Meteor Lake had the matrix hardware, so maybe just Lunar Lake laptops will support the frame generation.
I just wish some YT channel could do a more general comparison across mobile, desktop and even consoles to give a broader view than the niche DIY desktop market and minute FPS comparisons. The non-desktop gamers are actually a majority these days. Will the Arc get a mobile version? I think there will be mostly flat packages to the gaming kids this Christmas. Reflect that.
Writing this on a four year old laptop with a 16GB RTX 3080 (mobile) that probably still outperforms this new budget wonder.
where is game footage with stats?
User: Any data on LLM inference speed or Stable Diffusion rendering?
Blogger: Data? The only data I care about is my kill-death ratio.
Need 24GB and 48GB versions to give Nvidia some competition. Cloud hosting companies are buying up all the 4090s
I came here to say, “caveat”
Currently have a 5700xt and aoon upgrading to a 1440p monitor. I think the 5700 can hold up long enuff until we get hard truth about whats coming next. But if i can find this card for 270 still in stock ill grab her
Meteor lake doesnt have xmx hardware. Just lunar lake and arc. The b580 looks good. I have an a770 so waiting to see if a b770 is released.
It has been stuck in development hell and likely won't come out until late 2025/early 2026, if at all. They may just skip it and move onto celestial.
Intel Arc FTW!
I think Driver is responsible for higher idle power draw of Intel GPU.
Why is your title and description localized? I don't want that and ther eis no indicator.
(Android app).
It's not mainstream ready if they can't afford to sell very many of them.
I am still using an ASRock PG B550 Velocita, 5900X, and the Great 6800XT. That 16gigs has been more than enough. My God that GPU has future proofed me to this day.. Yeah Yeah I know there are way faster cards out now.. But that XFX 6800XT has held me down at 4k 60 FPS gaming and even higher frames in most games. Like I just played Hogwarts last night above 60 FPS on high settings in 4K.. And forget about it for 2K this card still crushes 1440P gaming.. AMD does truly give people more for their money.. And AM4 is still the gift that keeps on giving to me.. I am do for a new build though it's been 7ish years but honestly I can see myself getting by on this set up for another 5/6 years.. IDK...
My 1650 can finally rest
Please, fix idle power draw. I live in California, electricity is ultra expensive.
See kit gurus review (22 minute mark), had to make 2 setting changes and got the idle power draw to reduce for 36W→15W [1. Enable ASPM in bios 2. In OS go to PCI Express Link state Power Management and select Maximum power Saving]
Yes the 1080Ti i will take mine to the grave with me.
Would be a lot of fun to profile the games and jump into the pipeline and calls to see what's going on but I don't think that is easy to explain to your audience. Any idea where such content exists, if at all?
That would be awesome for sure, but way over my head for sure, but I can talk to Will about it.
-Adam
buying an intel card when its more widely supported by VR and more games with xess2
Okay Nvidia lets sell the 5060 for $250😅
Nivdia is more Likely to incrase the vram to 12 or more then lower price
Should I upgrade or keep using my Rx 6700 xt?
keep your rx 6700xt.
Keep riding that, it's still a damn good card.
-Adam
0:22 B580
Adam fail
Worth to upgrade from my rx 6600? 🙏🏻 Please helppppp
No. You want 60% or most boost b4 upgrading
this will be a decent boost in performance, and a very hefty VRAM boost.
if you're not happy with your 6600's performance on games you like to play, then you're not happy. so if you have the $$$, then move on from it. if you are happy with your 6600 or don't have the money to spare, then continue to stick with it.
Definite upgrade over my 5700XT
I would still wait. Better GPUs are coming next year from all 3 vendors. Although this it depends on your budget too. Let's see when AMDs lower end cards land first.
Not really. 6600 is beaten by this card, sure. But 6600 was never "playing on ultra" type of card and in reviews is mostly destroyed because of pure VRAM limitation. Even in some games where 4600 get a pass, most if not all games for some reason use more VRAM on AMD cards.
On settings you more likely play difference is not that big. If you're buying now b580 would be easy recommendation even with price difference but as upgrade no.
This card will only get better with driver updates.
No more Brad GPU reviews on camera? Where you at homie?
I'll be on Full Nerd today, I've never been based in SF so the video reviews ain't me!
@BradChacos I know you are a fellow New Englander. Dorchester representing!
@BradChacos i just got hip to you when yall sat down with Tom...u a real one
51:00 your saying we can run a system with this gpu with a 450w psu?
You can run a 7800X3D with a 3070 on a 350W psu without issues.
YOU'RE
No stock, price rises by retailers and even more by scalpers, wait for AMD.
So they caught up to this gen with a bigger die on a smaller node right before next gen is about to ship from AMD and Nvidia? Doesn't sound competitive to me unless Intel is willing to take the L for possible long term gains.
If this is not going to be a "Huge Wake Up Call For AMD" I do not know, what will...
Why do you say that? Intel is loosing money on each card here, negative margins.
Pls check warzone on b580
They are about a generation late
250.00 Where is it 250.00
Yeah the Indiana Jones game has something weird going on with it
So you have seen others reporting problems?
-Adam
Small correction. Intel cross vendor locked the DP4A path to their integrated/dedicated gpus in late 2022. Everyone else can use the Shader Model 6.4 path which is considerably worse than DP4A in performance, image quality wise the same. FLOAT16 vs INT8.
Any XeSS version released after 2022 have the DP4A path locked so only DX12 Shader Model 6.4 is available.
Ok. It wasnt a complete failure. I just hope that with how much intel is losing on these gpus(4070super amount of silicon cost on b580 😮) they dont just close shop on dgpus and drivers. Hope not.
Xe is gonna be HUGE for Intel because of AI and other components in the Intel ecosystem. It's not going anywhere. TAP said the next gen is already done, and the hardware team has moved to the next gen following it. Intel is in this for real.
They're not losing money lol. The 4060s are on 50 class dies and the 70s are on 60 class dies 💀. They're just not gimping the card in favor of leaning on AI to make fake frames instead despite charging double of what it should be.
@@Sureflame Maybe not losing but they surely not make any money on them either with such a big die produced in third party foundry. Good for consumers but that's path which AMD took with Phenom CPU's long time ago and this led the company almost to bankruptcy.
@gorky_vk It's cheap last generation sand though. Not bleeding edge hardware. Nvidia is gearing to use GDDR7 so that's one reason the card is 250 as an asking price.
Provided they provide a respectable vram, bit bus, and memory speeds on newer cards; they might blow the b580 out of the water but they'll be asking 100 more than the B580 lol.
@@Sureflame TSMC 5 nm is not cheap. It's not the leading edge anymore but still advance node and not cheap at all.
GDDR7 is completely pointless on entry level cards and I seriously doubt that entry level cards from nvidia will use it even with next generation. Even if they do that will be in 6+ months.
I wish they would release a B990. We need more high end cards. The low end GPU market is flooded already with new cards and used cards from previous generations with multiple price choices. Even as is the $250 B580 is sold out on many sites and board partner cards are $100 to $200 more, completely defeating the purpose of a $250 card. Intel really should be aiming higher. There is only 1 high tier card right now and we all know which one that is. Imagine if they released a massive card to compete with the RTX 4090 or a potential RTX 5090. That would really get people talking and people might be more eager to get their high end GPU if they could make it compete performance to price vs the competition. Even generations later, people would still be thinking about getting the older higher end card as the prices go down and it competes in the lower end some years down the road. I think this is a better strategy.
A 40 / 5090 isn’t just a ‘high end’ card , it’s the best gpu performance wise ever made. You are your under estimating what’s involved in bringing a card like that to market . Ask yourself why AMD aren’t bringing out a competitor to the 5090 ? Intel are the new boys , they can’t just pull out a 5090 type card out of their ass that’s also cheaper with features like great ray tracing etc They are trying to get market share whilst also improving their development work .
@@crymore7942 It's all about scaling. More processors.
Buy a 7900XTX
The die would be impossibly expensive to produce, the B580 is bringing 4060 performance with the die size of a 4070 Super, to rival 4090 performance with the current architecture would be unsustainable and even for a halo product it would be ridiculously expensive
@@4m470 That's an upper mid tier card.
why nobody trying to run on poe2 bruh. i wanna now XD
interesting😂named with "black" both game and plugin of davinci has problem❤ is black some key spell words that will abandon the magic of battlemage❤
apparently everyone forgot 6700xt existed
Not the same price point man
Jayzcent had a 3070 in the benchmarks.
Still interesting point of comparison. It was rather cheap one day and it seems like this should have comparable performance, same vram, with much better RT.
6700xt with better rt for less is quite compelling. @@NolanHayes-b8w
One question buddy here in my country iam getting rx6700 xt at the same and even a little cheaper than this which one should I get??
What's the point ? Title & description are in my native language. But there is not even subtitles. What's the point to do this ? Idi*ts...
Ah, also for Wukong the engine is the RTX branch for UE5, so it definitely does run better on Nvidia (the game just got another optimisation update for Nvidia, albeit for RT lol)
Bench mark is your EYES and ears not data on a screen ,When you go down the FPS and Data on a screen you are always going to see things that do not effect your Gameplay. Eyes and Ears people.
Stop saying ah or uh
Linux?
No games no thanks
00:05:48 Test machine specs. Every review makes this mistake apples to oranges mistake. AMD CPU doesn't give the best result for INTEL ARC. Reason for this is different architecture. AMD/Nvidia GPU with REBAR/E Cores +-5% zero zilch nada gain. INTEL ARC with REBAR/E Cores +30% gain optimized for INTEL CPU (hence Intel Processor Diagnostic Tool). INTEL ARC is as good as the CPU paired. Better CPU pairing and better performance from splitting the tasks with CPU. To understand that AMD/Nvidia CPU only feeds the GPU to do all the tasks with queuing and waiting that results to broken frames. So good INTEL CPU for INTEL GPU and AMD CPU for AMD/Nvidia GPU.
When you understand the two different architectures you also know how misleading these tests are. As a rule INTEL all good frames vs double the frames but half broken hence high Wattage to replace the broken frames that can be half if not over and this causes stutter and bad experience from AMD/Nvidia GPU. At the end these tests are worthless. Comparing the experience matters. For example at 4k A770 beats 4070ti with half the frames.
We're not here to give Arc the best result, we are here to see how it runs using our testing scenarios. It's important that potential buyers see this GPU will run across different kinds of test systems, which is why it's always good to watch multiple reviews and hope everyone is testing on something different.
- Adam
@@pcworld Your testing scenarios are misleading giving the competition an unfair advantage without pointing it out. For your credit at least it was mentioned that experience was somewhat better and different to the competition putting numbers aside.
Yes it is good to watch many tests but youtube only parrots other channels with few exceptions and you really need to dig deep to get some reviews that take the different architecture in count. It is easy just to show numbers even that these have very little meaning to the real performance.
365