A few notes about the review: The 12 game average is not an average of ‘raster’ performance, it’s a mix of ray tracing and raster performance. We’re now at a point in time where there are games that exclusively use RT effects, or enable them with higher presets. In this example Star Wars Outlaws is an RT only title and Dragon Age: The Veilguard enables RT with the ‘Ultra’ preset. So the 12 game average is meant to be an overall representation of modern gaming, whereas the RT section is a look at RT performance in games where it really makes a visual difference (based on our testing and opinions). The review is weighted more on the 12 game average data, which we also use for our value analysis. Moving forward I’m sure we’ll see more games included which use RT effects.
Even understanding the point, I think AVG percentages out of 12 games could be just without mixing it with RT results... Maybe another AVG out of 12 onmky with RT should be just added... Anyway thanks for good test !
Why are there Starfield and Star Wars Outlaws (flop game) Benchmarks and not ones for No Man's Sky, which has a much larger player base? Or games like Palworld, when going for more recent IP's? This is not an insult, just curiosity. Shouldn't games with higher active player bases in a certain genre be more important? //typo
Also small note, 4060 and 7600 are not just limited by their vram but also their memory bandwidth. 4060 has lower bandwidth than 1660 Super while 7600 is same as RX 5600XT.
@@alanliang9538 When people say competitive they mean cost per frame, nobody cares about die efficiency. That's completely irrelevant for the customer.
Just NVIDIA needs to scale back pricing on their lower end cards and buff them heavily. AMD still prices their products fairly. NVIDIA Blackwell will be legendary only with the RTX 5090 and RTX TITAN AI, the rest of Blackwell NVIDIA nerfed deliberately. NVIDIA does this to push their premium products and people willingly take the bait. If Intel exits the CPU market, we as consumers will all pay the price.
@@iLegionaire3755 They kinda have to. They can't afford to undercut their datacenter/workstation cards (where most of their money comes from) by releasing a 5080 that can outperform them on a price/performance level. To put it in perspective, a 5090 is expected to retail for $2000, and to get a current gen Ada card with 32GB of VRAM you need to spend $5000, so even it is going to be biting into their workstation market in a huge way. If the 5080 had more VRAM or performance than what leaks show it to have, it would hurt the profits of 95% of their market for the sake of giving 5% a good deal (and of course it would be bad for that 5% because you'd never actually be able to buy one because they would all be instantly bought up by datacenters). Even as spec'd, datacenters who are using cards for image generation farms will likely be snatching up 5080s as fast as they can, since 16GB is enough for SDXL and other high res generation engines.
@@LordApophis100 Yeah especially when comparing the silicon to an RTX 4070ti to which is only 22 Sq/mm larger on the same node and makes this GPU uArchetecture disparity an even bigger one than AMD/Nvidia of the 2017-19 time period. Granted this is intel's second ever consumer DGPU, and a hefty improvement over the 1st.
they priced it at the point necessary for them to have a win. as much real success they've had with R&D, this could have been way worse for everyone at the wrong price, and they just needed to get people on their side.
@@vicente3k People are buying it, all the positive reviews will definitely help. Whether Intel will make any money from it and commit to the GPU market long-term is another matter entirely.
or maybe it is an architecture thing. It might be me, but it seems like Battlemage performs really well in games that mostly rely on raster performance while looses some ground in compute performance. I do not know how compute scales with resolution, but if it is not linear like raster, then it might be that at 1080p the card is mostly compute bound while at higher res this bottleneck diminishes, making it look better. I guess we will find out with future videos!
@@Chicagoben The RX 590 from 2018 is similar. Doesn't always like low settings and prefers medium or high settings otherwise the cores throttle down due to lack of load.
We don't know how much money is Intel making. Can't really be making statements like that. It would be cool for us if they were cheaper, absolutely, but we can't just say they must be 200 tops, because we don't know if the companies would be able to stay profitable.
Time to upgrade from my RX 580 to B580. Overall, it looks good. I'm super pleased that there are no significant problems with drivers. In the conclusion where you said that RTX 4060 should go from 300$ to 200$ I burst out laughing. We all know it won't happen. I hope gamers won't buy overpriced NVidia GPUs
@@christophermullins7163 yeah I bought last gen and I both love it and hate it at the same time. Which in my book puts it into pretty good territory, especially considering the fact that I assumed this would be the case.
@@haukikannel Anybody paying $500 for an 8gb piece of junk in 2025 is more than likely the product of a marriage between first cousins. Yes Nvidia own the market but people still want value. Nobody should be paying more than $450 for 12gb cards now either.
@@whiteroserepublic1699 People still want value? Nvidia showed us with Lovelace/40 series that gamers will whine, but end up ponying up the cash anyways. 40 series was the most insane, egregious, greedy move in GPU history, and it was our opportunity to refuse to buy and show we wouldn't take being screwed like that, but people just gave in and bought.
That's what I'm interested in, I'd guess around a 4070 super level of performance if they can pull it off, but from the rumours it does that sound like its doing well and might be cancelled. If only they could manage 4080 level performance for $400 usd....that would be a must buy for me
@@gameurai5701 its ok, 6700 xt with a bit lower performance draws 45watt more. 7600 xt draws the same amount stated by amd and draws around 5 watts more, and has worse performance
@@BatteryBee Are there, though? The RX-6600XT to RX-7600XT situation is the same on the Radeon side, even worse is the RX-6800XT to the RX-7800XT. Or are we forgetting that AMD rebadged the actual RX-7800XT to the 7900XTX to sell it for £1000?
Europe prising… You get msrp… You test the GPUin few gaming benchmarks… You look the GPU that is the most expensive that is about same speed and put that price as the retail price! Do you remember time when AMD did try to compete with prices? The retailers did price the AMD GPUs close to Nvidia and did keep the change in their own pokect. AMD did learn from that and does not compete with the price anymore… Because it is useless! 😢
This card is the same price as the RX 590 released in 2018 (which I own). So if you account for the fact that its better and crazy inflation over the past few years, this card is being priced much LOWER than it could be. If you are in the market, I would buy it and not look back.
22% VAT magically turning to 32%, even though Intel has like 16% VAT thanks to exemptions in Ireland. So they are pocketing an extra 16% off any European sale. Neat.
@jayclarke777 Tariffs would make the card cheaper though, plus intel is an american company, local products get cheaper. All of Intel's fabs last time I checked are here in the US.
@@xenosarcadius1198They fab their chips at TSMC, like every other GPU maker... They're not fabricated in the states. Plus the CHIPS Act is pretty much d3ad from "DEI".
Here in Bulgaria, the price is 320 euros. It’s not 250 dollars, but it’s still the best you can get for that amount of money, plus an extra 4GB of RAM. I think it’s worth it.
Keep in mind that US prices are without tax while most, if not all, countries in europe show prices with tax included, so depending on how much tax you have in Bulgaria on GPU's the price might not be that much higher than the US price.
@@MrMeanh Even then the price is not comparable. 249USD translates to 280€ at 19% tax rate for Germany for example but its closer to 320€ here for basically no good reason even though Germany actually usually has the best prices outside the US.
@@ridleyroid9060 wow in Romania, the low cards( palit , gainward) rtx 4060 is 300e , rx7600 is around 262, we dont have the new intels cards yet, the best/price card right now is RX 6750 XT 12gb from XFX = 342 euro. looks expensive in Serbia
I'm so tired of this rhetoric. "I'll wait for intel to release a good gpu so I can buy nvidia cheaper". God, it's so annoying. Just support intel by buying arc if you think they offer you good value
Not really. They'll only be pressed if gamers actually buy enough of these to turn them into a real threat, and we all know how that will go. Gamers will ignore this launch, wait for the 5060 and buy that instead regardless of whether it's good or not.
@@BusAlexey people will buy cards which they seemed are good for them and price they willing to pay, no one buys a graphics card to “Support a Company” it’s dumb to think that way It’s not your local Mom and Pops shop that needs support it’s a Global Powerhouse
That's basically the same thing as saying designing for 1080p. You dont design a GPU for a specific resolution *with* upscaling. That's not a thing. Upscaling just uses a lower resolution. Either way, GPU's aren't really designed for specific resolutions in the first place. As always, bigger memory bus/bandwidth means something is better at higher resolutions. The MARKETING then gets targeted towards whatever they think will be the best market to market to for certain products, but it's not something they're thinking about during architectural design or anything. Game demands are fluid and vary significantly by title.
@@maynardburger you're not exactly correct, remember that when rendering higher resolutions you can have more things running in parallel, so it can matter what resolution they were aiming at.
Hopefully Intel has proper supply and regional pricing all over the world, filling the price range that AMD and NVIDIA have totally abandoned is a big opportunity.
It won't. Small supply with high % going to USA. Card is already being scalped. Most of world will never see the US MSRP. Als same for AMD/Nvidia cards.
@@Alcatraz760 looking through the comments it's already 320/350 in various parts of Europe, i'm sure the additional taxes wouldn't make that up. Not yet available even for pre order in my region, even something like the 9800x3d took a week to become available here despite being close to manufacturing and shipping routes. It may still be worth scalping even in USA depends on total supply.
Here in Brazil they released the A770 and A750 almost six months after the global launch and they were more expensive than the RX 6650 XT and RTX 3060. So nobody bought those.
@@RafitoOoO Same issue in the Philippines, although we're underserved by both AMD/Nvidia they'll still have more supply than Intel and prices that are same or cheaper.
I'm not sure. It seems the average consumer dont even know what the arc gpus are. Despite how good the card is they will pick nvidia because the recognize it, maybe AMD but even AMD suffers a bit there. In prebuilt PCs I think they will do well though.
They run the speed test of the GPU and put price to that. Same reason why AMD did end price competition. It is useless. Resellers will harmonice prices for higher profit in anyway, so if you sell at low price. The reseller just get fatter profit. So better put about same price allready, and call it a day because resellers destroys every opportunity to price competition!
its a win because of how they had to price it, which is at a level that got him fired because its not actually competitive unless its literally loss lead.
@@rotm4447 To be fair that's what the new player in a an established market always has to do. Sell at a loss at the beginning to build a customer base. The problem is that Intel is already hemorraging financially so they might not be able to afford to do that strategy as they probably could've done before ryzen came out.
yeah if only the guy who negotiated himself a golden parachute before announcing that 15'000 people are out of a job due to his own mismanagment gets to have a sentimental moment...
They don´t celebrate this… A product that does no produce profit, is not a win! B580 should be at least 50% more expensive to Intel call it a win… Would you buy this at $450?
@@haukikannel having a working gaming dGPU Division is not build overnight It's a journey, before they can sell it at high margin, but with this launch it's shows their graphic solution are working and great
@@TheReferrer72 3060 Ti has no frame gen support and just 8GB of VRAM, which even though faster because of bus width, it will start falling short more and more over time. B580 has both of these addressed.
@@psygreg Framegen is pointless under 60fps and if you rly want you can use FSR frame gen... Anyway, for 1080p or 1440p with DLSS there is no point. GL adressing drivers in older games.
@zbigniew2628 The 4090 can't even do 4k60 without it in Stalker 2 💀. Many companies are making games that almost require it in many titles so it's far from useless even if it's not ideal.
hope this will give AMD a good incentive to offer really good value low end products for the 8000 series. Or just as well, some juicy discounts on 7800XTs and downwards. But remains to be seen if Intel even produced these things in volume that actually threatens the other two. Nvidia probably doesn't care either way, as long as people are buying their actual money makers in the high end that won't even have any competition in the near future.
Better than expected, but not really impressive though. Same performing 6700XT/6750XT cards are on sale for 300€ for well over a year by now. If the B580 really can be had for under 270€ it will be an option worth considering.
@@loganmedia1142 I hope so! But right now the 7600 is at 250€ and it was on sale for 230€ last week. B580 is at 330€ right now. Sadly, it doesn't look promising so far.
The 6700 XT is NOT available in most countries that are not USA. Infact most GPU's are unavailable at this time in Canada, even the 7000 series that are worth buying plus those cards cost 500-700 dollars, while this card will be 400 after tax.
@@Fishy-i2g Prices vary of course. Here the 7600 is 250€ and it was 230€ on sale just two weeks ago. 6750XT is 330€ right now and it is on sale very regularly at or below 300€. B580 is 330€ right now too. So if Intel doesn't lower prices immediately they can't even compete with 3 years old cards. And RDNA4 is only 1-2 months away.
*Space Marine 2* is experiencing issues with DLAA and TAA, failing to deliver the expected FPS performance. The game only achieves the correct FPS when FSR is enabled for anti-aliasing.
Weird. I bought mine like 2 months ago. ALL TESTS i saw the 6700xt was on pair or ahead of the 3060 ti. MAINLY at rasterization. WAY AHEAD of 4060 and 7600. Little under the 4060TI. Buts this tests are...weird. Nvidia should have done some crazy shit with the drivers (the last of us at 1080 ultra) and they used presets that enter RT as he said. I dont like dlss and framegen. I like to see it native and to see benchmarks with the hardware real power at a REAL resolution with real frames. I think thats why they took bf2042, warzone and some other tests.
Just doing some napkin calculations a hypothetical B770 with 28 cores at a clock speed of 3180 could beat the 4070 in raster and maybe match the 4060ti in RT at 1440p. They could charge 375 for it. An 8 core increase corresponds to a 2 render slice increase, which is the increase from an A580 to an A770. The clock speed difference between those cards was 23.5%, so a 19% increase seems doable. The price difference between the B580 and B570 is 30, which corresponds to 15 per core, so if they went up 8 cores that would put the price at 370. Of course this all assumes linearity, which is unlikely to hold, but it might not be that far off. I tried to account for that by finding a number for the clock speed that would kick the 4070's ass and match the 4060ti at RT, then added 5 to the price.
@@justinway153 It could very well be the case. The B580 has 2560 cores and a 456 Gbps memory bandwidth, yet manages to be over 20% faster than the A770, which has 4096 cores and a 512 Gbps memory bandwidth. If they make a B770 with the same core count as the A770, but with, let's say, 20 Gbps GDDR6, for a bandwidth of 640 Gbps, it would probably be much, much faster. Unless you're talking about the rumor of Intel not making a B770. That would be a big fumble. A massive fumble if they actually quit the GPU market after Battlemage.
Glad Intel's cooking, once the two players on a duopoly don't have a reason to go after each other anymore, they settle on prices that don't benefit consumers, Intel will stick it to them, which is just fantastic.
What an amazing review as always, Steve! I liked that you tested the "worth-using" / "significant visual upgrade" ray tracing games in a separate category! I think that the B580 is a decent GPU at launch, but it will face real competition when the RTX 5060 and RX 8600 arrive, **if** their prices don't excede $300 USD and have at the very least 12gb of VRAM.
Intel needs a success, and although it's not specifically the product I have been hoping for, wanted to get myself the B770 instead, that's still a win especially considering great scalability in 1440p. I am curious though if the 1440p scalability is something they planned from the beginning or rather a lucky coincidence - hopefully Tom Petersen from Intel can enlighten us in his very casual style. 🙂Good job Intel! 👍
It could be that they were careless and the game switched quality settings to medium. Gamers Nexus exposed LTT for the same thing when it actually happens at GN and HUB too time to time.
I use a rtx 3070. Ultra textures in veilguard fills up the vram pretty quickly and game starts to run at low 40 high 30s with stutters at the first open world big area. Lowering the textures resolves the issue. Prob result of the 8gb vram. Your numbers show pretty high numbers for 8gb cards you are prob using the opening area or not running the game long enough to witness it. As the game runs vram memory cycling becomes worse and worse.
That is true, and as expected they beat competing AMD parts quite well in RT benchmarks. Only issues with that is, it's irrelevant. At least until they release a much higher end product. At this perf tier RT performance is still meaningless, just like it is for the 4060. In most games where RT actually matters in terms of visual uplift you wouldn't enable it. A stutter fest of pretty frames at 1080p with upscaling is better than a slide show of them, but still not anything you'd wanna play with.
Now i need to see how it handles pcie3 on these 8 lanes. Preferably on some lower end Cpu's like 5600 etc Edit: I've got 5600 with B450 tomahawk which afaik doesn't support pcie 4 which makes me wonder if there would be any penalty with X8, but making an educated guess, this shouldn't be a problem. It's not high end card, should be fine, but nonetheless, would love to see these sweet benchmarks
It's just insane that we're almost in 2025 and we're still talking 1080p for new GPUs. Overpricing GPUs due to pure greed from nVidia with AMD just following suit and consistently tripping themselves over got us to this stupid point. Monitors have gotten so far ahead with both high resolutions and high refresh rates being rather easily obtained and with decent image quality, but here we are with (almost) no reasonably priced hardware to run even just some older (but better) AAA games well unless you consider buying used previous gen higher-end GPUs. Intel really is onto something here and we know exactly what they're trying to do, so here's hoping they actually succeed. And if they manage to get the B770 and B780 out, they should absolutely stay below the $500 mark.
Even PS5 games are struggling at 1440p and are dipping to 1080p using DRS. 1080p is super demanding these days because higher quality textures, increased polycounts and now RT effects are also a factor, for instance Indiana Jones requires RT. Not to mention devs don't optimise their games these days like they used to. As much as I wish 1440p was the minimum resolution today, it's also that 1080p is not like it was in 2014, it requires way more compute now to render a 1080p image, otherwise the RTX 2060 would be enough even today to play games at 1080p 60 FPS.
@@NANOTECHYT I get that games are more demanding and less optimized than ever, but when taking a look back at some AAA games from 2017 with Assassin's Creed Origins, for example, being one of my favorite games from the 8th generation of consoles, these mid-range GPUs still aren't powerful enough to run it at like 1440p 120 FPS or even higher and it really is a game that's just so insanely more enjoyable at higher resolutions and frame rates, I was astonished seeing so much more detail the first time I played it at 4K. And it's not that the game isn't optimized well for it to be so GPU demanding, for a Ubisoft game it's actually pretty impressive and can easily run at high FPS with any modern CPU featuring at least 6C/12T, but the mid-range GPUs are just so lacking in both memory bus, bandwidth and processing power and it all comes down to greed of the big 2 GPU manufacturers - AMD and nVidia - being the reason we can't even enjoy older titles at high resolution and frame rates without spending a ton of money on a GPU.
@@garminizator Origins was a hard game to run at the time, I remember running it at like 60 FPS with my GTX 1080 and 7700K when it released and probably because it's an Ubisoft title it has problems scaling. Ubisoft's engine for AC titles was pretty bad after AC Unity.
Seriously, imagine if in 2017, we were getting new GPU's in the $200 range that were only good for 720p gaming. That's basically what's happening right now. 1080p is NOT next-gen. These GPU's are a step forward for consumers, but we're still in a pretty lousy state.
@@Uthleber He also seems to have trouble remembering that everyone hated the 590 when it launched because it was pointless over the 580. But it sure it's us being confused, right? Btw it's not very compelling in whole Europe, they are charging an extra 32% over MSRP, when average VAT is 22%, not to mention in Ireland Intel is only paying like 16% VAT, so we're literally paying doubled fees.
Nice to see Intel Finally get a Win. Actually good Vram amounts and performance with a fair price, something that we would never see in a world where only Nvidia dominates. Imagine finally having a competent 1440p card for under $250 Msrp.. Hopefully Intel gets rewarded for a quality product, and can be a viable 3rd Dpgu competitor going forward.
In France the B580 is priced around 360 euros... joy I would have grabbed it for my nephew or me if it was 250 usd + VAT but last time I checked our VAT where not 45% That place it at 6.31 euros per frame when the 7600XT is at 6.5, so better espcially for RT and better upscaling but it is 1 years late for it to really matter sadly
Ok, colour me impressed. That's way better than I expected and an actually OK product. Certainly compelling value. Now if it only were more consistent. There's still the issue that almost all recent releases in Steve's benchmark selection are underperforming, suggesting that you are still at the mercy of them updating their software stack regularly.
Looks like pretty solid GPU, if you can get it at MSRP. I know for a fact that retailers in my country will just price match it to 4060 to eliminate competition.
Very good effort from Intel here, they've certainly improved a lot since Alchemist! I would be curious to see some driver overhead testing, as there are still a few weird results. I would also point out that in other markets, its price isn't nearly as competetive, for example in the UK, a 4060 is ~£250, the same as the B580 and the 7600 is ~£230, but still tempting given the 12GB VRAM, which otherwise you could only get with a 3060 for ~£240 whilst there's still stock, but with a decent average performance advantage over the 3060.
100€ difference in germany to a 6800 or 7700xt, and its new. the price will drop by like 50€ in a couple of months and then you cant even compare it anymore
Great review. I'm so pleased Intel managed to improve so much. Now we can all say that they're a true contender. I'll be upgrading out of the xx60 tier in the next few months hopefully, but if it was looking to replace an aged 1060 or 2060, the 580 is exactly where I'd start looking.
7:05 no one plays War Thunder maxed to the nine's. You just can't see anything from the damned animated grass, the heat haze and smoke, or your teammates exhaust.
That's great, but they're not benching games, they're benching GPU's. These should not be used as a guide for what kind of framerate you'll get in specific games.
I'm so happy to hear how good the B580 is! We needed this GPU and I'm eager to see what else Intel has coming this year. I'd love to see Intel commit to, and own this part of the GPU market
EU bros. I got a rx 7600 for 238€ from Germany recently and that wasn't a particularly rare deal. The B580 costs 320€ there In Finland, the rx 7600 right now costs 260€ and the B580 is 340€. 30% premium for 8% more fps and same 1% low @ 1080p Dead in the water pricing.
How about look at it this way. The Intel B580 is being sold at the same price as the AMD RX 590 was in 2018, for better performance. Considering the rampant inflation in Western Countries and that this will perform better than the RX 590 /1060, the pricing is actually TOO LOW.
I don’t know if this is the 5700 XT upgrade I have been waiting for, I was hoping for above 60 FPS on modern titles at 1440p. Maybe I’m asking too much.
I feel as though this card really shouldn't be positioned as a 1440p ultra settings card, however it's been advertised and then tested in that scenario in order to show off the extra VRAM it has, over budget cards.
Yeah, I was sitting on an older card, and a monitor that could only output 144Hz at 1080p with dual link DVI, so I bought a 27 inch 1440p 144Hz monitor early 2023. And then I bought the most expensive card I have ever bought, a 6700 XT, and it really isn't enough to deliver a good experience for modern games with that monitor, so I mostly find myself sticking to older titles still, or indie. Makes me wonder if that is how/why you see so many like GTX 1060 and so on, people going "yeah, am I gonna spend $400+ to still not be able to get 60 frames in newer games, guess ill stick to minecraft and counter-strike and this card for another couple years". Well that card can play more impressive games than that but you get my point :)
I mean, 5700 XT was double the price if you adjust for inflation, so yeah, the fact this easily beats it and offers more vRAM and features is great already!
Realistically you'd customize your settings to medium-high depending on impact to graphical fidelity with a reasonable cost to performance. E.g. in Cyberpunk, if you follow the HUB guide on optimized settings, you can expect +18% performance vs high preset. At $250 new (in USA), can you find a better deal for 1440p 60 FPS in modern titles? Also, the 5700 XT was ~$450 new in 2019, so you're down two price brackets.
This definitely a great sign that Intel is doing some good work behind the scenes to make these series of graphics cards to meet expectations and not just drop out of the GPU market in a few years!
@@srobeck77 You are not a details guy. The 1060 is/was the most popular budget GPU as per steam. The OP was alluding to the fact that this will be the 1060 of our time.
@Fishy-i2g once again, your missing actual details. This is not a good look, fella. The B580 performs at or above the 4060. No1 cares about your retro gpu from 8+ years ago
It is nowhere near the jump from gtx960 to 1060 6gb.... And sells are not going to be great coz uncertain drivers and being close to rtx3060ti.... Anyway, it is great that people fear 8GB Vram, coz rtx3070 can be cheap sometimes and new games are often pile of trash. All I want is rtx3070 performance with 12Gb Vram and good drivers for older games.
I think Intel just walked in and made a name for themselves in the GPU market. A new stock recommendation from me. Just keep those driver updates coming.
Its all about price, if it will be on the market for price lower then RX 7600 and 4060, for like 30-50$ cheaper, then we can think about. If price is the same, i will skip it. 3-5 years later you not gonna sell it for good amount of cash, like now those who brought RTX 3060 or RX6700.
This is how I know this channel is honest, finally something to celebrate for the customers, maybe even better if they get better drivers. Not saying is perfect but finally we got a new player with somehow an agressive price for the mid-range, which was suppose to be low end a long ago
Nothing special really, 3060 is only one generation old..... I am still using RX 590 which is 3 gen old, soon to be 4 from 2018 and I still get 60-90 FPS.
Now to complement this review we need a video about any issues regarding compatibility with older APIs, emulation and software (drivers). If it's on the same level of AMD it's a major win for consumers.
I'm impressed; in Intel's launch presentation they had tested with XeSS on, and the B580 is around that performance with it off. However this may be too little too late. They needed to launch the full lineup as they aren't going to be able to justify prices once RDNA 4 launch as AMD undercuts Nvidia anyway. If the B770 was in between the 4070 Super and 7800 XT it would sellout at $400-$450.
I'm buying this for my backup/living room rig, thank you Intel. Can't believe I'm saying this. Went from the goat GTX 1080ti for $500 1440p, to a RX 7900XT $650 1440/4k last year. I'll be happy to give $250 for this 1440p card. Shit already sold out on new egg.........
It`s going to be brutal if prices drop when new low end GPU`s launch, if they perform better of course But they`ll launch very late, probably 3 to 6 months from now
Finally, Radeon is no more! Unsaid thing in the review - this thing actually has functioning and high quality encoding and fairly decent productivity support, unlike Radeons, so the viability vs GeForce is quite good. The "odd" performance scaling from 1080p to 1440p is very likely due to the memory subsystem. The B580 has by far the most memory bandwidth between it, the 4060 and 7600, holding it up at 1440p, but it also has the least cache (18MB, vs 24MB 4060 and 32MB 7600), so its a bit held back there at 1080p because of that.
A few notes about the review: The 12 game average is not an average of ‘raster’ performance, it’s a mix of ray tracing and raster performance. We’re now at a point in time where there are games that exclusively use RT effects, or enable them with higher presets. In this example Star Wars Outlaws is an RT only title and Dragon Age: The Veilguard enables RT with the ‘Ultra’ preset. So the 12 game average is meant to be an overall representation of modern gaming, whereas the RT section is a look at RT performance in games where it really makes a visual difference (based on our testing and opinions).
The review is weighted more on the 12 game average data, which we also use for our value analysis. Moving forward I’m sure we’ll see more games included which use RT effects.
thank god
Even understanding the point, I think AVG percentages out of 12 games could be just without mixing it with RT results... Maybe another AVG out of 12 onmky with RT should be just added... Anyway thanks for good test !
Why are there Starfield and Star Wars Outlaws (flop game) Benchmarks and not ones for No Man's Sky, which has a much larger player base?
Or games like Palworld, when going for more recent IP's?
This is not an insult, just curiosity. Shouldn't games with higher active player bases in a certain genre be more important?
//typo
only
Also small note, 4060 and 7600 are not just limited by their vram but also their memory bandwidth. 4060 has lower bandwidth than 1660 Super while 7600 is same as RX 5600XT.
Imagine telling someone 5 years ago that Intel would be more competitive in GPUs than CPUs
Crazy
its not competitive. this thing is a 4070ti that runs like a 4060. intel lose so much money selling this 😂
@@alanliang9538 Oh shower us with knowledge, I assume you have insight into the cost of production for Intel that we plebeians do not
What they lose in profit, they gain in marketshare. Like the initial days of amd with ryzen @@alanliang9538
@@alanliang9538 When people say competitive they mean cost per frame, nobody cares about die efficiency. That's completely irrelevant for the customer.
The Return of the "580"
We had the GTX580
Then the RX-580
Today, we bring the Arc B580!
Why did I get emotional with this? I think I felt old and also remembered how it felt to be excited by a good midrange release that I could afford
RGB 580
Intel actually doing well at this price point is a big win for people, GPU market has been utter shit for so long now.
Because you said that, I think it would be presumptuous for Arm to name the first gpu 580
I hate the name just because it is close to AMD motherboard names
Big finger to Nvidia, 12GB is for entry cards. Well done Intel.
Vram clearly doesn’t make a difference so the issue is way overblown. Optimization is about way more than that.
@@DeytookourjurrrsExactly... 4 year old 8GB 3070 (3060Ti), are putting up higher numbers than 2080Ti (11GB)-in some titles.
It beats it in non vram limited scenarios . . .
@@jayclarke777 There are no results for 2080Ti in this video.
@@Deytookourjurrrs VRAM matters a lot and its been proven time and time again. Why do you think the used market is flooded with 3070s and 3080s
I hope this is a massive success. NVIDIA and AMD need a kick up their asses.
Just NVIDIA needs to scale back pricing on their lower end cards and buff them heavily. AMD still prices their products fairly. NVIDIA Blackwell will be legendary only with the RTX 5090 and RTX TITAN AI, the rest of Blackwell NVIDIA nerfed deliberately. NVIDIA does this to push their premium products and people willingly take the bait. If Intel exits the CPU market, we as consumers will all pay the price.
@@iLegionaire3755 They kinda have to. They can't afford to undercut their datacenter/workstation cards (where most of their money comes from) by releasing a 5080 that can outperform them on a price/performance level. To put it in perspective, a 5090 is expected to retail for $2000, and to get a current gen Ada card with 32GB of VRAM you need to spend $5000, so even it is going to be biting into their workstation market in a huge way. If the 5080 had more VRAM or performance than what leaks show it to have, it would hurt the profits of 95% of their market for the sake of giving 5% a good deal (and of course it would be bad for that 5% because you'd never actually be able to buy one because they would all be instantly bought up by datacenters). Even as spec'd, datacenters who are using cards for image generation farms will likely be snatching up 5080s as fast as they can, since 16GB is enough for SDXL and other high res generation engines.
Especially Jensen needs a Wale up call that there's indeed still demand for their products on desktops but at much lower prices.
Intel sells them at a loss, Nvidia and AMD can just wait for them to run out of Money.
@@LordApophis100 Yeah especially when comparing the silicon to an RTX 4070ti to which is only 22 Sq/mm larger on the same node and makes this GPU uArchetecture disparity an even bigger one than AMD/Nvidia of the 2017-19 time period. Granted this is intel's second ever consumer DGPU, and a hefty improvement over the 1st.
Intel needed a win. And whatever we may think about them historically, gamers needed it too.
they priced it at the point necessary for them to have a win. as much real success they've had with R&D, this could have been way worse for everyone at the wrong price, and they just needed to get people on their side.
Nobody will buy it and I'd bet the 5060 will be faster
@@markdove5930wdy with nobody? I've seen ppl preordering it already 😂
@@vicente3k People are buying it, all the positive reviews will definitely help. Whether Intel will make any money from it and commit to the GPU market long-term is another matter entirely.
Oh my god, Steve having a thumbs up thumbnail, sitting down.... Thanks intel Arc
hahaha true, i laugh when i see the thumbnail, so much shitty nvidia cards these past gen
we HAVE to support intel at least this time! even i'll never buy their cpu
nvidia needs to fail so bad with their 5000 series. It NEEDs to be a major failure so they stop increasing and charging ridiculous prices.
Thanks Steve
The lack of performance difference between 1080 and 1440 makes me wonder if some 1080 performance might be sitting on the table
Exactly i think this will only get faster with driver updates
This is something similar with arcs A series. The cards did well under high and sustained workloads and resolutions.
Yep, they need driver updates
or maybe it is an architecture thing. It might be me, but it seems like Battlemage performs really well in games that mostly rely on raster performance while looses some ground in compute performance. I do not know how compute scales with resolution, but if it is not linear like raster, then it might be that at 1080p the card is mostly compute bound while at higher res this bottleneck diminishes, making it look better.
I guess we will find out with future videos!
@@Chicagoben The RX 590 from 2018 is similar. Doesn't always like low settings and prefers medium or high settings otherwise the cores throttle down due to lack of load.
Intel basically proves that 4060 and 7600 should cost no more than 200, even at launch.
We don't know how much money is Intel making. Can't really be making statements like that. It would be cool for us if they were cheaper, absolutely, but we can't just say they must be 200 tops, because we don't know if the companies would be able to stay profitable.
2 years after the fact...
Items are "worth" exactly the price that they sell for. If they do not sell at that price then they are not "worth" it.
@@janbenes3165 Intel is not making any money currently with the GPU´s. That is the official statement from Intel.
intel fab or tsmc?
if it intel's own manufacturing it, they probably didnt take much hit on profit.
Time to upgrade from my RX 580 to B580.
Overall, it looks good. I'm super pleased that there are no significant problems with drivers. In the conclusion where you said that RTX 4060 should go from 300$ to 200$ I burst out laughing. We all know it won't happen. I hope gamers won't buy overpriced NVidia GPUs
So long as your CPU/MB have ReBAR support! That is the only thing that keeps this being the ideal upgrade card for older systems.
Intel is truly Schrödinger's manufacturer.
Lolol! Summed it up completely!
It both is and is not a good product. You'll never know until you buy it and play the games you own.
Intel (Smells dead) inside 😅
@@christophermullins7163 yeah I bought last gen and I both love it and hate it at the same time. Which in my book puts it into pretty good territory, especially considering the fact that I assumed this would be the case.
TSMC manufactured GPU lol
jensen huang: new 5060 8gb start from 350usd
You mean $500…
And It will still sell tens if times more than B580…
Nvidia believers buy Nvidia. It is just a fact.
Sadly
@@haukikannel Anybody paying $500 for an 8gb piece of junk in 2025 is more than likely the product of a marriage between first cousins. Yes Nvidia own the market but people still want value. Nobody should be paying more than $450 for 12gb cards now either.
@@whiteroserepublic1699 People still want value? Nvidia showed us with Lovelace/40 series that gamers will whine, but end up ponying up the cash anyways. 40 series was the most insane, egregious, greedy move in GPU history, and it was our opportunity to refuse to buy and show we wouldn't take being screwed like that, but people just gave in and bought.
@@maynardburger prolly alot of data miners paying it and skewing the sales numbers
If this is the performance of the B580, imagine the B770 or even a newer B780. If it ever see the light of day.
That's what I'm interested in, I'd guess around a 4070 super level of performance if they can pull it off, but from the rumours it does that sound like its doing well and might be cancelled. If only they could manage 4080 level performance for $400 usd....that would be a must buy for me
That card draws quite a lot of power for a low-end gpu. I don't expect too many gains to be squeezed out of this arch.
@@gameurai5701 no it doesn't.
@@gameurai5701 its ok, 6700 xt with a bit lower performance draws 45watt more. 7600 xt draws the same amount stated by amd and draws around 5 watts more, and has worse performance
@@schwazernebe 7600xt is a nonsensically overclocked card and so is a 7600 LOL.
The fact that the 3060Ti is still faster than the 4060.. And even worse, by that amount... The state of PC gaming is poor indeed.
Even the 4060ti 8GB is not consistently faster than 3060ti.
It's kind of telling that RTX 3060 TI used bigger GPU than RTX 4080 Super. RTX 3050 had 20% bigger GPU than RTX 4060.
got it just a year ago every game runs great on it in 1080p
Only if you're wedded to Nvidia and/or want 4K with RT. Outside of those scenarios, there's quite a few decent GPU options.
@@BatteryBee Are there, though? The RX-6600XT to RX-7600XT situation is the same on the Radeon side, even worse is the RX-6800XT to the RX-7800XT. Or are we forgetting that AMD rebadged the actual RX-7800XT to the 7900XTX to sell it for £1000?
$250 is apparently 330€ now. Prices in europe absolutely wild.
Europe prising…
You get msrp…
You test the GPUin few gaming benchmarks… You look the GPU that is the most expensive that is about same speed and put that price as the retail price!
Do you remember time when AMD did try to compete with prices? The retailers did price the AMD GPUs close to Nvidia and did keep the change in their own pokect. AMD did learn from that and does not compete with the price anymore… Because it is useless!
😢
This card is the same price as the RX 590 released in 2018 (which I own).
So if you account for the fact that its better and crazy inflation over the past few years, this card is being priced much LOWER than it could be.
If you are in the market, I would buy it and not look back.
22% VAT magically turning to 32%, even though Intel has like 16% VAT thanks to exemptions in Ireland. So they are pocketing an extra 16% off any European sale. Neat.
£248 UK. I expected the same price as EU, tbh.
They'll settle down. The A750 from Acer was 180-190eur, multiple times these few months.
Rejoice people, it's a great day for budget gamers and PC gamers in general.
in America
@@seeibeUntil the tariffs kick in.
@jayclarke777 Tariffs would make the card cheaper though, plus intel is an american company, local products get cheaper. All of Intel's fabs last time I checked are here in the US.
@@xenosarcadius1198They fab their chips at TSMC, like every other GPU maker... They're not fabricated in the states.
Plus the CHIPS Act is pretty much d3ad from "DEI".
Tariffs + scalpers. We will never get this card at $250, never.
Here in Bulgaria, the price is 320 euros. It’s not 250 dollars, but it’s still the best you can get for that amount of money, plus an extra 4GB of RAM. I think it’s worth it.
it just launched, prices will get more compettive when more shops will get them in.
It's not here in Serbia yet :/, but I expect around the same price, which is great because the 4060 is $450 here!
Keep in mind that US prices are without tax while most, if not all, countries in europe show prices with tax included, so depending on how much tax you have in Bulgaria on GPU's the price might not be that much higher than the US price.
@@MrMeanh Even then the price is not comparable. 249USD translates to 280€ at 19% tax rate for Germany for example but its closer to 320€ here for basically no good reason even though Germany actually usually has the best prices outside the US.
@@ridleyroid9060 wow in Romania, the low cards( palit , gainward) rtx 4060 is 300e , rx7600 is around 262, we dont have the new intels cards yet, the best/price card right now is RX 6750 XT 12gb from XFX = 342 euro. looks expensive in Serbia
A Steve that sits is always a good sign!
For consumers, Intel releasing first is the best case scenario. Now it means AMD and Nvidia are pressed into matching those prices
I'm so tired of this rhetoric. "I'll wait for intel to release a good gpu so I can buy nvidia cheaper".
God, it's so annoying. Just support intel by buying arc if you think they offer you good value
Not really. They'll only be pressed if gamers actually buy enough of these to turn them into a real threat, and we all know how that will go. Gamers will ignore this launch, wait for the 5060 and buy that instead regardless of whether it's good or not.
@@BusAlexey people will buy cards which they seemed are good for them and price they willing to pay, no one buys a graphics card to “Support a Company” it’s dumb to think that way
It’s not your local Mom and Pops shop that needs support it’s a Global Powerhouse
NVIDIA 100% wont match these prices because it would dilute their margins
@@BusAlexey No, just buy whatever is the best value overall for you, how about that?
Judging by what that Intel GPU engineer said (forgot his name) the B580 was designed for 1440p gaming WITH upscaling in mind.
That's basically the same thing as saying designing for 1080p. You dont design a GPU for a specific resolution *with* upscaling. That's not a thing. Upscaling just uses a lower resolution. Either way, GPU's aren't really designed for specific resolutions in the first place. As always, bigger memory bus/bandwidth means something is better at higher resolutions. The MARKETING then gets targeted towards whatever they think will be the best market to market to for certain products, but it's not something they're thinking about during architectural design or anything. Game demands are fluid and vary significantly by title.
@@maynardburger you're not exactly correct, remember that when rendering higher resolutions you can have more things running in parallel, so it can matter what resolution they were aiming at.
Hopefully Intel has proper supply and regional pricing all over the world, filling the price range that AMD and NVIDIA have totally abandoned is a big opportunity.
It won't. Small supply with high % going to USA. Card is already being scalped. Most of world will never see the US MSRP. Als same for AMD/Nvidia cards.
@@PaulJohn01 Shame, especially scalping a budget gpu is kinda pointless since people will just buy a more expensive amd or nvidia card at that point.
@@Alcatraz760 looking through the comments it's already 320/350 in various parts of Europe, i'm sure the additional taxes wouldn't make that up.
Not yet available even for pre order in my region, even something like the 9800x3d took a week to become available here despite being close to manufacturing and shipping routes.
It may still be worth scalping even in USA depends on total supply.
Here in Brazil they released the A770 and A750 almost six months after the global launch and they were more expensive than the RX 6650 XT and RTX 3060. So nobody bought those.
@@RafitoOoO Same issue in the Philippines, although we're underserved by both AMD/Nvidia they'll still have more supply than Intel and prices that are same or cheaper.
Intel Arc Battlemage will sell out fast!
inb4 scalpers loot em all.
It already did.
I'm not sure. It seems the average consumer dont even know what the arc gpus are.
Despite how good the card is they will pick nvidia because the recognize it, maybe AMD but even AMD suffers a bit there.
In prebuilt PCs I think they will do well though.
Everything is out of stock already
Well Intel need close to a miracle at this point, and these kind of products might just keep them from getting purchased outright
1:37 damn the a580 running at 16000 Gbps
crazy to think intel is now the underdog trying to bring it back, im rooting for em if they make good value products like this.
Talking positively! Unheard of, very rare episode indeed.
For some reason, in Poland the cheapest B580 is more expensive then the cheapest RTX 4060.
Same in Italy
They run the speed test of the GPU and put price to that. Same reason why AMD did end price competition. It is useless. Resellers will harmonice prices for higher profit in anyway, so if you sell at low price. The reseller just get fatter profit. So better put about same price allready, and call it a day because resellers destroys every opportunity to price competition!
@@haukikannel Greed destroys world every funking time.
Big deal get Intel but don't complain when the drivers don't work and the hardware has problems
if only Pat still around to celebrate this with his team
its a win because of how they had to price it, which is at a level that got him fired because its not actually competitive unless its literally loss lead.
@@rotm4447 To be fair that's what the new player in a an established market always has to do. Sell at a loss at the beginning to build a customer base. The problem is that Intel is already hemorraging financially so they might not be able to afford to do that strategy as they probably could've done before ryzen came out.
yeah if only the guy who negotiated himself a golden parachute before announcing that 15'000 people are out of a job due to his own mismanagment gets to have a sentimental moment...
They don´t celebrate this… A product that does no produce profit, is not a win!
B580 should be at least 50% more expensive to Intel call it a win… Would you buy this at $450?
@@haukikannel having a working gaming dGPU Division is not build overnight
It's a journey, before they can sell it at high margin, but with this launch it's shows their graphic solution are working and great
Two words: GREAT VALUE!
Go Intel. This looks very promising
*Hopefully gets better with time*
drivers always do
This card will bring joy to so many people
Why? Just get an used 3060Ti
@@TheReferrer72 3060 Ti has no frame gen support and just 8GB of VRAM, which even though faster because of bus width, it will start falling short more and more over time. B580 has both of these addressed.
@@psygreg Framegen is pointless under 60fps and if you rly want you can use FSR frame gen... Anyway, for 1080p or 1440p with DLSS there is no point.
GL adressing drivers in older games.
@zbigniew2628 The 4090 can't even do 4k60 without it in Stalker 2 💀. Many companies are making games that almost require it in many titles so it's far from useless even if it's not ideal.
Excellent piece. This is exactly what we needed right now.
Competition is always good for everyone! Really happy that Intels B580 is positive! Can't wait how Nvidia and AMD will react :)
hope this will give AMD a good incentive to offer really good value low end products for the 8000 series. Or just as well, some juicy discounts on 7800XTs and downwards. But remains to be seen if Intel even produced these things in volume that actually threatens the other two. Nvidia probably doesn't care either way, as long as people are buying their actual money makers in the high end that won't even have any competition in the near future.
Waiting for B780 that could crush rtx 4070
Seeing a smile on your face for the first time for a card release ever since started watching your videos.
Better than expected, but not really impressive though.
Same performing 6700XT/6750XT cards are on sale for 300€ for well over a year by now.
If the B580 really can be had for under 270€ it will be an option worth considering.
RX 6750 XT is still 25% more expensive than an RX 7600 here. The B580 is expected to be closer to the RX 7600 price.
@@loganmedia1142 I hope so!
But right now the 7600 is at 250€ and it was on sale for 230€ last week.
B580 is at 330€ right now.
Sadly, it doesn't look promising so far.
The 6700 XT is NOT available in most countries that are not USA. Infact most GPU's are unavailable at this time in Canada, even the 7000 series that are worth buying plus those cards cost 500-700 dollars, while this card will be 400 after tax.
@@Fishy-i2g Prices vary of course.
Here the 7600 is 250€ and it was 230€ on sale just two weeks ago.
6750XT is 330€ right now and it is on sale very regularly at or below 300€.
B580 is 330€ right now too. So if Intel doesn't lower prices immediately they can't even compete with 3 years old cards.
And RDNA4 is only 1-2 months away.
And here comes Nvidia with a 320$ RTX 5060 8GB
It's also only 10% faster too. Enjoy!
with 8x pci-e lanes & 128-bit memory bus.
Only gamers will get that joke.
You mean $500…
Awesome!
With the recent early access launch of Path of Exile II, i'd love to see that benchmarked on the B580. Could be the perfect match for me.
*Space Marine 2* is experiencing issues with DLAA and TAA, failing to deliver the expected FPS performance. The game only achieves the correct FPS when FSR is enabled for anti-aliasing.
6700 xt just got dethroned 😮
In what universe
Weird. I bought mine like 2 months ago. ALL TESTS i saw the 6700xt was on pair or ahead of the 3060 ti. MAINLY at rasterization.
WAY AHEAD of 4060 and 7600. Little under the 4060TI. Buts this tests are...weird.
Nvidia should have done some crazy shit with the drivers (the last of us at 1080 ultra) and they used presets that enter RT as he said.
I dont like dlss and framegen. I like to see it native and to see benchmarks with the hardware real power at a REAL resolution with real frames.
I think thats why they took bf2042, warzone and some other tests.
@@annielittlee9811 This one if you watch the review.
The 6700 XT is like 4 years old.
That's not an accomplishment.
At the end rx6700xt was close to the price of b580... And B580 is not going much lower coz they already sell at low margins or lost.
I'm impressed, the value on offer is legit, definitely reminds me of the days of the RX580 which was peak competent entry level graphics.
Just doing some napkin calculations a hypothetical B770 with 28 cores at a clock speed of 3180 could beat the 4070 in raster and maybe match the 4060ti in RT at 1440p. They could charge 375 for it.
An 8 core increase corresponds to a 2 render slice increase, which is the increase from an A580 to an A770. The clock speed difference between those cards was 23.5%, so a 19% increase seems doable. The price difference between the B580 and B570 is 30, which corresponds to 15 per core, so if they went up 8 cores that would put the price at 370.
Of course this all assumes linearity, which is unlikely to hold, but it might not be that far off. I tried to account for that by finding a number for the clock speed that would kick the 4070's ass and match the 4060ti at RT, then added 5 to the price.
that would be amazing omg
Gamers rejoice!
Now imagine a B770 with 7700XT levels of performance, 16GB of VRAM, better RT performance and actually good upscaling for $350!
It wont
@@justinway153 It could very well be the case. The B580 has 2560 cores and a 456 Gbps memory bandwidth, yet manages to be over 20% faster than the A770, which has 4096 cores and a 512 Gbps memory bandwidth. If they make a B770 with the same core count as the A770, but with, let's say, 20 Gbps GDDR6, for a bandwidth of 640 Gbps, it would probably be much, much faster.
Unless you're talking about the rumor of Intel not making a B770. That would be a big fumble. A massive fumble if they actually quit the GPU market after Battlemage.
its gona be a dream
That is a bit dissapointing, the 7700 XT is already $400. It should be 10% faster to compete with the 8000 series.
there's too many people on the internet telling other people to get hyped for imagined, extrapolated products
Glad Intel's cooking, once the two players on a duopoly don't have a reason to go after each other anymore, they settle on prices that don't benefit consumers, Intel will stick it to them, which is just fantastic.
Costs ~350€ in EU, so much for that.
yeah, but Americans gonna get EU price treatment once the tariffs are instated. 🤷♂
@@Chasm9 Americans already pay that much if you include taxes.
@@Chasm9 american companies don't sell electronics for less in other countries
US electronic prices are usually cheaper than Europe, they also have microcenter
Hmm, the cheapest card in Sweden is 294 EUR (taxes etc included). Though most are at 330.
Will you be doing any PCIE 3.0 testing for us users still on PCIE 3.0 such as the B450 and a520 motherboard?
They did PCIe 2.0 with the 8GB and 16GB versions of the 4060Ti, it wasn't an issue with sufficient VRAM.
Here we go! Been waiting for this one :D
Now go buy one 🙂
If Intel works on the drivers the way they did with Alchemist, these cards may become even faster over time.
This GPU should be the new, most popular budget GPU.
that sparkle looks nice
They actually cooked! Especially in 1440p. Wow! Nice job Intel, glad to see it!
Interesting that Intel let you guys finally release this at midnight in Australia
What an amazing review as always, Steve!
I liked that you tested the "worth-using" / "significant visual upgrade" ray tracing games in a separate category!
I think that the B580 is a decent GPU at launch, but it will face real competition when the RTX 5060 and RX 8600 arrive, **if** their prices don't excede $300 USD and have at the very least 12gb of VRAM.
Nice, Intel really is in the game right now, hopefully B770 will come pretty soon. As for you, amazing coverage as literally always!
Unfortunately it sounds like B770 was cancelled. I think they're only putting out a single die this Gen.
Intel needs a success, and although it's not specifically the product I have been hoping for, wanted to get myself the B770 instead, that's still a win especially considering great scalability in 1440p. I am curious though if the 1440p scalability is something they planned from the beginning or rather a lucky coincidence - hopefully Tom Petersen from Intel can enlighten us in his very casual style. 🙂Good job Intel! 👍
PC gaming will not be saved by graphics companies, but by devs not blindly using upscalers and UE5 lumen/nanite.
08:16 I think someone mixed the 4070 with 3070 results here...
It could be that they were careless and the game switched quality settings to medium. Gamers Nexus exposed LTT for the same thing when it actually happens at GN and HUB too time to time.
Intel being the only ones with a card that makes sense
I use a rtx 3070. Ultra textures in veilguard fills up the vram pretty quickly and game starts to run at low 40 high 30s with stutters at the first open world big area. Lowering the textures resolves the issue. Prob result of the 8gb vram. Your numbers show pretty high numbers for 8gb cards you are prob using the opening area or not running the game long enough to witness it. As the game runs vram memory cycling becomes worse and worse.
Fact intel have real rt cores which amd lacks
That is true, and as expected they beat competing AMD parts quite well in RT benchmarks. Only issues with that is, it's irrelevant. At least until they release a much higher end product. At this perf tier RT performance is still meaningless, just like it is for the 4060. In most games where RT actually matters in terms of visual uplift you wouldn't enable it. A stutter fest of pretty frames at 1080p with upscaling is better than a slide show of them, but still not anything you'd wanna play with.
its crazy how Intel who just made their 2nd gen gpu are competitive in RT and destroys AMD gpus in RT who have a lot of experience in GPU devision.
@Hugh_I hopenintel. Comes back to I mean I can do high level works on it
@@KenpachiAjax Its because RT is dumb and a waste of development for 90% of users.
@@Fishy-i2g If RT was dumb and stupid, I don't think AMD would be focusing on RT for their RX 8000 series...
Now i need to see how it handles pcie3 on these 8 lanes.
Preferably on some lower end Cpu's like 5600 etc
Edit: I've got 5600 with B450 tomahawk which afaik doesn't support pcie 4 which makes me wonder if there would be any penalty with X8, but making an educated guess, this shouldn't be a problem. It's not high end card, should be fine, but nonetheless, would love to see these sweet benchmarks
I would like that as well. My nephew has a 10700k system that needs a GPU upgrade.
It should be just fine. Don´t use it with PCI2!
@@monke2361I'd assume he meant 5600g
To clarify, many B450 mobos have pcie 3
@@kuksio92 ok makes sense then
"Thank you intel, very cool!"
Can't wait to make a budget build with one of these. Such a great product.
Looks really promising. Now intel needs to release the b780
It's just insane that we're almost in 2025 and we're still talking 1080p for new GPUs. Overpricing GPUs due to pure greed from nVidia with AMD just following suit and consistently tripping themselves over got us to this stupid point. Monitors have gotten so far ahead with both high resolutions and high refresh rates being rather easily obtained and with decent image quality, but here we are with (almost) no reasonably priced hardware to run even just some older (but better) AAA games well unless you consider buying used previous gen higher-end GPUs. Intel really is onto something here and we know exactly what they're trying to do, so here's hoping they actually succeed. And if they manage to get the B770 and B780 out, they should absolutely stay below the $500 mark.
Even PS5 games are struggling at 1440p and are dipping to 1080p using DRS. 1080p is super demanding these days because higher quality textures, increased polycounts and now RT effects are also a factor, for instance Indiana Jones requires RT. Not to mention devs don't optimise their games these days like they used to. As much as I wish 1440p was the minimum resolution today, it's also that 1080p is not like it was in 2014, it requires way more compute now to render a 1080p image, otherwise the RTX 2060 would be enough even today to play games at 1080p 60 FPS.
@@NANOTECHYT I get that games are more demanding and less optimized than ever, but when taking a look back at some AAA games from 2017 with Assassin's Creed Origins, for example, being one of my favorite games from the 8th generation of consoles, these mid-range GPUs still aren't powerful enough to run it at like 1440p 120 FPS or even higher and it really is a game that's just so insanely more enjoyable at higher resolutions and frame rates, I was astonished seeing so much more detail the first time I played it at 4K. And it's not that the game isn't optimized well for it to be so GPU demanding, for a Ubisoft game it's actually pretty impressive and can easily run at high FPS with any modern CPU featuring at least 6C/12T, but the mid-range GPUs are just so lacking in both memory bus, bandwidth and processing power and it all comes down to greed of the big 2 GPU manufacturers - AMD and nVidia - being the reason we can't even enjoy older titles at high resolution and frame rates without spending a ton of money on a GPU.
@@garminizator Origins was a hard game to run at the time, I remember running it at like 60 FPS with my GTX 1080 and 7700K when it released and probably because it's an Ubisoft title it has problems scaling. Ubisoft's engine for AC titles was pretty bad after AC Unity.
I think, most don´t understand that resolution is not the solution to graphics equal to real life, just look at a movie at 480p...
Seriously, imagine if in 2017, we were getting new GPU's in the $200 range that were only good for 720p gaming. That's basically what's happening right now. 1080p is NOT next-gen. These GPU's are a step forward for consumers, but we're still in a pretty lousy state.
I never imagined seeing HU, LTT, AND GN all smiling in their thumbnails for an intel product
In Germany the B580 costs 319 € and the 6750 XT 329 €. Its a failure here. you can get a 6750 XT for this price for months.
B580 is being sold at the same price the RX 590 was in 2018. You have some confusion, especially considering inflation thanks to the forced lockdowns.
@@Fishy-i2g nobody was talking about "what was in...". For Germany - it isnt a good buy for the price.
@@Uthleber He also seems to have trouble remembering that everyone hated the 590 when it launched because it was pointless over the 580. But it sure it's us being confused, right?
Btw it's not very compelling in whole Europe, they are charging an extra 32% over MSRP, when average VAT is 22%, not to mention in Ireland Intel is only paying like 16% VAT, so we're literally paying doubled fees.
Nice to see Intel Finally get a Win. Actually good Vram amounts and performance with a fair price, something that we would never see in a world where only Nvidia dominates. Imagine finally having a competent 1440p card for under $250 Msrp.. Hopefully Intel gets rewarded for a quality product, and can be a viable 3rd Dpgu competitor going forward.
In France the B580 is priced around 360 euros... joy
I would have grabbed it for my nephew or me if it was 250 usd + VAT but last time I checked our VAT where not 45%
That place it at 6.31 euros per frame when the 7600XT is at 6.5, so better espcially for RT and better upscaling but it is 1 years late for it to really matter sadly
Ok, colour me impressed. That's way better than I expected and an actually OK product. Certainly compelling value.
Now if it only were more consistent. There's still the issue that almost all recent releases in Steve's benchmark selection are underperforming, suggesting that you are still at the mercy of them updating their software stack regularly.
Looks like pretty solid GPU, if you can get it at MSRP. I know for a fact that retailers in my country will just price match it to 4060 to eliminate competition.
But then other retailers can go a bit below that so they get more customers, and then others will need to follow.
@@budgetking2591
No if they have a deal for the price…
It's on preorder now so if you can, get it.
Very good effort from Intel here, they've certainly improved a lot since Alchemist!
I would be curious to see some driver overhead testing, as there are still a few weird results.
I would also point out that in other markets, its price isn't nearly as competetive, for example in the UK, a 4060 is ~£250, the same as the B580 and the 7600 is ~£230, but still tempting given the 12GB VRAM, which otherwise you could only get with a 3060 for ~£240 whilst there's still stock, but with a decent average performance advantage over the 3060.
Thank you for your service!
24:28 The flowers are still standing!
Except the price outside the US is horrible, in Europe it averages around 330eu, you can buy a 7700xt or 6800 for little more
yup its DOA outside usa, sad.
7700xt for 330?😂 Where? It goes for 450 at least
@@enricogiunchi7171 390 in stock at digitec in switzerland.
In italy the 7700xt is 440€ 😭
100€ difference in germany to a 6800 or 7700xt, and its new. the price will drop by like 50€ in a couple of months and then you cant even compare it anymore
Great review. I'm so pleased Intel managed to improve so much. Now we can all say that they're a true contender.
I'll be upgrading out of the xx60 tier in the next few months hopefully, but if it was looking to replace an aged 1060 or 2060, the 580 is exactly where I'd start looking.
7:05 no one plays War Thunder maxed to the nine's. You just can't see anything from the damned animated grass, the heat haze and smoke, or your teammates exhaust.
That's great, but they're not benching games, they're benching GPU's. These should not be used as a guide for what kind of framerate you'll get in specific games.
I'm so happy to hear how good the B580 is! We needed this GPU and I'm eager to see what else Intel has coming this year. I'd love to see Intel commit to, and own this part of the GPU market
Oooh he's sitting down.
The universal love for this card from virtually every relevant tech tuber feels like someone at Intel finally listened to userbenchmark guy lol.
1:20 16k Gbps and 19k Gps(?) what in the hell 😅
EU bros.
I got a rx 7600 for 238€ from Germany recently and that wasn't a particularly rare deal.
The B580 costs 320€ there
In Finland, the rx 7600 right now costs 260€ and the B580 is 340€. 30% premium for 8% more fps and same 1% low @ 1080p
Dead in the water pricing.
I live in Finland. We basically have scalper GPU prices across the board.
It's really sad and disgusting, always has been.
yep the prices are so bad here
How about look at it this way. The Intel B580 is being sold at the same price as the AMD RX 590 was in 2018, for better performance.
Considering the rampant inflation in Western Countries and that this will perform better than the RX 590 /1060, the pricing is actually TOO LOW.
@@Fishy-i2g bruh its 350€ in europe. what a rip off
I don’t know if this is the 5700 XT upgrade I have been waiting for, I was hoping for above 60 FPS on modern titles at 1440p. Maybe I’m asking too much.
I feel as though this card really shouldn't be positioned as a 1440p ultra settings card, however it's been advertised and then tested in that scenario in order to show off the extra VRAM it has, over budget cards.
I say yes! The performance increase is nice for just $300
Yeah, I was sitting on an older card, and a monitor that could only output 144Hz at 1080p with dual link DVI, so I bought a 27 inch 1440p 144Hz monitor early 2023. And then I bought the most expensive card I have ever bought, a 6700 XT, and it really isn't enough to deliver a good experience for modern games with that monitor, so I mostly find myself sticking to older titles still, or indie. Makes me wonder if that is how/why you see so many like GTX 1060 and so on, people going "yeah, am I gonna spend $400+ to still not be able to get 60 frames in newer games, guess ill stick to minecraft and counter-strike and this card for another couple years". Well that card can play more impressive games than that but you get my point :)
I mean, 5700 XT was double the price if you adjust for inflation, so yeah, the fact this easily beats it and offers more vRAM and features is great already!
Realistically you'd customize your settings to medium-high depending on impact to graphical fidelity with a reasonable cost to performance. E.g. in Cyberpunk, if you follow the HUB guide on optimized settings, you can expect +18% performance vs high preset. At $250 new (in USA), can you find a better deal for 1440p 60 FPS in modern titles? Also, the 5700 XT was ~$450 new in 2019, so you're down two price brackets.
This definitely a great sign that Intel is doing some good work behind the scenes to make these series of graphics cards to meet expectations and not just drop out of the GPU market in a few years!
Sitting! Sitting again!
Estuve esperando toda la mañana por esto... gracias por una valoración real de la b580 ❤
intel launch GPU and become the new 1060, what a time to be alive
U mean the new 4060 but with 12gb for only $250. U not a "details guy", huh?
@@srobeck77 You are not a details guy. The 1060 is/was the most popular budget GPU as per steam.
The OP was alluding to the fact that this will be the 1060 of our time.
@Fishy-i2g once again, your missing actual details. This is not a good look, fella. The B580 performs at or above the 4060. No1 cares about your retro gpu from 8+ years ago
It is nowhere near the jump from gtx960 to 1060 6gb....
And sells are not going to be great coz uncertain drivers and being close to rtx3060ti....
Anyway, it is great that people fear 8GB Vram, coz rtx3070 can be cheap sometimes and new games are often pile of trash. All I want is rtx3070 performance with 12Gb Vram and good drivers for older games.
@@zbigniew2628with driver maturity it willl get quite a bit faster probably around 3070 level performance. but we will see
I think Intel just walked in and made a name for themselves in the GPU market.
A new stock recommendation from me.
Just keep those driver updates coming.
Its all about price, if it will be on the market for price lower then RX 7600 and 4060, for like 30-50$ cheaper, then we can think about. If price is the same, i will skip it. 3-5 years later you not gonna sell it for good amount of cash, like now those who brought RTX 3060 or RX6700.
You lose substantially more on those more expensive GPUs.
This is how I know this channel is honest, finally something to celebrate for the customers, maybe even better if they get better drivers. Not saying is perfect but finally we got a new player with somehow an agressive price for the mid-range, which was suppose to be low end a long ago
My 3060 ti is still going strong.
Nothing special really, 3060 is only one generation old..... I am still using RX 590 which is 3 gen old, soon to be 4 from 2018 and I still get 60-90 FPS.
@@Fishy-i2g 60-90 FPS at new games? At what settings, low?
Now to complement this review we need a video about any issues regarding compatibility with older APIs, emulation and software (drivers). If it's on the same level of AMD it's a major win for consumers.
After 8 years my 1060 can finally rest
along with the 60hz, 1080p monitor, with no freesync
My RX 590 is looking forward to its retirement.
I'm impressed; in Intel's launch presentation they had tested with XeSS on, and the B580 is around that performance with it off. However this may be too little too late. They needed to launch the full lineup as they aren't going to be able to justify prices once RDNA 4 launch as AMD undercuts Nvidia anyway. If the B770 was in between the 4070 Super and 7800 XT it would sellout at $400-$450.
Let's go Intel! We're rooting for you. We need more competition in the GPU market. 😊
I'm buying this for my backup/living room rig, thank you Intel. Can't believe I'm saying this. Went from the goat GTX 1080ti for $500 1440p, to a RX 7900XT $650 1440/4k last year. I'll be happy to give $250 for this 1440p card. Shit already sold out on new egg.........
Well played Intel. Please keep it up for the B700 series.
Only if they they will scold sellers in Europe
It`s going to be brutal if prices drop when new low end GPU`s launch, if they perform better of course
But they`ll launch very late, probably 3 to 6 months from now
8:11 guessing the RTX 3070 and 4070 results are reversed?
B580 is the best value for money gpu right now.
Finally, Radeon is no more! Unsaid thing in the review - this thing actually has functioning and high quality encoding and fairly decent productivity support, unlike Radeons, so the viability vs GeForce is quite good.
The "odd" performance scaling from 1080p to 1440p is very likely due to the memory subsystem. The B580 has by far the most memory bandwidth between it, the 4060 and 7600, holding it up at 1440p, but it also has the least cache (18MB, vs 24MB 4060 and 32MB 7600), so its a bit held back there at 1080p because of that.
These cards are for upscaled 1440p or 1080p native though. IDK if more bus is cost-effective.