B580 seems to be not optimized for Indiana Jones yet. but at least B580 should be able to use 1080p max setting. while RTX4060 8gb will immediately crash because the Vram is too small.🤡 B580 best budget GPU in late 2024 😎 (optimizing the driver update will make it more superior)
@@penonton4260 Yeaa,It Could easily beat the RTX 4060 Ti 16GB with Better drivers( Looking to buy B580 in a year,But also will consider the drivers and the RTX 5060(I hope it would have the same vram) or for the RX 8600/8600 XT)
Another plus for Intel is that XeSS is a great match against DLSS. A lot of people avoided AMD budget cards not because of RT, but because FSR just isn’t that great in comparison.
Es que por lo que tengo entendido desde 2020 tanto AMD y Nvidia han aumentado el porcentaje de ganancias de cada GPU que venden , por lo que se me haría raro que intel perdiera dinero , en el peor de los casos Intel estaría ganando poco dinero pero no creo que pierda dinero
Intel has failed epically in a dozen other side quests at this point. I wouldnt say this is a success either because you cant find a b580 to purchase. Theyre losing money on it. Its unsustainable. Intel is an unfortunate dumpster fire in all of their ventures.
@@apokalip6 Not going to happen, because NVIDIA launches 60 class GPU a lot later. Intel has a very wide time frame to sell their gpus before nvidia's new budget gpus come.
Look im all with nvidia losing to Intel or amd but u have to think about the customers satisfactory if u start getting artifacts on intel gpu and weird stutters like old intel gpus that won't work @@shaheermansoor2560
@@BladeCrew The reason i think that it will be destroyed is the "old" die tech used. It's only on a 5nm die and the chip is kinda big, it only has parity with RDNA 3 and RTX 40 is already more advanced. Next gen RDNA 4 for 50 series are gonna be so much better on 3nm.
Sometimes I don't support the think that must wait for it to be viable. Like the PS5 PRO as an example, a few games is better than PS5 and XSX, but mostly falls at them. And the price? was pretty high and I don't recommend it at any demand. But the BI Arc B580? is the other way arround: Not only the price is the lowest to compete this gen and gamma, but it deals many satisfactory game performance, where there are not Nvidia-Oriented like Indiana Jones or AMD-Oriented like Calypsto Protocol. Yet, is giving a HUGE performance in other titles and is pretty close to Nvidia 4060 and AMD 7600 (XT?) To compare to PS5 PRO: One need urgently updates to withstand the others in games and many fails. The other is great from the start, has an extremely accesible price and performs extremely well in so many titles from the ground. And not need an urgent driver optimization/update. A small price with HUGE performance.
Intel's biggest disadvantage is not enough collaboration with game developers and game engine makers. Otherwise this card can achieve awesome performance.
Fun fact, Alder Lake i7 and i9 are faster than half of Zen 4. Steve from Gamers Nexus just replaced recently the 12700K for the GPU tests with a 9800X3D.
Yeah sadly Unreal Engine games are the future of all games and it seems Nvidia 4060 takes the lead here especially with Raytracing and better upscaling/Software. Nvidia 5060 coming soon and will have an even bigger jump in Raytracing and Unreal engine games over the B580.. The extra 50 bucks for better RT and Upscaling seems like Nvidia still the way to go over Intel.
@@SemperValor RTX 5060 may come with a far better RT performance compared to RTX 4060 but from the leeks i have seen its going to have 8GB of vram again which is totally bullshit, like really bruh 8GB in 2025 ? not only that 5070 is also going to have only 12GB of vram, see that's the issue with Nvidia they know gamers will buy their gpu despite of lower vram and higher tdp of gpus and price, i am waiting for RX 8000 series let's see what AMD is going to provide.
@@SemperValor If you're buying a 60 class Nvidia card, you're still not in this for Ray Tracing. Given that the 5080 is going to be half a 90, and the 70 class will likely have 8GB of VRAM until we see ti or Super cards, I don't buy that the performance uplift on a 5060 will be enough to let RT ride while playing. We'll see I guess, but right now, it's basically 4090 or bust to really start to enjoy what the good RT titles have to offer. We may now see that with the 5080, but the further down the stack you go the worse things look. Nvidia is an AI company now. The consumer segment is secondary. To that end, the entire stack is clearly positioned to upsell you. I wouldn't count on a 5060 doing ray tracing particularly well. The onus now is on AMD. For $250, and a second iteration, Battlemage is pretty impressive. AMD's the one looking like they're floundering now. RDNA4 better have an impressive feature set, which includes some solid RT capabilities, or they're not just going to continue to lose their pitiful share, but they might get bodied by Intel if they can do this again with Celestial. XeSS already looks like it might prove to be a better implementation of upscaling than FSR, so at least ray tracing needs to perform.
@@gamingedition5165 This isn't more fps for the b580. Do the math. Across these 10 games in this video the 4060 averaged 69.4 fps while the b580 averaged 66.2 If you remove Indiana Jones from the lineup, since the difference is huge there, the b580 still technically loses, though its essentially the same. With the 9 game lineup, the 4060 averaged 67.3 while the b580 averaged 67.1
The game is new and arc b580 is not powerful like rtx 4060 but is made to go toe to toe with the rtx wnd considering its a new game it probably will take arc580 to optimize its drivers for the game . Soo yea not a sponsored game
The biggest win for Intel is that people will no longer dismiss them out of hand when looking for a good value GPU. Making your product a viable contender for peoples money is half the battle. Lets hope they stick with dGPUs and steadily improve.
You must be joking because 1080p is still a very decent resolution for gaming It would be reasonable to say this after 2030 but for now, 1080p is still good @foxskyful
@@eddiesneax6754 1080p in some games was barely decent even in 2018-2019 for high-ultra gaming... Thank god for image sharpening I couldn't enjoy(played) without it games like RDR2 and AC Odyssey they were pretty blurry even on high ultra at 1080p for me. In 2025-2026 1440p should be new min for gaming. Go live in the past and praise for technology not advance if that's what makes you happy and want to play 15y the same way
Horizon Zero Dawn: B580 is Faster, uses less power, less CPU usage. Cyberpunk 2077 No RT: B580 is Faster, uses same power, slightly higher CPU usage. Cyberpunk 2077 w/ RT: Same speed, B580 uses slightly less power, same CPU usage Alan Wake 2: Virtually equal. Stalker 2: RTX 4060 is faster, same power, same CPU usage. Ghost of Tsushima: B580 is faster, slightly less power, slightly less CPU usage. Forza Horizon 5: Virtually equal. Indian Jones and the Great Circle: RTX 4060 destroys B580. Other metrics irrelevant. Silent Hill 2: RTX 4060 slightly faster, B580 uses less power, less CPU usage. Hellblade 2: RTX 4060 slightly faster, B580 uses less power. GoW: Ragnarok: RTX 4060 is faster. Overall: B580 is significantly cheaper than RTX 4060... if I were Nvidia right now, I'd be very scared. Intel has come to play.
Remember that Gelsinger brought this out of Intel. Gelsinger was GOOD for intel, and pushed Intel to momentary leads against AMD, and brought them into the GPU market.
Do one for 1440p, native and upscaler, cause i'm really interested with HUB finding that B580 is better at 1440p, HUB games are pretty limited i wanna see more untested games, if it's really better at 1440p, B580 might be the best banger at $250, cause i doubt 5060 or rx8600 gonna be that cheap.
@@jacklamat6315 not really in alot of tests dlss is abit worse then new xess also rt doesnt make the full picture since the 4060 is 50 dollars more or in my country even 200 USD more even the b580
Talvez na próxima geração a Intel Arc já consiga superar a AMD RX , a ARC já conseguiu superar a RX no quesito ray tracing fora os 12GB VRAM . Parece que a Nvidia finalmente vai ter uma concorrência a altura…
@@kissehfodasso pô maninho vc leu oque eu escrevi? “ talvez na próxima geração vai superar “ próxima geração eu não disse a geração atual !! vc acha que a Arc não vai superar a rx 6750 xt na próxima geração só o desempenho em ray tracing a Arc já se sai melhor melhor que AMD e em tecnologia eles já estão bem avançados , fora que AMD já vai parar de fabricar rx 6000 e 7000 então não vai existir mais rx 6750 xt . Essa ARC já tem praticamente o desempenho da rx 7600 uma diferença mínima alguns jogos só que com uma performance melhor em ray tracing e com 12GB , algumas atualizações de drivers e quem sabe ela já se sai melhor que a 7600 .
IF SOMEHOW NVIDIA LAUNCH RTX 5060 WITH 12GB VRAM, THE PRICE WILL BE TOO MUCH HIGHER THAN ARC B580, NOY A SMALL MARGIN BUT NEARLY 100 $ EXPENSIVE AS WE ALL KNOW ABOUT NVIDIA NEVER ENDING GREED
They cant do that by the time 5060 launch arc b580 will beat 4060 ti 16gb and even 5060 and the price will be even lower they cant price it above 300$ no way and i hope this gpu sale well so 5060 fail nvidia need huge computation i bought 4080 super at 900$ and i still dont think it worth it compare to 1080 ti in 2017 and 3080 in 2020 at 700$
Nah, intel will be release b770 soon. With same price as 5060ti. But perform like 5070 😅. B770 project is canceled because power issue. Just hope its will be released soon
Well in thr UK it's 250 vs 250 but i would still 100% buy the 4060 due to it having better ray tracing, better frame generation and also the use of dlss and Nvidia reflex. Most people would agree with me and that's saying something.
@UmmerFarooq-wx4yo Nvidia obviously doesn't know that and will make the same mistake with the 5060 most likely for 300-350 but like I said, it'll still sell out and everyone buys the 4060, almost the most popular GPU on steam for example cause of Nvidia features.
AMD video cards have been competing with Nvidia for decades, often on equal terms, even in the top segment. But yes, OF COURSE, Intel video cards will ruin Nvidia now.
@@saricubra2867 "AMD sucks" is just an you'r emotion. AMD video cards in 90% of cases provide the same performance as Nvidia video cards. You may not like AMD as a company, but you can't deny the fact that their video cards work, and work well.
@@allex_lyric Lmao, AMD has the worst RT perfomance and feature set, the drivers are way worse than NVIDIA. They ruined Radeon after they bought ATI. Even AMD sponsored games run worse in their own hardware.
@@allex_lyricIntel B580 is just A770 with 12gb A B770 will be on par with a 4070 at the Best Intel use same strategy that AMD for rx 400-500 Intel will be a left gen behind at fps
Note: The power consumption reading for Intel's GPU doesn't reflect the card's total power usage, which misleads many people. The card's maximum power consumption is 185 watts.
They'd probably cost same here. 4060 is 280-290$ and expecting B580 to cost same post taxes. Intel doesn't have support for most legacy games and as DF said, I doubt they would ever have.
you cant find a b580 for under $300. not to mention that there were hardly any of those gpus produced. there is not enough supply to even consider it as an option. This is a joke and is just a scam to trick investors.
Intel graphics card is good, I would prefer it, 12GB RAM is better than 8GB. They are the same in terms of power consumption, except that NVIDIA is said to consume less when idle. Greetings from Germany
25 years ago, Nvidia's market cap was a fraction of Intel's. Now it's vice versa. Intel's stock price is the same it was 25 years ago. Intel totally disregarded the discrete GPU market and missed opportunities in both crypto mining and now AI.😢
there would be no difference, he uses 7800x3d to eliminate any cpu bottleneck, but even a 5600 isn't slow enough to be a bottleneck as the b580/4060 aren't powerful enough
Damn. Really impressive. It will probably only get better as they improve on driver updates. Great job intel. May wanna build a budget build with intel.
Very important issues: Indiana Jones - 60 fps comparing to 90 from 4060, less fps in the Stalker 2, Silent hill and God Of War. So, it's important to understand - is these problems caused by drivers or something in the card's hardware. But if the listed problems will be solved in drivers, it will be a good choice.
I think it is a driver issue, in hardware unboxed reviews there were a few games with performance that didn't match up to what was seen elsewhere. hopefully fixed in the near future
1. remember that the power used that u see is not measured the same way as NV or AMD, intel measure the GPU power only. 2. not a bad card for the price 3. if next gen 5060/8600 will have only 8GB it will be interesting if they will have more that card will be forgotten in 6 month
It's $250!! While the drivers are not what I had looked forward to, it's $250!! You need to have resizable bar ofc but I honestly think compared to the cards that come in at $250 this thing is a steal.
If you owned RTX 20 series and up you wouldn't trade DLAA/DLSS with 10-15 more fps of shimmering images. TAA is worst as well. So 4060 is still the clear winner.
You are aware intel has its own upscaler as well, Xess. TAA is just used to ensure the game is at its native resolution when performing the benchmarks, not because that's what the B580 uses as its anti-aliasing. Xess is actually a very good upscaler and has very similar results to DLSS.
I purchased the 4070 ti in 2023 after upgrading from my faithful 2060 Super. I'm really rooting for Intel to help gamers shake things up and give us viable options ❤.
genuinely, impressive performance per wattage, but if amd has problems no way I'm touching intel lmao, but idk where people getting there's prices, but you're literally comparing 400 eur vs 300 eur...
Power figures on the Intel side don't mirror anything reported by the likes of Gamers Nexus, Hardware Unboxed, Tom's Hadware or any other remotely trustworthy tech outlet. I don't want to call this fake though, since the performance and anything else looks about right. Probably MSI Afterburner needs an update to report power consumption on Battlemage properly. B580 is more of a 160w card with a wider spread of actual power consumption, as it holds the 2850MHz clock perfectly stable and doesn't fluctuate there... so it fluctuates more on power. The RTX 4060 takes about 120w +-5w at all times, bit the clocks fluctuate a bit, due to it being generally limited by the designed power limit and not the clock. Two different design philosophies. Intel reminds of good old AMD days! ;-) B580 should have some OC potential left in the tank, while RTX 4060 basically has as good as none. The B580 is nice, but would surely be nicer without the need to have a ReBar capable system and showing up with fully fledged 16x pcie-lanes, not just 8x.
They are using the Ryzen 7 7800x3d as it rules out any issues of the game putting stress on the CPU and do the fps is all dependent on the GPU. This is the same test bench used for all tests on that channel with the graphics cards. If they were to use a lower end CPU, then it would put stress on the CPU instead and it would lower FPS
FINALYYYYYYYYYYYYYYYYYYYYYYYY! THE VIDEO WE BEEN WAITING FOR! THE CARD WE BEEN WAITING FOR! But It's already 2025, go for 2K, don't stick with Full HD anymore.
Even AMD cant use RT better than this, intel b580 is truly a king of budget, now wait a bit for more drivers supports then it will be a best choice for budget pc build
As an AMD enthusiast, my respect for Intel has grown. At least they’re providing options for mid-budget builders. Meanwhile, NVIDIA is just being NVIDIA. Respect++ 💪💪
Well 50$ more for 4060 still saves you on the electricity bill long term. still, these two cards trading blows in performance but 4060 performs better on the recent AAA Games. We can see that the 12 Vram is not that powerful against the 8Vram of the 4060, just like the 3060. pick your poison.
For who confused by power draw, see gaming nexus review on b580, short is 4060 use 126W and b580 142w while full load gaming. i guess 190w archive only while stressing gpu and vram at synthetic benchmark at 4k
Games :
Horizon Forbidden West - 0:00
Cyberpunk 2077 - 1:02
Cyberpunk 2077 | RT - 1:54
Alan Wake 2 - 2:44
S.T.A.L.K.E.R. 2 - 3:41
Ghost of Tsushima - 4:53
Forza Horizon 5 - 6:02
Indiana Jones and the Great Circle - 7:05
SILENT HILL 2 - 7:54
Senua's Saga - Hellblade II - 9:01
God of War: Ragnarök - 10:06
System:
Windows 11
Ryzen 7 7800X3D - bit.ly/43e3VxW
MSI MPG X670E CARBON - bit.ly/3YWS871
RAM 32GB DDR5 6000MHz CL30 - bit.ly/4e3MqEG
Intel ARC B580 12GB - bit.ly/3OSOhCp
GeForce RTX 4060 8GB - bit.ly/3CRMGqh
SSD - 2xSAMSUNG 970 EVO M.2 2280 1TB - bit.ly/2NmWeQe
Power Supply CORSAIR RM850i 850W - bit.ly/3i2VoGI
Make the Test Again in 4-5 Weeks with driver Updates. I believe the B580 can achieve 10%- 15% more Performance with optimized drivers
No me convence mucho esto la arc se come a esa rtx
B580 seems to be not optimized for Indiana Jones yet.
but at least B580 should be able to use 1080p max setting.
while RTX4060 8gb will immediately crash because the Vram is too small.🤡
B580 best budget GPU in late 2024 😎 (optimizing the driver update will make it more superior)
@@penonton4260 Yeaa,It Could easily beat the RTX 4060 Ti 16GB with Better drivers( Looking to buy B580 in a year,But also will consider the drivers and the RTX 5060(I hope it would have the same vram) or for the RX 8600/8600 XT)
Can U test Space marine 2, Indiana Jones 2024, Marvel's Spider-Man remasterd, Marvel's Spider-Man 2 (type dodi repacks😁) ?
Kinda impressive how much Intel has improved very quickly, to compete with other modern cards this fast
Now I'm curious about the B7XX cards as a potential upgrade to my 3060ti.
Another plus for Intel is that XeSS is a great match against DLSS. A lot of people avoided AMD budget cards not because of RT, but because FSR just isn’t that great in comparison.
@@graveyarddead753 Absolutely, DLSS has been the ONLY thing tempting me to go nvidia instead of amd. Now intel is a viable option
Their Ray tracing damm beating nividia... 😮Wow
@@KhawarJuniThe extra VRAM and memory speed definitely helps with that quite a bit.
Continue on this way Intel
@@mfelicio1 they are losing money with each battlemage gpu sold. In what world do u think they will keep being able to do this.
@JakeySurani but still, that's the only way to take back their market share, atleast from low-end.
@JakeySuranique te hace pensar que pierden dinero por cada GPU? O más bien ¿De dónde sacaste esa información?
@@traelkel8789 this guy is crazy
Es que por lo que tengo entendido desde 2020 tanto AMD y Nvidia han aumentado el porcentaje de ganancias de cada GPU que venden , por lo que se me haría raro que intel perdiera dinero , en el peor de los casos Intel estaría ganando poco dinero pero no creo que pierda dinero
When a CPU manufacturer can't do its main quest but is very competitive in side quests.
Intels farming xp to continue the story and beat the final boss (shareholders)
Intel just changing all things give sometime for it
@@pouyarahimi31 agree, more competitive across the board.. just some more time
INB4 intel becomes more a gpu brand and nvidia goes cpu.
Intel has failed epically in a dozen other side quests at this point. I wouldnt say this is a success either because you cant find a b580 to purchase. Theyre losing money on it. Its unsustainable. Intel is an unfortunate dumpster fire in all of their ventures.
For 250 bucks is good choice
Give it 2 months with driver updates and it will beat 4060 ti 16gb
@@GFE.Nah
@@paokpaok857it already did in some games check out linus tech tips
@@GFE. stop dreaming
@@GFE.Stop dreaming bruh
Nvidia monopoly must end.
si claro, en un mes todo esto esta obsoleto con la serie 5000, sigue soñando
@@apokalip6 Not going to happen, because NVIDIA launches 60 class GPU a lot later. Intel has a very wide time frame to sell their gpus before nvidia's new budget gpus come.
Look im all with nvidia losing to Intel or amd but u have to think about the customers satisfactory if u start getting artifacts on intel gpu and weird stutters like old intel gpus that won't work @@shaheermansoor2560
@@apokalip6 you keep dreaming with your $400 dollar 5060 with 8gb lmao
@@gamingedition5165 claro claro sigue tu pagando solo por raster, sin IA ni tensor cores de tu basura de AMD JAJAJA
nice move intel, more fps, less power usage, proud👍
"less power usage" for now, since it's early driver, keep in mind that B580 TDP over 190W
So then it'll increase perf which is good @@imtrash2812
@@imtrash2812 that power usage doesnt even show the whole picture, it only showing the power pulled from the board. not the total power usage
@@rafinaufal1510 It's buggish just like 6000 series. 🤧🥲
@@Jakiyyyyy really? what's the bugs bro?
Upgrading from amd rx580 to intel arc B580 sound good
Yeah same with 1650 and 1060 owners.
the 580 number naming lives on
580 gang
1050ti owner me 😮
Back in the day i upgrade from ati 9600 to nvidia 9600 😂 circle goes round
Remember it's $300 vs $250
This GPU will give budget gamers a great choice to build their $500 - $600 pc
well until budget gamers use 250usd GPU to play free online games and it perfom like shit :/. That said I'm rooting for Intel though.
And b580 comes with 12GB of vram.
The problem is that next month RDNA 4 is going to destroy it. I wouldn't buy a B580 at least until that.
@@kerotomas1 I would not say it will destroy it. I cant wait for an rx 8600 with 8GB of vram. That would be funny.
@@BladeCrew The reason i think that it will be destroyed is the "old" die tech used. It's only on a 5nm die and the chip is kinda big, it only has parity with RDNA 3 and RTX 40 is already more advanced. Next gen RDNA 4 for 50 series are gonna be so much better on 3nm.
With a couple of driver optimizations, this card will crush the 4060 and become the new budget king GPU
Sometimes I don't support the think that must wait for it to be viable.
Like the PS5 PRO as an example, a few games is better than PS5 and XSX, but mostly falls at them. And the price? was pretty high and I don't recommend it at any demand.
But the BI Arc B580? is the other way arround: Not only the price is the lowest to compete this gen and gamma, but it deals many satisfactory game performance, where there are not Nvidia-Oriented like Indiana Jones or AMD-Oriented like Calypsto Protocol. Yet, is giving a HUGE performance in other titles and is pretty close to Nvidia 4060 and AMD 7600 (XT?)
To compare to PS5 PRO:
One need urgently updates to withstand the others in games and many fails.
The other is great from the start, has an extremely accesible price and performs extremely well in so many titles from the ground. And not need an urgent driver optimization/update. A small price with HUGE performance.
Yes wait for few years
@@CarlosEduardoMezaCarops5 pro make Intel arc B580 shit
it will never happen nvidia is the boss
It's 50$ cheaper for similar performance, it _is_ already the budget king GPU...
Intel's biggest disadvantage is not enough collaboration with game developers and game engine makers. Otherwise this card can achieve awesome performance.
Intel CPU❌
Intel gpu✅
I have a Intel Ultra 200 severs all my needs just fine.
12400❤+4070super
@@tylertoptier2353 damn
Fun fact, Alder Lake i7 and i9 are faster than half of Zen 4.
Steve from Gamers Nexus just replaced recently the 12700K for the GPU tests with a 9800X3D.
@@ТимурТимерханов-р1рbro 12400 will bottleneck 4070S like crazy
i have 12400f paired with 3070 and its fine
This looks like a great upgrade from my current 1070.
And maybe performance will get even better in the coming months with driver updates as well.
get a 1440p monitor
as an ex 1070 user I really advise you to upgrade to a 70 class/equivalent instead of
@@da3siiew shit take , don't listen to this bozo go for 1080p 240hz .5 ms
wait for B770, should come with more VRAM and probably will be competitive with anything from nvidia/amd at the same price point
@@k680B I expect the B770 to be a 4070 16GB for $400.
Preventing 12 year olds from saying first
Second
First
I'd rather see someone commenting first than commenting whatever this is supposed to be
PREGNANT
1st
I see Indiana Jones and God of War Ragnarok need optimization but overall this performance is amazing W Intel
Unreal Engine 5 is most unoptimized engine. You can see fortnite witch has fps drops on RTX 4090
@@grigiobasi2751 unreal engine has nothing to do with laziness of devs
Yeah sadly Unreal Engine games are the future of all games and it seems Nvidia 4060 takes the lead here especially with Raytracing and better upscaling/Software. Nvidia 5060 coming soon and will have an even bigger jump in Raytracing and Unreal engine games over the B580.. The extra 50 bucks for better RT and Upscaling seems like Nvidia still the way to go over Intel.
@@SemperValor RTX 5060 may come with a far better RT performance compared to RTX 4060 but from the leeks i have seen its going to have 8GB of vram again which is totally bullshit, like really bruh 8GB in 2025 ? not only that 5070 is also going to have only 12GB of vram, see that's the issue with Nvidia they know gamers will buy their gpu despite of lower vram and higher tdp of gpus and price, i am waiting for RX 8000 series let's see what AMD is going to provide.
@@SemperValor
If you're buying a 60 class Nvidia card, you're still not in this for Ray Tracing. Given that the 5080 is going to be half a 90, and the 70 class will likely have 8GB of VRAM until we see ti or Super cards, I don't buy that the performance uplift on a 5060 will be enough to let RT ride while playing. We'll see I guess, but right now, it's basically 4090 or bust to really start to enjoy what the good RT titles have to offer. We may now see that with the 5080, but the further down the stack you go the worse things look.
Nvidia is an AI company now. The consumer segment is secondary. To that end, the entire stack is clearly positioned to upsell you. I wouldn't count on a 5060 doing ray tracing particularly well.
The onus now is on AMD. For $250, and a second iteration, Battlemage is pretty impressive. AMD's the one looking like they're floundering now. RDNA4 better have an impressive feature set, which includes some solid RT capabilities, or they're not just going to continue to lose their pitiful share, but they might get bodied by Intel if they can do this again with Celestial. XeSS already looks like it might prove to be a better implementation of upscaling than FSR, so at least ray tracing needs to perform.
Interesting. Most reviews say more fps than 4060 at higher power draw, but this one shows similar fps but slightly lower power draw for the B580.
more fps to the b580 mate lol
It also scales better at 1440p than the RTX 4060. 1440p medium to high is the sweet spot for this card since it has 12GB of vram.
Thats because B580 becomes stronger and uses more wattage in 1440p. This test is all in 1080p
That wrong
on Intel gpu, its not a total power draw, it consume 185W (max)
@@gamingedition5165 This isn't more fps for the b580. Do the math. Across these 10 games in this video the 4060 averaged 69.4 fps while the b580 averaged 66.2
If you remove Indiana Jones from the lineup, since the difference is huge there, the b580 still technically loses, though its essentially the same. With the 9 game lineup, the 4060 averaged 67.3 while the b580 averaged 67.1
7:23 least obvious nvidia sponsored game
Yeah you are right
driver is not fixed yet for that game
The game is new and arc b580 is not powerful like rtx 4060 but is made to go toe to toe with the rtx wnd considering its a new game it probably will take arc580 to optimize its drivers for the game . Soo yea not a sponsored game
Maybe it's because of the forced RT, nvidia still wins in that department.
you got to remember that RTX 4060 it's already running the game since it's launch. The B580 can probably be better at it with better drivers
Intel does actually some nice things
The power consumption is insanely good, b580 mobile chips would make the best gaming laptops
not bad intel🙂
And me with my RTX 3070 crying in the corner with 8GB VRAM 😢
Intel really dose a solid job on this GPU.
Never Been so happy In the GPU market before...
Intel did a great job in making this card; they just need to push a few driver updates to make this card perfect for 1080p gaming at this price range.
If they increase RT performance significantly in the 3rd generation, we welcome the third big "player".
At least it's viable with XESS
Intel's GPU naming looks like they are designing a war jet
The biggest win for Intel is that people will no longer dismiss them out of hand when looking for a good value GPU. Making your product a viable contender for peoples money is half the battle. Lets hope they stick with dGPUs and steadily improve.
Rx 8600 will destroy two of them.
Thanks man i was looking for it
I can't believe it's almost 2025 and we still don't have budget cards that can consistently handle 1080p high.
People should forget about 1080p its very low resolution especially for 2025 gaming
@@foxskyful not everyone are 4k snobs just saying
You must be joking because 1080p is still a very decent resolution for gaming
It would be reasonable to say this after 2030 but for now, 1080p is still good @foxskyful
@@eddiesneax6754 1080p in some games was barely decent even in 2018-2019 for high-ultra gaming... Thank god for image sharpening I couldn't enjoy(played) without it games like RDR2 and AC Odyssey they were pretty blurry even on high ultra at 1080p for me. In 2025-2026 1440p should be new min for gaming. Go live in the past and praise for technology not advance if that's what makes you happy and want to play 15y the same way
@@foxskyful Will surely do
Horizon Zero Dawn: B580 is Faster, uses less power, less CPU usage.
Cyberpunk 2077 No RT: B580 is Faster, uses same power, slightly higher CPU usage.
Cyberpunk 2077 w/ RT: Same speed, B580 uses slightly less power, same CPU usage
Alan Wake 2: Virtually equal.
Stalker 2: RTX 4060 is faster, same power, same CPU usage.
Ghost of Tsushima: B580 is faster, slightly less power, slightly less CPU usage.
Forza Horizon 5: Virtually equal.
Indian Jones and the Great Circle: RTX 4060 destroys B580. Other metrics irrelevant.
Silent Hill 2: RTX 4060 slightly faster, B580 uses less power, less CPU usage.
Hellblade 2: RTX 4060 slightly faster, B580 uses less power.
GoW: Ragnarok: RTX 4060 is faster.
Overall: B580 is significantly cheaper than RTX 4060... if I were Nvidia right now, I'd be very scared. Intel has come to play.
intel has a lot of room to improve with driver updates too
Remember that Gelsinger brought this out of Intel. Gelsinger was GOOD for intel, and pushed Intel to momentary leads against AMD, and brought them into the GPU market.
Do one for 1440p, native and upscaler, cause i'm really interested with HUB finding that B580 is better at 1440p, HUB games are pretty limited i wanna see more untested games, if it's really better at 1440p, B580 might be the best banger at $250, cause i doubt 5060 or rx8600 gonna be that cheap.
4060 crushed it with ray tracing and dlss
@@jacklamat6315 not really in alot of tests dlss is abit worse then new xess also rt doesnt make the full picture since the 4060 is 50 dollars more or in my country even 200 USD more even the b580
@xnji oh in my country the Intel is 50$ more then the 4060 haha crazy how things look in different countries
Talvez na próxima geração a Intel Arc já consiga superar a AMD RX , a ARC já conseguiu superar a RX no quesito ray tracing fora os 12GB VRAM . Parece que a Nvidia finalmente vai ter uma concorrência a altura…
Falou uma bosta GIGANTESCA.... QLQR ARC não supera no mínimo uma RX 6750XT nem de longe..
@@kissehfodasso pô maninho vc leu oque eu escrevi? “ talvez na próxima geração vai superar “ próxima geração eu não disse a geração atual !! vc acha que a Arc não vai superar a rx 6750 xt na próxima geração só o desempenho em ray tracing a Arc já se sai melhor melhor que AMD e em tecnologia eles já estão bem avançados , fora que AMD já vai parar de fabricar rx 6000 e 7000 então não vai existir mais rx 6750 xt . Essa ARC já tem praticamente o desempenho da rx 7600 uma diferença mínima alguns jogos só que com uma performance melhor em ray tracing e com 12GB , algumas atualizações de drivers e quem sabe ela já se sai melhor que a 7600 .
Eu quero ver as versões mobiles desses chips pra notebook
IF SOMEHOW NVIDIA LAUNCH RTX 5060 WITH 12GB VRAM, THE PRICE WILL BE TOO MUCH HIGHER THAN ARC B580, NOY A SMALL MARGIN BUT NEARLY 100 $ EXPENSIVE AS WE ALL KNOW ABOUT NVIDIA NEVER ENDING GREED
They cant do that by the time 5060 launch arc b580 will beat 4060 ti 16gb and even 5060 and the price will be even lower they cant price it above 300$ no way and i hope this gpu sale well so 5060 fail nvidia need huge computation i bought 4080 super at 900$ and i still dont think it worth it compare to 1080 ti in 2017 and 3080 in 2020 at 700$
Nah, intel will be release b770 soon. With same price as 5060ti. But perform like 5070 😅. B770 project is canceled because power issue. Just hope its will be released soon
@Thepapoyboyz let's hope for that
How did you even think that they'll release 5060 with more than 8 gb vram😂
Caps doesn't equal "validity", but it does make you look stupid and childish.
I get the feeling these cards are gonna climb up that Steam Hardware Survey list pretty quickly.
Is Power Consumption correct?
Early driver. ☕
power consumption strongly depends on the screen resolution on b580, in 1440p consumption is on average about 130-140 watts, in 4k even higher
Tnx for the info bro next test b580 with i5 12400f thanks@@TestingGames
on Intel gpu, its not a total power draw, it consume 185W (max)
@@rlifeh Can you say it in other words? I don't understand what you mean
Well in thr UK it's 250 vs 250 but i would still 100% buy the 4060 due to it having better ray tracing, better frame generation and also the use of dlss and Nvidia reflex. Most people would agree with me and that's saying something.
But 8gb is dead on arrival with rtx only games like indiana jones. And more games will shift to rtx only.
@UmmerFarooq-wx4yo Nvidia obviously doesn't know that and will make the same mistake with the 5060 most likely for 300-350 but like I said, it'll still sell out and everyone buys the 4060, almost the most popular GPU on steam for example cause of Nvidia features.
What if you wait for intel prices to come down ?!
Intel need to release a competitor for 4080 and 4070 next. That will surely end Nvidia.
AMD video cards have been competing with Nvidia for decades, often on equal terms, even in the top segment. But yes, OF COURSE, Intel video cards will ruin Nvidia now.
@@allex_lyricAMD gpus always sucked in general minus gems like the RX580 8GB version.
@@saricubra2867 "AMD sucks" is just an you'r emotion. AMD video cards in 90% of cases provide the same performance as Nvidia video cards. You may not like AMD as a company, but you can't deny the fact that their video cards work, and work well.
@@allex_lyric Lmao, AMD has the worst RT perfomance and feature set, the drivers are way worse than NVIDIA. They ruined Radeon after they bought ATI.
Even AMD sponsored games run worse in their own hardware.
@@allex_lyricIntel B580 is just A770 with 12gb
A B770 will be on par with a 4070 at the Best
Intel use same strategy that AMD for rx 400-500
Intel will be a left gen behind at fps
Note: The power consumption reading for Intel's GPU doesn't reflect the card's total power usage, which misleads many people. The card's maximum power consumption is 185 watts.
That is tdp , power consumption gives actual data as to how much it is using
@@drinkwoter
yep, and TGP (Total Graphics Power) with RTX 4060, and that better.
Memory clock b580 is so poor, This is the reason why it consumes less watts. Check please
can't believe,good job intel
You used DLSS in Cyberpunk and Silent Hill 2 which completely changes the setup and is not comparable to the Intel benchmarking.
People are surprised by this because they often forget that Intel is HUGE tech company.
True
Its going to be bankrupt soon.
@@Electrobuzz17 never
They'd probably cost same here. 4060 is 280-290$ and expecting B580 to cost same post taxes. Intel doesn't have support for most legacy games and as DF said, I doubt they would ever have.
better off just going with a used/better/faster GPU and learning a lil on linux and a VULKAN translator/rendering option for older AAA titles anyways.
you cant find a b580 for under $300. not to mention that there were hardly any of those gpus produced. there is not enough supply to even consider it as an option. This is a joke and is just a scam to trick investors.
@@christophermullins7163 dude why don’t you put this effort into making money??? 😂
@@christophermullins7163 Maybe things change in few months if we wait.
it costs 350
Did you try any old games? Can Arc deliver smooth performance?
Yes. But at that point the CPU bottleneck would be so bad, it's better using integrated graphics and saving a lot of power.
Intel graphics card is good, I would prefer it, 12GB RAM is better than 8GB. They are the same in terms of power consumption, except that NVIDIA is said to consume less when idle. Greetings from Germany
The incoming 5060 is only 8gb again with 128bit bus and only x8 pci-e lanes 😂😂😂 lmao
Let them rot on shelves 🤣
5060 will be pcie gen 5 and also use gddr7, ill come back here when it releases and stomps on the b580 haha
@@ConnorH2111 Nice joke bro
25 years ago, Nvidia's market cap was a fraction of Intel's. Now it's vice versa. Intel's stock price is the same it was 25 years ago. Intel totally disregarded the discrete GPU market and missed opportunities in both crypto mining and now AI.😢
And even after this, NGREEDIA will still release the 5060 with 8GB and 128-bit bus... 🤣
Those RTX 5060 GPUs are gonna be rotting on shelves 🗑
No, a lot of people are going to buy it, you'll see.
@@snox6469 Right, like the 4060 Ti when it released? LOL
Why i need card without normal rtx?
@@aviator1472 Why do you need a card without 12GB?
@laszlozsurka8991 why do I need card without proper rtx?
Nvidia does not actually offer its customers better stuff for cheaper prices. THANK U INTEL
Yeah :D less power usage + technology like dlss+fg and games are not everything. 4060 clearly beat this card without its technology
bro please test it with the r5 5600, most of the budget builders will pair it with the 5600
yeah man, im also planning to pair it with a 5600
you told this on behalf of us all 5600 users .
budget build neeeds budget cpu not rich cpu .
Its a gpu test and putting high end cpu So the cpu wont be the bottleneck and we can actually see the difference.
there would be no difference, he uses 7800x3d to eliminate any cpu bottleneck, but even a 5600 isn't slow enough to be a bottleneck as the b580/4060 aren't powerful enough
No most of the budgeters would pair it with a 12400f
Damn. Really impressive. It will probably only get better as they improve on driver updates. Great job intel. May wanna build a budget build with intel.
Very important issues: Indiana Jones - 60 fps comparing to 90 from 4060, less fps in the Stalker 2, Silent hill and God Of War. So, it's important to understand - is these problems caused by drivers or something in the card's hardware.
But if the listed problems will be solved in drivers, it will be a good choice.
These games are sponsored by Nvidia, so it makes sense.
I think it is a driver issue, in hardware unboxed reviews there were a few games with performance that didn't match up to what was seen elsewhere. hopefully fixed in the near future
as good as rtx 4060 and in addition 50 dollars cheaper plus 4 GB more VRAM....arc b580 is good....
Why the power consumption on b580 is so low ?
thats is 1080p then it will be high if using 1440p or 4k if im not mistaken
I think is just the power draw from the board not the gpu die itself just a guess
on Intel gpu, its not a total power draw, it consume 185W (max)
@@rlifeh yes u r right but why the power consumption is too low in this video?
@@sayem1960
Intel wants that maybe, we saw the same thing with some AMD cards.
Nvidia and AMD had +30 years of experience in GPUs, Intel joined the mid range level in only 2 years.
250$ my ass 😂😂😂
Out of stock literally everywhere with minimal price of 400$ 😆😆😆.
Paper launch.
is your ass $250? can i buy it?
By looking at the improvements from alchemist gpus
Intel should continue making Gpus.
1. remember that the power used that u see is not measured the same way as NV or AMD, intel measure the GPU power only.
2. not a bad card for the price
3. if next gen 5060/8600 will have only 8GB it will be interesting if they will have more that card will be forgotten in 6 month
not only super happy to see other companies entering the GPU market but also giving appealing products compared to others
Sorry the amd snobs who have been bashing Intel aren't eligible for this card .... it's Intel.
More brands are good always. Competition is always good for the customer
B580 has TDP 190W, but only use 100 here., Something is not right
Do you understand what TDP is?
Fr, thats why the fps much lower than other benchmarks show
If a card tdp is 190w, it doesnt mean it will pull 190watts under max load.
That's not how TDP works.... 190W is when you full load the GPU at 4K and XeSS
Finally some competition for Nvidia. Now, we need high-end contender.
1080p is in the 4060 favor. 1440p eats it alive.
does it? i play at 1440p in every game lol
Now turn on dlss+fg in 1440p lol
2k24
Intel leads in gpu fps per watt production
2k25
Nvidia 5060 8 gb +1% edition costs 499$
Not bad, but still not there where it should be in terms of support of drivers and developers. These big losses in some games are a pitty...
It's $250!! While the drivers are not what I had looked forward to, it's $250!! You need to have resizable bar ofc but I honestly think compared to the cards that come in at $250 this thing is a steal.
If you owned RTX 20 series and up you wouldn't trade DLAA/DLSS with 10-15 more fps of shimmering images. TAA is worst as well. So 4060 is still the clear winner.
You are aware intel has its own upscaler as well, Xess. TAA is just used to ensure the game is at its native resolution when performing the benchmarks, not because that's what the B580 uses as its anti-aliasing. Xess is actually a very good upscaler and has very similar results to DLSS.
And intel crushes Nvidia at 1440p
Until that crappy 8GB VRAM bites you which happens more on 2024. Can't imagine 2025...
Indiana Jones and the Great Circle - 7:05
RTX 4060 WIN BY 30FPS !!!!
Similar but some fps
5:04 take your placenta
Dame tu cosita
The new gpu is not good Indiana Jones 🫨
I think there is a gap in the early driver, everything will be fixed in time
Nvidia sponsored game
That sht game
Even RTX 4090 can't handle it at 4K.
Why couldn't this thing have come out a year earlier. Damn all to hell.
Maybe one day we'll see Nvidia CPU.
its called arm
Good 👍
NVIDIA Tegra already existe.. for exemple Nintendo Switch
I purchased the 4070 ti in 2023 after upgrading from my faithful 2060 Super. I'm really rooting for Intel to help gamers shake things up and give us viable options ❤.
Is this the new rx 580 🤔
Yep I have rx580 and I want upgrade
still stuck rx580 here 😂
There are three GPUs called 580, GTX 580, RX 580, and ARC B580
no it wont be. it will carve out its own niche instead
genuinely, impressive performance per wattage, but if amd has problems no way I'm touching intel lmao, but idk where people getting there's prices, but you're literally comparing 400 eur vs 300 eur...
I think it doesn't make sense to buy new Intel graphics cards right now, you have to wait a few months for the driver update
Power figures on the Intel side don't mirror anything reported by the likes of Gamers Nexus, Hardware Unboxed, Tom's Hadware or any other remotely trustworthy tech outlet. I don't want to call this fake though, since the performance and anything else looks about right. Probably MSI Afterburner needs an update to report power consumption on Battlemage properly. B580 is more of a 160w card with a wider spread of actual power consumption, as it holds the 2850MHz clock perfectly stable and doesn't fluctuate there... so it fluctuates more on power. The RTX 4060 takes about 120w +-5w at all times, bit the clocks fluctuate a bit, due to it being generally limited by the designed power limit and not the clock. Two different design philosophies. Intel reminds of good old AMD days! ;-)
B580 should have some OC potential left in the tank, while RTX 4060 basically has as good as none.
The B580 is nice, but would surely be nicer without the need to have a ReBar capable system and showing up with fully fledged 16x pcie-lanes, not just 8x.
I didn't know people would go for a high-end CPU after pairing it with the most budget friendly GPU yet? 🙄
its a gpu test. they pair it with a high end cpu to prevent a cpu bottleneck, im tired of seeing this comment under every video bro
They are using the Ryzen 7 7800x3d as it rules out any issues of the game putting stress on the CPU and do the fps is all dependent on the GPU. This is the same test bench used for all tests on that channel with the graphics cards. If they were to use a lower end CPU, then it would put stress on the CPU instead and it would lower FPS
@@winyex ratio'ed
Making this segment of the gpu’s great. Nice stuff.
FINALYYYYYYYYYYYYYYYYYYYYYYYY! THE VIDEO WE BEEN WAITING FOR! THE CARD WE BEEN WAITING FOR!
But It's already 2025, go for 2K, don't stick with Full HD anymore.
This thing is already out of stock, i really wanted that extra vram and low power consumption
Please let Intel release some 5070 level cards next year to balance out Nvidia's greed.
ya lo hace mejor que AMD, pero no quitara clientela a Nvidia, nadie va a comprar una gama baja de hace dos años a un mes de poder aquirir una 5070....
maybe b780 will be like 4070/4070S
Even AMD cant use RT better than this, intel b580 is truly a king of budget, now wait a bit for more drivers supports then it will be a best choice for budget pc build
As an AMD enthusiast, my respect for Intel has grown. At least they’re providing options for mid-budget builders. Meanwhile, NVIDIA is just being NVIDIA.
Respect++
💪💪
The big feature of B580 is their new Frame Gen tech, it's like a magic, the jump in fps is mind blowing compared to others
I think with some driver updates this GPU going to hit hard well done Intel
09:11 WHAT!? I get the "ASMR Tingles" on this one! 😂
Well 50$ more for 4060 still saves you on the electricity bill long term. still, these two cards trading blows in performance but 4060 performs better on the recent AAA Games. We can see that the 12 Vram is not that powerful against the 8Vram of the 4060, just like the 3060. pick your poison.
Thanks for all your videos. Good job.
Honestly Intel doesn't need some ultra high end competitor. An appealing entry level GPU to gain market share is all they need for a good start.
Hey is it really pulls only 100 watts in gaming overall
??
For who confused by power draw, see gaming nexus review on b580, short is 4060 use 126W and b580 142w while full load gaming.
i guess 190w archive only while stressing gpu and vram at synthetic benchmark at 4k
That's crazy. How did they achieve such an improvement all of a sudden
12gb of vram is a major difference for better performance in new games
New budget king is here
Me, budget ballers love it 🥵