Still garbage. Same Slog, same shit. I absolutely hate it. Runs like a crusty dried turd that was shat out the ass of an unreal 5 game.. . yes I know its not UE5, but still
@@HardWhereHerostarfield wish its engine was anywhere near ue5 instead of actually being forced to run on an engine from 1997 with a thin layer of makeup on 😂
I REALLY hope this sells extremely well. It’s the first budget gpu worth buying since the GTX 1060 and rx 580. Both nvidia and amd got complacent, Intel has a huge opportunity here to punish them for it, if they can produce enough.
22:50 we shall never forget Intel for finally doing the right thing and Linus for reminding us to never forget who helped steer the ship to actual fair priced graphics cards. glhf fam p.s. i remember the 1060 6GB... good times.
i bult a new pc in the end of 2022 so i absolutely don't need a new pc but I want arc to win so fucking bad amd won hearts and minds with the x3d lineup and I hope that intel can do the same or similar with arc
Sue did an interview a week or so ago. There was a little line in the interview I think a lot of people missed. x3d is probably coming to their GPU lineup. Don't know how. Don't know when. But she held a lab product sample they were testing and said it had 3d v-cache. From the sounds of it though it wasn't the generation which will release in a month or two but the generation after. My best guess is they are taking the new stack design with the ram under the cpu instead of on top and pushing it into GPUs.
Annual cycles are awful. They teach software devs to push the hardware. They create a constant sense of FOMO. They generate a ton of waste, and they put financial pressure on purchasers. I'd be happy with a 5 year cycle, with well optimised games. GPUs and CPUs were never supposed to be disposable.
This is very closely related to console generations, and is why devs prefer to develop on consoles -- over time the hardware and the SDKs for it get consolidated, so after 1-2 years it's not a pain to develop for it. Note that if SDKs are good enough, the difference in hardware implementations becomes less and less important.
Gotta give Intel credit for reviving the budget GPU segment. Now I hope the ARC division can make it onto Celestial and Druid and eventually start to compete up the stack!
I’ve been putting off a complete PC upgrade for a bit now, my 1660 TI and i7 9700 have been doing good enough up until about a year ago. I plan on supporting Intel Arc for my upgrade, not only because it’s about all I can realistically afford to spend on a hobby item but because I strongly believe in voting with your dollar and I hope Intel keeps doing what they’ve done with Alchemist and Battle Mage.
Good. BTW I would keep the i7 9700 for now and just buy the B580 (and sell the 1660 Ti). There's nothing significant to be gained from a CPU upgrade unless you're playing competitive tournaments at 1080p.
Legit Intel showing the others that there are intangible but important benefits to cards priced where people can afford them was not on my 2024 bingo card.
If you’re looking for content im sure alot of people would be interested in cool homelab/tinkering stuff, especially since it can run on next to anything meaning more people can engage with it
Per GN, Nvidia is notorious for spacing out their launches. I think AMD is now applying that strategy for their CPU launches. String along the announcements, dominate the airwaves by having more news items than the everything-at-once competitor.
The only thing that I hope isn't the case with theses cards is that they aren't priced so aggressively, perhaps at poental loss leading?, specifically to get a large userbase then slowly creep up the same way Nvidia and AMD have. But at least there will be more competition then, and they are certainly being competitive now.
Intel sort of needs to establish an install base so that games and graphics applications properly support their cards at launch (which is why the pricing is so competitive). Having a third (proper) GPU developer is definitely good for the market though, duopolies are not much better than monopolies at the end of the day.
They almost certainly are selling these at a loss but I hope they can mitigate the loss by clever logistics and parts sharing or economies of scale. They need to gain market- and mindshare and also really need people buying it en masse to help driver development. Now is the time to be aggressive (even at a loss), profits will come later.
My last GPU purchase was a $400 used GPU. For me, Intel was just a year or so too late. But if they can keep going, I will seriously check them out when it's time to upgrade again.
I am glad intel has a public win and more competition is greatly needed in the gpu market. However, it’s on a more advanced node with more silicon. It makes we wonder if forgetting the D&D intel is breaking even on the hardware at $250. If arc is a positive for intel, why was Pat refired?
Because its pretty much the only good news that's happened for Intel. Their CPUs are uncompetitive and sales are down, their 18A process is having major issues, and they're running out of money. A good midrange GPU is still only a minor win for Intel, especially because they're probably not going to make good sales on AI applications, which is where NVIDIA is now making nearly all their money.
A friend is looking to buy a new GPU soon, she's considering the 6750XT a 4 year old card, told her if she could find a 580 for about the same price or lower (price will change after import taxes) that's the one to get, excellent performance for the price, very modern architecture and will only get better with driver updates.
DO NOT START Paying 400.00 Plus for it they will up the prices to 600.00 700.00 Buy only the best prices or do not buy or the point of having a Excellent priced card will end uop costing a small fortune again.
Ive just upgraded to a full amd setup moving on from nvidia for the next few years and maybe on my next upgrade i could do a full Intel build. Lets see how the next few generations work out. Amd 7800xt dropped to $700 aud for 16gbs. I looked at it and went finally a price i will pay for a high end card
Wendell tested that. It's passable, but still rough as of now. Also, you'd need to be on the bleeding edge kernel. So no Debian I suppose. However, Intel does have a track record of supporting Linux, so we can expect it to get better soon.
Now if AMD and Intel could actually work together for once to provide meaningful support to developers and even the people working on engines for actual optimization so that we stop getting stuff like medium settings with frame generation to get 60fps on 1080p on the ~$300 GPU range would be a great way to finally get rid of this era of flicker, temporal smear, noise and "upscale looks better than native!" Nvidia wants this to justify prices and AI BS that we shouldn't need. If they actually had the intention of DLSS be great at increasing fps and making cheaper GPUs feel better, then they would give people the VRAM for such use.
4:00 JUST WIN! Even if it means you're losing 200$ per card sold. At 10 Million units sold, you would be losing 2 Bn Dollars. But hey, at least you made a dent in the market and people will know you exist! Great advice Linus Sebastian :))))))))))))))))))))))))))))))
You know Not every card will Play EVERY game 100% NOT 1 EVER . Been doing this 30+ yrs and i have never had a card that played a game 100% for every game ( It always depends on CPU MEM GPU To 20 outher aspects Power You name it it has a Price to pay for the Build. I Love my Intel GPU I want the prices to STAY where every one can get a great GPU. But Give in to the GREEDY Hands of Companies.
Linus talking about their game benchmark logic, pretending like they dont just do tomb raider and doom to represent games, every chance they get. Two very bog standard, boring games. Turn on Rust or something else interesting. All because some games cant be consistently benchmarked to 0.001% confidence, doesnt mean there isnt value in showing them. omg i got 550 frames in doom and tomb raider 2016!!!1
6:00 Asks for Pat Gelsinger back at Intel, which was the guy who made the decisions regarding Arc GPU-s, 2 generations that failed miserably, costing way too much and not providing the required performance. It's literally the reason he got fired ... Omg!! This guy is absolutely a buffon ... how can someone with an IQ over 80 watch this shit???
I am a bit skeptical about B580 because performance in UE5 games is bad and it's even worse if you use 1080p+quality upscaling. It's only good in 1080p native. The more UE5 games testers have in their suite of games, the worse the B580 performed.
if it's just UE5 then it sounds like a UE5 issue specifically, I know it is terribly bloated and unoptimized so it's most likely some kind of compatibility issue not a hardware fault
I wouldn't buy a product based on a future promise but I would guess that the performance in UE5 is from terrible optimizations, it could also be driver related as it's super early. If B580 performs on par with other GPU's in other applications, I could see how after some fixes maybe things will be better and hopefully one day developers will go back to optimizing games instead of brute forcing things. Honestly I'm not happy about the movement to UE5 just because it's easy, I liked when developers would actually develop within their in-house engine and develop creative things that instead of working in another engine to use what another company made to brute force to the same objective. (what I mean by creative things is creative things for optimization IE: light maps (basically cheap raytracing! HALF-LIFE 2 had this it's just static!))
@@personthing7733 totally agree about UE5, and buying based on a promise, but given the performance numbers on other engines, I think it's fair to say that it's a driver and/or UE5 bug, and sure I guess you could say that fixing a bug is a promise and not a deliverable but I don't think that's being fair to the battlemage team, I really hope we'll move to a point of engine devs actually optimizing instead of using up-scaling as a crutch
And REVIEWERS are partly to Blame for HIGH prices , OH yes This card is sooo good , Then they say well I can sell for 200.00 over and they will buy it anyway.
So The most popular game on Steam, Counterstrike, doesn't matter ... because you don't test it. Gotcha! Very useful review of the Intel B580 Also Can you get it anywhere at MSRP? I only saw it at like 4-500$
You got first according to my iPad. About 4 seconds faster than 2nd and 12 seconds faster than me 😂❤ I like your pessimism but Merry Christmas, you’re #1 this time 😊
Ray-tracing who the hell cares, no CUDA? hardpass... Even with the 12gb is not a diffusion card... sadge... Linus reading the market is like my old grandma reading the street signs, short sighted... Intel got popular with this (as they burn money trying to cop out the recent CPU fiasco) and that means they are "competitive" now, the next Intel ARC card will be the last price accessible they release, if they're gonna be a big player in the GPU market from now on they are gonna price their stock as the rest not lower (like every other big tech company, when has anything ever go down in price in the history of monopoly parties, it's always about the "big 3" in almost every main industry), and in no way or shape Nvidia or AMD are gonna lower their prices because "wow a new competitor", wake up, gaming is a dead hobby for low to mid salary people. Also Linus got really humble about being a simple youtuber, kudos man
Imagine a EVGA Intel GPU, that would sell faster than any cards released last 5 years
Yep if they made B570 B580 EVGA cards theyd all be top sellers on amazn.
one could only hope 🥲
damn 1st hard Rs now P Diddy
Wut
@@_Matt_Matt_365_8:45
lol I mean Kesha was never not trash anyways
@@_Matt_Matt_365_he said "waking up in the morning feeling like p diddy."
Ah got it.. where was the hard R tho??
"Dozens of peiple are playing starfield" LMAO
ngl, that caught me off guard🤣
he was so generous with that
I'm waiting for it to go on sale as cheap as Fallout 4 gets, meaning
Still garbage. Same Slog, same shit. I absolutely hate it. Runs like a crusty dried turd that was shat out the ass of an unreal 5 game.. . yes I know its not UE5, but still
@@HardWhereHerostarfield wish its engine was anywhere near ue5 instead of actually being forced to run on an engine from 1997 with a thin layer of makeup on 😂
I think this requires a new intel arc challenge
I second this!
I third
I just love the title
Intel should really try to court EVGA.
NVidia would sent the hitman 😂 but yeah that would be nuts
Amazing idea. That would be crazy. Would make Jay's two cents cry with joy
I REALLY hope this sells extremely well. It’s the first budget gpu worth buying since the GTX 1060 and rx 580. Both nvidia and amd got complacent, Intel has a huge opportunity here to punish them for it, if they can produce enough.
Pat helped design the 486 man. Big respect.
22:50 we shall never forget Intel for finally doing the right thing and Linus for reminding us to never forget who helped steer the ship to actual fair priced graphics cards. glhf fam
p.s. i remember the 1060 6GB... good times.
I do miss the old graphics card videos, but thanks for putting in perspective why they've kind of had to go away for so long.
i bult a new pc in the end of 2022 so i absolutely don't need a new pc but I want arc to win so fucking bad
amd won hearts and minds with the x3d lineup and I hope that intel can do the same or similar with arc
came out at the perfect time as I'm building a new PC for my partner
Sue did an interview a week or so ago. There was a little line in the interview I think a lot of people missed. x3d is probably coming to their GPU lineup. Don't know how. Don't know when. But she held a lab product sample they were testing and said it had 3d v-cache. From the sounds of it though it wasn't the generation which will release in a month or two but the generation after. My best guess is they are taking the new stack design with the ram under the cpu instead of on top and pushing it into GPUs.
Annual cycles are awful. They teach software devs to push the hardware. They create a constant sense of FOMO. They generate a ton of waste, and they put financial pressure on purchasers. I'd be happy with a 5 year cycle, with well optimised games. GPUs and CPUs were never supposed to be disposable.
5 year cycles are a little bit.. much. every 2 years would be a bit more sensible IMO,
Computer hardware has always been disposable. It's relatively inexpensive and easily replaceable (at least here in the US).
Same, 3-5 year buy an old card and be amazed it just works!
@@maxstrong1999expect the silica sand isn’t and when it runs out that will make things really interesting
This is very closely related to console generations, and is why devs prefer to develop on consoles -- over time the hardware and the SDKs for it get consolidated, so after 1-2 years it's not a pain to develop for it.
Note that if SDKs are good enough, the difference in hardware implementations becomes less and less important.
So glad we have someone else for the GPU market creating competition.
Gotta give Intel credit for reviving the budget GPU segment. Now I hope the ARC division can make it onto Celestial and Druid and eventually start to compete up the stack!
I’ve been putting off a complete PC upgrade for a bit now, my 1660 TI and i7 9700 have been doing good enough up until about a year ago. I plan on supporting Intel Arc for my upgrade, not only because it’s about all I can realistically afford to spend on a hobby item but because I strongly believe in voting with your dollar and I hope Intel keeps doing what they’ve done with Alchemist and Battle Mage.
Based! I did the same when I built my first all new PC last year and jumped on alchemist
Good. BTW I would keep the i7 9700 for now and just buy the B580 (and sell the 1660 Ti). There's nothing significant to be gained from a CPU upgrade unless you're playing competitive tournaments at 1080p.
Legit Intel showing the others that there are intangible but important benefits to cards priced where people can afford them was not on my 2024 bingo card.
Time for a new Arc intel challenge
Second this
Can we get EVGA intel cards?
one (we) can only wish🥲
Linus wakes up coved in baby oil
Scalpers got all the B580s!!!! Reselling for $100 - $150 over retail!!!!!
I like how the graphics card is priced the same as LTT backpack
Intel is doing what I've been wishing AMD would (good raytracing and upscaling) to compete with NVIDIA, but is cheaper.
Mind blown
enter third user (at least on the GPU side)🙃
If you told me when I bought my first ryzen 1600 that I would be rooting for Intel in 2025 I would have probably laughed at you.
"Do you have any idea what it felt like to wake up in the morning feeling like P. Diddy" - Linus, Linus Tech Tips, 2024
I was not expecting 30 min of positivity
Its the new baseline. If your gpu is $100 more but only 10-15% better people are gonna raise an eyebrow.
Im thinking about putting an arc in my nas for plex transcoding
well keep thinking because you can't get one
@@Elatenl sure he can, because he only needs an A310 or A380 for that. No trouble getting one of those.
@@ryanspencer6778 yeah B580 is way overkill for NAS an A380 will handle that handsomely
Idle power draw is 2 much. Might want to opt for n100 or similar instead.
@@RawmanFilm How do you even _buy_ an N100? Is that even available for retail, and not just to OEMs?
Excellent title
If you’re looking for content im sure alot of people would be interested in cool homelab/tinkering stuff, especially since it can run on next to anything meaning more people can engage with it
clickbait its actually 31 minutes of Linus Sebastian from Canada ranting about the company Intel.
Would you say it's intel about...Intel?
New Arc challenge when? 👀
I'm definitely going intel with my next build. I'm hoping for a 780 but that said I don't need a higher res than 1440p for a whike
Also from 13:00 linus is doing a great bill burr impression
It’s crazy reading “The Nvidia Way” by Tae Kim, that Nvidia was the distributor in the market and created the lower end cards.
Per GN, Nvidia is notorious for spacing out their launches. I think AMD is now applying that strategy for their CPU launches. String along the announcements, dominate the airwaves by having more news items than the everything-at-once competitor.
wasnt the last decent priced midrange gpu the RX480-RX580 polaris era? but that's 2016 lol
The only thing that I hope isn't the case with theses cards is that they aren't priced so aggressively, perhaps at poental loss leading?, specifically to get a large userbase then slowly creep up the same way Nvidia and AMD have.
But at least there will be more competition then, and they are certainly being competitive now.
Intel sort of needs to establish an install base so that games and graphics applications properly support their cards at launch (which is why the pricing is so competitive). Having a third (proper) GPU developer is definitely good for the market though, duopolies are not much better than monopolies at the end of the day.
They almost certainly are selling these at a loss but I hope they can mitigate the loss by clever logistics and parts sharing or economies of scale. They need to gain market- and mindshare and also really need people buying it en masse to help driver development. Now is the time to be aggressive (even at a loss), profits will come later.
My last GPU purchase was a $400 used GPU. For me, Intel was just a year or so too late. But if they can keep going, I will seriously check them out when it's time to upgrade again.
If i didn't already have a a750, i'd be gettin' a B580
How's it going so far? also, what are you running for a cpu?
@@Vtarngpb Only about 92% compatibility rate, so so far Starfield and TLoU Remaster didn't work. 12-600.
1:33 But Starfield is not supported on Steam Deck… I just double checked.
They also need to get these chips into laptops
YES YOU do have power You are not just a snooze off the street . YOU are a multi millionaire With PULL .
Someone forgot RX 480 was a thing. Wow
I am glad intel has a public win and more competition is greatly needed in the gpu market. However, it’s on a more advanced node with more silicon. It makes we wonder if forgetting the D&D intel is breaking even on the hardware at $250.
If arc is a positive for intel, why was Pat refired?
Because its pretty much the only good news that's happened for Intel. Their CPUs are uncompetitive and sales are down, their 18A process is having major issues, and they're running out of money. A good midrange GPU is still only a minor win for Intel, especially because they're probably not going to make good sales on AI applications, which is where NVIDIA is now making nearly all their money.
Their CPU division pretty much collapsed and sh*t itself.
Intel's GPU division is not their moneymaker - it's the CPU. And gen 13 and 14 were totally messed up.
A friend is looking to buy a new GPU soon, she's considering the 6750XT a 4 year old card, told her if she could find a 580 for about the same price or lower (price will change after import taxes) that's the one to get, excellent performance for the price, very modern architecture and will only get better with driver updates.
So how is the 580 for video editing, blender and streaming?
Soon at netflix: Begging is the new ranting
DO NOT START Paying 400.00 Plus for it they will up the prices to 600.00 700.00 Buy only the best prices or do not buy or the point of having a Excellent priced card will end uop costing a small fortune again.
Dont increase the price, update drivers regularly and get enough stock to buyers. Simply.
Ive just upgraded to a full amd setup moving on from nvidia for the next few years and maybe on my next upgrade i could do a full Intel build. Lets see how the next few generations work out.
Amd 7800xt dropped to $700 aud for 16gbs. I looked at it and went finally a price i will pay for a high end card
how is B series ark on linux?
Wendell tested that. It's passable, but still rough as of now. Also, you'd need to be on the bleeding edge kernel. So no Debian I suppose.
However, Intel does have a track record of supporting Linux, so we can expect it to get better soon.
Give it a month
fire title
Linus Knows people there Not just the janitor. WOW The episode I can die now Intel Fab tour in 2022. oh yes Vids do not lie.
In 3years I'll buy an Intel card
i remember the 1060 6GB... good times.
Im still using a 1060
Now if AMD and Intel could actually work together for once to provide meaningful support to developers and even the people working on engines for actual optimization so that we stop getting stuff like medium settings with frame generation to get 60fps on 1080p on the ~$300 GPU range would be a great way to finally get rid of this era of flicker, temporal smear, noise and "upscale looks better than native!" Nvidia wants this to justify prices and AI BS that we shouldn't need. If they actually had the intention of DLSS be great at increasing fps and making cheaper GPUs feel better, then they would give people the VRAM for such use.
4:00
JUST WIN!
Even if it means you're losing 200$ per card sold. At 10 Million units sold, you would be losing 2 Bn Dollars. But hey, at least you made a dent in the market and people will know you exist!
Great advice Linus Sebastian :))))))))))))))))))))))))))))))
Fun fact I have a 1200 gamer score in starfield 😮
Starfield is so badly optimized, at launch it even ran worse on AMD than Nvidia and it was an AMD sponsored title
I heard tik tok by kesha on the radio today and the whole line was bleeped out
People fucking hate nvidia right now and theyre not happy with amd. Gamers are screaming for value. Give it to them and win
No more shill hats with intel gpus
I feel this GPU makes the ps5 pro look bad
we got linus ranting about intel before gta4
gta 4 💀
@alikoneko he's been doing this for a while lol
You’re about 18 years late my boy
@@bullymaguire243 it's ok I'm from the future.
Hopefully they add drivers for call of duty cause thats a huge miss
You know Not every card will Play EVERY game 100% NOT 1 EVER . Been doing this 30+ yrs and i have never had a card that played a game 100% for every game ( It always depends on CPU MEM GPU To 20 outher aspects Power You name it it has a Price to pay for the Build. I Love my Intel GPU I want the prices to STAY where every one can get a great GPU. But Give in to the GREEDY Hands of Companies.
HAHAHAHAHAHAHHAA 8:45
Intel please do better and for longer…. I need replacement for my 2070 Super and I REALLY do not want to pay 1000$ for it
Linus talking about their game benchmark logic, pretending like they dont just do tomb raider and doom to represent games, every chance they get.
Two very bog standard, boring games. Turn on Rust or something else interesting. All because some games cant be consistently benchmarked to 0.001% confidence, doesnt mean there isnt value in showing them. omg i got 550 frames in doom and tomb raider 2016!!!1
Hard P!
6:00
Asks for Pat Gelsinger back at Intel, which was the guy who made the decisions regarding Arc GPU-s, 2 generations that failed miserably, costing way too much and not providing the required performance. It's literally the reason he got fired ...
Omg!! This guy is absolutely a buffon ... how can someone with an IQ over 80 watch this shit???
Yup😂
I am a bit skeptical about B580 because performance in UE5 games is bad and it's even worse if you use 1080p+quality upscaling. It's only good in 1080p native. The more UE5 games testers have in their suite of games, the worse the B580 performed.
Everything performs bad in UE5
if it's just UE5 then it sounds like a UE5 issue specifically, I know it is terribly bloated and unoptimized so it's most likely some kind of compatibility issue not a hardware fault
I wouldn't buy a product based on a future promise but I would guess that the performance in UE5 is from terrible optimizations, it could also be driver related as it's super early. If B580 performs on par with other GPU's in other applications, I could see how after some fixes maybe things will be better and hopefully one day developers will go back to optimizing games instead of brute forcing things. Honestly I'm not happy about the movement to UE5 just because it's easy, I liked when developers would actually develop within their in-house engine and develop creative things that instead of working in another engine to use what another company made to brute force to the same objective.
(what I mean by creative things is creative things for optimization IE: light maps (basically cheap raytracing! HALF-LIFE 2 had this it's just static!))
@@personthing7733 totally agree about UE5, and buying based on a promise, but given the performance numbers on other engines, I think it's fair to say that it's a driver and/or UE5 bug, and sure I guess you could say that fixing a bug is a promise and not a deliverable but I don't think that's being fair to the battlemage team, I really hope we'll move to a point of engine devs actually optimizing instead of using up-scaling as a crutch
Xe = Eksy
And REVIEWERS are partly to Blame for HIGH prices , OH yes This card is sooo good , Then they say well I can sell for 200.00 over and they will buy it anyway.
So
The most popular game on Steam, Counterstrike, doesn't matter ... because you don't test it. Gotcha!
Very useful review of the Intel B580
Also
Can you get it anywhere at MSRP? I only saw it at like 4-500$
Fourth probs
you were first my friend.... however.. i was the 4th hahah
You got first according to my iPad. About 4 seconds faster than 2nd and 12 seconds faster than me 😂❤
I like your pessimism but Merry Christmas, you’re #1 this time 😊
Ray-tracing who the hell cares, no CUDA? hardpass... Even with the 12gb is not a diffusion card... sadge...
Linus reading the market is like my old grandma reading the street signs, short sighted... Intel got popular with this (as they burn money trying to cop out the recent CPU fiasco) and that means they are "competitive" now, the next Intel ARC card will be the last price accessible they release, if they're gonna be a big player in the GPU market from now on they are gonna price their stock as the rest not lower (like every other big tech company, when has anything ever go down in price in the history of monopoly parties, it's always about the "big 3" in almost every main industry), and in no way or shape Nvidia or AMD are gonna lower their prices because "wow a new competitor", wake up, gaming is a dead hobby for low to mid salary people. Also Linus got really humble about being a simple youtuber, kudos man
nope im fourth hahaha
Intel may did a good job on graphjc card, but not credit of Pat.