you're saying that nvidia, the company that not only greatly increased prices but also significantly lowered the specs each card tier has? nah, they wouldn't do that to their consumers, it must be inflation!!!
Massive, massive profit margin for Nvidia. It doesn't cost anywhere near $1600 to make a 4090, it probably doesn't even cost half that. Margins will be lower for cheaper cards, but still crazy. AMD follows by pricing at the highest margin they can get away with. The 7900GRE selling for $600, presumably still making profit, while the 7900xtx using the same chip selling for $1000 is proof of that. Nvidia and AMD have gotten too happy with price gouging after the two crypto booms and covid. A top end card could totally still be $700-$800 like they used to be (adjusted for inflation). AMD and Nvidia just wouldn't be making their 50-70%+ margins like they've gotten used to.
Unfortunately a business can't sustain like that, so I hope it works out for them. Eventually they will need to raise the prices or simply make very few of them to take less of a dent. On top of that, most people who are happy about the price of low end GPUs tend to be people who aren't going to personally buy them, so it is only for generating hype and a good perception, meaning Intel probably shouldn't even produce many $250 GPUs.
@@Mcnoobletintel use this contra revenue strategy with their mobile SoC before and it end up really bad for them. It is the root that caused their current crisis.
Hardware Unboxed debunked this already. They are making a profit on the card but they are losing money on the R&D, but hopefully in time they will claw it back if it continues upwards.
The R&D effort so far has been massive and there's still a lot to do, there is no reality where they could recoup all the sunk cost in just two generations
@@XGD5layer I agree. If I was Intel, I'd keep them out of stock and take the free hype and noise, then release the higher end and get some profits back.
R&D cost are insane, they need economy of scale in order to profit. They lost TSMC's large order discount, which makes it worse for Intel, when the competition has it. They need to scale up in order to profit. It's business 101.
There is no large order discount. Not even apple get it anymore. TSMC said before in their official statement they no longer give such discount. And those discount are not big anywaym it was in the range of 1 to 3 percent. That's why the story of intel getting 40% discount before is not true at all.
On a pure component cost basis they may be making money, but we have to factor in thousands of wages that have to be factored into keeping their GPU product division operational.
B580 is likely making some money. Comparing it to the insane profits of Nvidia isn't a good comparison. With Intel you pay for the hardware, Nvidia you pay top dollar for the software and brand recognition.
A leaker has already talked about this and has an Intel source that said the GPU is being sold at a loss. Multiple retailers also told him they have no stock of the GPU and no idea if or when Intel will send more. Odds are intel is only paper launching the B580 with single digit units being sent to a small number of retailers to keep up appearances until they can get back on track with something profitable.
@@TRINVIDEO This would make the most sense to do, and if it is the case, how can this be proclaimed a victory for gamers for a surface level presentation?
It probably is, just the margin in pretty slim, normally since GPU development is expensive, you need the margins to cover development. Intel probably is not going to recoup R&D but will probably sell at a profit, and that is all they could hope for. Still this situation is not sustainable, Intel needs to narrow the gap significantly to be in a good situation here, or maybe even use it's own nodes. given the jump from their first to second gen, they still got a very long way to go, I personally have my doubts they will make it.
No, the calculation would make sense if the 4070 cards price did make sense. But it does not. The prices of those cards exploded during a time of limited supplies of chips and extra high demand for GPU's because of cryptomining. Chipproduction is now fine again and the cryptomining hype is over. But prices for those cards have only gone up referring to inflation. That means they are extremely overprised.
On the contrary,the prices of current gen RTX4000 has nothing to do with the inflation. The arrived 3 years ago BEFORE the inflation explosion. If anything their prices are about same or lower than when released , which is very,very surprising to me.
Even at a 100% profit margin, which is not, the RTX 4070 should cost $275 to produce, meaning that Intel is selling a chip from a more expensive node, at a bigger size, for a lower price than even the break even for NVIDIA, there’s no way they are making a profit. Battlemage was doomed from the beginning, too expensive to produce at a given performance, only chance to gain traction is to sell it at break even or losing money to even get a good review. I wouldn’t be surprised if the B580 becomes unavailable the rest of its life span and becomes the highest end GPU of the generation, though leakers are unsure if the B770 is cancelled or not.
Chip supply still limited and it's even worse due to AI & ML . US and China chip war was never ended and will be even worse with protective policy under Trump with tariff and Ban. Plus crypto also still exist although not like before but it still big in US and China. Inflation is still happens. This just the political and economic reasoning, not even tap into how more law has near its's final end that will rising up the price of chip gradually while gaining less in performance per watt.
@Javier64691 I don't know if the material + building cost are really that high. If you add in R&D cost, sure then I agree that cost get much higher then the raw material and building cost, even more so for Intel, as this is a brand new product-line for them. But the cost for that can be spread out over multiple generations. Maybe looking at a PS5 gives a good idea of the real cost. The PS5 has not been sold at a loss since mid 2021. (Before it did) So building cost for a PS5 is not more then $500. That includes a GPU what is most simular to the RTX 3070. And building cost for a 3070 are most likely simular to a 4070. Then there are all the other components in the PS5.
Given how far behind Intel in establishing themselves in the GPU market, I can see the logic behind selling cards at a small loss in order to establish themselves
The issue is that Intel is struggling at the moment in general and I can see the shareholders getting twitchy shall we say, I mean Pat was already (allegedly) forced into retirement due to Intel's poor financial performance. I can't imagine those same shareholders being happy about them selling these cards at a loss (if they are truly selling them at a loss).
@@Machistmo $249 and $269 is pre-tax. I'm finding the $269 one for 270€ with the VAT removed in the Finnish store. It's only $10 more than the advertised USD price, and that's after imports.
What I love about the ARC cards is that they're not just punching up with raw performance like AMD (which do actually make amazing cards for COMPETITIVE GAMES), but they're actually competing in the *premium* card segment with very good ray tracing performance and upscaling tech. It actually competes (and wins) with the 4060 and 4060ti, not just in raster.
294 mm² isn`t massive die in any shape or form. RTX 2060 had die size of 445 mm² it`s just Nvidia started selling small dies as xx70 cards because why not, they can - after crypto boom and now ai boom people are willing to pay. It`s a legal scam. RTX 3080 ti I have got 628 mm² die size - that`s a big die.
Thats mostly because of samsung wafers being cheaper. Older cards on tsmc nodes like the 1660 super or the 960 series had much smaller dies. The 1660 super has about the same as the B580 while the 960 is even smaller. Together with tsmc prices being way higher than before, the 1660 super is on 12nm and prices from 12nm to 5nm are like 4x maybe even higher.
It's almost as big as the GPU used in the 4070 ti. It's absolutely enormous for a GPU die on a 5nm node, released two years AFTER the 40 series and RX 7000 series, which is still slower than an RTX 3070 or 6750 XT, which are both nearly two generations old now. It's a good graphics card for the price... IF you can find one for near MSRP, at least for now.
Absolutely bizarre that people care more about the die size they're paying for than they do about the actual performance of their cards. My 4070 Ti Super has a die size that's roughly half that of my 3080 while delivering far more performance at the same price point. I guess by RUclips comment logic I should feel scammed, but I also follow lithography and know newer processes are getting far more expensive per square mm. VRAM is more expensive, but people want it doubled for the same price. Capacitors are more expensive, but they want the same price, etc, etc, etc. I'm happy Intel is adding competition, but at the same time we will never see a $500 flagship GPU again, and it's not just due to greed.
I don't believe they are losing money on each unit, because that would be dumb. However, they may be losing money on Arc as a whole (or they have not yet made back the money on their investment), because R&D is very expensive.
Die size isn’t cost (per chip); have to factor in yield rate and overall process cost as well. And even packaging contributes a bit-especially them “advanced” types. Right?
Yes, but as far as we know the yields are high enough on those GPU nodes so we can estimate comparative costs directly. And packaging is also similar. If we tried to compare the 7900 GRE with its chiplets and the 4070 Ti it would be a lot harder to compare, but both the 7600, 4060 and B580 are monolithics with similar enough RAM topologies. I'm still hoping for competition and Intel making enough money with each GPU to keep making them but availability and prices hikes are not showing a bright picture.
It's obvious that Intel is paying a huge upfront cost to try and build up a good reputation and market share. Having market share is so crucial in the GPU space since that means developers of games and game engines are going to prioritize even further optimization and implementing features that are specific to your hardware. For example the push for raytracing by NVIDIA was enabled by their market leadership, and that further cemented their position at the top since they now had a huge feature that AMD didn't, and they could also charge a premium for that feature. They also probably want to reach the status that NVIDIA has where people start buying their brand of GPU and stays on their ecosystem forever because it's what they're familiar with.
People often overestimate the cost of fabbing chips. At defect density of 0.1, a 300mm water can yield 150+ good 280mm die, more if not fully enabled. It’s possible that each die costs around $100. The final margin would be pretty thin but they won’t lose money.
At least Celestial doesn't need much money for R&D, if Intel's claims are to believe about that architecture being "already done". So they'll need money to just pay the TSMC services, printing and packaging, then deliver the GPU dies to their partners (who knows if Intel subsidy part of their costs or whatever).
I say it's the best time. Their CPU is stagnant. Intel not seeing the demand in ARC is their biggest fault. Alchemist's biggest fault apart from drivers, is power consumption, especially at idle.
Their CPUs are only shaky in the gaming sector. Most average office PCs are equipped with Intel (albiet lower end intel), and those users aren't running into issues and fleeing to AMD to save them. It's never a bad time to compete with Nvidia at pricing, especially when Intel is being realistic with their GPU costs.
Yeah, they will do it for sure. I'm a long time Nvidia user, always been great with their videocards, but you know what? I ordered a B580 cause Nvidia 4000 series, and 5000 too, are and will be overprized jokes.
I've seen this argument before in other industries. It usually starts as a bad faith argument with intent to mislead, and various people will fall for it. It usually comes up with products that have low volume, where the low volume cannot make up for overall cost of development, but then the bad faith people twist that into losing money on every one sold, because overall expenses are currently more than revenues from current sales. But this isn't an unusual situation, it's how all products start. The solution is to increase volume and continue to make production more efficient over time. You can feel the bad faith because it would have been a much stronger argument with the A series, but the B580 is actually competitive, so now haters have to step up their game.
It’s more AMD and NVidia are over charging, there is a good margin on those cards. People just got use to paying more for less, we see this in game pricing as well. The crypto bubble drove prices up, and people accepted it, this is what happens when consumers say yes when they should be saying no.
Intel are definitely not losing money when only looking at the per unit BOM cost. That would be insane. They may have underpriced it since I just noticed that newegg had a small stock go on sale today and sold out at a price of $299 ($50 over MSRP). Intel could have easily priced this at $279 and made $20 more per unit. Now it's newegg making an extra $50 per unit.
This is not true, it has been debunked already, they are profiting off each card sold but their entire art division is losing money things to development and research
They don't loose money on manufacturing. Nvidia has like 70% or more margin on their gpus. Their $500 gpu cost like $150 or less to make. They may loose on R&D if they don't make enough and don't get market share
maybe nvidia is not being able to digest arc b580 success nvidia dont want to shift tide in under $300 gpu market thats why they spreading these concerns and doubts.
I love my B580 Steel Legend. Skipped Alchemist since it was their first gen, but I knew I was def getting Battlemage. Now I have it and Im very impressed by this card so far.
You don't know if they are selling the cards at a loss. Hardware unboxed just made that point, but I've been making it for a while since "that other" channel has been bringing up die size left an right. Second most consumers won't care about how much profit a company makes or how they get to a level of performance if the value is good. Expecting intel to have a 60% margin at this point isn't reasonable. I wrote a similar comment about this 2 weeks ago. Most of their costs are RND and they are making them anyway for AI. They just need to make a small profit. They probably are. They've threaded the needle with the ram, and in some of the manufacturing. Even if they make 20 dollars a card if they can sell millions that isn't nothing. They are not stocked because they probably didn't think or know if it would be this popular and it's the holidays. It's hard to immediately restrock and fill all of the demand at one time w/o a lot of risk. Why not bring this up as well? It seems more nuanced than saying it's a big die, so it must not be profitable. No offence by the way, but I've been hearing this from others too, and i keep effectively writing this comment. I'm glad hardware unboxed finally said something. By the way. Maybe they are losing money on the margin. I'm not saying that i know. But i don't understand why we reporting on things we don't know. Your guess doesn't mean much. Even in comment in here saying people are wrong they are losing money. I say back at them prove it, and based on what. RND if you bring that in it's not relevant. A lot of the costs say even the software are ones they are making anyway. It' s not that simple. Die: - Model: BMG-G21 - Process: TSMC N5 - Die Size: 272 mm² - Transistors: 19.6 billion - Launch Date: December 12, 2024 - Source: TechPowerUp My guess: 50 - 100 dollars. can you get this price exactly? That would be the most effective way than continuing to talk about this. Memory - Type: GDDR6 - Capacity: 12 GB - Speed: 19 Gbps - Bus Width: 192-bit - Bandwidth: 456 GB/s - Source: Tom's Hardware Price 60-80 PCB - Includes: Printed circuit board, power delivery components, capacitors, resistors, and other necessary electronics. 30-50 Cooling - Details: Dual-fan cooling system with a heatsink to manage the 190W TDP. 30 Labor - Details: Labor and equipment costs associated with assembling the card and ensuring quality control. 10-20 Packaging - Includes: Retail box, manuals, cables, and other in-box accessories. 10 total "about": 175 and $300 I don't know if they are getting discounts on anything. Where they are the least competitive though is accounted for in the 50-100 dollars (the die cost) we keep bringing that up anyway. They are only about 25 dollars off say nvidia or amd on costs there, and nvidia is using an even more expensive node which nobody brings up (n4). It's just one part of the GPU. So we keep talking about how dire this is; it's not great that it's big, but it's only one piece of the costs. If you want to bring in driver development and Rnd sure yes it's not profitable. I don't think they are doing it for this gpu though. They are doing it for laptops, desktops, ai and to stay relevant. how is intel expected to ignore AI at this point. I've put more thought into the youtube comment than in the entire video.
It's nearly impossible to lose money per sold chip. The actual silicon is cheap, but the engineering cost is huge. I hope Intel's sales hold up for the rest of the generation.
I guess you don't know that batching out silicon at scale is very cheap per unit, it is the other costs that are driving final price most, not the die size itself.
For the people here in the Comment-section that are just guessing "No, Intel is not selling at a loss"... The Die is a according to Techpowerup 272mm². This would mean for a 5nm die, that the chip alone costs around 130$ per chip. We stil have to add costs for tools, chassis, VRAM, Board, Powerdelivery etc. etc. We dont even mention the "other stuff" (like Engineers, Labors, packaging, logistics etc.). TSMC also has anounced, that they will raise wafercost in 2025 up to 10%. And now tell me again, that Intel is not selling at a loss...(at the pricepoint of 250$ )
Even if they're losing money on each unit solid, I'd probably say they're happy to see a noticeable increase in market share that seems to have been impossible to take away from Nvidia. Each Arc B580 sold is a foot in the door to selling a more profitable GPU down the road.
Their operating income in 2023 was $93 million dollars. Their operating margin for 2024 was 1.4%. They are losing money. They are not happy, they have some hard decision what to cut and what to sell ahead of them.
I’m assembling an multi gpu machine using intel arc770s and its a pain but they’re putting a lot of effort into their extensions for popular frameworks.
If intel wasn't being aggressive they could have just put less memory in the GPU but they actually have a really generous amount for a card of this class. I think it's more than [they couldn't get 4070 performance so they didn't charge 4070 cost] because of that.
This is the biggest battle of any PC builder of this generation. Nvidia got extremely greedy, to the point that no PC builder want to use their cards, they are forced to. AMD is always matching their pricing to Nvidia specs, but lower because they can't match Nvidia pefrormance. Intel seems to be a new kid on the block. With budget price but mid range performance (and proper VRAM size) they are the best choice currently. And a hope for future with very large VRAM that can power 4k gaming (or 1080p 240/480 fps on some games) that is not just for the top wallet.
Supply is tiny. My local microcenter didn't have any on launch day, there were 5 the following week but sold instantly. They then had the Sparkle model the following week that was $270 but only 3 in stock that sold instantly. On newegg you see them at $400+ from price gouging 3rd party sellers. They may have better supply of the B570 if bad B580s are being repackaged. The main thing the B580 is doing at microcenter is I think most ppl who can't get one will end up getting an in stock $249 3060 12gb, or $300 4060 8gb. Rx 7600 is a bad card at $269 still.
Tbh the die size does not shock me, if you go by that metric we know that the 4060 is actually a 4050, and yet the 3060 Ti that has similar performance to the B580 was on a 392mm die, simply nvidia just shifted the needle, its that simple because in all fairness, they perform where they set them to perform and no one else typically bothers with that.
Another reason for they having to lower the price this much to be able to complete is how it seems like even though it has higher bandwidth and VRAM it performs worse than a 4060 in rendering in blender at least, of course the market for gaming and specially the AIbots is way more profitable than 3D work but its a point to consider, me personally I was going to try to get one B580 but that was the main reason I see getting a 4060 as unfortunately the only way right now to get a decent card for gaming AND rendering
If Intel or AMD could release a well-priced equivalent for each model that Nvidia produce at the current generation, that would really give Nvidia a run for their money. Something that really needs to happen too.
I think a lot of people are on copium thinking that B580 is making money and blaming Nvidia for having a large margin. Well, you don't have to compare it to Nvidia, compare it to RX 6600. RX 6600 is a lower bin chip, it is on TSMC 7nm and has 237mm2 die size and has 8GB VRAM. Right now it is being sold at around $200. B580 is on TSMC 5nm and has 272mm2 die size and 12GB VRAM. It is being sold at $250. Already from the die size itself it should be a lot more expensive. 5nm is expensive. And this is using TSMC and not Intel own foundry, thus you can't expect Intel to get that foundry discount. Also remember that we are comparing the lower bin Navi23 vs probably the full G21. I think at best the product has a very tiny margin thus overall it is a loss if you consider the R&D cost. At worst Intel is selling B580 at a loss. Whatever it is, I don't think Intel is making money from B580 and I believe Intel rep has said that much. For me, I don't mind Intel is selling this at a loss. The main issue I have is when AMD and Nvidia probably don't want to sell their GPU at a loss, thus for example, AMD might have a card in a similar performance tier as B580 (maybe their next GPU, RX 9060XT, and assuming it has similar VRAM size) and price it at $300 to $350 max. No doubt reviewers will portray this GPU as having a bad value because B580 is only $250, but in reality, if AMD want to actually make money, they can't charge lower than $300, which obviously make the review a bit unfair. FYI, in Indonesia, the cheapest B580, which is an ASRock model, is $340. You can buy RX 7600XT 16GB at $315 and RX 7600 8GB at $265. As far as I'm concerned, $250 is a fake price. Probably because the production is so low and they prioritized big market, thus only few that came to my country, thus it become expensive. FYI, in the US you can buy RX 7600 at $250 and the cheapest price for 7600XT is $315. Basically AMD has enough margin for RX 7600 series that they probably can sell those GPU in my country at a lower margin while B580 there is simply no margin thus the prices actually become more expensive. And yes, those prices are with the tax included which should be close to 20% for VAT + import tax.
Most everyone is in aggrement that BMG is losing money. The question is if the per unit cost has positive gross margin, which makes a big difference. Because if they're net negative gross margin per unit, every sale makes them lose more money. And if they're gross positive per unit, every sale makes them lose less money.
No, compare it directly to the 4070. Everything is the same. Sure, 294mm square vs. 272mm square, but EVERYTHING ELSE is EXACTLY THE SAME. Take 4070 MSRP = $550 and correct for channel margin (15%) and published NVidia 2022 margin (61%) and you get $297 to build the 4070. Maybe $10 less for the B580 to correct for the very minor die size difference. So, $287 to build the B580. They lose $37 on every B580 they sell at $250. Intel has forgotten how to design VLSI. All their 2024 designs are horribly inefficient.
Honestly, we're still in the "beta" stage of Intel GPU development. The drivers still aren't up to par with Nvidia or AMD, nor is the feature set the GPU offers. As of right now, I think it's more about market saturation than anything; get the product into user's hands, learn as much as you possibly can from them, make a better product next time. Nvidia and AMD have a MASSIVE head start on Intel in this arena. Anything Intel can do to disrupt the market is a huge win, even if there is a bit of a loss. Intel is playing the long game on this one.
My guess is break even or slightly positive gross margins per unit, with overall BMG being a loss once fixed costs are included. And BMG will never hit the volumes to recoup the NRE, so it will be a loss. But it's all speculation
ASSUME: TSMC 5n wafer costs $18,000 and you have 100% perfect yields (you won't). That means the 613 sq mm. chip for RTX 4090 only "costs" $200 in raw silicon. The 310 sq. mm chip on the Navi31 GCD only "costs" $97 in raw silicon. The 276 sq. mm. die on B580 only "costs" $86 in raw silicon. There is a mountain of R&D that must happen before you can make even one chip of course. And a ton of expense to get a chip off the wafer and running on your machine. I simply don't see how Intel could be making money on B580.
Question: is die size as important for a company that has its own foundries versus companies that contract die production to another business (e.g. to TSMC)?
I don’t even know why I bother posting this - but the transistor density of a 4070 is DOUBLE of that of a B580. I don’t set prices at TSMC, but I can’t imagine it doesn’t affect the pricing.
TSMC usually sends out dev advisors to their larger clients, such as Apple, Nvidia, the works. This is to help their more important clients work as efficiently as they can with their nodes. Intel is a competitor, they were probably not shown this courtesy. Not to mention the recent friction between them. Intel is on an expensive subfamily of nodes, without much guidance and/or benefits from that node. All things considered, even the possibility of them sandbagging their transistor density, it isn't the worst outcome
@@lolilolplixtransistor density usually reduces pricing, unless you're on a specifically developed node for your own product More transistors in the same space means you can make a similar chip in a smaller space, thus saving costs since you're paying per wafer of a node, not for the density your designs achieve
Kicking out Pat just before Christmas speaks for itself... Not that he is to blame, Intel was already behind, losing at least a decade thinking they will be the market leader forever. Catching up takes money however, and he was also misfortunate that money is now also short. Intel Arc turned to be too little, too late, but hopefully they will continue trying.
Intel stagnated because the accounts were in charge. For the first time in a long time they had a CE0 with an engineering background, and he just got fired. My guess is the accounts will be put back in charge.
The logic that this card can't be profitable because it offers much of what the 4070 does at a lower price, implies that the 4070 is priced with a thin margin and this undercuts it. Obviously that's not the case, so I'm not too sure I agree with the assessment.
I think the B570 with performance between the RX-6650XT and RX-6700-Vanilla priced at $220, might be MAKING money for Intel, and Factory Overclocked models for $230-$270 with performance on top of the B580 would help make Intel and Vendors more money. If I had to guess, it Total Card Cost for the B580 is $205-$220, and the Total Card cost for the B570 is going to be $150-$170....so Intel and Vendors are going to make more money off the B570 cards.
Its a 1060 basically with tax was 300 bucks and kncluding inflation i think the margins are on the lower end considering that they need more telemetry data to optimize their Driver even further and get Market Share with Reputation. If this goes on just like with XESS bein better than FSR than AMD really should consoder becoming competetive again in that space instead of relying on being main supllier for New Gen Consoles.
Given that AMD is barely making any money in their graphics division it seems likely Intel is losing some money making graphics cards once all dev costs (silicon engineering, driver programming etc) are counted. They could still make some profit on the hardware offsetting some of the costs.
So how did you arrive at the cost price, Profit margins etc. And how much profit should anybody be making, 10%, 20%, 30%, 100% or more !!! Could state what the failure rate when making these chips is and what do they did to be profitable !
I heard, Budget graphics don't make decent returns until they sell in big numbers. The cost remains more than the the profit until they pass the break even point.
It's pretty odd to me that, for Intel, there's concern that a GPU with a die size comparable to the 4070 Ti at that price isn't profitable, while at the same time people harp on Nvidia for the steep prices. You really can't have it both ways, I'm afraid: Either Intel is losing money on their GPUs, then Nvidia's (and AMD's pricing, they're not better there, sorry) is A-okay, or Intel is making money and Nvidia's pricing is over the top.
I think that intel is making money per unit but has no chance to actually recoup any R&D costs. For NVIDIA and AMD that would be counted as losing money. If they priced it accordingly they would not get any marketshare, not recoup R&D and lose even more money. They took the opportunity to get marketshare as that is the best outcome, they have to keep price low. I think the big question is, is intel going to have money to R&D the next generation of ARC cards? as they are not recouping the "investment" on the current gen.
What these guys don't mention which absolutely plays into the cost is, Intel is rebounding off a awful CPU launch with performance, that didn't go as how they planned. So yes, I think the $250+ price point is a way for them to take the spot life off the CPU and back on Intel, casting the company in a better light. BTW, I picked up an ASRock Steel Legend B580 on the 13th of December, and it's a fantastic card for the money. Well done Intel!
I don't think there's any realistic chance they are selling those cards for a loss, that just doesn't make any sense. It's not like there isn't a huge amount of wiggle room to compete against their competitors margins, hell AMD are usually operating at a profit margin of around 50% and Nvidia far higher than that. There may have been talk of Intel losing money on GPUs but that almost certainly refers to the Arc division as a whole losing money, which means including all of their R&D costs (which will be pretty enormous).
The resales of Nvidia will be reduced by resalers by a amount that intel cannot beat anyways . It is a shame to root out ANY competition. The prices are all ready coming down for most JUST TO SELL them off and STILL make a killing on them. WAKE up People .
They are not overpricing. But the "sponsored cards" had to run dry at some point. Now you can buy them at actual price, not at fake MSRP. You fell for the lie of Intel, because Intel is the scalper here, not the resalers.
Don't blame Intel for scalpers. You can always pre-order the Sparkle B580 for £259.99 on OCUK. Also in stock on CCL for £299.99. That's overpriced but still way short of £350.
In order for Intel to make money, they need to make back the money they've put into developing them... They are no where near that point. They would need to sell an eye watering amount of $250 GPU's to start making money on this generation. It's not going to happen. They become a successful GPU manufacturer, or the tens of billions already spent, go up in smoke.
Unfortunately, when a company establishes a benchmark of profit, they always have to surpass it. It cannot be less. For NO other reason than to appease shareholders. It is a natural consequence of capitalism and the reason prices will rise unsustainably until collapse. At the cost of consumers.
Wafer cost 10k. Yield 75% is what TSMC claims for the 6nm(they call 5nm) node. Die size some maths done ✅ want my spreadsheet? Jesus this is a disappointing video. Even if it is a 5 real. The intel margins are still 15-20% but i doubt Intel pays that... but lets say they do. They ARE NOT selling for a loss.
no they dont lose because what its the point of release of them anyway!!production of those cards are from 60-90$ !!Intel actually are only still in loses untill they start to cover investing for research and similar things
I'm buying a B580 as soon as the less expensive ones are back in stock. I could get one for 350 right now (which is still a decent deal IMO), but I'll wait. After that, I'm buying an AMD gpu. Nvidia has gone nuts with their pricing and specs. "Less is the new more"---Nividia.
The B580 is not on a more advanced node, it's on N6, like thegeneration prior to it, Alchemist, meanwhile the entire RTX4000 series is on N4X and RDNA3 is on N5(sans the 7600/7600XT)
I do not think intel is losing money with their Arc B580 at all, maybe they are not making that much of a proffit but,damn sure nvidia and amd are overpricing their gpu's, this is a guarantee...and i am not a intel supporter, pc and laptop both nvidia and amd....
I can easily justify that they are losing money. The MSRP of the 4070 is $550. The channel margin is 15% (Published when EVGA left the market). The NVidia 2022 margin is 61% (Published by NVidia in 2022). So $550 / 1.15 / 1.61 = $297. So the BOM costs $297 to make for the 4070. Now compare the 4070 to the B580. All the parameters are virtually the same. 5nm, TSMC, same power, same cooling, same DDR6 - EXACTLY. The B580 is a tiny bit smaller (272mm square vs. 294 mm square), so a TINY bit cheap er to make (maybe $10 cheaper). Therefore, the Cost of making a B580 = $297 - $10 = $287. Intel is losing $37 on every card they make. EVERY SINGLE ONE.
Pat Gelsinger was fired THE DAY BEFORE B580 WAS ANNOUNCED. Why? Because he was launching products that lose money, just to keep his promises to the board and keep his job. This clown was a useless CEO. The board finally realized it after Gelsinger had made his 12th mistake as Intel CEO !!
There is only ONE type of market share you gain by selling GPUs below cost. MARKET SHARE AMONG BANKRUPTCY LAWYERS. You will definitely need them if you give away free money too often, like Intel has been doing with Arc & Battlemage !!
No, they paid tsmc to produce their cards. They were using their nodes because intel didn't have the fab space since it was being used to produce arrow lake. There was even news of them losing their 40% discount because of what Pat G. Said about TSMC vs USA.
Intel is for sure losing money just based on R&D. At best their barely break even on a build cost but take into account staff and R&D and it’s a massive billion dollar loser.
Maybe, just maybe, Nvida sells their cards at a huge profit?
'muh infwation, guys'
you're saying that nvidia, the company that not only greatly increased prices but also significantly lowered the specs each card tier has? nah, they wouldn't do that to their consumers, it must be inflation!!!
Massive, massive profit margin for Nvidia. It doesn't cost anywhere near $1600 to make a 4090, it probably doesn't even cost half that. Margins will be lower for cheaper cards, but still crazy. AMD follows by pricing at the highest margin they can get away with. The 7900GRE selling for $600, presumably still making profit, while the 7900xtx using the same chip selling for $1000 is proof of that.
Nvidia and AMD have gotten too happy with price gouging after the two crypto booms and covid. A top end card could totally still be $700-$800 like they used to be (adjusted for inflation). AMD and Nvidia just wouldn't be making their 50-70%+ margins like they've gotten used to.
@@ryanspencer6778- Nvidia gets those 50%+ margins, not AMD.
@@ezg8448 AMD has profit margins of around 50%, Nvidia's profit margins are way higher
Intel have already admitted that Battlemage isn't about making money, it's about laying the foundation for future ARC graphic cards to be profitable.
and then fired the CEO that was the champion of the project?
ARC is no more after this, back to focusing on APUs
Unfortunately a business can't sustain like that, so I hope it works out for them. Eventually they will need to raise the prices or simply make very few of them to take less of a dent. On top of that, most people who are happy about the price of low end GPUs tend to be people who aren't going to personally buy them, so it is only for generating hype and a good perception, meaning Intel probably shouldn't even produce many $250 GPUs.
@@TJones4L What's the official source for this information? According to Intel, Celestial has already been planned out.
@@Mcnoobletintel use this contra revenue strategy with their mobile SoC before and it end up really bad for them. It is the root that caused their current crisis.
Hardware Unboxed debunked this already. They are making a profit on the card but they are losing money on the R&D, but hopefully in time they will claw it back if it continues upwards.
The R&D effort so far has been massive and there's still a lot to do, there is no reality where they could recoup all the sunk cost in just two generations
@@XGD5layer I think that's what Steve & Tim said.
@@JohnDavidSullivan I didn't notice that, maybe this video format doesn't really work for me
@@XGD5layer I agree. If I was Intel, I'd keep them out of stock and take the free hype and noise, then release the higher end and get some profits back.
@@Mcnooblet That's not what I said. Some generations down the line maybe, but not now when they need to build market share and recognition.
R&D cost are insane, they need economy of scale in order to profit. They lost TSMC's large order discount, which makes it worse for Intel, when the competition has it. They need to scale up in order to profit. It's business 101.
There is no large order discount. Not even apple get it anymore. TSMC said before in their official statement they no longer give such discount. And those discount are not big anywaym it was in the range of 1 to 3 percent. That's why the story of intel getting 40% discount before is not true at all.
@@arenzricodexd44091 to 3% is a lot. People talking about 40% discount failed business school. In a corporate world, a percent could be few millions.
On a pure component cost basis they may be making money, but we have to factor in thousands of wages that have to be factored into keeping their GPU product division operational.
Their GPU division is tiny.
The bright spot there for them is R&D and COGS are treated differently for P&L sake
Considering AMD is making 45% profit, and nVidia is making 70%, I seriously doubt Intel is losing money on units sold.
Where did you get those profit metrics? 😂
@@willwunsche6940 from the companies' public disclosures every quarter laugh laugh
@@willwunsche6940 yeah it’s definitely not that high, and it also depends on which gpu
@@akathenugget it doesn't depend on the gpu, these companies announce their profit margins company-wide.
@@akathenugget the latest figures are 50% for AMD and 75% for nVidia, so yeah, it's definitely that high.
B580 is likely making some money.
Comparing it to the insane profits of Nvidia isn't a good comparison.
With Intel you pay for the hardware, Nvidia you pay top dollar for the software and brand recognition.
Well, with the founders editions, nvidia also took a page out of apples play book. So with that Top Dollar™ you are also buying _aEsThEtIc sTyLe_
A leaker has already talked about this and has an Intel source that said the GPU is being sold at a loss. Multiple retailers also told him they have no stock of the GPU and no idea if or when Intel will send more. Odds are intel is only paper launching the B580 with single digit units being sent to a small number of retailers to keep up appearances until they can get back on track with something profitable.
@@TRINVIDEO This would make the most sense to do, and if it is the case, how can this be proclaimed a victory for gamers for a surface level presentation?
It probably is, just the margin in pretty slim, normally since GPU development is expensive, you need the margins to cover development.
Intel probably is not going to recoup R&D but will probably sell at a profit, and that is all they could hope for.
Still this situation is not sustainable, Intel needs to narrow the gap significantly to be in a good situation here, or maybe even use it's own nodes. given the jump from their first to second gen, they still got a very long way to go, I personally have my doubts they will make it.
No, the calculation would make sense if the 4070 cards price did make sense. But it does not.
The prices of those cards exploded during a time of limited supplies of chips and extra high demand for GPU's because of cryptomining.
Chipproduction is now fine again and the cryptomining hype is over.
But prices for those cards have only gone up referring to inflation. That means they are extremely overprised.
On the contrary,the prices of current gen RTX4000 has nothing to do with the inflation.
The arrived 3 years ago BEFORE the inflation explosion.
If anything their prices are about same or lower than when released , which is very,very surprising to me.
@GameslordXY for inflation I was mentioning the expected prices for the 5000 series. As that is what the B580 is really competing against.
Even at a 100% profit margin, which is not, the RTX 4070 should cost $275 to produce, meaning that Intel is selling a chip from a more expensive node, at a bigger size, for a lower price than even the break even for NVIDIA, there’s no way they are making a profit. Battlemage was doomed from the beginning, too expensive to produce at a given performance, only chance to gain traction is to sell it at break even or losing money to even get a good review.
I wouldn’t be surprised if the B580 becomes unavailable the rest of its life span and becomes the highest end GPU of the generation, though leakers are unsure if the B770 is cancelled or not.
Chip supply still limited and it's even worse due to AI & ML . US and China chip war was never ended and will be even worse with protective policy under Trump with tariff and Ban. Plus crypto also still exist although not like before but it still big in US and China. Inflation is still happens. This just the political and economic reasoning, not even tap into how more law has near its's final end that will rising up the price of chip gradually while gaining less in performance per watt.
@Javier64691 I don't know if the material + building cost are really that high.
If you add in R&D cost, sure then I agree that cost get much higher then the raw material and building cost, even more so for Intel, as this is a brand new product-line for them.
But the cost for that can be spread out over multiple generations.
Maybe looking at a PS5 gives a good idea of the real cost. The PS5 has not been sold at a loss since mid 2021. (Before it did)
So building cost for a PS5 is not more then $500.
That includes a GPU what is most simular to the RTX 3070. And building cost for a 3070 are most likely simular to a 4070.
Then there are all the other components in the PS5.
Given how far behind Intel in establishing themselves in the GPU market, I can see the logic behind selling cards at a small loss in order to establish themselves
It also gives more chances to get user feedback on drivers
The issue is that Intel is struggling at the moment in general and I can see the shareholders getting twitchy shall we say, I mean Pat was already (allegedly) forced into retirement due to Intel's poor financial performance. I can't imagine those same shareholders being happy about them selling these cards at a loss (if they are truly selling them at a loss).
they had ZERO choice. They already have jebaited you all. WHERE CAN YOU BUY A B580 for 249 or 269? WHERE?
@@Machistmo $249 and $269 is pre-tax. I'm finding the $269 one for 270€ with the VAT removed in the Finnish store. It's only $10 more than the advertised USD price, and that's after imports.
Maybe a few years ago when their cpu division also wasn't collapsing
What I love about the ARC cards is that they're not just punching up with raw performance like AMD (which do actually make amazing cards for COMPETITIVE GAMES), but they're actually competing in the *premium* card segment with very good ray tracing performance and upscaling tech. It actually competes (and wins) with the 4060 and 4060ti, not just in raster.
LOL clueless cope
294 mm² isn`t massive die in any shape or form. RTX 2060 had die size of 445 mm² it`s just Nvidia started selling small dies as xx70 cards because why not, they can - after crypto boom and now ai boom people are willing to pay. It`s a legal scam. RTX 3080 ti I have got 628 mm² die size - that`s a big die.
Thats mostly because of samsung wafers being cheaper. Older cards on tsmc nodes like the 1660 super or the 960 series had much smaller dies. The 1660 super has about the same as the B580 while the 960 is even smaller. Together with tsmc prices being way higher than before, the 1660 super is on 12nm and prices from 12nm to 5nm are like 4x maybe even higher.
It's almost as big as the GPU used in the 4070 ti. It's absolutely enormous for a GPU die on a 5nm node, released two years AFTER the 40 series and RX 7000 series, which is still slower than an RTX 3070 or 6750 XT, which are both nearly two generations old now.
It's a good graphics card for the price... IF you can find one for near MSRP, at least for now.
Are you seriously comparing 2060 ti wafer cost to TSMC's newer nodes wafer cost?
Nvidia's architecture is also good enough that they can get away with a smaller die for the performance target.
Absolutely bizarre that people care more about the die size they're paying for than they do about the actual performance of their cards. My 4070 Ti Super has a die size that's roughly half that of my 3080 while delivering far more performance at the same price point. I guess by RUclips comment logic I should feel scammed, but I also follow lithography and know newer processes are getting far more expensive per square mm. VRAM is more expensive, but people want it doubled for the same price. Capacitors are more expensive, but they want the same price, etc, etc, etc.
I'm happy Intel is adding competition, but at the same time we will never see a $500 flagship GPU again, and it's not just due to greed.
I don't believe they are losing money on each unit, because that would be dumb. However, they may be losing money on Arc as a whole (or they have not yet made back the money on their investment), because R&D is very expensive.
Or maybe Nvidia is charching a lot of money.
Are we assuming that Nvidia’s total net profit margin of 55% is anything other than exploitative?
Nvidia's high margins comes mostly from data centre revenue. Not dGPU
Die size isn’t cost (per chip); have to factor in yield rate and overall process cost as well. And even packaging contributes a bit-especially them “advanced” types. Right?
Yes, but as far as we know the yields are high enough on those GPU nodes so we can estimate comparative costs directly.
And packaging is also similar.
If we tried to compare the 7900 GRE with its chiplets and the 4070 Ti it would be a lot harder to compare, but both the 7600, 4060 and B580 are monolithics with similar enough RAM topologies.
I'm still hoping for competition and Intel making enough money with each GPU to keep making them but availability and prices hikes are not showing a bright picture.
It's obvious that Intel is paying a huge upfront cost to try and build up a good reputation and market share. Having market share is so crucial in the GPU space since that means developers of games and game engines are going to prioritize even further optimization and implementing features that are specific to your hardware. For example the push for raytracing by NVIDIA was enabled by their market leadership, and that further cemented their position at the top since they now had a huge feature that AMD didn't, and they could also charge a premium for that feature.
They also probably want to reach the status that NVIDIA has where people start buying their brand of GPU and stays on their ecosystem forever because it's what they're familiar with.
What now? This is all like pie in the sky
People often overestimate the cost of fabbing chips. At defect density of 0.1, a 300mm water can yield 150+ good 280mm die, more if not fully enabled.
It’s possible that each die costs around $100. The final margin would be pretty thin but they won’t lose money.
Its not impossible that they have a 25 buck margin.
Thin but enough to tide them over for now
At least Celestial doesn't need much money for R&D, if Intel's claims are to believe about that architecture being "already done". So they'll need money to just pay the TSMC services, printing and packaging, then deliver the GPU dies to their partners (who knows if Intel subsidy part of their costs or whatever).
They'd be losing even more money on every unit not sold if it had flopped
The problem is that this effort is funded by CPUs and their CPUs are on shaky ground. Worst time to sink money in this for Intel.
I say it's the best time. Their CPU is stagnant. Intel not seeing the demand in ARC is their biggest fault. Alchemist's biggest fault apart from drivers, is power consumption, especially at idle.
Their CPUs are only shaky in the gaming sector. Most average office PCs are equipped with Intel (albiet lower end intel), and those users aren't running into issues and fleeing to AMD to save them. It's never a bad time to compete with Nvidia at pricing, especially when Intel is being realistic with their GPU costs.
Nvidia fanboys would rather pay $1500 for 8GB gpus
Yeah, they will do it for sure.
I'm a long time Nvidia user, always been great with their videocards, but you know what? I ordered a B580 cause Nvidia 4000 series, and 5000 too, are and will be overprized jokes.
I've seen this argument before in other industries. It usually starts as a bad faith argument with intent to mislead, and various people will fall for it. It usually comes up with products that have low volume, where the low volume cannot make up for overall cost of development, but then the bad faith people twist that into losing money on every one sold, because overall expenses are currently more than revenues from current sales. But this isn't an unusual situation, it's how all products start. The solution is to increase volume and continue to make production more efficient over time. You can feel the bad faith because it would have been a much stronger argument with the A series, but the B580 is actually competitive, so now haters have to step up their game.
It’s more AMD and NVidia are over charging, there is a good margin on those cards.
People just got use to paying more for less, we see this in game pricing as well.
The crypto bubble drove prices up, and people accepted it, this is what happens when consumers say yes when they should be saying no.
Intel are definitely not losing money when only looking at the per unit BOM cost. That would be insane. They may have underpriced it since I just noticed that newegg had a small stock go on sale today and sold out at a price of $299 ($50 over MSRP). Intel could have easily priced this at $279 and made $20 more per unit. Now it's newegg making an extra $50 per unit.
This is not true, it has been debunked already, they are profiting off each card sold but their entire art division is losing money things to development and research
They don't loose money on manufacturing. Nvidia has like 70% or more margin on their gpus. Their $500 gpu cost like $150 or less to make. They may loose on R&D if they don't make enough and don't get market share
The 4070ti has a die size that belongs to a 60 tier Nvidia card. The 4070 is a rip off and the 4060 8gb should cost the same as the B580.
maybe nvidia is not being able to digest arc b580 success
nvidia dont want to shift tide in under $300 gpu market thats why they spreading these concerns and doubts.
Intel is losing money, not at the raw material cost per board. Most likely, when R & D + marketing is added is when the losses occur.
Not to mention the overall TSMC price rise and the loss of TSMC large order discount.
@@MarkedStriker No, they are losing $37 on every board they sell, look at my calculation in another note a few minutes ago ...
Why would standard N5 used by BMG be more expensive than Nvidia's custom "4N" node from TSMC???
Amazing video, thank you Digital Foundry, happy new year everyone! :D
xess with that ai branch on the latest dll is really helping as well.
where have B580's been in stock?
I love my B580 Steel Legend. Skipped Alchemist since it was their first gen, but I knew I was def getting Battlemage. Now I have it and Im very impressed by this card so far.
You don't know if they are selling the cards at a loss. Hardware unboxed just made that point, but I've been making it for a while since "that other" channel has been bringing up die size left an right. Second most consumers won't care about how much profit a company makes or how they get to a level of performance if the value is good. Expecting intel to have a 60% margin at this point isn't reasonable. I wrote a similar comment about this 2 weeks ago. Most of their costs are RND and they are making them anyway for AI. They just need to make a small profit. They probably are. They've threaded the needle with the ram, and in some of the manufacturing. Even if they make 20 dollars a card if they can sell millions that isn't nothing. They are not stocked because they probably didn't think or know if it would be this popular and it's the holidays. It's hard to immediately restrock and fill all of the demand at one time w/o a lot of risk. Why not bring this up as well? It seems more nuanced than saying it's a big die, so it must not be profitable. No offence by the way, but I've been hearing this from others too, and i keep effectively writing this comment. I'm glad hardware unboxed finally said something.
By the way. Maybe they are losing money on the margin. I'm not saying that i know. But i don't understand why we reporting on things we don't know. Your guess doesn't mean much.
Even in comment in here saying people are wrong they are losing money. I say back at them prove it, and based on what. RND if you bring that in it's not relevant. A lot of the costs say even the software are ones they are making anyway. It' s not that simple.
Die:
- Model: BMG-G21
- Process: TSMC N5
- Die Size: 272 mm²
- Transistors: 19.6 billion
- Launch Date: December 12, 2024
- Source: TechPowerUp
My guess: 50 - 100 dollars. can you get this price exactly? That would be the most effective way than continuing to talk about this.
Memory
- Type: GDDR6
- Capacity: 12 GB
- Speed: 19 Gbps
- Bus Width: 192-bit
- Bandwidth: 456 GB/s
- Source: Tom's Hardware
Price 60-80
PCB
- Includes: Printed circuit board, power delivery components, capacitors, resistors, and other necessary electronics.
30-50
Cooling
- Details: Dual-fan cooling system with a heatsink to manage the 190W TDP.
30
Labor
- Details: Labor and equipment costs associated with assembling the card and ensuring quality control.
10-20
Packaging
- Includes: Retail box, manuals, cables, and other in-box accessories.
10
total "about":
175 and $300
I don't know if they are getting discounts on anything. Where they are the least competitive though is accounted for in the 50-100 dollars (the die cost) we keep bringing that up anyway. They are only about 25 dollars off say nvidia or amd on costs there, and nvidia is using an even more expensive node which nobody brings up (n4). It's just one part of the GPU. So we keep talking about how dire this is; it's not great that it's big, but it's only one piece of the costs.
If you want to bring in driver development and Rnd sure yes it's not profitable. I don't think they are doing it for this gpu though. They are doing it for laptops, desktops, ai and to stay relevant. how is intel expected to ignore AI at this point.
I've put more thought into the youtube comment than in the entire video.
-15% margin, by my calculations. They lose $37 on every $250 card they sell ..
I love what Intel is doing with their ARC GPUs. I don't want their dedicated GPU division to close because of any reason
It's nearly impossible to lose money per sold chip. The actual silicon is cheap, but the engineering cost is huge.
I hope Intel's sales hold up for the rest of the generation.
I guess you don't know that batching out silicon at scale is very cheap per unit, it is the other costs that are driving final price most, not the die size itself.
For the people here in the Comment-section that are just guessing "No, Intel is not selling at a loss"...
The Die is a according to Techpowerup 272mm². This would mean for a 5nm die, that the chip alone costs around 130$ per chip. We stil have to add costs for tools, chassis, VRAM, Board, Powerdelivery etc. etc. We dont even mention the "other stuff" (like Engineers, Labors, packaging, logistics etc.). TSMC also has anounced, that they will raise wafercost in 2025 up to 10%. And now tell me again, that Intel is not selling at a loss...(at the pricepoint of 250$ )
Even if they're losing money on each unit solid, I'd probably say they're happy to see a noticeable increase in market share that seems to have been impossible to take away from Nvidia. Each Arc B580 sold is a foot in the door to selling a more profitable GPU down the road.
B770 is definitely that, but I don't expect them to make a profit for a while from that R&D cost.
Their operating income in 2023 was $93 million dollars. Their operating margin for 2024 was 1.4%. They are losing money.
They are not happy, they have some hard decision what to cut and what to sell ahead of them.
I’m assembling an multi gpu machine using intel arc770s and its a pain but they’re putting a lot of effort into their extensions for popular frameworks.
If intel wasn't being aggressive they could have just put less memory in the GPU but they actually have a really generous amount for a card of this class.
I think it's more than [they couldn't get 4070 performance so they didn't charge 4070 cost] because of that.
This is the biggest battle of any PC builder of this generation. Nvidia got extremely greedy, to the point that no PC builder want to use their cards, they are forced to. AMD is always matching their pricing to Nvidia specs, but lower because they can't match Nvidia pefrormance. Intel seems to be a new kid on the block. With budget price but mid range performance (and proper VRAM size) they are the best choice currently. And a hope for future with very large VRAM that can power 4k gaming (or 1080p 240/480 fps on some games) that is not just for the top wallet.
To bad B580 is nowhere to be found. Every new delivery sells out fast. I haven't managed to buy it yet.
I am betting the "deliveries" are rather small. Sure there is demand but I feel this is a supply issue, and perhaps a purposeful one.
Supply is tiny. My local microcenter didn't have any on launch day, there were 5 the following week but sold instantly. They then had the Sparkle model the following week that was $270 but only 3 in stock that sold instantly. On newegg you see them at $400+ from price gouging 3rd party sellers. They may have better supply of the B570 if bad B580s are being repackaged. The main thing the B580 is doing at microcenter is I think most ppl who can't get one will end up getting an in stock $249 3060 12gb, or $300 4060 8gb. Rx 7600 is a bad card at $269 still.
Tbh the die size does not shock me, if you go by that metric we know that the 4060 is actually a 4050, and yet the 3060 Ti that has similar performance to the B580 was on a 392mm die, simply nvidia just shifted the needle, its that simple because in all fairness, they perform where they set them to perform and no one else typically bothers with that.
Considering where the company as a whole is at right now, I don't know if they can afford to sell products at a loss.
Why was Pat Gelsinger (the project's champion) just fired?
Another reason for they having to lower the price this much to be able to complete is how it seems like even though it has higher bandwidth and VRAM it performs worse than a 4060 in rendering in blender at least, of course the market for gaming and specially the AIbots is way more profitable than 3D work but its a point to consider, me personally I was going to try to get one B580 but that was the main reason I see getting a 4060 as unfortunately the only way right now to get a decent card for gaming AND rendering
Could be just the drivers, they are focusing on improve their game drivers quickly.
If Intel or AMD could release a well-priced equivalent for each model that Nvidia produce at the current generation, that would really give Nvidia a run for their money. Something that really needs to happen too.
It costs 440 euro on amazon here in europe. It's cheaper to buy the nvidia, or amd
I think a lot of people are on copium thinking that B580 is making money and blaming Nvidia for having a large margin. Well, you don't have to compare it to Nvidia, compare it to RX 6600. RX 6600 is a lower bin chip, it is on TSMC 7nm and has 237mm2 die size and has 8GB VRAM. Right now it is being sold at around $200. B580 is on TSMC 5nm and has 272mm2 die size and 12GB VRAM. It is being sold at $250. Already from the die size itself it should be a lot more expensive. 5nm is expensive. And this is using TSMC and not Intel own foundry, thus you can't expect Intel to get that foundry discount. Also remember that we are comparing the lower bin Navi23 vs probably the full G21.
I think at best the product has a very tiny margin thus overall it is a loss if you consider the R&D cost. At worst Intel is selling B580 at a loss. Whatever it is, I don't think Intel is making money from B580 and I believe Intel rep has said that much.
For me, I don't mind Intel is selling this at a loss. The main issue I have is when AMD and Nvidia probably don't want to sell their GPU at a loss, thus for example, AMD might have a card in a similar performance tier as B580 (maybe their next GPU, RX 9060XT, and assuming it has similar VRAM size) and price it at $300 to $350 max. No doubt reviewers will portray this GPU as having a bad value because B580 is only $250, but in reality, if AMD want to actually make money, they can't charge lower than $300, which obviously make the review a bit unfair.
FYI, in Indonesia, the cheapest B580, which is an ASRock model, is $340. You can buy RX 7600XT 16GB at $315 and RX 7600 8GB at $265. As far as I'm concerned, $250 is a fake price. Probably because the production is so low and they prioritized big market, thus only few that came to my country, thus it become expensive. FYI, in the US you can buy RX 7600 at $250 and the cheapest price for 7600XT is $315. Basically AMD has enough margin for RX 7600 series that they probably can sell those GPU in my country at a lower margin while B580 there is simply no margin thus the prices actually become more expensive. And yes, those prices are with the tax included which should be close to 20% for VAT + import tax.
Most everyone is in aggrement that BMG is losing money.
The question is if the per unit cost has positive gross margin, which makes a big difference. Because if they're net negative gross margin per unit, every sale makes them lose more money. And if they're gross positive per unit, every sale makes them lose less money.
No, compare it directly to the 4070. Everything is the same. Sure, 294mm square vs. 272mm square, but EVERYTHING ELSE is EXACTLY THE SAME. Take 4070 MSRP = $550 and correct for channel margin (15%) and published NVidia 2022 margin (61%) and you get $297 to build the 4070. Maybe $10 less for the B580 to correct for the very minor die size difference. So, $287 to build the B580. They lose $37 on every B580 they sell at $250. Intel has forgotten how to design VLSI. All their 2024 designs are horribly inefficient.
Honestly, we're still in the "beta" stage of Intel GPU development. The drivers still aren't up to par with Nvidia or AMD, nor is the feature set the GPU offers. As of right now, I think it's more about market saturation than anything; get the product into user's hands, learn as much as you possibly can from them, make a better product next time. Nvidia and AMD have a MASSIVE head start on Intel in this arena. Anything Intel can do to disrupt the market is a huge win, even if there is a bit of a loss. Intel is playing the long game on this one.
Is that based on the marginal cost per unit, or the total price per unit with fixed costs amortized?
My guess is break even or slightly positive gross margins per unit, with overall BMG being a loss once fixed costs are included.
And BMG will never hit the volumes to recoup the NRE, so it will be a loss.
But it's all speculation
ASSUME: TSMC 5n wafer costs $18,000 and you have 100% perfect yields (you won't). That means the 613 sq mm. chip for RTX 4090 only "costs" $200 in raw silicon. The 310 sq. mm chip on the Navi31 GCD only "costs" $97 in raw silicon. The 276 sq. mm. die on B580 only "costs" $86 in raw silicon. There is a mountain of R&D that must happen before you can make even one chip of course. And a ton of expense to get a chip off the wafer and running on your machine. I simply don't see how Intel could be making money on B580.
Question: is die size as important for a company that has its own foundries versus companies that contract die production to another business (e.g. to TSMC)?
Yes.
Both the 4070 and B580 are made in exactly the same factories - TSMC 5nm process @ TSMC.
I don’t even know why I bother posting this - but the transistor density of a 4070 is DOUBLE of that of a B580. I don’t set prices at TSMC, but I can’t imagine it doesn’t affect the pricing.
Intel is still new to work with TSMC. They will get better at it or even better, switch to their own foundries.
@ Not the point I was trying to make. Double transistor density = cost
TSMC usually sends out dev advisors to their larger clients, such as Apple, Nvidia, the works. This is to help their more important clients work as efficiently as they can with their nodes.
Intel is a competitor, they were probably not shown this courtesy. Not to mention the recent friction between them. Intel is on an expensive subfamily of nodes, without much guidance and/or benefits from that node.
All things considered, even the possibility of them sandbagging their transistor density, it isn't the worst outcome
@@lolilolplixtransistor density usually reduces pricing, unless you're on a specifically developed node for your own product
More transistors in the same space means you can make a similar chip in a smaller space, thus saving costs since you're paying per wafer of a node, not for the density your designs achieve
Nvidia went through the same thing, during their TNT/TNT 2 days. And look where they are now
Kicking out Pat just before Christmas speaks for itself... Not that he is to blame, Intel was already behind, losing at least a decade thinking they will be the market leader forever.
Catching up takes money however, and he was also misfortunate that money is now also short. Intel Arc turned to be too little, too late, but hopefully they will continue trying.
Intel stagnated because the accounts were in charge. For the first time in a long time they had a CE0 with an engineering background, and he just got fired. My guess is the accounts will be put back in charge.
The logic that this card can't be profitable because it offers much of what the 4070 does at a lower price, implies that the 4070 is priced with a thin margin and this undercuts it. Obviously that's not the case, so I'm not too sure I agree with the assessment.
i don't agree they are losing money on actual units sold but definitely on R&D.
If die aize aets the final price, the 4090 must be a meter long
sorry I did not see I wrote completely wrong, it should say "If die-size sets the final price, the 4090 must be a meter long"
I think the B570 with performance between the RX-6650XT and RX-6700-Vanilla priced at $220, might be MAKING money for Intel, and Factory Overclocked models for $230-$270 with performance on top of the B580 would help make Intel and Vendors more money.
If I had to guess, it Total Card Cost for the B580 is $205-$220, and the Total Card cost for the B570 is going to be $150-$170....so Intel and Vendors are going to make more money off the B570 cards.
Its a 1060 basically with tax was 300 bucks and kncluding inflation i think the margins are on the lower end considering that they need more telemetry data to optimize their Driver even further and get Market Share with Reputation.
If this goes on just like with XESS bein better than FSR than AMD really should consoder becoming competetive again in that space instead of relying on being main supllier for New Gen Consoles.
The B580 is just a NEW 2080 Ti at this point lmao.
Not really. My regular 2080 is still like 25% faster
you pay nvidisa for the software as a big part this also needs a lot of money and the profit
Given that AMD is barely making any money in their graphics division it seems likely Intel is losing some money making graphics cards once all dev costs (silicon engineering, driver programming etc) are counted. They could still make some profit on the hardware offsetting some of the costs.
So how did you arrive at the cost price, Profit margins etc. And how much profit should anybody be making, 10%, 20%, 30%, 100% or more !!! Could state what the failure rate when making these chips is and what do they did to be profitable !
I heard, Budget graphics don't make decent returns until they sell in big numbers. The cost remains more than the the profit until they pass the break even point.
It's pretty odd to me that, for Intel, there's concern that a GPU with a die size comparable to the 4070 Ti at that price isn't profitable, while at the same time people harp on Nvidia for the steep prices. You really can't have it both ways, I'm afraid: Either Intel is losing money on their GPUs, then Nvidia's (and AMD's pricing, they're not better there, sorry) is A-okay, or Intel is making money and Nvidia's pricing is over the top.
I think that intel is making money per unit but has no chance to actually recoup any R&D costs. For NVIDIA and AMD that would be counted as losing money.
If they priced it accordingly they would not get any marketshare, not recoup R&D and lose even more money.
They took the opportunity to get marketshare as that is the best outcome, they have to keep price low.
I think the big question is, is intel going to have money to R&D the next generation of ARC cards? as they are not recouping the "investment" on the current gen.
No, the chip is small, so a lot of chips per wafer. Also, Nvidia was making a profit on the 970 being $300. Everything is pure profit for these guys.
What these guys don't mention which absolutely plays into the cost is, Intel is rebounding off a awful CPU launch with performance, that didn't go as how they planned. So yes, I think the $250+ price point is a way for them to take the spot life off the CPU and back on Intel, casting the company in a better light. BTW, I picked up an ASRock Steel Legend B580 on the 13th of December, and it's a fantastic card for the money. Well done Intel!
I don't think there's any realistic chance they are selling those cards for a loss, that just doesn't make any sense. It's not like there isn't a huge amount of wiggle room to compete against their competitors margins, hell AMD are usually operating at a profit margin of around 50% and Nvidia far higher than that. There may have been talk of Intel losing money on GPUs but that almost certainly refers to the Arc division as a whole losing money, which means including all of their R&D costs (which will be pretty enormous).
we can't get the intel B580 gpu for less than $400 thanks to scalpers selling the gpu now and intel not having a big stock
The 24GB variant will be great for AI.
The resales of Nvidia will be reduced by resalers by a amount that intel cannot beat anyways . It is a shame to root out ANY competition. The prices are all ready coming down for most JUST TO SELL them off and STILL make a killing on them. WAKE up People .
8gb of vram is literally the goddamn quad core cpu bs all over again..... let it dieeeeee.
Nvidia: time to make those new rtx 1600$ and up
Intel would not get to fund their GPU R&D if they continue with this unprofitable strategy.
The resellers are OVER Pricing the B580 already the 250.00 is 400.00 Where is the 250.00 GPU's ?????? Amazon has 399.00 B580 WTF.
They are not overpricing. But the "sponsored cards" had to run dry at some point. Now you can buy them at actual price, not at fake MSRP.
You fell for the lie of Intel, because Intel is the scalper here, not the resalers.
How is the B580 cheap everyone keeps saying it's only 250 yet in the UK cheapest version is 350
Don't blame Intel for scalpers. You can always pre-order the Sparkle B580 for £259.99 on OCUK. Also in stock on CCL for £299.99. That's overpriced but still way short of £350.
@Groovy-TrainI am looking at Amazon those say pre order
@@xpmon wrong site
@@XGD5layer ??? Wdm
@@xpmon Overclockers UK is cheaper than Amazon
Well it costs 400-450 euros in Europe :)
At most 350€ in Europe
@@XGD5layer Correct. 360€ was the highest offer I saw.
In order for Intel to make money, they need to make back the money they've put into developing them... They are no where near that point. They would need to sell an eye watering amount of $250 GPU's to start making money on this generation. It's not going to happen. They become a successful GPU manufacturer, or the tens of billions already spent, go up in smoke.
Unfortunately, when a company establishes a benchmark of profit, they always have to surpass it. It cannot be less. For NO other reason than to appease shareholders. It is a natural consequence of capitalism and the reason prices will rise unsustainably until collapse. At the cost of consumers.
Wafer cost 10k. Yield 75% is what TSMC claims for the 6nm(they call 5nm) node. Die size some maths done ✅ want my spreadsheet? Jesus this is a disappointing video. Even if it is a 5 real. The intel margins are still 15-20% but i doubt Intel pays that... but lets say they do. They ARE NOT selling for a loss.
Calculating material costs totaly ignores RND, software and other things
@ I’m talking about production costs gross margin. They have long since expensed the R&D.
How big is wafer?
@@GreyDeathVaccine300 mm
@@skyyefinance its 272 the die
no they dont lose because what its the point of release of them anyway!!production of those cards are from 60-90$ !!Intel actually are only still in loses untill they start to cover investing for research and similar things
That is going to be a while
for now intel will use some one like tmc but when intel fabs are up and running intel will make bank
not suprised. they are all sold out so all the hype really was just that.
I'm buying a B580 as soon as the less expensive ones are back in stock. I could get one for 350 right now (which is still a decent deal IMO), but I'll wait. After that, I'm buying an AMD gpu. Nvidia has gone nuts with their pricing and specs. "Less is the new more"---Nividia.
Guy concerned probably huge nvidia fan 😂
Does it matter, given that Intel isn't selling them?
Im sure. But to our benefit.
The demand for this card is insane, there's no way they are losing money.
The B580 is not on a more advanced node, it's on N6, like thegeneration prior to it, Alchemist, meanwhile the entire RTX4000 series is on N4X and RDNA3 is on N5(sans the 7600/7600XT)
As per intel official site it’s on tsmc N5
@@eduardoleiva4959 still doesn't make it a more advanced node
Alchemist where much bigger and sold more cheaply
Alchemist is produced using TSMC N6. It's a cheaper node.
I do not think intel is losing money with their Arc B580 at all, maybe they are not making that much of a proffit but,damn sure nvidia and amd are overpricing their gpu's, this is a guarantee...and i am not a intel supporter, pc and laptop both nvidia and amd....
B580 performance is indeed closer to 4060Ti than 4060. 20% faster than 4060 and 5% slower than 4060Ti. (Source: TechSpot and Tom's Hardware).
I can easily justify that they are losing money. The MSRP of the 4070 is $550. The channel margin is 15% (Published when EVGA left the market). The NVidia 2022 margin is 61% (Published by NVidia in 2022). So $550 / 1.15 / 1.61 = $297. So the BOM costs $297 to make for the 4070. Now compare the 4070 to the B580. All the parameters are virtually the same. 5nm, TSMC, same power, same cooling, same DDR6 - EXACTLY. The B580 is a tiny bit smaller (272mm square vs. 294 mm square), so a TINY bit cheap er to make (maybe $10 cheaper). Therefore, the Cost of making a B580 = $297 - $10 = $287. Intel is losing $37 on every card they make. EVERY SINGLE ONE.
Pat Gelsinger was fired THE DAY BEFORE B580 WAS ANNOUNCED. Why? Because he was launching products that lose money, just to keep his promises to the board and keep his job. This clown was a useless CEO. The board finally realized it after Gelsinger had made his 12th mistake as Intel CEO !!
There is only ONE type of market share you gain by selling GPUs below cost. MARKET SHARE AMONG BANKRUPTCY LAWYERS. You will definitely need them if you give away free money too often, like Intel has been doing with Arc & Battlemage !!
I doubt the whole die is manufactured at tsmc, maybe they added chiplets with some components produced by their process
No, they paid tsmc to produce their cards. They were using their nodes because intel didn't have the fab space since it was being used to produce arrow lake. There was even news of them losing their 40% discount because of what Pat G. Said about TSMC vs USA.
Intel is for sure losing money just based on R&D.
At best their barely break even on a build cost but take into account staff and R&D and it’s a massive billion dollar loser.
I think the case is Nvidia making too much money