@@antondovydaitis2261now you will see, which retailer is a scalper... Most of them.... If only i could get one directly from manigacturers to skip the retail mafia
People will still complain and many of you guys still won’t buy them. Same thing with AMD, many people only want them for competition, not because they are actually interested in buying their products.
"they are not lying, they just have a different calculation" Nvidia calculation probably was: if with the original price we were making 80% margin on the card, adding 8 GB extra of vram (same PCB and same bus size), cost us 20$ more, we have to charge an extra 100$ so it keeps on the same 80% profit margin as the original variant.
@@moto6981 for $230 it's still an amazing deal. Compared to the nvidia offering of that price which is like a 1660 lol. most 3060s are around 250+ a770 isn't always the fastest, but in many titles it still gets close to a 4060 ti. it can beat, or compete with a 6700xt too often. for 230 bucks you aren't getting a bad deal.
had one for a few months and it truly was great for vram intensive workloads, gaming was meh, I mean, it works, but it obviously underperforms. But there is no other way to get 16gb anywhere close to that price
I'm currently playing Dying Light 2 on 1440p with max ray tracing on a RTX 3080 10G. For the first 10-15 hours it ran pretty good but once I got to the big city I keep experiencing burst of stutters. It was indeed a vram issue and it's annoying. The 3080 is definitely a capable card, but Nvidia intentionally handicapped it. All this is to force people to upgrade their gpu in just one generation. They know exactly what they're doing. This is a calculate decision made by a multi-trillion dollars company to squeeze every last penny out of us the consumers. I'm glad Intel is also fighting this unnecessary issue that Nvidia started. I hope Arc stays relevant in the market for a long time. Nvidia needs to be humbled.
Same card user here, that 10gb is bugging me too such a capable card but limited by vram. Wonder how long will it last on 1080p high on the years to come.
I also have a 3080 10GB, you are right, this gpu is too fast to have just 10gb of vram and in some cases it can be limiting factor. I play at 1440p and usually I have some issues only when I turn RT on because (ironically) it uses more vram, without it it's perfect. That being said it is possible to have a good experience with 10GB of vram maybe lowering textures from ultra to high (or shadows), some RT related setting or just using dlss that looks quite good at 1440p. I think 10gb are also more than enough at 1080p (but who plays at 1080p with a 3080??). So yeah you can play well lowering some settings and most of the times you won't even notice the difference but it's annoying that it has to be done on such a capable gpu. At least I bought it used for just 350$, I would have never paid full price for it.
RTX 3080 10G, like many Nvidia gpus, is a waste of sand. A good gpu should have proper Vram, therefore, half the Nvidia lineups that fanboys claim to be good gpus, are trash
@@1vaultdweller I mean, most nvidia gpus don't have a lot of vram and that's true, I wouldn't buy anything with only 8gb of vram at this point but maybe you are exaggerating just a little bit. Changing textures from ultra to high is not a big deal most of the time and it's the only thing you need with a 3080 at 1440p (most of the times you can max out everything if you don't use RT). I think the 3080 should have 16gb of vram but that doesn't mean it's a trash gpu, in fact it can run any game without issues (in rare occasions you lower textures by a bit). Nvidia has a vram issue in general but also AMD gpus have some issues, for example FSR is currently the worst upscaler and RT performance on all AMD gpus are a bit dissapointing, yes in most games you can always play without RT but in the near future games are going to only have RT (or PT) rendering, also if you buy a high end AMD gpu (for example the 7900xt) you'll most likely never use RT because it's too heavy at 1440p or 4k and in some games it really makes the difference. Like it or not upscaling and RT are going to be the future of gaming and with FSR 4 being an AI upscaler old AMD generations are most likely to be left behing just like nvidia did. I'm not saying AMD, NVIDIA, or Intel are bad, I'm just saying every GPU has some flaws and we should buy what fits best for us without hating. A big factor is also price, most of the times AMD gpus are cheaper but it's not always like that especially on the used market, if you could buy, for example, a used 3080 for 350$ or 300$ (the price my 3080 had) would you refuse just because of the 10gb of vram? Every gpu is good at the right price.
Currently having endless stuttering with my 3070 due to the 8GB of VRAM, I'm never buing Ngreedia ever again after this shit. The chip itself can run games even on 4K without a sweat, but the small vram kills it. Planned obsolescence at its finest.
So you have had about a half decade with it so far and are now mad that it is starting to have performance issues. Right. You made the decision on 8gb, you made the decision not to go AMD where you wouldn't have to pay for RT cores, tensor cores, and get VRAM instead. "Ngreedia" didn't strong arm you into buying their specific product, you chose to, YOU DID. Hardware Unboxed has warned people about 8GB for years, and you dummies do what you want and cry later like a child whos bottle wasn't filled to the absolute top. Try AMD next time, and we will see you back in another 5 years crying about that GPU no longer performing as well as it use to.
I predict the GPU department of Intel will be a lot more popular in a few years, possibly more than their CPU department. B580 is a really good GPU, I'm glad to see the competition. And if the 5060 truly has 8 GB again, that's the best thing that could happen for Intel, because every decent reviewer would recommend the B580 instead. Also I really love the communication with Tom Petersen here!
The only issue it has r with drivers. Games like Spider-Man stutter like crazy for no reason with digital foundarys testing. Also the frametimes in general r worse.
@@stealthhunter6998I mean it's improved so much, it's now turned into pointing at a few games. That means, especially if people start buying arc GPUs, that this could probably be fixed soon, because a large part of the problems may well be on the game developer side
I heard they shut it all down... 'MLiD Tom loves himself more than anyone else ever could' said so.... but I mean what about drivers if they shutter it? Intel pushed their CEO out the back door. then back dated his departure.. who fuckin does that man? I have never heard of that, ever. word was always that Gelsinger was the only thing keeping this project alive. He is gone now.
@@ksolo614 Well that's not too reassuring for the future. The Intel guy was basically talking about how their Battlemage is essential for iGPU and APU development so it looks like they gonna ditch Arc eventually.
NEVER. GOING. TO. HAPPEN. Nvidia is below 132 most of this morning. If is closes below 132 I think its got 15 - 30 point to drop before it soft lands. I think Nvidia is in serious trouble and with the new tarrifs, the smart money is leaving. Billionaires all sold at record highs already. The smart money jumps in the morning and exits in the afternoon. lets see....
Oh come on, the 3060 came with 12GB and the cost was not a big factor. If all new cards had 16GB they could keep the cost down due to the volume buy of one kind of chip.
TAP can talk cost analysis all wants, the memory, interconnect, bus, etc, but they genuinely cost nothing compared to the price. It's all about the profit obtained through market segmentation.
Currently using an A770 and very happy with performance at 1440p. Glad to see Battlemage improve on it but I'll probably wait for Celestial to upgrade.
with Nvidia and AMD avoiding the bottom end, there is definitely a market there for intel above APUs and below a discrete Nvidia and AMD. The thing i dont get is why intel don't have a laptop GPU since thats where they still get most sales. they could have bundled it like they used to do with centrino when you had an intel CPU + chipset + wifi, they could have done the same but with CPU + GPU + WIFI
no...all intel APU suck no handheld gaming developer want any APU from Intel .and Nvidia never make any CPU or APU still AMD was top ranking in any handheld APU
@@kurttis8512 i never said anything about an intel APU, i said a laptop GPU. and your also wrong gaming on a APU, I have the GPD WINMAX with a Intel 1035G7 and its iris graphics can play everything.
Seeing how the B580 turned out, I'm currious how the 700 tier cards would turn out? My early prediction is the B770 could be between the RTX 3070 & 3070 Ti for performance with 24GB VRAM on a 192bit bus (my understanding for B580 is 6x 2GB 32bit chips for 12GB over 192bit (with one chip removed for the B570 that would be 5x 2GB 32bit chips for 10GB over 160bit), next step up for size is 4GB chips so 6x 4GB 32bit chips would be 24GB over 192bit (with one less chip being 20GB over 160bit that could be the B750 that could be somewhere between the RTX 2080 Ti & 3070))
HD 7970 had 3 GB of vram. Basically memory chips have a set bit bus compatibility - to populate 128 bit memory bus you need 4x 32-bit memory chips. To populate 160 bit bus you end with 5 chips totalling 10 GB (the configuration Intel Arc B570 has). To have a 160-bit 8 GB card you would need custom 40-bit memory chips, that no manufacturer would want to make, because those would be incompatible with industry standard.
@cronos1675 ok then how 4060 ti can have 16 gigs of memory but not require a bus increase. Or if you think it's the other way around why doesn't the 8 gig version have a 64 bit bus.
@rhythmmandal3377 That's because the memory controller can addresses more memory chips on the same channels, by adding chips to the same connections on the other side of the pcb. Only downside is those extra chip provide no additional bandwidth.
The beauty with faster iGPUs also means there is no VRAM limit there either. I don't think AMD is going to make another 8GB card but Nvidia probably will. We'll see iGPUs on laptops overtake the RTX 5050, Nvidia will basically have to abandon that market entirely.
@@defeqel6537 yea Navi is a generational leap in RAY Tracing performance, rumors too. I dont know that I care about RAY TRACING. No I know I dont care.
These days VRAM should be 12Gb or above, less than that is in fact an issue unless you spend all your time playing these small games, most which i call retro style 2D games.
Side Note: I have seen this channel make opposite arguments about 8GB just not that long ago right? I am smoking too much of the local shrubbery? Well the main channel. They have branched out to hub clips and whatever.
@@kimmyksbro3116 each memory controller, which connects to 1 or 2 memory modules/chips, is 32 bits, so you have 4 of them for a 128-bit bus. Now each memory module (of GDDR6) can be 1 or 2 GB large, so you get 4 or 8 GB capacity, if you want 16 GB, you need to connect 2 modules to the memory controllers. With a 192-bit bus you have 6 memory controllers, so either 6 GB, 12 GB or 24 GB (with 2 x 2GB modules per MC). In theory you could only have some memory controllers connected to 2 memory modules and some to a single one, but that makes things much more difficult in terms of memory bandwidth and data access.
1GB/8Gb of VRAM costs under $3 atm. (and has for quite some time), plus a bit of additional die space when talking about a wider bus width, and some traces on the BCP. It's really not very expensive.
The ability of AMD and Nvidia to ignore the insatiable need for more vram is not just confounding but also a huge missed opportunity not just for sales but for developers to be freed to be more creative instead of being confined to 8GB.
AMD had that space to themselves. Now here comes intel with a 400 card that no one realizes is 400 yet with the performance of a 6600XT come on people. STOP BEING GOLDFISH
If the cards now had 24gb instead of 12gb, the game developers would not be any freer in their creativity. The majority of players have inexpensive cards that are somewhat older and have little VRAM. The game developers are guided by this. They want the largest possible buyer base. The market is moving so slowly that the steps with VRAM will move very slowly.
Let's go Intel. We need you to thrive in the GPU industry. We're very dissatisfied with AMD and NVIDIA. We need a new competitor. Please make those cringe ads that show how you're beating AMD and NVIDIA. Make sure you deliver your advertised performance. 😊
and what do we see happening right now? Google BUY INTEL B580. how much are they again? Now compare this hunk of crap to a 400 card. The 4060 Ti is a no brainer. Nvidia MSI 4060 ti card right now before those prices go up. Are you people goldfish?
Worst of all there are few games who forced ray tracing despite the fact that steam most popular gpu is 💩 for ray tracing unless you use ultra performance upscaling
The problem with this “entry level” GPU is that it uses as much silicon as a 4070 super to produce 4060 performance. Intel probably is selling these at a loss even BEFORE factoring in R&D. Hope they can stick around to actually make any money. The discount on TSMC wafers they lost because of their idiot CEO running his mouth probably ate the entire profit margin of Battlemage.
It looks like he just avoided to answer it actually. We need more ram, 8GB not enough, our calculation correct, but nvidia say otherwise, yet they are not wrong.... oh ffs come on man don't do that clown dance and tip toeing around, be straight with us, have a spine and we can respect you.
team blue baby im tired of ngreedia scamming us gamers iv seen the arc b580 mops the floor with the rtx 4060 4060ti rtx3070 3060 and amd rx 6700xt and 7600xt
The answer is no. They wanted this gpu to fall in line at about the 4070 level and compete nicely there for $400 having a standard amount of vram. Perdormance fell short so now it is a spin of "12gb to the entry level". Not that its a bad more or a bad look. Good for intel.
No, this unit only has 20 Xe cores (5 "render slices") compared to the previous gen with 32 (8 "render slices"), the increased efficiency over last gen means a higher tier GPU is possible, there's just no release date set and Intel hasn't said much. Intel's marketing also didn't compare the B580 to the top tier previous gen, even the naming suggests a cut-down version.
Dude 4070? what now? Its barely on par with, and it looks like shit compared to, a 4060. I hate Nvidia passionately but I would get an Nvidia 4060 8GB before this card.
@@nimrodery Can you imagine if Intel, with their infinite wisdom and knowledge of the 4060 like performance.. decided to compare the GPU to a 4070? Yeah.. the fact that they didn't do that means they were aiming for low end from the start. Riiight. Not. The number of cores is completely meaningless as well... different architecture means they cannot be compared. Remember the leaks and rumors that suggest Intel is aiming for 4070 ti to 4080 super performance at the top end? Well that is supposedly the 256bit GPU. Just like Nvidia.. if the 256 bit GPU is aiming for 4080 performance.. it's stands to reason that the 192bit GPU aims for the 4070(or 4070 ti because the 580 is the full die like the 4070 Ti is). And the b580 is the full die and is not a "cut down" GPU like you say. The b580 literally had identical silicon Costs as the 4070 Ti and it is only 60% of the performance of that GPU.. and you really think that was Intel's plan?
@@christophermullins7163 Basically everything you said is wrong. "Sources" have been saying for a long time you shouldn't expect a 4080 tier GPU, and if you speculate on the performance of a 32 Xe core part based on this one, the facts seem to concur. Maybe you were hoping for a faster B770 but there has been literally zero figures divulged by Intel, we never knew beyond speculation such a GPU even exists. As for the B580 it's clearly a cut-down GPU, Intel made improvements in every area including heat, it's just ridiculous to suggest this is somehow the best they could do. If we don't see a B770 it's because Pat Gelsinger opened his mouth, not because it's too hard to make. The hard part was getting a dGPU on the market in the first place, and they've released 7 not including collabs with other companies. I don't know about you but I expected 0.
Amd is still better , dude already spilled the beans and said they have already made the next 2 generations of Gpu’s meaning after 2 years these cards get put in the back burner
I see a lot of people hyping this up , like ah I want intel to release a b770 and stuff like that . I want to ask you, how many of you are ready to buy a 450$ gpu from intel that's on par with 4070/4070s instead of actually going with Nvidia ? Not many , I see a lot of people still buying i5 12400f in their rig despite the fact that 7500f /7600 ryzen platform is like 20-30$ more expensive and it has immense futureproof people still stuck to buy what they owned 10 years ago and if 10 years ago they had an i5 6400 they will gladly buy a 12400f and maybe even a motheroard that supports BCLK overclocking , get it to 4.8-5 ghz and be happy for the next 10 years. You are all acting here like majority of people are doing generation to generation upgrades , people usually buy a pc for ~5 years period then upgrade it , the amount of people doing generational upgrades is less than 10%. If intel wants market shares they should be even more aggressive in this price point , maybe a b590/b750 at around 300$ with same amount of memory but 20% more XE cores would do it even better than a b770 , cause in the 400$ price point there is already a lot of competition from new gen and past gen gpus.
I would without hesitation. Give me an Intel card that is fairly priced. Does 165-200fps at 1440p and Intel will get my money. Im still here rocking an old 6700XT, waiting for the 8000 series next year
I would definitely get the B770. The only problems that I have are the taxed prices, stocks and a huge hole in my bank account after buying it (going into adulthood)
I'm kinda new to the whole PC building situation, so maybe I just have a fresh perspective. But the build I'm planning involves a Ryzen 7600/7600X for the CPU and the Intel B580 GPU. Hopefully, I can scrounge together the money for it soon.
@@cjmunnee3356 Dont buy components 1 at a time for a system. Save the chunk. Decide how to spend it when the time comes you have enough for what you wanted. Never buy a computer un pieces. Unless your President is about to enact pointless tariffs that will do nothing but weaken your nation globally. then whatever. Also start with used parts, you can source locally. Start with a used gamers PC. Watch your local sellers, nextdoors FBMP and the like. I just sold a banger system for 760 in a Zalman brand new case. Dude walked off with a HUGE deal. Sold it via NEXTDOOR app my GF uses.
@@ComfyShortz It's literally going to be out of stock until January in most countries. I couldn't blame someone if they can't be arsed to wait and just play some games. Would get 7600XT in that price range for new GPUs but hey.
I'm glad Intel actually bringing some good competition, hopefully their GPU's do well in the low-mid range market. They need to succeed..
my guess is it will be short-lived, once intel establishes a loyal customer base in this market.
If only they were actually for sale at anywhere close to $250....
Still expensive, almost £300 for the Asrock.
@@antondovydaitis2261now you will see, which retailer is a scalper... Most of them.... If only i could get one directly from manigacturers to skip the retail mafia
People will still complain and many of you guys still won’t buy them. Same thing with AMD, many people only want them for competition, not because they are actually interested in buying their products.
"they are not lying, they just have a different calculation" Nvidia calculation probably was: if with the original price we were making 80% margin on the card, adding 8 GB extra of vram (same PCB and same bus size), cost us 20$ more, we have to charge an extra 100$ so it keeps on the same 80% profit margin as the original variant.
I just bought a few A770 16GB for $229.99 what a steal for content creation and gaming with the 16GB VRAM.
The overall design of the last gen intel gpus is not that good. Battle Image is alot better
@@moto6981 this is battlemage not extra image word!
why would you buy the first gen model???
@@moto6981 for $230 it's still an amazing deal. Compared to the nvidia offering of that price which is like a 1660 lol.
most 3060s are around 250+
a770 isn't always the fastest, but in many titles it still gets close to a 4060 ti.
it can beat, or compete with a 6700xt too often.
for 230 bucks you aren't getting a bad deal.
had one for a few months and it truly was great for vram intensive workloads, gaming was meh, I mean, it works, but it obviously underperforms. But there is no other way to get 16gb anywhere close to that price
I'm currently playing Dying Light 2 on 1440p with max ray tracing on a RTX 3080 10G. For the first 10-15 hours it ran pretty good but once I got to the big city I keep experiencing burst of stutters. It was indeed a vram issue and it's annoying. The 3080 is definitely a capable card, but Nvidia intentionally handicapped it. All this is to force people to upgrade their gpu in just one generation. They know exactly what they're doing. This is a calculate decision made by a multi-trillion dollars company to squeeze every last penny out of us the consumers. I'm glad Intel is also fighting this unnecessary issue that Nvidia started. I hope Arc stays relevant in the market for a long time. Nvidia needs to be humbled.
Same card user here, that 10gb is bugging me too such a capable card but limited by vram. Wonder how long will it last on 1080p high on the years to come.
I also have a 3080 10GB, you are right, this gpu is too fast to have just 10gb of vram and in some cases it can be limiting factor. I play at 1440p and usually I have some issues only when I turn RT on because (ironically) it uses more vram, without it it's perfect. That being said it is possible to have a good experience with 10GB of vram maybe lowering textures from ultra to high (or shadows), some RT related setting or just using dlss that looks quite good at 1440p. I think 10gb are also more than enough at 1080p (but who plays at 1080p with a 3080??). So yeah you can play well lowering some settings and most of the times you won't even notice the difference but it's annoying that it has to be done on such a capable gpu. At least I bought it used for just 350$, I would have never paid full price for it.
RTX 3080 10G, like many Nvidia gpus, is a waste of sand. A good gpu should have proper Vram, therefore, half the Nvidia lineups that fanboys claim to be good gpus, are trash
It's all a conspiracy to get YOU ,, yes a multi billion dollar company planned for decades to just effect YOU
@@1vaultdweller I mean, most nvidia gpus don't have a lot of vram and that's true, I wouldn't buy anything with only 8gb of vram at this point but maybe you are exaggerating just a little bit. Changing textures from ultra to high is not a big deal most of the time and it's the only thing you need with a 3080 at 1440p (most of the times you can max out everything if you don't use RT). I think the 3080 should have 16gb of vram but that doesn't mean it's a trash gpu, in fact it can run any game without issues (in rare occasions you lower textures by a bit). Nvidia has a vram issue in general but also AMD gpus have some issues, for example FSR is currently the worst upscaler and RT performance on all AMD gpus are a bit dissapointing, yes in most games you can always play without RT but in the near future games are going to only have RT (or PT) rendering, also if you buy a high end AMD gpu (for example the 7900xt) you'll most likely never use RT because it's too heavy at 1440p or 4k and in some games it really makes the difference. Like it or not upscaling and RT are going to be the future of gaming and with FSR 4 being an AI upscaler old AMD generations are most likely to be left behing just like nvidia did. I'm not saying AMD, NVIDIA, or Intel are bad, I'm just saying every GPU has some flaws and we should buy what fits best for us without hating. A big factor is also price, most of the times AMD gpus are cheaper but it's not always like that especially on the used market, if you could buy, for example, a used 3080 for 350$ or 300$ (the price my 3080 had) would you refuse just because of the 10gb of vram? Every gpu is good at the right price.
Currently having endless stuttering with my 3070 due to the 8GB of VRAM, I'm never buing Ngreedia ever again after this shit. The chip itself can run games even on 4K without a sweat, but the small vram kills it.
Planned obsolescence at its finest.
haha No, next time you just buy Nvidia again and fall for the sales talk as always.
So you have had about a half decade with it so far and are now mad that it is starting to have performance issues. Right. You made the decision on 8gb, you made the decision not to go AMD where you wouldn't have to pay for RT cores, tensor cores, and get VRAM instead. "Ngreedia" didn't strong arm you into buying their specific product, you chose to, YOU DID. Hardware Unboxed has warned people about 8GB for years, and you dummies do what you want and cry later like a child whos bottle wasn't filled to the absolute top. Try AMD next time, and we will see you back in another 5 years crying about that GPU no longer performing as well as it use to.
Finally someone that gets Nvidia baseball bat chair'd em.
What games are you experiencing stuttering with?
@@tottorookokkoroo5318 farming simulator 25
I predict the GPU department of Intel will be a lot more popular in a few years, possibly more than their CPU department. B580 is a really good GPU, I'm glad to see the competition. And if the 5060 truly has 8 GB again, that's the best thing that could happen for Intel, because every decent reviewer would recommend the B580 instead. Also I really love the communication with Tom Petersen here!
The only issue it has r with drivers. Games like Spider-Man stutter like crazy for no reason with digital foundarys testing. Also the frametimes in general r worse.
@@stealthhunter6998 Drivers can be fixed, VRAM cannot. Its a fight for Intel to lose if they don't come around to make their drivers more robust.
@@stealthhunter6998I mean it's improved so much, it's now turned into pointing at a few games. That means, especially if people start buying arc GPUs, that this could probably be fixed soon, because a large part of the problems may well be on the game developer side
I heard they shut it all down... 'MLiD Tom loves himself more than anyone else ever could' said so.... but I mean what about drivers if they shutter it? Intel pushed their CEO out the back door. then back dated his departure.. who fuckin does that man? I have never heard of that, ever. word was always that Gelsinger was the only thing keeping this project alive. He is gone now.
So any news on a B780 or some higher-end Battlemage card that's not just budget tier?
Those were canceled
@@ksolo614 Well that's not too reassuring for the future. The Intel guy was basically talking about how their Battlemage is essential for iGPU and APU development so it looks like they gonna ditch Arc eventually.
@@ksolo614 When was that announced?
@@nimrodery Never, he made it the fck up.
Glad we got RTX ON in the glasses 🤣
The fundamental issue is that the consoles aside from XSS have ~12GiB for graphics, so that's what games are designed around.
Please, if nVidia again releases 5060 with 8GB and 16GB for 400-500€ and 5070 with 12GB for 600€ rip them a new one.
Pretty please 😊❤
@@KryssN1 nope, peoplenjustify this... Again and again
NEVER. GOING. TO. HAPPEN. Nvidia is below 132 most of this morning. If is closes below 132 I think its got 15 - 30 point to drop before it soft lands. I think Nvidia is in serious trouble and with the new tarrifs, the smart money is leaving. Billionaires all sold at record highs already. The smart money jumps in the morning and exits in the afternoon. lets see....
Side note: man I miss being young and dumb and full of hope.
The 3060 adressed the 12GB issue 5 years ago but then Nvidia had a greed stroke.
3:30 "they are not lying" lol
Corpo is a corpo regardless.
Oh come on, the 3060 came with 12GB and the cost was not a big factor. If all new cards had 16GB they could keep the cost down due to the volume buy of one kind of chip.
You can't generalize this because the Ada architecture is more expensive than Ampere, especially the process node.
@@dampflokfreund is the Battlemage architecture more expensive than Ampere though?
1GB costs under $3 right now, though that's just the VRAM chip/module
TAP can talk cost analysis all wants, the memory, interconnect, bus, etc, but they genuinely cost nothing compared to the price. It's all about the profit obtained through market segmentation.
Currently using an A770 and very happy with performance at 1440p. Glad to see Battlemage improve on it but I'll probably wait for Celestial to upgrade.
I have an RX7900xtx and seeing the B850 make me wanna get one too 😅
Yeah it's cool little gpu 😅
Why
@@newearth9027 Cute 😅
@@cocobos cause the design is cute? Lol
why? huge downgrade.
with Nvidia and AMD avoiding the bottom end, there is definitely a market there for intel above APUs and below a discrete Nvidia and AMD. The thing i dont get is why intel don't have a laptop GPU since thats where they still get most sales. they could have bundled it like they used to do with centrino when you had an intel CPU + chipset + wifi, they could have done the same but with CPU + GPU + WIFI
no...all intel APU suck no handheld gaming developer want any APU from Intel .and Nvidia never make any CPU or APU still AMD was top ranking in any handheld APU
@@kurttis8512 i never said anything about an intel APU, i said a laptop GPU. and your also wrong gaming on a APU, I have the GPD WINMAX with a Intel 1035G7 and its iris graphics can play everything.
right especially with this efficiency, that laptop will be cold as fuck
Nice. Thanks Tom.
Seeing how the B580 turned out, I'm currious how the 700 tier cards would turn out? My early prediction is the B770 could be between the RTX 3070 & 3070 Ti for performance with 24GB VRAM on a 192bit bus (my understanding for B580 is 6x 2GB 32bit chips for 12GB over 192bit (with one chip removed for the B570 that would be 5x 2GB 32bit chips for 10GB over 160bit), next step up for size is 4GB chips so 6x 4GB 32bit chips would be 24GB over 192bit (with one less chip being 20GB over 160bit that could be the B750 that could be somewhere between the RTX 2080 Ti & 3070))
Supposedly Intel has canceled the B770
Hope they do well, we really need strong alternatives at a good price
True i getting sick seeing crappy new budget gpu
@@mrbobgamingmemes9558 and this card is 400 now. Making it MEHx2
They aren't lying, they're just not telling the truth.
Truth is the new orange.
Where supply?
Thanks, Tom!
Let me put an nvme directly on to my gpu as vram
"I don't think they're lying to you..." You sure about that??
I don't care about 1440p, I'm just glad we're getting more competition!
We need GPUs with more VRAM for LLMs
I got a new prebuilt earlier this year with a 4060 (coming from 1650 S) and man i kinda want this one as its really necessary for over 8 vram
If your 4060 plays all your games at the settings you like, you're probably good for the time being
You didnt get a '4060', you got a _4050_
I don't understand how a hd 7970 could have 2gigs of 384-bit bus but now adays it's impossible for 8 gig cards to have 160-bit bus.
HD 7970 had 3 GB of vram.
Basically memory chips have a set bit bus compatibility - to populate 128 bit memory bus you need 4x 32-bit memory chips. To populate 160 bit bus you end with 5 chips totalling 10 GB (the configuration Intel Arc B570 has). To have a 160-bit 8 GB card you would need custom 40-bit memory chips, that no manufacturer would want to make, because those would be incompatible with industry standard.
@Zbychoslaw666 you explain to me how a card from more that a decade ago had a higher memory bus width than current gen cards.
@@rhythmmandal3377 cause the hd7970 has 12x memory chip combined for 3 GB in total memory . Each memory chip must be 256MB 32 bit
@cronos1675 ok then how 4060 ti can have 16 gigs of memory but not require a bus increase. Or if you think it's the other way around why doesn't the 8 gig version have a 64 bit bus.
@rhythmmandal3377 That's because the memory controller can addresses more memory chips on the same channels, by adding chips to the same connections on the other side of the pcb. Only downside is those extra chip provide no additional bandwidth.
The beauty with faster iGPUs also means there is no VRAM limit there either. I don't think AMD is going to make another 8GB card but Nvidia probably will. We'll see iGPUs on laptops overtake the RTX 5050, Nvidia will basically have to abandon that market entirely.
all rumors point to Navi 44 being 128 bit, so likely we will get 8GB and 16 GB again
@@defeqel6537 yea Navi is a generational leap in RAY Tracing performance, rumors too. I dont know that I care about RAY TRACING. No I know I dont care.
@@Machistmo for perception reasons, RT performance is probably important, but I also don't really care about it
It would be good if they could test in programs such as Stable Diffusion, comfyui and similar.
12gb vram seems like sweet spot. I has RX6800 & RX6800XT 16GB in 2 PCs...
These days VRAM should be 12Gb or above, less than that is in fact an issue unless you spend all your time playing these small games, most which i call retro style 2D games.
Here ... Some Tom love ❤
Side Note: I have seen this channel make opposite arguments about 8GB just not that long ago right? I am smoking too much of the local shrubbery? Well the main channel. They have branched out to hub clips and whatever.
Why not 16GB?
would require either a wider or narrower bus, neither is a good option, the first because of cost and the latter because of performance loss
@@defeqel6537 4060ti has 16GB ram 128 Bus
Why not just put 16GB on the card?
192 bus, is that a problem?
@@kimmyksbro3116 each memory controller, which connects to 1 or 2 memory modules/chips, is 32 bits, so you have 4 of them for a 128-bit bus. Now each memory module (of GDDR6) can be 1 or 2 GB large, so you get 4 or 8 GB capacity, if you want 16 GB, you need to connect 2 modules to the memory controllers. With a 192-bit bus you have 6 memory controllers, so either 6 GB, 12 GB or 24 GB (with 2 x 2GB modules per MC).
In theory you could only have some memory controllers connected to 2 memory modules and some to a single one, but that makes things much more difficult in terms of memory bandwidth and data access.
I hope intel gpu can be competitive this time,
I want intel to release a card that competes with the upcoming nvidia 5080/70
thanks
It does cost more. About 30 dollars or so more because that's what 3060 costed compared to 4060 if we compare MSRP pricing.
1GB/8Gb of VRAM costs under $3 atm. (and has for quite some time), plus a bit of additional die space when talking about a wider bus width, and some traces on the BCP. It's really not very expensive.
@@defeqel6537yeah i mean 12 gb at 250$ means than higher capacity vram is not that expensive,
Another paper launch but people keep milking the 250 price point
The ability of AMD and Nvidia to ignore the insatiable need for more vram is not just confounding but also a huge missed opportunity not just for sales but for developers to be freed to be more creative instead of being confined to 8GB.
AMD had that space to themselves. Now here comes intel with a 400 card that no one realizes is 400 yet with the performance of a 6600XT come on people. STOP BEING GOLDFISH
Nvidia does this on purpose so the only options are 3090/4090
If the cards now had 24gb instead of 12gb, the game developers would not be any freer in their creativity. The majority of players have inexpensive cards that are somewhat older and have little VRAM. The game developers are guided by this. They want the largest possible buyer base. The market is moving so slowly that the steps with VRAM will move very slowly.
Let's go Intel. We need you to thrive in the GPU industry. We're very dissatisfied with AMD and NVIDIA. We need a new competitor. Please make those cringe ads that show how you're beating AMD and NVIDIA. Make sure you deliver your advertised performance. 😊
and what do we see happening right now? Google BUY INTEL B580. how much are they again? Now compare this hunk of crap to a 400 card. The 4060 Ti is a no brainer. Nvidia MSI 4060 ti card right now before those prices go up. Are you people goldfish?
Keep shilling ray tracing and 12gb won't even be enough for the shit games they put out these days
Worst of all there are few games who forced ray tracing despite the fact that steam most popular gpu is 💩 for ray tracing unless you use ultra performance upscaling
Now it would be interesting to see if user benchmark makes b580 performing higher than 5090 in some made up measurements
I still don't know how that site is even around still lol
Nvidia's lying to you. Tom just want his options open when shit hits the fan :D
The problem with this “entry level” GPU is that it uses as much silicon as a 4070 super to produce 4060 performance. Intel probably is selling these at a loss even BEFORE factoring in R&D. Hope they can stick around to actually make any money. The discount on TSMC wafers they lost because of their idiot CEO running his mouth probably ate the entire profit margin of Battlemage.
It looks like he just avoided to answer it actually. We need more ram, 8GB not enough, our calculation correct, but nvidia say otherwise, yet they are not wrong.... oh ffs come on man don't do that clown dance and tip toeing around, be straight with us, have a spine and we can respect you.
RUclipsrs hyping this non existent product endlessly 😆
THIS. THIS RIGHT HERE. Its garbage anyway.
@@Machistmo *It's
@@GrainGrown thanks auto correct. By the way you’re late and fired
team blue baby im tired of ngreedia scamming us gamers iv seen the arc b580 mops the floor with the rtx 4060 4060ti rtx3070 3060 and amd rx 6700xt and 7600xt
The answer is no. They wanted this gpu to fall in line at about the 4070 level and compete nicely there for $400 having a standard amount of vram. Perdormance fell short so now it is a spin of "12gb to the entry level". Not that its a bad more or a bad look. Good for intel.
No, this unit only has 20 Xe cores (5 "render slices") compared to the previous gen with 32 (8 "render slices"), the increased efficiency over last gen means a higher tier GPU is possible, there's just no release date set and Intel hasn't said much. Intel's marketing also didn't compare the B580 to the top tier previous gen, even the naming suggests a cut-down version.
Dude 4070? what now? Its barely on par with, and it looks like shit compared to, a 4060. I hate Nvidia passionately but I would get an Nvidia 4060 8GB before this card.
@@nimrodery Can you imagine if Intel, with their infinite wisdom and knowledge of the 4060 like performance.. decided to compare the GPU to a 4070? Yeah.. the fact that they didn't do that means they were aiming for low end from the start. Riiight. Not. The number of cores is completely meaningless as well... different architecture means they cannot be compared. Remember the leaks and rumors that suggest Intel is aiming for 4070 ti to 4080 super performance at the top end? Well that is supposedly the 256bit GPU. Just like Nvidia.. if the 256 bit GPU is aiming for 4080 performance.. it's stands to reason that the 192bit GPU aims for the 4070(or 4070 ti because the 580 is the full die like the 4070 Ti is). And the b580 is the full die and is not a "cut down" GPU like you say. The b580 literally had identical silicon Costs as the 4070 Ti and it is only 60% of the performance of that GPU.. and you really think that was Intel's plan?
@@christophermullins7163 Basically everything you said is wrong. "Sources" have been saying for a long time you shouldn't expect a 4080 tier GPU, and if you speculate on the performance of a 32 Xe core part based on this one, the facts seem to concur. Maybe you were hoping for a faster B770 but there has been literally zero figures divulged by Intel, we never knew beyond speculation such a GPU even exists. As for the B580 it's clearly a cut-down GPU, Intel made improvements in every area including heat, it's just ridiculous to suggest this is somehow the best they could do. If we don't see a B770 it's because Pat Gelsinger opened his mouth, not because it's too hard to make. The hard part was getting a dGPU on the market in the first place, and they've released 7 not including collabs with other companies. I don't know about you but I expected 0.
@@christophermullins7163 And you compare stuff based on what else is on the market at a similar price. I thought that was pretty basic.
Amd is still better , dude already spilled the beans and said they have already made the next 2 generations of Gpu’s meaning after 2 years these cards get put in the back burner
Tom Peterson you're the man.
B5800 被能夠提斯奈特勒熱卡莫麼
I see a lot of people hyping this up , like ah I want intel to release a b770 and stuff like that . I want to ask you, how many of you are ready to buy a 450$ gpu from intel that's on par with 4070/4070s instead of actually going with Nvidia ? Not many , I see a lot of people still buying i5 12400f in their rig despite the fact that 7500f /7600 ryzen platform is like 20-30$ more expensive and it has immense futureproof people still stuck to buy what they owned 10 years ago and if 10 years ago they had an i5 6400 they will gladly buy a 12400f and maybe even a motheroard that supports BCLK overclocking , get it to 4.8-5 ghz and be happy for the next 10 years. You are all acting here like majority of people are doing generation to generation upgrades , people usually buy a pc for ~5 years period then upgrade it , the amount of people doing generational upgrades is less than 10%. If intel wants market shares they should be even more aggressive in this price point , maybe a b590/b750 at around 300$ with same amount of memory but 20% more XE cores would do it even better than a b770 , cause in the 400$ price point there is already a lot of competition from new gen and past gen gpus.
I would without hesitation. Give me an Intel card that is fairly priced. Does 165-200fps at 1440p and Intel will get my money. Im still here rocking an old 6700XT, waiting for the 8000 series next year
I would definitely get the B770. The only problems that I have are the taxed prices, stocks and a huge hole in my bank account after buying it (going into adulthood)
I'm kinda new to the whole PC building situation, so maybe I just have a fresh perspective. But the build I'm planning involves a Ryzen 7600/7600X for the CPU and the Intel B580 GPU. Hopefully, I can scrounge together the money for it soon.
@@Rose.Of.Hizaki that isnt this intel card. the 8000 series might be.
@@cjmunnee3356 Dont buy components 1 at a time for a system. Save the chunk. Decide how to spend it when the time comes you have enough for what you wanted. Never buy a computer un pieces. Unless your President is about to enact pointless tariffs that will do nothing but weaken your nation globally. then whatever. Also start with used parts, you can source locally. Start with a used gamers PC. Watch your local sellers, nextdoors FBMP and the like. I just sold a banger system for 760 in a Zalman brand new case. Dude walked off with a HUGE deal. Sold it via NEXTDOOR app my GF uses.
12gigs is still pathetic lmaooo
not in the $250 price range. it's actually pretty good
For 1080 8gb is plenty if games are properly optimized for low 3nd users, sadly this will only encourage more laziness
This is just wrong. I wish you were right but you are DEAD wrong. Encourage more laziness? What in The FUCK are you on about?
i am still gonna buy rtx 4060
B580 is amazing at its price point you are making a mistake.
@@ComfyShortz i know but it just that it is not available in India
@@ComfyShortz It's literally going to be out of stock until January in most countries. I couldn't blame someone if they can't be arsed to wait and just play some games. Would get 7600XT in that price range for new GPUs but hey.
Lol... 😂😂😂😂😂
@@Bargate but rtx 4060 is good in 2d animation faster than rx7600xt that's why i am getting it