The rtx4060 costs $300 and is rapidly on the way to being the most sucessful GPU of this generation. Nvidia dominates at the low end, the 4050 and 4060 Laptop GPUs are also extremely popular.
@@Wobbothe3rd They barely make any profit off of those cards though, compared to high-end cards and especially server/AI stuff anyway. They probably make more profit on a single 4090 (which are still selling like hot cakes) than on 10-20 of these low-tier cards.
AMD's Reaction: Lower price of RX-7600 to MSRP of $199.99 and lower the price of the RX-7600XT to $249.99 nVidia's Reaction: Raise the price of the RTX-4060 to $349.99 and raise the price of the RTX-4060-Ti to $429.99.
it looks like nvidia wants out of gamers, they want to quit their gaming business and focus 100 % on AI stuff. I think this is the only way to understand their responses.
If intels performance claims are relatively accurate and the drivers are up to snuff day one, this could be a really compelling entry level gpu that is actually affordable to a lot more people. It's good for the GPU market all around.
@@richardwolfangel4724 Not necessarily. One of the big advantages NVIDIA has is that they dominate with OEMs, but DIY is a lot more fluid. There hasn't been a REALLY good budget option since like the RX580. These will sell well if they actually perform like they are supposed to.
@@Exiztential yes i would buy it but looking back its been issues w drivers n stuff sadly,,great hardware but horrible software as far as i understood it??
@@metamon2704 What are you smoking ? lol This is isn't CPU testing but rather GPU tests that we're talking about. If you can't bother watchng the video then keep your bs opinions to yourselves.
@@Boofski no its not steam survey says 1080p is the standard you're the one who needs to face reality, at this price range 1080p is the best option, but with upscaling techniques i could be wrong, hard to say until you try it, but you cant change the fact that 1080p is the most common among players.
@ Steam survey is overcrowded by a bunch of Brazilians in the favela playing gta San Andreas or CS Source on old DELL workstations with gtx1660 swaps. The average person that lives in a first world country (aka the demographic for newly released video cards) plays on 1440p minimum
Nvidia's mid range getting hit on both sides with a strong looking 8800XT at the relatively high end and Arc battlemage at the low end. If 8800XT delivers 7900XTX raster performance with a sizable (20% or more) RT performance uplift for no more than $600, that will be an insane card.
Yeah I'm excited about 8800XT the most, that card might be insane if it's about $550. Which is possible because I think they will be aggressive with their pricing with next gen because they've been saying they want a bigger gpu market share, that's the best way to get customers to switch sides. Nvidia can't keep getting away with their bs
Peeps should keep their hype in check imo lol. From the supposed spec leaks so far I highly doubt it will be 4080/7900xtx raster, more than likely it will be 7900xt raster / 4070ti super ray tracing. If its 500 or 550$ it will be a good product. Wish I'am wrong on this but 7900xtx raster seems too much with the numbers that leaked so far.
@@mort996 I feel like Nvidia can do whatever and people will default to graphics card == Nvidia. I had a chat with a colleague today who games, a year ago he bought a used 1070 instead of a more modern AMD or Intel card merely because "I've always used Nvidia", and today he expressed surprise at the Intel reveal but positive such because "Good for the market with three manufacturers" . Five minutes later he said "I hope the 5060 series comes with a decent amount of VRAM because then I'll get one of those" because to him it seemingly does not matter at all what any competition does. GPU=== Nvidia
7900xtx? Please cool for a second and think! 8800xt will have 6% more cores, 3% faster memory, and ~24% faster clocks than the 7800xt. 7900xt is 30% faster than the 7800xt so it will be good if the 8800xt will match the 7900xt and you are dreaming about the 7900xtx? C'mon... be real.
Nvidia is the only option for 4k gaming. I personally cant do 1440p, i prefer playing on large screens and 75" 1440p displays dont exist, they would look terrible if they did
For content creation and 3d design their consistency, reliability and compatibility is unmatched, a reputation is built not given and they have done that, the other options are still interesting but when your buying a GPU for business and not pleasure, you want to have the least anount of problems and issues with anything you may throw at it. For example it's a similar thing with dell or HP and the business pc market, there's many cheaper options but they've built a reputation and in that field for the most part are reliable.
NoVideo doesnt care AT ALL about budget oriented cards anymore. Been very clear ever since the 4000 series released.They just make the 60 cards because every OEM in existence will shove a 60 tier card in their prebuilt.
Novideo? Never heard that one. Doesn't really make sense either. Sounds like something that would apply to AMD gpus due to their history with drivers. Nvidias budget cards are last years GPUs. I've never purchased a budget gpu in my life...and I'm poor. Why would I buy a budget card when I could buy last years/gens mid to high end card for nearly the same price.
@@DingleBerryschnapps AMD's drivers have generally been pretty good contrary to the FUD spread by nvidia fanboys. On the other hand, nvidia have had some history with both failing GPUs and artifacting problems.
Amd will have true competitors for sure. Unless nvidia stops being stingy in areas such as vram capacity, then its intel vs amd for budget to mid range options. We’ll see what the 5060/ti cards look like in the future but I still predict the 5060 to have 8gb vram for some reason. Would like to be wrong tho.
Me personally as a Previous Technician for approx 10 years, I also do some small dev work, etc. I would not recommend buying an Intel Battlemage until they show they are much more improved with compatibility, performance in a wide arrange of games. What hurts the current Intel Arc series, even the bigger channels have done tests with for example hundreds of games from recent to 3 years back or so. The Intel arc only did pretty well in approx 80-85% of games they tested. I don't have the exact number but it was in the 80% range, the other 15-20% or so the ARC series either did really poor or not even run the game at all. What I am saying, DO NOT BUY until you see reviews and people actually testing in a wide arrange of games.
However that's due to the weird architecture choices that Intel did, it's also why it doesn't do as good as it should, especially for a GPU with a big of a die as it currently has. I'm still taking it over an Nvidia easily to upgrade from my 2080 to.
You’re overstating the problems. Go watch hardware unboxed’s video where they tested 250 games. 87% of the games worked no problem and it was up to 92% after disabling the integrated gpu. Of the remaining 8% games several are extremely difficult to run and it wasn’t intel-specific, just not realistic for a non-high-end card. In the end it was only about 5% of the games that had real driver problems, and that number is likely to continue decreasing over time like it has for the past several years since Arc launched. The only consistent issue is lack of day 1 driver support, intel tends to not get support for a bit after a game comes out. Imo this is no big deal since so many games suck nowadays that we have to wait for reviews first to see if it’s even worth buying, so the small delay is not a problem. Others may feel differently though.
@@jgig1329 They didn't factor in a lot of early access, betas, alphas, playtests and UE5 betas. On early betas the arc GPUs won't even run over 50% of the time.
I'd like to see some ue5 performance tests, as intel has mentioned that battlemage has better hardware acceleration for features like nanite. So perhaps it performs disproportionally better relative to the previous arc gpus in ue5 titles
That is the pricing I was expecting Intel to land at. Now just hope their drivers aren't absolute trash and that performance holds up. This could be a very interesting generation of releases.
Intel has a better place now vs 1st gen in terms of drivers since they have references they can look to for improving on. All they have to do is match or beat where current arc drivers are at now.
Not amazing at all. According to Intel's own claims the B580 is around 5% slower than RX 6700 XT 12GB. It will probably be even slower in independent reviews. But the 6700 XT has been around $270 for half a year now. So if someone wanted that level of performance for that money they could have had it a long time ago and didn't have to wait for Intel.
@@JackJohnson-br4qr yeah and its not available anywhere anymore, has awful rt and upscaling, cannot do any ai workloads and is a lot less efficient. also i have never seen 6700xt under $300
@@JackJohnson-br4qr Not seeing 'new' RX6700XTs on Newegg or Amazon for that price, new I'm seeing $500-700 and coming from China, $300 for refurbs that I wouldn't trust for a minute (also from China). The amazing part I think is the RX6000s are out of production and the Arc is giving consumers an unused graphics card (new in a box) option for near similar performance for under that $300 mark. The price to performance over Nvidias 4060 is where it shines (8 gig card hovering around the $300 mark)
This is 272 mm2 in die size. That's 44% bigger than the 4060ti, while being 10% slower than a 4060ti. (188 mm2) That's 71% bigger than the 4060 while being only 10% faster than a 4060. (159 mm2) Luckily for them it's 5nm, not 4nm, so there is some cost saving, and a small part of that much larger size is excusable. If it's actually true that it is on a very slightly worse node than Nvidia, that is.
Ummm on Tech power up it shows the die is 406mm2 for the B580 and B570. Also kind of irrelevant to say this die size or that die size because TSMC is making them for all 3 players. And its 6nm not 5 nm that's probably why the cost is so low.
@@JackJohnson-br4qr The 6700 XT isn't even available in so many places now, as stock for the GPU finishes. Not to mention, the B580 has multiple other advantages over the 6700 XT. And even at that $270 price tag, the B580 is still $20 cheaper than the 6700 XT, while being just 5% slower in pure raster.
How did I not see the giant "per dollar" at the top before? LOL That moves it down to 10% faster in raster, 4% faster in RT I did already catch that they cooked it by forcing the 4060 to hit its VRAM limit on 1440p max settings RT
i think that's fair, nvidia has been neglecting vram for generations, to the point where it is already a bottleneck in a lot of real world scenarios. i also don't think the vram aspect was misleading, since they had a whole slide talking about it, it's not like they tried to be sneaky
As good as I think this card is, and I think it’s looking (tentatively) *_GOOD_* , who’s going to bother scalping a $250, not really limited production, relatively obscure product? Like, it’s not like this has some name that scalpers are going to recognize attached to it.
@@praisetheoak Scalpers were only really an issue during the crypto mining hype and for founders edition Nvidia cards, wouldn't worry about scalpers here.
If we do the math to find the real performance percentages we get that on raster intel is 10% faster on average and with ray tracing approximately 4% faster... But again what you said is correct about the vram possibly having a big impact here and also we have to take into account the difference between dlss and xess
Yes, Intel Arc is absolutely cooked. Look at the die size of the B580 compared to the competition: B580 - 270mm^2 7600 - 200mm^2 4060 - 160mm^2 Why does this matter? It demonstrates that Nvidia and AMD are able to manufacture more GPUs per silicon wafer relative to Intel. Additionally, Intel's B580 uses a more advanced process node than AMD's 7600, which further increases manufacturing cost. A higher bill of materials and fixed input costs combined with a lower MSRP indicates that Intel's profit margin is significantly smaller than the competition. Intel's disadvantage in GPU manufacturing expenses will be exacerbated in the coming months with the release of RDNA 4 and Blackwell series GPUs. For example, Navi 44, the chip that will be used in the lowest RDNA 4 SKUs (including the likely direct competitor to the B580--the RX 8600) is rumored to be around 150mm^2 (I've seen 130-170). Even Navi 48 is rumored to be somewhere between 250-300mm^2! AMD's RDNA 4 will utilize a more expensive process node, but the cost differential between TSMC's N5 and N4P is small relative to the factor of 2 difference in chips per silicon wafer. This disparity in fixed input costs necessarily means that AMD will have plenty of headroom in their profit margins to significantly undercut Intel's price to performance. Intel is already in dire straits financially. They cant afford to continue to dedicate resources to their discrete GPU division without significant return on investment. The fact that Intel's CEO "retired" one day before the announcement of their new GPU lineup is not a coincidence. I will be curious to see how many of these GPUs actually ship. To me, this screams "paper launch". It allows them to report to investors that they met their goal of delivering Battlemage in Fiscal Year 2024 (just disregard that it came in the final days of the year and failed to even mention their higher end B770 model--which is rumored to be cancelled). All that said, I think the B580 has potential to be a win for consumers. It appears to be a compelling budget option for 1440p with a strong feature set, but we will have to wait for 3rd party reviews to know for sure. At the very least, it might prompt AMD to get more aggressive with their entry level pricing. Competition is always a good thing, but unfortunately all evidence points towards this being an unsustainable situation for Intel Arc discrete GPUs. I will be happy if they prove me wrong!
to add onto this the tdp on a 5nm gpu is pretty high for only being 10% better than a 4060. it matches the 7600xt tdp which is on an older node. It does worry me that this is unsustainable as well. more vram, bigger die, and a need for larger coolers is just not a good recipe
They are just starting. If you think they can enter the market and instantly compete with Nvidia on all technical aspects it is not realistic. Their best option to be competitive is by price. They need to get market share first. And then better products will come over time. If the claimed numbers are true the improvements over their first gen are exceptional. And we pretty much know they are true because Battlemage is already doing fantastically in Lunar Lake notebooks.
AMD lacks dedicated RT hardware and doesn't have as much hardware dedicated to ML acceleration. If the 7600 had that, the die would be a fair bit larger. And it's not like Nvidia has made it a secret that they're using tiny chips and enjoying huge margins. Really, all the die size says is how far ahead Nvidia is over AMD and Intel, Intel is closer to AMD than it seems.
If that GPU released when Nvidia released the 4060, it would have shattered the market. Right now ? There's a new generation of AMD and Nvidia GPUs at the corner, that will potentially affect the pricing of current gen, which is what Intel goes up against... In other words, not so hot.
5060 will be 4060ti ish perfomance for 300-350$ which is 10% faster than B580. Okay price but probably people will pay a little bit more and go for 5060 even if it's 8gb. (For 1440p it's not worth to buy any 60 class gpu anyway)
@@Krisz94HU that's a interesting way to put it. if the 5060 is 12gb. The $100 cheaper b580 still sounds alright. If the 5060 12gb is $350 you could save 33% of your money and get something only 10% slower with the same VRAM.
@@Krisz94HU 5060 is going to be between 4060 TI and 4070 most likely, the target performance is going to be around 7700XT, which is where I expect the 8600 to also be at, for 250€. You can buy a 4060 TI for that money already, it wouldn't be worth releasing a card this expensive and slow for next gen.
The fact it has more than 8gb is great, and all, but I wonder how much of these performance gains just exist because the card is not chocking on VRAM, instead of actual rasterization gains. When they say it's 10% faster than a 4060, I have my suspicion that's mostly because the 4060 is being strangled. If you tested on "high" instead of "ultra", I wouldn't be shocked if these perform identical on average.
But the difference will be that the B580 or B570 will perform better at higher quality settings than both 4060 and 7600. Whats the issue? Thats a great thing. This will force AMD and nvidia to reconsider their new mid range cards if they don't bump up the vram.
its not just RAW performance, its also how the upscaling works, future compatibility and even things like the AI playground, imthinking of buying it and sell my 2080 super since NVIDIA isnt supporting the 20 series with new dlss software and such, i cant even have nvidia framegen on this card. So 250 bucks for a slightly higher specs than 2080 TI in RAW performance combined with all the new tech makes for a very happy me.
Yes, I'm waiting for independent reviews, but if XeSS2 is close to offering DLSS levels of quality and performance it means that Intel have made a massive leap towards bridging the gap that nvidia have opened, something that AMD have failed to deliver on in the past few years. They have lagged far behind in raytracing and FSR is noticeably inferior to DLSS. If Intel have done better in just two years that will hopefully give both AMD and nvidia a much needed kick up the ass.
@@Rufnek2014Both AMD and Nvidia have driver bugs too. That's why some people get a AMD card instead of Nvidia, or vice versa. Nothing is perfect and that includes the Monolith that's Nvidia.
I'm just glad people will have access to functionally good low end cards. It wasn't that long ago a low end card could barely play 720p 30fps for similar prices.
Regarding Games to Review If it is claiming to be 1440p with ultra settings, I think you should also test games like Total War: Warhammer 3 or Pharaoh. Total War doesn't have RT, but It will be interesting to see how it handles those benchmarks. The 12GB will hopefully put it in good stead. City Skylines 2 and Homeworld 3 would also be interesting. Hopefully it will hit 60FPS
1080p RT capable card or 1440p, not max settings, and light RT capable card with 12GB for $250... for me it looks decent. The alternative for now is the 4060 for $300 with 8GB and 7600xt but with a worse RT.
All intel had to do was be competitive in both raster and rt. Be close with competition unlike Radeon cards in terms of rt in which it lagged behind severely even on the flagship 7900xtx. And if intels numbers are to be believed, they have decent options in both this time around, and if more games adopt xess and intel doesn’t fall back on drivers support, then intel some freaking how made the biggest gains this generation relative to the gains of its competitors.
@@FrowningTaco RT is handled by separate processing units than raster. As long as there are enough of them to handle the workload it shouldn't completely tank performance when the feature is turned on.
I don't think Nvidia cares about entry level anymore. They're not cooked until someone offers a flagship that outperforms a current 90 series for $500.
If these gpus continue to be a great value in the future I'll definitely consider getting one when my rtx 3060 ti isn't good enough anymore, $250 for a 1440p gpu is great value
this will be even rarer than AMD's mobile GPUs. no OEM is gonna bother with this. if Intel still had the NUC, this would shine in the X15. otherwise expect Lenovo reuse some old Legion chassis to release like 100 units.
them showing xess performance is good it shows the heavy cpu/ driver headroom has resolved as its hitting higher frame rate. also to note seeing games that have previously sucked on arc do better is another good sign, such as god of war, the last of us and horizon zero dawn , enjoyed the video
While no one knows how a B770 card will end up, their budget b series offerings being this much of an improvement even vs an a750 is great news as it means the b770 could end up being a high end 1440p to budget 4k card, and that should make nvidia and amd scared that intel is making this much of a leap from last gen to this gen. Competition is a beautiful things isn’t it.
The biggest concern I have are the performance charts they showed. There’s nothing solid or concrete about them so I got a feeling Intel isn’t confident in showing off more. These will probably be ok but the 8600 and 5060 will undoubtedly mop the floor with these.
The 5060 will come in with 8GB of VRAM for $350+ though. So getting a card with more VRAM for 30% cheaper & still beats a 4060 (remains to be seen) is pretty nice, especially if it has feature parity.
@@corey2232 that is NOT worth it by any means anything below the 5080 in the 5000 series just seems to be previous overclocked GPU's with PCIe 5 support that only pushes it to a extra few frames
@ yeah but there’s still the issue of Intel really supporting these things for a few years down the line. What’s the point of buying a $250 GPU and games stop supporting it within the next few years. That’s a legitimate concern that everyone should have, especially since Intel has said that ARC is winding down. It’s sad but at the same time that’s why we really need AMD to step it up. RDNA 4 and eventually UDNA sound interesting, but AMD needs to finally take some market share back from NVidia.
Hate the clickbait, love the content. This card is more of a "threat" to AMD than it is Nvidia, and that fact should already occur to those who can think a little critically.
I am straight up going to get one. Their last offering came in a joke...it wasn't badly priced but nobody wanted into that driver mess that we knew was going to happen. They've had 2 years to mature drivers. They've finally given us a completely workable option to Nvidia at lower cost.
I see alot of people trying to get out of intel 13 and 14th gen chips which many are still perfectly fine chips. Get a 13th or 14th gen i5 in ITX case and slap this in there. Solid portable system.
@@robertmorrison107 I have a 13700k and putting that chip into an itx case is asking for trouble. I have issues cooling it with a 360 aio I can't imagine in a ITX case with an air cooler. Believe me I considered it but there would be major issues cooling those chips in a sff case.
Thank you for this video. Please test at 1440p & 4k the following games: Cyberpunk 2077, The Witcher 3, A Plague Tale: Requiem, COD Modern Warfare III, The Callisto Protocol, Spider-Man 2, and a driving game to round it out.
dont give a flying fly, until actual benchmarks come out , all companies cherry pick. need proper benchmarks from hw unboxed ,also comparing 12gb vs 8gb variants is hiliarous, why didnt they compare against ti and xt versions, we all know why.
@@farasty7371 3060,7600xt,6750xt all are above 8gb. cards like the 6700 and 6700xt were as well. it's the first msrp card under 300 with more than 8GB but idk if that even matters
Intel did well comparing this against the 8GB 4060. one of their biggest selling point is 50% more VRAM than the competition. why the hell wouldn't they point that out? nowadays a good amount of VRAM is enough to sell a card.
Dude... I was saying that at $250, this would be a slam dunk. I'm really happy about that. Gotta' see what the competition comes out with, but I honestly might try Intel this generation if the other tiers are anything to go by. GREAT point about the performance per dollar though, they're probably going with the $299 MSRP for the 4060, so it's barely faster... But that still means the next card up will be a generation improvement, so that may not be bad. So you'd still be getting "more for the same price" as it should be..?
From Niktek's and your videos, my idea is that: Its an Intel Rx 6700 xt/6750xt. But for £250? Alright. Sounds neat. Edit: oh wow, so the B870 is an Rx 6700 10gb? Finally, a GPU around this specs and price point (used). So around an RTX 3060ti in performance?
Except it actually has usable upscaling at 1440p with hardware less unlike fsr and it also has a ton more oc headroom due to intel allowing for much more voltage and power limit headroom. It managed +350mhz on average which should give you about 15% more performance. In other words, with an overclock, you're getting a 12gb 3070ti for $250
@وليدحسيناشتيوي I've seen their slides giving the exact numbers genius and they had the exact same headroom on the a770 but memory clocks were locked with them which limited oc headroom altho you actually still nearly got +15% performance on that too.
intel gpus are insane at video editing what? have u heard of quicksync? they are also insane for ai, it's amd thats useless for anything other than gaming
Competing against Nvidia's slowest " Last Generation" GPU won't concern Nvidia The bias D Owen expresses for Nvidia is shameful, if anyone is cooked, Its AMD
He’s definitely not Bias especially towards Nvidia he’s just giving praise to Intel for the continued improvements. Which is deserved. Yes I tel doing good hurts AMD not Nvidia but that’s because AMD decided to play the same pricing game as Nvidia. lol
If it would have matched the 4060ti, then maybe. Otherwise the b580 needs to be at most $200 to make a dent in the market. The 7600 xt offers 16gb and will drop in price to knock out Battlemage.
From Niktek's and your videos, my idea is that: Its an Intel Rx 6700 xt/6750xt. But for £250? Alright. Sounds neat. Edit: oh wow, so the B870 is an Rx 6700 10gb? Finally, a GPU around this specs and price point (used). So around an RTX 3060ti in performance?
1440p @ 60fps should be the new standard in regards to resolution and frame achievement. I could currently care less about Ray Tracing tech, i prefer visual clarity and smooth framerates over anything else. (just as most consumers) 1080p is nearly 20 decades old and 30fps is just the same. We had the PS2 (if im not mistaken, that ran @ 60fps) that revolutionized what gaming tech could provide for the user. It's time for GPU makers to take this into account and maybe not charge $1000 for their products as well. And maybe game Devs could also quit making games that look washed out and pixelated @1080p. I just recently played "The Evil Within 2" @ 1080p.....and i have to say, that it's one of the best looking 1080p games i've played on a 70" 4k tv. (no jaggies, etc.)
One good thing that happened bcz of the announcement of Battlemage pricing that now NVIDIA and AMD will not overprice their gpu on launch like they always do
The 5060 and 8600 are probably going to be in that performance and price tier or better. For 250€, you're probably going to get a 8600 that performers like a 7700XT with better RT or a 5060 that sits between the 4060 TI and 4070. 25-32% over last gens competition isn't really hard to beat.
@@b3as_t Nvidia has more money than god from the ai rush at this point, I genuinely don't think they care anymore. I hope Intel and AMD can chip away some of their gaming gpu market share though.
AMD is already fine, their cards actually work, the 7600 XT has the same performance level, on a older node, with a smaller die, AMD can kill these Intel cards by just lowering prices on RDNA 3 while keeping comfortable profit margins.
I have a 12600k cpu. Theoretically this cpu could be paired with up to a 4080 level card, but by that tier I’ll upgrade my whole system so any card that’s on a performance tier to the 4070/5070 card is what I’ll look at for upgrading my current pc.
Counterpoint: Nobody should be buying an 8GB card to play today's games at 1440p, and nobody should be playing at 1440p with settings dialed down to make an 8GB card usable. If you can't get the FPS you want at 1440p native, play at 1080p.
According to Intel's own claims the B580 is around 5% slower than RX 6700 XT 12GB. It will probably be even slower in independent reviews. But the 6700 XT has been around $270 for half a year now. So if someone wanted that level of performance for that money they could have had it a long time ago and didn't have to wait for Intel.
do you just go around copy pasting your answer everywhere? on Amazon the cheapest rx 6750 i found is 330 euros. On Newegg there is no 6700 xt for 270. Only 300 and that refurbished. So not sure where you find these cards but it seems to me that you are bit full of it. Also the 6700xt launched at like 450 usd msrp or something like that Edit: cheaper and nearly as fast as the 6700xt seems like reasonable to me not sure what you expect.
@@_qwe_fk_1700 Did you even see the video or did you just come here to criticize others? 9:48 The 6750 XT is even 10% faster than the B580 and is only $300 now. Anyway, I'm not your buying guide. I'm just saying that everyone had plenty of time to buy the same performance for the same money. Intel's nothingburger.
@@JackJohnson-br4qr yeah a 4 year old card that had an msrp that was double... what a stupid logic from you, based on that there should only ever be high end cars. No more 50/60/70 series then. Just 80 and 90 series... yes they offer the same performance as older cards, but at a smaller price.
Id like to see it benchmarked with just an AMD Ryzen 5 and 16gb of DDR4. So many people out there have these and they will be the ones looking at this card.
I was hoping to upgrade my 1080ti. I think is still is too close to this if you don't use frame gen magic. Maybe the B770 comes still with 380 price point or something.
It's nice that Intel is in the game (albeit late again for X-mas). Earlier this year, I down-upgraded to a RX 6750xt from a RX 7600 to support my 1440p monitor. I picked up the 6750 (new) during Amazon Prime wk for $280. No need to grab another GPU any time soon.
As far as I'm concerned with the Intel Battlemage GPUs, I'd like to see how well it performs on older DX9 & DX10 titles - this was a blindspot when the Archmage GPUs initially launched. While it's fun to play the latest and greatest titles, sometimes I like to kick back to an oldie like Oblivion or C&C3: Tiberium Wars.
Really appreciate the breakdown of what Intel could be spoofing. Looking forward to reviews from you and everybody. My cousin might not go for Intel, but I am on the look out for when my 1080ti dies.
This is the right move by Intel. Good stuff. I'm not looking for a GPU, but this is a great thing to see. Even if it ends up being slightly slower than a 4060 in non starved situations, they will still age better, and the price helps too.
Regarding, Games to Review 13:36 If Intel is claiming B580 to be targeting 1440p with ultra settings, I think you should also test games like Total War: Warhammer 3 or Pharaoh. Total War doesn't have RT, but It will be interesting to see how it handles those benchmarks. The 12GB will hopefully put it in good stead. City Skylines 2 and Homeworld 3 would also be interesting.
I don't think Nvidia even cares about this price tier anymore tbh
making gpu for gaming just a "side-dish" for them not really they main focus, they dont care as much if ppl like it or not
@@imtrash2812 good for AMD and Intel
Then why nshitia bother do GPU,s for gamers? ;)
The rtx4060 costs $300 and is rapidly on the way to being the most sucessful GPU of this generation. Nvidia dominates at the low end, the 4050 and 4060 Laptop GPUs are also extremely popular.
@@Wobbothe3rd They barely make any profit off of those cards though, compared to high-end cards and especially server/AI stuff anyway. They probably make more profit on a single 4090 (which are still selling like hot cakes) than on 10-20 of these low-tier cards.
AMD's Reaction: Lower price of RX-7600 to MSRP of $199.99 and lower the price of the RX-7600XT to $249.99
nVidia's Reaction: Raise the price of the RTX-4060 to $349.99 and raise the price of the RTX-4060-Ti to $429.99.
I love your projection of Nvidia move. 🙂
Almost right. But it will be 5060 and the Ti at more than those prices.
it looks like nvidia wants out of gamers, they want to quit their gaming business and focus 100 % on AI stuff. I think this is the only way to understand their responses.
NVidia: releases the RTX 5060 with 8gbs at $400 and the 5060ti at $550
@@J3f3R20n Exactly you get it.
If intels performance claims are relatively accurate and the drivers are up to snuff day one, this could be a really compelling entry level gpu that is actually affordable to a lot more people. It's good for the GPU market all around.
Sad part is that even if it is a good value people will still buy Nvidia.
@@richardwolfangel4724 Not necessarily. One of the big advantages NVIDIA has is that they dominate with OEMs, but DIY is a lot more fluid. There hasn't been a REALLY good budget option since like the RX580. These will sell well if they actually perform like they are supposed to.
I was thinking about 4060 now forget about it... Intel B580 is a BUGET BEAST... I WILL SURELY BUY IT NOW... 😊
@@Exiztential yes i would buy it but looking back its been issues w drivers n stuff sadly,,great hardware but horrible software as far as i understood it??
Hopefully they also fixed the idle power draw which was pretty high for Alchemist first gen GPUs.
The way how he moved his little camera image to point at things makes me crack up
True teacher vibes 😂
Pointer Daniel is a true classic
Well he is a teacher haha
Aphex Twin mentioned
You don't need a pointer visualisation layer when you have fingers!
I think is very reasonable to test even mid-end graphics cards at 1440p. This should now be the standard.
1440p, sure, but 1440p ultra with RT enabled probably lands those cards at 10-20 FPS in some games, making it completely irrelevant.
@@metamon2704 1440p is the new standard face it. Also you'd want to test GPU's at higher resolutions to negate any CPU bottlenecks that might occur.
@@metamon2704 What are you smoking ? lol This is isn't CPU testing but rather GPU tests that we're talking about. If you can't bother watchng the video then keep your bs opinions to yourselves.
@@Boofski no its not steam survey says 1080p is the standard you're the one who needs to face reality, at this price range 1080p is the best option, but with upscaling techniques i could be wrong, hard to say until you try it, but you cant change the fact that 1080p is the most common among players.
@ Steam survey is overcrowded by a bunch of Brazilians in the favela playing gta San Andreas or CS Source on old DELL workstations with gtx1660 swaps.
The average person that lives in a first world country (aka the demographic for newly released video cards) plays on 1440p minimum
Nvidia's mid range getting hit on both sides with a strong looking 8800XT at the relatively high end and Arc battlemage at the low end.
If 8800XT delivers 7900XTX raster performance with a sizable (20% or more) RT performance uplift for no more than $600, that will be an insane card.
Yeah I'm excited about 8800XT the most, that card might be insane if it's about $550. Which is possible because I think they will be aggressive with their pricing with next gen because they've been saying they want a bigger gpu market share, that's the best way to get customers to switch sides. Nvidia can't keep getting away with their bs
Peeps should keep their hype in check imo lol. From the supposed spec leaks so far I highly doubt it will be 4080/7900xtx raster, more than likely it will be 7900xt raster / 4070ti super ray tracing. If its 500 or 550$ it will be a good product. Wish I'am wrong on this but 7900xtx raster seems too much with the numbers that leaked so far.
@@mort996 I feel like Nvidia can do whatever and people will default to graphics card == Nvidia. I had a chat with a colleague today who games, a year ago he bought a used 1070 instead of a more modern AMD or Intel card merely because "I've always used Nvidia", and today he expressed surprise at the Intel reveal but positive such because "Good for the market with three manufacturers" . Five minutes later he said "I hope the 5060 series comes with a decent amount of VRAM because then I'll get one of those" because to him it seemingly does not matter at all what any competition does. GPU=== Nvidia
7900xtx? Please cool for a second and think!
8800xt will have 6% more cores, 3% faster memory, and ~24% faster clocks than the 7800xt.
7900xt is 30% faster than the 7800xt so it will be good if the 8800xt will match the 7900xt and you are dreaming about the 7900xtx? C'mon... be real.
we calling a 8800xt midtier now?
thats the very end of mid tier of beginning of highend cards if it actually is about on par with a 4080S
I'm surprised that $20 is enough to warrant two products.
its 30 dollars but because they are already budget cards that literally more than 10% difference in price which is still sizable .
That’s how binning works, otherwise all the B570 chips which failed to reach B580 levels would be thrown in the trash bin.
@@ZeroUm_ That makes a lot more sense. It's a result of the manufacturing process.
@@ZeroUm_ Still they should've been cheaper. No one is looking at the B570 because of it's price compared to the 580.
@@Boofski Maybe they're just being assertive with the MSRP and the street price will be a bit more discounted.
People still throwing money at Nvidia is beyond me. You literally get the worst value per dollar and you still encourage their behavior. Perplexing
I have a 4090 yeeeeeeeee🎉🎉🎉
Nvidia is the only option for 4k gaming. I personally cant do 1440p, i prefer playing on large screens and 75" 1440p displays dont exist, they would look terrible if they did
Is like socialism..
For content creation and 3d design their consistency, reliability and compatibility is unmatched, a reputation is built not given and they have done that, the other options are still interesting but when your buying a GPU for business and not pleasure, you want to have the least anount of problems and issues with anything you may throw at it. For example it's a similar thing with dell or HP and the business pc market, there's many cheaper options but they've built a reputation and in that field for the most part are reliable.
Anything over $500 got real muddy last gen. A 10% "Nvidia tax" isn't much considering all the benefits
NoVideo doesnt care AT ALL about budget oriented cards anymore.
Been very clear ever since the 4000 series released.They just make the 60 cards because every OEM in existence will shove a 60 tier card in their prebuilt.
Novideo? Never heard that one. Doesn't really make sense either. Sounds like something that would apply to AMD gpus due to their history with drivers.
Nvidias budget cards are last years GPUs. I've never purchased a budget gpu in my life...and I'm poor.
Why would I buy a budget card when I could buy last years/gens mid to high end card for nearly the same price.
@@DingleBerryschnapps AMD's drivers have generally been pretty good contrary to the FUD spread by nvidia fanboys. On the other hand, nvidia have had some history with both failing GPUs and artifacting problems.
This is good but late, next gen gpus coming out a couple months later will likely destroy it
Amd will have true competitors for sure. Unless nvidia stops being stingy in areas such as vram capacity, then its intel vs amd for budget to mid range options. We’ll see what the 5060/ti cards look like in the future but I still predict the 5060 to have 8gb vram for some reason. Would like to be wrong tho.
Me personally as a Previous Technician for approx 10 years, I also do some small dev work, etc. I would not recommend buying an Intel Battlemage until they show they are much more improved with compatibility, performance in a wide arrange of games.
What hurts the current Intel Arc series, even the bigger channels have done tests with for example hundreds of games from recent to 3 years back or so. The Intel arc only did pretty well in approx 80-85% of games they tested. I don't have the exact number but it was in the 80% range, the other 15-20% or so the ARC series either did really poor or not even run the game at all.
What I am saying, DO NOT BUY until you see reviews and people actually testing in a wide arrange of games.
THIS. The Gaslighting I'm seeing from some of these "Tech" sites regarding Intel GPU"s is 95% HYPE.
However that's due to the weird architecture choices that Intel did, it's also why it doesn't do as good as it should, especially for a GPU with a big of a die as it currently has.
I'm still taking it over an Nvidia easily to upgrade from my 2080 to.
It was Hardware Unboxed video: "I Tested Every Game I Own on an Intel Arc GPU" Tim tested 250 games.
You’re overstating the problems. Go watch hardware unboxed’s video where they tested 250 games. 87% of the games worked no problem and it was up to 92% after disabling the integrated gpu. Of the remaining 8% games several are extremely difficult to run and it wasn’t intel-specific, just not realistic for a non-high-end card. In the end it was only about 5% of the games that had real driver problems, and that number is likely to continue decreasing over time like it has for the past several years since Arc launched.
The only consistent issue is lack of day 1 driver support, intel tends to not get support for a bit after a game comes out. Imo this is no big deal since so many games suck nowadays that we have to wait for reviews first to see if it’s even worth buying, so the small delay is not a problem. Others may feel differently though.
@@jgig1329 They didn't factor in a lot of early access, betas, alphas, playtests and UE5 betas. On early betas the arc GPUs won't even run over 50% of the time.
I'd like to see some ue5 performance tests, as intel has mentioned that battlemage has better hardware acceleration for features like nanite. So perhaps it performs disproportionally better relative to the previous arc gpus in ue5 titles
They've also added hardware support for Indirect Write + more where has prior they tried to do it in software which killed UE5 performance.
That is the pricing I was expecting Intel to land at. Now just hope their drivers aren't absolute trash and that performance holds up. This could be a very interesting generation of releases.
Intel has a better place now vs 1st gen in terms of drivers since they have references they can look to for improving on. All they have to do is match or beat where current arc drivers are at now.
That pricing is amazing if it really does well a 12GB card under $300 wow, hope its drivers are better. Dec 13th cant come soon enough for reviews
From Niktek's and your videos, my idea is that: Its an Intel Rx 6700 xt/6750xt. But for £250? Alright. Sounds neat.
Not amazing at all. According to Intel's own claims the B580 is around 5% slower than RX 6700 XT 12GB. It will probably be even slower in independent reviews. But the 6700 XT has been around $270 for half a year now. So if someone wanted that level of performance for that money they could have had it a long time ago and didn't have to wait for Intel.
@@JackJohnson-br4qr yeah and its not available anywhere anymore, has awful rt and upscaling, cannot do any ai workloads and is a lot less efficient. also i have never seen 6700xt under $300
@@JackJohnson-br4qr Not seeing 'new' RX6700XTs on Newegg or Amazon for that price, new I'm seeing $500-700 and coming from China, $300 for refurbs that I wouldn't trust for a minute (also from China). The amazing part I think is the RX6000s are out of production and the Arc is giving consumers an unused graphics card (new in a box) option for near similar performance for under that $300 mark. The price to performance over Nvidias 4060 is where it shines (8 gig card hovering around the $300 mark)
@@JackJohnson-br4qr In my region the cheapest 6700 XT is at an insane price of $600. 6750 XT is actually cheaper, but it still costs over $400.
Short Answer: No
Long Answer: Nooooooooooooooooooooo
Exactly
Funny to read it from user with 177013 prof pic
It isn't a strong enough GPU for high FPS at 1440p with Ray Tracing and high quality settings The Benchmarks are NONSENSE
If Intel doesn't have coil whine they already have a point in my book
Please do us a favor in the next video, could you add more arrows pointing to things?
This is 272 mm2 in die size.
That's 44% bigger than the 4060ti, while being 10% slower than a 4060ti. (188 mm2)
That's 71% bigger than the 4060 while being only 10% faster than a 4060. (159 mm2)
Luckily for them it's 5nm, not 4nm, so there is some cost saving, and a small part of that much larger size is excusable. If it's actually true that it is on a very slightly worse node than Nvidia, that is.
Ummm on Tech power up it shows the die is 406mm2 for the B580 and B570. Also kind of irrelevant to say this die size or that die size because TSMC is making them for all 3 players. And its 6nm not 5 nm that's probably why the cost is so low.
intel is better at making gpu's than cpu's 😭
Both bad
@@Zazie900 ARC is decent it's definitely come a long way
@@Zazie900 Have you ever checked which CPU brands are still dominating the Steam charts?
@@angnhan9946 popular = good is a crazy mindset
As a 4080 user, im absolutely grabbing one of these just to see what i can get out of it. 250 is crazy affordable
The answer is NO. Saved you having to watch
😂😂
Exactly. The B580 is around 5% slower than RX 6700 XT 12GB. And the AMD card has been around $270 for half a year now.
Why are you spamming this in every arc video
@JackJohnson-br4qr are there still rx 6700 XT around? Cuz in my country there's none for brand new.
@@JackJohnson-br4qr The 6700 XT isn't even available in so many places now, as stock for the GPU finishes. Not to mention, the B580 has multiple other advantages over the 6700 XT. And even at that $270 price tag, the B580 is still $20 cheaper than the 6700 XT, while being just 5% slower in pure raster.
No mention of hardware transcoding performance. Many of us don’t play games but instead use GPUs for hardware video transcoding. Disappointing.
How did I not see the giant "per dollar" at the top before? LOL
That moves it down to 10% faster in raster, 4% faster in RT
I did already catch that they cooked it by forcing the 4060 to hit its VRAM limit on 1440p max settings RT
Idk why do they have to mislead like that. Just saying "same performance, cheaper" it's already a good selling point
@@KeiGGR probably so they can add a chart where intels bar is bigger than the competitions bars lol
@@KeiGGR cos people only read the headline. Only those really into tech will catch the details.
i think that's fair, nvidia has been neglecting vram for generations, to the point where it is already a bottleneck in a lot of real world scenarios. i also don't think the vram aspect was misleading, since they had a whole slide talking about it, it's not like they tried to be sneaky
@@KeiGGR cause the vram bottleneck by nvidia deserves to be highlighted, its one of the biggest issue with almost the entire nvidia line up.
I appreciate the fact they tested on the 14900k. Lends some credibility.
Big if true, but we'll see. Biggest concerns are:
-Drivers
-Scalping
As good as I think this card is, and I think it’s looking (tentatively) *_GOOD_* , who’s going to bother scalping a $250, not really limited production, relatively obscure product? Like, it’s not like this has some name that scalpers are going to recognize attached to it.
@@levygaming3133 We will see, you raise a good point, but never underestimate scalpers's greed lol
@@praisetheoak Scalpers were only really an issue during the crypto mining hype and for founders edition Nvidia cards, wouldn't worry about scalpers here.
I hope it does get scalped. It's signs of a successful product.
@@thelazyworkersandwich4169 do you have intel stocks or something??? Why would you ever want that lmao
If we do the math to find the real performance percentages we get that on raster intel is 10% faster on average and with ray tracing approximately 4% faster... But again what you said is correct about the vram possibly having a big impact here and also we have to take into account the difference between dlss and xess
Yes, Intel Arc is absolutely cooked.
Look at the die size of the B580 compared to the competition:
B580 - 270mm^2
7600 - 200mm^2
4060 - 160mm^2
Why does this matter? It demonstrates that Nvidia and AMD are able to manufacture more GPUs per silicon wafer relative to Intel. Additionally, Intel's B580 uses a more advanced process node than AMD's 7600, which further increases manufacturing cost. A higher bill of materials and fixed input costs combined with a lower MSRP indicates that Intel's profit margin is significantly smaller than the competition.
Intel's disadvantage in GPU manufacturing expenses will be exacerbated in the coming months with the release of RDNA 4 and Blackwell series GPUs. For example, Navi 44, the chip that will be used in the lowest RDNA 4 SKUs (including the likely direct competitor to the B580--the RX 8600) is rumored to be around 150mm^2 (I've seen 130-170). Even Navi 48 is rumored to be somewhere between 250-300mm^2! AMD's RDNA 4 will utilize a more expensive process node, but the cost differential between TSMC's N5 and N4P is small relative to the factor of 2 difference in chips per silicon wafer. This disparity in fixed input costs necessarily means that AMD will have plenty of headroom in their profit margins to significantly undercut Intel's price to performance.
Intel is already in dire straits financially. They cant afford to continue to dedicate resources to their discrete GPU division without significant return on investment. The fact that Intel's CEO "retired" one day before the announcement of their new GPU lineup is not a coincidence.
I will be curious to see how many of these GPUs actually ship. To me, this screams "paper launch". It allows them to report to investors that they met their goal of delivering Battlemage in Fiscal Year 2024 (just disregard that it came in the final days of the year and failed to even mention their higher end B770 model--which is rumored to be cancelled).
All that said, I think the B580 has potential to be a win for consumers. It appears to be a compelling budget option for 1440p with a strong feature set, but we will have to wait for 3rd party reviews to know for sure. At the very least, it might prompt AMD to get more aggressive with their entry level pricing.
Competition is always a good thing, but unfortunately all evidence points towards this being an unsustainable situation for Intel Arc discrete GPUs. I will be happy if they prove me wrong!
to add onto this the tdp on a 5nm gpu is pretty high for only being 10% better than a 4060. it matches the 7600xt tdp which is on an older node. It does worry me that this is unsustainable as well. more vram, bigger die, and a need for larger coolers is just not a good recipe
They are just starting. If you think they can enter the market and instantly compete with Nvidia on all technical aspects it is not realistic.
Their best option to be competitive is by price. They need to get market share first. And then better products will come over time.
If the claimed numbers are true the improvements over their first gen are exceptional. And we pretty much know they are true because Battlemage is already doing fantastically in Lunar Lake notebooks.
The 750 is 400mm^2
Intel is SHY - should go for 400mm chip -40$ msrp 450 (+200$) 16GB (+4GB -12$) profit +193$
Intel 400mm chip perf would be +50% = 4070 (500$) and still Intel would be cheaper @ 450$
A770 msrp 350 chip 65 16GB 48 board 30 cooler 14 = 157 airmail 20 retail 20 Intel 153$ 🤑+97%
B580 msrp 250 chip 85 12GB 36 board 30 cooler 14 = 165 airmail 20 retail 20 Intel 45$ 🤣+27%
7600 msrp 270 chip 32 8GB 24 board 30 cooler 14 = 100 airmail 20 retail 20 Amd 130$ 🤑+130%
A750 msrp 230 chip 65 8GB 24 board 30 cooler 14 = 133 airmail 20 retail 20 Intel 57$ 🤣+43%
6700xt msrp 330 chip 60 12GB 36 board 30 cooler 14 = 140 airmail 20 retail 20 Amd 150$ 🤑+107%
6950xt msrp 600 chip 86 16GB 48 board 40 cooler 14 = 188 airmail 30 retail 50 Amd 332$ 🤑+177%
4060 msrp 300 chip 50 8GB 24 board 30 cooler 14 = 118 airmail 20 retail 20 Nvidia 142$ 🤑+120%
4060ti msrp 400 chip 66 8GB 24 board 30 cooler 14 = 134 airmail 20 retail 30 Nvidia 216$ 🤑+160%
4060ti msrp 500 chip 66 16GB 48 board 30 cooler 14 = 158 airmail 20 retail 30 Nvidia 292$ 🤑+184%
4070 msrp 500 chip 100 12GB 36 board 30 cooler 14 = 180 airmail 20 retail 50 Nvidia 250$ 🤑+140%
4070s msrp 600 chip 100 12GB 36 board 30 cooler 14 = 180 airmail 20 retail 50 Nvidia 350$ 🤑+194%
TSMC 4nm wafer 80% yield / 117x 600mm2 from wafer 4090 / 19000$/117=162$ / 162/0.8=200$
d=300mm wafer tsmc 4nm 19000$
B580 = 270 mm² 4n price 200% tech power up spec
A770 = 406 mm² 6n price 100%
AMD lacks dedicated RT hardware and doesn't have as much hardware dedicated to ML acceleration. If the 7600 had that, the die would be a fair bit larger.
And it's not like Nvidia has made it a secret that they're using tiny chips and enjoying huge margins.
Really, all the die size says is how far ahead Nvidia is over AMD and Intel, Intel is closer to AMD than it seems.
Thank you as always. Id like to see some productivity testing. Blender, Davinci Resolve etc.....anything in that realm.
If that GPU released when Nvidia released the 4060, it would have shattered the market.
Right now ? There's a new generation of AMD and Nvidia GPUs at the corner, that will potentially affect the pricing of current gen, which is what Intel goes up against... In other words, not so hot.
5060 will be 4060ti ish perfomance for 300-350$ which is 10% faster than B580. Okay price but probably people will pay a little bit more and go for 5060 even if it's 8gb. (For 1440p it's not worth to buy any 60 class gpu anyway)
@@Krisz94HU that's a interesting way to put it. if the 5060 is 12gb. The $100 cheaper b580 still sounds alright. If the 5060 12gb is $350 you could save 33% of your money and get something only 10% slower with the same VRAM.
@@Krisz94HU 4060 is fine at 1440. Obvs not same experience as $1000 gfx but 1440 is certainly viable on 4060.
@@Krisz94HU 5060 is going to be between 4060 TI and 4070 most likely, the target performance is going to be around 7700XT, which is where I expect the 8600 to also be at, for 250€. You can buy a 4060 TI for that money already, it wouldn't be worth releasing a card this expensive and slow for next gen.
The fact it has more than 8gb is great, and all, but I wonder how much of these performance gains just exist because the card is not chocking on VRAM, instead of actual rasterization gains. When they say it's 10% faster than a 4060, I have my suspicion that's mostly because the 4060 is being strangled. If you tested on "high" instead of "ultra", I wouldn't be shocked if these perform identical on average.
But the difference will be that the B580 or B570 will perform better at higher quality settings than both 4060 and 7600. Whats the issue? Thats a great thing. This will force AMD and nvidia to reconsider their new mid range cards if they don't bump up the vram.
its not just RAW performance, its also how the upscaling works, future compatibility and even things like the AI playground, imthinking of buying it and sell my 2080 super since NVIDIA isnt supporting the 20 series with new dlss software and such, i cant even have nvidia framegen on this card.
So 250 bucks for a slightly higher specs than 2080 TI in RAW performance combined with all the new tech makes for a very happy me.
There do be frame gen mods for games like elden ring, forza, Metro and so on.
Yes, I'm waiting for independent reviews, but if XeSS2 is close to offering DLSS levels of quality and performance it means that Intel have made a massive leap towards bridging the gap that nvidia have opened, something that AMD have failed to deliver on in the past few years. They have lagged far behind in raytracing and FSR is noticeably inferior to DLSS. If Intel have done better in just two years that will hopefully give both AMD and nvidia a much needed kick up the ass.
Until you find out you can run your games due to driver bugs. LOL. Good luck with that.
@@Rufnek2014Both AMD and Nvidia have driver bugs too. That's why some people get a AMD card instead of Nvidia, or vice versa. Nothing is perfect and that includes the Monolith that's Nvidia.
I love that you are thinking about how people will actually use the card such as high vs ultra settings and upscaling enabled.
I'm just glad people will have access to functionally good low end cards. It wasn't that long ago a low end card could barely play 720p 30fps for similar prices.
Too little too late, we needed a GPU faster than an RTX 4060 and RX 7600 for $250 at least a year ago.
no we dont need much more speed. Games dont move forward in graphics anymore. Besides 33% faster than 4060 in Cyberpunk is crazy result!
"Too late" is subjective.
@@lucaslod2731 33% faster for sub 300 bucks was what the 4060 should have been from the start.
@@lucaslod2731what? have you seen the latest disasterous titles?
Regarding Games to Review
If it is claiming to be 1440p with ultra settings, I think you should also test games like Total War: Warhammer 3 or Pharaoh. Total War doesn't have RT, but It will be interesting to see how it handles those benchmarks. The 12GB will hopefully put it in good stead.
City Skylines 2 and Homeworld 3 would also be interesting.
Hopefully it will hit 60FPS
1080p RT capable card or 1440p, not max settings, and light RT capable card with 12GB for $250... for me it looks decent.
The alternative for now is the 4060 for $300 with 8GB and 7600xt but with a worse RT.
RT at this tier is a meme
I would avoid 8Gb memory these days. Games are starting to utilize more than that
All intel had to do was be competitive in both raster and rt. Be close with competition unlike Radeon cards in terms of rt in which it lagged behind severely even on the flagship 7900xtx. And if intels numbers are to be believed, they have decent options in both this time around, and if more games adopt xess and intel doesn’t fall back on drivers support, then intel some freaking how made the biggest gains this generation relative to the gains of its competitors.
@@FrowningTaco RT is handled by separate processing units than raster. As long as there are enough of them to handle the workload it shouldn't completely tank performance when the feature is turned on.
RT is out of question at that price range
The 6750 XT was one of the fastest rises in the Steam hardware survey last month. The world listened to your recommendation to buy it, Daniel. 😄
So this is what I expect. Intel ruling the low end, AMD ruling the mid range, and Nvidia rulilng the high end.
Reality : NVIDIA ruling everything
@@SalimShahdiOff Sad but true 😔😔
I don't think Nvidia cares about entry level anymore. They're not cooked until someone offers a flagship that outperforms a current 90 series for $500.
AMD gonna be the one who feels the most pressure from this not NVIDIA
Nvidia is the richest company on this planet! I doubt they worry they sell hardware the government won’t allow them to sell to China!
If these gpus continue to be a great value in the future I'll definitely consider getting one when my rtx 3060 ti isn't good enough anymore, $250 for a 1440p gpu is great value
If the claims are true, I want to see this GPU on laptops, I'm sick of 8GB VRAM cards in 2024...and we're almost at 2025.
then i hope you go buy this card as soon as it is available right?
I’m stuck on 4gb 😳
this will be even rarer than AMD's mobile GPUs. no OEM is gonna bother with this. if Intel still had the NUC, this would shine in the X15. otherwise expect Lenovo reuse some old Legion chassis to release like 100 units.
them showing xess performance is good it shows the heavy cpu/ driver headroom has resolved as its hitting higher frame rate. also to note seeing games that have previously sucked on arc do better is another good sign, such as god of war, the last of us and horizon zero dawn , enjoyed the video
You see how good we had with 6700 and 6700XT, better than this at almost same price 2-3 years ago...
6700xt was 399 msrp.
@@EldenLord00 No it was 479 USD
@ that’s right. I remember getting my hellhound for 499$
Can you compare the B580 vs the A770 and the 4060 and 4060ti? These are all of interest!
No one is cooked, but jesus am I happy we have three companies in the GPU race. In the end it all benefits us, the consumer
Where's the successor to the A770 16gb? How does the performance of these two gpus compare to the A770?
While no one knows how a B770 card will end up, their budget b series offerings being this much of an improvement even vs an a750 is great news as it means the b770 could end up being a high end 1440p to budget 4k card, and that should make nvidia and amd scared that intel is making this much of a leap from last gen to this gen.
Competition is a beautiful things isn’t it.
B580 will be a very good gpu for budget builds
I want to see this compared to all the old Intel cards. I personally own an a580 and have been happy with it and am excited for this next Gen
The biggest concern I have are the performance charts they showed. There’s nothing solid or concrete about them so I got a feeling Intel isn’t confident in showing off more. These will probably be ok but the 8600 and 5060 will undoubtedly mop the floor with these.
2:43 , nothing concrete ? uh
The 5060 will come in with 8GB of VRAM for $350+ though.
So getting a card with more VRAM for 30% cheaper & still beats a 4060 (remains to be seen) is pretty nice, especially if it has feature parity.
@@corey2232 that is NOT worth it by any means
anything below the 5080 in the 5000 series just seems to be previous overclocked GPU's with PCIe 5 support that only pushes it to a extra few frames
most my games require 10-14gb vram, funnily enough, some are sponsored from Nvidia who claims you dont need more than 8 for gaming.
@ yeah but there’s still the issue of Intel really supporting these things for a few years down the line. What’s the point of buying a $250 GPU and games stop supporting it within the next few years. That’s a legitimate concern that everyone should have, especially since Intel has said that ARC is winding down. It’s sad but at the same time that’s why we really need AMD to step it up. RDNA 4 and eventually UDNA sound interesting, but AMD needs to finally take some market share back from NVidia.
Once again here to request a look at the history of generational GPU price:performance improvement over the years 🙏
Just Google it
AIUI not much has changed - the graph is still going up but at a slightly reduced incline. (Contrary to much public opinion).
Hate the clickbait, love the content. This card is more of a "threat" to AMD than it is Nvidia, and that fact should already occur to those who can think a little critically.
Nvidia was a "threat" to themself making the RTX 3070 8GB card, the holy grail washed down the toilet because of vram size.
Amd? The 6700xt is faster n has been saling for 250-300 for yrs now
This card just fail again.
lol the way you move your camera to point in screen which moved your face, really made be laught🤣
I am straight up going to get one. Their last offering came in a joke...it wasn't badly priced but nobody wanted into that driver mess that we knew was going to happen. They've had 2 years to mature drivers. They've finally given us a completely workable option to Nvidia at lower cost.
I see alot of people trying to get out of intel 13 and 14th gen chips which many are still perfectly fine chips. Get a 13th or 14th gen i5 in ITX case and slap this in there. Solid portable system.
Why didn't u get the 6700xt it faster n has been saling for 250-300 for 2 yrs now
@@robertmorrison107 I have a 13700k and putting that chip into an itx case is asking for trouble. I have issues cooling it with a 360 aio I can't imagine in a ITX case with an air cooler. Believe me I considered it but there would be major issues cooling those chips in a sff case.
@@NeVErseeNMeLikEDis i have one. Well....the kids machine has one
Never did I think I would ever see Daniel Owen use gen alpha words...
I’ve never played or bought based on price to performance. My all time gpu ownership was 970, dual 970s, 980ti, 1080ti, 2080ti, and 4080.
Not everyone can afford these cards, so people calculate to get the best bang for their buck.
Thats very fortunate for you, 80 class cards arent feasible for a lot of gamers
Not everyone got money like that so price to performance is important
Thank you for this video. Please test at 1440p & 4k the following games: Cyberpunk 2077, The Witcher 3, A Plague Tale: Requiem, COD Modern Warfare III, The Callisto Protocol, Spider-Man 2, and a driving game to round it out.
dont give a flying fly, until actual benchmarks come out , all companies cherry pick. need proper benchmarks from hw unboxed ,also comparing 12gb vs 8gb variants is hiliarous, why didnt they compare against ti and xt versions, we all know why.
Uhhh, you know the Ti still has 8GB, right? Like, there is 16GB version, but that’s in a whole ‘nother price bracket.
because they are the only ones offering over 8gb in the $250-$300
@@farasty7371 3060,7600xt,6750xt all are above 8gb. cards like the 6700 and 6700xt were as well. it's the first msrp card under 300 with more than 8GB but idk if that even matters
Intel did well comparing this against the 8GB 4060. one of their biggest selling point is 50% more VRAM than the competition. why the hell wouldn't they point that out? nowadays a good amount of VRAM is enough to sell a card.
Dude... I was saying that at $250, this would be a slam dunk. I'm really happy about that. Gotta' see what the competition comes out with, but I honestly might try Intel this generation if the other tiers are anything to go by. GREAT point about the performance per dollar though, they're probably going with the $299 MSRP for the 4060, so it's barely faster... But that still means the next card up will be a generation improvement, so that may not be bad. So you'd still be getting "more for the same price" as it should be..?
250$ for less performances than a 6700xt which sold for $270 for yrs now.
Yea sucks
From Niktek's and your videos, my idea is that: Its an Intel Rx 6700 xt/6750xt. But for £250? Alright. Sounds neat.
Edit: oh wow, so the B870 is an Rx 6700 10gb? Finally, a GPU around this specs and price point (used). So around an RTX 3060ti in performance?
Except it actually has usable upscaling at 1440p with hardware less unlike fsr and it also has a ton more oc headroom due to intel allowing for much more voltage and power limit headroom. It managed +350mhz on average which should give you about 15% more performance.
In other words, with an overclock, you're getting a 12gb 3070ti for $250
@@Frozoken Believe me if they could they would. Nobody leaves free performance on the table, but keep dreaming.
@وليدحسيناشتيوي
I've seen their slides giving the exact numbers genius and they had the exact same headroom on the a770 but memory clocks were locked with them which limited oc headroom altho you actually still nearly got +15% performance on that too.
@Frozoken they also benchmark in 1440P THEY KNEW THE 4060 WOULD BOTTLENECK,
Is nvidia cooked?
Is nvidia cooked?!
iS nViDiA cOoKeD?!?!?!
No
No one will touch NVidia's market share until AMD and Intel gpus can do anything but play games good
intel gpus are insane at video editing what? have u heard of quicksync? they are also insane for ai, it's amd thats useless for anything other than gaming
Why is it that people who only play video games don't buy AMD then?
@@razpaqhvh7501 How are they "insane at AI"? Isn't basically every ML software optimized/built for Cuda cores?
Great video . The new cards from Intel are a welcome to the gpu market. Can you test StarCitizen with the Intel cards? thanks.
I hope Intel keep pushing for higher and higher end cards, we need more competition in the GPU market.
Give them money and they will do it
Just wait for the actual comparison tests.
Competing against Nvidia's slowest " Last Generation" GPU won't concern Nvidia The bias D Owen expresses for Nvidia is shameful, if anyone is cooked, Its AMD
Yep, the 5060 will be probably be 20 to 30% faster in most games and comes out in a few weeks
He’s definitely not Bias especially towards Nvidia he’s just giving praise to Intel for the continued improvements.
Which is deserved.
Yes I tel doing good hurts AMD not Nvidia but that’s because AMD decided to play the same pricing game as Nvidia. lol
Which AIB is going to make the "Limited Edition"?
If it would have matched the 4060ti, then maybe. Otherwise the b580 needs to be at most $200 to make a dent in the market. The 7600 xt offers 16gb and will drop in price to knock out Battlemage.
From Niktek's and your videos, my idea is that: Its an Intel Rx 6700 xt/6750xt. But for £250? Alright. Sounds neat.
Edit: oh wow, so the B870 is an Rx 6700 10gb? Finally, a GPU around this specs and price point (used). So around an RTX 3060ti in performance?
please intel get a win with this launch
1440p @ 60fps should be the new standard in regards to resolution and frame achievement. I could currently care less about Ray Tracing tech, i prefer visual clarity and smooth framerates over anything else. (just as most consumers) 1080p is nearly 20 decades old and 30fps is just the same. We had the PS2 (if im not mistaken, that ran @ 60fps) that revolutionized what gaming tech could provide for the user. It's time for GPU makers to take this into account and maybe not charge $1000 for their products as well. And maybe game Devs could also quit making games that look washed out and pixelated @1080p. I just recently played "The Evil Within 2" @ 1080p.....and i have to say, that it's one of the best looking 1080p games i've played on a 70" 4k tv. (no jaggies, etc.)
1440p monitor now cheaper same like 1080p from the past, but 1080p for now still a standard, atleast from steam survey.
One good thing that happened bcz of the announcement of Battlemage pricing that now NVIDIA and AMD will not overprice their gpu on launch like they always do
Never Ever listen to Intels Benchmarks.
Never listen to any first party benchmarks. Always look third party reviews.
The 5060 and 8600 are probably going to be in that performance and price tier or better. For 250€, you're probably going to get a 8600 that performers like a 7700XT with better RT or a 5060 that sits between the 4060 TI and 4070. 25-32% over last gens competition isn't really hard to beat.
Nvidia cooked? no, AMD yes.
Nvidia is cooked lol, amd will be fine once they release rx 8000
@@b3as_t Nvidia has more money than god from the ai rush at this point, I genuinely don't think they care anymore. I hope Intel and AMD can chip away some of their gaming gpu market share though.
AMD is about to release an APU that is faster than this lol
i feel sorry for anybody buying a $300 gpu
AMD is already fine, their cards actually work, the 7600 XT has the same performance level, on a older node, with a smaller die, AMD can kill these Intel cards by just lowering prices on RDNA 3 while keeping comfortable profit margins.
Can't wait to pair it with my budget Alder lake 12400😁
I have a 12600k cpu. Theoretically this cpu could be paired with up to a 4080 level card, but by that tier I’ll upgrade my whole system so any card that’s on a performance tier to the 4070/5070 card is what I’ll look at for upgrading my current pc.
Counterpoint: Nobody should be buying an 8GB card to play today's games at 1440p, and nobody should be playing at 1440p with settings dialed down to make an 8GB card usable. If you can't get the FPS you want at 1440p native, play at 1080p.
lil bro settings on pc are so you can play how you want . dont enforce how you wanna play onto others
@@abz7800 Just my opinion. I have no power to enforce it, so you're safe.
VRAM fascist lol
Will this card support Intel own Frame Generation?
No, but Intel sure is and i bet AMD gpu's won't make a dent in the market own by Ngreedia this next generation either.
Will B580 be a good upgrade from RTX 2070? Seems to be same performance level with more memory which is the only thing I have problems with on that PC
According to Intel's own claims the B580 is around 5% slower than RX 6700 XT 12GB. It will probably be even slower in independent reviews. But the 6700 XT has been around $270 for half a year now. So if someone wanted that level of performance for that money they could have had it a long time ago and didn't have to wait for Intel.
do you just go around copy pasting your answer everywhere?
on Amazon the cheapest rx 6750 i found is 330 euros. On Newegg there is no 6700 xt for 270. Only 300 and that refurbished.
So not sure where you find these cards but it seems to me that you are bit full of it.
Also the 6700xt launched at like 450 usd msrp or something like that
Edit: cheaper and nearly as fast as the 6700xt seems like reasonable to me not sure what you expect.
@@_qwe_fk_1700 Did you even see the video or did you just come here to criticize others? 9:48 The 6750 XT is even 10% faster than the B580 and is only $300 now.
Anyway, I'm not your buying guide. I'm just saying that everyone had plenty of time to buy the same performance for the same money. Intel's nothingburger.
@@JackJohnson-br4qr
so 10% faster but 20% more expensive, seems like a decent deal for th intel card.
@@_qwe_fk_1700Yeah a brand new card "competes" with 4 year old card that is at the end of its life time.
@@JackJohnson-br4qr yeah a 4 year old card that had an msrp that was double...
what a stupid logic from you, based on that there should only ever be high end cars. No more 50/60/70 series then. Just 80 and 90 series...
yes they offer the same performance as older cards, but at a smaller price.
Id like to see it benchmarked with just an AMD Ryzen 5 and 16gb of DDR4. So many people out there have these and they will be the ones looking at this card.
I was hoping to upgrade my 1080ti. I think is still is too close to this if you don't use frame gen magic. Maybe the B770 comes still with 380 price point or something.
It's nice that Intel is in the game (albeit late again for X-mas). Earlier this year, I down-upgraded to a RX 6750xt from a RX 7600 to support my 1440p monitor. I picked up the 6750 (new) during Amazon Prime wk for $280. No need to grab another GPU any time soon.
As far as I'm concerned with the Intel Battlemage GPUs, I'd like to see how well it performs on older DX9 & DX10 titles - this was a blindspot when the Archmage GPUs initially launched.
While it's fun to play the latest and greatest titles, sometimes I like to kick back to an oldie like Oblivion or C&C3: Tiberium Wars.
Really appreciate the breakdown of what Intel could be spoofing. Looking forward to reviews from you and everybody. My cousin might not go for Intel, but I am on the look out for when my 1080ti dies.
This is the right move by Intel. Good stuff. I'm not looking for a GPU, but this is a great thing to see. Even if it ends up being slightly slower than a 4060 in non starved situations, they will still age better, and the price helps too.
Sounds good in the paper, as they always do, let's hope the drivers will follow up in a better way than the A series 😅
Regarding, Games to Review 13:36
If Intel is claiming B580 to be targeting 1440p with ultra settings, I think you should also test games like Total War: Warhammer 3 or Pharaoh. Total War doesn't have RT, but It will be interesting to see how it handles those benchmarks. The 12GB will hopefully put it in good stead.
City Skylines 2 and Homeworld 3 would also be interesting.
So, AMD and Intel are now slightly better than Nvidia cards from 2-years ago but Nvidia is cooked? That's a funny take.
Intel is definitely going to get some market share with this if there are no major drawbacks
they need to crank out more power to go with that vram to completely eliminate nvidia from this price tier
In the end it's the driver's. If the drivers are the same then it won't matter what charts you show me.
Betteridge's law of headlines...
The only thing i'm sure of, AMD will miss the catch no matter what
i really like how intel are high lighting how NVidia is cheaping out on vram.
The fact that the 4060 costs $300 makes me sad. Because that is $300 USD. Which translates to roughly $450 in my country.
They don't care but we should be happy, competition is better for us
40% yap, 10% actual good data, 50% repeating what he's already said... garbo video
Hi Daniel, I really want to see Topaz Video AI tested on the b580/b570, thanks.