Can I ask why the comparison between the older cpus vs newer, instead of rebar on and off? I was thinking its already "discovered" and tested, that the problem is rebar, not the cpu itself being old, so I was expecting expanding on that specific topic, or maybe even ask a question if that is something somebody should be worried about, and how old platform need to be not not even have rebar.
I'm curious about performance with raytracing. I would like to see testing done on Indiana jones. If games are going to start requiring raytracing I want to know how the intel cards perform in an environment where its the only option.
Review for low-end GPU almost irrelevant on your chanell. People that buy in this budget don't care about 4k and even 1440p resolutions, same for ultra-settings in AAA-games or RT perfomance. They didn't play games at 28-32 fps. And they don't care how perfomance of 3060 stack up to 4090. They care about getting 60 fps in 1080p to play with vsinc enabled, without drops and spiky frametime, with good input lag. Or play a bunch of competitive games (CS, Fortnite, Pubg, DOTA, WoT etc.) with low settings at higher fps possible. So you get all wrong about how benchmarking this products. In single-player games - 60 fps is a ground, the question is - with what settings you can get it? In competititve games - what input lag and consistency you get at the same low "competitive" settings?
??? are the brackets in the benchmarks (12/24) or (1/25) dates they were tested or mean something else?? because its 1/16, just wondering if its a typo and supposed to be (1/15) as the day tested with drivers as I'm confused what those numbers mean..... unless this video came from the future then I apologize for the stupid question
No, just no. You can’t keep dropping new shiny reviews every other day-that’s straight-up against RUclips’s Terms of Service Policies. WTF, bro? Seriously? 😳😭
As a owner of the 1080Ti... its getting so much harder to choose whether to stay or upgrade, whilst thinking the whole GPU economy is so incredibly overpriced.
One of the if not the greatest! I ask every 1080ti owner this.. Did you buy it new at launch for 6-700 USD and if so why not buy another 6-700 USD card that is a significant upgrade to the 1080ti now all these years later? Adjusted for inflation that 6-700 is even more. Of course if you got one used for 200 USD or less yes it is going to be tough to beat for similar cost today.
I'm willing to bet the 5070 Ti or 5080 should be a pretty decent upgrade if you're looking for something in a similar price range to the 1080 Ti when it launched. If you're looking at budget stuff then ehh you might wanna wait another GPU generation unless there's a game you want to play that you can't run. But the B580 or B570 looks to be a decent option.
The ball is in their court. And bank. How far can they fund coming up short? Cause idk a single person who uses an Intel card due to these videos. The community that frequents the store all say that they don't wanna invest in bs problems they'll have to troubleshoot and with Intel being so new, they won't commit to getting any products that aren't cpu's
I'm not holding out much hope. Their own road map promised "enthusiast" tier cards for Battlemage launching in 2023-2024. Yet, all we got was one entry level card in December of the second year. Doesn't instill confidence
@@SansAppellation one incredibly well received, well selling mid range card undercutting the current market which is inflating in price. Intel is seeing success. The GPU branch if anything will branch off
Thank you for all of this work. The sheer amount of configurations that could affect a product are insane. AMD vs Intel, x3d vs not, more graphics cards, resolution, etc. Completely mind blowing...
@@GamersNexus you need THE MEGACHART with DROPDOWN MENUS :D, mabye a tech reviewer scene team effort. Especially now that AMD tries to switch to sku overlord confusion tactic with different categories accel in different thing. If others try to push this we and especially you the tech reviewer scene will have a problem. We already have the problem to figure out and pick the right hardware synergy without having big gap in bottleneck and not pay unnecessary when we have literal see of hardware choices and combinations, somehow this needs to be alleviated somehow.
@@GamersNexus I really really appreciate your methodology, and your considerations at the first part of the video. Thank you. This is at the heart of why some of us have criticized your (and other reviewers') methodology. Checking with a few tests to at least confirm that your usual assumptions about performance scaling behaviour still hold before going on to do standard tests gives a huge boost to the confidence we can have in your results in the end. We understand the combinations are mind blowing, we do, but this level of cross-checking is extremely valuable to us. Again, thank you.
9:50 i just wanna take a moment to give a big shout out to the editor for guiding the viewer to the benchmark Steve is talking about. As a viewer it can get hard to follow along especially when looking at data and trying to interpret them on the fly. So major props to you.
I think it's pretty impressive for Intel in merely two generations of GPU to largely have caught up to the two giants in this space given their performance tiers and pricing.
Caught up? Only if you ignore how big die and power needs of this card is. It's a flop, good for US consumers as this fake MSRP is there somewhat available but Intel obviously used that only to get good reviews and expect for rest of the world to pay for difference between what they need to make this profitable (or at least to be at 0).
Wrong. Their old iGPU before Iris-Xe can't be apart of this "GPU" thing as it technically was meant as a display output and nothing more. Technically, Intel started up properly on the GPU business when they came out with Iris-Xe, before going into Xe Max and that was their .5 of a Generation. Intel Arc Alchemist came out and that was their first proper start
They haven't *really* caught up - they're using much bigger dies than AMD and Nvidia for those cards, despite using similar lithography nodes. Most likely that's the reason they can't scale to high-performance cards yet - they'd be way too big and power-hungry to make sense
i concur. idle wattage for CPU and GPU is probably just as important since half the time, our computers are idle or doing low task like watching youtube or typing up papers.
I dont think they have ASPM L0/L1 enabled in BIOS and power plan set to maximum power saving for PCIE as they should. B580 idles around 13W what I saw in different review.
@PunzL For B580 it's pretty good, I think I saw one store always ran out of stock at the end of the week and have them restock by the next monday. But since the review about the overhead problem seems like they sold it less and less. For the B570, the stock is also good, but I think it's because people don't want to buy it yet, since there was no review available
@@ShroudedWolf51Because some people are willing to deal with the issues it has. Your friend with their A750s have had issues but me with my MSI Claw never had any issues besides stuttering, which is a trait of Alchemist
@@GamersNexus you guys are crushing it and the standard has only gone UP since I subbed many moons ago. I'm so glad to see GN rocketing into the position of broad respect and reference that you have worked so hard to build towards.
I'm still daily driving a 6900XT from 2021! It's a great GPU, and I upgraded to 1440p/165 mid last year (1080p/60 before, I needed the horsepower for non-gaming tasks). It's a bit of a stretch asking it to do that, but in a lot of games it can reach it perfectly with great quality!
Me over here with my i5-8400 and rx580 4gb..lol. I'm gonna upgrade soon just don't know what to do first. Prolly gonna buy a 6700xt to get me buy till I get a whole new mobo
I have to say thank you so much for the bars on the sides to indicate how long sections are. Also i cant wait to get this years disappointment build shirt ill have '21 '22 '23 and '24!
dude you are collecting the Infinity stones of Dissapointments haha in shirt from, personally i think 24" takes it tho as the Reality stone, ya know, since ppl out here making it whtevr they want
As someone who's still on a 1070, it's much appreciated! I'm eyeing the Battlemage cards, but I've had some hesitations... Scalping being one of them, haha.
As a A770 user I think Intel really outdone themselves. B580 firmly beating the A770 as a mid-range card, with the $220 B570 getting really close. B570 might very much be the best value card this generation. I might not upgrade anytime soon but it's nice to see this much improvement and keeping up with the competition!!
I had a look this morning in Australia. The B580 is back in stock in shops with the release of the B570. However the price has gone up $40 and the B570 has launched at the price the B580 launched at. Glad I got one from the launch batch.
I had a look and there is still one store in Victoria selling for the original price but yeah disgusting that every other store would take the chance to pricejack the 580 before even listing the 570 it would make fairly comparing the two on value impossible to the average consumer who doesn't know historical pricing charts exist. You want to support local business but then they do this cartel tier price fixing collusion garbage so ordering dodgy stuff off china direct looks more and more attractive despite the risks, story of Australia I guess.
This is a very useful video to show how so many people fundamentally misunderstand the words "bottleneck" and "limited". They are not the same thing. Thank you GN!
I really don't think they need to compete at the current high end, as to me Nvidia has lost the plot in how far they have pushed the high end. Almost nobody really should be wanting to put a 600 ish Watt GPU of that price in their system! It is just too hot, too expensive to run, and the performance uptick it brings really isn't worth it for most folks when compared to the lower-mid range Nvidia or the Intel/AMD competitors that are targeting that part of Nvidia's range... The 4090 and now 5090 are insanely impressive in some ways, but so expensive, so power hungry they just don't make much sense for the general public level of gamer.
@@john_in_phoenix Knowing Nvidia, I doubt it. Ray tracing and Frame Generation has its own downsides, and developers not having option to turn it off unnecessarily raises the hardware requirements
Same, and I am happy with the purchase. It out performs my 3070 in most games that don't favor dlss, or other rtx specific features, and it does so on an 3700x where the 3070 is on a 5700x. I don't regret this purchase as the other options in this price range are used or just plain suck.
If intel makes a super tiny Battlemage card, such as single slot, I'd love to buy it to put in a tiny slim PC case I'm designing to look like a 1990s Thinkpad tablet. I'd use it for FFXIV.
IIRC it was possible to swap a A310 single slot cooler onto a A380, maybe that will happen again, I would really like to see a low profile B380, as would millions of SFF office PC owners, especially as the 8th and 9th gen PCs are hitting the bargain bin, and 10th Gen systems are popping up on the surplus market for under $100
It would be great for people that want the ultra low power home lab, since Intel arc cards were really good in encoding and decoding. A next gen one with maybe even lower power draw would be awesome
I can't imagine the extra hours it took to retest the GPU with last-gen (or earlier) CPU's, but it's appreciated, especially by someone still on AM4 and not planning on upgrading any time soon.
A decend, brand-new GPU that doesn't cost more than twice the rest of the PC... didn't really thing we'd get those. My mind cannot process how Intel wound-up being to GPUs what AMD was to CPUs a long time ago in a galaxy far away...
Are there many other youtubers who avoid teleprompters? It's been years since I've heard "pun unintentional" from a host, and that's because it's a really unlikely thing to say when reading off a prompter. Anyway I love it, it's really enjoyable to get carried away in Steve's flow and follow his train of thought. And being only loosely scripted obviously brings the review itself up a notch. I hope you guys are around for a long, long time.
I really appreciate how, when you open a video with a question, ie, "Does a poorer CPU make this video card worse?" you don't leave anyone hanging and just immediately go "No, not by much". I do want the details, but if I didn't, you wouldn't be wasting my time at all! Thank you!
I feel like in the Twilight Zone. Everyone keeps saying the B580 is such a good deal, meanwhile after a month the card still can't be had at MSRP in any market, while AMD's direct competitor cards are actually available at cheaper prices for the same performance.
It's a very challenging industry to get into, due to software/driver support and complexity. I think it is impressive that Intel is able to get into this successfully. One can only hope they improve and start becoming a competitor to Nvidia and AMD, we need the variety
@ I had Asus P5G41T-M-LX 775 board with dual-core Celeron and gaming on the chipset graphics Intel ICH7. This guy doesn't know that Intel has been doing this for over 20+ years on graphics.
Production volume for GDDR7 is very low and expensive. Even AMD couldn't get them for RX 9070XT , so they run on 20Gbit/s GDDR6 . So , what you're asking for Battlemage is too much.
Thanks for testing lower end CPUs, that is real world scenario for a low end graphics card. Also comparing them to old gpus to show the real world gains is great.
I just got a $250 3060 12 gig and feel fine with my purchase, right in the middle of the two ARC cards and I was able to test in my PC, with my intended games before handing over cash. Also met some other automotive enthusiasts in the process, funny how the used market can be great some times.
@@llothar68 Mate, $350 is incredibly cheap for a GPU nowadays. Sure, $350 is a pretty hefty chunk of money and if the card is worth it is a whole other story, but it's still cheap. 'Cheap' and 'expensive' are both relative terms in relation to the alternatives. Obviously scalpers upping the price by two-fold isn't helping anyone, either.
You can get a 4060 for that price, so no it's not cheap. Or if you say it like this, it's not as cheap as it should be compared to the competition. @@avananana
Intel GPUs should NOT be going into some noobs build, it'll only end badly. There are way to many issues for someone new to the scene to troubleshoot constantly.
Nvidia: Budget gamers don't deserve to game. Intel: i got you budget gamers. Clearly intel still have lots of work to do if they really wants to compete in the GPU market but B570/B580 is a good start. Hope they continue to get better in the future.
also Intel - we've finally caught up to where the budget range was 4 years ago, just as the new generation of Nvidia/AMD cards are set to make us look really bad
@@MaximéGodModeI have 2 Toyota trucks, Komatsu backhoe, a ranch and house and I use 1050ti. Some people not poor, they just have a life outside playing games. Or they are poor, but smart enough to not spend on something unnecessary. Do you buy a pc with your own money or asking mommy or daddy for your 4090 ?? Mister rich polish man here poor shaming people on the internet like some kind of a big shot….
I was able to get B580 Asrock Steel Legend for 340€ (this includes crazy taxes in my country) and I am really happy with it. It's price is currently 380€, so still cheaper than 4060 Ti
As someone that is still running an i5-10400 and an RX 570 8GB, thank you for including those older cards so I have a frame of reference for what an upgrade would look like for me. Keep up the awesome work!
Man, I miss when this was midrange GPU pricing. My first one I remember buying myself was around this price (8800 GT?GTS?). Keep up the good work Steve n crew!
@@_etwas The A770 is in the middle of the list with the top tier CPU, and at the bottom of the list with the 5600X (which is no slouch, tbh). That difference was drastic.
@@Retro-Iron11 exactly, i bought an x1950 pro for £95 back in 2006, I'd expect to spend at least 3x that now for GPU before we even get into how much better entry level gaming is now. You can run pretty much anything if you are prepared to turn some settings down. Back then, entry level couldn't. (see Crysis lol)
I can see two classes of benchmarking: one where everything not tested is as good/high powered as possible to best eliminate bottlenecks and show the tested part in the best case scenario, and two, where test bench parts are chosen "budget category" specific. If you're buying a $150 GPU, you're likely not buying a $700 CPU or a $400 motherboard. It would take a lot of editorial choice in part selection at different budgets, but I expect that is what people want out of a review: "How good is this part?" and "How good is this part actually going to be with what I already have, or can afford to build up/replace?".
Thank you for this. I think it's really helpful to have different CPU testing with GPUs since it gives consumers a better idea of what they might actually benefit from in their situation. It is a lot more work, though, so I understand why it's not something that you can do all the time.
8:06 "There's no substantial change to the B580's performance across these three CPU's with this test configuration" So.. we're just going to ignore the 1% lows?
Gotta say I'm loving the additional CPU's in this review. The combination of the no CPU bottleneck high end CPU and the very relevant "this is what people buying this GPU will likely be paring it with" data for budget to midrange CPUs.
you know what would be hilarious to show, just simply to demonstrate how far CPU technology has really come do all of these tests but with a really old CPU like an AMD FX8350 or something
I'm just waiting to buy a B580 for under $300 rather than a 4060. Intel is doing a good job with these cards. I am currently running a 3060 that I seriously overpaid for during covid. Just upgraded my 5600 (no x or x3d) to a 5700x3d and 32 gig memory (newegg bundle). Since the monitor connected is only 1080p, I can wait until the price gets closer to MSRP.
It was really cool to see that you guys did the work to test the CPUs. I know you guys do your best to detail games, but Adobe also has benchmarks, and they can be drastically different. Any way to include photo/video rendering speed compares to? I bet more of your audience would care about that than you think.
7:59 there's a big difference in 1% lows and 0.1% lows with the lower-end CPUs with a B580, which is where such driver overhead issues are most likely to manifest anyway. But you don't even mention that, and just say that there's no substantial difference.
Excuse me, but what you are saying at 09:00 goes contrary to the chart: Neither 4060 nor 7600 gets any damage to 0.1% lows (when CPU gets downgraded), but B570 and B580 gets stutterstruggle™. Average FPS is good, but average FPS is (sadly) not the best gaming experience metric.
Great video! I'd love to see the 3080 in the current efficiency tests to compare with the B580 and 7800XT. I believe it's still a good reference point as I think Nvidia pushed that card quite hard when it came out.
The suggested prices are a joke. In germany b580 costs close 350eur while rtx 4060 costs less than 300eur not even mentioning rx6800 used that can be had for 250eur. So this cut down version will be close to 300eur.
@mightychev bought reference 6800 which is still very nice in Germany for 260eur shipped. I guess that's very low but you can get one easy for 280 which I still think is a good price for such a nice gpu that will be still usable for few years.
14:28 never mind the Arc cards, look at those lows on the 4060 in BG3 at 1440p 😮, they absolutely tanked on the slower chips and those cpus aren't even that slow.
While these aren't for me, I'm glad intel is getting into the lower end market to help drive low\mid tier costs down. My Factory Tour skeleton shirt should arrive tomorrow 😃
Agree, I own an A770, and am considering replacing my 6950xt, if the price is right on a B770 or equivalent I plan to pick one up. I am fortunate to be able to pick up something like that out of curiosity, it will be a tougher decision for people who only buy a single GPU every four or five years.
I like how steve respects his viewers time and gives us the main result upfront and then if we want to see the process we can watch the whole video. ❤ thanks for not holding us hostage through your videos.
Really happy to see the B570/B580 cards being affordable and with a decent amount of vram. They make it seem possible to make a backup/smaller lan PC without breaking the bank completely.
it is a live-service game, which means continuous updates, which means long-term test consistency is impossible. So, bad idea. (But, videos testing a popular game on a bunch of GPUs may be a useful idea)
While hats off to Intel for these GPUs, especially considering the price point, 1% lows and 0.1% lows are painful. I have a setup with a 7900XTX and an i5 11600K....yeah, I know, I'm waiting for 9800X3D stock atm...My fps would be awesome most of the time, but 120 fps averages are quite hampered by 70 fps 1% lows and such in my case, for example. This means that I'm CPU bottlenecked half the time, while the other half my 1% lows are abysmal, and knowing this, similar average performance of the B570 does not hold a candle to stable 1% lows of other cards in my book. Don't get me wrong, I am all for Intel shaking up the market with ARC, and if they can fix these stability issues, they can prove to be a worthy competitor. These are still very much interesting products, regarding generational improvements over Alchemist, Battlemage is nuts, so looking forward to the future!
They both make really good cards. Saying the company sucks doesnt mean their products suck. Amazon is one of the worst companies to work for in the us and you keep buying everything from them.
"Good" and "bad" are a function of both performance and price (perhaps energy consumption/heat depending on preferences), if performance is comparable and the price is lower, it's a good product.
As always good job GN , These tests are getting complicated. I'm still fine with my 5600 6650xt, maybe when I get a 1440p monitor I'll consider upgrading. Thanks Steve.
Thanks for including the extra CPU's! Would like to see some of this for other products, even if it's just to a lesser degree for checking if say, RTX 5000 is scaling correctly on lower end hardware. The normal testing methodology still holds up for using the highest end available CPU, but compatibility and overhead are good things to at least check, right?
@@GamersNexus Sure, but the fact that there's a legitimate concern whether or not a specific game is going to run terribly or not is certainly not an ideal customer experience
@@vince2051 I think so too. Honestly, if I was in the market for one of those cards, I would probably hesitate to buy the 580 because I'd rather have a more consistent performance rather than sometimes better than the competition, sometimes worse with frame stutters. So then I would much rather buy 4060 or 7600 personally for more consistency.
Grab our Disappointment Tour 2024 T-Shirts here (cotton)! store.gamersnexus.net/products/disappointment-pc-2024-oxidized-design-cotton
Find our prior Intel B580 review here: ruclips.net/video/JjdCkSsLYLk/видео.html
Watch our thermal engineering video about the RTX 5090 here: ruclips.net/video/-p0MEy8BvYY/видео.html
Can I ask why the comparison between the older cpus vs newer, instead of rebar on and off? I was thinking its already "discovered" and tested, that the problem is rebar, not the cpu itself being old, so I was expecting expanding on that specific topic, or maybe even ask a question if that is something somebody should be worried about, and how old platform need to be not not even have rebar.
I'm curious about performance with raytracing. I would like to see testing done on Indiana jones. If games are going to start requiring raytracing I want to know how the intel cards perform in an environment where its the only option.
Review for low-end GPU almost irrelevant on your chanell. People that buy in this budget don't care about 4k and even 1440p resolutions, same for ultra-settings in AAA-games or RT perfomance. They didn't play games at 28-32 fps. And they don't care how perfomance of 3060 stack up to 4090. They care about getting 60 fps in 1080p to play with vsinc enabled, without drops and spiky frametime, with good input lag. Or play a bunch of competitive games (CS, Fortnite, Pubg, DOTA, WoT etc.) with low settings at higher fps possible. So you get all wrong about how benchmarking this products. In single-player games - 60 fps is a ground, the question is - with what settings you can get it? In competititve games - what input lag and consistency you get at the same low "competitive" settings?
??? are the brackets in the benchmarks (12/24) or (1/25) dates they were tested or mean something else?? because its 1/16, just wondering if its a typo and supposed to be (1/15) as the day tested with drivers as I'm confused what those numbers mean..... unless this video came from the future then I apologize for the stupid question
No, just no. You can’t keep dropping new shiny reviews every other day-that’s straight-up against RUclips’s Terms of Service Policies. WTF, bro? Seriously? 😳😭
As a owner of the 1080Ti... its getting so much harder to choose whether to stay or upgrade, whilst thinking the whole GPU economy is so incredibly overpriced.
The 1080 Ti truly is the GOAT!
9070xt is going to be phenomenal.
Lol. You must not be picky about frame rate. ..lol
One of the if not the greatest! I ask every 1080ti owner this.. Did you buy it new at launch for 6-700 USD and if so why not buy another 6-700 USD card that is a significant upgrade to the 1080ti now all these years later? Adjusted for inflation that 6-700 is even more. Of course if you got one used for 200 USD or less yes it is going to be tough to beat for similar cost today.
I'm willing to bet the 5070 Ti or 5080 should be a pretty decent upgrade if you're looking for something in a similar price range to the 1080 Ti when it launched. If you're looking at budget stuff then ehh you might wanna wait another GPU generation unless there's a game you want to play that you can't run. But the B580 or B570 looks to be a decent option.
Just hope Intel pushes on and doesn't shut Arc down, we need competition.
The ball is in their court. And bank. How far can they fund coming up short? Cause idk a single person who uses an Intel card due to these videos. The community that frequents the store all say that they don't wanna invest in bs problems they'll have to troubleshoot and with Intel being so new, they won't commit to getting any products that aren't cpu's
Edit: Intel GPU being so new, i meant, obviously
buying > hoping
I'm not holding out much hope. Their own road map promised "enthusiast" tier cards for Battlemage launching in 2023-2024. Yet, all we got was one entry level card in December of the second year.
Doesn't instill confidence
@@SansAppellation one incredibly well received, well selling mid range card undercutting the current market which is inflating in price. Intel is seeing success. The GPU branch if anything will branch off
Thank you for all of this work. The sheer amount of configurations that could affect a product are insane. AMD vs Intel, x3d vs not, more graphics cards, resolution, etc. Completely mind blowing...
There can be literally millions of combinations of hardware! Pretty wild.
@@GamersNexus you need THE MEGACHART with DROPDOWN MENUS :D, mabye a tech reviewer scene team effort. Especially now that AMD tries to switch to sku overlord confusion tactic with different categories accel in different thing. If others try to push this we and especially you the tech reviewer scene will have a problem. We already have the problem to figure out and pick the right hardware synergy without having big gap in bottleneck and not pay unnecessary when we have literal see of hardware choices and combinations, somehow this needs to be alleviated somehow.
This is exactly why optimizing a PC game can be so difficult for devs.
@@GamersNexus I really really appreciate your methodology, and your considerations at the first part of the video. Thank you. This is at the heart of why some of us have criticized your (and other reviewers') methodology. Checking with a few tests to at least confirm that your usual assumptions about performance scaling behaviour still hold before going on to do standard tests gives a huge boost to the confidence we can have in your results in the end. We understand the combinations are mind blowing, we do, but this level of cross-checking is extremely valuable to us. Again, thank you.
9:50 i just wanna take a moment to give a big shout out to the editor for guiding the viewer to the benchmark Steve is talking about. As a viewer it can get hard to follow along especially when looking at data and trying to interpret them on the fly. So major props to you.
I think it's pretty impressive for Intel in merely two generations of GPU to largely have caught up to the two giants in this space given their performance tiers and pricing.
Caught up? Only if you ignore how big die and power needs of this card is. It's a flop, good for US consumers as this fake MSRP is there somewhat available but Intel obviously used that only to get good reviews and expect for rest of the world to pay for difference between what they need to make this profitable (or at least to be at 0).
@@gorky_vk Flop is right! It was ludicrous launch!
Only halfway true tho, they have been in the GPU space 27 for years now.
Wrong. Their old iGPU before Iris-Xe can't be apart of this "GPU" thing as it technically was meant as a display output and nothing more.
Technically, Intel started up properly on the GPU business when they came out with Iris-Xe, before going into Xe Max and that was their .5 of a Generation. Intel Arc Alchemist came out and that was their first proper start
They haven't *really* caught up - they're using much bigger dies than AMD and Nvidia for those cards, despite using similar lithography nodes. Most likely that's the reason they can't scale to high-performance cards yet - they'd be way too big and power-hungry to make sense
I can't believe Steve didn't point out that FPS/W is the same as Frame per Joule!
Yeah, feels like a first.
Hahaha
yay, he/they understood that clear communication is more important that cancelling out as many units as possible.
@@deepspacewanderer9897 i thought we were going to get it as frames/pizza from now on. So disappointed.
The Power Consumption charts (Idle & Efficiency) are so good! Keep it going, please ✌
Thank you!
@@GamersNexus Any chance you add lock to 60fps efficiency in the futur ?
i concur. idle wattage for CPU and GPU is probably just as important since half the time, our computers are idle or doing low task like watching youtube or typing up papers.
I dont think they have ASPM L0/L1 enabled in BIOS and power plan set to maximum power saving for PCIE as they should. B580 idles around 13W what I saw in different review.
Totally agree, the idle power in my use case would steer me to the 4060.
We need a B770!!
I hope we get one
Christ tell me about it!
Yeah fr
The longer Intel is waiting and not announcing whether or not B770 is even a thing, the more I'm inclined to just buy AMD's RX 7900 GRE or RX 9070.
with 40cus, and 32 gigs of ram for 700 dollars
Funny how this GPU released about a week ago here in indonesia without review and/or a usable driver lol
ha! Saw that online!
How's the stock availability? (asking from a close neighbor nation)
Berapa harga indonya bro?
and the price , B570 at $280 and B580 at $340 is insane
@PunzL For B580 it's pretty good, I think I saw one store always ran out of stock at the end of the week and have them restock by the next monday. But since the review about the overhead problem seems like they sold it less and less. For the B570, the stock is also good, but I think it's because people don't want to buy it yet, since there was no review available
Jensen saw this $220 card with 10GB of vram and wiped a single tear away with a $100 bill.
Why would he use his toilet paper on his face? He’d be using a leather handkerchief, like a real gentleman! /s
@@xKB616 The finest monogrammed lambskin chamois, of course.
I am installing drivers for it right now ! super excited !
Who cares
@@MaximéGodMode 6 people
....why not watch the review and THEN order once you know it's actually worthwhile?
@@ShroudedWolf51 I didn't want to wait
@@ShroudedWolf51Because some people are willing to deal with the issues it has.
Your friend with their A750s have had issues but me with my MSI Claw never had any issues besides stuttering, which is a trait of Alchemist
Thanks for doing these videos GN.
Thank you for watching!
@@GamersNexus you guys are crushing it and the standard has only gone UP since I subbed many moons ago. I'm so glad to see GN rocketing into the position of broad respect and reference that you have worked so hard to build towards.
Steve: we will test it with an older 12400 CPU.
Me: looks in awe to my 10700k, just hanging the 6900XT in there…
The 10 series was great! So is the 6900 XT!
still rocking my 10700k too :))
The 6900xt is still very good man
I'm still daily driving a 6900XT from 2021! It's a great GPU, and I upgraded to 1440p/165 mid last year (1080p/60 before, I needed the horsepower for non-gaming tasks). It's a bit of a stretch asking it to do that, but in a lot of games it can reach it perfectly with great quality!
Me over here with my i5-8400 and rx580 4gb..lol. I'm gonna upgrade soon just don't know what to do first. Prolly gonna buy a 6700xt to get me buy till I get a whole new mobo
I have to say thank you so much for the bars on the sides to indicate how long sections are. Also i cant wait to get this years disappointment build shirt ill have '21 '22 '23 and '24!
I'll let Andrew know! He introduced those bars probably almost 10 years ago now! They've been awesome.
dude you are collecting the Infinity stones of Dissapointments haha in shirt from, personally i think 24" takes it tho as the Reality stone, ya know, since ppl out here making it whtevr they want
@@GamersNexusThanks, Steve.
@@GamersNexus you GN folks are great
No one will ever know I'm shtting at work rn. Even Steve won't find out. Edit: shit...... this comment blew up. no pun intended.
🤣
How many Courics?
same
Same
Same I get paid to watch Steve with my pants down.
It is nice to see older cards on the charts like the 1070. Still a lot of people using those cards.
As someone who's still on a 1070, it's much appreciated! I'm eyeing the Battlemage cards, but I've had some hesitations... Scalping being one of them, haha.
Yeah only get it at MSRP @@imjust_a
@@imjust_apre-order one.
As a A770 user I think Intel really outdone themselves. B580 firmly beating the A770 as a mid-range card, with the $220 B570 getting really close. B570 might very much be the best value card this generation. I might not upgrade anytime soon but it's nice to see this much improvement and keeping up with the competition!!
Some say they are much more silent and cooler. But as aA750 user I will not upgrade yet
@@FastTcXas a fellow a750 user, I'm not gonna upgrade till am6 releases 😭💀🙏
I may just buy one for the wifeys system, as the a770 16gb bifrost is more than enough for me
I might even consider for a cheap build for my nephew
Thanks steve, for testing with the Intel CPU.
Loving this dedication amongst everything else going on in the industry and world as a whole
Thank you! It's crazy right now.
I had a look this morning in Australia. The B580 is back in stock in shops with the release of the B570. However the price has gone up $40 and the B570 has launched at the price the B580 launched at.
Glad I got one from the launch batch.
I had a look and there is still one store in Victoria selling for the original price but yeah disgusting that every other store would take the chance to pricejack the 580 before even listing the 570 it would make fairly comparing the two on value impossible to the average consumer who doesn't know historical pricing charts exist.
You want to support local business but then they do this cartel tier price fixing collusion garbage so ordering dodgy stuff off china direct looks more and more attractive despite the risks, story of Australia I guess.
There was a perfect "thanks Steve" opportunity after talking about 3x the work, could have been poetic in an intel review
This is a very useful video to show how so many people fundamentally misunderstand the words "bottleneck" and "limited". They are not the same thing.
Thank you GN!
more competition = more options for consumers, hope AMD and Intel both start competing in other areas than the low end of consumer GPUs
I really don't think they need to compete at the current high end, as to me Nvidia has lost the plot in how far they have pushed the high end. Almost nobody really should be wanting to put a 600 ish Watt GPU of that price in their system! It is just too hot, too expensive to run, and the performance uptick it brings really isn't worth it for most folks when compared to the lower-mid range Nvidia or the Intel/AMD competitors that are targeting that part of Nvidia's range... The 4090 and now 5090 are insanely impressive in some ways, but so expensive, so power hungry they just don't make much sense for the general public level of gamer.
We need serious competition in the higher end too. Intel or AMD won't bother tbh, DLSS is too good.
Once you get the lower end right, the high end will follow (I hope).
@@john_in_phoenix
Knowing Nvidia, I doubt it. Ray tracing and Frame Generation has its own downsides, and developers not having option to turn it off unnecessarily raises the hardware requirements
@@Aereto I was speaking of Intel, just FYI.
Efficiency testing is great to see, thanks!
I bought the B580 to get a piece of tech history ❤
Same, and I am happy with the purchase. It out performs my 3070 in most games that don't favor dlss, or other rtx specific features, and it does so on an 3700x where the 3070 is on a 5700x. I don't regret this purchase as the other options in this price range are used or just plain suck.
If intel makes a super tiny Battlemage card, such as single slot, I'd love to buy it to put in a tiny slim PC case I'm designing to look like a 1990s Thinkpad tablet. I'd use it for FFXIV.
Wait b380, b310 realese
@@eyersadul I'm not sure if those will release this cycle, from what I understand the A310 was a way to reuse binned chips.
IIRC it was possible to swap a A310 single slot cooler onto a A380, maybe that will happen again, I would really like to see a low profile B380, as would millions of SFF office PC owners, especially as the 8th and 9th gen PCs are hitting the bargain bin, and 10th Gen systems are popping up on the surplus market for under $100
It would be great for people that want the ultra low power home lab, since Intel arc cards were really good in encoding and decoding.
A next gen one with maybe even lower power draw would be awesome
@RinoaL say what you will about the gt1030, its better thank any integrated graphics out there.
Thanks Steve
Back to you Steve
I can't imagine the extra hours it took to retest the GPU with last-gen (or earlier) CPU's, but it's appreciated, especially by someone still on AM4 and not planning on upgrading any time soon.
To be fair, a 5700/5800X3D we'll probably have plenty of performance to prevent CPU bottlenecking in games.
A decend, brand-new GPU that doesn't cost more than twice the rest of the PC... didn't really thing we'd get those. My mind cannot process how Intel wound-up being to GPUs what AMD was to CPUs a long time ago in a galaxy far away...
I wouldn't be surprised if Intel then took the route that AMD and NVIDIA did during the pandemic once their GPUs take hold in the market. 😓
Are there many other youtubers who avoid teleprompters? It's been years since I've heard "pun unintentional" from a host, and that's because it's a really unlikely thing to say when reading off a prompter. Anyway I love it, it's really enjoyable to get carried away in Steve's flow and follow his train of thought. And being only loosely scripted obviously brings the review itself up a notch. I hope you guys are around for a long, long time.
When Intel said they weren't aiming for high end, I hope they didn't mean the B770 isn't going to come out ever...
I’d assume that a B770 would just be way too power hungry and too big/costly of a die to be economical based on the B580’s die size
I really appreciate how, when you open a video with a question, ie, "Does a poorer CPU make this video card worse?" you don't leave anyone hanging and just immediately go "No, not by much". I do want the details, but if I didn't, you wouldn't be wasting my time at all! Thank you!
one of the best gpu lunch in the last couple of years for sure
"Gpu lunch" 😋
I feel like in the Twilight Zone. Everyone keeps saying the B580 is such a good deal, meanwhile after a month the card still can't be had at MSRP in any market, while AMD's direct competitor cards are actually available at cheaper prices for the same performance.
@ good for you at our place some amd gpu are way more fking expensive then even the nvdia alternative
@ 6900xt cost $1500-1800 at our place just an example💀
@@seeibesame thing happened with the 3000 series lol Intel should boost production or raise their prices
Liquid metal scrying sounds dangerous
Only for those untrained.
@@GamersNexus But can I learn this art from the Jedi?
depends on the metal. Gallium melts at 30°C
"Don't get scalped" is great advice. I just wish we could avoid being scalped by Nvidia and AMD themselves.
It's a very challenging industry to get into, due to software/driver support and complexity. I think it is impressive that Intel is able to get into this successfully.
One can only hope they improve and start becoming a competitor to Nvidia and AMD, we need the variety
Intel has been in the graphics industry and supporting graphics drivers for well over 20 years.
did you game on chipset graphics before on Intel like 15 years ago?
@@chronyk743 I did, on laptops XD
$250 for a 2k/1440p card... pretty darn competitive.
Possibly you mean high end segment competitive.
@ I had Asus P5G41T-M-LX 775 board with dual-core Celeron and gaming on the chipset graphics Intel ICH7. This guy doesn't know that Intel has been doing this for over 20+ years on graphics.
3x the work but only 2x the charts? That's efficiency!
Around 5:00 regarding testing is "remember kids, the only difference between screwing around and science, is writing it down"
I think Battlemage is the best thing that's happened to the GPU market in a while.
Can't wait to buy the B570!
you can pre order it now and its actually in stock
@@ployth9000 where?
after almost a decade since 10-series thsi actually got me use a PSU calculator and check benchmarks
Update: I just bought it lol.
@ im waiting for local retailers to restock here. id be the same next month.
Appreciate the extra work put into this!
That's very affordable indeed, but I really want to see them (All Battlemage GPUs) with GDDR7 for that price and it would be fantastic.
Production volume for GDDR7 is very low and expensive. Even AMD couldn't get them for RX 9070XT , so they run on 20Gbit/s GDDR6 . So , what you're asking for Battlemage is too much.
Thanks for testing lower end CPUs, that is real world scenario for a low end graphics card. Also comparing them to old gpus to show the real world gains is great.
I'd like to see a mid-range CPU test, since that's what will likely be paired with this GPU.
I just got a $250 3060 12 gig and feel fine with my purchase, right in the middle of the two ARC cards and I was able to test in my PC, with my intended games before handing over cash. Also met some other automotive enthusiasts in the process, funny how the used market can be great some times.
Man this thing is priced so well. I wouldn’t hesitate building someone a budget box with this and say an i5 or Ryzen 5.
$350 is not cheap and the scalpers already 100% control the 580 market 😢
@llothar68 less comment section, more job.
@@llothar68 Mate, $350 is incredibly cheap for a GPU nowadays. Sure, $350 is a pretty hefty chunk of money and if the card is worth it is a whole other story, but it's still cheap. 'Cheap' and 'expensive' are both relative terms in relation to the alternatives. Obviously scalpers upping the price by two-fold isn't helping anyone, either.
You can get a 4060 for that price, so no it's not cheap. Or if you say it like this, it's not as cheap as it should be compared to the competition. @@avananana
Intel GPUs should NOT be going into some noobs build, it'll only end badly.
There are way to many issues for someone new to the scene to troubleshoot constantly.
Thank you for the work you folks do, we greatly appreciate it❤
Nvidia: Budget gamers don't deserve to game.
Intel: i got you budget gamers.
Clearly intel still have lots of work to do if they really wants to compete in the GPU market but B570/B580 is a good start. Hope they continue to get better in the future.
also Intel - we've finally caught up to where the budget range was 4 years ago, just as the new generation of Nvidia/AMD cards are set to make us look really bad
@@mightychev They'll catch up if the continue to be less greedy than the competition.
"There's a lot of charts". If Steve is saying that, we better buckle up, because this is going to be a ride.
finally a worthy card to replace my 1050ti lol
1050ti tired boss 1050ti needs a break
@@MaximéGodMode cry harder
@@MaximéGodModeI have 2 Toyota trucks, Komatsu backhoe, a ranch and house and I use 1050ti. Some people not poor, they just have a life outside playing games. Or they are poor, but smart enough to not spend on something unnecessary.
Do you buy a pc with your own money or asking mommy or daddy for your 4090 ?? Mister rich polish man here poor shaming people on the internet like some kind of a big shot….
@@MaximéGodMode Your dad must hate you.
Ah nothing like benchmarks for breakfast.
That's probably what the GN team will be saying for the next couple weeks with all the GPU reviews to do.
I was able to get B580 Asrock Steel Legend for 340€ (this includes crazy taxes in my country) and I am really happy with it. It's price is currently 380€, so still cheaper than 4060 Ti
As someone that is still running an i5-10400 and an RX 570 8GB, thank you for including those older cards so I have a frame of reference for what an upgrade would look like for me. Keep up the awesome work!
Man, I miss when this was midrange GPU pricing. My first one I remember buying myself was around this price (8800 GT?GTS?).
Keep up the good work Steve n crew!
14:35 *that A770 difference is WILD!!* 😳
It wasn't mentioned on that slide for some reason
Also noticed that
It might actually be driver overhead or just missing optimization, if I had to guess
@@_etwas The A770 is in the middle of the list with the top tier CPU, and at the bottom of the list with the 5600X (which is no slouch, tbh).
That difference was drastic.
Back in my day, $220 was a lot of money to spend on a GPU. Now it's the "low end." Insanity.
That $220 now is the same buying power of $75 back in the day.
@@Retro-Iron11 exactly, i bought an x1950 pro for £95 back in 2006, I'd expect to spend at least 3x that now for GPU before we even get into how much better entry level gaming is now. You can run pretty much anything if you are prepared to turn some settings down. Back then, entry level couldn't. (see Crysis lol)
groceries today are almost twice as expensive as it was 7 years ago.
Look up the word "Inflation"
Thank me later.
idk what your back in the day is but 1660ti launched at 280 in Q1 2019 (that's 6 years ago)
I can see two classes of benchmarking: one where everything not tested is as good/high powered as possible to best eliminate bottlenecks and show the tested part in the best case scenario, and two, where test bench parts are chosen "budget category" specific. If you're buying a $150 GPU, you're likely not buying a $700 CPU or a $400 motherboard. It would take a lot of editorial choice in part selection at different budgets, but I expect that is what people want out of a review: "How good is this part?" and "How good is this part actually going to be with what I already have, or can afford to build up/replace?".
Go intel. We need the competition. Now get that B780 worhing.
You will get competition on all of the new AMD/ Nividia cards once they make it over to Ebay for repricing.
Thanks for the efficiency tests. I would love some more idle efficiency variants. Multi monitor setups, video playback, etc.
Thank you for this. I think it's really helpful to have different CPU testing with GPUs since it gives consumers a better idea of what they might actually benefit from in their situation. It is a lot more work, though, so I understand why it's not something that you can do all the time.
Again, thousand thanks for the idle consumption info. I also do text based work and life off-grid. In this case a low idle draw is damn important
8:06 "There's no substantial change to the B580's performance across these three CPU's with this test configuration" So.. we're just going to ignore the 1% lows?
Pay no attention to the man behind the curtain.
Man didn't wait until 8:40
It was worded oddly, for sure, but it was not ignored
@@KPalmTheWise Maybe he should've led with that instead of saying "nothing to see here"
Maybe you should wait for the whole explanation for each game to be finished instead of jumping on the apparent mistake. :P
Gotta say I'm loving the additional CPU's in this review. The combination of the no CPU bottleneck high end CPU and the very relevant "this is what people buying this GPU will likely be paring it with" data for budget to midrange CPUs.
Should use Arma 3 to test CPU overhead, its renderer is very CPU-heavy so if an overhead issue exists, you'll certainly find it with that game
you know what would be hilarious to show, just simply to demonstrate how far CPU technology has really come
do all of these tests but with a really old CPU like an AMD FX8350 or something
I'm just waiting to buy a B580 for under $300 rather than a 4060. Intel is doing a good job with these cards. I am currently running a 3060 that I seriously overpaid for during covid. Just upgraded my 5600 (no x or x3d) to a 5700x3d and 32 gig memory (newegg bundle). Since the monitor connected is only 1080p, I can wait until the price gets closer to MSRP.
It was really cool to see that you guys did the work to test the CPUs. I know you guys do your best to detail games, but Adobe also has benchmarks, and they can be drastically different. Any way to include photo/video rendering speed compares to? I bet more of your audience would care about that than you think.
7:59 there's a big difference in 1% lows and 0.1% lows with the lower-end CPUs with a B580, which is where such driver overhead issues are most likely to manifest anyway. But you don't even mention that, and just say that there's no substantial difference.
Great data. You guys are awesome. Thanks Steve.
Excuse me, but what you are saying at 09:00 goes contrary to the chart: Neither 4060 nor 7600 gets any damage to 0.1% lows (when CPU gets downgraded), but B570 and B580 gets stutterstruggle™. Average FPS is good, but average FPS is (sadly) not the best gaming experience metric.
He talks specifically about the lows if you actually pay attention.
Was about to go to sleep but this popped on the notifications. Guess I'm watching this now
5600 cpu is insane value for money tbh.
Great review. Total pro, well done.
Nah, b570 and Switch 2 on the same day is wild
Great video! I'd love to see the 3080 in the current efficiency tests to compare with the B580 and 7800XT. I believe it's still a good reference point as I think Nvidia pushed that card quite hard when it came out.
The suggested prices are a joke. In germany b580 costs close 350eur while rtx 4060 costs less than 300eur not even mentioning rx6800 used that can be had for 250eur. So this cut down version will be close to 300eur.
people still want £400-£600 for a 6800 2nd hand in the UK. I don't know what they're smoking :)
@mightychev bought reference 6800 which is still very nice in Germany for 260eur shipped. I guess that's very low but you can get one easy for 280 which I still think is a good price for such a nice gpu that will be still usable for few years.
@ yeah still a great card, you got a good deal
@mightychev 500eur that's 7800xt territory so what I paid I feel is actual market value. And that is OK of course.
Germans are smarter than Americans to not buy this waste of sand anyways lmao
14:28 never mind the Arc cards, look at those lows on the 4060 in BG3 at 1440p 😮, they absolutely tanked on the slower chips and those cpus aren't even that slow.
If you started up your own GPU company. Everything would be flawless for this industry ;)
No thanks! When everyone is mining gold, sell leather jackets.
@@GamersNexus would you be open to wearing a leather jacket for the 5090 review
no shit id love to see that
@@MrRaftingGOAT idea.
@@MrRafting No please no! Last thing we need is for that to become a thing, and if you really HAVE to GN use fake leather at least.
I appreciate everything you do :)
Ohhh finally another interesting GPU
While these aren't for me, I'm glad intel is getting into the lower end market to help drive low\mid tier costs down. My Factory Tour skeleton shirt should arrive tomorrow 😃
Super excited for what Intel has in store for the higher end battlemage cards, priced right they could be absolute beasts.
Agree, I own an A770, and am considering replacing my 6950xt, if the price is right on a B770 or equivalent I plan to pick one up.
I am fortunate to be able to pick up something like that out of curiosity, it will be a tougher decision for people who only buy a single GPU every four or five years.
@@cecilb7927 your 6950XT is still about 50% faster than the B580 4 years later. I'd be surprised if the B770 is anywhere near it
@@mightychev I know, I just really want to check out a B770, I already have an A770 LE
Doing all of these benchmarks must have been a fucking nightmare
I like how steve respects his viewers time and gives us the main result upfront and then if we want to see the process we can watch the whole video. ❤ thanks for not holding us hostage through your videos.
Remember though that if everyone leaves the video fast, steve channel make way less money, so do try to watch it completely if you can.
Really happy to see the B570/B580 cards being affordable and with a decent amount of vram. They make it seem possible to make a backup/smaller lan PC without breaking the bank completely.
Getting Path of Exile 2 into your testing regiment would be sweet, since no one else is doing it.
it is a live-service game, which means continuous updates, which means long-term test consistency is impossible. So, bad idea.
(But, videos testing a popular game on a bunch of GPUs may be a useful idea)
@@deepspacewanderer9897 You got a point, but I really wish someone would benchmark PoE 2 properly.
Looking forward to the 5090 vs the 4090 raster, dlss, raytracing benchmarks.
This is mostly selling me on the lows you get from 9800x3d. Even on slower hardware.. sheesh what a beast.
While hats off to Intel for these GPUs, especially considering the price point, 1% lows and 0.1% lows are painful.
I have a setup with a 7900XTX and an i5 11600K....yeah, I know, I'm waiting for 9800X3D stock atm...My fps would be awesome most of the time, but 120 fps averages are quite hampered by 70 fps 1% lows and such in my case, for example.
This means that I'm CPU bottlenecked half the time, while the other half my 1% lows are abysmal, and knowing this, similar average performance of the B570 does not hold a candle to stable 1% lows of other cards in my book.
Don't get me wrong, I am all for Intel shaking up the market with ARC, and if they can fix these stability issues, they can prove to be a worthy competitor. These are still very much interesting products, regarding generational improvements over Alchemist, Battlemage is nuts, so looking forward to the future!
I have overwhelming feeling that it's not how good B580/B570 are. But how extremely bad AMD 7000 and NGreedia 4000 are.
They both make really good cards. Saying the company sucks doesnt mean their products suck. Amazon is one of the worst companies to work for in the us and you keep buying everything from them.
"Good" and "bad" are a function of both performance and price (perhaps energy consumption/heat depending on preferences), if performance is comparable and the price is lower, it's a good product.
@@MichaelOfRohanjust because there isn't enough competition to solidify nvidia/amd price to performace ratio sucks, doesn't mean it doesn't suck.
Nvidia and amd cards are good gpus but with an awful price/performance ratio (mostly nvidia)
As always good job GN , These tests are getting complicated.
I'm still fine with my 5600 6650xt, maybe when I get a 1440p monitor I'll consider upgrading.
Thanks Steve.
But can it run Lichdom Battlemage?
Let's see what price we get in Europe :-/
Ah! A marginally impressed Steve face!
Thanks for including the extra CPU's! Would like to see some of this for other products, even if it's just to a lesser degree for checking if say, RTX 5000 is scaling correctly on lower end hardware. The normal testing methodology still holds up for using the highest end available CPU, but compatibility and overhead are good things to at least check, right?
thank you gamer snexus
Fantastic showing in the budget category. I'm not in the market at the moment, but I may jump to an Intel card once my current card reaches its end.
Rememeber guys, its not enough for Intel to make these cards. We, the consumer, have to buy them if we want Intel to keep this going.
Thank you thank you for putting out a lwoer cpu benchmark!!! im on ryzen 5 3600 and not planning on uprgrading
That overhead bug need to be sorted. Ain't no way a 200 dollar gpu should require 7800x3d level CPU to get everything out of it.
It doesn't, though. You can see that in our tests. There are situations where that may be true, but it is clearly not all of them.
@@GamersNexus Sure, but the fact that there's a legitimate concern whether or not a specific game is going to run terribly or not is certainly not an ideal customer experience
@@vince2051 Isn't that just PC gaming as a whole?
@@vince2051 I think so too. Honestly, if I was in the market for one of those cards, I would probably hesitate to buy the 580 because I'd rather have a more consistent performance rather than sometimes better than the competition, sometimes worse with frame stutters. So then I would much rather buy 4060 or 7600 personally for more consistency.