@@Justakatto yeah bun intel must fall cuz they been agressive player for years in CPU market so i guess new AMD takes crown in CPU MARKET AND MAYBE Intel také crown in low to antry mid range GPU market and AMD TAKES pre per 1440p mid range Gpus and well nvidia again gonna bé on the top with 5090 and rest products are gonna bé again just up sell and more expensive than comoetitions but less good just Like Rtx 4060 Rtx 4060ti or Rtx 4070 vanila i guessing the 50 series gonna bé again same but maybe we will see more agressive approach from Jensen and they drop prices but most likelly nope nvidia lkke money but they hate when they lose them so i think they figures out something Like higher end gpus are for profesionals just Like they dit with 4090 and 5090 bé samé but just dont be suprised when nvidia say the Rtx 5070ti and up all gpus are gonna be for profesionals lmao that would bé crazy kick to balls hahah.
Eh that isn't really true. Intel has mostly lost in the top end. An i3 12100, 13100 or 14100 are basically unbeaten in gaming in terms of price. A 12400f also kills in that regards. AMD mostly wins on the higher end x3D chips but they arent really budget.
@@Blackfatrat that is technically true but when you take into account that you have to upgrade intel motherboards every 2-3 gens, then it's more expensive in the long run. Am4 lasted from 2016-2024. A whole 8 years.
The colour grading on this video feels so nice. It’s warm and soft. Just really chill. I like it. Please give the colour grader a slap on the back and a “good job, mate” in an Australian accent.
The warmer color grade likely comes from them not using their traditional cameras, they’re visiting Colin and Samir. LMG usually uses Sony FX6s or FX3s. This was shot on a Canon C70. The lighting of Colin and Samir’s setup here is also probably why it’s warmer. Just wanted to point out the camera change too.
Intel realized there's a lot of gamers that want good cards but don't want to spend the same price as a handheld PC for them. I'm all for the sub $300 card making a comeback
Do you people actually believe this stuff? Intel's "targeting" the low-midrange, because their gpus aren't good enough to do anything else. Same reason AMD isn't competing in the high end this upcoming generation. It's not some noble sacrifice, because they really really care.
@@jamesbyrd3740 it's a pretty effective marketing strategy, target the customers that the big two aren't. It's Intel, they make chips. If they wanted more powerful GPUs they'd build them
This is HUGE. I build around 80 "gaming" PC per years and most of my customers want EXACTLY that. They want a 300$ or less card with enough to play small to medium games. High end PC with high end graphic cards are less than 10% of my builds. 4060 was the only option, now there is more. This is competition where there was very few. This is good for all of us.
yeah, honestly I think of xx70 range as 'high end' already, xx80/xx90 is just gratuitously high spec to support 4k gaming. I've never had an issue running lower resolution, 1440p is pretty good for me. At this point what I'm looking for is incremental improvements in performance with bigger gains in power efficiency, although I'm sure we'll never see an actual decrease in TDP sadly.
But you might get into a tech support nightmare. Intel's GPU driver still has many wrinkles that need to be ironed out. Those who only play mainstream games might be fine, but if they play indie games, they better be tech-savvy.
@@chrisdt2297 I agree but they are getting much better. I built a few A750 and I was surprised how well it handled most games I through at it. I sadly do not owned one personally but recent videos using latest drivers seem to suggest most major issues on popular games were addressed, still not like an Nvidia or AMD but getting there. Having competition is good too.
@@ninjason57 not everyone cares for DLSS. I'm an old school quake gamer, and nothing frustrates me more than fake frames and fake resolution resulting in latency. I still live by doubling your high refresh rate with your fps to minimise input lag. For 1080p gamers, a budget option like this could be great....if you don't care about DLSS like me
Yes please. I want something better than team red and team green and I'm quite tired of Nvidia cutting hardware down and using dlss as a hardware crutch rather than a booster. ...not that I'm keeping my hopes up. But dammit Intel, Celestial is a great name. Don't you kill your Arc before then!
Not gonna lie, as a long time AMD GPU user I do not feel compelled at all to switch to an Intel card. It only brings back memories of driver buginess from some of the AMD cards I've used years past. Drivers have been more robust in the past 6 or so years. I am not really looking forward to experiencing the same issues again with a different brand. I'm a huge tinkerer but I kinda need my computer to also just work.
Back in the old days, games would come with a README.TXT file with a list of issues that the game may have with various graphics cards. Nvidia rarely had anything listed, and AMD always had a huge list of models and driver versions that had problems. I always hated AMD for that, and basically stopped buying their cards.
Yeah my worst nightmare is AMD becoming careless, non innovative monstrosity intel is. We need intel to exist solely to move the market forward. Hopefully they can turn it around but it ain’t looking good over there. They can’t find their ass from a hole in the ground.
I'm not too worried, I was until I realized they still have an ARM license and are ramping up uh, er "other category" chips. AMD has oddly become more monolithic than Intel in many ways, which caused Intels earlier decline.
@@dataanalystbynight4375 intel is a goverment asset and will never go under plus teh fact its building foundrys in america that will operate in 2028 intel has a very exciting future just not in the near term. they will become the defacto cpu maker again in 3-4 generations. Due to all that goverment money coming in
@@dennissdigitaldump8619 yeah, but losing 1 company that litterally start up this whole x86 architecture and be only monopilized with 1 company(AMD) doesnt sounds good hopefully its something that AMD hapens to have back in 2014 era before Ryzen(aka intel finds its own "Ryzen" that wont dominate market but keep AMD in check, im afraid AMD became Sandy bridge era intel with intel decline)
why are disappointed it's not GDDR6X? 3:25 it's only been used in higher end Nvidia cards and doesn't make much difference. Also GDDR7 is a thing so who cares. New Radeon and Geforce cards are rumored to be using GDDR7 so that is the best-case scenario. The 3070ti with 6X was faster than the 3070 with 6 but the 6800 with GDDR6 was faster in raster than both. It's just a weird thing to say.
Gelsinger got handed the worst hand possible. Staring down at a company that had been gutted by penny pushers and greedy shareholders only looking at next quarters returns. Intel deserves to be humbled, they were complacent for a good decade where they just kept releasing the same thing over and over until AMD caught them napping. But seeing them slip this far, its not even a competition with AMD anymore, they can't even compete with themselves, ouch. We need competition.
@@JellyLancelot Forcing out Pat was ultimately disrespectful, he deserves credit for what he did for the company. Turning around a company that was neglected for almost 2 decades is not easy to fix. He needed more time. I was really rooting for intel to roar back with Pat but now I don’t care for the company anymore.
As someone that spoke with my wallet and bought an ARC a770 at launch. For a first gen product I expected it to have issues along with what seems like a nearly impossible task of trying to optimize not only for all the new games but also the insane number of older games out there that people still play. I've been super impressed with the team they have working behind the scenes on the drivers and software. They were given a mountain to climb and they have put in a hell of an effort! I don't even need a new GPU but I am super excited to try Battlemage for myself and will be purchasing a B770 B780 or whatever they call the flagship top model this time.
@@n.henzler50 Don't be ridiculous! Pot of Greed's card text is completely incomprehensible, that's why it was banned. "Draw 2 cards"? Is that even English? What on Earth could it possibly mean?
At 07:17, it's not 25% less performance. Its 20% less performance. Adding 25 cents to a dollar is an increase of 25%. Taking them away again is only a 20% decrease. You also can't simply subtract the percentage difference (132-106=26) between the 7600/B580 at 06:35... that's not how it works. The increases are with reference to the baseline 4060.
6:34 Intel is not claiming a 26% performance uplift in rasterization performance. Somebody did the math wrong. Intel is actually claiming a 24.5% uplift. You shouldnt substract percentage... Same goes for 6:45 but other way around uplift is actualy around 44% If lets say subject "A" have 75% performence and subject "B" have 150% performence diffrence betwen is not 75% but 100% You should divide performence B and performence A not subtract them
12GB of VRAM and that level of performance, for $249, is nuts. For the same price as a 4090, you could build an enclosure with multiple B580s in it, and it would probably have the combined VRAM and performance of an enterprise chip. For gaming and other realtime use cases, probably not worth doing, but for productivity, for video editing, for 3D renders, for AI training, that would be awesome.
Yeah I paid like 450€ for my 12Gb RTX3060 at the plateau just before the prices actually completely dropped. But at least I got one of the faster 3060's, heard Nvidia started selling shart cards after that.
As he said in our last company meeting, he was too optimistic. Constantly making claims he couldn’t make come true. This led to him having multiple layoffs, while enjoying a 45% raise. He cost my colleagues their jobs and didn’t care. I’m glad he’s gone.
As someone who is finally getting back into constructing my own desktop / tower after years of using a laptop, I've been genuinely excited about the Arc cards. Alchemist felt a little too "early adopter / beta tester"-ish, so I've been holding out hope for Battlemage. As long as it's faster than the GTX 1650 mobile I'm running, I'll be happy.
Most of the Alchemist issues got fixed over time, so hopefully (keep your fingers crossed) the Battlemage launch should be a lot smoother. It's hard to say what happens long term, but let's also hope that Intel's GPU team remains intact (or even gets more attention/funding, ideally) regardless of what happens with shakeups at executive level.
@@PumpiPie This is what happens when big companies get complacent and assume they will stay on top even if they don't make any advances... Nvidia needs to watch out.
I have an A380 in a SFF side rig. It handles itself nicely for a 75 watt card now since the drivers matured. The lower budget, and Low Profile market need continued support and happy to see Intel doing alright here.
You mean NVidia "unlaunching" another card? I very much doubt it, otherwise they wouldn't have a good way to upsell their GPU to the much more expensive 5070...
I used 2 ARC 380’s in my Unraid server. One for transcoding and one for VM’s. For what I use them for, they’re unbeatable, especially since the 2 cards were only $100 each. Incredible budget cards
Come on guys, you can't just subtract percentages. The figures for the Arc B580 and RX 7600 are in reference to the RTX 4060. Raster performance is less than 26% better than the RX 7600 and ray tracing is more than 37% better.
@@DragonHuman00 Since when did LTT choose the 7600 and 4060 to compare cards to? Those are Intel charts - the only mistake LTT made is the wrong wording for the % difference. Grow up.
Hey, viewer from Brazil here! I gotta say, this is one of my favorites, if not my top one of all time tech channel to watch! You, "fireship" and "Coisa de Nerd" are the best!
How much did the CEO actually matter in this situation? He's not the one that told the engineers to make bad products. Kicking him out just seems like the board shoving away blame from their own cost saving measures that led to these issues.
Yep. We're gonna get a replacement that further guts the company for a quick profit at the expense of long term viability again. They really didn't learn their lesson.
@@slunasaurusrex It's a publicly traded corperation. They basically have a fiduciary responsibility to generate as much short-term gain for the shareholders as possible without thinking of the consequences. Late-stage capitalism is the point where none of this is sustainable anymore. Boeing doors are flying off and Intel sells faulty CPUs.
He was the CEO. He wasn't innocent 😂 stop being a contrarian just to seem smart or something. It's very easy to almost always find someone better for a job that pays a bunch of money. That person is out there. Even if he wasn't "that bad," there's someone better.
It's the CEOs job to push initiatives in the company forward. If said initiatives involved ignoring feedback from engineers or being short-sighted, lost them a partner and caused a massive scandal. Then, the shareholders of Intel are going to kick them out.
@Ichibuns I assume if CEO wouldn't tweeted that useless tweet that lost Intel 40% discount in TSMC fabs, then he would still be working. Also most blame is on previous CEO's who wanted to keep their jobs, and didn't invest in more research, instead put more money in investors pockets. At least this CEO tried something different. Not sure what is their next, better CEO option. Lets hope there won't be more lay offs until Intel finally get their ... together. I am not sure i ever buy Intel again, maybe only when AMD also becomes as much greedy.
0:27 another announcement Pat Gelsinger has stepped down as CEO of Intel. This is tragic, idk who would be best suited to replace Pat as he was one of the veterans intel engineers who came up through the ranks. Intel needs a miracle to stay in the game
@@kravenfoxbodies2479 Knowing Elon, he would probably fire everyone including the engineers, come up with a terrible naming scheme, then complain that the company is unable to do better.
IFS Foundry will spin out as separate entity. Private investors won't care about bleeding edge like 18A or 14A, just like Global Foundries abandoned 7nm. Intel Products will be highly lucrative as they can outsource to IFS and TSMC. Investors win, and "national security"? Nobody truly gives a shit about national security at end of day, that's just an pretext for (some) free cash.
They got you to watch the sponsor you normally double tap to skip. Metrics will say it was good. But it wasn't. You can't always just trust the numbers. Common sense will always be important. This is our only hope in the battle against the AI.
@@gasracing5000 It's more about the stupid editing trend people on Tiktok do where they cut future into the past video edits. It's horrible. I got PTSD
Been rocking my A770 for a while now, I could play anything at max setting at 1080 and now that I upgraded to a 1440, high refresh rate display.....it's still runs everything at high settings just fine, might not be getting my full 144fps but, it's it's ability to render videos with resolve makes me more then happy!
@@veilmontTV Anything that runs UE4 but has a lot of extras on top of the engine can be pretty rough without DX12, and Elite Dangerous used to refuse to start on Arc but I haven't tested it in ages. Most games tend to need more delving through the settings to find an "ideal" setup than AMD or Nvidia cards though
Running an ARC A770 16GB VRAM paired with a 12900K CPU. LOVE IT! Waiting on the B770 to release. Hopefully they bump up the VRAM from 16GB to 20GB or 24GB.
These corporations know where the money is. Sadly, people playing video games didn't bring that much revenue. We all know these GPU manufacturers can pack this chip on a "professional grade" products, and slap a humongous price tags on them.
@@AffectionateLocomotive The Arc GPUs have QuickSync encoders, which you find in most Intel CPUs. These hardware video encoders are better-supported than AMF and even NVENC, largely because they have been in Intel iGPUs for so long, and coincidentally, they’re now also the best quality-wise in pretty much every major video codec, including AV1. If you want to add hardware encoding/transcoding to a server, the Arc A310 is the best GPU for that.
@@fujinshu yep, I've found that encoding on my a380 creates higher quality videos at a smaller size vs AMD or Nvenc hardware. Not as good as CPU encoding, but at 3x the speed, it's totally worth it
I went in on a A770 in part because of your shill hat video. I've been really happy with it and was late enough to avoid all the year one issues (other than automatic driver updates never working). Been eyeing the battlemage news for awhile as it was what I originally was waiting for but upgrades needed to happen. Hope they have a successor to the higher end of their cards coming soon.
Same here. I have an A750 and picked it up in year 2, sidestepping many of the earliest issues. It has been a good card and I will not blink at picking up another once the B770 comes out.
@@satyampatel491 I've seen the rumor that it hasn't been taped on the 2025 roadmap yet, and that has some insider-type folks worried that it has been cancelled. It makes sense that these sorts of rumors are floating around given the disruptions and chaos at Intel as of late. Perhaps there will turn out to be something to it, but "B770 was cancelled" isn't accurate. At least not yet.
7:20 that is not how percentages work. The B580 delivers 32% more performance than the 4060, but that is not the same as what is said in the video. That number would be 100/132=0,758 which means that the 4060 has (1-0,758=) 24% less performance than the B580
With all the issues intel has had, want to give them props for accepting my 14900ks claim with no issues at all. Brand new and it was failing bad (crashes daily). Got a new one in 48(ish) hours.
I think it's a bit early in the day to start testing the cards, mostly since nobody will have them yet and if Intel doesn't feel like doing some encoding tests...
I agree. And maybe some compute focused tasks given Xe-2 has FP64 support. Given the compute characteristics of alchemist I don’t think Intel will nerf it like Nvidia does with GeForce
@@sam35551 This is obviously a last-minute video that didn't have time to be as thoroughly checked. Things slip by sometimes, it's okay. The point still stands, whether the percentages he speculates about are completely accurate or not. It's just a matter of illustrating how the price-to-performance might be really good.
oeh double sided note: Intel: -I get screwd HARD, invested in about 100x 13/14gen and oof... LTT: -The lighting got really good on this channel!! love it! The improvement on lighting especially is impressive and really happy to see!
Sacking Pat was a bad move. But I understand the situation. He wasn't a good salesman. There's no rulebook that says a CEO needs to present product launches etc. But I'm also pretty sure they didn't spend any time to consider alternatives, like hiring a marketing exec, because sacking a CEO is the run of the mill expected knee jerk reaction of the times.
I took a huuuuge gamble about a year after the launch of the Arc A770, and it has not let me down one bit. Suuuper solid card and damn near whisper quiet most of the time. It wasn't necessarily a huge upgrade to my previous card but it was still an upgrade. I love it, aside from the Arc Control Panel that refuses to update drivers automatically it's great. Also.. The set you're in needs some better sound deadening. There are some pretty noticeable echos.
As a previous a750 owner (I just moved to 4060 ti) I'm really happy to see some hype for this. The first gen had a HORRIBLE launch but after a few months they fixed everything and for $250 I got an insanely good card that I was really happy with. But nobody bought it cause the launch ruined its reputation. I hope battlemage can turn it around.
Technically they didn't fixed everything. On the last Intel GPU review/check I saw several months ago, there were still issues, though overall it was much better than at the start.
@@cz35vk 4060ti performs similar to AMD's flagship GPU(AMD Ryzen 9 7950XTX). With DLSS and RTX on it can easily get triple the performance too. One of the best budget GPU under $2000.
Good video and we'll see how Intel performs. Also, usually I don't typically call out quality, but in the video background there was some overly grainy/pixelation to it which was quite noticeable.
I’m sure that them pivoting to a new ceo during a complicated and long term change will go great, assuming of course the board prioritizes the incoming CEO will focus the tech and engineering progress rather than stock price. Right???
As a Linux user, I'm actually really interested in an Intel discreet GPU because they have the best drivers over here. Not needing to have the notoriously painful nvidia drivers installed would help make setups that much easier (Intel drivers are open source and preinstalled on pretty much every Linux distro). I'm also curious how it will play with an eGPU setup, and whether the drivers will be able to make use of both my laptop's internal Iris chip, and the additional horsepower of the dedicated GPU as well.
I'm really happy to see a second gen intel card coming out after so many were speculating that the first gen wasn't good enough for a follow up. Not sure i'm gonna upgrade just yet, 1080ti still hanging in there, but when i do (and i plan to go more towards that end of the price spectrum, i don't game nearly enough to justify high end gpu's anymore) i hope there is a compelling Intel offering to at least consider seriously.
Good call! Hopefully one of the tech channels will cover this aspect. Granted, with WMR discontinued, I'd doubt I'd ever see support for my Samsung headset anyways.
@@TheoHiggins because vr usually isn't at 1080p, it's that simple. High resolution requires so much processing it's exponential and it sucks. Then you add that you don't want to play vr at 60fps and this card wouldn't cut it
If this generation bombs too (can hardly call Alchemist a success), then that's it for Intel's GPU department. Not that I want it to bomb, but the launch of Alchemist left a very bad impression, which led to an adoption rate so low that it might as well not even exist. The number of games using XeSS is OKish, but not comparable to Nivida's DLSS. The problem with that 200-300 dollar price range is that those cards typically don't make a lot of profit, if any. And this is about profit, nothing else. Do not for a second think that Intel just wants to give gamers the best bang-for-the-buck card because out of the goodness of their heart.
Good video, makes me hopeful for the new Battlemage launch! Only part of the video I found jarring was at 8:20 with that stock foot(heh)age insert. That insert doesn't quite feel like LTT, though it is good to see the editors come out a bit with their own flairs and ways to edit
I am looking forward to XeSS2 because XeSS already looks way better than FSR3 in most games it's in. But a lot of people don't even know about its existence as it was not really advertised by Intel in any way
Remember that FSR will get 4.0 version with 8000 series launch that is also gonna use AI based up scaler instead of the filter based that AMD has been using this far. So XeSS2 is not competing against FSR 3. Also how many games will actually implement XeSS the 1st version adaptation was quite low.
@wta1518 Better take off the tinfoil hat before you end up on the list. I've got no reason to shill just stating the fact so people can make more informed decisions.
@@Freestyle80 I made no claim on the quality I simply stated the fact that comparing XeSS2 to FSR 3 is going to miss lead from the experience that you will have for most of the useful life of the card. Meaning that if you use the quality of those upscalers as the deciding factor there is a potential for buyers remorse.
For those who want more details, 32% less of ARC battle mage would imply the NVIDIA has 68% of ARC performance, but 100%(NVIDIA number) of 132%(ARC number) is actually 100/132 = approximately 75.8% (24.2% less). While the main point still stands, the difference was over exaggerated.
This is the first time in years that an early adopter of a next gen product doesn't have to break the bank. You can get started with the fastest Intel GPU for just $250, which will get you above an RX 7600 at $270 and an RTX 4060 at $300.
Definitely. But that’s not really out of the ordinary. Lower tier GPUs have less profit margin which is made up with your 4090s and 7900XTXs. Intel just needs to break in to the enthusiast tier and then they’ll have the opportunity to make more profit.
I recently picked up an A750 for $125 (used) and am blown away by it. Saying that the drivers have come leaps and bounds doesn't quite do it enough justice. Some slight tweaking, and it's performing on par, or better, than the 3060 12gb I was using for the same budget machine. I'm ridiculously excited for their next entries, as they're definitely zoning in on the "budget" or "casual" sector harder than Nvidia or AMD ever have.
I can't wait for the full review. I am still loving my A770 although it does have it's weaknesses. For 250$ if it would give just a little bit more stability, or basic shader fixes like running Spacebase Startopia without green lines, MK1 without bright white lights whenever Chameleon is doing anything, it could be a solid, buy it to support them until B770 comes out solution. Please if you test games again on the B580 like when ARC first came out, make sure you try out these two games somehow because these issues are insane! :)
@@kmieciu4ever yeah, the sacrifice of supporting the newcomer, running Bloodrayne 2 on 30 fps because dx9 was a nightmare (fixed) and the white flashes of regret xD
@@AskemoX DX9 is only emulated on Intel cards and that is the reason of poor performance. People like to shit on RTX4060 but mine runs Stalker 2 at about 100 fps thanks to framegen and uses max 100 Wats. It's like having a laptop card in your PC :-)
@kmieciu4ever yeah, i know it uses a translation layer for anything under dx12. I think performance is really relative. Someone accepting low settings 1080p or with heavy scaling, while others want pure native performance :)
Finally decided to upgrade my old 1060 this week. Was planning a 4060, but maybe I'll hold off another couple of weeks to see how this performs in the wild...
Love the editor doing both ytp style "In a nutshell, youll find nuts" (listen closely, linus didnt say that), AND the subtle foreshadowing trend (the awefully cut segue) Bravo
It makes a lot of sense to make a low end cards these days. Gamers are playing older games, or rather not moving off older titles, and many folks are gaming on 1080p so it there's no need to really get the latest gen hardware. And because we have platforms like XB Series S and Switch have pretty low end specs, developers are having to optimize for the lowest common denominator anyways.
But they... dont do much optimization at all? Most larger games now just use Unreal Engine, and through DLSS and frame gen on it and hope it's good enough
I got an A770 Limited Edition just 'cause I thought it would be cool to own a first gen card. In the beginning it sucked noodles for almost ANYTHING, so I put it on the shelf to just look pretty as an art piece for a good long time. I pulled it back out recently to use in a Linux machine (and to do stuff in Windows and the same machine) and I gotta say it was VASTLY improved. Night and day. If the boasts Intel is making about the upcoming cards are true and if their driver team keeps pushing then this could be THE go-to budget card when it comes to price-to-performance. Side note: These cards are still my favorite in terms of appearance. So sleek and elegant. Not all sharp edged and pretending to look dangerous for "Gamers."
That indeed was the worst segue ever.
Best*
@@Justakatto wondering is ridge is going to love it or hate it. Don't need another oura ring fiasco.
@@Justakatto yeah bun intel must fall cuz they been agressive player for years in CPU market so i guess new AMD takes crown in CPU MARKET AND MAYBE Intel také crown in low to antry mid range GPU market and AMD TAKES pre per 1440p mid range Gpus and well nvidia again gonna bé on the top with 5090 and rest products are gonna bé again just up sell and more expensive than comoetitions but less good just Like Rtx 4060 Rtx 4060ti or Rtx 4070 vanila i guessing the 50 series gonna bé again same but maybe we will see more agressive approach from Jensen and they drop prices but most likelly nope nvidia lkke money but they hate when they lose them so i think they figures out something Like higher end gpus are for profesionals just Like they dit with 4090 and 5090 bé samé but just dont be suprised when nvidia say the Rtx 5070ti and up all gpus are gonna be for profesionals lmao that would bé crazy kick to balls hahah.
“It’ll be the worst year for intel, like our worst segue to our sponsor”
@@jared3174 probably gonna love it. that was memorable as hell
2017: The best budget gaming rig is an Intel CPU with an AMD GPU
2024: The best budget gaming rig is an AMD CPU with an Intel GPU
how the turn tables
The tables have been adequately rotated
They talk to each other for more $$$$$$$$$$$$
Eh that isn't really true. Intel has mostly lost in the top end. An i3 12100, 13100 or 14100 are basically unbeaten in gaming in terms of price. A 12400f also kills in that regards. AMD mostly wins on the higher end x3D chips but they arent really budget.
@@Blackfatrat that is technically true but when you take into account that you have to upgrade intel motherboards every 2-3 gens, then it's more expensive in the long run. Am4 lasted from 2016-2024. A whole 8 years.
The colour grading on this video feels so nice. It’s warm and soft. Just really chill. I like it. Please give the colour grader a slap on the back and a “good job, mate” in an Australian accent.
I feel like its worse than usual. Big chroma noise and I feel like there were some hue vs sat changes made especially to his face. Not 100% sure.
Could also be since this was filmed on Colin and Samir's set as mentioned in a recent community post.
The warmer color grade likely comes from them not using their traditional cameras, they’re visiting Colin and Samir. LMG usually uses Sony FX6s or FX3s. This was shot on a Canon C70. The lighting of Colin and Samir’s setup here is also probably why it’s warmer. Just wanted to point out the camera change too.
I agree with you, but it seems a bit green to me...
I had color filter set to orange the entire time
"in a nutshell, you'll find nuts" LINUS IT'S TOO EARLY FOR THIS
I mean he already has 3 kids..
I had to pause the video to recover from that one
And its done by the editor, ytp style!
@@AMalas It's Pelle, he went away for a bit, but his editing has always been a little like that
It's December tho
Intel realized there's a lot of gamers that want good cards but don't want to spend the same price as a handheld PC for them. I'm all for the sub $300 card making a comeback
@@DaBigCheeso puts pressure on the used market. Its good all around
At $300 that's good would be amazing but I'm not convinced
Do you people actually believe this stuff? Intel's "targeting" the low-midrange, because their gpus aren't good enough to do anything else. Same reason AMD isn't competing in the high end this upcoming generation. It's not some noble sacrifice, because they really really care.
@jamesbyrd3740 ok buddy some of us are upgrading from a 2080 or a 3060 or whatever... Holding back what I want to call you!!! Lol
@@jamesbyrd3740 it's a pretty effective marketing strategy, target the customers that the big two aren't. It's Intel, they make chips. If they wanted more powerful GPUs they'd build them
This is HUGE.
I build around 80 "gaming" PC per years and most of my customers want EXACTLY that.
They want a 300$ or less card with enough to play small to medium games.
High end PC with high end graphic cards are less than 10% of my builds.
4060 was the only option, now there is more. This is competition where there was very few. This is good for all of us.
Buy some B580 cards 🙏
yeah, honestly I think of xx70 range as 'high end' already, xx80/xx90 is just gratuitously high spec to support 4k gaming. I've never had an issue running lower resolution, 1440p is pretty good for me. At this point what I'm looking for is incremental improvements in performance with bigger gains in power efficiency, although I'm sure we'll never see an actual decrease in TDP sadly.
But you might get into a tech support nightmare. Intel's GPU driver still has many wrinkles that need to be ironed out.
Those who only play mainstream games might be fine, but if they play indie games, they better be tech-savvy.
@@chrisdt2297 I agree but they are getting much better. I built a few A750 and I was surprised how well it handled most games I through at it. I sadly do not owned one personally but recent videos using latest drivers seem to suggest most major issues on popular games were addressed, still not like an Nvidia or AMD but getting there. Having competition is good too.
@@emissarygw2264 Makes sense, I feel like even a 2080 can often hit 1440p/60 fps on modern games these days, if you tweak a few settings.
If Intel can get the drivers in check and pricing to disrupt the market like AMD did for the RX480, Battlemage could be a budget PC gamers dream.
idk man. i need a dollar.
@@lucasrem nvidia has DLSS 🤷♂️
@@ninjason57 not everyone cares for DLSS. I'm an old school quake gamer, and nothing frustrates me more than fake frames and fake resolution resulting in latency. I still live by doubling your high refresh rate with your fps to minimise input lag. For 1080p gamers, a budget option like this could be great....if you don't care about DLSS like me
Yes please. I want something better than team red and team green and I'm quite tired of Nvidia cutting hardware down and using dlss as a hardware crutch rather than a booster.
...not that I'm keeping my hopes up. But dammit Intel, Celestial is a great name. Don't you kill your Arc before then!
@@licksludgee Quake is such an old game, the what GPU you have doesn't matter at all
Not the subtle foreshadowing segue 💀
💀
💀
💀
💀
💀
Not gonna lie, as a long time AMD GPU user I do not feel compelled at all to switch to an Intel card. It only brings back memories of driver buginess from some of the AMD cards I've used years past. Drivers have been more robust in the past 6 or so years. I am not really looking forward to experiencing the same issues again with a different brand. I'm a huge tinkerer but I kinda need my computer to also just work.
Well, to be fair, not EVERYONE should rush to the other side of the boat, anyway, or else we'll have the same problem in 3-5 years.
Drivers for my 6800XT and later my 7800XT have been consistently abysmal. I can't wait to switch.
Back in the old days, games would come with a README.TXT file with a list of issues that the game may have with various graphics cards.
Nvidia rarely had anything listed, and AMD always had a huge list of models and driver versions that had problems. I always hated AMD for that, and basically stopped buying their cards.
I wasn't going to upgrade but $249 is tempting, that's about the price point I got my GPUs ten years ago
ikr?
you can also just trade 6 ltt pro hats for it
@@XGD5layer I got 750ti fror 200$ ish when it launched, I miss that
Temping to upgrade the 1050ti in my server
@@averywellsand888Arc should support AV1 handling (But Alchemist did already, too)
Although I'm an AMD fan, I don't want to see Intel going in the dust, as competition is always a win for the end user.
Yeah my worst nightmare is AMD becoming careless, non innovative monstrosity intel is. We need intel to exist solely to move the market forward. Hopefully they can turn it around but it ain’t looking good over there. They can’t find their ass from a hole in the ground.
I'm not too worried, I was until I realized they still have an ARM license and are ramping up uh, er "other category" chips. AMD has oddly become more monolithic than Intel in many ways, which caused Intels earlier decline.
@@dataanalystbynight4375 intel is a goverment asset and will never go under plus teh fact its building foundrys in america that will operate in 2028 intel has a very exciting future just not in the near term. they will become the defacto cpu maker again in 3-4 generations. Due to all that goverment money coming in
Consumers should NEVER be a fan of corporations.....your only loyalty should be to products which offer the best value for your money.
@@dennissdigitaldump8619 yeah, but losing 1 company that litterally start up this whole x86 architecture and be only monopilized with 1 company(AMD) doesnt sounds good
hopefully its something that AMD hapens to have back in 2014 era before Ryzen(aka intel finds its own "Ryzen" that wont dominate market but keep AMD in check, im afraid AMD became Sandy bridge era intel with intel decline)
why are disappointed it's not GDDR6X? 3:25 it's only been used in higher end Nvidia cards and doesn't make much difference. Also GDDR7 is a thing so who cares. New Radeon and Geforce cards are rumored to be using GDDR7 so that is the best-case scenario. The 3070ti with 6X was faster than the 3070 with 6 but the 6800 with GDDR6 was faster in raster than both. It's just a weird thing to say.
New Radeon cards are rumored to be GDDR6 still, not 7.
Also edging the segue is CRAZY
Em What the Sigma.🤓
fr
nah thats crazy
u care more about the segue than the horrible data 😂😂
Gelsinger got handed the worst hand possible. Staring down at a company that had been gutted by penny pushers and greedy shareholders only looking at next quarters returns. Intel deserves to be humbled, they were complacent for a good decade where they just kept releasing the same thing over and over until AMD caught them napping. But seeing them slip this far, its not even a competition with AMD anymore, they can't even compete with themselves, ouch. We need competition.
We will know if he was at the right path ... 2 years from now.
TMC is who they were fighting not AMD
I know it will likely never happen (for legal reasons), but imagine:
AMD buys Intel.
@@JellyLancelot Forcing out Pat was ultimately disrespectful, he deserves credit for what he did for the company. Turning around a company that was neglected for almost 2 decades is not easy to fix. He needed more time. I was really rooting for intel to roar back with Pat but now I don’t care for the company anymore.
@@ivonakis IFS foundry will be spinned out as a separate entity in 2 years. It's built into the CHIPS ACT that they are allowed to do so.
As someone that spoke with my wallet and bought an ARC a770 at launch. For a first gen product I expected it to have issues along with what seems like a nearly impossible task of trying to optimize not only for all the new games but also the insane number of older games out there that people still play. I've been super impressed with the team they have working behind the scenes on the drivers and software. They were given a mountain to climb and they have put in a hell of an effort! I don't even need a new GPU but I am super excited to try Battlemage for myself and will be purchasing a B770 B780 or whatever they call the flagship top model this time.
Same, A770 has done the job and driver updates have been frequent. I'll buy the B770 on launch.
Pot of greed is a banned card. You're going straight to the shadow realm for playing it linus
also only 2 segues not 3
At least we finally know what it does.
@@n.henzler50 Don't be ridiculous! Pot of Greed's card text is completely incomprehensible, that's why it was banned.
"Draw 2 cards"? Is that even English? What on Earth could it possibly mean?
@@Spartan-551It's from that VR chat meme or whatever 😭 RUclips it. "i summon pot of greed" should get you there. It's so good.
At 07:17, it's not 25% less performance. Its 20% less performance.
Adding 25 cents to a dollar is an increase of 25%. Taking them away again is only a 20% decrease.
You also can't simply subtract the percentage difference (132-106=26) between the 7600/B580 at 06:35... that's not how it works. The increases are with reference to the baseline 4060.
LMG should hire an 8th grader to check their math for them.
I honestly just mentally added the "points" at the end to validate it. 25% percentage points less
such a silly thing to try and nitpick
@@samo9288 No, no its not.
@@samo9288 Technically correct is the best kind of correct, especially on a tech channel :)
6:34 Intel is not claiming a 26% performance uplift in rasterization performance. Somebody did the math wrong. Intel is actually claiming a 24.5% uplift.
You shouldnt substract percentage...
Same goes for 6:45 but other way around uplift is actualy around 44%
If lets say subject "A" have 75% performence and subject "B" have 150% performence diffrence betwen is not 75% but 100%
You should divide performence B and performence A not subtract them
12GB of VRAM and that level of performance, for $249, is nuts.
For the same price as a 4090, you could build an enclosure with multiple B580s in it, and it would probably have the combined VRAM and performance of an enterprise chip.
For gaming and other realtime use cases, probably not worth doing, but for productivity, for video editing, for 3D renders, for AI training, that would be awesome.
Would be immensely funny if these new cards work on the Apple Pro.
@@Demopans5990what is an "Apple Pro"?
Yeah I paid like 450€ for my 12Gb RTX3060 at the plateau just before the prices actually completely dropped. But at least I got one of the faster 3060's, heard Nvidia started selling shart cards after that.
@@jojo_da_poe The monitor
The A770 is even more insane: 16 GB of VRAM for 229 right now!
Kicking out Pat Galsinger is such a classic incompetent board move
We want money now!!!
Trust me, this lvl greed is very widespread. I see it all over the place. Ppl r drunk with instant gratification.
@Nobody-vr5nl yes
I couldn’t agree more. He had a lot to turn around from the last incompetent guy so of course it was going to take a lot of time to fix.
As he said in our last company meeting, he was too optimistic. Constantly making claims he couldn’t make come true. This led to him having multiple layoffs, while enjoying a 45% raise. He cost my colleagues their jobs and didn’t care. I’m glad he’s gone.
As someone who is finally getting back into constructing my own desktop / tower after years of using a laptop, I've been genuinely excited about the Arc cards. Alchemist felt a little too "early adopter / beta tester"-ish, so I've been holding out hope for Battlemage. As long as it's faster than the GTX 1650 mobile I'm running, I'll be happy.
Most of the Alchemist issues got fixed over time, so hopefully (keep your fingers crossed) the Battlemage launch should be a lot smoother. It's hard to say what happens long term, but let's also hope that Intel's GPU team remains intact (or even gets more attention/funding, ideally) regardless of what happens with shakeups at executive level.
@zivzulander I agree, at this point in time I'll take more GPUs in the market
Who would think 10y ago that AMD would be the CPU king and Intel becoming a GPU company 😂😂
Kinda sad :/
@@PumpiPie This is what happens when big companies get complacent and assume they will stay on top even if they don't make any advances... Nvidia needs to watch out.
probably anyone who is old enough to remember AMD being the cpu king 10 years before 10 years ago.
AMD: I'll take the more profitable CPU business, please.
Intel: No worries, I'm going after the much less-profitable GPU business, now!
Well if i remember right intel did test making gpus back then too so im not rven more with amd having being better than intel through the 2000s
the editor was really cooking with that sponsor segway
Surely you mean segue?
@@aleksjenner677 Surely you know they do
idk
@@cpmc5400itz okei tu spel ikorektly if ze reeder knaus vat yu meen?
RIDGE
The question is: Am I buying a dead eco-system with this cards? What's the future for Intel graphics? As for now, nobody can tell.
I have an A380 in a SFF side rig. It handles itself nicely for a 75 watt card now since the drivers matured. The lower budget, and Low Profile market need continued support and happy to see Intel doing alright here.
I like the way they look
Is it half height?
less than $250? dude I'm sold, I don't mind paying that much for a GPU
Considering I got my rx6650xt for that price I'm gonna try keep an eye in this
@@levibull6063 can you give me your rx 6650xt once you finish using it
Dude just grab that $250, go to the casino and QUADRUPLE your money and buy yourself a 3080ti🤑📈🔥🔥
@@FlightReactsFan911 winners mindset 🤑🤑🤑
@@FlightReactsFan911hol up, what about a 4080 super, caught you lackin son
Linus going thru puberty at 5:40
Let's hope this launch prematurely cancels the 8GB 5060.
You mean NVidia "unlaunching" another card?
I very much doubt it, otherwise they wouldn't have a good way to upsell their GPU to the much more expensive 5070...
@@JohnWilliams-gy5yc rename it RTX 5050
I used 2 ARC 380’s in my Unraid server. One for transcoding and one for VM’s. For what I use them for, they’re unbeatable, especially since the 2 cards were only $100 each. Incredible budget cards
1:02 best segway from LTT in their existence so far.
Come on guys, you can't just subtract percentages. The figures for the Arc B580 and RX 7600 are in reference to the RTX 4060. Raster performance is less than 26% better than the RX 7600 and ray tracing is more than 37% better.
They picked the most hated card and tried to say we do not suck as much as them 😂
@@ohboyd flashback to the Gamers Nexus criticisms
@@DragonHuman00 Since when did LTT choose the 7600 and 4060 to compare cards to? Those are Intel charts - the only mistake LTT made is the wrong wording for the % difference. Grow up.
They meant to say percentage points, I assume.
Yeah that hurt to watch. Tbf tho, the claims aren't important. Hopefully their maths is accurate in the benchmarks video
I'm angry at how the board of Intel just ousted Gelsinger like this... wtf guys
Makes me wonder how long those board seats have been held, maybe they’re the consistent factor of intel’s struggles.
what idomt get it.
@@SwirlingDragonMist What?? Noooo. That can't be it, it the customers fault! */S*
I am pretty sure he made enough money in those three years to be set for life lmao.
I think u missed the point @@BladeEXE67
Hey, viewer from Brazil here! I gotta say, this is one of my favorites, if not my top one of all time tech channel to watch! You, "fireship" and "Coisa de Nerd" are the best!
How much did the CEO actually matter in this situation? He's not the one that told the engineers to make bad products. Kicking him out just seems like the board shoving away blame from their own cost saving measures that led to these issues.
Yep. We're gonna get a replacement that further guts the company for a quick profit at the expense of long term viability again. They really didn't learn their lesson.
@@slunasaurusrex It's a publicly traded corperation. They basically have a fiduciary responsibility to generate as much short-term gain for the shareholders as possible without thinking of the consequences. Late-stage capitalism is the point where none of this is sustainable anymore. Boeing doors are flying off and Intel sells faulty CPUs.
He was the CEO. He wasn't innocent 😂 stop being a contrarian just to seem smart or something.
It's very easy to almost always find someone better for a job that pays a bunch of money. That person is out there. Even if he wasn't "that bad," there's someone better.
It's the CEOs job to push initiatives in the company forward. If said initiatives involved ignoring feedback from engineers or being short-sighted, lost them a partner and caused a massive scandal. Then, the shareholders of Intel are going to kick them out.
@Ichibuns I assume if CEO wouldn't tweeted that useless tweet that lost Intel 40% discount in TSMC fabs, then he would still be working. Also most blame is on previous CEO's who wanted to keep their jobs, and didn't invest in more research, instead put more money in investors pockets. At least this CEO tried something different. Not sure what is their next, better CEO option. Lets hope there won't be more lay offs until Intel finally get their ... together. I am not sure i ever buy Intel again, maybe only when AMD also becomes as much greedy.
0:27 another announcement Pat Gelsinger has stepped down as CEO of Intel. This is tragic, idk who would be best suited to replace Pat as he was one of the veterans intel engineers who came up through the ranks. Intel needs a miracle to stay in the game
They will ask Elon Musk.
@@kravenfoxbodies2479 Knowing Elon, he would probably fire everyone including the engineers, come up with a terrible naming scheme, then complain that the company is unable to do better.
@@kravenfoxbodies2479I hope not, he already screwed with twitter, I don't need no "xtel 42069x premium blue checkmark" CPU
IFS Foundry will spin out as separate entity. Private investors won't care about bleeding edge like 18A or 14A, just like Global Foundries abandoned 7nm. Intel Products will be highly lucrative as they can outsource to IFS and TSMC. Investors win, and "national security"? Nobody truly gives a shit about national security at end of day, that's just an pretext for (some) free cash.
@@kravenfoxbodies2479they’ll go bankrupt 😂
1:26 do it again
1:25 please don’t ever do that again k thanks
They got you to watch the sponsor you normally double tap to skip.
Metrics will say it was good. But it wasn't.
You can't always just trust the numbers. Common sense will always be important.
This is our only hope in the battle against the AI.
@@gasracing5000 lol
I thought it was kinda funny
@@gasracing5000 It's more about the stupid editing trend people on Tiktok do where they cut future into the past video edits. It's horrible. I got PTSD
I liked it, it was fun
Been rocking my A770 for a while now, I could play anything at max setting at 1080 and now that I upgraded to a 1440, high refresh rate display.....it's still runs everything at high settings just fine, might not be getting my full 144fps but, it's it's ability to render videos with resolve makes me more then happy!
Any games it won't play?
@@veilmontTV AC7 is a stuttering mess even in tye hangar.
@@veilmontTV Anything that runs UE4 but has a lot of extras on top of the engine can be pretty rough without DX12, and Elite Dangerous used to refuse to start on Arc but I haven't tested it in ages. Most games tend to need more delving through the settings to find an "ideal" setup than AMD or Nvidia cards though
I was all set to buy an A770 until the reviews landed.... I'm really hoping Battlemage is in a better position....
Running an ARC A770 16GB VRAM paired with a 12900K CPU. LOVE IT! Waiting on the B770 to release. Hopefully they bump up the VRAM from 16GB to 20GB or 24GB.
It's quite sad that they are competing with a card that is more expensive because there are no cheaper cards worth mentioning...
These corporations know where the money is. Sadly, people playing video games didn't bring that much revenue. We all know these GPU manufacturers can pack this chip on a "professional grade" products, and slap a humongous price tags on them.
Please Intel! Let battlemage win! Nvidia cannot be allowed to take over the market anymore 😫
we dont need a 3rd gpu company and we already got competition so intels not need, so you cant use that flimsy excuse so many others keep repeating
the sad truth is they will take almost no market share from Nv.
It's going to go from like 95% Nv, 5% AMD to 94.5% Nv, 4% AMD, 1.5% intel.
nvidia will rule most of the market until intel actually make good gpus which is unlikely
@@R_E_D22 AMD already makes good gpus.
@@jamesbyrd3740 As a double AMD GPU owner, cap
Cautious because of the first gens driver issues. Hope you'll do another video on them and see how it's changed
It also has a massive advantage over the other two teams for media servers - amazing encodings cards.
Planning to get an A310 to replace my RTX 3080 Ti for encodes.
sayyy whattt...
@@AffectionateLocomotive The Arc GPUs have QuickSync encoders, which you find in most Intel CPUs.
These hardware video encoders are better-supported than AMF and even NVENC, largely because they have been in Intel iGPUs for so long, and coincidentally, they’re now also the best quality-wise in pretty much every major video codec, including AV1.
If you want to add hardware encoding/transcoding to a server, the Arc A310 is the best GPU for that.
@fujinshu i got like everything except all those enconder names.
@@fujinshu yep, I've found that encoding on my a380 creates higher quality videos at a smaller size vs AMD or Nvenc hardware. Not as good as CPU encoding, but at 3x the speed, it's totally worth it
I went in on a A770 in part because of your shill hat video. I've been really happy with it and was late enough to avoid all the year one issues (other than automatic driver updates never working). Been eyeing the battlemage news for awhile as it was what I originally was waiting for but upgrades needed to happen. Hope they have a successor to the higher end of their cards coming soon.
Same here. I have an A750 and picked it up in year 2, sidestepping many of the earliest issues. It has been a good card and I will not blink at picking up another once the B770 comes out.
B770 was cancelled due to budget cuts
@@satyampatel491 I've seen the rumor that it hasn't been taped on the 2025 roadmap yet, and that has some insider-type folks worried that it has been cancelled.
It makes sense that these sorts of rumors are floating around given the disruptions and chaos at Intel as of late. Perhaps there will turn out to be something to it, but "B770 was cancelled" isn't accurate. At least not yet.
@@satyampatel491 there's no confirmation about it being cut, whether or not it turns out to be the case. Stop lying.
@@satyampatel491 I've seen recent shipping manifest with the G31 Die which is B750 B770.
The foreshadowing intro was juuuuust something else❤
1:35 Well that was NUTS
7:20 that is not how percentages work. The B580 delivers 32% more performance than the 4060, but that is not the same as what is said in the video. That number would be 100/132=0,758 which means that the 4060 has (1-0,758=) 24% less performance than the B580
For raster performance specifically. For the ray tracing at the right the number would be even lower(when said the way it is in the video)
*performace per dollar.
But everything else you wrote is correct. Happy to see many people pointing this out in the comments.
Must have meant percentage increase, writers probably rushed that one out lol.
With all the issues intel has had, want to give them props for accepting my 14900ks claim with no issues at all. Brand new and it was failing bad (crashes daily). Got a new one in 48(ish) hours.
I know you guys read these - include some hardware encoding tests (HEVC & AV1) if you can please, they're very useful (for video people).
I think it's a bit early in the day to start testing the cards, mostly since nobody will have them yet and if Intel doesn't feel like doing some encoding tests...
I agree. And maybe some compute focused tasks given Xe-2 has FP64 support. Given the compute characteristics of alchemist I don’t think Intel will nerf it like Nvidia does with GeForce
7:15 Linus, that is not how percentage works unfortunately
yes, what is the team of reviewer doing
@@sam35551 This is obviously a last-minute video that didn't have time to be as thoroughly checked. Things slip by sometimes, it's okay. The point still stands, whether the percentages he speculates about are completely accurate or not. It's just a matter of illustrating how the price-to-performance might be really good.
its not like the percentages Intel gave are 100% correct anyways. just a bit off is not that big of deal
Yes it should be - costs 20% more while delivering 24.24% less raster performance per dollar.
oeh double sided note:
Intel:
-I get screwd HARD, invested in about 100x 13/14gen and oof...
LTT:
-The lighting got really good on this channel!! love it! The improvement on lighting especially is impressive and really happy to see!
1:24 ok I giggled
Me not
I don’t think it was a glitch
1:03 i fr though my phone was lagging hard 😭
Sameee 💀💀💀
The B580 with a used 5600x and some RAM could be a decent choice for people who want to save one or two bugs.
Would love to see you guys do another extended stream testing a variety of programs and games like you did with the Intel GPUs!
Yesss, but give them a few weeks to cook.
Sacking Pat was a bad move. But I understand the situation. He wasn't a good salesman.
There's no rulebook that says a CEO needs to present product launches etc. But I'm also pretty sure they didn't spend any time to consider alternatives, like hiring a marketing exec, because sacking a CEO is the run of the mill expected knee jerk reaction of the times.
I took a huuuuge gamble about a year after the launch of the Arc A770, and it has not let me down one bit. Suuuper solid card and damn near whisper quiet most of the time. It wasn't necessarily a huge upgrade to my previous card but it was still an upgrade. I love it, aside from the Arc Control Panel that refuses to update drivers automatically it's great.
Also.. The set you're in needs some better sound deadening. There are some pretty noticeable echos.
As a previous a750 owner (I just moved to 4060 ti) I'm really happy to see some hype for this. The first gen had a HORRIBLE launch but after a few months they fixed everything and for $250 I got an insanely good card that I was really happy with. But nobody bought it cause the launch ruined its reputation. I hope battlemage can turn it around.
Technically they didn't fixed everything. On the last Intel GPU review/check I saw several months ago, there were still issues, though overall it was much better than at the start.
moved to a 4060 ti
damn my condolces
@@cz35vk 4060ti performs similar to AMD's flagship GPU(AMD Ryzen 9 7950XTX). With DLSS and RTX on it can easily get triple the performance too. One of the best budget GPU under $2000.
8:20 lol Editor Pelle Gustavs, that is not a foot
🤣🤣🤣
The Yu-Gi-Oh references warmed my heart. Thanks for that guys.
Having worked with marketing people, XeSS and XeLL are 100% supposed to be pronounced "Excess" and "Excell".
Maybe they shouldn’t have named it a dumb unpronounceable name if they wanted us to say it "correctly"
You saying I can have either 5 LTT Store hats (hats + delivery) or this banger of a card?
A mad Yu-Gi-Oh reference. Brovo
👏🏾👏🏾👏🏾
Funny thing about those 4060 MSRP's. Every 4060 on NewEgg is over $319 except for one model.
Good video and we'll see how Intel performs.
Also, usually I don't typically call out quality, but in the video background there was some overly grainy/pixelation to it which was quite noticeable.
I’m sure that them pivoting to a new ceo during a complicated and long term change will go great, assuming of course the board prioritizes the incoming CEO will focus the tech and engineering progress rather than stock price. Right???
Right, right 😂
The math in this was painful to listen to
Dude, I count on their math cause I'll be hours working it out on my own and having to search up formulas.
U listen to the segue tho?
segue makes up for it tho ngl
As a Linux user, I'm actually really interested in an Intel discreet GPU because they have the best drivers over here. Not needing to have the notoriously painful nvidia drivers installed would help make setups that much easier (Intel drivers are open source and preinstalled on pretty much every Linux distro). I'm also curious how it will play with an eGPU setup, and whether the drivers will be able to make use of both my laptop's internal Iris chip, and the additional horsepower of the dedicated GPU as well.
Intel best bring their A-game!
thanks for the insight sir jimmy savile
yes!
Clever joke about it being the B(attlemage) series.
@@lucasrem it's a quote from Joey Wheeler, referencing the Yugioh memes in the video.
More like their B game
I'm really happy to see a second gen intel card coming out after so many were speculating that the first gen wasn't good enough for a follow up.
Not sure i'm gonna upgrade just yet, 1080ti still hanging in there, but when i do (and i plan to go more towards that end of the price spectrum, i don't game nearly enough to justify high end gpu's anymore) i hope there is a compelling Intel offering to at least consider seriously.
I love that your content is not just for high end. Thanks for looking out for the little guy.
Thought my phone was lagging when the segway came up. 1:10
i realize vr gaming is niche, but its the biggest support hole on the arc cards at this point. a brief mention/headsup would be appreciated thx
i don't think a 250$ card is gonna let people play vr games at a decent frame rate...
@@ForeverHobbit And why do you think that?
@theo its a 1080 card
Good call! Hopefully one of the tech channels will cover this aspect. Granted, with WMR discontinued, I'd doubt I'd ever see support for my Samsung headset anyways.
@@TheoHiggins because vr usually isn't at 1080p, it's that simple. High resolution requires so much processing it's exponential and it sucks. Then you add that you don't want to play vr at 60fps and this card wouldn't cut it
If this generation bombs too (can hardly call Alchemist a success), then that's it for Intel's GPU department. Not that I want it to bomb, but the launch of Alchemist left a very bad impression, which led to an adoption rate so low that it might as well not even exist. The number of games using XeSS is OKish, but not comparable to Nivida's DLSS.
The problem with that 200-300 dollar price range is that those cards typically don't make a lot of profit, if any. And this is about profit, nothing else. Do not for a second think that Intel just wants to give gamers the best bang-for-the-buck card because out of the goodness of their heart.
10:11 oh god indeed
@05:17 - Dude.. You're getting a " X e L L"
What that means?
Reference to the "Dude, you're getting a Dell!" Branded commercials from Dell.
Intel really said, 'New CEO, new GPUs, who dis?'
This is the earliest I've ever been to an upload
What's wrong with the video?
Dim light and high iso and then noise reduction?
It certainly doesn't look as good as the other videos.
Yes, there is also a certain weird red noise over the phone
Likely it's a rushed video
the video was likely rushed to spread the news faster then other channels
LTT's Blue sky account posted that the video wasn't recorded in house but at another RUclipsrs studio
Good video, makes me hopeful for the new Battlemage launch! Only part of the video I found jarring was at 8:20 with that stock foot(heh)age insert. That insert doesn't quite feel like LTT, though it is good to see the editors come out a bit with their own flairs and ways to edit
I am looking forward to XeSS2 because XeSS already looks way better than FSR3 in most games it's in. But a lot of people don't even know about its existence as it was not really advertised by Intel in any way
Remember that FSR will get 4.0 version with 8000 series launch that is also gonna use AI based up scaler instead of the filter based that AMD has been using this far. So XeSS2 is not competing against FSR 3. Also how many games will actually implement XeSS the 1st version adaptation was quite low.
@@xYarbx Found the AMD bot
@wta1518 Better take off the tinfoil hat before you end up on the list. I've got no reason to shill just stating the fact so people can make more informed decisions.
@@xYarbx you should remember that AMD made grand claims about FSR3 and it was still crap
@@Freestyle80 I made no claim on the quality I simply stated the fact that comparing XeSS2 to FSR 3 is going to miss lead from the experience that you will have for most of the useful life of the card. Meaning that if you use the quality of those upscalers as the deciding factor there is a potential for buyers remorse.
Watching this video on my powerful Intel Arc A730M, ah no wait, blue screen 💀
Congrats on 16M subs!🎉🎉🎉
4:23 lmao that was kinda brutal 😂
Go intel go.
We need you 🎉😊
9:04 I absolutely love the Yu-Gi-Oh reference please do more of that 😀👍
@7:15 That is NOT how percentages work.
For those who want more details, 32% less of ARC battle mage would imply the NVIDIA has 68% of ARC performance, but 100%(NVIDIA number) of 132%(ARC number) is actually 100/132 = approximately 75.8% (24.2% less). While the main point still stands, the difference was over exaggerated.
@@Stars-Mine I came to mention that and saw your comment.
This is the first time in years that an early adopter of a next gen product doesn't have to break the bank. You can get started with the fastest Intel GPU for just $250, which will get you above an RX 7600 at $270 and an RTX 4060 at $300.
They are probably selling at subsidized price.
Definitely. But that’s not really out of the ordinary. Lower tier GPUs have less profit margin which is made up with your 4090s and 7900XTXs. Intel just needs to break in to the enthusiast tier and then they’ll have the opportunity to make more profit.
I recently picked up an A750 for $125 (used) and am blown away by it. Saying that the drivers have come leaps and bounds doesn't quite do it enough justice.
Some slight tweaking, and it's performing on par, or better, than the 3060 12gb I was using for the same budget machine. I'm ridiculously excited for their next entries, as they're definitely zoning in on the "budget" or "casual" sector harder than Nvidia or AMD ever have.
Finally the Rx 580 has a worthy successor, fingers crossed.
I remember when people wanted 400 bucks for an rx580 a few years ago lol
@@Daniel-dj7fh Yeah, that was insane. I could have sold my pc four years after I built it for more money than I spent on it.
I can't wait for the full review. I am still loving my A770 although it does have it's weaknesses. For 250$ if it would give just a little bit more stability, or basic shader fixes like running Spacebase Startopia without green lines, MK1 without bright white lights whenever Chameleon is doing anything, it could be a solid, buy it to support them until B770 comes out solution. Please if you test games again on the B580 like when ARC first came out, make sure you try out these two games somehow because these issues are insane! :)
same! Ive got the Asrock PG 16gb card. If battlemage flagship is still in that 300$ price class, im definitely upgrading
you'll get 25% more green lines and bright white lines than RTX4060 :-)
@@kmieciu4ever yeah, the sacrifice of supporting the newcomer, running Bloodrayne 2 on 30 fps because dx9 was a nightmare (fixed) and the white flashes of regret xD
@@AskemoX DX9 is only emulated on Intel cards and that is the reason of poor performance. People like to shit on RTX4060 but mine runs Stalker 2 at about 100 fps thanks to framegen and uses max 100 Wats. It's like having a laptop card in your PC :-)
@kmieciu4ever yeah, i know it uses a translation layer for anything under dx12. I think performance is really relative. Someone accepting low settings 1080p or with heavy scaling, while others want pure native performance :)
Finally decided to upgrade my old 1060 this week. Was planning a 4060, but maybe I'll hold off another couple of weeks to see how this performs in the wild...
Love the editor doing both ytp style "In a nutshell, youll find nuts" (listen closely, linus didnt say that), AND the subtle foreshadowing trend (the awefully cut segue)
Bravo
It makes a lot of sense to make a low end cards these days. Gamers are playing older games, or rather not moving off older titles, and many folks are gaming on 1080p so it there's no need to really get the latest gen hardware. And because we have platforms like XB Series S and Switch have pretty low end specs, developers are having to optimize for the lowest common denominator anyways.
And a lot of people just play esport titles
But they... dont do much optimization at all? Most larger games now just use Unreal Engine, and through DLSS and frame gen on it and hope it's good enough
Lets hope for an Arc b770 with 256 bit bus, 16GB ram and better than the fake 4070 for just $319.
9:06 thats a banned move
He has four jewels!
No wait! He has five jewels!!
6:09 !!!
I got an A770 Limited Edition just 'cause I thought it would be cool to own a first gen card. In the beginning it sucked noodles for almost ANYTHING, so I put it on the shelf to just look pretty as an art piece for a good long time. I pulled it back out recently to use in a Linux machine (and to do stuff in Windows and the same machine) and I gotta say it was VASTLY improved. Night and day. If the boasts Intel is making about the upcoming cards are true and if their driver team keeps pushing then this could be THE go-to budget card when it comes to price-to-performance.
Side note: These cards are still my favorite in terms of appearance. So sleek and elegant. Not all sharp edged and pretending to look dangerous for "Gamers."
Wasn't supersampling the opposite: Render more pixels, then downscale to the native resolution for better AA.
It’s a little confusing because SSAA is called supersampling, and XeSS is also called supersampling.
Intel's day was not as WILD as the total misunderstanding of how percentages work in this video.
I love LTT but politics aside, math is math.