How Are Intel GPUs in 2024? Games, Drivers, and MORE!
HTML-код
- Опубликовано: 24 апр 2024
- Wendell takes the Intel ARC A770 and A580 through their paces, a year and a half later!
**********************************
Check us out online at the following places!
linktr.ee/level1techs
IMPORTANT Any email lacking “level1techs.com” should be ignored and immediately reported to Queries@level1techs.com.
-------------------------------------------------------------------------------------------------------------
Intro and Outro Music By: Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 3.0 License
creativecommons.org/licenses/b... - Наука
Intel has to cover years of black magic optimizations done in amd and nv drivers for every single game. The fact that they are doing it at that speed is really impressive.
I mean it’s definitely impressive but hasn’t the software side of things always been what Intel focuses on? Between that and their decades of experience working on internal graphics, I guess I kind of expected Intel to have almost caught up by now.
Hope they take a bit of L on the CPU side and focus on stability and efficiency rather than beating AMD on benchmarks.
@@Rushil69420 no one was playing these games on igpus
I realize I'm a bit late to the conversation but has Intel improved things for pre-dx9 games? I play a lot of 90s AAA classics.
@@Rushil69420 simply put NAAAAAAAA, amd and nv drivers are ahead. Intel had more bugs, that's why i feel not in confidence buying an intel card.
But i am currently thinking about intel for my next video card maybe for their next gen, if they level up their driver and don't cheat.
Things have improved a lot on the Linux side too. In the past couple of years, ARC on Linux has gone from "nightmare to get it working properly" to "just works right out of the box". My main (Linux) desktop currently has an ARC card in it, and I'm happy with it.
I had a lot of problems getting video encoding and decoding working on my a380. That was across arch and Ubuntu. I'm not saying it's not better but it's still not quite plug and play.
this is relevant to my interests :D
Depends on the distro, AIUI. Debian stable is still in kernel 6.1 and you really need a newer kernel than that. I mostly care because I want to stay on Debian stable and Jellyfin supports AV1 encoding on Intel hardware from 6.2.
good to know!
@@samiraperi467 Yeah, I suppose I should've qualified that with "assuming you're on a recent kernel". I'm currently running Ubuntu 22.04.4 LTS; the .4 point release brings 22.04 up to the 6.5 kernel. FWIW ARC also seemed decent with the 6.2 kernel (22.04.3 LTS) before that.
Old boomer here, was a early adaptor of arc cards. living on fixed income, and needing a new computer during the pandemic, I bought a mac mini m1 due to the cheap price. I have always built my own too. Finally after prices ettled down I built a PC with Ryzen 5 3600 G, and a intel A380. I'm not a gamer, and at first left the integreated graphics on. Intel drivers were really bad as 4K HDR on youtube would just white out. Now evereything runs smooth. Driver updates are painless. Also I do some video encoding and the arc is amazing for that. I finally turned off the integrated graphics in the bios. Good job intel! I was not gonna pay the Nivida tax, with a 1650 even costing way over 200.00 a lil while back.
Yep, thats why I chose AMD, sadly Intel wasnt an option yet at the time, but its been nice to see them finally release a GPU lineup and turn it into another very good alternative to the insane prices Nvidia pushes on GPUs that are not as good as an AMD or Intel equivalent on the lower end. The silly argument for Nvidia is raytracing but the truth is, _NOBODY_ is getting good RT on lower end SKUs from any company. You want an RT PC? You better be buying the LATEST _and_ GREATEST top tier SKUs - and that means your willing or able to spend $3,000 on a PC every year. Meanwhile, I eagerly await Battlemage.
The A770 LE has been one of the best gifts I've ever bought for somebody. Through no doing of my own, it keeps on giving. My brother uses it in an SFF editing rig with an i7 13700 (non-K). This card was the best value dual-slot card at the time, and now it's just a great GPU in general. The goals initially were to have a solid editing rig that could play a few games here and there, and now the Xbox has lost its spot on his desk.
Cant wait for battlemage
The more competition in the GPU space, the better.
@@DragunBreathOnly if people will actually _buy_ the competition.
@@benjaminoechsli1941 lots of us did, I know I did and I can't wait for battlemage
Hopefully they don't neglect to include a healermage too, or at least a supportmage
I'll be buying into Arc when battlemage drops
I hope they'll have some sensible choices, but irrespective of the offer of the competition I'm ready to give intel Arc a try
Love it when Wendell is hyper-active. More videos, yes!
Get him and Patrick from STH and we get pandamonium in the server room in a good way... 😊
I actually gave up on the Challenger A580 about a month ago, pulled it and sold it. The machine it was in is used for watching video streams, discord and occasional light 1080 gaming. I added the A580 primarily for Quick Sync (so I could offload some encoding jobs from my power hog gaming machine) and maybe turn up some detail when gaming. It never got along well with Windows update in my machine, but two or three months in I was getting daily driver update failures from the Intel updater, and when I finally did more than just dismiss the notifications I found two instances of the updater, along with a non-existent Intel Bluetooth adapter in device manager (?). I removed the card, DDU'd the drivers, cleaned up device manager and reinstalled. It took less than a week to start giving me the update failures. The updater had again installed itself twice, and device manager was again showing the phantom Intel BT adapter. Personally I could live with minor bugs in some games, but the Arc Control update software in my experience was crap.
I find it amazing how well these things compete with their contemporaries, being a completely novel lines of gpus.
There needs to be a shirt with "Gaming.... For science!" for these occasions!
Hardware can be amazing but it won't matter if the software isn't efficient... super cool to see how many gains they've made over the 2 years.
Thanks Wendell! - Love the reviews as always
4x of A770 could be a beast for AI. 64GB of VRAM for less than the price of a 4090. Too bad the support isn't there.
would love a vid from l1 reviewing intels ai performance, if it runs at all
hopefully many more will buy it and developers will have no choice but to support it
Interesting, but for now 2x 3090 is more than capable for about the same price.
@@userblame632I use my single A770 16GB on Arch and it's pretty solid for Llama3. OneAPI is stupid easy to configure.
@@disco.volante Consider the power usage of 3090s vs intel though.
I have the A380 as my host gpu for dual 1440p monitors and it's working with no issues.
I am absolutely a fan of my A770. I'm running it with a Ryzen 5700 on Fedora 40 as of this comment and it's just wonderful. Everything supported out of the box, stable and fast! I'd dare even say an AMD CPU and an Intel GPU is a dark horse these days. No, it's not the top of the line performance monster, but bang for your buck, this just can't be beat.
I could see myself getting an AMD GPU in the future if the price was right, but if I need a new card in the next year, I'll strongly be considering Battlemage. The level of support and quality Intel has given on Linux is setting the bar way high.
Id love for you to test these on linux, since im considering going intel GPU for my next build and i like to use fedora. Good video tho :3
I second this really would like to know more about people's Experience with various Linux distros out the box with ARC , and Wayland on Arc, KVM etc.
Arch-based user here, it's a pretty smooth experience. Obviously the A770 is about 10-15% behind my old 3070 Ti at 2K, but most of what I play handles great. Currently on a Helldivers 2 kick and it handles it like a champ.
As noted in another thread, my experiences with ARC on Linux have been positive. You do need to be on a reasonably recent kernel - 6.2 or later - or you may be in for a bad time. Any currently supported version of Fedora should be fine since it's pretty bleeding edge. Ubuntu 22.04.3 and .04 (with the updated HWE kernels) have worked for me.
Very good vid, thanks.
I would buy an ARC GPU immediately if they get proper SR-IOV support.
Yeah igpu SR-IOV for a server build would be pretty compelling.
Can you make customer GPUs / builds comparison for LLM applications?
This is a pretty impressive uplift from drivers alone
People still seem to forget the insane amount of silicon Arc GPUs have.
Yes, they have moved mountains like Wendell rightfully states.
But! That's in comparison to the abysmal state the software stack was in when it launched. And it's still painfully weak when compared to the sheer amount of silicon an A770/750/580 has.
If I had paid €400 when the A770 launched, I would've literally thrown it out of the window.
It's been a wild ride, from figuring out firmware updating to moving away from D3Don12. The SR-IOV stuff has always been in the driver, I'm curious what more recent stuff you've seen that makes you think it might go public. I'll stay tuned.
For my part, I've been in it since Odyssey in SF. I was there when it launched, right behind Raja. And I'm still on their Beta program with an A770 as my daily. I can give nothing but the greatest praise to the Intel folks for responding to issues and working them out since launch, it really gives you an appreciation for all the reproduction they have to try and do.
I am running an A770 with an I7-11700F and so far, it's performed pretty well. I do run games in 1440, using an MSI 120 hz 1440p monitor. My biggest issue is no sound over DisplayPort when viewing RUclips. So, I plugged my speakers back into the motherboard audio. Fixed.
aside from gaming the media engine included in the arc cards is very nice. i replaced my quadro p400 with an sparkle A310 single slot card in my jellyfin/plex server and i'm very satisfied. even the pcie-passthrough an the driver installation worked like a charm
Good to have good news.
The 0.1% low changes are insane. Incredible work!
I’m no gamer and love my ARC 770 LE. I even have an ARC 750 LE still sealed in a box as a spare card. The Intel design of the LE cards looks great (to me).
Do you use multi monitors with Intel card or no? Would like to know how they run
Its space ship grade
@@manuelhernandez2017 No, I just use one 1920x1080 60 Hz monitor.
@@a.j.haverkamp4023 I've read of issues with multimonitor setups even without gaming.. just asking for that reason. I'd like to buy one and try it
I've been running the A770 for over a year now and I love this card.
This video drops just as i got the driver update.😁
Excited for what is in store for these cards even last Celestial which might be when I might upgrade. I’m perfectly fine with my A750. More so thanks to these driver updates the past handful of months. In a sea and lake of green and blue I love my little pond of blue.
i have an arc a750 and i bought it as a "temporary" gpu (upgrade for my finally long in the tooth Titan Xp/building a new pc and keeping the old one together) waiting for the 50 series to buy one of those or a 4090 depending on pricing. im very impressed with the a750 for the price. i do still get occasional consistent stutters in some games.
ngl i want an intel card, i could gun it for an A750 but i want to see what battlemage brings to the table, i hope the hardware side has moved as much as the software side
and i hope it also doesnt cost an arm and a leg either, im happy if we can get decent 200 usd cards again that are not 3 generations old
Was on the fence before but I'm seriously looking forward to Battlemage.
Hey Wendelm, awesome video, love it! One thing though: Intel and good structured software development? Well sorry to burst your bubble, them seem to do some good, but also please lookup the topic "Meshcentral" and what happend with the devs.
Wendell, How is the A770 doing with Topaz Video AI 5? I know that Arc GPU's were unusable with Video AI about a year ago. Has this been fixed? Thanks!
im glad there is a 3rd option out there now. i have 4 of the 8 a770 16gb cards that there are and they all vary on their max power setting. so far the sparkle a770 titan oc is the highest one i have at 276w. for me they are a great bang for the buck on my systems due to not needing to much a lot of the time but when i need the extra power they work great for me. the drivers also impress the heck out of me how intel is backing their project massively and i don't mind supporting something so neat fully. i have high hopes for these cards and cant wait to see them become more then a "hobbyist" card.
i had my complaints in the beginning but they have smoothed out the experience quite well. oh btw hot pot taipei, computex time? i know a super great place.
Any updates on sr-iov for the a770?
So with intel making enterprise VDI solutions... will we see any of that support trickle down to the desktop cards?
I have had virtualization issues for dual monitors on my 11thgen i7, at first I had to pull a hdmi cable out to get past the POST screen, but it seems a microcode update in Ubuntu fixed the issue randomly one day. Note I tried allowing visualisation on the board and I also was running bare metal
100% hope Intel continues with their GPUs and Drivers. I am never an early adopter of PC technology because of the cost of parts, but if a competitive hardware or software options proves itself, I am happy to whole-heartedly embrace it.
Nice test video...
Sir... Does ARC A770 support video Ai FPS enhancement software? (Like : Flowframes / FrameGUI)
If you ramp up the fans, you can get the card temperature down real low. Its always running at max power, so it wont matter. 😊
I love my a770, super impressed with it's price/performance and it just keeps getting better
They released new hardware? Please do a segment on the importance of drivers versus hardware
Very much looking forward to seeing what Battlemage is capable of! Hopefully they can improve on efficiency. The main thing I'm curious about with Intel ARC is creative, productivity, and LLM tasks.
Fingers crossed that the battle mage isn't to much expensive then this gen and a lot more performance! I was waiting till see the 2nd Gen of Intel gpu's before jumping on board buy its very exciting to see the strides Intel is making on their gpu's though! More competition the better for everyone and hopefully knock the green giant of their high horse!
Competition Woot!
Intel Arc FTW
In the graph at around 9:30, I believe you used outdated results for ARC 770 (from Feb 2023), which could explain why there is so small a difference between that card and A580.
I lament to inform you that I was unable to read the white letters on the light blue background of the bar graphs, at the out and about viewing distance I had with this on my phone. Black outline could help. Also the far left text was beyond too small for me see even when I gave special efforts to look closely.
As a listener viewer, I wasn’t able to discern what the 9 bars were each associated with. I assumed the top 3 were all the new driver results, but upon zooming in, I could discern different dates among those 3. Maybe a little bit of description of the graphs, or fewer at once would be favorable. Maybe…
But don’t sweat it, you do you. I would though promote enlarging the text and giving a black outline for the white on light-blue.
Thanks for the follow up on Arc, I also lament I still have not produced my “Intel ARC the Anime” series. But with Ai I hope to some day achieve such heights of art.
How are these doing on Linux now? Wayland support? Game performance? AV1 encoding?
should be fine. Intel has always had the best linux drivers and has committed to the kernel for decades, far longer than any of the other big corps.
Obviously make sure you use bleeding edge kernel
Runs pretty well, the only things I can't do is run some newer games like Starfield.
Arch-based Hyprland user here, Wayland is great! Gaming performance is a bit behind the 3070 Ti I had before, but solidly playable (2K 165Hz for reference).
A770 has been mostly great, but I'm starting to need more juice to run games on the resolution (3440x1440p) I'm using. BG3 is fine on 60fps, but games like Hunt Showdown and Helldivers 2 need more than that. Let's hope that they drop Battlemage soon, because I don't want to go back to Nvidia.
I feel that, A770 has been good to me using 4K res, but if BM can get near 4070 to 4080 performance like speculated then we are in for a treat if the price is right like $400
@@thetheoryguy5544 Cheapest 4070S in about 680€ in my country and if Battlemage's high-end (B770?) will be close to that performance, I'm happy to even pay 500-600€ range. LE ofc if possible.
What I can actually say: using an arc380 in my proxmox server for a windows vm is crazy good. Low power, great encoding.
10:50 was it with INTEL XeSS XMX or XeSS DP4A? Two different programs. XeSS XMX is for INTEL specific. The difference is huge.
I love my ARC a770 LE. Amazing on linux and looks fantastic. Intel did a great job on it. Cant wait to see battlemage and see intel excel in the GPU space.
I knew drivers make a difference, but this is pretty amazing.
For Linux they have plug and play. No driver issues. nothing. I love it. I don't care that the drivers may be worse than on windows.
This is very exciting news, trying to not get my hopes up for battlemage
I had the same thing with asus amd 6600xt. Windows 10 no issues. Windows 11 would put in a generic driver. I spent months trying to fix it.
Love my A770 but idle power consumption still sucks despite so many driver upgrades
Multiple monitors? All high refresh rates?
@@joniqst It's a known and published bug that Intel accepts cannot be fixed without a hardware revision.
Im not a technical person i di unferstand 0.1% lows etc but running a 5800x3d on an a770 playing modern warfare 2 feels smooth as butter. Its not super high frame rate but it just feels incredibly smooth.
Same combination and I have to agree 100%.
It is television broadcast smooth for me as well.
Not a single stutter.
Not a single in-game crash to date...and I've owned mine since December 2022.
Had three driver installers crash but never the game itself.
I have an Intel A750 for my Jellyfin server, it shattered my previous 3050ti, in terms of transcoding speed and of course AVI encoding and transcoding . Bought it for 200$ including shipping, and I have loved it ever since, it was quite a pain to make the pass through in Proxmox and it sadly only has support on Ubuntu and Windows VMs but for what I paid and what it delivers , it truly is a marvel. If Intel responds with Battlemage being at least in an AMD level of performance and driver support and development , I might switch from NVIDIA to Intel
Can you please make a video reporting on pre-dx9 games? That's an area I believe Intel needs to work on.
I bought an Arc A380 for $120 a month ago and while the Windows experience is okay, because of the lack of any GPU selection like Nvidia Optimus for Laptops or AMD’s PowerXpress, you're stuck with Windows' High Performance/Power Saving GPU selector which is for DirectX, Vulkan games have their own GPU selector, and OpenGL games always use the GPU of the primary monitor, unless your primary monitor is a duplicate between two monitors, then Windows will use the DirectX setting to select priority.
I use my Arc as the output device for all my monitors (DP) and I have the HDMI input on my primary monitor plugged into my Nvidia GPU, performance is lacking on Windows, but on Linux with Prime Offloading performance is amazing, Wayland works flawlessly, and everything is offloaded onto the Arc except the game I want to play, and I have AV1 encoding.
Battlemage hype
so my experience with my sparkle a770 has been all over the place but currently great! so im running a 5700g system with 32 gigs of ddr4 4000 memory and an asrock x570 mobo. and my windows experience was not good. like i couldn't have discord and steam open at the same time. it was not that bad when it worked but man i spent a lot of time trying to get it working right.
but i moved over to Ubuntu 23.10 and oh man. the only bug i ran into was some weird reflection issues in gta5 and it has been fixed for a long time. on some distros the 770 can be problematic, especially if a lot was changed under the hood. like popos was not a fan. but ive been daily driving the a770 with Ubuntu 23.10 for months and have had no issues whatsoever. i do have an issue where the card doesn't always pick up both monitors at boot but ive read that its a hardware issue.
i use Ubuntu out of the box with whatever drivers it auto installs. i know intel has Linux arc drivers but i haven't found them necessary. cant wait to try out 24.04!
Bought an ASROCK A380 strictly for doing AV1 encoding and its been working like a champ except I just noticed...its locked at PCIe v1.0 x1 speeds no matter what motherboard I plug it into. Was a rather shocking discovery given it requires a full x16 slot, tho its only spec'd to use x8. The ASROCK also requires external PCI-E power tho its not actually needed. Kind of a bummer. Confirmed with a collogue with the same card and different hardware - same thing. Guess I should have figured ASROCK would scrape the bottom of the barrel for 'supports but doesn't utilize' specifications. Oh well, none of the A-series cards support SR-IOV and I'd really like that feature so *crosses fingers* here's to hoping the next-gen Intel supports this.
I bought a A770 LE last year and it is mindbogglingly good for a first gen product. Drivers updates are regular and keep getting better. If you mainly want a productivity GPU that is also becoming a pretty good at gaming and you don't feel like remortgaging your house, Intel has you covered.
Has the elevated idle power draw on intel Arc cards been fixed?
No, it's hardware level and can only be fixed with a hardware change, so Battlemage (hopefully) won't have this problem.
At 9:35 , that lackluster CP2077 performance on the A770 is using the older Feb 2023 driver's performance numbers. He had so much better results with the April 2024 drivers just a few minutes back at 5:58
Intel GPU industry wise are doing exciting work. Assuming they hang tough, and keep up the improvement, they are going to issue Battle mage with a rep that while the drivers may not be A1, their effort to get to A1 is - A1.
I was in a mental view of not buying Intel discrete GPUs - due to bad drivers. I think that's largely done now, so I'd probably be fine picking them up now.
Its not quite right to say Intel came from no where. In truth their GPU and IGP hardware and software has been around a long time. But to level up against the high end players in the market, to this degree - in this kind of time frame is highly impressive. And I am thankful for it - because the market where midrange cards exceed $500/£500 instead of the older $200/£150 arena is batshit and something needed to come break that impasse.
did the drivers improve AI performance? With the ass games that have been out lately, (not into multiplayer shooters) im deepdiving Ai more than gaming. Would an intel card out perform the $400-450 used 3080ti's in stable diffusion or llm? i know they have a bit more memory, but speed is more important than memory in my usage.
I been running a test pig on AMD / AM4 / B550 with the Ryzen 7 5700x and Sparkle A 770 Titan video card, some of us are doing AMD platforms and the user experience could be different.
Wendell, please test using B450/x470 in PCIe 3 with Rebar on for Intel Arc A580 /A770, it should work being AMD can do what Intel cannot on PCIe 3, AMD can do Rebar for Nvidia or Intel on x470 chipset, I tested the RTX 3070 on x470 with Rebar and it works.
Wendell what’s the encode/decode performance like? I’m wanting one of these for Plex to ditch Nvidia.
I rock a A730M 12 gb vram laptop gpu and it owned !
Windows auto updating has been such a pain. My solution is just download the driver manual do a clean install, it's been relatively good since.
*Question...*
Any thoughts on MACHINE LEARNING being involved to fix drivers for various games?
I'm very interested in the idea of ML being involved. Maybe initially some non-ML version that's simply automated to try all sorts of combinations of game settings while monitoring FRAME TIMES? Step through every game in the Steam catalogue? Then ML having access to specific driver code locations so it can try to rewrite them? (feedback loop for crashes/worse performance/better performance...) Maybe move on to rewriting game code to be more efficient? Basically there are TONNES of older games. How much can we fix them with what might be eventually LOW HANGING FRUIT in terms of cost as ML gets better?
Hope with Battlemage Intel can put enough pressure on Amd and nvidia with their mid and low end cards to have reasonable vram and prices, would be awesome for the gamer market...nvidia is supposedly alone in the ultra high end coming fall/winter but ey, 3 players is better than 2
I happy with my current GPU Sapphire GPRO X080 mining version of RX 6700 was extremely cheap and is ok with my 1440P 75HZ Viewsonic ;)
What about on linux? How's proton?
I'd be curious how these cards perform rendering video in DaVinci Resolve AV1 Vs MP4 (render time and file size)... Windows Vs Linux (since there versions for both platforms)....
And ideally if things have improved over driver updates. I'm rooting for intel Arc, and looking to escape the Windows grip. I wonder if anyone else cares for this test?
when battlemage comes i cant wait to see how they implimented drivers on a better hardware level.
Great video!
It seems for now at least, intel is doing what every new actor on a new market should do. Reach out to users for feedback and have enough people working on solutions and optimizations. My question is, once they get a solid step on the discrete GPU market, will they continue to be this "nice".
I hope they learn a lesson, with decent prices, good product overall and good and respectfull communications you can get a lot, really a lot. As long as you don't get greedy of course.
Does arc work in Plex for encoding?
What’s this about SR-IOV on the regular A770?
Wish Rebar didn't need to be a thing because I would slam a A380 LP in a old Dell right now.
OK but what about on Linux Mint??
Not really in the game atm but I really want to see intel to become a stable 3rd player.
Frame generation is nice, but the image quality of XeSS is so much better and I'm not even using the XMX version. If anything, I feel like FSR is the technology that's lagging behind and desperately needs to improve.
I love to see what they're doing but I feel like linux is so neglected on Arc, we don't even have a GUI yet.
You don't need a GUI for it.
@@lost-prototype yes, and?
@@jddes Ummmm... Eat at Joes?
hello i have the AsRock ARC A770 16gb and its always at 90c gaming how is yours running so cold even when i set the fans to 100% im still getting mid to high 80s
Check your thermal pads/paste or ask a friend who is experienced wirh that kind of thing?
is it Shady is he back?
How are the intel GPU on Linux? Do they even work on Linux?
Wooooo intel GPUs! SR-IOV baby!!!!!!
When you find an issue were the A580 is similar to the A770 that is a more an issue in the game than drivers or hardware. I'm saying that a software developer with also a background in microprocessor development. You already have proof the drivers work appropriately by other games having performance differences that match effectively the hardware difference in the cards.
That means the drivers will make full use of what is provided to them to them under that drivers. It should not be required to write special drivers just to make a game work.
That is 100% an issue with the games optimization and design.
one of the biggest problem that the arc gpus still have are the idle consumptions all of them consume around 40w by just doing nothing on the pc
A770 or rtx 3070? Same price in my country
I wonder if that code 43 is from Microsoft essentially running Windows in a VM these days? All their Security sandboxing that they include.
I missed out on a good deal on an A580 a week or two ago. God, it stings.
Mesa maintainers should allow Intel developers to replace old LLVM compiler with in house IGC compiler.