I've been using an A750 since launch. I replaced a 2070 super with it so, not exactly an upgrade. But it's been able to do what I've asked of it quite well in my time using it.
thats more a side grade, a770 would have been an upgrade in things where the drivers/software play well, alot of stuff with dxvk for example, but side grade in stuff thats still not been optimized in drivers yet... my buddy after testing his 3 top games, pulled a 3080 out of his main rig and sold it to his bro for 2.5x the price of the acer predator bifrost a770 16gb cards we all grabbed before xmas, he bought a 2nd of this arc card, and tossed his old 1070ti amp extreme into the system as his 2ndary videocard... its his fallback card anyway so.. being able to just change primary display adapters is nice when checking if a game will launch on one card but not the other...(found a couple cases i thought it was the case but arc just crashed faster then nvidia..i check and the games bugged for alot of people.. *smacks forehead* ) i got no regrets so far this cards been fun...
@@user-wq9mw2xz3j At the time it was $50, and I wasn't really in a position of one with tight pockets. Just thought it was gonna be a worse experience than I ended up getting. Would have grabbed the 770 any day knowing what I know now.
@@Spenlard thats how i felt about the acer a770 vs the LE card, better cooler, no pig tail for rgb, etc... i went from dual 1070's to this,and got no regrets, i wanted 16gb vram, and so far it just keeps getting better and better.
Hope Intel gets the issues worked out soon. The world can use some less expensive gpu's since it seems like Nvidia is going to stop selling the 3050 / 3060 range.
Interesting that ARC 750 only draws 10 watts less compared to the ARC 770. I would expect a card with less ROPs, Xe-Cores, ALUs and TUs to consume much less power. My ARC 770 16 GB never exceeds 190 watts.
Yeah, I crank the power on it though and a nice FPS boost at 230 Watts. the fact that this card hates 1080P and 4K, but loves 1440P just makes me think that Intel needs to make CPU optimizations to the drivers to lower overhead for 1080P to be faster. For 4K, the way it handles the memory access and shaders need to be optimized. But a perfect first gen nonetheless.
@@phoenixrising4995 Intels notification tool informed me today about a new driver, 4032, dated December 21th. Haven´t found a changelog yet, no improvement on the ARC control center as far as I can tell.
I bought one today for $249, being the 3060V2 and 3060Ti are ..... way overpriced. I play SC2 and WoW: WotLK at 1440P, so I am very confident in this purchase. More people should buy these, they are not really that bad for the price, especially since Intel released a new driver last week that improves performance significantly from what I have read.
pcie 3vs4 means virtuallly nothing, we tested it, as have others, rebar is all that matters in real terms... these are getting closer and closer to 3070ti perf with driver updates and dxvk (and other translation layers) the acer a770 is amazing fun... great card for the money so far ;)
Agreed. But by the time A750 becomes stable Battlemage will come out. So I'd buy that one instead because I don't want to be an Intel's beta tester while paying full price for it.
@@brownjonny2230 Battlemage seems promising with the amount of improvement Alchemist has seen in such a short period. I’ll be looking into getting one myself.
@@livedreamsg Sure it is. If nothing else I'll just buy one to express my unadulterated hate toward Nvidia. AMD is out of question btw because they don't support RT cores for Blender.
@@livedreamsg same I'm not planning on playing over 1440p so 3090 level of raster is more than enough It's also worth noting that Intel has better encoders than Nvidia (A770 beating the 4090 in AV1) and also better ray tracing when you take into account that game devs optimize for RTX, not the ray tracing technology as a whole but specifically the Nvidia implementation, so really Nvidia is not that good at RT when you see how both AMD and Intel have caught up to them even when games aren't optimized for it
I have been underpowering the card (95w) since i got it and its average temp is 62c. The framerate on games I have found is generally 10-20fps less then its average power (155w). For me its impressive.
If you know how to take advantage of the AV1 feature on the Arc gpu. PLEASE let me know, I want to edit the smaller file size videos on Adobe Premiere. But its not being accepted, and I really want to use the av1, since its simply better for videos.
Good video sir. It's hard to find a channel about the Arc series cards that people haven't shit on. I got one before Christmas and love it. I'm sure software updates will improve this cards performance. But I have to say my OCD is kicking in and please my friend, dust your PC and work station. Lol. Look forward to more videos🙏
I got the ARC A770 LE 16GB and it is pretty good. I find it funny that the 4070Ti comes with less memory and a lower bus bandwidth than a card priced at $349.
I've had one since launch. Drivers are improving at a faster rate than I thought they would be. It's quiet and runs games ok. The only game I wish it performed better in is darktide 40k, that being said that game needs optimization overall. It feels like it's good for 60-100fps depending on the game, 1080p/1440p. Cod MW2 for me on 1440p xess balanced ultra settings is good for 60-100fps w a 12700 nuc system.
I had a a750 on for a month or so after launch and it did everything I wanted it, played games well and was quiet. Regret selling it on tbh. Seriously considering by another or preferably an a770
You can run rebar on pcie 3 with AMD, Ryzen 3000 series is needed, it worked with 3700x and RTX 3070 on x470 for me, the 3700x was replaced for 5700x and rebar is still up and working for RTX 3070 on x470, nice video
I personally love NO LEDs, far more elegant and grown-up. I want everything I spend, going to the screen. I am so PO'd at AMD and Nvidia Unless I "Have to" - I will not give them another dime. We truly need a third great company providing some great ideas and products. I just hope they stay in the game long enough to compete and exceed the others. Good first try - keep up the good work... We are waiting in the wings for a fair deal on a good to great card. I am interested in undervolting and its' effect! Please do a video on that !
I originally bought the A770 when it was released to try it out. I put it in an Intel based computer and I really liked it despite the drivers being potato. But as said drivers got less green, and that happened much faster than AMD did this, the A770 really got good. I now have a partner card from ASRock using their version of the A750 and put it in a modern AM5 based computer. They do overclock it to 2000hz out of the box and it works well. I'm getting 98% of the power I've seen from the A770 and that's amazing for the cost. I'm keeping both of these cards around to see where Intel takes these as the drivers mature, competition is a good thing!
problem is its 200$ low end card for 400$ and fact it glued together doesn't leave much room for renewing the heat sink in a few years, clean seal the board, and why would a modern card on a level comparable with a RX 6600 draw so much more power, its power draw for it performance is par with 2 generation old GTX 1070
Hoping for under-power of the caed! Btw Any platform to report bug or error to Intel? I am using A750, and I am happy with it However there are still some bugs/ error/ optimization issue that would be great if I could report back to their team
I picked up the a770 it’s okay I switched from a 2060 . I’m not a big gamer so for me it’s good since I mostly play cod. Which has not given me any problems
Hit ‘Alt + i’, and it will bring up the Intel Arc overlay. Once there, navigate to the telemetry tab, and bring up the advanced/more options (I can’t remember the exact wording in the UI atm), and it should allow you to check whether or not it’s included in the telemetry overlay in the first column, and then the second column has a check mark that will allow you to graph it over time!
You said at one point "7 core CPU". I have never in my lifer ever heard of any 7 core CPU. Other than single core chips, the amount of cores is always an even number.
Yeah, I was more illustrating a comparable configuration in terms most people would be familiar with. I agree that there has never been a 7 core CPU, but the point was to show that disabling a render slice doesn’t make the card incapable! Hope you understand!
Its not always a even number. AMD Phenom X3 for example had 3 cores. IBM powerpc cpu had 3 cores. The playstation 3 had 7 cores (8th disabled to increase yield)
Hi all . Users of arc750 , i have buyed the arc750 and all good. I have instaled the card in my pc connect the display port cable and Bam works. But after i have instaled the drivers bumm no signal on display port. I trayed a lot of things but not working. And the hdmi works but limit my monitor Hz. I decided to send it back. I have done the corect think?
The 6650xt also has a large Level 3 cache to supplement its memory bandwidth. The 6650xt will struggle more at 4K, but these cards are more so targeted at 1080p
It's an electrical power supply problem! Each 8-pin is only rated for 150W total, and that stays true even if you are using a daisy chained 6-pin! It's to ensure the card get's the most amount of power possible!
I wonder what was the reasoning behind calling desktop graphics cards “discrete” - those are big and clunky, sometimes noisy, there is nothing discrete about them 🤔 Integrated gpu is more discrete than this 😅
The main reason I chose to run at medium settings is because this card just doesn’t have the horsepower to run with settings absolutely maxed. Some settings it can run maxed, like textures, geometry, AA, and Anisotropic Filtering- but shader quality and particle effects had to be lowered to provide somewhat consistent performance. I 100% should have mentioned this in the review!
@@ProceuTech I see where you come from, but the comparison with other benchmarks is quite impossible with medium graphics fps, as almost everybody uses high or ultra. But nevertheless, thanks for the hard work. Much appreciated!👋 Subbed
funny, been using one for over a month now and, the launch drivers were pretty bad, the current drivers havent given me any real problems, i use dxvk with my nvidia cards anyway so, i didnt even have to install that to get it working it just worked on titles that may have shown issues. the a770 is getting closer and closer to 3070ti perf in more and more things, hell a couple games a buddy of mine playes alot its within spitting distance of his 3080.. causin him to sell the 3080 for 2.5x the cost of the acer a770 16gb.. and allowing him to run 2 of the games at settings higher then his 3080 due to having the vram to allow it when the 3080 dosnt.. the hardware seems very solid so far, drivers are a work in progress.. still glad i got this over the equally priced amd and nvidia options.. its already out grown them in the titles i play.. the higher power limit is nice as well.. 2500-2800 on the core with a simple dirty 20/45/305 overclock gets amazing results in my exp.. these seem to be selling well... despite the fact anybody buying one knows they are going to deal with driver teething issues.
I've been using an A750 since launch. I replaced a 2070 super with it so, not exactly an upgrade. But it's been able to do what I've asked of it quite well in my time using it.
thats more a side grade, a770 would have been an upgrade in things where the drivers/software play well, alot of stuff with dxvk for example, but side grade in stuff thats still not been optimized in drivers yet...
my buddy after testing his 3 top games, pulled a 3080 out of his main rig and sold it to his bro for 2.5x the price of the acer predator bifrost a770 16gb cards we all grabbed before xmas, he bought a 2nd of this arc card, and tossed his old 1070ti amp extreme into the system as his 2ndary videocard... its his fallback card anyway so.. being able to just change primary display adapters is nice when checking if a game will launch on one card but not the other...(found a couple cases i thought it was the case but arc just crashed faster then nvidia..i check and the games bugged for alot of people.. *smacks forehead* )
i got no regrets so far this cards been fun...
@@AshenTechDotCom My one and only regret actually... Not shelling out the extra dough for the A770.
@@Spenlard eh. if it was close to $100 more, not worth it.
not much better. Id pay $50 tops.
@@user-wq9mw2xz3j At the time it was $50, and I wasn't really in a position of one with tight pockets. Just thought it was gonna be a worse experience than I ended up getting. Would have grabbed the 770 any day knowing what I know now.
@@Spenlard thats how i felt about the acer a770 vs the LE card, better cooler, no pig tail for rgb, etc... i went from dual 1070's to this,and got no regrets, i wanted 16gb vram, and so far it just keeps getting better and better.
Surprisingly in depth for such a small channel.... keep up the great work!!!
ya lol i was thinking exactly the same .
Hope Intel gets the issues worked out soon. The world can use some less expensive gpu's since it seems like Nvidia is going to stop selling the 3050 / 3060 range.
I’d say it’s already here. The driver that dropped today makes dx9 playable which was the last real problem
Interesting that ARC 750 only draws 10 watts less compared to the ARC 770. I would expect a card with less ROPs, Xe-Cores, ALUs and TUs to consume much less power. My ARC 770 16 GB never exceeds 190 watts.
Yeah, I crank the power on it though and a nice FPS boost at 230 Watts. the fact that this card hates 1080P and 4K, but loves 1440P just makes me think that Intel needs to make CPU optimizations to the drivers to lower overhead for 1080P to be faster. For 4K, the way it handles the memory access and shaders need to be optimized. But a perfect first gen nonetheless.
@@phoenixrising4995 Intels notification tool informed me today about a new driver, 4032, dated December 21th. Haven´t found a changelog yet, no improvement on the ARC control center as far as I can tell.
you call me impressed, the details on this video is really amazing. you have put the card through everything and explain it really well.
This must be the most detailed review I have seen on the arc cards! Impressive!
I bought one today for $249, being the 3060V2 and 3060Ti are ..... way overpriced.
I play SC2 and WoW: WotLK at 1440P, so I am very confident in this purchase. More people should buy these, they are not really that bad for the price, especially since Intel released a new driver last week that improves performance significantly from what I have read.
pcie 3vs4 means virtuallly nothing, we tested it, as have others, rebar is all that matters in real terms... these are getting closer and closer to 3070ti perf with driver updates and dxvk (and other translation layers) the acer a770 is amazing fun... great card for the money so far ;)
Before I watch the video, I'll say it'll be a very good card for its price in 2023 when it gets more driver updates
Agreed. But by the time A750 becomes stable Battlemage will come out. So I'd buy that one instead because I don't want to be an Intel's beta tester while paying full price for it.
@@brownjonny2230 Battlemage seems promising with the amount of improvement Alchemist has seen in such a short period. I’ll be looking into getting one myself.
@@livedreamsg Sure it is. If nothing else I'll just buy one to express my unadulterated hate toward Nvidia. AMD is out of question btw because they don't support RT cores for Blender.
@@brownjonny2230 I wonder if we’ll get a flagship B9 series. Performance equivalent to a 3090 would be very good for me.
@@livedreamsg same I'm not planning on playing over 1440p so 3090 level of raster is more than enough
It's also worth noting that Intel has better encoders than Nvidia (A770 beating the 4090 in AV1) and also better ray tracing when you take into account that game devs optimize for RTX, not the ray tracing technology as a whole but specifically the Nvidia implementation, so really Nvidia is not that good at RT when you see how both AMD and Intel have caught up to them even when games aren't optimized for it
I have been underpowering the card (95w) since i got it and its average temp is 62c. The framerate on games I have found is generally 10-20fps less then its average power (155w). For me its impressive.
If you know how to take advantage of the AV1 feature on the Arc gpu. PLEASE let me know, I want to edit the smaller file size videos on Adobe Premiere. But its not being accepted, and I really want to use the av1, since its simply better for videos.
I like your channel, your videos are easy to understand with clear graphics and you have a nice voice to listen too
Yeah we need a video without Resizable Bar on and underpowering the card! More Proceu videos are always valuable!!
Good video sir.
It's hard to find a channel about the Arc series cards that people haven't shit on.
I got one before Christmas and love it.
I'm sure software updates will improve this cards performance.
But I have to say my OCD is kicking in and please my friend, dust your PC and work station. Lol.
Look forward to more videos🙏
I got the ARC A770 LE 16GB and it is pretty good. I find it funny that the 4070Ti comes with less memory and a lower bus bandwidth than a card priced at $349.
I've had one since launch. Drivers are improving at a faster rate than I thought they would be. It's quiet and runs games ok. The only game I wish it performed better in is darktide 40k, that being said that game needs optimization overall. It feels like it's good for 60-100fps depending on the game, 1080p/1440p. Cod MW2 for me on 1440p xess balanced ultra settings is good for 60-100fps w a 12700 nuc system.
I had a a750 on for a month or so after launch and it did everything I wanted it, played games well and was quiet. Regret selling it on tbh. Seriously considering by another or preferably an a770
I have no idea what resiazable bar is
You can run rebar on pcie 3 with AMD, Ryzen 3000 series is needed, it worked with 3700x and RTX 3070 on x470 for me, the 3700x was replaced for 5700x and rebar is still up and working for RTX 3070 on x470, nice video
ryzen 3 non apu supports pcie 4
I personally love NO LEDs, far more elegant and grown-up. I want everything I spend, going to the screen. I am so PO'd at AMD and Nvidia Unless I "Have to" - I will not give them another dime. We truly need a third great company providing some great ideas and products. I just hope they stay in the game long enough to compete and exceed the others. Good first try - keep up the good work... We are waiting in the wings for a fair deal on a good to great card.
I am interested in undervolting and its' effect! Please do a video on that !
I originally bought the A770 when it was released to try it out. I put it in an Intel based computer and I really liked it despite the drivers being potato. But as said drivers got less green, and that happened much faster than AMD did this, the A770 really got good. I now have a partner card from ASRock using their version of the A750 and put it in a modern AM5 based computer. They do overclock it to 2000hz out of the box and it works well. I'm getting 98% of the power I've seen from the A770 and that's amazing for the cost. I'm keeping both of these cards around to see where Intel takes these as the drivers mature, competition is a good thing!
Wait, what? The 750 ASROCK is almost on the level of the 770 LE?
@@jelly8594 Yep, it's factory overclocked over 200hz over even the A770. It's pretty darn fast!
@@dragon2knight dang 😄
You earned a Sub well done & Thank you, great info very helpful
The real question is, can it compute?
retest with latest driver 4146. a750 is more on par with 3060 now. the a770 now ties and even beats a 3070 in some games.
Currently testing, and I am finding some interesting results!
a video on underpowering the card I'm down.
Will definitely be putting the card back into my main build then!
problem is its 200$ low end card for 400$ and fact it glued together doesn't leave much room for renewing the heat sink in a few years, clean seal the board, and why would a modern card on a level comparable with a RX 6600 draw so much more power, its power draw for it performance is par with 2 generation old GTX 1070
This guy knows what he is doing.
My a750 refuses to play left 4 dead 2. I can start the game but it won't load into a map. It just crashes
nice review, I find these Intel GPUs fun and interesting
ive heard that they gunna make em 250 bucks 😳
Around $200-220 now. I picked up an arc a 770 for $200
Does the ARC run better with an Intel system?
I would generally suspect so, however any AMD chips supporting Smart Access Memory and PCIe 4.0 should theoretically perform just fine!
Do you feel like the software is holding it back??....considering how strong the actual hardware is
It could be a mix of both tbh.
Hoping for under-power of the caed!
Btw
Any platform to report bug or error to Intel?
I am using A750, and I am happy with it
However there are still some bugs/ error/ optimization issue that would be great if I could report back to their team
Great video , thanks 👍
I picked up the a770 it’s okay I switched from a 2060 . I’m not a big gamer so for me it’s good since I mostly play cod. Which has not given me any problems
What version you got 8gb or 16gb ?
@@BrunoFerreira-fp1vb 16 I returned it lol 😂 after using it for a while I realized it’s lacking my 2060 performed better. Instead I ordered 4070 ti
So where is the comparison benchmark with nVidia and AMD?????
Arc a750 so expensive here in the ph. Around $400. Idk what to say.
How are getting the bottom 2 graphs on intel telemetry?
Hit ‘Alt + i’, and it will bring up the Intel Arc overlay. Once there, navigate to the telemetry tab, and bring up the advanced/more options (I can’t remember the exact wording in the UI atm), and it should allow you to check whether or not it’s included in the telemetry overlay in the first column, and then the second column has a check mark that will allow you to graph it over time!
*Does it have DLSS enabled in games?*
No, but it has Intel’s XeSS!
@@ProceuTech *It is similar in terms of adding FPS in terms of frame generation?*
No unfortunately
amazing video, btw dont forget to timestamp it to help viewers see what you cover and offer
Done!
incredible you can get great 1440p gaming performance for as little as $230!
You said at one point "7 core CPU". I have never in my lifer ever heard of any 7 core CPU. Other than single core chips, the amount of cores is always an even number.
Yeah, I was more illustrating a comparable configuration in terms most people would be familiar with. I agree that there has never been a 7 core CPU, but the point was to show that disabling a render slice doesn’t make the card incapable! Hope you understand!
@@ProceuTech Fair enough.
Its not always a even number. AMD Phenom X3 for example had 3 cores. IBM powerpc cpu had 3 cores. The playstation 3 had 7 cores (8th disabled to increase yield)
Hi all . Users of arc750 , i have buyed the arc750 and all good. I have instaled the card in my pc connect the display port cable and Bam works. But after i have instaled the drivers bumm no signal on display port. I trayed a lot of things but not working. And the hdmi works but limit my monitor Hz. I decided to send it back. I have done the corect think?
What GPU are you going to go with instead?
@@ProceuTech idk. Maybe an Rx 6650 xt? ITS ok?
@@ThePushuk The 6650 XT would offer a better plug and play experience!
@@ProceuTech but Rx 6650xt will not be slow only with 128 biti? Than arc with 256
The 6650xt also has a large Level 3 cache to supplement its memory bandwidth. The 6650xt will struggle more at 4K, but these cards are more so targeted at 1080p
The 3060 actually uses more power despite any numbers stated.
why do you use 2 cables for the GPU?
It's an electrical power supply problem! Each 8-pin is only rated for 150W total, and that stays true even if you are using a daisy chained 6-pin! It's to ensure the card get's the most amount of power possible!
What was the Intel driver version tested in this video ?
Driver version 31.0.101.3959
Heyy can I pair this with ryzen 5 5600x and the mobo is Asus Rog strix b550 f gaming wifi II? I am planning to buy it please reply
Yes you can! Just make sure to enable Smart Access Memory!
What do you mean in 2023?? It came out four months ago
and the videos is about how it performs in 2023, whats the issue?
@@OethRust that it’s too new to perform differently? Except drivers which he does not include?
Yup for the price i jost got my rx6700
I wonder what was the reasoning behind calling desktop graphics cards “discrete” - those are big and clunky, sometimes noisy, there is nothing discrete about them 🤔 Integrated gpu is more discrete than this 😅
I put the ARC A750 up against an RTX 4080.... and I'm blown away by the results.
The 4080 crushed the A750, 200 fold 💁♂
I must say testing on Medium Settings is kind of ...meh. Nobody buys a new card and then sets it on medium (except for competitive games)
The main reason I chose to run at medium settings is because this card just doesn’t have the horsepower to run with settings absolutely maxed. Some settings it can run maxed, like textures, geometry, AA, and Anisotropic Filtering- but shader quality and particle effects had to be lowered to provide somewhat consistent performance. I 100% should have mentioned this in the review!
@@ProceuTech I see where you come from, but the comparison with other benchmarks is quite impossible with medium graphics fps, as almost everybody uses high or ultra. But nevertheless, thanks for the hard work. Much appreciated!👋 Subbed
wonderful video
rtx 3060 vs arch a750 minecraft rtx and java suses ptgi comperison
Who wants to play bf2042 😂
first
second
@@ProceuTech do i get a free gpu now?😅
I wish 😭
@@ProceuTech 🤣
Arc is behind and buggy a death knell
funny, been using one for over a month now and, the launch drivers were pretty bad, the current drivers havent given me any real problems, i use dxvk with my nvidia cards anyway so, i didnt even have to install that to get it working it just worked on titles that may have shown issues.
the a770 is getting closer and closer to 3070ti perf in more and more things, hell a couple games a buddy of mine playes alot its within spitting distance of his 3080.. causin him to sell the 3080 for 2.5x the cost of the acer a770 16gb.. and allowing him to run 2 of the games at settings higher then his 3080 due to having the vram to allow it when the 3080 dosnt..
the hardware seems very solid so far, drivers are a work in progress..
still glad i got this over the equally priced amd and nvidia options.. its already out grown them in the titles i play.. the higher power limit is nice as well.. 2500-2800 on the core with a simple dirty 20/45/305 overclock gets amazing results in my exp..
these seem to be selling well... despite the fact anybody buying one knows they are going to deal with driver teething issues.