Arc A770 & A750 Performance After 6 Months

Поделиться
HTML-код
  • Опубликовано: 22 авг 2024

Комментарии • 372

  • @titan_fx
    @titan_fx Год назад +460

    Glad that Intel keeps optimizing this card. At least they are trying to make a competitive product on the not-so competitive GPU market like we have today.

    • @toby1248
      @toby1248 Год назад +12

      They are optimising the driver rather than the card specifically. Most of it will carry over to the Battlemage cards

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Год назад +21

      Battlemage is gonna be fun.

    • @Antagon666
      @Antagon666 Год назад +1

      *fixing

    • @frankwainwright7826
      @frankwainwright7826 Год назад +4

      ​@@Antagon666 Not fixing technically, it was never broken, the driver is just new and specifically for dx9 stuff they found a better way to translate it to dx12

    • @KEYBEATZ
      @KEYBEATZ Год назад +1

      They still have a little ways to go, but they're almost there

  • @CNC295
    @CNC295 Год назад +282

    My A750 paired with a ryzen 5600, with 32 gigs of ram has been doing just fine. Driver updates help immensely

    • @lclnbm
      @lclnbm Год назад

      Pcie 3 is holding you back

    • @kodakblack531
      @kodakblack531 Год назад +21

      @@lclnbm 5600 supports pcie 4 but he need a mb with it, 5500 and under has pcie 3 only edit: also cpu and gpu that use ram see big gains in fps with faster ram, even on Ryzen, 3600mhz cl14 with bdie going to offer more fps on cpu/ and gpu. hp v10 good bdie or you can buy used to save more 3200 cl141414 is good ram can do 3600 cl14/4000+cl15

    • @dicknr1
      @dicknr1 Год назад

      Imagine the stupidity of a person skipping superior AMD cards for a intel with a AMD build... AMD launched all these vid-cpu compatibility modes and options if you pair them and your ignorance decides to buy a arc. Imagine this stupidity. A 6000 series is cheaper and better with even more gains. This proves how stupid buyers are nowadays.

    • @bocahdongo7769
      @bocahdongo7769 Год назад +2

      @@lclnbm 5600 support PCIE 4

    • @dennisjungbauer4467
      @dennisjungbauer4467 Год назад +1

      @@lclnbm I don't think so, at least that's not the case for other GPUs - is this something special with Arc? I only know of the ReBar benefits/requirement, but that's something else. Also, like the previous commenter said, his CPU does support PCIe 4.0 - without mainboard info, you don't know if his system is PCIe 4.0 capable or not.

  • @ElladanKenet
    @ElladanKenet Год назад +136

    I love seeing how much the Intel cards have improved. I had already bought a 3060ti when these came out, and I really had no reason or need to buy the cards, then or now. But I have an upgrade window coming up in a few years, and I'm eager to see how Battlemage turns out! I just might make an all-Intel machine!

    • @PoulWrist
      @PoulWrist Год назад +19

      Unless Intel CPUs get more efficient and less power hungry we might end up with builds with AMD CPUs and Intel GPUs...

    • @VaginaDestroyer69
      @VaginaDestroyer69 Год назад +3

      @@PoulWristIronic.

    • @Iheartlailussy
      @Iheartlailussy Год назад

      Good luck with the Vram my friend!

    • @ElladanKenet
      @ElladanKenet Год назад +1

      @Wetymanys I'm hoping I can hold out. HL was the first game I had to turn down settings for. Pretty sure Star Wars Jedi will also

    • @Curiouzity_Omega
      @Curiouzity_Omega Год назад +1

      @@VaginaDestroyer69 A poetic irony that could be beautiful.

  • @TerribleatTechnology-fn3qz
    @TerribleatTechnology-fn3qz Год назад +76

    I just installed the A770 in my system. Purchased at Newegg for $349 with a bundle of free software and games for free. I am impressed atm. I would love to see IRay support. The AV1 was important to me and the syncronized partnership with Blackmagic and Magix also was important to me. I am a content creator and saw those important partnerships to help me along. I would recommend this GPU to others atm. FYI, MSI Afterburner will OC this card now.

    • @hotdogsarepropaganda
      @hotdogsarepropaganda Год назад +9

      Oh shit I didnt even realise any of this and Ive been following ARC for a while! great comment thanks for the info!

    • @700mobster
      @700mobster Год назад

      Does msi afterburner have an auto overclock function? I kinda miss that from asus gpu tweak when i had my 1060.

    • @terribleatfishing
      @terribleatfishing Год назад +1

      @@700mobster they do but not for the Intel

    • @700mobster
      @700mobster Год назад +1

      @@terribleatfishing dang...

  • @ScreamingTc
    @ScreamingTc Год назад +38

    I never, in a million years, ever thought I'd be rooting for an Intel product, but this is the strange, strange world we live in.

    • @lordshitpost31
      @lordshitpost31 Год назад +4

      I never though I'd be this impressed with a new product that had crap drivers on launch but here we are :D

    • @maxjames00077
      @maxjames00077 Год назад +1

      May i ask why you didn't think you would be rooting for an Intel product? (just curious)

    • @resik4866
      @resik4866 Год назад

      @@maxjames00077 Before AMD Ryzen released in 2017, Intel pretty much had a monopoly over the CPU market. They made absolutely no progress for years and overcharged for their CPUs by a tonne since there were no good alternatives. It wasn't until Ryzen that Intel actually bothered to innovate and improve their processors.

    • @maxjames00077
      @maxjames00077 Год назад +1

      @@resik4866 yeah ofc, no company is going to put billions into r&d if u own the market anyway. Simple business logic.

    • @resik4866
      @resik4866 Год назад

      @@maxjames00077 Yeah fair enough, but people were rooting for every underdog that there was so there would finally be competition.
      Think of it like the GPU market, except there's no AMD or Intel, just Nvidia. Imagine Nvidia kept on increasing prices like they are now, but didn't innovate at all. Look at how much people dislike Nvidia right now, then imagine they didn't improve their GPUs at all. At least with Nvidia graphics cards are getting better.

  • @LouisKleiman
    @LouisKleiman Год назад +87

    Your charts should be restructured to put the before/after values directly next to each other. The comparisons are difficult to track.

    • @miweneia
      @miweneia Год назад +2

      True, I had to pause for like half a minute to figure out how tf do I read this chart properly...

    • @shamadooee
      @shamadooee Год назад

      I read it fine honestly but I think a good alternative would be to combine the avg and 1% lows into one multi-coloured bar which is how I see most tech youtubers do it

    • @eddyr3691
      @eddyr3691 Год назад +1

      Yep, you have to consciously divert your natural tendency to compare directly adjacent bars. It is effectively misleading (unintentionally ofc).

    • @dennisjungbauer4467
      @dennisjungbauer4467 Год назад

      I'm fine with the low and avg order, that's common (and there is a different color used), but I would like launch results to come first and newest ones after.^^ Because this is about showing progress, not checking regression I'd say, so the order shown is a bit unintuitive.

  • @jonathanrodriguez6161
    @jonathanrodriguez6161 Год назад +25

    So far I love my ARC 770 ❤

  • @bignishspcinsights1762
    @bignishspcinsights1762 Год назад +37

    Thanks for the last six months Keith. From my experience of using an A770 since launch, I think the ARC driver is in great shape and XeSS is being added to an increasing number of titles which is great news, but Intel ARC Control desperately needs some development horsepower thrown at it. I have given up installing it since it stopped being an overlay as basic features like the ability to turn off specific performance tiles does not work....... I just install the driver updates for now. Fingers crossed ARC Control will be given some much needed love and attention in the months ahead.

  • @Jacob-hl6sn
    @Jacob-hl6sn Год назад +18

    At least the a770 has a 16 gb version, which has enough vram for max settings in new games, unlike most nvidia gpu's.

    • @vasopel
      @vasopel Год назад

      hm...are there actually people that play at max settings?

    • @Jacob-hl6sn
      @Jacob-hl6sn Год назад +4

      @@vasopel yes everyone with a good pc

    • @vasopel
      @vasopel Год назад

      @@Jacob-hl6sn ? a good pc can't handle max settings in new games! only a VERY good gaming pc can handle it...and no one (in his right mind) is gonna spent the amount of money needed for that very good gaming pc ;-)
      so who cares about max settings since (almost) no one uses them? :-)

    • @Jacob-hl6sn
      @Jacob-hl6sn Год назад +4

      @@vasopel whatever you want to believe man

    • @Im_SSJay
      @Im_SSJay Год назад

      cant even break 100fps on mw2

  • @kurtwinter4422
    @kurtwinter4422 Год назад +31

    I have a 5800x3d with A770 and its absolutely glorious

    • @Trololol3973
      @Trololol3973 Год назад

      ur doing it wrong mate.
      Buy intel CPU and Nvidia gpu.

    • @0xBlez
      @0xBlez Год назад

      ​@@Trololol3973 You're stuck in marketing bs. Open your eyes

    • @INTJ791
      @INTJ791 Год назад

      @@Trololol3973 bad troll attention seeking, there you go one attention

  • @Obie327
    @Obie327 Год назад +33

    I'm very happy with my ARC A770 LE 16. Plays all my games with the next level DX12 ultimate/Vulcan, And XeSS features. Can't wait for DX11 and improved performance updates. Thanks for keeping Intel's option on your radar.

  • @RANDOMNATION907
    @RANDOMNATION907 Год назад +19

    The thing I like most about the A770 LE is it's appearance. It just looks grown up, refined and mature. I sincerely hope that Battlemage GPU's retain this design feature.
    The performance exceeds my current needs. The 16GB of ram will likely serve it's owners well as time marches on.
    Yes! I very much like the A770 LE.

  • @Hughesburner
    @Hughesburner Год назад +5

    I worked for Intel for 10years in the fab, pumping out wafers, from Prescott to Haswell . It's great to see them in the game finally. I have been a life long Nvidia fan.

  • @Starwing700
    @Starwing700 Год назад +19

    I got a intel arc a770 a week ago, and I was honestly pretty surprised by how well it performs in most games I have tested, compared to my last gpu, i can now actually use max settings on games and sometimes raytracing, and get more than 20 fps. Overall, I would say that it performed way better than I expected, and that I made a pretty good choice

    • @maxjames00077
      @maxjames00077 Год назад +1

      Exactly same story here, don't regret buying the A770 at all!

    • @whoisanarnb
      @whoisanarnb Год назад

      Does it do well with older games like hl1 and tf2?

  • @robertlawrence9000
    @robertlawrence9000 Год назад +24

    I'm happy to see Arc improving and the technology is really promising on what it has accomplished so far. I really hope they keep it up and make new generations as it's giving gamers more options. If they improve little by little every generation, they will have gamers more and more interested in using Intel GPUs. If building a lower end system, I definitely would consider them. I hope they start aiming higher end as well with great features that are important to gamers at very compelling prices. Nice job Intel team! Keep it up!

  • @AdamNeal
    @AdamNeal Год назад +2

    Three months in with my Arc A750 and i have no complaints. I've cut and rendered 4k videos, worked on countless large Photoshop files, and it plays all of my games without any problems. I can't recommend Intel enough

    • @user-xg2zh6hk7e
      @user-xg2zh6hk7e Год назад

      What GPU did you have prior and what was your experience with them?

    • @AdamNeal
      @AdamNeal Год назад +1

      @@user-xg2zh6hk7e I had a Nvidia 1650 Super and it was okay but I would occasionally notice dips in some games and the occasional Photoshop bug update complaining about drivers. I don't hold that against Nvidia though because Adobe has become lazier with their coding over the past few years.

  • @greenman8
    @greenman8 Год назад +21

    I really hope Intel continues with a second generation, Improving on the initial attempt. It would be great if Intel was able to improve to the point of making it a viable option.
    Regardless of how great they make it perform, one thing that would hold me back from jumping on board?
    Make the next set of cards serviceable. Expect the enthusiast to clean and re-paste the GPU-Heatsink.

    • @ovemalmstrom7428
      @ovemalmstrom7428 Год назад +3

      You also got cards from Acer and ASRock that is much easier to service.

    • @lordshitpost31
      @lordshitpost31 Год назад +1

      You don't want to re-paste a new GPU but when you do, it's not that hard to teardown.

    • @orenJF
      @orenJF Год назад +2

      Thats why i bought the acer predator version . Its very easy to open (see gamer nexus) it was the same price as the intel version here in ireland direct from Acer

    • @greenman8
      @greenman8 Год назад

      @@lordshitpost31 This particular one is molded plastic, no screws.

    • @lordshitpost31
      @lordshitpost31 Год назад

      @@greenman8 What? Is that a thing? I took off my back plate hoping it'd help dissipate some of the vram heat and there were screws just like the card on Gamers Nexus

  • @Themanthatisi
    @Themanthatisi Год назад +7

    I am using ARC 770 on a gen13 i7, ddr5 and Gigabyte mobo Z790... I have been very happy with its performance and so far, stable.

  • @Saabjock
    @Saabjock Год назад +6

    I love that A770-16 GB card I bought in mid-December.
    The only racing title I could not run at installation, was Rfactor2. It simply would not launch.
    Every other title was fine...very smooth and stable.
    With the 4090 beta release driver, Rfactor2 started working. It too has been flawless.
    My only other issue encountered was the fact I could not use the .exe to install drivers.
    The installer would start compiling a list of hardware then crash, requiring driver installs from a .zip archive.
    That has now been fixed with the 4311 release.
    I

  • @gamesushi
    @gamesushi Год назад +7

    I have the Intel A770 LE. Upgraded from a GTX970 and its been great. Even though I usually play on my 144hz monitor at 2560x1440 in most games on high/ultra (Last of Us High 80fps, Atomic Heart Ultra over 90fps, Cyberpunk High 96fps average, Hellblade High115-120 fps), in well optimized games like Atomic Heart or Hogwarts I can play on my 32:9 at 5120x1440 high/ultra and still get over 60fps. With 16GB of VRAM the A770 is already aging like fine wine.

    • @TheDado512
      @TheDado512 Год назад

      Whats your cpu?

    • @gamesushi
      @gamesushi Год назад

      @@TheDado512 5600X so nothing crazy. If you want to see the Cyberpunk and Hellblade examples check out my youtube. Haven't made videos of the other ones yet. But I will get to them eventually.

    • @GUNNYCANUCK
      @GUNNYCANUCK Год назад

      @@gamesushi I'd love to know how Star Citizen and Stalker GAMMA run with the A770 if you perhaps ever play them.

  • @lc9245
    @lc9245 Год назад +7

    Compare to the 6600XT, the A770 has some cool trade off. It has on par performance in newer game, less performance on older games and much better RT performance at much higher power draw. The limited edition also have 16GB VRAM so there's headroom for the future. Its improvements made the midrange choice very intriguing. While it is slightly more pricey than the 6600XT in some places, the 16GB VRAM, better RT, capable of running AI applications and AV1 make it quite future proof. If you game all the time then power draw might be a bit much, but the aforementioned advantage might mean it can last for up to 5 years for new RT games at 1080p and 3 years at 1440p. It's not that bad. The 8GB 3060Ti has much more raw performance but only has 8GB VRAM which means it might not be that good for future games. In any case, NVIDIA cards price have not been dropping that much, so its realistic competitor is the potentially much cheaper 6600XT. For my part, I went with the 6600XT because I game frequently on older game so something quiet and efficient is preferred. Though, I look forward to their next generation.

  • @SPG25
    @SPG25 Год назад +5

    Just got my a770 and gotta say, huge improvement over my 2060 6gb. Did test runs in all the games I play and no issues. I love this card, and I'm excited for battlemage. Hopefully intels efforts bring some stability to the gpu market but either way. Always go for the best deal that works for what you want to do. As for me, love this a770.

  • @UNDERGROUNDHITRADIO
    @UNDERGROUNDHITRADIO Год назад +1

    awesome! i just got a A750 i get it tommorrow! Cant wait!

  • @tosc5277
    @tosc5277 Год назад +9

    would have liked to see more benchmarks of older titles, since this was the main point of the Arcs struggle, but otherwise good Video

  • @bryanwages3518
    @bryanwages3518 Год назад +3

    My biggest request for them is to add a fps counter to the overlay like amd has it.

    • @lordshitpost31
      @lordshitpost31 Год назад

      I think there is one, enable the gaming overlay from Arc Control

    • @bryanwages3518
      @bryanwages3518 Год назад +1

      @@lordshitpost31 there isn't. The overly is nearly identical to the amd one but amd added a fps counter to thiers

    • @lordshitpost31
      @lordshitpost31 Год назад

      @@bryanwages3518 Yeah well, hopefully with with next driver pack

  • @TechGuyBeau
    @TechGuyBeau Год назад +3

    What’s up with those cyberpunk numbers though? They did very recently massively improve it (see my xess vs fsr video).
    I suspect you tested just before the fix as now it runs closer to 60 fps 1440 ultra even in the worst areas?
    Probably that. Hard to sneak in every huge improvement when they are putting out bi weekly updates that can’t totally fix performance in one game

    • @pcworld
      @pcworld  Год назад

      Yeah, these numbers were run right after the Overdrive patch.
      -Adam

  • @MrJava1593
    @MrJava1593 Год назад +1

    My A750 just got yesterday and it works just fine.. Playing COD at 100+ FPS and paired with a Ryzen 5700... The Hardware is great, the drivers will take it to were it needs to be! 100% satisfied!

  • @falloutnewvegasboy
    @falloutnewvegasboy Год назад +5

    Very excites to see what Battlemage has to offer.
    I've always been a fan of unique/niche hardware and the A770 is enticing. So much VRAM for such a cheap card! I'm heavily considering upgrading to an ARC if battlemage is worth it

  • @svexee
    @svexee Год назад +3

    Thanks bud, gonna get one for my new rig. Should be decent enough until 5 series comes out.

  • @legrangedylandlg
    @legrangedylandlg Год назад +7

    Seeing these drivers optimization gives me hope for the next generation. They may shake up the budget to midrange market. Anything to derail Nvidia is good.

  • @arvindashok
    @arvindashok Год назад +2

    One thing I've noticed is that performance is pretty stable as long as you stay within the VRAM limit. If you try overloading it and pushing the VRAM, it tends to stutter. It also gets pretty hot.

  • @JagHiroshi
    @JagHiroshi Год назад +2

    We just got XeSS on CP77 ... my A750 is giving me a smooth 60fps. Love this card.

  • @ChrisP872
    @ChrisP872 Год назад +14

    I like my Arc 770 LE 16GB. Especially at the price I paid for a 16GB card.

    • @Strongholdex
      @Strongholdex Год назад +4

      Nvidia listen up.

    • @volvot6rdesignawd702
      @volvot6rdesignawd702 Год назад +1

      100% i see the ARC 770 as the new budget king .. bought mine a few weeks back plans to match it with a 13600k ( currently matched with with some left over parts b550i 5600x )
      but works a treat and for just ( eventually ) all intel build as just a fun build ( my main is a 7800x3d and 7900xtx ) for me..
      i can see real potential for alot of budget gamers that refuse and " personally shouldnt " buy 8gb rubbish !!

    • @ilnumero1234
      @ilnumero1234 Год назад

      i mean yes,but arc is still not powerful enought to utilize his 16gb vram

    • @buggerlugz6753
      @buggerlugz6753 Год назад

      Whats this "Value for money" in GPU's in 2023? nah...impossible!!!

  • @Cipotalp
    @Cipotalp Год назад +3

    I have Intel 13900K and ARC A770 16GB LE :) And I'm happy with it. Few years later I will buy only Intel card.

  • @joller79
    @joller79 Год назад +3

    Its actually getting better and better, exiting future for the next gen release, there is a small hope that price to performance gaming wise will get better..

    • @ovemalmstrom7428
      @ovemalmstrom7428 Год назад

      @@istealpopularnamesforlikes3340 has he used DDU to start fresh with the drivers? Maby old drivers cause trouble for him?

  • @Valthonia
    @Valthonia Год назад +2

    The latest update for Cyberpunk is incredible for Arc series. Great XeSS 1.1 support is one thing, but the performance jump even without it has jumped more than a quarter or so.

  • @Devilion901
    @Devilion901 Год назад +19

    This card will probably age better than a 4070

    • @buggerlugz6753
      @buggerlugz6753 Год назад +3

      Probably more Intel cards have sold than the 4070 already! :)

    • @CyberneticArgumentCreator
      @CyberneticArgumentCreator Год назад +1

      No possible way, it doesn’t have the horsepower.

    • @tvdootman27
      @tvdootman27 Год назад +3

      ​@Cybernetic Argument Creator yes possible way.
      16 gb vram (instead of 12) and drivers still have improvements to be made

    • @82_930
      @82_930 Год назад +1

      @@CyberneticArgumentCreator it does though

    • @fathammy
      @fathammy Год назад

      It doesn't have the compute but I'm getting this as a stopgap because of the garbage 4070

  • @DJ-fw7mi
    @DJ-fw7mi Год назад +18

    For some reason I really want to build a 13500 and a770. Seems like a reasonable pair for 1440p.

    • @ovemalmstrom7428
      @ovemalmstrom7428 Год назад +4

      ​@@xCDF-pt8kj well, XeSS, AV1, better RT support, better for content creation and off course 16gb of fast ddr6 vram. Raw rasterasation isn't all to look for. Don't get me wrong rx6750 xt is a very good gaming card, but not the card for everyone.

    • @kissehfodasso
      @kissehfodasso Год назад

      LOL

    • @leonro
      @leonro Год назад +1

      @@xCDF-pt8kj The A770 in your country is too expensive then, in the end the best GPU changes depending on the market you're in.

    • @volvot6rdesignawd702
      @volvot6rdesignawd702 Год назад +2

      im in the process of doing such .. bought the ARC 770 few weeks back and at the moment its matched with a b550i 5600x but my goal is a b660 with a 13600k ( just as a fun build as my main is a 7800x3d and 7900xtx )
      Im quite impressed with the ARC 770 quite a soild nice looking little gpu for the price ..
      I would highly recommend it over any of the old and upcoming 8gb rubbish !!

    • @ovemalmstrom7428
      @ovemalmstrom7428 Год назад

      @@volvot6rdesignawd702 couldn't agree more 👏

  • @Lancaster604
    @Lancaster604 Год назад +1

    That moment when you realize Linus Tech Tip's pivot towards more data driven videos is just catching up to PCWorld in a way lol

  • @ThatChannel48
    @ThatChannel48 Год назад +1

    I've been happy with my A750 as my daily driver, a few months ago I still had a couple games that wouldn't run but they've been fixed and now I barely think about it

  • @LucidStrike
    @LucidStrike Год назад +1

    Ironically, the fact that reviewers keep focusing on the same games hides the main concern with ARC: Older games. How are the drivers for DX11, DX 9 titles?
    Constantly testing Cyberpunk and such when they were never REALLY a problem for it is a lot less interesting.

  • @lordshitpost31
    @lordshitpost31 Год назад +3

    I for one didn't have ANY problems with A750 but I know some people did have problems and I know their problems were being fixed with utmost speed from reddit sub, best bang for buck gpu out there in the market, would definitely recommend.

  • @ARH0101
    @ARH0101 Год назад +5

    I have a 5800x3D and a A770 16gb LE and I have no issues at all. A 256 bit bus card for under $400 is a nice change.

    • @damara2268
      @damara2268 Год назад +1

      And most importantly 16gb
      This card already perfoms better than RTX 3080 in games that need more than 10gb lol

  • @arky9002
    @arky9002 Год назад +2

    Intel needs to invest more in Xess and its adoption in the industry! For me it is the most underrated feature (since so few games support it) of the cards.

  • @FinneyDale
    @FinneyDale Год назад +2

    Thanks for your comment on rebar. I own a 'vintage' PC that still mostly does what I need. For that reason I'm looking for a budget gpu since my system will likely bottleneck a more expensive card. Seems like my best option for a new card is an Rx 6600 although I guess I'll take a small hit on the PCIe lanes and was interested in Intel for that reason. Suggestions?

    • @pessoaanonima6345
      @pessoaanonima6345 Год назад +1

      No, if it's old it probably doesn't support Resizable Bar, which hugely impacts the performance of Intel gpus. If you have PCIe 3.0 the performance impact of a 6600 will be really small. If you have PCIe 2.0 or 1.0, get an older amd gpu or a nvidia one.

  • @mattvanelli817
    @mattvanelli817 Год назад +1

    Awesome video, man. I've been starting to look into hardware for a new PC. My old 970, while still kicking ass, is starting to show it's age.. I want so bad for Intel ARC gpu's to work out and be great, been looking into the A770 as my new machine will be a little bit more of a budget build. This video helped lots.

  • @itsdeonlol
    @itsdeonlol Год назад +6

    A770 is way better now!

  • @jaredbush1866
    @jaredbush1866 Год назад +1

    A770 16GB LE /w i5-12600k absolutely screams.
    Intel gained a fan here!

  • @ResidentWeevil2077
    @ResidentWeevil2077 Год назад +1

    I have an Arc A770 LE 16GB card as my daily driver, and am patiently waiting for Battlemage to drop later this year. Intel has made great strides to make their cards better, and I look forward to future GPUs from Team Blue. 😊

  • @ryanspencer6778
    @ryanspencer6778 Год назад +2

    There are still some bugs that keep Arc from being perfectly fine for normal users, at least on mobile. But performance, and especially stability, has come a long way, even since December.

  • @jpikspiks2067
    @jpikspiks2067 Год назад +2

    Wonder how would Arc A770 would do on Lumion 11?

  • @bookmark2846
    @bookmark2846 Год назад +2

    Need more competition in the GPU space. I hope someone brings Voodoo back to life!

    • @jacobkyle4573
      @jacobkyle4573 Год назад +1

      Nvidia bought all of their assets and IP back in 2001 so this is very unlikely.

  • @ojassarup258
    @ojassarup258 Год назад +3

    Honestly this is great, looking forward to Battlemage and I hope it shakes up the market more, Nvidia and AMD definitely need it.
    A750 and A770 need to still see price cuts though, it's a hard sell if AMD outperforms them in the same price bracket.

  • @JakeTaco83
    @JakeTaco83 Год назад +3

    I have a a770 16gb, and the only real complaint I have is the idle power draw. It spikes up or sits neatr or above 40w on idle. Causing the GPU to heat up and fans to kick on just a bit. Sits at about 48-50c on idle. I tried their bios recommended settings and various power consumtion settings and nothing worked (although I read that it does work for the a750). Also tried using 1080p monitor opposed to 4k TV and still same issue. Under load, it stays very cool (under 70-75c) which is great. Just a small complaint. I dont leave this PC on when not gaming but could be an issue to some. This card has been crushing 1440p for me and does great with 4k in story games (especially in XeSS supported games like Shadow of Tomb Raider). I will likely never spend $500 or more on a GPU and its nice to see a mid tier card that is affordable and can handle higher resolutions.

  • @kmg501
    @kmg501 Год назад +1

    People and especially gamers should be ecstatic that Intel is pushing into this space.

  • @anasevi9456
    @anasevi9456 Год назад +1

    CDPR has since done updates that significantly improved Cyberpunk 2077 and Witcher 3 Enhanced performance on Arc cards, also added XeSS. In the DX12/Vulkan era, as people stated way back in 2015-2016 but seemed to have forgotten recently; with low level API's the ball of optimisations, even uArch specific optimisations is not firmly in the court of the GPU maker; Game engine devs also now share responsibility. The Performance of ARC cards will improve, even drastically over time; as more game developers take the fact that they exist too, not just Geforce and Radeon.

  • @SandyWhitmore
    @SandyWhitmore Год назад +2

    Thanks for the video series. I ended up buying an A750 last month based on it, for a 2nd computer using Silverstone SG13 case I had sitting around. Fits perfectly and the performance is quite nice for the price. QoL-wise, I have some small issues like screen flickering black when alt tabbing with DRM video content is paying, but that could as be just me installing the card on a non-fresh copy of Windows (although I did run both driver built in uninstaller of the previous card and DDU)

  • @FlorinArjocu
    @FlorinArjocu Год назад +3

    I don't need 4090 levels of graphics performance, so in 3-4 years when I will probably upgrade my laptops (1 year old now), I hope that the next ones have Intel GPUs inside. I am sick of the current market leaders. PS: I expect feature and performance parity on Linux.

  • @ryue5626
    @ryue5626 Год назад +3

    You should do this for and and Nvidia cards so we can see if the drivers also improve performance over time

  • @lastblueride5
    @lastblueride5 Год назад +3

    don't get why Intel get so much hate. They are trying to compete in a market where one greedy corporation just sets whatever price they want and people have no choice but to deal with it

    • @ericp631
      @ericp631 Год назад +2

      Fanboyism. People don't just buy products as commodities anymore, instead they project onto the things that they spend money on as self identifiers. It's not just a card that let's you play games at higher fps, it's a lifestyle choice/gang affiliation/who you are pledging your unyielding loyalty and allegiance to. And so Intel or any new comer is going to get hatred/backlash as they are threatening people's"way of life" and the success of their "team"

    • @kazi1
      @kazi1 Год назад

      @@ericp631fr

  • @hubtubby
    @hubtubby Год назад +1

    Really looking forward to see their future cards

  • @felixindahouse1
    @felixindahouse1 Год назад +5

    Hows the performance in CAD applications?

    • @user72974
      @user72974 Год назад +1

      Interested in this too. I game a bit, but I also like how good GPUs make my desktop environments feel fluid when doing things like lots of web browsing (e.g. RUclips), coding in IDEs, etc.

  • @peytonhoward
    @peytonhoward Год назад +2

    I almost bought one of these. Im glad I didnt. My current 1080 ti is a better performer, but im glad that intel is continuing to optimize the card.

  • @hotdogbusinessman1172
    @hotdogbusinessman1172 Год назад +1

    Average frame time graphs and numbers would be a very helpful metric to track since there were a lot of stuttering issues on launch with these

  • @jamezxh
    @jamezxh Год назад

    Pretty impressive for an out the gate discrete GPU. I love the clean look of it too.

  • @Code_String
    @Code_String Год назад

    The one thing Intel needs to nail would probably be VR support. There is an opportunity to get market share with those cards. The A770 is such a wonderful card at its price point.

  • @Wild_Cat
    @Wild_Cat 11 месяцев назад +1

    Can you do another check-in right now? it's september 2023

  • @rarigate
    @rarigate Год назад +1

    This card has insane hardware for its price, soon will get one for their nextgen

  • @Spookophonicstudios
    @Spookophonicstudios Год назад +1

    i like the improvements they did, but when i was using mine i noticed it was a tad bit too warm

  • @WhiskeyInspekta
    @WhiskeyInspekta Год назад +2

    I have an a770 and i had to go back to my 5700 XT.
    I couldn't play my favorite games for shit. Halo MCC played like garbage, dx12 battlefield 1/5 was very jittery, forget battlefield 4.... I had to switch back. MW2 ran like a champ, but it wasnt worth giving up my gaming library for literally 3 new games i own.

  • @shuntao3475
    @shuntao3475 Год назад +1

    I do not buy a new Graphics Card to play DX11 and older games, why upgrade when your old card if probably just fine. So I am happy that Intel built the card for DX12. As for my Daughters Arc A770, we could not be any happier. I scored a 16g model for under $300, in February and it has been close to flawless since. Daughter's focus is on Hogwarts at 1080 and the game shines.

    • @ironman-fg3jh
      @ironman-fg3jh Год назад

      What are you smoking? Backwards compatibility is the main reason we play on pc !!

  • @l.i.archer5379
    @l.i.archer5379 Год назад

    I bought and tested the ACER Predator Bifrost 16GB Arc A770 awhile back, and it ran hot while benchmarking with Heaven - it got all the way up to 89 deg C and was still climbing when I stopped the app. I just bought an ASRock version yesterday and ran Heaven on it. This card has 3 big fans (something that the Bifrost doesn't), and never went over 79 deg C running Heaven. And with the latest Intel drivers, it absolutely beat my RX 6700 XT. I predict this card is going to be on par with the RTX 4070 if drivers keep improving!! AND, at only $325 (after tax from MicroCenter), it's certainly got the best bang for the buck, in my honest opinion.

  • @user-du8yt9gv1m
    @user-du8yt9gv1m Год назад

    Go Intel!!! Very hyped for your Battlemage...

  • @pazitor
    @pazitor Год назад +2

    The price and the memory make the Arc 770 16Gb my next upgrade, that or a coming Battlemage.

    • @lordshitpost31
      @lordshitpost31 Год назад

      An A770 plus is also rumored

    • @clayman71497
      @clayman71497 Год назад +1

      Yeah, the refresh of alchemist is coming later this year with arc a 770+, 750+,etc., but battlemage is coming in early 2024 and the flagship battlemage gpu is supposedly on par with the 4070ti- 4080, atleast according to leaks,id save my money and wait for battlemage, but it'll be interesting to see how alchemist+ performs, the 770 is pretty equal to a 3060ti, the plus model might be 3070 tier possibly 3070ti

  • @notrixamoris3318
    @notrixamoris3318 Год назад +1

    Older games sir like starcraft, diablo 2, fallout, I have no mouth and I must scream, etc. And please do this for one whole year coz we need to monitor the ARC coz 4070 is overpriced with or without inflation...

  • @rickjames9866
    @rickjames9866 Год назад +1

    I bought an arc 770, its a great card

  • @ishantripathi9707
    @ishantripathi9707 Год назад +1

    Reason i chose amd gpu over intel gpu is power consumption. It's the one thing pulling intel down for me right now.

  • @DarksurfX
    @DarksurfX Год назад +1

    Did they ever fix quicksync transcoding? How well does it work on AMD systems?

  • @MayaPosch
    @MayaPosch Год назад +1

    1:15 - You mean i740, not i750. Regards, a former i740 GPU owner :)
    (Yes, even S3 cards blew it out of the water)

  • @JellyYeen55
    @JellyYeen55 Год назад +1

    I have an RTX 3060 and I love it but I honestly wish I could have waited to upgrade and get the A750 because I absolutely love the design of Intels cards and I believe they will improve over time and it would have matched my Intel build lol

    • @lordshitpost31
      @lordshitpost31 Год назад

      For long term you should consider A770, also LEDs look sick.

  • @vladislavkaras491
    @vladislavkaras491 Год назад

    Thanks for the news!

  • @medievalfoot
    @medievalfoot Год назад +1

    Im always cheering for the underdog. Hope they will be able to hit the highend in the near future. Nvidia is on a run with the 4090 and its not looking good with no competition.

  • @pippin123ify
    @pippin123ify Год назад

    I bought one of these recently and have had no problems. It leaves my old (well one year old), 3050 in the dust and is much cheaper than a 360/370. Am impressed, though do wish MSI afterburner would introduce full support. Guess that will happen and with driver updates, I expect this to prove a great value buy longer term.

  • @ScottGrammer
    @ScottGrammer Год назад

    I recently bought an Acer Predator BiFrost (A770/16GB) for $329. It's great for video encoding, OK for gaming, but it's having problems with Topaz Video AI version 3.2. Topaz crashes every time it runs. Version 3.1 ran just fine. I've been talking with Topaz tech support, and they're working with Intel to fix the issue. I sold my 2080Ti because (1) I am growing very unhappy with the way NVidia is treating the gaming community, and (2) I wanted the AV1 encoding of the Intel card. I began by buying a used A380 to add in to my machine, to test the encoding ability and was blown away by it. But I needed that card slot back, so I sold the A380 and the 2080Ti and got the Acer. Let's hope the driver situation continues to improve.

  • @lincorecybergnida
    @lincorecybergnida Год назад +1

    Going to upgrade from my 1650 to a750 in next 2 months. Wish me luck.

  • @michaelthompson9798
    @michaelthompson9798 Год назад +1

    Good to see Intel improved its drivers and overall performance…… but I feel since a couple months back (after the major driver performance update) improvements have been slower since then based on reviewers feedback.
    I’m just hoping Intel can get up into the ~$500 GPU market with BattleMage and offer a better performing alternative as Nvidia and AMD seemed to have all but left this market and forcing customers to buy higher priced models.

  • @redslate
    @redslate Год назад +2

    Intel's been 'Johnny-on-the-spot' with ARC driver updates. 😃

  • @UpTheAnte1987
    @UpTheAnte1987 Год назад

    I've had an A770 since December and have been waiting for VR support for the last 6 months. If it doesn't arrive in the next couple of months then a switch back to AMD or nvidia is on the cards.

  • @techieg33k
    @techieg33k Год назад +1

    I think if I needed a backup GPU I would snag one of these at these prices

  • @rj-zero1453
    @rj-zero1453 Год назад

    With my new computer Windows 10 Pro setup, I was waiting for the Intel Arc A770 to be released, and once it was I purchased one Jan 11th and put it in my system. It worked really well right off the bat with the newest drivers. But... some of the latest driver updates have been causing my system to lock up occasionally. Also, when using MS Teams for my work conference calls, the system would lock up during these meetings, and me having to reboot really quickly to get back to the meeting. I had a lock up today, even when not running the Arc Control Application during my Teams meeting and decided to do the long-awaited Windows 11 Pro upgrade. The Windows 11 update has been flashing on my system every week or two for some time now and I figured I would give it a try and see if my system works better with these latest drivers on Windows 11 rather than on Windows 10 Pro with this system setup.

  • @thomashasenecker5728
    @thomashasenecker5728 Год назад +1

    My first impressions on the gpu were good, but now I‘ve realized some problems. In Jedi Survivor shadows and raytracing aren’t displayed correctly. Also in emulators the card doesn’t work as it should.

  • @Em4gdn1m
    @Em4gdn1m Год назад +4

    is this the 16GB or 8GB A770? I'm torn between the A770 16GB and the RX 6750 XT

    • @unknownname3189
      @unknownname3189 Год назад +1

      Same here

    • @Em4gdn1m
      @Em4gdn1m Год назад

      @@unknownname3189 lol I'm still running my GTX 970, need to finally bite the bullet, but these prices are insane.

    • @pcworld
      @pcworld  Год назад

      Both are the Intel Limited Edition models.
      -Adam

    • @unknownname3189
      @unknownname3189 Год назад +1

      @@Em4gdn1m I know these prices are crazy, I want to get the A770 for the 16 gig and it's cheaper.

    • @unknownname3189
      @unknownname3189 Год назад

      @@pcworld Do you have a 6700 that you can do a comparison between the A770?

  • @billchildress9756
    @billchildress9756 Год назад

    When I bought my A770 I did experience prob's with dx9 until the update to fix that issue. This card has only improved since with further driver updates since. If these cards always get above 50 fps then it is still playable in my opinion even if the lows are not too bad. Most of the titles I play are around 145 fps so I consider that as good. My system is an MSI gaming plus X570 with a 5800X3D and 32 gb of corsair 3600 memory. Re Bar has been enabled and the card has worked very well for me! I await Battlemage! At least Intel listens to us!!

  • @arielis78
    @arielis78 Год назад

    Battlemage can't come soon enough!

  • @soldiersvejk2053
    @soldiersvejk2053 Год назад

    Very happy with my A750.

  • @kolack100
    @kolack100 Год назад +1

    Arc A770 16gb wouldn't be so bad if it didn't cost the same as a RX 6700 XT.

  • @geekmachine666
    @geekmachine666 Год назад

    Hopefully ARC will live on, we need more players in the GPU market.

  • @cem_kaya
    @cem_kaya Год назад

    My friend bought a A770 we had to update everything from terminal. since epic game store refused to launch saying unsupported gpu. and reinstalling it didnt fix. But after that 99% of the games worked without a problem.

  • @NobbeChika
    @NobbeChika Год назад

    I love that the GPU has native kernel drivers in the Linux kernel. No more need to beeing afraid that the system stops working after an update like with as example NVidia.

    • @dutchsailor6620
      @dutchsailor6620 Год назад

      Adding native PCI passthrough so Qemu/KVM VM's could run bare metal accelerated graphics would make this card stellar. They did this with their CPU's (only not for Linux).

  • @takashy87
    @takashy87 Год назад

    What about older titles? Can't recall if it was/is a DX thing , but most if not all reviews pointed out that the Intel cards are pretty good for recent/new games, while being pretty bad to run older games with ?