The Intel Arc A770 16GB - Budget 16GB GPU?

Поделиться
HTML-код
  • Опубликовано: 15 июл 2024
  • // Join the Community Discord! ► / discord
    Is it worth considering the Intel Arc A770 if you're building a desktop computer in 2024? Well, both yes and no- but there seem to be some identifiable trends that may help you come to a conclusion as to whether or not it's the right card for your machine!
    The Intel Arc A770 is a performance segment graphics card launched as a part of their first generation, code-named Alchemist, of discrete graphics cards! Surprisingly though, Intel has been in the GPU business for decades and you have probably experienced one of their GPUs integrated into a CPU. By percentage of devices sold with Intel graphics, they account for the majority of graphics solutions sold per year- however can their mobile expertise translate to the high performance, high power budget desktop discrete market? Well... surprisingly yes and no...
    Test System Specs:
    - Intel Core i7-14700k, 5.6GHz-P, 4.3GHz-E
    - Corsair H150i 360mm AIO
    - Gigabyte Z690 Aorus Elite AX DDR4
    - 64GB 3600MT/s G.Skill DDR4
    - Intel Arc A770 16GB Sparkle Titan OC
    - Timetec 1TB PCIe 4.0 for games
    - Samsung 970 Evo 1TB PCIe 3.0 for OS/Boot
    - 650W EVGA PSU
    - Resizable BAR Turned On for Testing
    A GPU is a separate bank of ALUs and FPUs, designed to offload heavily parallelizable calculations from the main CPU onto a dedicated chip. The Graphics Card refers to the entire sub-system, containing the PCB, Memory, and Cooling solution.
    Building a Budget PC can be tough. Not only are GPUs and CPUs so incredibly expensive, but they can be hard to find on a budget... But, there are tips and tricks to finding you your dream Budget GPU, and pairing it with a CPU that will give you the performance you want!
    Also, if you're reading this far -babooshka B)
    Have a Great Day!
    - Proceu
    Timestamps:
    0:00 Intro
    0:30 Preface
    1:01 Test Bench Specs
    1:58 Apex Legends
    3:01 Crysis Remastered
    3:54 Cyberpunk 2077
    4:53 Doom Eternal
    6:09 Fortnite
    7:28 Helldivers 2
    8:29 MW3 (2023)/Warzone 2
    9:56 Red Dead Redemption 2
    11:26 XDefiant
    12:50 Additional Thoughts
    18:31 Guinea Pig Cam
    #intel #arca770 #intelarc #a770 #arc #gpu #budgetgpu
  • НаукаНаука

Комментарии • 81

  • @jannemann_17g90
    @jannemann_17g90 5 дней назад +7

    Bought a 2 week old Arc A770 Sparkle for 180€ and im completely happy with it. Using it on 1440P with 0 Problems.

  • @WyTDeViL86
    @WyTDeViL86 8 дней назад +8

    I am running one in my current system. Excellent card for the price, I have no problems with mine at all, runs everything I've thrown at it without even a hiccup.

  • @JayzBeerz
    @JayzBeerz 11 дней назад +29

    $279.99 for a 16GB card is a no brainer buy.

    • @formerlycringe
      @formerlycringe 10 дней назад +5

      Should've been there when it used to be $230~250

    • @p_mcg
      @p_mcg 6 дней назад +2

      As an owner of an A770 LE, the 16GB makes no difference in most cases anyway

    • @Saymriy
      @Saymriy 4 дня назад +4

      @@p_mcglongevity and future proofing bro

    • @8bitbender495
      @8bitbender495 3 дня назад +1

      people are waiting on Battlemage

  • @rzkrdn8650
    @rzkrdn8650 11 дней назад +13

    If Battlemage equivalent of A770 is 20-30% faster than this and keep the current (reduced) pricing. I can honestly see it sells some

    • @Cake-ql8lh
      @Cake-ql8lh 8 дней назад +2

      it's gonna be much more than that, their next gen laptop gpu (battlemage in lunar lake laptop) is 50% stronger
      So 50% minimum better, i would expect it around 75% for the desktop

    • @rzkrdn8650
      @rzkrdn8650 8 дней назад +2

      @@Cake-ql8lh but that's made by tsmc, battlemage is rumored to be made in-house and slightly different node, we don't know

    • @Cake-ql8lh
      @Cake-ql8lh 8 дней назад +1

      @@rzkrdn8650 it's TSMC 4

    • @_TrueDesire_
      @_TrueDesire_ 6 дней назад +4

      The 7800XT has set the bar. Intel needs to match it, imo at a lower price. I got a 4080S FE in one computer but I'm done supporting nvidia 😅
      So in my collection/builds, I now got 4080S FE, RX 7900 GRE and starting today an A770 👍🏻

    • @Cake-ql8lh
      @Cake-ql8lh 6 дней назад

      @@_TrueDesire_ 7800xt was good but thing is we had 6900 XT for 500$ a long time pre rdna3 launch
      We need something from the start a good gpu price
      I believe battlemage flagship will be 400$

  • @ShamoaKrasieski-xm4ze
    @ShamoaKrasieski-xm4ze 12 дней назад +11

    The Arc graphics drivers have come a long way in unlocking this card's potential for gaming, but I wonder how much more Intel's engineers can squeeze out with driver updates? This might be close to the best we are going to get. Not bad, priced right, but not competitive enough for core gaming. Best suited for folks who like to experiment. Intel has open source drivers for Linux, so it might be great for Linux gaming.

    • @ProceuTech
      @ProceuTech  11 дней назад +7

      Yeah, I keep seeing the whole “the drivers will improve”, but tbh idk how much more they can squeeze from the cards without a drastic rewrite/rearchitecting of their hardware and drivers. It’s been almost 2 years since these cards, at what point do we sort of “call it” and realize this is what we’ve got. Cards are ridiculously powerful if you know how to program for it, but without specific optimizations on the part of games it probably won’t stand out significantly, which stinks because I like the card a lot…

    • @ItsDeeno69
      @ItsDeeno69 11 дней назад +6

      ​@@ProceuTechI think intel said that the hardware is a770 is 3070 level so when ir reaches that it will be done

    • @p_mcg
      @p_mcg 6 дней назад

      It may never reach that due to hardware limitations and hardware design. Alchemist struggles with occupancy issues @@ItsDeeno69

  • @USArmyVet91
    @USArmyVet91 7 дней назад +4

    It is a great card. Almost weekly driver updates have helped this card become quite a good 1440p gaming card. 👍👍

  • @muhammadyasser4362
    @muhammadyasser4362 6 дней назад +5

    Wait this thing can run 4k?

    • @dbod4866
      @dbod4866 4 дня назад +2

      All my gaming is 4k. Diablo4 80-100 fps

  • @Obie327
    @Obie327 10 дней назад +9

    The Intel A770 is a decent performing GPU. Been happy with the recent driver updates and use it for 1440p gaming. Thanks for the review.

    • @animalyze7120
      @animalyze7120 5 дней назад +1

      Card is stupidly underrated by snobby enthusiasts. Got an A770 in a microcenter deal for a customer and was totally shocked to see how far Intel had come, total game changer. $100 less than 4060, came with 16gb of vram instead of 8 and blew the doors of it in tests at 2k.

    • @Obie327
      @Obie327 5 дней назад

      @@animalyze7120 Agreed. The intel ARC drivers have been getting better and better with each update. I would buy another ARC GPU but will wait to see what Battle mage improves on when it comes out. Happy Gaming! Peace!

    • @willhutton1516
      @willhutton1516 3 дня назад

      What settings do you use? All I see people saying is 1440p ULTRA or Highest settings. Like, why would you play a game at highest settings for the lowest fps? I’m mainly a console gamer so I’m used to 60 fps, but the difference between 1440p and whatever consoles do with variable resolution is honestly negligible.

  • @CNC295
    @CNC295 12 дней назад +7

    Good review. I have had a good run with my A750. I just regret not buying the A770 16g version. :( My only regret.

    • @_TrueDesire_
      @_TrueDesire_ 6 дней назад +1

      @@CNC295 well you saved some money for the next generation gpus 😁

  • @rnrbishop
    @rnrbishop 11 дней назад +5

    Picked an A770 up from a local place for 220 bucks on special offer, quite an upgrade from my 1660 super 😅

  • @ennio5763
    @ennio5763 11 дней назад +4

    Direct benchmark comparisons against similarly prices gpus would have been useful to establish the final question "is it worth buying it in 2024?"

  • @snakeplissken1754
    @snakeplissken1754 5 дней назад +3

    Honestly, i think the a770 is an often overlooked value package. Decent native 1440 performance that will benefit a lot from fsr (specially fsr3 + fg) in the future helping it to stay relevant especially with its 16gb of vram.
    Surely not a good option for anyone that doesn´t have a mainboard supporting rebar and maybe still a bit of a gamble for anyone mainly interested in older direct x games. Especially older niche titles are unlikely to be a focus of attention from the driver developers to fix a potential issue (graphic glitches or stuttering).
    For anyone who only (or at least mainly) plays modern esports titles, modern triple a titles and who also has a modern cpu/mainboard combo the card can be a real value option. Even more so if you get a good deal/discount.

  • @owomushi_vr
    @owomushi_vr 9 дней назад +5

    Pico allows you to use it in vr. I gave up on my Nvidia 4060ti

  • @brynduffy
    @brynduffy День назад +1

    Got the bifrost a770 16 GB for $279 a month ago and I feel like I "won". Great performance on baldur's gate, Witcher 3, Fallout 4 and Fallout 76.
    Zero complaints but I will admit that I'm not a big time player, just a few hours a week.

  • @punkbutler88
    @punkbutler88 10 дней назад +3

    I have an 5600x rx6600 setup. I really wanted to try the a770 just for shits and giggles. Im gonna need a card soon anyways when i build a pc for my kiddo so.... 🤷

  • @cHaMp630
    @cHaMp630 5 дней назад +3

    Personally I don't really consider 60fps that great outside of the most casual of casual most likely 3rd person games with a controller. Otherwise 90fps minimum or 120 ideally should be the target framerate any worth its salt gpu should aim for in most decently well optimized games medium / high settings at 1440p. I'll only cut them some slack if they can barely get that 90 with their upscaler set to the best quality setting at 1440p. This would mean intel would be looking at something in the ballpark of a 500-600$ a790 type card to get those numbers.

  • @rdsii64
    @rdsii64 11 дней назад +3

    I'm not a gamer. With that said, I would like to experiment with open source LLM's on my own hardware. This card being significantly less than $300.00 with lots of vram makes it an obvious choice for me.

    • @ProceuTech
      @ProceuTech  11 дней назад +5

      Then this card makes 100% sense, especially with the fp16 compute perf it offers!

  • @HelminthCombos
    @HelminthCombos День назад +1

    my rig is OLD ryzen 5 1600k and a 1060 6 gig
    ill just grab this and stretch my cpu till i get enough to upgrade everything else

  • @KillerSneak
    @KillerSneak 2 дня назад +1

    Seems the 7600xt is still the better choice, but they are doing a great job with their tech/drivers.

  • @CyberClu
    @CyberClu 7 дней назад +2

    I'm curious, does the Intel control panel have a custom color control temperature slider like the AMD Adrenalin software? I'm using an old HDTV for my PC monitor, and it's just way easier to calibrate a cooler or warmer temp on AMD, than it is on Nvidia's control panel. Nvidia's control panel for color calibration sucks.

    • @ProceuTech
      @ProceuTech  7 дней назад +2

      It does, but it’s kind of limited. If you joined the discord I could sent screen shots of it

  • @Arejen03
    @Arejen03 11 дней назад +3

    im wondering about upgrading my old 1060 6gb, boardl imited to 3.0 16x PCIE 3060, 3060 Ti, arc a 580/750

    • @ProceuTech
      @ProceuTech  11 дней назад +3

      I was worried that PCIe 3.0 would bottleneck the A750/A770- after testing on a Z390 board with an i7-9700k, the only thing that effected performance was whether or not Resizable BAR was turned on (Called "Above 4G Decoding" in the BIOS).

  • @Tanner_SD
    @Tanner_SD 2 дня назад +2

    Does anyone have the intel arc a770 that would be willing to test the game dead by daylight from Steam and tell me how it runs as of 07/2024? Preferably when playing at high/ultra preset?? Heard that you might have to go into game properties and add "-DX12" to make it run better, but that info is a year old now, so I'm not sure if that is still true.

  • @sc9433
    @sc9433 8 дней назад +3

    why would you show us medium presets? could it be that this card cant handle Ultra?

    • @l.i.archer5379
      @l.i.archer5379 5 дней назад

      It's not an Ultra card. It will play Halo Infinite at Medium/High settings in 1440p at 80+ fps.

  • @sorousha.s9002
    @sorousha.s9002 11 дней назад +3

    how does it perform in the unreal engine 5?
    I'm thinking about buying the next gen intel cards, the 16gb Ram for that price is really appealing.

    • @ProceuTech
      @ProceuTech  11 дней назад +2

      I used it for UE5.4 experimentation. Everything ran pretty well on it! If you want the 16GB of memory then it makes sense, but if you are spending around $300, the GPU found in a competeing Nvidia or AMD card (RTX 4060 and RX 6700XT) are more powerful in terms of raw FPS.

    • @sorousha.s9002
      @sorousha.s9002 11 дней назад +2

      @@ProceuTech yes i want it for the vram, i got a 1070 ti so 4060 8gb doesn't really worth the hassle. Need a card with performance of 3070 and 16gb of vram. I'll wait for the new gen though, this gen performane is too inconsistent.
      Thanks a lot

    • @BLYAThus
      @BLYAThus 9 дней назад

      @@sorousha.s9002 The drivers do need more work, maybe wait for Battlemage? I've heard leaks they're as fast as RTX 4070's or even the unreleased RTX 5080 haha. But again, prices for the a770 might drop when newer gen releases.

    • @W5529RobloxGameDevelopment
      @W5529RobloxGameDevelopment 6 дней назад

      @@sorousha.s9002 The drivers are good now. The only limiting factor is the architecture.

  • @BigT5
    @BigT5 8 дней назад +3

    In Europe they want 350€ for it.

    • @ProceuTech
      @ProceuTech  8 дней назад +2

      Yeah, makes it kind of a tough sell when it’s encroaching on the 4060ti

    • @ZForZamorak
      @ZForZamorak День назад +1

      Remember the prices you see in USD don't include taxes, it's typically 7-10% more depending on where you live

    • @ProceuTech
      @ProceuTech  День назад

      @ZForZamorak oh it’s way more where I live 😭

  • @badass6300
    @badass6300 11 дней назад +4

    How does it compare to the rx 6700(XT) and rtx 3070?

    • @rzkrdn8650
      @rzkrdn8650 11 дней назад +3

      6700xt and 3070 are 20% and 35% faster respectively

    • @ProceuTech
      @ProceuTech  11 дней назад +4

      ^^^ can be even wider or closer depending on the title though.

    • @badass6300
      @badass6300 11 дней назад +6

      @@rzkrdn8650 You are looking at outdated numbers, as the 6700xt nowadays is about as good as the 3070 and the a770 at launch was on par with the 6600 non-XT, but nowdays is close to the 3070.

    • @rzkrdn8650
      @rzkrdn8650 11 дней назад +3

      @@badass6300 nope it's not an outdated number, look it up on youtube, sort by upload date, a benchmark as early as 3 weeks ago confirms my statement. 3070 only stalemate with 6700xt if it is running out of vram (ultra texture/4k). It wins pretty much anything else.
      And everything else i said is also the same.

    • @rzkrdn8650
      @rzkrdn8650 11 дней назад +3

      @@ProceuTech everything is

  • @Ornal64
    @Ornal64 12 дней назад +2

    My only issue with A770 is that I can’t set up 120hz refresh rate in windows.

    • @JayzBeerz
      @JayzBeerz 11 дней назад +2

      why i can

    • @Ornal64
      @Ornal64 11 дней назад +1

      @@JayzBeerz it might got something to do with hdmi cable that works perfectly with 3090. I will try display port.

    • @JayzBeerz
      @JayzBeerz 11 дней назад +2

      @@Ornal64 have you gone into advance display in windows?

    • @Ornal64
      @Ornal64 11 дней назад

      @@JayzBeerz yes, it will just revert back to 60hz. Same hdmi cable with Radeon and Nvidia works without problem.

    • @Aegie
      @Aegie 9 дней назад +1

      ​​@@Ornal64use display port instead of HDMI, maybe it will help
      Intel cards can be tricky sometimes

  • @Monkeymehrab
    @Monkeymehrab 5 дней назад +2

    😲😲😲😲

  • @animalyze7120
    @animalyze7120 5 дней назад +2

    What on Earth are talking about? On any monitor 60hz and higher 70ish fps will be butter smooth to the human eye and there will be no stuttering or tearing AT ALL. What is this new age pompous attitude thinking folks can really detect a difference without a horribly inaccurate fps counter app? Seriously at 60fps your in the sweet spot there is nothing slowing you down competitively, otherwise Consoles would have never been popular and totally unplayable by your reasoning. I have a 4090 and another system with older 2070 and 240hz 32" monitor. I have done this test with literally dozens of ppl and with graphics maxed not 1 person could tell me which was which, I mean the guess is 50/50 of course but the point is unless you are Data from Star trek, give it up you aren't going to see anything different. If you think fps numbers impress then you'd be wrong again, unless you can actually see that difference, the Extra $1200 is wasted.