1 YEAR WITH INTEL ARC A770

Поделиться
HTML-код
  • Опубликовано: 11 июл 2024
  • Time Stamp:
    0:00 - Intro
    1:13 - Honest Take on my Experience with an Intel Graphics Card / Intel GPU
    1:53 - My Wattage and Experience in Intel Arc Control Settings for Intel Arc
    4:58 - Average Experience with Intel Arc A770 in Gaming / intel arc fps
    5:50 - Outro
    My computer Specs:
    CPU - AMD Ryzen 5 5600X 3.7 GHz 6-Core Processor (Paid 330)
    CPU Cooler - Noctua NH-D15 82.5 CFM CPU Cooler (Paid 90)
    Motherboard - Asus TUF GAMING X570-PLUS ATX AM4 Motherboard - (Paid 170)
    Memory/RAM - Corsair Vengeance LPX 16 GB (4 x 8 GB) DDR4-3600 CL18 Memory (Paid 150)
    Storage - Western Digital Black SN850 1 TB M.2-2280 PCIe 4.0 X4 NVME Solid State Drive (Paid 200)
    Video Card/GPU - Intel Arc A770 Limited Edition 16GB PCI Express 4.0 Graphics Card (Paid 380)
    Case - Corsair 275R Airflow ATX Mid Tower Case (Paid 70)
    Power Supply - ADATA XPG CORE Reactor 850 850 W 80+ Gold Certified Fully Modular ATX Power Supply (Paid 150)
    Monitor - LG 27GL850-B 27.0" 2560 x 1440 144 Hz Monitor (Paid 500)
    Keyboard - EVGA Z20 RGB Wired Gaming Keyboard (Paid 90)
    Mouse - Razer Basilisk V2 Wired Optical Mouse (Paid 35)
    Headphones - Sennheiser HD6XX Headphones (Paid 270)
    Webcam - Elgato Facecam - 1080p60 Full HD Webcam (Paid 10)
    #gpu #intelarc #a770 #gaming #computer #parts #kushikush
  • ИгрыИгры

Комментарии • 8

  • @TommyMoreels77
    @TommyMoreels77 4 месяца назад +1

    I got me an Acer Bifrost ARC 770 last month. It's really great and as I had it for €300 due to sales on Acer Belgium, it's a steal for 16Gb Ddr6. I can have mostly 70fps on QHD at most of my games, and it's incredibly good for video-editing.

  • @AshenTech
    @AshenTech 7 месяцев назад +5

    multi updates a month most months, 2 or more seems to be the pattern, more if a new title needs a day 1 driver..
    boost is inconsistent, (came to link a buddy to this video so he could relay it to his bro whos got a card on the way)
    some titles will benefit more from higher wattage headroom, given some time i have a feeling the drivers will get better at drawing what they need for most titles without drawing excess wattage.
    also monitor your frametimes with framecapx (better version of intels latancy monitoring and gpu busy monotiring tool), you may find some games 1% and .1% lows suffer when you limit wattage.
    as to temps.. this is quite normal, i know 90c sounds hot, and is, actually... you can go back to the 8800gt cards to find 112c being their throttle temp, and those cards ran for YEARS many still working today, with a single slot cooler, at hitting 112c under gaming loads even... .112c is "too-fing-hot" , 90c is also the throttle temp for most modern cpu's and or intel, and both companies like to push those limits to get the most out of their benchmarks scores... and limit the benefit of overclocking to irrelevance..
    in this case 90c is quite normal as the throttle temp for a modern gpu, and modern cpu for that matter, and you wont hit that on these without pushing them above stock.
    oh early on boost was alot more lenient, i think their making heavier use of the gpu has been part of why that limits backed off, i can manage an 8-12 boost with a 45mv offset now, use to be able to manage around 26 with a 75mv and not have it throttle below 2600-2650 range, also i have been told the scaling for boost has changed, i may have to test with timespy and see what clocks it actually gets with an 8 boost..if its close to 2600min/full load...then the scales changed alot..
    anyway, we all love this card, its far from perfect but... for the money.. it was 3060 12g, or 66*0xt at the time i ordered, you could get a "ships in 4-6 weeks" 6700xt prone to fan failure for around the same price but.. honestly i dont feel at all bad i went with this.. the acer card keeps its ram cooler then the LE cards and has FAR better hardware encoding then pre 7k amd chips, that use shaders not hardware encoders/deocders to do the same job... but with worse perf and quality..
    oh and our buddy who got a 16gb 4060ti laughs because 3 betas we are in the acer a770 just wrecks the 4060ti 16g cards due to how restricted the memory bw is, and how much the game uses the vram you have, all 3 devs built smart systems to use the vram as efficiently as possible, be it small vram limits and keeping use down, or large limits and loading more into vram so there are less calls for streamed textures/etc.. also means WAY more consistent frame times since all 3 got their systems ironed out...
    oh and part of why all 3 decided to stick with the engine they are using, rather then UE5... after some testing UE5 runs soo much worse for one easily imported test scene even after alot of hand optimization, that, despite how good the tools are, and how good it can look using said tools... its just not worth it.. they can get 98% of the way there using the engine they are on, funny enough the companies arent related or connected but happen to be using the same engine in very similar ways and ended up sharing tools and resources in a cross licensing agreement for their dev tools and optimization tools they came up with, its also why when they found out alot of us were using dxvk-async for the game on all vendors, that all 3 games within a week had a beta patch you could enable in launcher that dropped a hand optimized ver of dxvk into the folder along with the games main exe... they removed ALOT of code they didnt use and optimized alot for how the engine handled things that was shared among the games, sadly the NDA means i cant name games, but 2 are MMO's ones sort of like rocket league with a more sci-fi skin...
    all 3 games now have a full VK path mind you, oh and they contributed back to dxvk-gplasync and main line dxvk after optimizing several code paths for their own uses and finding with testing that they seem to work on every game tested to date.. even if they were really written to boost fps by 20-30% by simplifying how the translation was being done that would matter for how all 3 were using game same engine... (i gather this happend because they all shared employees and got lucky that they were able to get their former bosses into the same room and suggest licensing the tools and solutions each made since they arent competing products in genera and infact, they have found people play 2-3 of them regularly when they allowed people playing any of the 3 in beta to merge-link accounts for testing... we are also told its helped track down a few engine specific flaws/bugs the were able to patch because all 3 titles had the same bug even if it expressed slightly dif in each game...
    oh they were all using scaleform but the UI got totally replaced when one of the 3 hired a couple who hated the perf hit of using scaleform so wrote a UI framework that could be added to or even baked into any game using a similar version of the engine from what they started with... that got licensed by more then a dozen other companies so far im told.. because no perf hit from the UI being up is a huge benefit for most games that have a HUD/UI...
    oh and the companies actually got workstations from acer that came with the a770 16gb preinstalled, they contacted them about ordering a store case worth of cards and ended up with a case of cards and more then 2 dozen full workstation systems that acer dosnt list anywhere as a product they sell... but apparently the workstations were a good enough deal that it convinced the bosses to grab them, and to also upgrade alot of the older workstations that support rebar to a770's.. why?
    1. the hardware av1 allows them to store more data at higher quality then amd and nvidia cards of similar price,
    2. vram allows working with higher quality images/video,(bw matters here as well, 8k raw footage runs horrible on a 16gb 4060ti showing nearly no benefit over 8gb 8-16gb a770 show benefit based on how much vram the app is using, but its always better at higher res... infact testing 1440p raw starts to show issues with memory bw on the 16g 4060ti using 3 apps we tested.. not true for the a770... 4k runs at 300+fps encoding for me.. at quality encode mode... at fast its over 500.. but quality and size are worse... sooo..quality mode is the way to go!!!
    3. the are able to setup a system over the network that uses the arc cards to transcode their old MPEG2-H.264 archival footage, into av1 and dump the results to the new server array they built to replace the aging array of servers holding all that mpeg2-h.264 video.... some lab stuff some security, but all of it required to be kept for over a decade due to contracts with the govt and other companies... sooo.. yeah.. shrinking the stuff to a fraction of the size and converting whats taking up a whole double sided row of racks into what will take up about 4-6 racks to have 3 way redundant copies of everything and room to store years more footage... each servers also got an arc card now that they were able to get single slot cards from intel.. well got/is getting.
    oh they been doing the encodes during off hours and even have several systems running compare streams to check for any encode errors without needing to have people manually review every file... catches even fairly minor issues and flags the files if a re-convert gets the same result a person has to review the problem files...
    and to do this with nvidia cards would have cost 2-3x as much minimum, and wouldnt have used the quicksync gpu features to boost encode rates drastically.. oh they also put 4xa770's into several old x299 workstations and set them up to just sit and recode footage since a 9990xe is more then capable of backing 4 of these cards all using all 4 media engines to transcode video... directly to an 8x1.92tb ssd array that then dumps to the server over 10g nic built into the mobo.. (i want the board/chip/ram they are using in those...nice upgrade over what im using!!!)

  • @megurinemiku8201
    @megurinemiku8201 5 месяцев назад +2

    I undervolt my A750 CLD as well. 2251Mhz@0.71V using arc tool. Actual applied clock and voltage is 2252Mhz@0.925v. Temps never exceed 60C. For Ray tracing and XeSS the gpu pulls around 130W Temps around 65C, for non ray traced and upscaled game's the gpu hovers around 100W. Arc definitely undervolt better than overclock thanks to TSMC 6N.

  • @AshenTech
    @AshenTech 8 месяцев назад +5

    I grabbed a Acer Predator BiFrost a770 16gb about a year ago, 3 friends quickly followed me, from 1070/fury era products mostly, then a few others have since grabbed arc cards... dxvk and dgvoodoo2 tend to sort any issues i have with the vast majority of games... :)

  • @s2mann
    @s2mann 7 месяцев назад +4

    Currently running driver v4953 which brought (dx11) stability back since driver v4676. I use Blender mostly but I do play some old and new games (mostly fps). Zero complaints except, when I get used to the speed, I just want more. Thanks for the review. I think intel has made a solid card.

  • @nassa0815
    @nassa0815 7 месяцев назад +4

    Thank you very much for the nice review

  • @KushiKush
    @KushiKush 8 месяцев назад +4

    pog