Intel Arc A770 16GB Benchmarks - Worth It?

Поделиться
HTML-код
  • Опубликовано: 15 июл 2024
  • НаукаНаука

Комментарии • 98

  • @luizarthurbrito
    @luizarthurbrito 6 месяцев назад +83

    I swear if Intel keeps going I'll seriously consider them for my next gpu. Nvidia has become too expensive and amd aren't as value oriented as they once were.

    • @mikepawlikguitar
      @mikepawlikguitar 5 месяцев назад +6

      AMD is great in that its cards perform very well for the price + the generous amounts of VRAM. However, it needs to get on Nvidia's ass when it comes to proper competition for DLDSR and DLSS.

    • @moaa042
      @moaa042 5 месяцев назад +7

      AMD seems content to just slightly undercut nvidia at every sub-top-tier price point whereas Intel tried very hard to go right for the throat (they just haven't quite succeeded)

    • @klevesmith
      @klevesmith 5 месяцев назад +3

      I am about to build my first PC around the Arc 770 16gb paired with an i7 13700

    • @mikepawlikguitar
      @mikepawlikguitar 5 месяцев назад +1

      @@klevesmith nice GPU! Why Intel though, if you don't mind me asking? With such a powerful CPU, you could easily go RTX 4070 Super or even Ti Super. That CPU is a lot of power for just an Arc card. If you're adamant about Arc, you may be better off with an i3-13100 in that case for 1080p gaming. i7-13700 would never see its full potential with that card, so it may be wasted $$.

    • @klevesmith
      @klevesmith 5 месяцев назад +2

      ​@@mikepawlikguitaryou're exactly right. Especially with me running AutoCAD 3D, SOLIDWORKS

  • @rangersmith4652
    @rangersmith4652 6 месяцев назад +57

    Just bought an ARC A770 for $280 and installed it into a system that will mostly be used to teach myself Resolve and OBS. So far, so good. I suspect that driver support will keep getting better because Intel has the resources to maintain such a focus.

    • @ProceuTech
      @ProceuTech  6 месяцев назад +14

      That’s what I’m hoping. They’ve made huge strides in the past year+ since launch. Hope they can continue the improvements until and after their next gen comes out!

    • @fuzzylumpkins6034
      @fuzzylumpkins6034 6 месяцев назад +5

      Our youngest son had a hand me down rx 5700xt in his rig. We gave him a £300 gpu budget. Personally I thought he would grab a 6700xt but he picked up the arc sparkle 770. I am so impressed with how far drivers have come and he just chops about in Minecraft and the unreal store building his own game and it seems flawless. I know I am speaking on behalf of what 5 year old wants and does but it has me wanting to try it in my server pc.

  • @ChipboardDev
    @ChipboardDev 6 месяцев назад +22

    I've got the A750, it hasn't let me down!

  • @wagnonforcolorado
    @wagnonforcolorado 6 месяцев назад +21

    A lot of content creators and production users like the 16GB of VRAM on a wide memory bus. The 770 does cater to a specific market that AMD and Nvidia have ignored. Intel's driver development is also helping with their iGPU drivers. Future looks positive for Intel keeping up on their ARC line.

  • @82_930
    @82_930 6 месяцев назад +29

    I have the exact same version of the A770 you have and I love it, sniped for 279$ USD new for my secondary system, intel has come a very very long way in terms of drivers and the instability seems to be fixed for the most part. In terms of performance the card is still all over the place, sometimes losing to a 3060, to sometimes mopping the floor with a 4060Ti. Either way I think this card will age like fine wine and over the next 4-5 months it’ll get more consistent

    • @lorsheckmolseh3345
      @lorsheckmolseh3345 6 месяцев назад +4

      The higher the resolution, the better the A770 runs compared to 4060 and 7600. In many cases stepping up from 1080p to 1440p is only a few fps slower, what is a sign for still lacking driver-optimization.

    • @82_930
      @82_930 6 месяцев назад +6

      @@lorsheckmolseh3345 yup that’s the A770’s superior hardware showing its true colors, it has better specs on paper than an RTX 3070Ti and has the potential to outperform one as drivers get better

    • @_gr1nchh
      @_gr1nchh 3 месяца назад +1

      @@82_930 Yep. I've seen a few people say why would you buy Intel cards right now. It's simple, you're not buying them for RIGHT NOW, you're buying them for the future when VRAM is at a premium and you'll have to pay $600+ for a card of this nature instead of the $250-$300 you can get it for now. Intel definitely future proofed this release and I'm heavily leaning towards upgrading to an A770 from a 1050 ti. It's a massive jump in performance but looking like it's the best price-to-performance card right now with some futureproofing as well.

    • @dramatyst5661
      @dramatyst5661 2 месяца назад

      ​@@_gr1nchhi just bought one for my 2nd pc build

  • @the_desert_dweller
    @the_desert_dweller 6 месяцев назад +11

    I just recently switched to a INTEL LIMITED EDITION A770 16GB after my 3080 died. even though the framerate's weren't crazy fast, it runs butter smooth, and Personally I like the way Intel renders the colors in COD.

  • @GroundGame.
    @GroundGame. 6 месяцев назад +10

    Such an exotic 3rd contender card. I've seen a few at my microcenter here. Keep up the good work as always. Always love the die breakdowns.

    • @almisami
      @almisami Месяц назад

      Ingot the Sparkle Orc as a media card and it is a video editing monster.

  • @ovemalmstrom7428
    @ovemalmstrom7428 6 месяцев назад +7

    I believe that many gamers buying the Arc cards do it so to support the new platform, and be there to se it evolve and grow, not max out performance, gaming heroes in a way!

    • @Miley_00
      @Miley_00 5 месяцев назад +1

      That's me! I never run anything on ultra and I have built about 15 PC's for my family/friends and excited to see the Ryzen 7500f does with A770

    • @pauloazuela8488
      @pauloazuela8488 3 месяца назад +1

      The only arc card I would buy would be the 380 and 310 simply because of form factor basically better entry levels

  • @BlueLightning
    @BlueLightning 6 месяцев назад +8

    as an enthusiest, i LOVE what this card represents. for a first gen product (yes they have done integrated for years but integrated was never ment for running games) this is a very good start, i have also heard from multiple sources that the bigger reason the performance is so low compared to cards with similar power draw is there was a slight flaw in the base architecture, leading to lower performance. (the a770 was originally supposed to be at or above a 3070 in terms of performance) i have the acer predator a770, mostly because i just think the blower and flow through hybrid fan tech is super cool, and it performes quite well (although i did go back to my 4080 just because its performance is way higher lol) i just hope the flagship battlemage card blows us away, im hearing roumers of a possible 4070ti or even 4080 competitor, and if it does indeed turn out that way, ill 100% get one. especially if it is around or even under 700 bucks.

    • @ProceuTech
      @ProceuTech  6 месяцев назад

      I just wish I knew what the problem with the architecture is. I’ve been doing some SYCL programming on the A750 and A770 over the past year, and there are TONS of ways to tweak the performance of code, so much to the point where I got a raw ray tracer computing intersections on the shader cores working at 1080p at between 75-90 FPS depending on how much geometry is on screen. The cards perform well, but they need optimization, I just wish I could figure out what exactly

    • @BlueLightning
      @BlueLightning 6 месяцев назад +1

      @@ProceuTech i mean the actual silicon used has a slight flaw from what i have heard, so no matter what they do on the software side, it will never reach the potential they wanted. only thing i am confused by is how well arc runs in synthetics like 3d mark, it is basically the exact same as a 3070 in time spy. i have zero coding knowlege, i just love pc hardware.

  • @Punk2k11
    @Punk2k11 6 месяцев назад +9

    A770 is constantly on sale for 250-260 so it’s better for the price to performance than the 4060

    • @IntelArcTesting
      @IntelArcTesting 6 месяцев назад +3

      Not in EU unfortunately, it’s a rock solid at €399 for over a year now. A750 is €250 or so.

    • @ProceuTech
      @ProceuTech  6 месяцев назад +2

      @IntelArcTesting that’s sad to hear. In the US alchemist cards are getting very cheap, probably because nobody wants to really buy them

    • @IntelArcTesting
      @IntelArcTesting 6 месяцев назад

      I got one for 250 on the used market luckily but new doesnt make sense to buy imo.@@ProceuTech

    • @nonamenameless5495
      @nonamenameless5495 5 месяцев назад +1

      EU here as well, on the fence of buying an A770 despite the potential hassle. I ve been through the journey of Ryzen 1st gen to nowadays chips and whilst interesting, I m not all that sure I really want or can invest as much time as I did with workarounds and learning what works and what doesn t when it comes to graphics cards. I believed AMD s promise of a long lasting platform and also saw the cost benefits over time- my entire family/ friends switched to Ryzen thanks to me spearheading and spending time doing my own research thanks to many online sources. It ended up paying off but you don t remotely get a similar synergy or ROI when it comes to a graphics card and yeah, here the prices for A770 are between 350-400Eur lately, that simply kills the deal when you can get a 6700/6750XT for the very same price. If the A770 was around 250-270 Eur it d still be a questionable risk (re sale value will be low anyway) but that could lure me into getting one.

    • @sandornagy967
      @sandornagy967 2 месяца назад

      try looking at buying from german websites. Here in my local Saturn the A770 goes for 299 and sometimes 269 on discount and the a750 for 199.

  • @easydayez
    @easydayez 6 месяцев назад +8

    Works perfectly with emulators at least the arc a750 from my experience

  • @linklickz
    @linklickz 6 месяцев назад +9

    If you play vr do not buy this

  • @owomushi_vr
    @owomushi_vr 4 месяца назад +1

    Fully working with pico connect on VR , love this GPU now 😁

  • @IntelArcTesting
    @IntelArcTesting 6 месяцев назад +6

    It all depends on the price. In my country A750 is €250 and A770 is €399. Performance uplift is like 5% at most.

  • @yasunakaikumi
    @yasunakaikumi 6 месяцев назад +8

    I got my A770 for around 250 here in Japan and that 16GB is worth every penny especially for the future games that needs more than 8GB. for competitive games you dont need that much since most of those are optimized for 8GB or lower specs, since AAA and AA single games will be aimed with PS5/XBSX ram in mind 16GB is definitely the way to go if you want to use more geometry and texture data for those sweet visuals. also for those who are starting to use Blender on making 3D stuff or just Davinci Resolve, it's quite fast for what it is. so if you can spot an under 300 bucks in the wild id say it's a steal.

    • @Miley_00
      @Miley_00 5 месяцев назад +1

      I just ordered a Sparkle Titan A770 for 280$ Newegg I'm excited for 1440p 144hz on a Ryzen 7500f 6000Mhz ram and excited to see how it does

  • @exitar1
    @exitar1 6 месяцев назад +5

    New driver released today 1/10/24

  • @Mr-Clark
    @Mr-Clark 4 месяца назад +3

    I just bought the A770 Sparkle and waiting for delivery. Hoping the experience is good. 🤞🏻

    • @brahimsaad6287
      @brahimsaad6287 7 дней назад

      hey man
      how is your experience so far?

    • @Mr-Clark
      @Mr-Clark 7 дней назад

      @@brahimsaad6287 wonderful. Drivers are very mature. A770 performance falls between a 4060 and 4060ti. Especially the 16gb variant. If you can get it for $300 or less, it’s well worth your money.

  • @scredeln
    @scredeln 6 месяцев назад +7

    have you unlocked the resizable bar function? I heard that if it is disabled, the card loses a lot of performance

  • @shannonrhoads7099
    @shannonrhoads7099 5 месяцев назад +1

    I have been using an Arc A770 that I got used during the tail end of the pandemic. When it comes to really old games, I have some subjective observations. In City of Heroes: Freedom, the A770 had serious issues. Running the 'Ultimate" Preset (the game wants to default to this) rendered this old game beautifully - but had a very distinct stutter whenever rendering large fights, crowds and fights with lots of particle effects. On Recommended settings the previous issues go away but the game looks somewhat 'flat' in comparison. In the Cryptic Studios games Champions Onliine and StarTrek Online, there are some issues in what the game wants to use as default settings. In the former, anti-aliasing is turned to the max, which causes harsh black lines to appear on characters and objects in the former, and in the latter title bloom had to be manually adjusted in space scenes as planets were brighter than stars, and almost painfully bright on the monitor I am using. Again, adjusting these specific settings fixes the issues. I am sorry I don't have hard FPS data for these titles. I've paired it with an I7-11700F machine with 32 gb of 3200mhz ram. Does it matter? I dunno. 🤷‍♂

  • @user78405
    @user78405 5 месяцев назад +4

    intel gpu can spot game devs flaws in the code when it reveal any lower 1%lows when highs can go very high 300 fps....crysis is well made game in remaster version than original version due to devs don't make mistake in making maps from their tools of doing maps in one pass vs avg dev would patch the map by multiple pass from their texture editor than redo whole map from scracth that devs don't want to do again so that idea would result fps hit in certain spot on the map if user find it from gpu rendering ...devs are getting super lazy as this late ...either time constraints and low pay

  • @afti03
    @afti03 6 месяцев назад +4

    good video!

  • @kahaneck
    @kahaneck 6 месяцев назад +4

    Apex Legends is also available in DX12. Borderlands 3 is a UE4 game, disable texture streaming and the stuttering will go away.

  • @l.i.archer5379
    @l.i.archer5379 3 месяца назад +1

    I had issues 3 months ago with the Arc A770 16GB in my PC, and had to swap my RTX 3070 Ti back into it. This Sunday, I actually decided to fix my PC by reinstalling the Arc A770 and resetting my Windows 10 22H2, with a clean install of the Intel drivers. Reinstalled Halo MCC and Infinite, and FPS maxxes out at my monitor's 165 Hz frequency in both titles with smooth, and very playable graphics. I'm really loving my ASRock Phantom Gaming card paired with a Ryzen 7 5800X, and 64 GB of 3200 MT/s 16CL RAM, now. The latest Heaven Benchmark tonight also yielded a 361 FPS max, and 209 FPS average framerate, and temperatures never went above 74*C. Now, I'm thinking of upgrading to a 5800X3D CPU for this PC.

  • @reezymfk8
    @reezymfk8 2 месяца назад +1

    I got my hands on the Asia exclusive A770 Photon in white one of the best looking cards I've seen in person .

    • @ProceuTech
      @ProceuTech  2 месяца назад +1

      Saw one on eBay a few weeks ago here in the US- interesting screen on it

  • @lorsheckmolseh3345
    @lorsheckmolseh3345 6 месяцев назад +4

    A770 has a 256-bit bus (560 GB/s), 16 GB, what should make it a good tad faster than the castrated 4060 or 7600 only connected by 128 bits and 8x PCI-E. Raytracing-capabilities are outstanding in its class. The high power draw easily can be reduced to around 140W by undervolting, maxclock set to 2100MHz and usage of Optimus (video out over motherboard), what costs around 15-20% performance. The driver improvements in 1 year were massive and there is still a good portion of extra power to expect by more optimization.

    • @ProceuTech
      @ProceuTech  6 месяцев назад

      While I agree 100%, the a770 and 4060/7600 are in completely different performance classes. I should have included the specifications in this video instead of pushing them to a separate video, because I could have made this much clearer.

    • @lorsheckmolseh3345
      @lorsheckmolseh3345 6 месяцев назад +1

      @@ProceuTech, yes, but this was only true before the newest driver versions. The "PC Games Hardware"-channel on RUclips made several tests some days ago, that clearly show the A770 dominating over the 4060/7600. No wonder: the higher the resolution of meshes, textures and screen, the higher the risk of cache misses at these models.

    • @lorsheckmolseh3345
      @lorsheckmolseh3345 6 месяцев назад

      @@ProceuTech ruclips.net/video/ZkTYGTsbBvI/видео.html

  • @Flaviosantos-d8h
    @Flaviosantos-d8h 6 месяцев назад +3

    It's a shame that in Brazil it costs 431 euros for a plate like this, here our politicians rob us straight through. But I opted for an Arc 770 because I believe in Intel more than AMD

  • @lacie5915
    @lacie5915 Месяц назад +1

    I average 144 to 158 1080p on cod mw3, running with corsiar vengence ddr5 6400 set at 5400 with msi b760 wifi plus gaming moterboard and i7 12700kf not overclocked.

  • @quatreraberbawinner2628
    @quatreraberbawinner2628 5 месяцев назад +2

    Im rooting for intel, but their next GPU either needs to perform a lot better or be a lot cheaper before i consider buying an arc card

  • @AotO_DJ
    @AotO_DJ 5 месяцев назад +5

    You are not using XeSS. You should be more knowledgable in the area of electronics my friend… XeSS literally a blessing to this already powerful GPU.
    Edit: You know about XeSS, but you didn’t show the differences with XeSS. These are only native settings.

    • @curious5887
      @curious5887 4 месяца назад +1

      For honest comparison, native resolution is better for comparison

  • @Mr-Clark
    @Mr-Clark 4 месяца назад +1

    I got this card and unfortunately my PC won't boot. I have a 9700F i7 and it won't post. I don't have onboard graphics so I made sure resizable bar was on before I swapped gpus. I made sure to uninstall Nvidia drivers and swap with my A770.
    When the computer turns on I only see a few white dots on the screen. I thought my gpu was dead so I put it in my son's computer. He has an i5 10400 and it works fine.
    I know Intel recommends a 10th get but some people have made an ARC gpu work with an 8th gen so long as resizable bar is on.
    Any help would be appreciated.

  • @mikepawlikguitar
    @mikepawlikguitar 5 месяцев назад +2

    Great video! Thanks for sharing. So the A770 is a great card for 720p ultra, 1080p medium to high, or 1440p low to medium, depending on the game of course. No thanks. Lol

  • @lflyr6287
    @lflyr6287 6 месяцев назад +3

    Proceu Tech : The problem with this I5 13600K is that the 1.st CPU-s P-core only hold those turbo clocks for a few seconds and then they clock down, the same goes for the 2.nd CPU-s E-cores....this i5 is a bottle neck for many new GPU models that are of a MID TIER and above....so, to truly test the potential of this Intel Arc A770 16GB you would have to use a Ryzen 7 5800X3D (that is faster than this i5 13600K as well as the i7-s and i9-s) or a Ryzen 7 7800X3D (that is faster than the i9 14900K). You can go look it up, certain reviewers already proved that (not that youtube reviewers are legit, but in certain general knowledge scenarios, they, if the wanted to, cannot hold the truth back).

    • @ProceuTech
      @ProceuTech  6 месяцев назад +1

      While this behavior is true out of the box, I’ve uncapped the power limit and set the tau to be over 300 seconds. In games, and I can prove this by either showing Intel’s XTU or MSI afterburner, the chip is able to stick to 5.5GHz on P-Cores 0-3, and then 5.4GHz on the rest. All of the E cores were also stuck at 4.0 GHz. I do agree that there were cpu bound instances in this video; I Just wonder how much of it is due to the Arc cards and how much is due to the bottleneck

    • @lflyr6287
      @lflyr6287 6 месяцев назад +1

      @@ProceuTech cool to everything you just said....but bro, 230 Watts at stock clocks with no OC-ing, unlocking the time limit only increases the power consumption to 290 Watts ?!?!?!?!
      Nissan Skyline GT-R 35 can get 2500 HP and beat every car on the planet....while eating abnormal amount of fuel, breaking every 100 KM, needs its part replaced every week and so on....do get my drift.
      The 5800X3D and 7800X3D are at its stock clocks and at their stock extremely high efficiency the fastest CPU-s on the planet Earth. By doing so they eat very low amounts of power. And only at stock clocks with PBO enable is what counts and nothing else, because this is the most realistic scenario here to be used by every gamer.
      These two AMD-s are faster even at stock limited power consumption levels than you 300 Watt i5. That's why you should use either one of them for every GPU testing because this way you get the least amount of CPU bound scenarios if at all.
      5800X3D = max peak of 111 Watts
      7800X3D = max peak of 82 Watts

  • @michaelgunterman2620
    @michaelgunterman2620 4 месяца назад +1

    Im curious Where are you finding a 4060 cheaper? Everywhere I look, the 4060 is the same flr the 8gb and hundreds more for anything above the 8gb

  • @domepop
    @domepop 6 месяцев назад +9

    Your standards are a bit weird, what is a acceptable frame rate. Seems like anything under 200 fps at 1440p is not playable in your opinion

    • @dvolutionz6950
      @dvolutionz6950 6 месяцев назад +8

      i agree, he sead he cringed at the fact that the card ran crysis at 87 fps at 4k, like wtf ?

    • @raresmacovei8382
      @raresmacovei8382 5 месяцев назад +1

      "The card ran acceptible at 1080p. We got 250 fps"
      Roflmao.

  • @RenatoG1848
    @RenatoG1848 6 месяцев назад +4

    What about Skyrim?

  • @Saabjock
    @Saabjock 6 месяцев назад +3

    I really like my A770 LE.
    Sadly, I just put it up for sale last evening due to an issue in my favorite title generated by an Intel firmware update.
    Intel's support seem lacking at this point.
    Not sure if they factored the amount of resources they'd need to address issues.
    The standard answer for every issue is them asking to submit an SSU report...that is typically followed by them closing the case without any attempt at a proper resolution.

  • @APersonCalledAdam
    @APersonCalledAdam 4 месяца назад +1

    Does this card pair well with the 5600x?

  • @Mr11ESSE111
    @Mr11ESSE111 6 месяцев назад +3

    Arc 750 are much better deal because performance difference are under 10% and price difference are 50%

    • @Miley_00
      @Miley_00 5 месяцев назад +1

      But in a year or two 8gb not enough and I like to upgrade in 3+ years so I went 16Gb a770 Titan 280$ on sale

    • @Mr11ESSE111
      @Mr11ESSE111 5 месяцев назад +1

      @@Miley_00 it will be fine for 1080p and arc 770 are simply not enough stronger so 16gb will be enough but performance of card again will be too shitty

  • @9klincoln
    @9klincoln 5 месяцев назад +2

    have a a380 as dedicated video encoder

  • @WarMicrowave
    @WarMicrowave 6 месяцев назад +2

    Thinking about buying one , and i wanted to ask if it gets bottlenecked by an i3 12100f, and if yes in which extent?

    • @ProceuTech
      @ProceuTech  6 месяцев назад +1

      It definitely will, but only in incredibly CPU demanding tasks. In gaming, it should be fine and you shouldn’t notice any performance degradations.

  • @VanceLP
    @VanceLP 5 месяцев назад +3

    Only two games shown that are part of my game niche. A card like this with 16GB of VRAM for competitive games is money wasted, most of these games are optimized as hell for a GPU with 8GB of VRAM and don't need more than that!
    Now the performance of this card in these singleplay games is leaving something to be desired, cyberbug is a shit... although the performance of this card in Red Dead Redemption was a little satisfactory.
    Anyway, the card looks good but the performance is a little low for the price it offers!

    • @ProceuTech
      @ProceuTech  5 месяцев назад +3

      Couldn’t agree more!

    • @Miley_00
      @Miley_00 5 месяцев назад +2

      ​​​@@ProceuTechbut I'm guessing by the time Battlemage comes out, drivers might be on par with AMD which still isn't the best drivers.... I just ordered a 280$ A770 Titan 16Gb and am l hoping they will pull through for all of us!

  • @mullewap6670
    @mullewap6670 5 месяцев назад +1

    Not running with amd CPUs?

  • @Linkhunter
    @Linkhunter 4 месяца назад

    DOSBox is used in some older games from GOG oof

  • @Abess9
    @Abess9 5 месяцев назад +2

    What is name of game in background?

    • @ProceuTech
      @ProceuTech  5 месяцев назад +2

      Black Ops Cold War, Firebase-Z zombie map!

    • @houstonastrorider713
      @houstonastrorider713 4 месяца назад +1

      Had me wondering too. Looks great... I didn't play any cod since og black ops. And recently got back on mw3 for zombies. But definitely going to try that one.

  • @leomiles7263
    @leomiles7263 5 месяцев назад +2

    My guy think 140fps is barely high refresh rate 😂 3:49

  • @jounikyy7715
    @jounikyy7715 17 дней назад +1

    buy intel if u cry high prizes and think drop prizes if competitive buying

  • @chrismail0914
    @chrismail0914 3 месяца назад +1

    CAUTION: IF YOU WANT TO USE A750/A770 TO PLAY ZELDA TOTK OR OTHER SWITHCH GAME WITH YUZU/RYUJINX, STAY AWAY. IT WILL CRASH AND NO RESOLVE SOLUTION!

  • @w-games5674
    @w-games5674 4 месяца назад +1

    P i z d o b o l

  • @domepop
    @domepop 6 месяцев назад +1

    You mentioned the a750 price and performance 50 times in this video but never provided any evidence

  • @curtisoncuffy8114
    @curtisoncuffy8114 6 месяцев назад +4

    Watch, your, tone, Boy

    • @ProceuTech
      @ProceuTech  6 месяцев назад +4

      Unfortunately that’s what this comment section seems to want 😭

  • @w0lfnutz115
    @w0lfnutz115 5 месяцев назад +2

    too much graphs, too little gameplay footage

  • @LinuxCity79
    @LinuxCity79 6 месяцев назад +1

    i have a a750 and the fact it doesn't run dosbox is a huge turn off. there is no reason in 2024 where you cannot run dos games. intel made a huge blunder there. welp back to my 5700xt till i upgrade to something else.

  • @mrcrunch8000
    @mrcrunch8000 6 месяцев назад +1

    What is with these benchmark settings? It basically makes them completely useless and not comparable to any other trusted benchmarks.

  • @user-wq9mw2xz3j
    @user-wq9mw2xz3j 6 месяцев назад +2

    boycott Intel

  • @ko9wdmhnc
    @ko9wdmhnc 4 месяца назад +1

    If you want av1 encoding this card is king for the price. It’s definitely has a niche.