Intel Arc Goes Where NVIDIA Won't: A580 GPU Benchmarks & Review vs. A750, RX 6600, & More

Поделиться
HTML-код
  • Опубликовано: 3 авг 2024
  • Sponsor: Thermaltake SWAFAN EX 120 3-Pack on Amazon geni.us/ardZR
    This benchmark & review of the Intel Arc A580 GPU (using the Sparkle A580 Orc) tests the card vs. the Intel A750, AMD RX 6600, and several other more expensive cards, like the AMD RX 7600 and NVIDIA RTX 4060. These GPU benchmarks introduce sub-$200 cards to the charts, bringing a much-needed breath of fresh air to the world of video cards & PC building. Interestingly, we noticed that several A750s dropped to around $190 to $200 alongside the A580 launch at $180, complicating things further. AMD's RX 6600 is similarly priced, likely making for the closest alternative. These A580 tests explore whether the card is worth it and comment on the Intel driver situation for Arc GPUs.
    The best way to support our work is through our store: store.gamersnexus.net/
    Like our content? Please consider becoming our Patron to support us: / gamersnexus
    RELATED PRODUCTS [Affiliate Links]
    Sparkle Intel Arc A750 Orc OC on Amazon: geni.us/NAfOfq
    Sparkle Intel Arc A580 Orc OC on Amazon: geni.us/YwSM
    XFX RX 6600 SWFT 210 on Amazon: geni.us/MAC32Z
    ASUS RTX 4060 on Amazon: geni.us/LzJb1dg
    XFX RX 7600 on Amazon: geni.us/cvZ2dp
    TIMESTAMPS
    00:00 - Intel Arc A580 GPU Review
    02:50 - Intel A580 Overview, Specs & Pricing
    05:48 - Baldur’s Gate 3 GPU Benchmarks (1080p)
    06:48 - Baldur’s Gate 3 (1440p)
    07:34 - Starfield 1080p GPU Benchmarks
    08:10 - Total Warhammer 3 1080p Benchmarks
    09:21 - Total Warhammer 3 1440p A580 Benchmark
    10:06 - Tomb Raider 1080p GPU Comparison
    11:05 - Tomb Raider 1440p A750 vs. A580
    11:56 - FFXIV 1080p & 1440p Best GPUs
    13:26 - Horizon Zero Dawn 1440p Benchmarks
    13:58 - F1 2022 1080p & 1440p Framerate
    14:38 - Ray Tracing - Tomb Raider - 1080p & 1440p
    15:26 - Ray Tracing - F1 2022 - 1080p
    15:48 - Sparkle A580 Orc Thermals
    16:46 - Load Power Consumption (Intel A580)
    18:00 - Idle Power Consumption (Intel A580)
    18:54 - Conclusion (A580 vs. RX 6600, A750)
    ** Please like, comment, and subscribe for more! **
    Links to Amazon and Newegg are typically monetized on our channel (affiliate links) and may return a commission of sales to us from the retailer. This is unrelated to the product manufacturer. Any advertisements or sponsorships are disclosed within the video ("this video is brought to you by") and above the fold in the description. We do not ever produce paid content or "sponsored content" (meaning that the content is our idea and is not funded externally aside from whatever ad placement is in the beginning) and we do not ever charge manufacturers for coverage.
    Follow us in these locations for more gaming and hardware updates:
    t: / gamersnexus
    f: / gamersnexus
    w: www.gamersnexus.net/
  • ИгрыИгры

Комментарии • 1 тыс.

  • @GamersNexus
    @GamersNexus  9 месяцев назад +144

    FOR THE REST OF THIS WEEK: We are still giving 10% of our store sales revenue to Cat Angels, the local cat shelter charity we like to support. If you want a GN item, now's a great time to help us AND them! store.gamersnexus.net/
    Find our Baldur's Gate GPU benchmarks here! ruclips.net/video/AAjQXdwkgEE/видео.html

    • @Dom_Mason
      @Dom_Mason 9 месяцев назад +1

      I like having another player in the gpu market. Keeps competition on their toes and prices down. The A580 is not to bad of a gpu for some entry level gaming, or basic desktop 4k stuff. Does anyone know if Linux / Debian kernel support this line of Intel gpus? Open source like AMD?

    • @gintozlato1880
      @gintozlato1880 9 месяцев назад

      Do you Guys Ship To The UK?

    • @BattleChicken-ij2qs
      @BattleChicken-ij2qs 9 месяцев назад +3

      Yo! I have a A770 ARC 16gb and it plays Starfield NP. wtf?

    • @Winnetou17
      @Winnetou17 9 месяцев назад

      Why isn't there a single mention of RX6500 ? I know it's with problems, and it's most likely not very relevant anyway, quite a different class and pricing (I think, haven't checked). But at least a mention why it's not included would've been nice. You know, if it's not literally the bottom of the barrel, show what's below too, even if it's 40% lower.

    • @mikkodoria4778
      @mikkodoria4778 9 месяцев назад

      Please keep benchmarking total warhammer 3, just ignore the new total war game

  • @user-ku7qj9of4c
    @user-ku7qj9of4c 9 месяцев назад +1387

    Seeing a $180 entry level GPU come equipped with dual 8 pins for power is just absolutely wild to me.

    • @ulasht1
      @ulasht1 9 месяцев назад +161

      Intel's still working on the Overclocking Software, and every other Driver Update is a firmware update, honestly I feel like Intel has a Rare oportunity that Nvidia does that they could make use of the in chip graphics processing of some of their CPU's to help with Loads to the GPU in a weird Pseudo 3DFX version of SLI setup. as 11th, 12th, 13th, even 14th Gen, all have similar architecture to Arc for their Integrated graphics.

    • @unnoticedhero1
      @unnoticedhero1 9 месяцев назад +145

      Good, AMD and Nvidia need to have competition in this tier of GPU market because they often severely limit GPU performance in this price class. Most PC gamer's have GPUs in the $150-300 price range and there's not much improvement from the last couple generations of AMD/Nvidia in that price range, I'm hoping Intel's 2nd gen Battlemage cards really shake up the low-mid end market.

    • @infernaldaedra
      @infernaldaedra 9 месяцев назад +76

      ​@@unnoticedhero1yeah low end GPU market disappeared with mining craze

    • @dylanherron3963
      @dylanherron3963 9 месяцев назад +45

      @@unnoticedhero1 Good take, I've been VERY comfortable tailing 3-ish years behind the curve and allowing marketing trends to die and liquidation sales to happen ;) My card 5 years ago was a 230 dollar 1660ti, and my upgrade last year was a 6650XT on Cyber Monday for $240. I'm off the GPU market for the next 3-4 years with that lol, considering I'm a 1080p 144hz gamer anyway.

    • @hondajacka2
      @hondajacka2 9 месяцев назад +8

      yea. but performance is ~2060.

  • @CNote929
    @CNote929 9 месяцев назад +616

    The fact that the 1080 Ti is still competitive on these charts is absolutely insane

    • @rundown132
      @rundown132 9 месяцев назад +71

      goated GPU

    • @YetiMusicCity
      @YetiMusicCity 9 месяцев назад +32

      @@rundown132 I have the 11 GB one, still excellent 👍

    • @WCGwkf
      @WCGwkf 9 месяцев назад +38

      Used prices make a 1080ti probably the best value for a budget build

    • @4Wilko
      @4Wilko 9 месяцев назад +27

      I love my 1080 Ti. It's so good I'm always considering grabbing a spare one as a backup just in case. If I could just jam more VRAM into it I'd be more than set waiting for bad GPU trends to taper off.

    • @MrGr8productions
      @MrGr8productions 9 месяцев назад +65

      Nvidia will never make the mistakes that were the 1000 series cards again :/ true power house my 1060 and 1080

  • @wildfire5181
    @wildfire5181 9 месяцев назад +528

    Seeing the GTX 1080 TI still going strong all these years later brings a tear to the eyes.

    • @m.streicher8286
      @m.streicher8286 9 месяцев назад +89

      The last great Nvidia card. They have never, and will never, match it.

    • @WCGwkf
      @WCGwkf 9 месяцев назад

      ​​@@m.streicher82863080 is the card you're looking for same launch price, huge generational gains. Used price to performance best values are a 1080 and 3080

    • @MegaManNeo
      @MegaManNeo 9 месяцев назад +31

      Pascal remains one of the best generations across all vendors.

    • @damasterpiece08
      @damasterpiece08 9 месяцев назад +36

      it's being gimped to oblivion unfortunately, in some games it's running much worse than it should

    • @panathaninf
      @panathaninf 9 месяцев назад +4

      My card: the boss

  • @CZPC
    @CZPC 9 месяцев назад +859

    I always knew Intel could do it. They have the resources to be that 3rd player the gpu industry needs to remind the other two they actually have to compete with each other.

    • @MistahGamah
      @MistahGamah 9 месяцев назад +65

      ​​@@WhiteG60I'm cautiously optimistic. Gelsinger has shown he's perfectly willing to cut or divest large Intel projects (Mobileye, Altera, Optane...), so the fact that Arc has lived this long I think is a positive signal. Plus he's on the record saying (paraphrasing) he wants Intel GPUs to be one of his lasting legacies.
      Mostly just worried that the order comes from someone above him...

    • @visitante-pc5zc
      @visitante-pc5zc 9 месяцев назад

      @MistahGamah yeah. The pressure for quick results

    • @youreright7534
      @youreright7534 9 месяцев назад +14

      They aren't doing it. They're making $200 cards... they still can't even keep up with amd in terms of making mid range cards and amd can't keep up with nvidia in terms of high end cards. It'll be 5+ years before Intel has a good high end card, if even.

    • @thefreewayoctopus
      @thefreewayoctopus 9 месяцев назад +41

      ⁠@@youreright7534The fact that their first gpu series (which was significantly delayed) is even relevant is impressive, they’re having to catch up to years of advances by amd and nvidia all at once

    • @tuckerhiggins4336
      @tuckerhiggins4336 9 месяцев назад +1

      They straight up don't have the resources

  • @Sonic6293
    @Sonic6293 9 месяцев назад +489

    It's been interesting to see Intel's progress in the dGPU space, I'm optimistic for Battlemage. I hope they can crush it in the low end space, I'd love to see what they can do with 75W of headroom(especially if they can do something to outshine low profile single slot options like the RX 6400/GTX 1650).

    • @GamersNexus
      @GamersNexus  9 месяцев назад +253

      Battlemage is where Intel will really need to prove itself. At that point, they'll have learned all the necessary lessons on the driver side -- it'd be good to keep NVIDIA (in particular) in check, although AMD could use the check too.

    • @quittessa1409
      @quittessa1409 9 месяцев назад +27

      Here's hoping, I'm running an A380 as an AV1 encoder and quite happy at it's performance there, but have a 2nd hand 3090 as my gaming card@@GamersNexus

    • @ResidentWeevil2077
      @ResidentWeevil2077 9 месяцев назад +19

      I switched to Team Red with a 7800 XT in an all AMD build, but I still have interest in Intel Arc - in fact I rebuilt my Intel-based rig with the A770 16GB card I have, and keeping it as lower powered rig.

    • @Wifelover_
      @Wifelover_ 9 месяцев назад +8

      There already is a 75watt low profile 2 slot that far outshines those. It’s the RTX4000 SFF ADA. Even its previous gen RTX A2000.
      However as workstation GPUs their cost is often not worth the performance.

    • @visitante-pc5zc
      @visitante-pc5zc 9 месяцев назад +17

      @Wifelover_ I'll never buy anything from ngreedia ever

  • @Majima_Nowhere
    @Majima_Nowhere 9 месяцев назад +33

    The pricing is mindblowing. In a good way. A year ago, you couldn't find a 1650 for less than 200 bucks. I bought an A770 for a build a while ago and was impressed with the price/performance of that one too. Intel is absolutely on the right path here.

    • @DuBstep115
      @DuBstep115 9 месяцев назад +2

      And if that price drops to 150 as steve said, that best in class then for $/fps

    • @davidrenton
      @davidrenton 8 месяцев назад +1

      i brought a computer 3 years ago with a 1650, quite decent the whole thing cost 330

  • @toufusoup
    @toufusoup 9 месяцев назад +279

    Props to Intel for keeping at this and not just abandoning it like Microsoft with their Zune. It’s really impressive what they’ve been able to do, not to mention good workstation performance!

    • @DoNotFitInACivic
      @DoNotFitInACivic 9 месяцев назад +27

      I hope they keep at it, as Intel has a history of getting pissy if they aren't the top.

    • @sahara8771
      @sahara8771 9 месяцев назад +25

      Zune did get a second gen, it was more a matter of timing... and the fact that gen 1 was literally just a rebrand of a Toshiba mp3 player. Anyway, the iphone came out shortly after and within a few years mp3 players were a mostly niche product because smartphones existed.
      Not entirely comparable on how much investment they'd be throwing away if they stopped supporting it, and it's also not a market that's just going to completely evaporate in 3 years.

    • @BTechUnited
      @BTechUnited 9 месяцев назад +4

      Unironically miss the zune software, thanks for reminding me of that.

    • @vn_loc7316
      @vn_loc7316 9 месяцев назад +18

      ​​@@Failure_Is_An_OptionZune was as good music player as iPod. Problem was the lack of apps. Same with Windows Phone, both had great UI.

    • @hashbrown777
      @hashbrown777 9 месяцев назад +5

      Yeah my Win10 phone was amazing. But no apps and MS gave up :( they tried so hard too with making those android->uwp apis

  • @RepsUp100
    @RepsUp100 9 месяцев назад +11

    Elf, Orc, Titan... I love Sparkle's Arc GPU names

    • @liquidcorundum6568
      @liquidcorundum6568 2 месяца назад

      They go well with Intel's fantasy-inspired codenames.

  • @WildkatPhoto
    @WildkatPhoto 9 месяцев назад +222

    Im just happy someone is making GPUs that are normal sized. Its amazing to see a 1ish slot card.

    • @Derkiboi
      @Derkiboi 9 месяцев назад +10

      Ok, I'm not the only one thinking about this

    • @legendaryhero90
      @legendaryhero90 9 месяцев назад +30

      Gpu that don't take up half my pc case!? What an outrage! /s

    • @mechanicalmonk2020
      @mechanicalmonk2020 9 месяцев назад +23

      It's not the GPUs that are big. It's the cooling. Weaker cards need less cooling hence the compactness

    • @visitante-pc5zc
      @visitante-pc5zc 9 месяцев назад +37

      And game developers need to relearn to code properly.
      At this rate we'll need a datacenter to run the next Bethesda game

    • @MistyKathrine
      @MistyKathrine 9 месяцев назад +3

      And less than $200!

  • @c99kfm
    @c99kfm 9 месяцев назад +23

    I bought the RX 6600 on release as an emergency stop-gap, and I'm increasingly aware I lucked into an extremely good purchase.

    • @SIPEROTH
      @SIPEROTH 9 месяцев назад +2

      On release am not so sure because it was like $330. I guess you still got your money's worth since you can still used it just fine and that is a good few years for a $330 card but i bought it start of this year just because it was crazy cheap at $200.
      And i have to say i don't regret it. It plays everything on 1080p that my screen is. For only $200 it is more than capable.

    • @c99kfm
      @c99kfm 9 месяцев назад +1

      @@SIPEROTH Thing is, this was during the Covid price insanity. I was just about to bite the bullet on a $500+ 6600 XT, when the 6600 launched and I managed to pick one up for $299.

    • @c99kfm
      @c99kfm 9 месяцев назад +2

      Or maybe that should be "crypto price insanity". Well, the consumer-side price bubble which preceded (and arguably led to) the current-gen vendor-side price inflation.

    • @marlon.8051
      @marlon.8051 9 месяцев назад

      @@SIPEROTH i got an 6600 for 150$ off le aliexpress and it ran anything i wanted and the power comsumption of the card is amazing, definitely a big upgrade from a gt 1030 for me

  • @ericwright8592
    @ericwright8592 9 месяцев назад +36

    The real story here is the GTX 1080ti solidly hanging on in most of the charts.

  • @AlexSchendel
    @AlexSchendel 9 месяцев назад +71

    It's wild that there's a $190 A750 right now... Over a 33% price drop since launch, and it was already decent value back then! Between the massive driver improvements and the 33% drop in price, it's become wayyy better

    • @User9681e
      @User9681e 9 месяцев назад +4

      If people keep not purchasing the card it will turn free lol
      Just needs some guy to unlock the disabled units on the chips and we have decent ray tracing for nothing

    • @nathangamble125
      @nathangamble125 9 месяцев назад

      It's not wild at all. It's completely normal for GPUs to get cheaper a few years after launch. You can buy RX 6600s for less.

    • @HunterTracks
      @HunterTracks 9 месяцев назад +4

      ​@@nathangamble125RX 6600 has been on the market for twice as long, and the MSRP for it isn't that much higher ($330 vs $290).
      Things aren't really going great for Arc cards at the moment.

    • @lordhellriser1
      @lordhellriser1 9 месяцев назад

      43 wats de consumo en espera, esa tarjeta no la toco ni con palo 😂, la energía no me la regalan

    • @SIPEROTH
      @SIPEROTH 9 месяцев назад

      @@lordhellriser1 Yeah that is an issue that Intel really needs to just sit down and solve as fast as possible.

  • @geraldh.8047
    @geraldh.8047 9 месяцев назад +7

    Thank you for reporting on idle power. Please include it other reviews as well ! 👍

  • @nebraskarooster9244
    @nebraskarooster9244 9 месяцев назад

    Great video - thanks as always for putting this together and breaking things down like this!

  • @A-Monkman
    @A-Monkman 9 месяцев назад +120

    Still blows my mind that the 1080 ti is still on the charts, what a great card.

    • @NoGoodNoob
      @NoGoodNoob 9 месяцев назад +24

      That also shows how lacking the improvements of the newer Gen cards have been. I guess it's good to look at the charts with optimism though 😂

    • @Niiwastaken
      @Niiwastaken 9 месяцев назад +8

      Tbh the 1080 ti was a top dog when it came out so im not really surprised.

    • @fayis4everlove
      @fayis4everlove 9 месяцев назад +3

      It's a $399 card with 11gb .. come on..

    • @m.streicher8286
      @m.streicher8286 9 месяцев назад +6

      The 1080 ti isn't a very good card, it's just that progress has stagnated since 2016.

    • @NothingXemnas
      @NothingXemnas 9 месяцев назад

      ​@@m.streicher8286 Ok, let's not be contrarians, here. All in all, even older 8GB AMD cards can still be used for gaming at 1080p, as demonstrated well by Optimum Tech, and the 1080ti is above even that.
      In fact, if such old hardware tech still works, I'd not argue these are the ones stagnating; it is the software that isn't getting more demanding.

  • @MayaPosch
    @MayaPosch 9 месяцев назад +6

    Happy to see idle power usage. Since most of the time the system will be sitting at the desktop, being used for browsing and watching videos, it's really useful to know which components are the most efficient during those period.
    As I'm trying to put together a new rig that doesn't idle at ~100 Watt like my 2015 Skylake 6700K, GTX 980 Ti one, this is of great interest to me. Sadly, it's also a kind of metric that seems surprisingly hard to find, somehow.

  • @aaronriggs4430
    @aaronriggs4430 9 месяцев назад +61

    1080ti officially made it all the way through the crypto craze, GPU shortage, and pandemic. Fkn legendary.

    • @Rspsand07
      @Rspsand07 9 месяцев назад +12

      Two gpu shortages and mining crazes. There was one in 2018 as well

    • @aaronriggs4430
      @aaronriggs4430 9 месяцев назад +9

      @@Rspsand07 in my eyes it was one long period with inflated prices :(. I just wanted a 1060 for msrp forever.

  • @codyjohnson9321
    @codyjohnson9321 8 месяцев назад +2

    Picked up an a580 on black friday from Newegg via TikTok for $109! Should arrive today. Will be great for my sons PC for Christmas. Thanks for this informative video.

    • @mikehank2896
      @mikehank2896 7 месяцев назад

      how is the a580 gpu? i'm considering it?

  • @Tschacki_Quacki
    @Tschacki_Quacki 9 месяцев назад +2

    Sparkle back with GPUs????
    What a time to be alive.

  • @Blaquegold
    @Blaquegold 9 месяцев назад +8

    Remember how AMD cards used to be? Look at them now. I think Intel will eventually get there too. We need this competition to keep both Nvidia and AMD in line.

  • @Zapdos0145
    @Zapdos0145 9 месяцев назад +58

    let’s go, some nice intel GPUs. crazy surprise drop

    • @visitante-pc5zc
      @visitante-pc5zc 9 месяцев назад +9

      You should buy one to support intel, bro. We need a 3rd guy competing. Jensen and his leather-jacket needs to be humbled.
      So tomorrow you go to a PC store and fetch one

    • @Zapdos0145
      @Zapdos0145 9 месяцев назад +10

      @@visitante-pc5zc i own an A750 for my secondary gaming rig, got it day 1. pleasantly surprised. however as steve says a lot i don’t think it’s up to us to support the multi billion dollar company that is intel.

    • @russellg1473
      @russellg1473 9 месяцев назад

      @@visitante-pc5zcyes because intel has proven to never get their hand dirty with type of manipulation nvidia and amd have been pulling recently…. /s

  • @saucyg6371
    @saucyg6371 9 месяцев назад +1

    Your guy’s efficiency is insane. A 24hr turnaround for this review is just mind boggling to me.

  • @jimbodee4043
    @jimbodee4043 9 месяцев назад

    Good job on the quick 24 hours turn around on the review after the delay in delivery.

  • @DustyCruz
    @DustyCruz 9 месяцев назад +18

    Im using the A770 16gb LE. And man I gotta say this card rips everything at 1080p. However, like what has been said in numerous videos(including this one), do not get these cards if your motherboard doesnt support reBAR or if youre playing older games. DX8 in particular is hit or miss annd some OpenGL games have weird graphical bugs.

    • @coatlessali
      @coatlessali 9 месяцев назад +1

      Have you tried DXVK for DX9 titles? Intel seems to have their Vulkan driver in order and it might provide a boost.

  • @LuigiGodzillaGirl
    @LuigiGodzillaGirl 9 месяцев назад +41

    I applaud Intel about as much as I can when it comes to big tech mega corps. Arc has come a long way since launch, reaching a point where I almost feel comfortable recommending it to friends. I hope they continue the good work.

  • @jgorres
    @jgorres 9 месяцев назад

    I'm extremely happy your review unit got lost in the mail so you could test with launch day drivers! I was kinda pissed at other reviewers that tested with the "wrong" drivers just to get their videos out on launch day, even though they knew a significant driver update was right around the corner.

  • @zachknell8125
    @zachknell8125 9 месяцев назад +2

    I like the look of that card. That blue-ish color is pretty nice!

  • @WRXnumberSeven
    @WRXnumberSeven 9 месяцев назад +3

    Having more competition is only good news for us consumers - excellent!

  • @DKTD23
    @DKTD23 9 месяцев назад +16

    The A750 hit $199 again for Prime Day. It's still the better buy for sure. Great to see the continued strides!

    • @snowhawk04
      @snowhawk04 9 месяцев назад

      Newegg has had all the A750 models for 199 and the Sparkle A750 for 189.

    • @tortellinifettuccine
      @tortellinifettuccine 9 месяцев назад

      It's not a better buy over literally anything, a gtx 1070 would literally blast it out of the water for half the price used

    • @eli72481
      @eli72481 9 месяцев назад +1

      @@tortellinifettuccine where is a 1070 blasting an a750 out the water

    • @tortellinifettuccine
      @tortellinifettuccine 9 месяцев назад +1

      @eli72481 in every possible metric, again, you wanna be a corpo rat go ahead, let them test outdated tech on you for the price of a new tech. A gtx 970 could literally blow it out of the water. I don't have anything to prove, I'm not sponsored by Intel....now these creators....thats another story.

    • @leolandi3852
      @leolandi3852 9 месяцев назад

      used 1070 ? a 6 year card that went through the mining craze? you really feeling lucky if you're willing to flip that coin. @@tortellinifettuccine

  • @andreaslegomovies
    @andreaslegomovies 9 месяцев назад +2

    Thanks for including the idle power, much appreciated :)
    I'd really like to see proper power efficiency rather than just max power draw. I.e. a measure like W/fps (=J/frame).
    Or energy per unit of compute, total energy (J or kWh) consumed during a well defined task, e.g. a blender rendering task. In this case also reporting the duration of the task would be nice.
    (Phoronix includes this type of results once in a while, but mostly for CPUs)

  • @vincentvanrijn7469
    @vincentvanrijn7469 9 месяцев назад +2

    Those blue PCI-E 8 pin connectors look fire 0:40

  • @Kapitaen_Flauschbart
    @Kapitaen_Flauschbart 9 месяцев назад +4

    I hope Intel really gets their Battlemage lineup goin, I see much potential!

  • @QuantumConundrum
    @QuantumConundrum 9 месяцев назад +3

    This is great. I don't expect battlemage to have all the issues sorted out from the start, but I am hopeful of what Intel brings in the future. I just think that software/driver maturity takes way longer and some more long term patience will be required.

  • @lexzbuddy
    @lexzbuddy 9 месяцев назад

    Great review, thanks

  • @SurelyYewJest
    @SurelyYewJest 9 дней назад

    NICE. Bringing out the ol' 90s cleanroom sparkle crew.

  • @bob_ohms2low75
    @bob_ohms2low75 9 месяцев назад +4

    Extremely interested to see how battle mage goes! I've got a 3080ti in my main system, but would love to build another all Intel build.

  • @2K8Si
    @2K8Si 9 месяцев назад +13

    The 1080ti... Wow... Still belting out some good numbers in 2023...😁

  • @totallynotmyeggalt6216
    @totallynotmyeggalt6216 9 месяцев назад

    Interesting to see the 7700XT/7800XT Stock vs. OC data in the power draw graphs...can't wait for that video!

  • @sashacrossi
    @sashacrossi 9 месяцев назад +1

    Honestly gotta say I love that you include games like FFXIV in these reviews! Before I bought my RX 7600 I watched your video about it and I didn't expect it, but it helped me out at choosing the card immensely.I couldn't be happier with it! Got it reduced and raiding is a whole new experience compared to 17-40 fps I had before! Thanks ♥

  • @JoJoJenkns
    @JoJoJenkns 9 месяцев назад +3

    I’m sure a lot of people world wide watch this that may need the idle power consumption number. As some places that idle draw may make a difference Ina home lab or a person starting out. It’s also great we can get all the numbers from a reputable 3rd party. Thank you Gamers Nexus Team For the work you guys do!

  • @EnvAdam
    @EnvAdam 9 месяцев назад +6

    I cant wait for next gen arc although I dont plan on replacing my GPU for the next 5 maybe more years, mainly for a couple reasons, 1. hopefully they fix the idle power usage which I think is down to the memory bus and related never clocking down unless the card is literally off, 2. hopefully we'll see less of starfield although it runs fine on my A770 but at around 50ish FPS regardless of graphics settings and the most recent driver did fix the light flickering issue.
    anyway as said happy with my A770 16GB LE, had it since last year around this time.

  • @Frendh
    @Frendh 9 месяцев назад

    Idle power consumption is very interesting to me. Thanks for including.

  • @KerbalLauncher
    @KerbalLauncher 9 месяцев назад +2

    For the idle power draw issue, you need to turn on ASPM in the motherboard bios so that the OS can control the low power states on the GPU.

  • @pyroslev
    @pyroslev 9 месяцев назад +55

    Intel is still trying and playing the games. They're putting in the work. They're cards are still punching up in ways that are surprising. If this does drop to that $160 range with a good driver update, AMD better pay attention to the budget section.

    • @andersjjensen
      @andersjjensen 9 месяцев назад +5

      I hate to say this but given the ever increase in wafer prices (and the A580/A750/A770 use TWICE as much N6 as the RX7600) due to the decade delay on EUV lithography, Intel is just trying to break even at these prices.

    • @martinkrauser4029
      @martinkrauser4029 9 месяцев назад +10

      @@andersjjensenthat's an awfully confident statement to make on a firm's accounting for an outsider

    • @Fullduplex-hp1uj
      @Fullduplex-hp1uj 9 месяцев назад +2

      @@martinkrauser4029 Its not. The same economics apply to these GPUs then to AMD's or nVidia's. GPU's are manufactured by an external party and since intel needs bigger GPU size and bus with for the same performance, their costs will be higher. AMD's low end already is competing with nVidia's marketing through pricing, so it does not bode good for intel, that they have to manufacture a mid-range spec card (GPU size, but with, VRM etc) that can only compete with low end.

    • @andersjjensen
      @andersjjensen 9 месяцев назад

      @@martinkrauser4029 "to make on a firm's accounting for an outsider"
      Uhh, well: Publicly traded companies must publish quarterly public financial statements. And they must be accurate. AMD lumps dGPUs and console chips into the same category labelled "Gamin products". Last quarter they reported a 24% gross margin. Observe that gross margins and "real" margins are very different. Gross margin means "if we sell a chip for $100 then, when we're doing paying TSMC, our Malaysian packaging plant, and for shipping to our partners, we have $24 left". That is, gross margin does not cover RnD, foundry bring-up costs, driver development, technical support for board partners, etc, etc. For reference AMD has a company wide gross margin of 46% (driven up by much higher margins in datacenter). Their gaming segment is, by far, their worst earning segment in terms of margin.
      Add to this: Intel's graphics uses not just almost twice the chip real estate for the same performance, but also more power (A380 is 406mm2/175W vs RX 6600 at 237mm2/132W) which means that Intel cards need a ~33% more beefy VRM which, obviously, means more base cost to a final board. Which again means that Intel sees less money in hand when a partner sells a board for $180 than the performance equivalent form AMD for $210. When you're at a triple disadvantage like that (must be cheaper because of brand stigma, uses more chip area and needs more support component) in a market where the competition is already absolutely fierce, then you WILL make shit margins. If any at all.
      AMD has had periods where their gross gaming margin was 9-12% for several quarters straight despite having comparatively (to Intel) competitive products, and they have run into the reds several times due to minor snafus like product launch delays.
      TL;DR: Just because you don't personally look into the financial statements of publicly traded companies doesn't mean others can't gauge the situation to a reasonable degree of accuracy. If that wasn't possible then investing in stocks would pretty much be pure gambling.

  • @habitualoffender4957
    @habitualoffender4957 9 месяцев назад +2

    So glad to see them doing better. I know they still have some growing to do but looking at where they were they are great. Wont be long that NVIDIA and AMD will have to do alot more.

  • @Kerosyn
    @Kerosyn 9 месяцев назад +2

    this was always by far the most interesting arc card to me since the initial announcement, so it's nice to see that it finally exists and pretty much lines up with what I expected, even if the power efficiency is hilariously bad. when these drop to $150 or lower (which they really should be considering the fact that the a750 can be priced nearly the same), then a used 5700XT might finally get dethroned on the value front. I want one

  • @sixteenornumber
    @sixteenornumber 9 месяцев назад +2

    The new starfield drivers are fantastic. Just a couple weeks ago, I had trouble playing on 1080p, now even 4k is playable on a a750

  • @tj3495
    @tj3495 9 месяцев назад +5

    Paul's Hardware's review pointed out a huge drop in performance from simply opening up the in game menu in Starfield. Most of the time in which it did not recover unless you reloaded the game. Did you encounter this as well? Others have mentioned it with different hardware configurations as well though.

    • @kravenfoxbodies2479
      @kravenfoxbodies2479 9 месяцев назад +6

      I am not going hold Starfield against Arc, it also ran like trash on my RTX 3070/ RX 6700 10Gb / RX 6600 and still not DLSS /XeSS added yet with patch.

  • @RebelSapph
    @RebelSapph 9 месяцев назад +9

    I got an a770 after my 3070 died, needed something quick and didn't had a ton of money, decided to give Intel a chance, I like weird and not super popular hardware after all, and honestly? not disappointed, been with it for about a month now, the only issues I had was with Starfield, everything else is fine. I'm super looking forward to Battlemage, as you said in other comment, its where Intel REALLY needs to prove themselves and I hope they can pull it off, we need to give Nvidia a check asap.

  • @martini668
    @martini668 9 месяцев назад

    Great to see another company in the mix

  • @Mike-wo2dk
    @Mike-wo2dk 9 месяцев назад +1

    Please review touch screen monitors!
    It’s amazing seeing a feature that was pushed so hard in the early 2010s being almost completely dropped. There is little to no genuine reviews out there now. An objective analysis would be amazing.

  • @lurkii_721
    @lurkii_721 9 месяцев назад +4

    Would be funny to see a A580 vs RX 580 vs GTX 580 test.

  • @endurofurry
    @endurofurry 9 месяцев назад +4

    Biggest thing to me and the reason i got one was its extreamly cheap to have AV1 encoding which is were i put it in my media server. if you have a media server HIGHLY recommend it for that AV1 support.

    • @killingtimeitself
      @killingtimeitself 9 месяцев назад

      depending on the codecs you use and hardware you prefer, intels modern QSV support is really good. And the recent cpus are power efficient as well. Overall 12th gen is a great homelab platform for those looking for a simple media server build.

  • @sworddice
    @sworddice 8 месяцев назад

    seeing 1080ti not at the bottom surprised me. the old man still packs a punch!

  • @iwsfg
    @iwsfg 9 месяцев назад

    it looks cool in blue with those two white stripes going across it

  • @gio2vanni86
    @gio2vanni86 9 месяцев назад +38

    Its crazy to me that the 1080ti is still pulling weight 7 years later.

    • @pirobot668beta
      @pirobot668beta 9 месяцев назад +1

      The design of the ten-speed bike is over 100 years old, but it just keeps on going.
      A good design that fills a need will never go out of style.

    • @moonasha
      @moonasha 8 месяцев назад

      I have a 1070 (about to replace it with a 4060) and it runs pretty much every game to this day perfectly fine. Games honestly just haven't increased in computing demand that much in the past 10 years. Alien Isolation is going on 10 years old and still looks utterly amazing. People are saying cards are stagnating a lot, but honestly, computer graphics are kind of stagnating. The only real big leap we've had is ray tracing, and it was such a big leap nvidia had to put AI in the cards to make it even playable

  • @Slane583
    @Slane583 9 месяцев назад +3

    I bought an ARC A770 back in July to play around with and so far it has been very promising. It wasn't really a replacement for my RX 5700XT per-se, more of wanting to try something different. I don't follow the newest AAA games as none of them really pique my interest, Starfield falling in that category. But everything that I play as far as games go run quite well on the A770, the Black Mesa remake of Half Life runs at high fps. My most played game, Sniper Elite 5, has the settings maxed out in quality and the internal resolution scaler is set to 140% to get more visual fidelity at 1080p. It runs like melted butter. :)

  • @jasonyoung3070
    @jasonyoung3070 7 месяцев назад

    really happy to see a 3rd brand choice in graphfics cards hopefully they can keep it up and sell enough to keep thier gpu devopment going

  • @Kules3
    @Kules3 9 месяцев назад

    Thx for video!

  • @flamingscar5263
    @flamingscar5263 9 месяцев назад +40

    This GPU proves Intel is in it for the long haul, no one expected this card to come out, we all assumed it was cancled, but Intel still kept their word and released it, completing thier trifecta of the 3, 5, 7 branding
    A380
    A580
    A750
    A770
    I can only hope a couple generations down the line we see a 9 series to compete with at least the top end, probably not enthusiast level (90 series) but it would be nice to have a more competent competition with Nvidia at the 80 class range, AMD isnt doing bad, I even am considering a 7900xtx but it is limited compared to Nvidias GPUs of the same class
    Worse raytracing, little to no AI cores so no DLSS
    Intel has embaraced AMD in the fact they nailed raytracing first try, the fact that a $180 GPU can handle raytracing on ARC at 1080p when AMDs $200 offering barely does is embarrassing, AMD has been in the game for long enough that they should have crushed it

    • @Greenalex89
      @Greenalex89 9 месяцев назад +7

      Raytracing is too inconsistent, buggy and experimental for me to care, let alone buy from ngreedia BUT AMD hasnt been innocent in that case.
      I usually avoid Intel CPUs cause I think they gotta step up their technology and not the clock speed, but the upcoming battlemage would make me upgrade if its a higher end card for a criminally cheap price. Like a 1080 situation revived..

    • @draxrdax7321
      @draxrdax7321 9 месяцев назад +6

      @@Greenalex89 Not to mention ray tracing on entry level cards brings all the cards to their knees. So unless you want to play games at 30fps with RTX on a 200$ card, better give up on using it. For me RTX is just a marketing gimmick. Giving up on 50% performance for an improvement you can barely notice, and that only in certain situations, it's not worth it.

    • @raresmacovei8382
      @raresmacovei8382 9 месяцев назад +1

      There's also the A310, lol.

    • @aerosw1ft
      @aerosw1ft 9 месяцев назад

      ​@@draxrdax7321i share that same "RTX is a marketing gimmick" mindset about a year ago when I purchased my 6700XT, but seeing how more games are utilizing it and how it can genuinely be a game changer graphics wise in some game *cough* cyberpunk *cough* and also the fact that intel is doing a pretty nice job on RT compared to AMD whose on they're 2nd gen RT enabled cards, it's really starting to look like the way forward for gaming graphics. Like it or not AMD is neglecting this feature, and they might be caught with their pants off by Intel if they don't start getting more serious with it

    • @DKTD23
      @DKTD23 9 месяцев назад

      The 7900XTX is a great video card. Pretty easy to undervolt as well.

  • @olnnn
    @olnnn 9 месяцев назад +12

    It would be really useful to see a test of these more budget cards combined with a budget CPU as well to see how driver overhead/scheduling implementation comes into play when the CPU is closer to the limit - other channels like HUB have demonstrated significant differences between AMD and Nvidia there in the past and I've seen anecdotal reports of arc cards being somewhat sensitive to cpu-limited scenarios too.
    While it makes sense to eliminate any CPU bottleneck to evaluate raw performance it's also very useful to evaluate how the cards perform in a more realistic setup as well, these carsd won't usually be paired with a top of the line CPU.

    • @sijedevos2376
      @sijedevos2376 9 месяцев назад +1

      Yes please. I have a750 paired with a r5 5600 and I’m seeing like 30-40fps on the same settings he did in horizon zero dawn. Cpu usage 90% or so. Also lowest settings 1080p fsr ultra performance gives me like 60-70fps. This is tested with rebar on, dual channel ram and it the frozen wilds dlc. Also baldurs gate 3 suffers really bad inside the city on arc (even in vulkan) while a similar nvidia card doesn’t on the same system.

  • @superdbsfan8670
    @superdbsfan8670 9 месяцев назад

    Just received my modmat I love it makes a great surface for tear downs. I’m liquid cooling my pc didn’t have to worry to much about damaging solder pins on the back of PCB boards on my graphics card and motherboard

  • @djvidual8288
    @djvidual8288 9 месяцев назад

    The blue pcb on the Intel cards look awesome!

  • @Yudginklg
    @Yudginklg 9 месяцев назад +3

    I use Arc750, play almost only WoW and I like it. For me it was the most reasonable purchase, cause I don't play modern games. This card allow me to play with rt if I want, and I like Intel's XeSS technology.

  • @martinzhang5533
    @martinzhang5533 9 месяцев назад +9

    The second gen intel cards would be great. Really looking forward to more competition in the gpu area.

  • @solreaver83
    @solreaver83 9 месяцев назад +2

    I'm excited to see where they go in a generation or 3 especially. The potential they have starting from scratch means they haven't been stuck in a culture of this is how a card is built like n idea and amd who have their preconceptions and bones of previous cards to build on.

    • @haukionkannel
      @haukionkannel 9 месяцев назад

      Up in price…. No profit now, 80% profit in the future

    • @solreaver83
      @solreaver83 9 месяцев назад

      @@haukionkannel OK....

  • @MrDavidRicketts
    @MrDavidRicketts 9 месяцев назад +1

    I have an a770 for my dedicated racing sim PC setup and it does great

  • @T0beyeus
    @T0beyeus 9 месяцев назад +10

    I am really hopeful for Intel Arc, I just bought a a380 for my moms mobile workstation. It was nice to know I could get a GPU for $100. If Intel can get their drivers working smoothly and keep the price to performance competitive I think they will do really well. I have recently been a convert to Team Red for GPUs just because Nvidia cannot provide price to performance, I would be just as open to Intel for my gaming GPU. Intel overall needs to work on their power usage though, it is one of the things starting to push me to AMD for their CPUs

    • @DeltaSierra426
      @DeltaSierra426 9 месяцев назад

      AMD's latest APU's about keep up with any GPU's around $100. Honestly, I think the only way to get decent value on low-end graphics going forward will be CPU's with good iGPU's. That is, unless one is looking at $150 GPU's and higher, in which case iGPU's just aren't at that perf level. I don't think Intel will let themselves remain far behind AMD, either.

  • @zdspider6778
    @zdspider6778 9 месяцев назад +5

    3:39 So Intel can make a 256-bit card for $180, but Nvidia can't.
    The "4070" and "4070Ti" both have 192-bit, when ALL 70-class cards sold in the last DECADE (since the GTX 670) came with 256-bit. Nvidia can get bent.

  • @notthedroidsyourelookingfo4026
    @notthedroidsyourelookingfo4026 9 месяцев назад

    I'm really looking forward to Battlemage (and Celestial). If they keep their prices this low, I'm definitely buying one.

  • @damasterpiece08
    @damasterpiece08 9 месяцев назад +2

    Thank you for having a few old cards on the charts ! great work

  • @TheyCallMeMrMaybe
    @TheyCallMeMrMaybe 9 месяцев назад +14

    I'm continuing to root for Intel, here. As soon as they release a comparable and stable-enough enthusiast card, I am for sure switching from Nvidia.

    • @ResidentWeevil2077
      @ResidentWeevil2077 9 месяцев назад +2

      The only reason I can see anyone sticking with Nvidia at this point is CUDA for productivity. If you don't care about creativity/productivity or RT, Arc and Radeon are better cards. I have an A770 in an all Intel lower power rig, and a 7800 XT in an all AMD rig as my main PC.

  • @matthew_ferguson
    @matthew_ferguson 9 месяцев назад +7

    I found a Gigabyte RX6600 Eagle open box at Micro Center for $118 a few months ago. I even got Starfield for free with it. That card is still a real competitor in the budget world. And now with the FMF beta? It's shocking.

    • @masterluckyluke
      @masterluckyluke 9 месяцев назад +2

      But that's not the normal price for the card, you just got lucky. ;) And the Arc card provides modern technology like AV1 encoding and much better RT performance + the drivers are getting better and better and in several games it already outperforms the 6600 easily. When Intel continues the good work, there is a lot to gain with an Arc card. But of course, if you are able to get a used 6600 for that cheap including a modern game, that's a great offer. But you can get lucky with every card out there.^^

    • @gabsol7397
      @gabsol7397 9 месяцев назад

      ​@@masterluckylukeyou aren't going to play with RT with those cards tho...

  • @ImpmanPDX
    @ImpmanPDX 9 месяцев назад +1

    I like how stable the card is under load. I feel like the variance between mean and 1% is much closer for the arc. It's a shame that it seems to be at the cost of jamming current.

  • @Shiny_Dragonite
    @Shiny_Dragonite 9 месяцев назад +1

    If I hadn't built my most recent PC with the intention of playing a brand new game (Armored Core 6), I probably would have gone for an Arc. One of my friends is building a PC but he's not the most tech savvy person and Intel's finally getting the drivers to a point where I can recommend the cards and this makes me happy, particularly since he asked me about them. Outside of the first gaming PC I built 15 years ago (god I feel old saying that lol), I've tended to be more on the budget end and a solid card from Intel that I could trust would be a godsend.

  • @__aceofspades
    @__aceofspades 9 месяцев назад +3

    Bought my girlfriend an A750 for gaming (and for me to use for AV1 encoding), its been great in 95% of the games we've played, only a couple issues. For $190 I'd absolutely recommend it compared to buying more expensive 7600 and 4060 cards, which it trades blows with from my research. Also id rather support Intel, as AMD and Nvidia proved they will milk consumers with their duopoly.

  • @shadowr2d2
    @shadowr2d2 9 месяцев назад +5

    Competition is always good for the consumer. I hope that Intel will. Keep on making there GPU. So you have a choice. Intel, AMD, & NVIDIA to choice from. I don’t want to sale my body organs. Just to play a video game 🎮 on my PC.

  • @spankyham9607
    @spankyham9607 9 месяцев назад

    I appreciate the idle power numbers.

  • @HermSezPlayToWin
    @HermSezPlayToWin 9 месяцев назад

    Thanks Steve!

  • @Kikker861
    @Kikker861 9 месяцев назад +7

    Please do encode/decode benchmarks! Gaming is one use, and AI is another, but encode/decode stuff is used everywhere from cloud gaming to video rendering. Intel's video encoder might shine in other applications

    • @stoneymahoney9106
      @stoneymahoney9106 9 месяцев назад

      Wrong channel, not relevent. You're also looking at it from the wrong perspective - hardware performance is less important than the software support as coded by the developers of each individual editing/encoding application, so testing specific applications with different encoders would be a much more useful comparison as editors should be selecting the hardware according to the workflow, not the other way around - and there's other channels out there that can help do exactly that.

  • @Djeddozo
    @Djeddozo 9 месяцев назад +4

    I wonder if there is or will be an advantage similar to AMD's smart access memory if you would run a both Intel CPU and GPU. Also, love the GPU color design. Gives me playful older gen consoles vibes

    • @kravenfoxbodies2479
      @kravenfoxbodies2479 9 месяцев назад +1

      I think having three different platforms with AM4 / AM5 / Intel, there is a chance it could act different with a none iGpu / Cpu like a 5700x on B550 vs 12700k

    • @bagasfabianmaulana
      @bagasfabianmaulana 9 месяцев назад

      There's an advantage. But not on gaming performance. If you had a Intel CPU with iGPU and ARC GPU, Deep Link will enabled to boost AI and Encode/workstation stuffs performarnce. Sadly ARC really needs ReBAR so low/old end systems cannot utilize full performace with ARC (unless the buyer know how to add ReBAR into their UEFI)

  • @jjjuhl
    @jjjuhl 9 месяцев назад

    Thanks for including power consumption under load AND idle. Both are important numbers for people in Europe where the price of electricity matters. 🙂

  • @WardenOfSouls
    @WardenOfSouls 9 месяцев назад +1

    i really really like how this all came along. I went for an A770 LE 16GB a few Months ago and experiencing it getting better and better is pretty cool! i am Playing Starfield with it on 1440p and it is absolutely playable. Cant really complain about anything much so i hope they keep it coming with the driver Updates and the next Generation. Intel did a good thing this time

  • @dan_loup
    @dan_loup 9 месяцев назад +13

    Starfield is a bit complicated because according to people messing with it trying to make it run well on DXVK, the game is basically broken, with the nvidia drivers fixing it rather than being a regular software.

    • @Neatrior
      @Neatrior 9 месяцев назад +5

      Exactly. There are workarounds with this issue with DXVK. But in order to truly fix it, it’s gotta be fixed by Bethesda. Will they do it or not? Probably not. Wish more tech focused channels like this one would talk about this issue. It’s clearly deeper than just hardware and drivers, Bethesda gotta be called out for this or else nothing will be fixed.

    • @pikkyuukyuun4741
      @pikkyuukyuun4741 9 месяцев назад +9

      ​@@Neatriornah bro, Bethesda fans have been brainwashed to accept the only correct way to fix a game is to have modders do it

    • @raresmacovei8382
      @raresmacovei8382 9 месяцев назад

      VKD3D* for Statfield. Not DXVK.

  • @user-to1su2iy4d
    @user-to1su2iy4d 9 месяцев назад +3

    Having a full Intel build would be awesome. I've had too many problems with AMD in the short time I've owned their hardware and Nividia doesn't seem like they're gonna play ball anytime soon.

    • @staggerdagger
      @staggerdagger 9 месяцев назад +1

      ^^^ NVIDIA is pricey but works out of box. Amd is like linux in hardware form. It takes work to make, well, work.
      Intel is one of the most reliable for the price rn

    • @m.streicher8286
      @m.streicher8286 9 месяцев назад +2

      If you've had problems with Ryzen, it's most likely your fault.

    • @OmniDuck
      @OmniDuck 9 месяцев назад

      @@m.streicher8286I love my Ryzen but I had an Asus mobo with the 3.3v bug. No visible damage, but it has not been smooth sailing. Of course now it’s rock solid, but I bought it near launch and it was pretty buggy.
      Not the CPU I think, but the NB stuff like the integrated Wi-Fi blue screening, instability when using DDR-5…
      Also I don’t think I’ll ever get used to a 90C operating temp. Lucky the eco mode is so good!

    • @zihechen3111
      @zihechen3111 9 месяцев назад +2

      @@m.streicher8286😅ryzen has all kinds of problems. Just when ppl trying to point those problems out online. U fanboys defend amd like it is ur king or ur masters😅

    • @zihechen3111
      @zihechen3111 9 месяцев назад

      @@m.streicher8286u amd shills are live in a world amd can’t be criticized whatsoever. Those issues amd has doesn’t go to u, bc ppl bringing those criticisms up got attacked by u crazy fkers

  • @thedanyesful
    @thedanyesful 9 месяцев назад

    I'm just happy I have an option to buy from a company with a long history of stable open source commits and drivers up-streamed into the kernel.

  • @DivorcedGooseRat
    @DivorcedGooseRat 9 месяцев назад +1

    brooo i haven’t seen those little intel plush things since the 2nd grade when intel went to my school (i used to live next to Ronler Acres)

  • @peterjohnson5872
    @peterjohnson5872 9 месяцев назад +3

    Intel's GPU naming is going to get confusing fast. The A/B/C is simple enough to follow but then you have the numbering mess. It's like cross generation numbering inside one generation comparing A580 with A770 similar to 3080 vs 4070. Imagine soon we'll have B750 etc thrown into the mix too.

    • @gaetanoisgro6710
      @gaetanoisgro6710 9 месяцев назад +1

      I still have no idea which card is stronger 😅

    • @ryanspencer6778
      @ryanspencer6778 9 месяцев назад

      It's similar to the CPU naming scheme. First number is market tier, second number is performance class in that tier.
      For example, the A380 is the fastest card in their low end tier of cards, A580 is the mid range card, and so forth.

  • @hongluzhang7771
    @hongluzhang7771 9 месяцев назад +8

    A suggestion: as the chart numbers get crowded, I personally would prefer three numbers display in their colours in one line instead of now cramming each other.

  • @hardtruth603
    @hardtruth603 9 месяцев назад

    Steve, thanks for this thorough review. Saw Paul's yesterday and was seriously considering going for either the 580 or 750 depending on price, but after seeing the in game power draw and idle power draw, the RX 6600 would end up being the cheaper option for me - the price difference would be negated in less than 3 months for me because of how much we pay for electricity here.

  • @Noliving
    @Noliving 9 месяцев назад +1

    Steve referring to the Radeon 6600 as the Arc 6600 at 19:36 minute mark just lets you know how much of a rush they were in.

  • @FieryMeltman
    @FieryMeltman 9 месяцев назад +6

    That is a beautiful graphics card. We really need more colors and design variety in the GPU space.

  • @bleeb1347
    @bleeb1347 9 месяцев назад +5

    Intel is the value king right now. We needed another RX 480/580 type card, and Intel is the only one delivering. Got one in my Plex rig, and my God does this thing beat the crap out of nvidia’s CUDA. (QuickSync)

    • @Deliveredmean42
      @Deliveredmean42 9 месяцев назад

      Funny enough, they have the same numbers!

    • @endurofurry
      @endurofurry 9 месяцев назад +1

      Not only that you get the AV1 encoding for really cheap compaired to the other cars that have it!
      Media server is were i am using and arc as well.

    • @bleeb1347
      @bleeb1347 9 месяцев назад +1

      In my Plex server I was previously using a Quadro RTX 4000, and before that a Quadro P4000. I’m using ARC A380 right now, and I’m getting triple the simultaneous transcodes that the RTX 4000 was getting. Triple… QuickSync is a beast for transcoding.

    • @Noliving
      @Noliving 9 месяцев назад

      The radeon 6600?

    • @bleeb1347
      @bleeb1347 9 месяцев назад +1

      @@Noliving the MSRP of the 6600 is still $330. That is too high to be a value card. The MSRP is the indicator of manufacturer intent, not the street price.

  • @tzuyd
    @tzuyd 9 месяцев назад +1

    Oh man, Sparkle wasn't a name I was expecting to see again. They were my first ever GPU! A Geforce 3Ti

  • @SinisterSlay1
    @SinisterSlay1 9 месяцев назад

    As an entry card, it seems pretty good. I can see this being thrown into refurb office machines. Assuming it doesn't have the funny PCI issue. The refurb office to gaming machines market is pretty good. So it would be great to have a new entry in that space.

  • @michaelstanley5215
    @michaelstanley5215 9 месяцев назад +4

    A580 is going to be an OEM product where a discrete card is essential for the marketing material. At OEM pricing it might make sense as it makes no sense at retail.

  • @NPzed
    @NPzed 9 месяцев назад +3

    Thank you very much for including the 1080 Ti on the benchmarks!
    As a 1080 Ti owner, its extremely helpful being able to see how the card has aged to the newer generation cards and when it might sense to jump to the next card again!

    • @PhAyzoN
      @PhAyzoN 9 месяцев назад +1

      Humanity has yet to truly evolve past the 1080Ti. Best card ever

  • @razormarkz
    @razormarkz 9 месяцев назад +1

    Honestly, this makes me more excited for the battlemage intel gpus (hopefully) coming out next year.

  • @a1rg3ar31
    @a1rg3ar31 27 дней назад

    I would consider getting one of these for a streaming computer.