Intel Arc A580 vs. Radeon RX 6600 vs. GeForce RTX 3050

Поделиться
HTML-код
  • Опубликовано: 11 сен 2024

Комментарии • 1 тыс.

  • @PSXman9
    @PSXman9 11 месяцев назад +1131

    i really hope Intel doesnt give up.
    Arc has become so much better and the GPUs are actually really decent.

    • @oppos727
      @oppos727 11 месяцев назад +64

      I keep seeing posts from people saying they "hope Intel sticks it out" but I've seen maybe literally two comments between YT and reddit of people actually buying them. Go buy one if you want them to stick around!

    • @musikalniyfanboichik
      @musikalniyfanboichik 11 месяцев назад +97

      ​@@oppos727he doesnt own them money for saying things on the internet, lmao

    • @JanM2
      @JanM2 11 месяцев назад +37

      ​@@MrNPCfor what do you need 24GB of VRAM?

    • @JanM2
      @JanM2 11 месяцев назад +3

      Intel battle mage is on the rise so it does not seem like theyre giving up

    • @JoeNokers
      @JoeNokers 11 месяцев назад +15

      @@MrNPC A 7900xtx has 24GB of VRAM, what possible use would a mid tier card have for that much haha

  • @rodwelpolanco9493
    @rodwelpolanco9493 11 месяцев назад +251

    Intel don't give up. That RT performance at that price... you are at the brink of making a killer GPU. I will wait patiently for the next generation.

    • @cosmic_gate476
      @cosmic_gate476 11 месяцев назад +10

      Yes please Intel, I don't want to buy a card like my 4070Ti again 😂 has less memory bandwidth than this gpu 😂

    • @samarkand1585
      @samarkand1585 11 месяцев назад +6

      ​@@cosmic_gate476Then don't buy it?

    • @cosmic_gate476
      @cosmic_gate476 11 месяцев назад +4

      @@samarkand1585 As big a meme the 4070 Ti is, the 7900 XT is an even bigger one.

    • @samarkand1585
      @samarkand1585 11 месяцев назад +2

      @@cosmic_gate476 At its initial price, sure, when did you buy it?

    • @Vorexia
      @Vorexia 11 месяцев назад

      ​@@cosmic_gate476Yeah, and they'll assumably do it for a much lower price, making it a possibly attractive RT upgrade for the massive amounts of people who didn't decide to buy the insanely overpriced 4070 Ti. You're not the intended customer, sorry dude.

  • @totopogito7434
    @totopogito7434 11 месяцев назад +572

    The power draw of intel a580 is quite concerning. It consumes almost 50 percent more power than rx 6600.

    • @Reitaliation
      @Reitaliation 11 месяцев назад +64

      Reminds of RX580 performance/power draw

    • @nigelo92
      @nigelo92 11 месяцев назад +29

      The other Arcs upon release had high power draw that I believe was relatively resolved.

    • @lharsay
      @lharsay 11 месяцев назад +47

      Intel has a major architectural disadvantage compared to AMD and Nvidia right now, the Arc A770 uses twice as much TSMC 6nm silicon as the RX7600 yet thay are still losing.

    • @t1e6x12
      @t1e6x12 11 месяцев назад +19

      @@nigelo92 No. They use as much power under load as they did at launch.

    • @zalankhan5743
      @zalankhan5743 11 месяцев назад +10

      Considering the Performance inconsistency The Price should $160 or 170.

  • @CataclysmZA
    @CataclysmZA 11 месяцев назад +727

    I hope Intel is looking in the comments: You guys are doing an amazing first run. Those raytracing results are incredible.

    • @bradhaines3142
      @bradhaines3142 11 месяцев назад +51

      intel is going to be the next 'fine wine' driver update company. they just need to get them more stable like their CPUs and they'd easily end up a great budget option

    • @brownjonny2230
      @brownjonny2230 11 месяцев назад +7

      AI accelerator is greatly important. Intel really need to improve raw rasterization performance with Battlemage. As long as Intel's cards are useful in rendering and AI support that's an instant buy for me. But right now they are not.

    • @PadaV4
      @PadaV4 11 месяцев назад +31

      @@bradhaines3142 "fine wine" drivers is just code phrase for sh*ty release drivers. Maybe Intel should release good drivers on launch instead

    • @Kuriketto
      @Kuriketto 11 месяцев назад +40

      @@PadaV4 You're way off base. Fine wine means starting with decent drivers that get optimized over time to squeeze out that extra little bit of performance from a GPU. Notice he said Intel would be the "next" company to release fine wine drivers. They're not there yet.

    • @Blackfatrat
      @Blackfatrat 11 месяцев назад +5

      ​@@PadaV4they likely will for battlemage or at the latest, Celestial. It takes time though. The first arc drivers were basically the integrated graphics drivers. They will ofcourse reuse the already massively better alchemist drivers for future gpus.

  • @jasonspain3554
    @jasonspain3554 11 месяцев назад +240

    The RX 6600 is such a good value card. It can handle almost every game 1080p high settings above 60 fps. Paired with a Ryzen 5 5600, you can build a decent 1080p PC for about $550-$600.

    • @BladeCrew
      @BladeCrew 11 месяцев назад +14

      My rtx 3050 was a good value card when I got it. I bought it when crypto minning was still a thing and got my card for 250 dollars new. At that time in my country the rx 6600 was over 400 dollars so I had to go with rtx 3050. I mean I'm happy with my card, I don't run latest or popular titles and still get on high to ultra settings 60 fps without dlss. It also pairs nicely with my Ryzen 5 1600 AF clocked at 3.5GHz with a voltage of 1.1870v. I undervolted the rtx 3050 with the voltage at 0.925v and kept the boost clocks the same. I also cap my fps in rivia statistic tuner and my gpu power consumption never goes over 80w of power for my gpu and 44w for my cpu, that is if both of them run 100% usage and temps are in the mid-high 40 in a 25 degree celcius room.

    • @Akab
      @Akab 11 месяцев назад +28

      Rx 7600 getting better though as the pricing is getting closer to the 6600 (in my region at least) so watching out for a deal might be a good decision there 👍

    • @aghilesl.4040
      @aghilesl.4040 11 месяцев назад +18

      In Canada the 6650 xt is cheaper than the 6600 so thats probably the best value gpu on the market

    • @krzysztofpaszkiewicz1274
      @krzysztofpaszkiewicz1274 11 месяцев назад +5

      If you're on a tight budget a Ryzen 3600 would be more than enough for this GPU.

    • @BladeCrew
      @BladeCrew 11 месяцев назад

      @@krzysztofpaszkiewicz1274Im saving for my last upgrade to a ryzen 5 5800X3D.

  • @kamil.g.m
    @kamil.g.m 11 месяцев назад +200

    Results look pretty good honestly. Still a first-gen product unfortunately so there are issues but considering Intel is new to the graphics card market, this is absolutely a good release for competition in the market.

    • @mukkah
      @mukkah 11 месяцев назад +7

      Yup. I can't be an early-adopter type, but for those who don't mind Intel is pullin some weight. Entry into GPU market, I wouldn't expect many positive results, tbh. They have a lot of things to work on / improve and a ways to go, but it's been a good start. Hope they keep it up, not for them (could care less about corporations interests), mainly for us, the consumer.
      Lower tier GPUs need better options.

    • @toivopro
      @toivopro 11 месяцев назад +5

      What was Intels Dg1? And how long have they had integrated graphics?

    • @justacomment1657
      @justacomment1657 11 месяцев назад +6

      yeah, that whole first gen BS is just that BS... what you see is basicly vega all over again... trying to much and failing heavy...

    • @andersjjensen
      @andersjjensen 11 месяцев назад

      @@justacomment1657 Funny that you compare it to Vega. Vega was absurdly compute heavy and sported some impressive TFLOPS numbers, heck even TFLOPS/Watt numbers were good, but it didn't translate into gaming performance. Both Vega and Arc were brain childs of Raja Koduri. Both where the end of his career at the respective companies.
      That AMD has split their architecture into CDNA and RDNA tells me that they have realized that having both a compute focused architecture AND having a hardware scheduler/front-end seems to be mutually exclusive.
      Nvidia ditched the scheduler a long-ass time ago and are doing fine in both compute and gaming performance at the expense of falling completely appart on people who think their 6700K (or similar) will last for yet another GPU upgrade.
      Unless Battlemage dials back the compute focus we're going to see "Vega again" with excessive power draw for the actual gaming performance even at node parity.

    • @VoldoronGaming
      @VoldoronGaming 11 месяцев назад

      Not a bad showing for intel’s first gen product that is meeting(amd rx 6600) or besting(nvidia RTX 3050), both companies making gpus for decades while this is the first year for Intel,

  • @KoeiNL
    @KoeiNL 11 месяцев назад +37

    Really hope Intel is going to continue on this path. This all looks very promising.

  • @TalonsTech
    @TalonsTech 11 месяцев назад +53

    Intel's ARC GPUs continue to improve performance with each GPU driver release. What Intel has managed with their GPU division on a first release is pretty impressive. Hoping Battle Mage is very successful.

    • @TecoProductions
      @TecoProductions 11 месяцев назад

      That'll be my next GPU battlemage

    • @recon1673
      @recon1673 11 месяцев назад

      I might buy a Battlemage to replace my hefty 2080. I can only play on a 4k60 TV and despite DLSS, I still get frame drops and less than ideal frames sometimes thansk to the lack of VRAM.

    • @selohcin
      @selohcin 10 месяцев назад

      I really don't think Intel will continue to make GPUs beyond this generation. People just aren't buying their cards, so it doesn't make sense to put hundreds of millions of dollars of investment into something that doesn't even return a profit. Gamers will have no one but themselves to blame.

  • @svs2136
    @svs2136 11 месяцев назад +150

    RX6600 is just too good of a value, especially when you can get used ones for $100

    • @kravenfoxbodies2479
      @kravenfoxbodies2479 11 месяцев назад +3

      He is using the wrong gpu from Radeon, RX 6500 is what Intel was targeting with A580.

    • @Zapdos0145
      @Zapdos0145 11 месяцев назад +36

      @@kravenfoxbodies2479yeah but the 6500 shouldn’t have existed and if it can go toe to toe with a 6600 that’s even better for intel

    • @kravenfoxbodies2479
      @kravenfoxbodies2479 11 месяцев назад

      @@Zapdos0145 The RX 6600 had to scale down in price to what you see today, my RX 6600 cost $279 new one year ago, the A580 is at MSRP and never scaled up in price.

    • @tringuyen7519
      @tringuyen7519 11 месяцев назад +30

      Regardless of gamers preferences, RTX 3050 is pure crap in value.

    • @KontrolStyle
      @KontrolStyle 11 месяцев назад

      yet 3050 sells because people can't wait an extra week and buy a decent gpu (sigh) @@tringuyen7519

  • @mukkah
    @mukkah 11 месяцев назад +13

    3:43 Honestly, the 3 card test pool is actually kind of nice for a change (and yo, I LOVE me some data graphs n' shit hehe). Straight comparison through one particular budget range. Did a budget build that came down to 3050 or 6600 Nov '22, so I know this path well too hehe ^_^
    Great content as always for us consumer folk.
    Merci!
    ~a random canadian dude

  • @Archer_Legend
    @Archer_Legend 11 месяцев назад +13

    It is finally a GPU which has both adequate specs (VRAM, bus etc..) and a decent price for its specs. I truly hope Intel does not give up even though Battlemage may be late to the party as nothing has been heard till now and that is supposed to come out before the end of H1 2024.

  • @vladdyboy2612
    @vladdyboy2612 11 месяцев назад +50

    The amount of power this GPU consumes, the superb RT performance, a game or two where its performance is an entire tier above the competition all indicate to me that there is so much untapped potential in these ARC cards.
    I hope Intel truly commits to these cards even when the B and C series cards come out

    • @MrBaltonic
      @MrBaltonic 11 месяцев назад +3

      is amount of power a good thing? it consumes 80W more than the other 2 cards

    • @vladdyboy2612
      @vladdyboy2612 11 месяцев назад +4

      @@MrBaltonic No it's not a good thing, at least not right now.
      That's why I said it's an indicator of untapped performance, it's software crippled rather than simply being architecturally inefficient

    • @samir2zk135
      @samir2zk135 11 месяцев назад

      @@vladdyboy2612 I agree, Hoping to see their open source drivers do way better with MESA compatibility and bringing interesting results in linux.

    • @MrBaltonic
      @MrBaltonic 11 месяцев назад +11

      @@vladdyboy2612 what makes you believe it's software crippled and not actually architecturally inefficient?

    • @PainterVierax
      @PainterVierax 11 месяцев назад +6

      @@MrBaltonic if it's hardware, it would only be the lack of logic to shut off unused modules. Because Intel is not bad at all at idle on CPUs.
      For the high load scenario, it's clearly software optimisation as the results are inconsistent and even drop more than a tier below.
      Intel knows how to make non gaming iGPUs, they just lack the gaming part, which is for the most part driver optimization. And it shows when their driver revisions finally allow the Arc cards to show their full potential. From past experience in low spec gaming, the more performant Intel igpu of my previous Skylake system was more buggy and slow than a potato GT610 (framedrops, artifacting, missing textures, etc.).

  • @shadowlemon69
    @shadowlemon69 11 месяцев назад +52

    Arc A750 is already at $190, so this should be $150 or $140, definitely not $190.

    • @madgodzilla12465
      @madgodzilla12465 11 месяцев назад +9

      It'll drop in price too without a doubt. Would be interesting to see how close it gets to $100. Almost disposable at that point.

    • @madgodzilla12465
      @madgodzilla12465 11 месяцев назад +5

      But tbh i'd still just get an rx 6600. Drivers are way more consistent and consumes considerably less power.

  • @arthurcuesta6041
    @arthurcuesta6041 11 месяцев назад +31

    Damn, the progress in Arc GPUs is really impressive
    Keep it up Intel

  • @sir.fender6034
    @sir.fender6034 11 месяцев назад +13

    Still using the 6600XT today and it's been an excellent card. I will upgrade when the 8600XT becomes available.

  • @whitehavencpu6813
    @whitehavencpu6813 11 месяцев назад +41

    Wow! Raytracing on a $180 card? And look at those Ratchet & Clank results! Intel's GPU division keeping it real!

    • @ThunderingRoar
      @ThunderingRoar 11 месяцев назад +12

      you aint raytracing shit with entry level gpus, 24 fps in cyberpunk lol

    • @LlywellynOBrien
      @LlywellynOBrien 11 месяцев назад +4

      ​@@ThunderingRoarApart from Res 4 and Spiderman? I get what you are saying but this video literally has ray tracing looking very much playable on two games.

    • @ThunderingRoar
      @ThunderingRoar 11 месяцев назад +5

      @@LlywellynOBrien those are fake raytracing games, where you can barely nothice the difference and the shadow/reflection resolution is so low

    • @whitehavencpu6813
      @whitehavencpu6813 11 месяцев назад +5

      @@ThunderingRoarThat's not the point, the point is that it can Raytrace better than the competition at that price.

    • @shadow_force
      @shadow_force 11 месяцев назад +2

      Rt shadows only no lighting or reflection

  • @Djoki1
    @Djoki1 11 месяцев назад +80

    Intel's RT tech seems quite impressive for a 1st gen product from a company which has never made such product for the masses before.
    Im still rooting for Intel too, as we clearly need a third competitor in the GPU market.
    Nvidia has been Ngreedia for a while now and AMD seems to still be missing its marketing brains.
    The power consumption is quite bad though. Wondering if it will get better over time.

    • @arenzricodexd4409
      @arenzricodexd4409 11 месяцев назад +2

      in regards to RT intel simply doing something similar to nvidia. that's why their RT performance is quite impressive even for the first gen. for AMD they still got with this idea that they did not want to spend too much die space for RT functionality so it is reflected in their performance. as much as we really need more competitor in this market consumer in general did not like more competition. heck they did not even like heavy competition between two player. hence nvidia able to grab almost 90% of discrete GPU market for themselves.

    • @SwingArmCity
      @SwingArmCity 11 месяцев назад +9

      It's a polished turd. Ray tracing at this level is useless.

    • @Knowbody42
      @Knowbody42 11 месяцев назад +1

      It seems like Nvidia just keeps giving AMD plenty of opportunities due to their own greed, and AMD just never takes them.

    • @Apollo-Computers
      @Apollo-Computers 11 месяцев назад

      @@arenzricodexd4409 that's because amd just started to actually compete with rdna...

    • @t1e6x12
      @t1e6x12 11 месяцев назад +1

      RT was good at its price point. In an absolute comparison between AMD, Intel, and Nvidia, Intel would be behind.

  • @MarcoGPUtuber
    @MarcoGPUtuber 11 месяцев назад +21

    Too late. Already bought an A750 for $149 US.

    • @thepunic5261
      @thepunic5261 11 месяцев назад +11

      congrats, thats a good find !

    • @elirantuil5003
      @elirantuil5003 11 месяцев назад +5

      Holy shit that's a good deal

    • @rahuloberoi9739
      @rahuloberoi9739 11 месяцев назад +14

      My friend returned his A750, tons of driver issues (in latest triple a games ) , bought 6650XT instead for $200

    • @MarcoGPUtuber
      @MarcoGPUtuber 11 месяцев назад +4

      @@thepunic5261 Thanks! A guy here in the local classifieds won a gaming PC with an A750 and wanted to upgrade so he sold me the GPU so he could buy something better for his PC.

    • @MarcoGPUtuber
      @MarcoGPUtuber 11 месяцев назад +2

      @@rahuloberoi9739 Yea but I wanted cheap AV1 encoding.

  • @ianthomas1955
    @ianthomas1955 11 месяцев назад +8

    I really miss the Hunt Showdown Benchmark.
    But thanks for the A580 video but would have been cool to see the A750 in there as well.

    • @1Grainer1
      @1Grainer1 11 месяцев назад +1

      christmas is soon, so probably performance will tank with introduction of festivities

  • @singular9
    @singular9 11 месяцев назад +32

    Nice to see some OC testing too.
    The 7600 can OC at least 15% higher than stock performance, and some hit 20%.

  • @tomallan5000
    @tomallan5000 11 месяцев назад +21

    Shows just how good the 6600 was against it’s competition when everything is considered.

    • @Dragoon710
      @Dragoon710 11 месяцев назад +1

      The 6600 is nearly double the price of the A580

    • @TheGabriel959
      @TheGabriel959 11 месяцев назад +9

      @@Dragoon710 You can buy a RX 6600 is around $200, not double the price.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat 11 месяцев назад +4

      @@Dragoon710 lol who told you that ?
      6600, not 6700

    • @edouard9867
      @edouard9867 11 месяцев назад +2

      @@Dragoon710 I can get a 6600 for 210€ including taxes in France.
      So same price...

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat 11 месяцев назад +1

      Well not really. Intel has much better RT, better productivity perf. and AI upscaler while being cheaper. Only waiting for them to improve starfield and last of us performance.

  • @M00NM0NEY
    @M00NM0NEY 11 месяцев назад +2

    1 Million Subs! Woot woot! 🙌🙌🙌 Congrats Steve & Tim

  • @EbonySaints
    @EbonySaints 11 месяцев назад +3

    Preety good review. I kinda have to agree on the price point, where for $30 extra, you can have either the 6600 for slightly better consistency in older titles, or the A750 for raw performance. Its kind of a problem for Arc where they've adjusted pricing so much and bunched it up so close that you have three tiers of cards within $60 of each other with relatively little difference. Granted, while a lot people harp on power consumption, given the never ending recommendations for the RX 580 and older power hungry used cards despite the former being completely outclassed in today's titles, I usually find it to be a moot point for most gamers outside of certain markets.
    Also, just a heads-up for Arc testing, Steve. In case you weren't aware, just about every driver Intel releases also updates the firmware as well. While this isn't an issue for users 99% of the time, if you plan on doing any sort of comparison between launch drivers and the current ones anytime soon, realize that rolling back to the old drivers in case you forgot something won't be an apples to apples comparison, since firmware doesn't downgrade. There's a thread on TechPowerUp's Intel Arc forum, explaining the deal. Don't worry about haing trouble finding it, it's pretty dead in there.
    Also, the A380 is at $100 as of me writing this and regularly hits that price point and also has a LP model from AsRock without the 8-Pin. Cut it some slack man, its probably the best ultra budget option you can buy new from the manufacturers. It ain't perfect, but it beats the 6400 and 1650.
    Can't wait for you to slaughter the A310 since that launched at $100 (See the first point, it needs to be $75 max as a glorified display adapter, though at least it is one.) and is probably ready for at least one TechTuber's cheap seats review.

  • @SkeeziOS
    @SkeeziOS 11 месяцев назад +16

    Bravo to Intel, with amd’s fsr 3 and their own amazing ray tracing performance. I can’t wait to see what their better cards have in store for us. Especially with the lackluster market we are in today.
    For the first time since the the 3000 series dropped, I actually feel excited to see the new gpus.

  • @iseeu-fp9po
    @iseeu-fp9po 11 месяцев назад +4

    I think this was uplifting. It brings hope for later Intel GPU's.

  • @Bigbacon
    @Bigbacon 11 месяцев назад +7

    I really want to see Intel succeed in the GPU market.

  • @OriginalBrett610
    @OriginalBrett610 11 месяцев назад +7

    This is great news! The latest driver updates are making Arc a real competitor out there. I hope they keep it up, as I've been really impressed with my 750 so far and its only been improving over time with better drivers.
    Even though it is disappointing not to include the latest fixes in the 580 driver release, they are working really hard to make fixes. The most recent Starfield fixes make the game more than playable, and there's likely more tweaks to yet to come.

  • @Xbox360mIRC
    @Xbox360mIRC 11 месяцев назад +2

    Sad thing is the 3050 is extremely common and many people buy them still. Pre builds also make up a large portion of gamers that won't ever do price to performance comparisons and look through hundreds of different pre builds for the best value one. More gamers on PC buy a pre build/laptop than actually build their own. Look up pre builds for each price bracket and laptops too. Nearly all are 4060 4060 ti 3050 1660 Ti 1660 Super 1030 ect. Virtually no intel GPU pre builds and barely any AMD.

  • @imglidinhere
    @imglidinhere 11 месяцев назад +14

    Damn... that's impressive! Way to go Intel! This is what I was hoping for and I hope battlemage is equally impressive.

  • @VitisCZ
    @VitisCZ 11 месяцев назад +1

    Was the idle power consumption measured with or without the intel recommended bios settings? I suppose that's without because Intel literally has an article on their website stating you have to allow os controlled aspm and enable pci L1 substates on the pci express root port's ASPM. Just by searching for intel arc aspm you should get their article as the first result. That makes idle power consumption a ton better

  • @trailduster6bt
    @trailduster6bt 11 месяцев назад +4

    I like that there is at least one decent gpu under $200. I hope to see Nvidia and AMD try and compete in this space once again as well. Maybe the A580 or its Battlemage successor will spur the team green and team red to respond with something more than a cut down RTX 3050…

  • @DragonBane299
    @DragonBane299 11 месяцев назад +1

    You guys are so close to 1 mil!
    Here's an early congrats on hitting 1 mil after such a long time!

  • @GiSWiG
    @GiSWiG 11 месяцев назад +13

    RT performance is pretty darn good comparatively. Battle mage just around the corner (sorta) I might wait to see how their offerings stack up against the RX 7800XT and the RTX 4070.

    • @RobBCactive
      @RobBCactive 11 месяцев назад +4

      RT performance is totally moot in this tier

    • @nigelo92
      @nigelo92 11 месяцев назад +4

      @@RobBCactive I disagree. In SP titles I'm fine with 30fps if the visual overhaul is drastic. And in this case XESS is on par with DLSS so at 1080p (where FSR is bad) using the quality preset it okay for me too. Some games like Spider-Man have high performance with RT anyway.

    • @RobBCactive
      @RobBCactive 11 месяцев назад

      @@nigelo92ah yes so many want to upscale to 1080p to reach 30fps on so many titles offering visual overhauls with RT that also support XESS on their re-BAR capable systems and don't mind brokeness in other games, it's a hugely popular use case.

    • @HunterTracks
      @HunterTracks 11 месяцев назад +1

      ​@@nigelo92One big issue with XeSS is game support. For instance, out of the last 5 "big" games I played -- Cyberpunk 2077, Starfield, Baldur's Gate 3, Horizon: Zero Dawn and Lies of P -- only one had XeSS support. I suspect that few devs implement it because it performs so poorly on non-Intel cards that it may as well be Intel-only.
      So yeah, most of the time it's FSR or bust, and Intel cards don't exactly always work perfectly with FSR.

  • @MisterFoxton
    @MisterFoxton 11 месяцев назад +1

    Props HUB! This might be the first GPU review in a long time where a card has soundly beaten its Nvidia counterpart and you haven't gone too deep into "but [insert feature]" as a counterpoint.

  • @syncmonism
    @syncmonism 11 месяцев назад +5

    There are already sales for the A750 which go as low as 180 USD, and the A580 is significantly slower than the A750. Also, for a new system, most people will get better value from something more powerful, like a 6750 XT or 6800 XT. Furthermore, this isn't a great upgrade for a lot of older systems, because a lot of older systems don't support ReBar, and Intel Arc cards also have higher CPU overhead than Radeon cards (such as the RX 6600), which is also likely to be an issue on a lot of older systems.

    • @naamadossantossilva4736
      @naamadossantossilva4736 11 месяцев назад

      Yeah this seems to be only for system builders,who don't have to care about power consumption and can't really take advantage of sales.

  • @Randy1124pr
    @Randy1124pr 11 месяцев назад +1

    I you can pick one up for $150-160 this will be a hit, there’s really anything good in that price range. The 6600 continues to be the go to for a $200 budget and the 7600 is a beast for $250.

  • @falcie7743
    @falcie7743 11 месяцев назад +4

    Now this might be just what I was looking for in a Plex card. 4k transcode usually recommends at least a 1070 level of performance and AV1 is a huge plus.

    • @Dick_Valparaiso
      @Dick_Valparaiso 11 месяцев назад

      In best case scenarios the A580 should perform closer to a gtx 1080 right now. Atm, techpowerup has the A580 sandwiched right between the 1070 (+1%) and 3050 (-1%).
      Techpowerup shows the 6600 has an 8% lead over the 1080. Just going by their stats (which aren't gospel) along with the A580's best showings in this video, it's reasonalbe
      to believe that with better drivers the A580 could actually perform closer to a 1080ti. The bus/bandwidth and pci 4.0x16 advantage could help this card hugely in the long term.

    • @mjc0961
      @mjc0961 11 месяцев назад +1

      I dunno if you want that in a Plex server with that power draw, though. Are there no better options?

  • @Dht1kna
    @Dht1kna 11 месяцев назад +1

    I'm quite surprised that you didnt include fps per dollar stats

  • @bzdtemp
    @bzdtemp 11 месяцев назад +4

    Good to see continued coverage of the Intel GPU offerings. And nice with a review that covers all the bases.
    On this specific offering, I find the power usage somewhat more important than with the mid/top-end GPU's, my theory being that I imagine one reason to go with a lower end GPU is that gaming isn't the primary use of the PC it is used in. Something like 80 watt extra power usage is not something to just overlook - especially if an AC is needed to keep room temperature down, since the AC used a lot more than 80 watt to cool 80 watt of heat.

    • @LlywellynOBrien
      @LlywellynOBrien 11 месяцев назад

      Unless you have reverse cycle, where efficiencies go crazy and 80 wats of heat could be displaced for probably 10-20 watts depending on some variables.

    • @mlsasd6494
      @mlsasd6494 11 месяцев назад +2

      if your ac uses more electric power than the heat it removes your ac is really bad or you live in very very bad conditions

  • @j340_official
    @j340_official 11 месяцев назад +2

    I thought a certain youtube leaker said ARC is dead? Good showing from the blue team.

  • @ElZamo92
    @ElZamo92 11 месяцев назад +6

    The thing about Intel’s offering is that they released their cards almost two generations late, so Intel’s Arc A580 was actually made to compete with older cards like the RTX 2060 and the RX 5700. But to me it doesn’t really matter as long as their price to performance is on point, and at $180 Intel DEMOLISHES the competition.

    • @kravenfoxbodies2479
      @kravenfoxbodies2479 11 месяцев назад +1

      You missed the point, they called it A580 and not A570 to slap AMD in the face about the full RX 580 8Gb, and what thay have added to the design of RX 580 to make it compete with GDDR6 and not GDDR5, AV1 encoding, 6nm die and can use FSR 1 /2.2 / 3 also plus XeSS
      also added PCIe 4 with Rebar support and still priced at what my Power Color Red Dragon RX 580 8Gb cost me new in 2019 = $189
      You see people buy cheap cut down kickoffs of the RX 580's from Ali Express and this is market for Intel to tap into with a better made product IMO.

  • @DaboInk84
    @DaboInk84 11 месяцев назад +2

    Paul’s Hardware retested Starfield with the new driver, and he was getting better results to begin with, not sure why there is such a difference in his results vs HUB’s for that game.

    • @Hardwareunboxed
      @Hardwareunboxed  11 месяцев назад +9

      Not exactly sure but we do test a much more demanding section of the game. So that could be it.
      Edit: Ohh wait Paul explained that there was a game update, and he went back and tested that. We finished all of our testing for this content on Saturday and Starfield was benchmarked on Friday. So if a game fix came after that we didn't test it.

    • @Chrissy717
      @Chrissy717 11 месяцев назад

      ​@@HardwareunboxedCan you maybe pin a comment under this video addressing this issue? Would be nice for the viewer

    • @Hardwareunboxed
      @Hardwareunboxed  11 месяцев назад

      I talked about the issue in the video.

  • @TheDaswilhelm
    @TheDaswilhelm 11 месяцев назад +4

    RX6600 still the best value GPU we've seen since the 1080ti.
    Scooped them for as low as $150. Can't go wrong.

    • @gametestinglab8861
      @gametestinglab8861 11 месяцев назад

      If that card it’s so good why it’s price dropped like a stone? They’re a little bit slower than 1080 ti and selling little bit cheaper than 1080 ti a 7 freaking old card. If you’re happy and works for you I’m happy for you. But in the end market and the gamers are the ultimate judge. The judge has spoken by purchasing decisions. Regardless of your feelings or my feelings. Enjoy your card.

    • @zaidlacksalastname4905
      @zaidlacksalastname4905 11 месяцев назад +1

      ​@@gametestinglab8861it's good now. Wasn't at 300.

    • @gametestinglab8861
      @gametestinglab8861 11 месяцев назад

      @@zaidlacksalastname4905 Value it’s a subjective term. It’s not bad card. But for same price I would get 1080ti any day and night. Wider bus, 11gb vs 8gb and little bit faster.

  • @lyoneel
    @lyoneel 11 месяцев назад

    I bought a a770 from sparkle already, but there one thing is not anywhere, no one is tearing down this GPU, I would like to see how good or bad is design inside and how good or bad is its cooling capabilities

  • @WeencieRants
    @WeencieRants 11 месяцев назад +6

    Honestly. I made a sidegrade from a 1080Ti to a 6600 and it’s an amazing card. And it just sips power

    • @t1e6x12
      @t1e6x12 11 месяцев назад +6

      Thats more like a downgrade

    • @WeencieRants
      @WeencieRants 11 месяцев назад

      @@t1e6x12 I mean, I sold it for $500 and bought the 6600 last year at microcenter for $230. It also uses less than half the power. And it plays all the same games at the same or similar performance.

    • @zodwraith5745
      @zodwraith5745 11 месяцев назад

      That's every bit a downgrade. Inferior memory, inferior bandwidth, inferior overall performance. Why would you buy that unless your 1080ti died?

  • @IntelArcTesting
    @IntelArcTesting 11 месяцев назад +2

    Can you please do a cpu scaling test on arc? I know that baldur’s gate 3 with a r5 5600 performs pretty bad in the city

  • @DrearierSpider1
    @DrearierSpider1 11 месяцев назад +19

    Wow, didn't expect another 1st gen Intel GPU to be slotted into the lineup after Nvidia and AMD basically launched the full 4000/7000 series lineups, respectively. Prior to watching, I'm expecting this'll have a tough time competing in the very competitive ~$200 segment. There's plenty of options, particularly if you consider used.

    • @ca9inec0mic58
      @ca9inec0mic58 11 месяцев назад +8

      $200 segment isnt really competitive, there is just one card worth the price, and that is the RX6600. nothing more nothing less

    • @DrearierSpider1
      @DrearierSpider1 11 месяцев назад +4

      @@ca9inec0mic58 Like I said, I'm including the used market as a qualifier. And while the RX 6600 might be the only new option at or below $200, the options open up as you head towards $250 (RX 6650 XT, RX 7600, etc.) More money, but potentially better value depending on the game as Steve showed.

    • @bradhaines3142
      @bradhaines3142 11 месяцев назад +2

      used market shouldn't be considered competition to new. to many different variables to compare to. you can get an old titan, a last gen card, or whatever inbetween. and then there's the values that really depend on the person selling them

    • @ryanspencer6778
      @ryanspencer6778 11 месяцев назад +4

      For a lot of people, the risk of buying used is just too much beyond the $100 price point.

    • @anasevi9456
      @anasevi9456 11 месяцев назад

      @@DrearierSpider1 discreet gpus are one of the hardest things to get right, before Intel back in the mid 2000s when the bar was vastly lower, a massive and somewhat experienced in gpu SiS tried under the XGI volari line, and crashed and burned hard. First gen ARC may lose to Radeon in raster, but their Raytracing and special hardward features like XeSS are proven superior to even 7 series Radeon transistor for transistor. So yeah ARC is smelling of roses in actual brass tacks long term outlook, even if it doesnt mean much to most consumers just in to game now.

  • @Cryptic_Chai
    @Cryptic_Chai 11 месяцев назад +1

    Thanks for the update, I would love to see how A750's new drivers are going on with some random titles.

  • @nikzel
    @nikzel 11 месяцев назад +6

    Six months ago I bought the RX6600 instead of the RTX 4080. It's good enough for current games, and I'll save my money for when the value is better at the high end.

  • @mapsofbeing5937
    @mapsofbeing5937 11 месяцев назад +1

    considering Arc is intel's first-gen dedicated GPU, it's pretty amazing, showing that intel is properly learning to walk before it runs
    competing at the low end in 1 generation is not a small achievement

    • @justacomment1657
      @justacomment1657 11 месяцев назад

      DG1 was first gen..

    • @mapsofbeing5937
      @mapsofbeing5937 11 месяцев назад

      @@justacomment1657 DG1 counts as much as the GT 1030 DDR3

  • @olnnn
    @olnnn 11 месяцев назад +6

    IMO it would be beneficial with a test with a older CPU mobo in addition for gpus in this range, as you have shown in the past there are notable differences in behaviour in cpu-limited scenarios between at least nvidia and AMD which are relevant for the systems these GPUS are likely to go in, plus the fact that the 6600 is 8x pcie lanes.

  • @mroutcast8515
    @mroutcast8515 11 месяцев назад +1

    I hope Intel sticks with GPU market. We need 3rd competitor AF.

  • @Starfishtroopers
    @Starfishtroopers 11 месяцев назад +3

    not having the drivers makes it almost pointless.. still the 3050 is a paperweight.

  • @Didymuss1
    @Didymuss1 11 месяцев назад +1

    I like that the PCI power sockets are blue, it's a nice touch.

  • @zodwraith5745
    @zodwraith5745 11 месяцев назад +6

    It's really good to see Intel soldiering on and giving good value and ever improving drivers. It's just too bad they couldn't haven't released a year prior when they wanted to they really could have looked like a hero.
    I kinda want an a770 or a580 just to play with if I didn't already have faster cards. But I will be watching Battlemage very closely with how quickly and how much their drivers have already improved. Keep in mind how many years it took AMD drivers to not suck. At the rate intel drivers are improving they'll catch AMD within a year or 2

  • @MrDgt66
    @MrDgt66 11 месяцев назад +1

    Congrats on the one million subscribers boyz! 🥳

  • @LlywellynOBrien
    @LlywellynOBrien 11 месяцев назад +4

    I think I am going to grab an A770.
    So impressed with the results (bar for the 380, but going ultra-budget was probably just to ambitious for a first gen.
    Even more impressed by their eagerness to improve.

    • @EbonySaints
      @EbonySaints 11 месяцев назад +2

      Honestly, at $100, even the A380 is a much better buy than the 6400 ($135) or the 1650 ($150), much less the 1630. The 6500XT beats it in raw performance, but it's more gimped all-around unless you get the 8GB version. Even then, you don't have hardware encoding, and you need a 4.0 slot to get the most value out of it anyway, at which point you either buy the A580 in two months when they inevitably cut the price by $20 or go with the A380.
      The A310 is at a very silly price right now though at $100.

  • @gatsusagara6637
    @gatsusagara6637 11 месяцев назад +1

    It's much better than expected. It's probably pretty usable as an entry lv card if the drivers continue to improve. Only time with tell I guess. Maybe in a year or two if Intel doesn't abandon their driver team.

  • @konstantinlozev2272
    @konstantinlozev2272 11 месяцев назад +3

    Conclusion: RTX 3050 is such garbage for its price that even Intel's recent first go at discrete GPUs wipes the floor with it.

  • @BitZapple
    @BitZapple 11 месяцев назад +2

    When all the bug fixing, optimization (in games dev, too) and price adjustments is done, this will be the go-to value option!

    • @andrewsnows
      @andrewsnows 11 месяцев назад

      Then came the "price adjustment" 150%

  • @qlum
    @qlum 11 месяцев назад +4

    If I were on windows (I am not) I would make the following performance calculus at equivalent pricing.
    AMD (100%)
    Nvidia (95%)
    Intel (80% at the realistic worst case)
    Here is why I consider it that way, I mostly care about being able to get good framerates 80fps or so at 1440p while keeping a image with limited artificating / ghosting.
    The reason the difference for AMD vs Nvidia is little because of my personal distaste for the way they are pushing for a closed vendor-locked ecosystem and them being by far the biggest vendor (Those are very much personal considerations). But I will still consider the obvious benefits where DLSS can be acceptable FSR2 is not, I may even prefer FSR1 for it's problems are less distracting. Better RT performance so far has been a minor thing as I put more emphasis on Looking decent at a good framerate than Looking good at a decent framerate.
    As for Intel, because of its hit-and-miss nature I would require all games I want to play to at least run decently, so the averages are not as important, as long as the lower bound is decent. If I wanted to play starfield and find out it is broken on ARC or even some older game, that is just not acceptable as long as there are other options. One or two is acceptable but the problems are in more than that.
    Of course I am not running on Windows, as it stands Nvidia's linux drivers are just a pain to deal with, and the bare standard of having a mixed refresh rate with multi monitor variable refresh rate work is just not met on linux with Nvidia. And because of that it's really a non-starter.
    Intel I would have to evaluate a bit more if it comes to it as Linux performance is a bit different from Windows here.

  • @shaneeslick
    @shaneeslick 11 месяцев назад +1

    G'day Steve,
    Paul ran the new intel Driver for Starfield in his A580 Review & it help increase the performance to match 3050

  • @aggies11
    @aggies11 11 месяцев назад +4

    The driver issues in the launch driver (missing game specific fixes) is really telling. It just goes to show that unless intel is giving specific attention to your game, performance has the potential to be a disaster. And if they can't even prioritize games for review of their own card releases (whose deadlines they themselves set) then it doesn't bode well for the myriad of titles that a PC gamer in 2023 might want to play.
    Unless you only play a few specific titles and you know Arc performs well on them, I'd be hard pressed to recommend an intel gpu yet. Real shame they botched the launch this bad. It's understandable given how hard making a modern video card is, but that means there was no excuse for them to attempt a full launch with a top to bottom stack. A single GPU, budget priced, while they work on their drivers. And only once they've shipped a few generations and have that down, would you start to branch out into more high end cards. Could have hopefully kept their spending more in line too, so the AXG didn't have to be gutted wasting lots of time and resources.
    Typical eyes being bigger than your stomach.

    • @Diegorskysp17
      @Diegorskysp17 11 месяцев назад

      To be fair to them, it's two games out of... tons that have such issues, and both games were pretty much broken at launch for Intel GPUs (so much that devs had to release vendor specific patches so it wasn't just on Intel's side), and Starfield in particular didn't even run well for Nvidia GPUs. Hell, it wasn't too long ago that similar stuff happened with AMD were they were basically unsupported at launch. So, while they could (should?) have waited to have those optimized drivers, it's still worth noting that pretty much no other game but those two had issues, so I don't think it's fair to call this a "botched" launch.

    • @HKNotch
      @HKNotch 11 месяцев назад

      Wouldn't call this launch "botched" compared to the A700 and A300 series. They also have shown they've been hard at work on driver issues so that's also good.
      Starfield in particular ran badly on both Nvidia and Intel GPUs.

    • @RobBCactive
      @RobBCactive 11 месяцев назад

      I guess you justify a large software budget by optimistic sales forecasts and many models, this cut down SKU must be about flogging stock still held from the 2022 Q1 production run, both Kaduri & Shrout have left Intel and I cannot see appeal 24 Q1 of Arc branded Meteor Lake laptops.
      The problem is Arc inherited a large media engine from a previous project so cost effective production of the low end would be an issue.
      Intel had to develop new because their old iGPU designs weren't scaling, so they've suffered from past neglect of GPU with AMD's APU becoming ever more capable.

    • @aggies11
      @aggies11 11 месяцев назад

      @@Diegorskysp17 Yeah I wouldn't (and didn't) call this launch botched. But I think it is illustrative of the problem Intel has, and the potential issues if you have an Arc cards. There is no guarantee that the game you want to play won't have these issues. If the game you like works fine, then you are great. But these are high profile titles and even Intel couldn't get reviewers a performant driver for the card's launch. That should be a no brainer. So it just shows how hard it is. With an Nvidia and AMD gpu you have a pretty safe bet that your game will run decent enough, the vast majority of the time. With Intel, there is a bit more uncertainty. So unless you are very aware of what you are getting into, it'd be a tough recommendation.

    • @Diegorskysp17
      @Diegorskysp17 11 месяцев назад +1

      @@aggies11 you kinda did: "Real shame they botched the launch this bad". But nevermind that, you are right. Although I guess it doesn't matter that much if the drivers are there for launch day, even if they miss review embargo, it's true that Arc GPUs can't be blindly recommended to anyone. It's the kind of product where you should know what you are getting into and willing to face/endure its problems. I just hope Intel manages to squeeze more performance (that these cards seem capable of) and stability soon enough. We really need that reliable third competitor.

  • @102728
    @102728 11 месяцев назад +2

    I bought an rx6600 in may 2022. Had this card been out at the same $300-ish price I paid for the 6600 back then with the performance shown here, I'd likely have bought it even with the still maturing drivers and higher power consumption.
    The issue is that it's out only now. It's a gen behind and simply not really worth it. I'm glad with intel's efforts and I hope they can close the gap within the next 2 gens, but right now I just can't help but wonder why it was even launched.
    Battlemage is rumoured to come out only next year and will face the same late release issues, but at least have the advantage of much more mature drivers from the get go. Still likely not very compelling, but I hope they power through, keep improiving and get to a point where they can really compete. Fingers crossed

  • @jaydeepmohile
    @jaydeepmohile 11 месяцев назад +6

    People don't realise what an incredible job Intel has done with the ARC GPUs. This is the 1st iteration and they are already competing with AMD & Nvidia. Yes there are Driver issues and they are bound to be. Heck AMD has driver issues for new cards even after being in the GPU market for over a decade. I really want Intel to continue with their GPUs. It will increase the competition and put pressure on the other two brands.

  • @SalamaAhmed-pj3jv
    @SalamaAhmed-pj3jv 11 месяцев назад +1

    When I first knew that Intel GPUs' silicon will be made by TSMC at 6nm, I thought they would be very power-efficient. Now I'm skeptical about the 6nm: there's no way we can make sure about it, so they can say whatever they want

  • @pascaldifolco4611
    @pascaldifolco4611 11 месяцев назад +4

    Main take on this benchmark is that the 3050 really is a piece of crap^^

  • @RurouTube
    @RurouTube 11 месяцев назад +1

    One important things that Intel supporters tend to forgot is that they priced it this way because their GPU is NOT competitive, at least if we compared transistor usage, vs the competition, thus the current pricing is not something that is actually desirable for Intel, but it is something they were forced to do because otherwise it won't sell. This thing should compete with at least 3060Ti~3070Ti and 6700~6700XT. I say at least because the chip being used for A5~A7 has a bit more transistor vs the other GPU I mentioned. Those that are hoping Intel can keep their pricing for their next gen while also hoping for relatively great price to performance is probably going to be disappointed. This fine wine thing for Alchemist can happen because it is underperforming in the first place. If the GPU actually performs as well as the amount of transistor they used, it will be priced not too far vs AMD or Nvidia. Remember that for Alchemist they use TSMC, so they can't really get whatever discount possible by using their own foundry and afaik, their next gen GPU will also be made at TSMC, thus they really need to get their performance/transistor on par with AMD and Nvidia otherwise it will be hard for Intel to actually make money from it. A770 should perform like 3070Ti and A580 should perform like 3060Ti, but instead Intel comparing A770 with 3060 and A580 with 3050.
    Personally I'm not surprised that A580 can beat 6600 and 3050 in some games, because it should have since it has 60%+ more compute and of course like double the VRAM bandwidth.

  • @StretchDattass
    @StretchDattass 11 месяцев назад +5

    This just makes me more hype for Battle Mage! If they can hit their target of matching the 4080, or hell, even if they're 10%off, that's still near 4080 performance for probably $400-450! If it's not a buggy mess like arc was at launch it'll sell out day 1.

    • @defeqel6537
      @defeqel6537 11 месяцев назад +1

      If they could actually hit that early next year, then I'd definitely go with Intel, but I think they will try to charge more

    • @ryanspencer6778
      @ryanspencer6778 11 месяцев назад +4

      Don't expect 4080 performance for $450 even in their current state when they're trying to establish a market presence, they aren't a charity. If the top battlemage card can get close to the 4080, expect $500-$600, which is still more than fair.

    • @zalankhan5743
      @zalankhan5743 11 месяцев назад

      Dont ever Assume that these Companies are Here for the Consumer.
      Just as Intel has a Flagship Capable Gpu, they will Bump the Price to Match the Amd and Nividia.

    • @jaxonswain3408
      @jaxonswain3408 11 месяцев назад

      Battle Mage will NOT be as fast as a 4080, no way in hell. And you people are utterly delusional if you think Intel would sell 4080 like performance for $450. Lay down the crack pipe and learn to be rational.

    • @ryanspencer6778
      @ryanspencer6778 11 месяцев назад +1

      @@jaxonswain3408 why not? It's really more of a 4070ti. The A770 has specs roughly equal to a 3070ti, though the drivers don't let it use that hardware to its full capability. BMG-G10 is also supposed to be 56 to 64 Xe cores, up to twice as much as the A770. Then there are the architectural improvements. I don't think we'll see 4080 performance most of the time, but I do think the top Battlemage card will beat the 4070ti. As it is, the A770 already hangs with the 4060 a lot of the time.
      Nvidia really left the door open with everything but the 4090 by having such a terrible generational leap in the lower cards. And AMD dropped the ball with RDNA3. So Intel was basically gifted a free generation to play catch up, which they desperately need as they're about 1.5 generations behind in raw performance.

  • @Kneedragon1962
    @Kneedragon1962 11 месяцев назад +1

    ONE MILLION SUBS! How cool is that!? Carn morn Ozzie!

  • @vealck
    @vealck 11 месяцев назад +4

    RX6600 is unquestionable budget king. And it has drivers that actually work.

  • @BertieJasokie
    @BertieJasokie 11 месяцев назад +2

    Underwhelming by most standards, but with a low release msrp, it should come down to a similar price range to a 1650/6500xt, which would make it sell well in countries like India. Given it has a low enough price.

    • @givemeajackson
      @givemeajackson 11 месяцев назад +2

      there's so much silicon on this thing that they're most likely already selling these at a loss. absolute dud of a card.

  • @danieloberhofer9035
    @danieloberhofer9035 11 месяцев назад +3

    Conclusion A: The 3050 was horrible to begin with, is still a waste of silicon and the fact that people still buy that thing is laughable. Only that it's Nvidia's bean-counters doing the laughing part.
    Conclusion B: AMD and Nvidia seriously need to stop cheaping out on PCIe and memory bandwidth. The fact that a x16 interface and 256bit memory controller become the A580's saving grace speaks for itself.
    Conclusion C: If Intel doesn't get the driver stack working way better (and most importantly consistently) by the time Battlemage launches, and better launches soon, it'll probably be game over for Arc. They simply won't be able to sell another generation of GPUs below cost. The A580 has the same ACM-G10 die as an A770. That's 406 mm² of TSMC N6 worth of silicon. The 7600 uses half of that and pretty much roflstomps the A580 as it sees fit. And TSMC's N6 yields are excellent, these dies can't all be salvage.
    They have to get their sh*t together or Pat Gelsinger will have no choice left but to ax the project.

    • @EbonySaints
      @EbonySaints 11 месяцев назад

      Conclusion C.2: Arc as a discrete product for consumers is dead/sidelined, but they're continuing development because it's going to be the iGPU architecture for Arrow Lake and subsequent CPUs and thus, they can't just throw it all away. Plus, Intel knows that AI is a thing and Arc, despite the relatively crappy gaming performance, actually does great at that and certain productivity tasks for a first shot. Regardless, they're going to have to continue whether they like it or not, because the old iGPU architecture was anemic at best and parallel workloads are the future, at least in regards to what's hot right now.

  • @venomtailOG
    @venomtailOG 11 месяцев назад +1

    Didn't brand new drivers come out an hour after this video that rates some extra performance?

  • @MuhammadAhmad-sw1tr
    @MuhammadAhmad-sw1tr 11 месяцев назад +3

    3050 sucks.

  • @PineyJustice
    @PineyJustice 11 месяцев назад +1

    The real joke is you can get a 6700xt for 250$ used right now. The most expensive 70$ saved in history.

  • @elirantuil5003
    @elirantuil5003 11 месяцев назад +3

    This is actually a winner. 63 fps at cp2077 high at that price is insane.

    • @givemeajackson
      @givemeajackson 11 месяцев назад +3

      the 6600 has provided that level of performance at the same price point for about a year now. without any of the issues plaguing arc to this day. how exactly is that a winner?

    • @elirantuil5003
      @elirantuil5003 11 месяцев назад

      @noir-13 sales are region dependant, without them we didn't really have this level of performance at that price.

  • @h0stile420
    @h0stile420 11 месяцев назад +1

    Congratulations to 1M subscribers :D

  • @marcnashaat4192
    @marcnashaat4192 11 месяцев назад +1

    I think graphic manufacturers are missing out on the massive number of gamers who are eager to upgrade for just a good value GPU right now.

  • @ilhamshobriantoro311
    @ilhamshobriantoro311 11 месяцев назад

    good start, the lack luster performance in some games might be caused by poor optimization should be fix in later driver update

    • @zodwraith5745
      @zodwraith5745 11 месяцев назад

      They literally addressed those specific games amongst others in their latest drivers. That's why Steve was saying it's weird they didn't have those updates in the review drivers they sent him when you _know_ there's already better drivers out there.

  • @bzdtemp
    @bzdtemp 11 месяцев назад

    Ads is one thing, but when the lines are blurred and the ad includes praise as if it was a review it becomes questionable on several levels!
    I've worked for print media and there is was a very clear rule, that ads and editorial content was no allowed to mix. It may be innocent and all, but doing so just opens up lots of questions about the editorial content.

  • @Venom012
    @Venom012 11 месяцев назад

    It's nice to see Sparkle again (dating myself), they are a board partner from the 90's. Used to be a lot more common here in Europe along with Elsa, Diamond, Orchid etc

  • @cam42704
    @cam42704 11 месяцев назад

    You guys are so close to 1 million subs, 1k left.

  • @ElecTricKTitaN
    @ElecTricKTitaN 11 месяцев назад

    I'm just glad they're giving it a bash amongst the chaos at the moment.
    Hats off to em especially since they're giving continuous updates.

  • @HDJess
    @HDJess 11 месяцев назад

    999K subs, here we go. In the next video you'd better be showing us the 1M RUclips plate.

  • @DaffyDuckTheWizzard
    @DaffyDuckTheWizzard 11 месяцев назад +1

    I think this is all so crazy
    The Intel Arc division is improving at fast enough pace that I might change my mind completely by the time I think it's time to buy my new GPU
    I'm genuinely impressed, this is one of those few things that gives me a positive outlook in the future of a market that just does not offer anything much exciting anymore

  • @amnottabs
    @amnottabs 11 месяцев назад +1

    as someone who doesn't give a fiddle about Ray Tracing I find it funny how it steamrolled over the RTX card on the Ray Tracing results (not like the 3050 is an RT beast but still funny nonetheless). Also power consumption isn't something I usually take into account unless it causes heat dissipation problems for the whole system but wasn't expecting this card to consume THAT much power

  • @pyron674
    @pyron674 11 месяцев назад +2

    I'd be keen to see the results for these GPUs, including the RX 7600 once FSR3 gets implemented in some of these titles. The new upscaling tech from AMD looks promising, albeit no where close to DLSS, but still promising.

  • @megurinemiku8201
    @megurinemiku8201 11 месяцев назад +1

    Arc A580 would make a good pairing with i3 12100f and a h610m motherboard...
    Edit: It's time to build a pc for me I guess.

  • @CannedMan
    @CannedMan 11 месяцев назад +1

    Considering your examination from a couple of years back, how would this GPU be affected by running in a low-tier system, such as it is likely to be paired up with, say something like the 13500 or maybe the 7600(X)?

  • @CalgarGTX
    @CalgarGTX 11 месяцев назад +1

    Why does it come so late... Intel is fully capable of having 40-50 same gen different CPU SKUs in their lists on the server market, but they couldn't release 5 GPU variants anywhere close to the same time, 4 of them having the same chip... hello?

  • @fynnplayz6501
    @fynnplayz6501 11 месяцев назад +1

    Congrats on 1 million subscribers

  • @gabe4142
    @gabe4142 11 месяцев назад +1

    This thing shits on the 3050 for 180 bucks already a banger in my book

  • @Hybred
    @Hybred 11 месяцев назад +2

    Intel today released a driver that improves performance across many games. Maybe it wasn't the launch driver but it was very close to launch so I feel like the comparisons especially in games where Arc is borked should've been redone.

  • @evergaolbird
    @evergaolbird 11 месяцев назад

    999k Subs, advance congratulations for a huge milestone for the channel.

  • @Hugs288
    @Hugs288 11 месяцев назад +77

    its very promising to already see arc gpus beating their amd and nvidia counterparts in some titles, keep it up intel!

    • @Revenant-oq9ts
      @Revenant-oq9ts 11 месяцев назад +5

      The tech is there. It's just the drivers that hold it back. I'm happy with my a750, and I do show support in order to diversify the market. But I do hope Battlemage incorporates whatever lessons they've learned from Alchemist.

    • @givemeajackson
      @givemeajackson 11 месяцев назад +12

      @@blk13 and that their counterparts are two years old. and use half the power. and run on an older cheaper manufacturing node. and don't have crippling driver issues in many games. very hard for me to get excited about any of this, and i'm sure their shareholders think the same.

    • @Hugs288
      @Hugs288 11 месяцев назад +6

      @@blk13 i mean, considering that its a first gen product and that gpus are very complicated devices, its not bad imo. especially since they are constantly improving the drivers.

    • @toivopro
      @toivopro 11 месяцев назад +4

      ​@@Hugs288its there 3 discreet run. Pat Gelsinger was the lead for their first gpu...

    • @maverickvgc4220
      @maverickvgc4220 11 месяцев назад +2

      "counter part" the die is the size of GA104, the actual counterpart would be the 3060ti

  • @Tigermania
    @Tigermania 11 месяцев назад

    Intel driver updates for so many games is a problem, but that idle power draw is one problem they really need to fix asap.

  • @Gluosnis9
    @Gluosnis9 11 месяцев назад +1

    Congratulations with 1M subs!

  • @fracturedlife1393
    @fracturedlife1393 11 месяцев назад

    That Memory Bus is killing the others.

  • @tim3172
    @tim3172 11 месяцев назад +1

    Intel has already stated that idle power consumption is an architecture limitation that cannot be fixed with a driver update.
    You’d be a straight fool to buy a GPU that tanks on performance unless the manufacturer hand-crafts drivers for every single game. How long do you think Intel will provide this level of support for these GPUs?
    I don’t think I’ve ever had performance just crater on a reasonably-spec’d AMD/Nvidia GPU just because I failed to install the driver that specifically makes it playable. That’s going all the way back to the GeForce2 days.