The Great Blue Hope | Intel Arc A770 16GB

Поделиться
HTML-код
  • Опубликовано: 22 авг 2024

Комментарии • 544

  • @IcebergTech
    @IcebergTech  Год назад +251

    CLARIFICATION: Resizable BAR was on 😊
    ERRATA: Time Spy 13034 graphics score, Firestrike 35252
    Thanks again to @HardwareLab for the loan of the Arc A770. Keep an eye on their channel for an upcoming video using my RTX 3070!

    • @IvanIvanov-ni4rs
      @IvanIvanov-ni4rs Год назад +6

      I love it when you say soixante neuf frames per second! 😃 I thought it was spelled Swason Neuve , but the french always surprise me with their spelling...

    • @markthompson4567
      @markthompson4567 Год назад +3

      jedi survivor you need to hit spacebar to accept changes not the return key even though its that button that is shows to press

    • @weltsiebenhundert
      @weltsiebenhundert Год назад +2

      Why not pinned ?

    • @reinhard7164
      @reinhard7164 Год назад

      Thanks!

    • @IcebergTech
      @IcebergTech  Год назад +9

      @@weltsiebenhundert could have sworn I did! Thanks

  • @yasu_red
    @yasu_red Год назад +658

    It's impressive how well intel has done for their first generation of desktop GPUs in over 20 years

    • @neon_arch
      @neon_arch Год назад +38

      They are the only company we can have hopes from rn in the GPU (and cpu as well) space

    • @raresmacovei8382
      @raresmacovei8382 Год назад +24

      This is disengenious, given Intel has been making iGPUs for the past over 20 years

    • @neon_arch
      @neon_arch Год назад +90

      @@raresmacovei8382 he meant discrete gpus that could do much more than just giving a display output

    • @raresmacovei8382
      @raresmacovei8382 Год назад +17

      @@neon_arch I know what he meant. I still believe Intel should've gotten MORE FLACK for their poor Arc launch, given it's not their first GPU.

    • @jacobwebb8818
      @jacobwebb8818 Год назад +24

      ​@@raresmacovei8382 this was exactly my first thought. They have been making integrated GPUs in many forms between Intel Atom mobile processors and even their core series. I'm not surprised their first launch went as poor as it did, you have to give engineers some credit. They are the top of their class at Intel, and if you think a GPU not coming out of the gate swinging as hard as it could out of the launch, you're simply delusional.
      This is a tight knit market with not much deviation. If Intel waited a few more months to sell the card, that would be countless sales lost. You have to realize when you get a "high-mid tier" card for $350 it's not going to be a rocket ship, and you can expect to wait to get the best performance, and there's nothing wrong with that.
      Hell the Radeon 7××× series launch was really poor with driver support, I think even the 6 series cards were outperforming the 7 series cards for a little while

  • @sutripanchaudhuri2246
    @sutripanchaudhuri2246 Год назад +68

    We now live in the RGB Universe :D What a time to be alive

    • @Null_4045
      @Null_4045 Год назад +4

      Just need to wait Nvidia CPU, then it'll be perfect

  • @mtaufiqn5040
    @mtaufiqn5040 Год назад +321

    Hope the next iteration will deliver great performance with little cost

    • @jacobwebb8818
      @jacobwebb8818 Год назад +41

      The price will only go up. I don't believe a bunch, but I'm sure it will. I feel like this card was released as a loss leader to build the market up for Intel cards

    • @prateekpanwar646
      @prateekpanwar646 Год назад +14

      @@jacobwebb8818I agree, If we consider the new production line for new product, low quantity, new drivers, support team, architecture. Everything cannot beat years of developed production from Nvidia and AMD.

    • @eclipsegst9419
      @eclipsegst9419 Год назад +2

      ​@@jacobwebb8818 probably a bit. But i also would not put it past Intel to under cut for the win. This has worked wonders for them in the past several times. When you own your own fabs and have a huge net worth, you can pinch the fat and pull such a move off. I know Battlemage is still on TSMC but i'm pretty sure from Celestial forward it will be 18A

    • @shepardpolska
      @shepardpolska Год назад +5

      The card was released at a good price because it has many issues and performence at release was not even worth the price. They didn't raise the price because they don't really make them anymore and that would be bad PR.
      The A770 was supposed to be a 3070/6700XT competitor and uses more silicone then both, yet it performs closer to a 3060. Thats why the good price. It was a flop, and priced as such. It still NEEDs ReBar on to even be usable.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Год назад +3

      @@shepardpolska Why is everybody so pissed off with ReBar requirements ? My 5 year old B450 motherboard supports it. Also, Intel is new to the market, they won't even dream of making GPUs as profitable as AMD or Nvidia until they get a decent stable market share.

  • @DJ-fw7mi
    @DJ-fw7mi Год назад +16

    This can be Intels Ryzen moment for GPUs. Aiming for low to mid range with fair pricing from 200$-400$ 12-20GB, strong feature set, they could take a chunk out of the market. I might pick up an A770 16GB for 300ish usd.

  • @ArthurD
    @ArthurD Год назад +172

    I think that A770 still has some power in it to be squeezed out. Massively good card from Intel.

    • @JynxedKoma
      @JynxedKoma Год назад +41

      On the contrary, it has LOTS more power to be squeezed out yet. It's essentially a 3070.

    • @McLeonVP
      @McLeonVP Год назад +11

      @@JynxedKoma or 3070ti xd

    • @shepardpolska
      @shepardpolska Год назад +15

      @@JynxedKoma It was supposed to be but it's clearly flawed. If it's the architecture at issue you can only fix so much in drivers, so it's not a given that they can patch it to be much faster. There is a reason why they cut the higher Alchemist models from the market completely

    • @sleepy9932
      @sleepy9932 Год назад

      Mate should I buy a770 or 3060 rtx

    • @shepardpolska
      @shepardpolska Год назад +8

      @@sleepy9932 look at benchmarks for games you like to play, at that price I would also consider AMD cards since especially now they can be very good value

  • @TechDweeb
    @TechDweeb Год назад +34

    "It's main appeal is simply that it existed"
    Same. Same.

  • @BudgetBin
    @BudgetBin Год назад +112

    I have noticed in my experience with the Arc cards, is that no one person's experience, will be the exact same due to how random the outcome gets per PC. We literally uploaded our polarizing videos minutes of each other.
    Very nice stuff! Glad you ran into less problems than I did!

    • @rockstar-5934
      @rockstar-5934 Год назад +4

      That goes for all computer builds. What’s your point?

    • @anasevi9456
      @anasevi9456 Год назад +4

      @@rockstar-5934 amen, and often people blame the gpu first when its Radeon or now with Arc. Yeah their drivers are spotty, but 95% of the time the big issues are caused by something else in the system, in particular stability or performance deviation.

    • @CaptainKenway
      @CaptainKenway Год назад +17

      @@rockstar-5934 His point is that Arc is completely unreliable and all over the place in terms of performance from one game to the next, and that Intel's drivers are still garbage as of now no matter how many improvements they've made. But then you know that, which is why you and your buddy there immediately got all defensive about it. Trying to equate that as being 'the same for all PCs' and 'perfectly normal' is just Intel shill disinfo of course. I don't have to worry about MY GPU performing like a GTX 1050 in one title and an RTX 3070 in the next, or simply refusing to launch a certain game at all, because I don't own Arc and won't do until Intel get their crap together. Nevertheless, you brave soldiers who paid Intel hundreds of dollars to beta test their product have my thanks.

    • @GewelReal
      @GewelReal Год назад +3

      @@CaptainKenway well the 770 offers some extra features, not just gaming

    • @cyjanek7818
      @cyjanek7818 Год назад +4

      @@rockstar-5934 not to that extent

  • @USArmyVet91
    @USArmyVet91 Год назад +30

    Shocked at how far Intel has come after all the issues the initial launch. Although still not perfect, it is nice to see so much work being done to improve the drivers. Great video as always. 💪👍

  • @Azureskies01
    @Azureskies01 Год назад +56

    I watch HWB for graphs of current/high end stuff, I watch IBT for the mid range and older stuff and it is just as good.
    Nice video as always. Keep it up man.

  • @406Steven
    @406Steven Год назад +22

    I bought an A380 out of curiosity and to support competition, bonus points for AV1 encoding (400+ FPS re-encoding my library). The A750 at $200 right now is, in my opinion, the best bang-for-buck card and if they drop the A770 to $270-300 it'll make the recent "budget" cards from teams green and red pointless. As an added bonus, Intel is releasing drivers seemingly every month and continually improving performance so you can count on them running even better as time goes on.

    • @frostcate2546
      @frostcate2546 Год назад +3

      Getting close now to that $ mark. ASRock phantom cards are currently going 280$ for the 8gb 770 model and 310$ for the 16gb 770 ( current promo code knocks 10$ off )

    • @406Steven
      @406Steven Год назад

      @@frostcate2546 Good to know, I've been watching the prices a bit and it seemed like $320ish is where they were settling with 16GB models still being too much for what you get. I'm not sure if I want to hold out for $180-200 for a 750 or go for a 770. It's not going to be in my main gaming rig or even my extra but rather a gaming machine I put together just because. From a hardware collecting standpoint the 770 seems like the obvious choice but I don't know that the extra performance is in line with the extra cost.

    • @frostcate2546
      @frostcate2546 Год назад +1

      @@406Steven Well, it's happened. The A770 16gb is sitting at 300$ now

    • @406Steven
      @406Steven Год назад

      @@frostcate2546 Good to know, last I looked they were still going for $320. For the money I really don't know if you can do better for a new card. I'm very tempted to pick one up, though I already have 2 gaming PCs.

    • @frostcate2546
      @frostcate2546 Год назад +2

      @@406Steven Kinda wanted to pick one up myself lol but idk what use I would have it for since I already have an A750 and a Rtx 3090

  • @Zansilveira
    @Zansilveira Год назад +27

    Your channel is everything i`ve ever dreamed of and hope to do it myself someday. Big job!

  • @ProjectPhysX
    @ProjectPhysX Год назад +22

    Drivers were rough at launch, but are much better now. For the money you get 16GB VRAM on a solid 256-bit bus. None of that 128-bit garbage that the other vendors are selling for higher price. Looking very good for Intel!

  • @quinncamargo
    @quinncamargo Год назад +23

    Finally, Intel enters the conversation. I cant wait for the next card they make. Now that Nvidia has really given up on PC gamers, i hope intel comes in and saves the day.

    • @haukikannel
      @haukikannel Год назад

      Does the intel want to have 200%+ profit margins from AI GPUs or 80% profit margins from gaming GPUs… hmm… or sell GPUs at los or not profit like they do now…
      Intel will douple the prices if the next gen is any good!

    • @NathanPlays395
      @NathanPlays395 Год назад

      @@haukikannel they have promised that it is the same. don't expect a huge price increase like nvidia.

  • @atomicskull6405
    @atomicskull6405 Год назад +6

    ARCs D3D8/9 improvements have come from Linux and Wine, instead of trying to get it to run D3D8 and D3D9 natively they stopped trying and instead baked DXVK (Wine's DirectX to Vulkan translation layer) directly into the drivers. This allows them to have a D3D12 / Vulkan optimized GPU while maintaining good performance on D3D8 and D3D9 (on my RX 6700XT many D3D8 and D3D9 games actually run better in Linux under Wine than they do in Windows 10)

  • @Carstuff111
    @Carstuff111 Год назад +27

    It is nice to see that Intel is putting in the work on their drivers to make their cards better. That said, I am also glad that I got my hands on a RTX 2080 Super to replace my GTX 1070, and now an AMD Ryzen 5 5600X to replace my AMD Ryzen 5 1600X. The Intel cards have decent price and performance now, and will likely get some what better over time, but I can't find a new card that is on par with performance per dollar compared to what I got my 2080 Super for. And it does seem the 2080 Super or cards in that performance range seem to be a perfect match to the 5600X.

    • @youtubeisgarbage900
      @youtubeisgarbage900 Год назад +2

      Cope

    • @youtubeisgarbage900
      @youtubeisgarbage900 Год назад

      @Gustav0 He is in buyers remorse mode and trying to cope. Nothing will beat a brand new RX 5700XT for 180usd in 2019

    • @onepcgamer6810
      @onepcgamer6810 Год назад +1

      @@youtubeisgarbage900 Buyers remorse mode?

    • @mrpiratefox4497
      @mrpiratefox4497 Год назад

      ​@@youtubeisgarbage900 those AMD cards didnt even have any DLSS equivalent and didnt even have RT cores, buying Amd that gen was a waste of money

    • @youtubeisgarbage900
      @youtubeisgarbage900 Год назад

      @@mrpiratefox4497 180 dollars and amd fidelity fx is a thing, ray tracing is a meme that doesn't matter

  • @kadian666
    @kadian666 Год назад +17

    You should do the a750, it has 90% of the performance of the 770, and can be had for $200. It is the best budget card right now. I have one paired with a I3 12100f, 2tb nvme, and 16g ddr4. And I was able to build that for about $650

    • @LanciaSiluri
      @LanciaSiluri Год назад

      A poor man that is bragging lol

    • @jenous736
      @jenous736 Год назад +8

      @@LanciaSiluri his cpu performance is a million times more than me though💀

    • @NathanPlays395
      @NathanPlays395 Год назад +6

      @@LanciaSiluri a rich guy coping on value lmao

    • @AhmadWahelsa
      @AhmadWahelsa Год назад

      @@LanciaSiluri and what is wrong with it? I got myself
      A decade old Core i5 3470 and 5 years old RX 560 4G in an asrock b75 mobo, with dual channel team elite ddr3 8gb ram, a Samsung 870 evo 1tb and WD blue 1tb as storage devices, and office logitech mouse and keyboard, ofcourse with an ancient prehistoric 1366x768 monitor
      How's that sound?

    • @LanciaSiluri
      @LanciaSiluri Год назад

      @@AhmadWahelsa another clown that brags about old crap, no one gives a flying f what you bought/have

  • @falloutnewvegasboy
    @falloutnewvegasboy Год назад +10

    This GPU drop has been the most fascinating since the Radeon VII. I love misfit cards

  • @RafaleC77th
    @RafaleC77th Год назад +8

    I have one as well... on a 13900K with 64GB DDR5 5600. NGL there was some growing pains with this card. And yes the 16GB of VRAM was a Major deciding factor in its choice. My ultrawide display is limited to 1080 (I might update that at some point) so something like the A770 was not a stretch. The card is honestly an 'Interim Card'. And the rough around the edges parts of the experience begins in older titles like Star Trek Online. The cards sings with great performance on newer titles like Sons of the Forest. I feel & see the difference between Intel's card and the other two. I was going to get a 4090, but none were ever available.

    • @jeffreygrindle6396
      @jeffreygrindle6396 Год назад +5

      A770 is great for 1080p for sure and its a amazing capture card as well and in truth I think it's going to age a lot better then a lot of the gpus priced around the same

  • @revx2664
    @revx2664 Год назад +56

    Considering the spec of this gpu it's a bit sad to see how much the driver is holding it back.

    • @RoCkShaDoWWaLkEr
      @RoCkShaDoWWaLkEr Год назад +6

      That was the story of my life when I had a 8gb RX580 a few years back.

    • @damazywlodarczyk
      @damazywlodarczyk Год назад

      How would you know? You think you know better than high IQ tech engineers with MBA's who work in Intel? You don't.

    • @IvanIvanov-ni4rs
      @IvanIvanov-ni4rs Год назад +1

      @@damazywlodarczyk Intel fanbois never cease to amaze me with their stupidity.

    • @KyleRuggles
      @KyleRuggles Год назад +1

      @@damazywlodarczyk WTF is wrong with you?

    • @KyleRuggles
      @KyleRuggles Год назад +2

      Agreed!

  • @anasevi9456
    @anasevi9456 Год назад +17

    CDPR also made general optimisations for ARC during the XeSS integration in Witcher 3 and CP2077. With low-level API's both DX12 and Vulkan, a LOT of the onus for performance is also on the game engine developers, not just driver teams. Expect ARC performance to significantly improve as game engines take ARCs existence into account over time too. Great video and glad things are improving, try XeSS, it's already proven really good, way better than FSR2 visually even in software mode if you use a gpu newer than RDNA1, and it's faster too in ARC hardware mode.

  • @Omonarc
    @Omonarc Год назад +19

    Interesting you brought up Spider-Man, I was finishing up Miles Morales after a recent patch and it was behaving very similarly on my 1700/5700XT and my Steam Deck 🤔

  • @dbod4866
    @dbod4866 Год назад +4

    I have a a750 and play a ton of older games. Pillars of Eternity, Divinity, and even BG2. No glitches. No artifacts. I run at 4k on 11700k. The A750 is the best deal in gpus. Notice when people complain, they never list broken titles, they just throw it out there. Breaking hearts.🎉

  • @ricodo1244
    @ricodo1244 Год назад +9

    Intel were the first ones with av1 encoding if I remember correctly which would be "innovation", even if pretty much no software supported it

  • @SwissMedi
    @SwissMedi Год назад +8

    I bought an arc a750 some time ago, and other than 2 times that it wouldn't run a game, I had no performance issues whatsoever, so experience may vary for anyone, and I'm confident that the driver is the thing holding back the gpus

    • @njm12
      @njm12 Год назад +1

      I love iceberg, but as a 750 owner myself, he definitely did or didn't do something that hindered the performance. Resizable BAR maybe

  • @eclipsegst9419
    @eclipsegst9419 Год назад +3

    For first gen this is a solid product. 30+ years of backwards compatibility is one HUGE hurdle that many who have dissed on this card overlook. I bought a 6700xt this year because, well, it's better. But i would love to see Intel keep up the good work and maybe buy a Celestial or Druid from them in a few years.

  • @Tony_Re-imagined
    @Tony_Re-imagined Год назад +2

    RGB, red green blue, i feel like this is teh beginning of huge rivalries, which is good for us because better gpu's

  • @daviddesrosiers1946
    @daviddesrosiers1946 Год назад +8

    Just ordered an Acer Predator Bifrost A770 OC. My primary use case is creation and editing. Very little 3D. Looking forward to using its features.

    • @KadiusFTW
      @KadiusFTW Год назад

      Supposedly coming in a month and a half

    • @daviddesrosiers1946
      @daviddesrosiers1946 Год назад

      @@KadiusFTW Mine shipped this morning.

    • @KadiusFTW
      @KadiusFTW Год назад +1

      @@daviddesrosiers1946 saying mine is

  • @ReetinEntertainment
    @ReetinEntertainment Год назад +16

    From what I have seen the A750 performs pretty well compared to the A770. The A750 can be had for around $200 in the USA as well. I use mine in my streaming PC in the hopes that someday Twitch will start supporting AV1 encoding.

    • @SIPEROTH
      @SIPEROTH Год назад +2

      It does. if you look at hardware specs as well is pretty close as well. It's only the 8GB vs 16Gb that is the bigger difference but considering the performance of this cards i don't know how much it matters.

    • @ReetinEntertainment
      @ReetinEntertainment Год назад +1

      @@SIPEROTH Yeah, the software is the real issue with these cards. Once they get better drivers they're gonna be killer. They have already improved performance by a lot.

  • @flamingscar5263
    @flamingscar5263 Год назад +6

    1 reason to go for the a770 that I'm surprised you didn't mention is AV1 encoding
    Even if all you do is clip your gameplay, AV1 makes everything look better, and if you stream on RUclips then you can stream with AV1 to save some bandwidth or make your stream lool better at the normal bandwith you stream at

    • @swiiishgin1515
      @swiiishgin1515 Год назад +3

      all 3 arc cards have av1

    • @redslate
      @redslate Год назад +1

      Definitely looking forward to more AV1 support in the future.

  • @CardCaptorKaren
    @CardCaptorKaren Год назад +4

    Your conclusion is about the same as I always think when I'm looking at the Arc, though the fact that a year later the A770 looks better than it did at launch and still like a valid option has me hopeful for the Battlemage cards.

  • @jqwright28
    @jqwright28 Год назад +11

    I'd enjoy seeing you review the a750 8gb sometime. Now that's is $200 USD for a new one, I think that particular card is hard to not recommend. I have the a770 16gb, it's just not got the rasterization power it should based on it's specs and it really doesn't outperform the a750 like it should. I actually think you'd be impressed with the a750, for the price.

    • @100500daniel
      @100500daniel Год назад +1

      The ARC A770 is more hammered by driver optimization

    • @jqwright28
      @jqwright28 Год назад

      @@100500daniel Perhaps, in some games but I think with a lot of dx12 games where it ought to perform like a 3070 or better it just doesn't have the power and I don't think they will get the drivers to better that.

  • @discopatw
    @discopatw Год назад +1

    4:38 "starts to look cinematic... in a bad way"😂😂😂

  • @slippedsquid2311
    @slippedsquid2311 Год назад +2

    For some reason I just like your channel more than other review channels. Keep it up

  • @AurioDK
    @AurioDK Год назад +4

    A strategy for Intel could be to avoid competing in the upper class segments, going up against the likes of the 4090 somewhen in the future might not be a good idea. What they could do is invest massively where all the money is, in the mid and low tier segment followed up by affordable prices, that means going hard on the 1080p/1440p market where most people still find themselves.
    1% of Steam users are on 4k and people like me are not going to transition anytime soon, I have other priorities like most people. For me personally the temptation to upgrade only comes in the form of significant performance improvement while also maintaining a sane price, Intel could actually do that if they focus.
    While I don´t believe Intel should compete to be the top dog, it´s simply a risky and very expensive undertaking, it doesn´t mean they can´t work on it while developing something most gamers can actually benefit from.

    • @redslate
      @redslate Год назад

      A halo card can be a customer-draw, but Intel is focusing on the core market (low to mid tier). This is a smart move early on.
      If the next few gens find their footing, I'm sure they'll begin pushing into the high tier market.

  • @Zxanonblade
    @Zxanonblade 6 месяцев назад +2

    I'm using Asrock's 16gb A770 at 1440p and it's an amazing card, definitely glad I decided to go with Arc!

  • @84jdgregory
    @84jdgregory Год назад +3

    I had bought one of these late last year and was super excited. It did awesome in time spy but when I started playing MW2 the frame rate was way below the 3060 ti I was using at the time and for some reason I was having problems getting into lobbies. A couple times the game would flat out freeze or my system would crash.
    Maybe some of those bugs were worked out but for $350 I was expecting at least a stable gaming experience.

  • @cristianroth8524
    @cristianroth8524 Год назад +3

    I love everything about this card. The looks, the performance, the potential, all are off the charts, considering Intel is still a rookie in the GPU world. Already owning an RTX 3070 is the only thing stopping me from buy an ARC for the time being

  • @t_z1030
    @t_z1030 Год назад

    I've only just stumbled over your channel, and I have to say that what you're doing here is excellent. It's rare to see a hardware review video that is so incredibly well produced. I hope the youtube algorithm is kind and merciful for you, you deserve to get a lot bigger.

  • @arranmc182
    @arranmc182 Год назад +9

    I think in the near Future the a770 will start to perform on par or better with new games in the future as the 8GB of the on the RTX 3070 is going to hurt it and worse of all I own a RTX 3070 🤦

    • @lordshitpost31
      @lordshitpost31 Год назад +4

      As an Arc owner I'd say A770 is the equivalent of 3060ti while A750 is the equivalent of 3060. In computing power intel exceeds competition but it's really hard to turn all that into gaming performance.

    • @phoenixrising4995
      @phoenixrising4995 Год назад +3

      @@lordshitpost31 The spec should put it close to a 3070 going forward, but it has 16GB of VRAM while a 3070 has only 8GB.

  • @ausfoodgarden
    @ausfoodgarden Год назад +3

    I got an A770 16Gb slightly used from a frustrated gamer for about half the new price before the drivers got useable.
    I use it for video work mainly but have recently tried a couple of games and it's pretty good.
    I hope Battlemage will be another big step up and I'm planning on buying one new next time.
    Oh and I'm kind of locked in to Nvidia for my main box because - Blender. I didn't buy this generation.

    • @calummckinney
      @calummckinney Год назад

      i just bought one used to use primarily for video editing - very curious to see how it goes... was strongly compelled to go with this over nvidia for some reason

  • @The_Fat_Turtle
    @The_Fat_Turtle Год назад +2

    I'm excited for Battlemage, the second generation of DGPU's has real potential, especially with RT performance.

  • @JeredtheShy
    @JeredtheShy Год назад +2

    Hmm, looks like the drivers still need more massaging but the performance is there, the 770 seemed like a real disappointment at first but Intel is at least in the ballpark. It's a solid first showing after all, but I think we'll all be more interested in the followup. It really is too bad that Intel didn't get a piece of the "price no object" GPU market, that would have encouraged them a lot, but hopefully they intend to give the competition a proper run and don't give up from weak sales for their first cards.

  • @Scitch87
    @Scitch87 Год назад +2

    I would love to build a system with an Arc A770. It's absolutely impressive what Intel have achieved with their first line of dedicated GPUs.
    I'm really excited on how they will perform in the future.
    But Intels prices don't make it easy. The Arc A770 costs as much as the RX 6700XT where i am.....so......yeah that's a pass for most people at the moment.

  • @FireWoreal
    @FireWoreal Год назад +2

    "the CPU has potential to be bottleneck with this class of GPU" i´m really happy for jokes like that

  • @atoo7yt490
    @atoo7yt490 Год назад +5

    wake up babe new iceberg tech video

  • @GoldenGyroBalls
    @GoldenGyroBalls Год назад +1

    Using a clip from Tomorrow Never Dies in the opener is an easy way into my heart, thank you.

    • @IcebergTech
      @IcebergTech  Год назад +1

      You're lucky, I was almost going to use a clip from Austin Powers

  • @robosquidz
    @robosquidz Год назад +3

    I kinda want to get one for my first build because I’ve heard blender performance is pretty good on Arc

  • @oktc68
    @oktc68 Год назад +12

    Getting good performance with ray traced games at 1440p isn't cheap, tbh there's only the RTX 4070 Ti or higher GPU's that meet my standards. It's great to see Intel enter the fray, at this point I'd rather give them my cash than Nvidia, fingers crossed for Battle Mage. 👍🏻

  • @SB-pf5rc
    @SB-pf5rc Год назад +1

    congratulations on 30k subs!

  • @BNBPhotofr
    @BNBPhotofr 11 месяцев назад +1

    I honestly believe that it's more a matter of software optimization then raw hardware performance. I think it should age quiet nicely.

  • @arthurbrax6561
    @arthurbrax6561 8 месяцев назад +1

    I am an old guy
    I was there when the Intel i740 failed big time
    I was also there when Larrabee failed as well
    glad that Intel finally got it together in their third try

  • @Map71Vette
    @Map71Vette Год назад +1

    I've been very tempted by the Intel cards, but have a few major hangups still. The main reason I was even thinking about a new card was to give me better performance for VR, but VR and ARC don't get along all that well just yet. I was also looking at a 3D scanner for fun, but the one I am most interested in requires an Nvidia card, so that's more rain on my parade. Hoping Intel keeps up the pressure for future releases though as they look to be really good performance for the money on games that work well with them.

  • @Tainted79
    @Tainted79 Год назад +2

    Oh wow, I've never had or seen freezing like that before in Spider-Man remastered.
    My rig AMD 5600x@4.8ghz, 32gb cl18 ram, rx6700xt 12gb, 8tb 7200rpm HDD.
    Also, would've loved to see 1080p benchmarks. Cheers

  • @adinnugroho6544
    @adinnugroho6544 Год назад +10

    I remembered when Intel ARC was criticize for the bad Driver. Almost like Radeon in the past, but then got better and better as Intel maybe got feedback from their users

    • @abdullahnaeem7275
      @abdullahnaeem7275 Год назад +7

      Yeah the drivers are improving at quite a great pace.

    • @redslate
      @redslate Год назад

      Yeah, even out the gate, I recall watching Gamers Nexus (whom I respect immensely) figuratively ripping into ARC GPUs and thinking to myself, "That's all just software issues? This is their _first_ foray. They can patch *all* of these faults. The hardware is incredible!"
      Intel's driver support has since been second to none.

  • @jeffreygrindle6396
    @jeffreygrindle6396 Год назад +3

    I have a feeling battlemage is going to be a ball buster for team green and red intel is really bringing the value

  • @oscarfiala2104
    @oscarfiala2104 Год назад +1

    Red Green Blue. Perfect, especially for GPU’s that produce color.

  • @jernaugurgeh451
    @jernaugurgeh451 11 месяцев назад

    This was a good video - informative and entertaining, and I really liked the intro and the potted history of Arc. Nice to see a UK-based TechTuber doing good stuff like this.
    I just picked up an Asrock Phantom Gaming Arc A770 for just £190, new from a big tech retailer via Ebay. That's a premium 3-slot card with 3 fans. I don't like the design of it so much, and I really wanted an Intel Arc Limited Edition card because they look so cool and are a slice of GPU history, but the cheapest I found was £230, and only an A750.
    Even though I know it's not going to match my RTX 4070 FE for performance or compatibility, I'm really looking forward to having some fun tinkering. I'll probably build a seperate system to put it in eventually, though gotta pay off the interest-free credit for my 4070 first!

  • @terrieterblans7027
    @terrieterblans7027 Год назад +1

    My first graphics card was Intel i740. Overall very good card but had to change it for a 3d card. I have been with nVidia for a long time but had 1 ATi card. I am on the cusp of buying a new GPU. I am going to buy Intel again.

  • @maxsettings2906
    @maxsettings2906 Год назад

    Your voice makes me think of the intro to the early 90s show beyond 2000

  • @Alphawolf2325
    @Alphawolf2325 7 месяцев назад

    Would love to see a do-over! I've been doing some looking and it would seem that driver updates have helped a LOT

  • @soldiersvejk2053
    @soldiersvejk2053 6 месяцев назад +1

    I love my A750 purchased about a year ago. The design is just beautiful.

  • @okuyashoe_official
    @okuyashoe_official Год назад +1

    I love how the first couple of game he chooses have known performance issues

  • @Ri_Penguin
    @Ri_Penguin Год назад +1

    I have the ACER Intel® Arc™ A770 16 GB Predator BiFrost OC Graphics Card on AMD cpu & this card is amazing , I’ve tested it will all the games above apart from warzone & I can’t say one bad thing about the Card , I’m impressed

  • @GewelReal
    @GewelReal Год назад +3

    Such a shame such a promising GPU is held back by its drivers. Come on Intel!

    • @Unethical.Dodgson
      @Unethical.Dodgson Год назад +1

      Not exactly a new story. Happens with AMD on every generation and Nvidia has had their own share of major driver issues in the past.
      But at least the stability is still better than I've ever experienced with AMD... the amount of times a game would crash due to "AMD Driver has stopped responding" across no less than EIGHT different cards... fucking hell. That was bad.

    • @TheMasterOfSafari
      @TheMasterOfSafari Год назад +5

      @@Unethical.Dodgson seems like something is wrong on your setup and not AMD then.
      Did you DDU for instance?

    • @WyattOShea
      @WyattOShea Год назад

      @@TheMasterOfSafari I agree. Half the gpus I've ever had have been AMD and most major issues were due to my own ignorance in those days (when I was starting out with pc gaming and didn't know about stuff like ddu or not installing new drivers over the top of old ones).
      Most major issues have been ironed out on Nvidia and AMD cards over the years in my experience with almost exclusively minor issue remaining from time to time.
      The vast majority of the time it's down to user error or Windows needing a fresh install than the hw itself being bad. Also Nvidia and AMD drivers are basically the same from a stability standpoint.

  • @styles234
    @styles234 Год назад +4

    I wonder if the major stutters could be related to resizable BAR not being enabled? I know that's a huge bottleneck for Intel cards.

    • @Enkirex
      @Enkirex Год назад +1

      Resizable BAR is a requirement for Intel cards;I mean if it were disabled you would see worse performance than that.

    • @IcebergTech
      @IcebergTech  Год назад +3

      ReBAR was enabled

  • @LegendaryDJ
    @LegendaryDJ Год назад +1

    Honestly the real winner here is the 1080ti. Over 6 years old and still keeps up with the mid range today. That's of course no praise specifically to the 1080ti, but in fact more proof of the stagnation of the midrange that's occurred for far too long now.

  • @eccodreams
    @eccodreams Год назад +1

    Very excited to see the A770 on your channel!

  • @griffinhigh6646
    @griffinhigh6646 Год назад

    How tf Is your channel so small your content is so good

  • @billchildress9756
    @billchildress9756 Год назад

    I think I posted here earlier about this card and I can say that it has impressed me with each driver update. The last update included a firmware update that corrected a bootup issue I was having with this card and no more problems! Intel listens and responds to those who are having some issues very quickly! On heaven benchmark with max settings I am getting even better results with frame rates, Avg 50 low and 150 high. With AMD Ryzen 5800X3D and 32 gb's 3600 memory was a very good investment last year. I don't care for buying new parts every year so I'm looking for long term usability. As expensive new cards are getting I'm going to wait and see what Battlemage is going to be in price and performance. Between Scalpers, Miners and general Greed at the start of 2020 it's nice to see some sanity emerge from all of this.

  • @lordshitpost31
    @lordshitpost31 Год назад +2

    I think it's your CPU that causes bottleneck in Spider-Man, never seen another Arc review with that

  • @hrmd3537
    @hrmd3537 Год назад +1

    My next card will probably be from team blue. Hopefully, they keep making these improvements. I see Intel taking 30% of the marketshare within 3 years, if they keep improving the drivers and AMD with NVidia keep prices reasonably high. Though, Radeon group has dropped their prices. At the same time Intel's have some great features. What keeps a lot of gamers from getting are the bugs. Crush the bugs, offer solid 1440p gaming experience in sub €400 and you will have > 10% within a year.

  • @wjack4728
    @wjack4728 7 месяцев назад +2

    The Intel Arc cards are very nice, but since DOSBox doesn't work with them, they are useless to me. Edit: Intel finally fixed the Dosbox Windows issue with their latest drivers, so all is good now.

  • @mofstar8683
    @mofstar8683 Год назад +2

    I’ve been waiting for this one!!!! Just started watching but for anyone that sees this Spider-man Remastered and Miles Morales have both just been updated to XeSS 1.1 but in future the DLLs can be swapped out version to version on any single player title like DLSS

  • @realfeils4655
    @realfeils4655 Год назад +1

    As for the spiderman issue i had it in my i5 10400f and rx 5700 xt when the game was in my hdd
    It was unplayable then i transferred it to my nvme and only had it happen once through the whole game
    Not sure why its happening with you but for me transferring it to a faster drive fixed the issue

  • @Nick_R_
    @Nick_R_ Год назад +1

    The most striking conclusion here is how good the GTX 1080ti still is. Oh for a manufacturer to still be making their top card available at a relatively reasonable cost.

    • @redslate
      @redslate Год назад +1

      An elegant weapon, for a more civilized age.

  • @vogonp4287
    @vogonp4287 Год назад +1

    I hope that Intel's cards do as well as possible. Gpu prices need to be brought down.

  • @johnm9263
    @johnm9263 Год назад

    This is actually amazing for a gen-1 competitor
    They just have the existing infrastructure that their competitors do

  • @WSS_the_OG
    @WSS_the_OG Год назад

    Your writing skills are superb, my good man. Your prose always amazes me. Terrific video. A definite soixant neuf out of soixant neuf from me (not that my door opens that way ... you understand).

  • @micahottaway8455
    @micahottaway8455 Год назад +1

    It's not available for delivery on Newegg in the UK. Hmm... I consistently get better prices for new PC hardware from Newegg even when delivered to the UK from the US.

  • @gecko2000405
    @gecko2000405 Год назад

    I can't believe we're still using the 1080ti for benchmarks five years later; regardless of price.

  • @Deathdemon65
    @Deathdemon65 Год назад +1

    Most games that underperformed were all Sony ports see the trend so Sony on their side are not optimizing games for the arc it’s all the drivers so imagine the future ports that do support arc

  • @alderwield9636
    @alderwield9636 Год назад +1

    Cant wait for trio rgb for each team respectively

  • @darthwiizius
    @darthwiizius Год назад +1

    C'mon Intel stick with it because you know that 3 is the magic number.

  • @harlstrum833
    @harlstrum833 Год назад

    I've only ever had those Spider Man remaster issues when running my game from a hard drive. Once I switched it to an ssd it was BUTTERY smooth. Wicked good optimization after the cpu patches they did.

  • @zaliasis1366
    @zaliasis1366 Год назад +2

    Hey Iceberg, You should add Cities:Skylines as a game for CPU testing. Because that game likes really powerful CPU's. I don't see many PC hardware youtubers use that game. So I would really like to see Cities:Skylines

    • @cristod1904
      @cristod1904 Год назад

      Just to add the city for the benchmark scene should have 150k or more population, since small cities don't cause much trouble for the CPU. IIRC I got a city like that where there are 2 very crowded subway stations connected by a pedestrian bridge, filled into the brim. Zoom into that and FPS really bogs down, like under 30 or so. Running on a 5700X with PBO.

    • @zaliasis1366
      @zaliasis1366 Год назад +1

      @@cristod1904 Yeah, and your CPU is good, mine is an i5-10400f and I have a big city with a lot of mods and assets :) With this setup in the busy area I get 15 FPS XD. But at least I'm planning to upgrade for C:S 2. I have 2 options upgrade to AMD 5000 thousand series or to i7-10700k and OC it.

    • @cristod1904
      @cristod1904 Год назад +1

      @@zaliasis1366 IMO maybe avoid the 10700k, still quite expensive isn't it? For AMD maybe have a look at the 5800x3d, or Intel's 12600kf or 13600kf

    • @zaliasis1366
      @zaliasis1366 Год назад

      ​@@cristod1904 I was planning to buy the i7-10700k used for about 170 Euros. But I was also thinking of AMDs AM4 platform 5 thousand series because of their low power draw to performance. Intel's 12th and 13th gen power draw is crazy. That I didn't even consider intel. Until the 12th and 13th gen intel CPUs I liked intel but now I'm almost certain that I'm going to transition to AMD.

    • @zaliasis1366
      @zaliasis1366 Год назад

      ​​@@cristod1904 Does the extra cash help for cities:skylines?

  • @alexalex5121
    @alexalex5121 Год назад +6

    Just not a good idea compared to RDNA 2, hopefully Battlemage can improve the situation and maybe offer 16gb VRAM at sub $300 MSRP? 👀

  • @nicholasbrooks7349
    @nicholasbrooks7349 4 месяца назад +1

    Better comparison would be an RX 7600, RTX 4060, RTX 3060, RX 6550 XT, similarly priced together 280 to 300 USD for me.

  • @Complextro93kg
    @Complextro93kg Год назад +2

    For a first gen its great.

  • @notfunny3397
    @notfunny3397 9 месяцев назад +1

    Considering the a770s die size and power consumption despite its somewhat small node, I think this GPU will carry the legacy on AMDs finewine.
    Maybe I'm just stupidly hopeful though

  • @filipborch-solem1354
    @filipborch-solem1354 Год назад +1

    2:57 I can’t believe Intel has “memory utilization” in their overlay. Nvidia doesn’t with GeForce experience.

  • @bighersh14
    @bighersh14 Год назад +4

    Once Intel eventually gets the drivers figured out for their cards they will definitely be a contender for value gpus in the future.

  • @oswaldmosley6179
    @oswaldmosley6179 Год назад

    This card should be the next progression of the A750 for the refresh and a BIG driver update.

  • @acconboy
    @acconboy Год назад

    Running the Arc with an AMD proc hobbles its capabilities right out of the gate. Several of its capabilities come into play when paired with a 12th gen or newer i7 or i9

  • @PyromancerRift
    @PyromancerRift Год назад

    ARC released right at the end of shortages and when the second hand market was flooded with good deals. They missed their train.

    • @Finckelstein
      @Finckelstein Год назад

      I mean, even among new cards its performing quite bad. Iceberg tested it against a 6700, not against the A770's direct competitors - namely the 6700XT and 6750XT. Both of which offer way better value to performance ratio.
      I hope they stay in the game, since competition is good for us consumers. But the current gen certainly can't compete with AMD.

  • @6TheBACH
    @6TheBACH Год назад

    The only hope for the blue team is they start selling these kind of cards at 250$ or even lower.
    Low/Medium market is still struggling, and whoever taps it, will win the race in the long run.

  • @Wild_Cat
    @Wild_Cat Год назад +1

    Intel Arc FTW!

  • @saschapurner9579
    @saschapurner9579 Год назад

    Last week i opend one of my arc cards and instandly close the backplate back becouse i never seen so much testing points on a electronix device. She is a beauty and most people will never see her naked.^^

  • @Posh_Quack
    @Posh_Quack 8 месяцев назад +2

    More competition is more better. I look forward to the next generatation.

  • @FatheredPuma81
    @FatheredPuma81 Год назад

    You forgot to test older games. I've heard a lot of DX9 titles outright won't launch.