Switching to Intel Arc - Conclusion!

Поделиться
HTML-код
  • Опубликовано: 7 окт 2024

Комментарии • 2,9 тыс.

  • @Sakosaga
    @Sakosaga Год назад +5108

    I hope intel is taking notes, this is probably the perfect series of videos they need to get feedback and bug fix everything which is good to hear Linus confirming as well too

    • @MihkelKukk
      @MihkelKukk Год назад +29

      surely they must be!

    • @robotredkitten817
      @robotredkitten817 Год назад +122

      They are basically super users. Companies uses people like that to make the products better. They listen to that kind of use more cause those users have enough knowledge to report clearly the problems.

    • @benwu7980
      @benwu7980 Год назад +13

      Let TP control it, get rid of RS. Possibly the most cringe set of videos I've watched in recent years, was seeing Tom Peterson with his superior engineering qualifications and enthusiasm, and Ryan Shroud being a complete corporate marketing lackey shutting down the most interesting topics.
      I get that RS has a job to do, I just can't give him any respect for how every Arc promo video he was in, that he's a complete sell-out. Every. Single. Video.

    • @GOWvMatchstick
      @GOWvMatchstick Год назад +19

      They absolutely are. I really hope they make a dent in the market. Options never hurt. It's also worth mentioning how cheap the A770 can be.

    • @pirojfmifhghek566
      @pirojfmifhghek566 Год назад +42

      Oh, they ABSOLUTELY are taking notes. Videos like these are some of the most valuable beta testing they can get outside of the lab. The problem with products that go out into the wild is that very few customers actually write back to them with their gripes and complaints. Most who feel irritated enough to say anything do so in a one-star review with "this card doesn't work" as their critique. Usually the only chance to start a dialogue happens when a customer tries to RMA the card, but the vast majority of people simply shrug at the minor inconveniences or replace the device with something else. "I knew what I was getting into buying the first gen" they'll say, not realizing that they're perpetuating the problem by not saying anything.
      You can tell Intel is listening, because they were in close communication after each episode. One day we'd be hearing complaints from Linus on his show, then by the next show he'd be saying that Intel reached out to him immediately with driver fixes, patches, vr alpha drivers to test out. And paying attention to the feedback is honestly what they have to do to prioritize how they put out the neverending brushfire of problems that come with delivering a GPU to market. They're hopelessly behind the curve in this generation of cards, but it gives me hope that the next release will be much more stable and not have quite so many quality-of-life problems.

  • @Jsteeeez
    @Jsteeeez Год назад +5836

    Knowing that Linus wrote the script for him and Luke makes this funny. You can listen to Luke talk and it sounds like something Linus would say.

    • @vectious2237
      @vectious2237 Год назад +729

      You can see Linus mouthing along at some points when Luke is speaking, funny lol.

    • @YannickBo
      @YannickBo Год назад +240

      As I understood it Linus only compacted Luke's points so the information would be more dense

    • @cirion66
      @cirion66 Год назад +256

      @@vectious2237 Yeah - After watching the WAN show I have been wondering whether this was THE video... It sure is subtle, almost as if Luke was Linuses ventriloquistic puppet :D

    • @evilmarc
      @evilmarc Год назад +61

      That explains why the vid flows so well. I like it a lot

    • @buddhabrot
      @buddhabrot Год назад +85

      tell me you are spending 24/7 at youtube without telling me that you are spending 24/7 at youtube

  • @upstartmyst3734
    @upstartmyst3734 Год назад +433

    Realistically, THIS is what it takes for this stuff to improve. For anyone new to a market, getting heavy usage, and direct feedback and support from any of the biggest, best, or smartest companies/people in the field will lead to massive improvements much faster than normal, because the root of issues can be found and fixed faster. The fact that they were working, essentially hand-in-hand with you guys the entire time, is really awesome to see, and gives me hope that they will continue to improve and succeed

    • @Splarkszter
      @Splarkszter Год назад +2

      100% my brother

    • @VC777
      @VC777 Год назад

      Haven't thought of that. Thanks for the viewpoint

    • @woopygoman
      @woopygoman Год назад

      They need to do the same type of challenge but with retro games. DX8 and below. So many good games were made pre-DX9.

    • @flintfrommother3gaming
      @flintfrommother3gaming Год назад +2

      @@woopygoman I really don't know a ton of games which didn't use OpenGL but used pre-DX9 and OpenGL is ok in ARC cards.

    • @Splarkszter
      @Splarkszter Год назад +1

      @@woopygoman LinusTech is not the right channel for that anyway. This is a channel directed for normies.

  • @xmine08
    @xmine08 Год назад +396

    Seeing the story of DXVK is truly impressive. It, almost by itself, got Linux from a "Some games work every once in a while" to "If it wasn't for crazy anti cheat you can play about 99% of all games at good performance".

    • @drek9k2
      @drek9k2 Год назад +16

      GOG now is getting DRM'd shit too 😡They didn't have the Mafia soundtrack either I had to actually download it and put it in music and audio folder manually. I hate third party clients so damn much.

    • @HXunbanned
      @HXunbanned Год назад +6

      ... it is impressive IF DXVK is working fine ... In my FC38, Doom Eternal does not launch and Crysis, Crysis Wars and Crysis Warhead has resolution bugs ....
      If the process to get API trace would not be 10 step copy-paste-install-configure journey, it probably would not be such a hassle to provide quick bug reports..

    • @sohiearth
      @sohiearth 2 месяца назад

      @@HXunbanned Doom Eternal's id Tech 7 supports only Vulkan, of course it won't work with DXVK.

    • @TX2015
      @TX2015 2 месяца назад +1

      ​@@sohiearthIs it really that obvious though? If DXVK turns D3D9/10/11 calls into Vulkan API calls, shouldn't it also let the Vulkan calls through directly? It's just dumb to design a program that translates for another API, but then block the calls and not launch the app when they don't need translation--

  • @World_Theory
    @World_Theory Год назад +309

    Laying the way for an almost entirely new type of product line is going to be tough. I'm just glad they're doing it. It's still a hiking trail at this point, but at least it reaches the destination, and it's being smoothed out.
    Really looking forward to seeing what it turns into.

    • @WeicherKeks
      @WeicherKeks Год назад

      Intel has developed integrated GPUs for decades, I would not really call them newcomers.

    • @XLR8bg
      @XLR8bg Год назад +22

      @@WeicherKeks Yeah, but those iGPUs were basically meant for business use, ie just to have something to drive a couple of monitors. This is the first time they have to design GPU hardware and software for high-performance workloads.

    • @TheAyanamiRei
      @TheAyanamiRei Год назад +8

      @@WeicherKeks Those integrated GPUs were something that competed on Low End GPUs. At best. NEVER competed with Medium End. With a TRUE Dedicated GPU. It's like the difference between Gas Powered Scooters and getting into Mopeds or Motorcycles. Still not car or truck level, but waaaaaaay more than they used to be.

    • @micmccond7
      @micmccond7 Год назад

      Hmm considering how Intel has a good relationship with Microsoft's operating systems. Microsoft having chat gpt. I see Intel implementing davinci codex on the GPU kernel. That 10$b from chips and science act is just a registration fee. AI integrated chips are now a strategic resource commodity. What better way than through the demanding GPU enthusiast sector.

    • @TTech321
      @TTech321 Год назад +1

      I really am rooting for intel to keep at it and compete with nvidia and amd. It’s just disgusting how much nvida is price gouging and amd isn’t helping gamers either.

  • @corndog9482
    @corndog9482 Год назад +1197

    If you consider the magnitude of what they are trying to accomplish, I'd say it's pretty awesome how fast things are improving on their first attempt.

    • @Megalomaniakaal
      @Megalomaniakaal Год назад +31

      Well, not their first, first attempt, but first attempt where they have brought to market something at least marginally competitive. On paper anyways.
      There's a whole webpage dedicated to intels past attempts. I don't think I ever bookmarked it though and I can't be bothered to google it right now tho.

    • @dan_loeb
      @dan_loeb Год назад +35

      @@Megalomaniakaal aside from arc, xe graphics a couple years ago and the first intel graphics solution, the i740 in 1998, they've never launched dedicated graphics, especially not commercially available ones. it's always been integrated, either with intel extreme graphics and intel gma on the motherboard northbridge chipset, or built into the cpu on everything since. larrabee never officially launched (and it wasn't aimed at home users anyways it seems). so there's really only the i740, intel xe, and now arc as intel dedicated graphics products.

    • @Megalomaniakaal
      @Megalomaniakaal Год назад +2

      @@dan_loeb Yes, the things in between those never were publicly released. Larrabee perhaps came the closest.

    • @lasskinn474
      @lasskinn474 Год назад +1

      aagh not again this first attempt after attempting 25 years

    • @apetogetherstrong6600
      @apetogetherstrong6600 Год назад

      They have integrated gpu all along. Ofc there are a lot of difference but not that unimaginable

  • @AurioDK
    @AurioDK Год назад +2853

    Intel just announced they have discovered "the huge bottleneck" in the ARC drivers and will be releasing new drivers soon which would eliminate this "bottleneck".

    • @sujimayne
      @sujimayne Год назад +106

      But don't get your hopes up. There is only so much that drivers can do for fundamentally weak hardware.

    • @mythicalducky
      @mythicalducky Год назад +1009

      @@sujimayne that's the thing i don't know if id call arc fundamentally weak hardware wise

    • @Breakfast_of_Champions
      @Breakfast_of_Champions Год назад +40

      It's the bottleneck their engineers' whiskey flows through?

    • @_-TC
      @_-TC Год назад +348

      @@mythicalducky Exactly. It's kind of like having a super powerful engine, but it's not optimised and therefore lacks a lot of horsepower. Then when optimised, both in software and physically, it's suddenly competitive. Don't buy into a future promise, but honestly? It DOES look promising for Intel. First release and they can be in the same ballpark as some Nvidia cards, that's no small feat.

    • @MacaroniLove
      @MacaroniLove Год назад +116

      @@sujimayne It's actually quite believable. On paper the card is strong enough. In practice it is much slower than it should be in many games. For games that already works well it probably won't make much of a difference, but I'm expecting the same thing they already did it with DX9 games by bundling DXVK in the driver. DXVK also works for DX10-DX11 games but for some reasons that's not enabled in Intel's drivers. I have not tested many games, but in specific cases in Final Fantasy XIV (super crowded areas), I got almost double FPS by adding DXVK myself. (Which can be done easily copying a few DLL files in the game's folder)

  • @bigun89
    @bigun89 Год назад +728

    It's clear Intel has been taking notes and making strides during this entire process. I honestly think this will need more research after some hiatus. Honestly deserves a part two - probably near end of this year. Excellent jobs - both of you guys.

    • @lyra_tcm1953
      @lyra_tcm1953 Год назад +16

      Maybe soon after battlemage releases both see how the new GPU and drivers are

    • @ubermidget2
      @ubermidget2 Год назад +10

      You read my mind - They should re-run the challenge with 6 or 12 months of new driver updates

    • @ShadowSnipero7
      @ShadowSnipero7 Год назад +2

      @@ubermidget2 Literally said what I was gonna say LMAO

    • @TechGuyBeau
      @TechGuyBeau Год назад +2

      I have been testing their cards on my channel, in various community requested games. I show REALY gameplay footage, and dont make subjective comments.
      I want people to see how well this card really runs.
      I have found that major youtube is immensely missing this (i get it, its a lot of work, i know personaly), but even so.
      The Arc GPUs are an IMMENSE development in personal computing, arguably a bigger one than ANY other content these channels are covering... yet its mostly ignored.

    • @TechGuyBeau
      @TechGuyBeau Год назад +4

      @@ubermidget2 They barely did the challenge in the first place. they also didnt document it, or explain in depth what games were played, and on which drivers.
      It was a nice little opinion piece of tech-tainment, but it lacked the objectivity of something kike GN or related channels. heck. i sold my system and bought an arc system JUST so i could make videos myself out of frustration due to the lack of coverage

  • @zackkoukios8123
    @zackkoukios8123 Год назад +373

    Man, I'm reminded just how good of a presenter Luke is. It's been nice getting a nice dose of Luke content outside of the WAN Show lately with the challenges and recent Shadow tour/showcase

    • @bobdylan6454
      @bobdylan6454 Год назад +3

      The WAN show is everything to me.

    • @ethanperez4774
      @ethanperez4774 Год назад

      What recent challenges?

    • @grn1
      @grn1 Год назад +2

      @@ethanperez4774 Linux, Arc, and AMD. This video is the conclusion to the Arc challenge, they previously did a challenge where they ran Linux on their main PCs and more recently (I've not actually watched it yet) they've done or started to do an AMD GPU challenge where, like the Arc challenge they use AMD GPUs instead of NVidia.

  • @santobell
    @santobell Год назад +442

    I've been surprisingly happy with my 770. I had plenty of issues early on, but have been quite impressed since the driver roll-out near Christmas.

    • @Mumbolian
      @Mumbolian Год назад +9

      What led you to buy one?

    • @Dr.WhetFarts
      @Dr.WhetFarts Год назад +36

      @@Mumbolian Price, they are cheap som places

    • @WindFireAllThatKindOfThing
      @WindFireAllThatKindOfThing Год назад +27

      Curiousity and experimentation, on my part. And I was building an El Cheapo Special project for my Dad anyways, so it's not like he was going to be a performance snob trying to hit 30% extra frames to max out a 144hz 4k display. So he got to be my guinea pig. As long as intel doesn't take their ball and go home, they could make a run, here. Even if they're just fighting for 2nd place in the mid/low tier.

    • @ShadeKoopa
      @ShadeKoopa Год назад +19

      That's good to hear. If they can do good with their driver updates, Intel may be the choice for budget users. And we NEED one given how expensive Nvidia and AMD are getting with their GPU this current gen.

    • @geremysantos4203
      @geremysantos4203 Год назад +2

      Really good to hear. We need more updated reviews as drivers continue to get rolled out and make it better.

  • @notsparktion
    @notsparktion Год назад +664

    FYI: For me, the DisplayPort message in Windows actually comes from a USB C Hub that I have. That might be the case for you guys too.

    • @DriantX
      @DriantX Год назад +33

      Yup, same. Got a "cable matters" hub that does that but works flawlessly otherwise.

    • @myrealusername2193
      @myrealusername2193 Год назад +3

      Same with a usb hub I have that has HDMI out

    • @nicholascopsey4807
      @nicholascopsey4807 Год назад +7

      I get the same message when I plug my laptop into my monitors integrated kvm using the usb c cable as data and an hdmi cable for video.

    • @stili774
      @stili774 Год назад +6

      RTX4090 here, here it is a DELL Monitor. Never had ARK installed bevore. Comes, when a device for charging is connected over usb c of the monitor

    • @lebowski3748
      @lebowski3748 Год назад +10

      @@stili774 oh cmon stop braggin...

  • @HealthyCriticism
    @HealthyCriticism Год назад +744

    3:50 Linus agrees so much with Luke, he is mouthing along the script with him.

    • @123cez123
      @123cez123 Год назад +51

      Ahahahahaha. Laughed so hard at this 😅

    • @josiwhitlock2156
      @josiwhitlock2156 Год назад +178

      they mentioned it in WAN and i cant stop looking at it

    • @crazyeyez1502
      @crazyeyez1502 Год назад +45

      @Josi Whitlock I wonder, tho, if they hadn't mentioned it on WAN, would it have been as noticeable? I noticed it because i was already aware of it and looking for it...

    • @CesarinPillinGaming
      @CesarinPillinGaming Год назад +11

      Don't they use a teleprompter ?

    • @arnistein6493
      @arnistein6493 Год назад +18

      Same at 12:53 :D

  • @Effectlife
    @Effectlife Год назад +808

    To be honest, format wise, this has been one of my favorite episodes in the last few months, really like the dual host setup you've got going here

    • @GameTimeWhy
      @GameTimeWhy Год назад +25

      Linus and Luke are two peas in a pod.

    • @BassRacerx
      @BassRacerx Год назад +14

      it's all scripted but made to seem like it isn't .... still a good episode..

    • @Bureaucromancer
      @Bureaucromancer Год назад +3

      Dual host is great, and so is the crunchier tech content. Really hope that this becomes something of a model for labs. As much as I hear all the stuff on WAN Show about LTT making what it does for solid business reasons, the actual tech content has always been really good when you decide to damn the the algorithm and get serious.

    • @ghostpeel
      @ghostpeel Год назад +4

      Especially with Linus and Luke, but I totally see this working out with others

    • @Matthew-dp3hf
      @Matthew-dp3hf Год назад +4

      I really wish that Luke would be featured in more content similar to this with the dual host layout because it really brings out their dynamic that they have

  • @SwaggerMeister72
    @SwaggerMeister72 Год назад +50

    Just the fact that the ARC team is communicating and actually taking criticism to heart and making solutions makes me want to buy one of these.

  • @noelchristie7669
    @noelchristie7669 Год назад +90

    Went from a GTX 1060 3GB to the A770 LE n the beginning of December paired with an i5-12600k and z690 mobo. Been having a great time past my first week's struggles. Definitely other cards to consider in the price range but I'm having fun and playing more games than ever!

    • @oliversmith2129
      @oliversmith2129 Год назад

      ​@@desmasic Lemme know how it goes, I have a 3060 Ti and I'm tempted to try arc 770 before my shift to next generation cards by 2024

  • @dragon2knight
    @dragon2knight Год назад +737

    As an owner of both an A770 and an ASRock Challenger A750 I can say that the latest driver updates really makes them more than just useable. I expect these cards to just keep on improving as time goes on. They're a good buy now!

    • @iHadWaterForDinner
      @iHadWaterForDinner Год назад +38

      AMD is way ahead and nobody buys their gpus. Nobody is gonna buy these either and thats fair.

    • @peik_haikyuu2265
      @peik_haikyuu2265 Год назад +58

      @@iHadWaterForDinner yeah if a 6650 xt is selling for 300 brand new and a 3060ti is over 400 for 5% more fps its clear what people are buying lol

    • @renchesandsords
      @renchesandsords Год назад +69

      @@iHadWaterForDinner honestly for friends' budget builds, I'm personally recommending AMD stuff to pretty much everyone that doesn't need CUDA

    • @7evive
      @7evive Год назад +23

      The 770 is a really good buy

    • @eric-.
      @eric-. Год назад +1

      @MichaelLivote , do you play forza horizon? if so, how's the game performance on latest arc drivers?

  • @_boux
    @_boux Год назад +516

    The "spiky" lag might be caused by their use of DXVK for older direct x games, it's probably the vulkan shader compilation, which is an issue that has been addressed heavily for the next version of DXVK

    • @luisortega8085
      @luisortega8085 Год назад +40

      this also affects linux as well, also on their more mature laptop igpus. Hopefully they'll also deal with that in the vulkan native side

    • @Reflect-y7c
      @Reflect-y7c Год назад +3

      Can you not just pre-compile the shaders? I do it on all my Linux machines.

    • @AndRei-yc3ti
      @AndRei-yc3ti Год назад

      @@luisortega8085 vulkan has been updated to do away with shader compilation stutters

    • @luisortega8085
      @luisortega8085 Год назад

      @@AndRei-yc3ti huh really? weird. i get stutters a lot when i play a new game... tested it many times with nfs rivals

    • @AndRei-yc3ti
      @AndRei-yc3ti Год назад

      @Luis Ortega what distro you on? I'm on arch so I get latest updates.

  • @5plurge
    @5plurge Год назад +93

    This is a complete aside to the subject matter, but, having two presenters sat chatting is a refreshing change to some of the norms of; single talking head, lead and support, lead and many many supports (oh hey engineering project vids didn't see you there) etc. Brings more of the WAN feel (obviously) while keeping the tight production of a regular LTT video. More of this please as and when the opportunity arises

  • @The_oli4
    @The_oli4 Год назад +91

    It would be interesting to see how the arc GPU is doing in a year from now could be a good video too to compare how much better it has become. It's good that intel is actually working hard to fix things they are learning a lot and the next GPU line might be a huge competitor already

  • @FilthyCasualRacing
    @FilthyCasualRacing Год назад +16

    You guys should revisit this. I'm very curious about where it's at now. It's time to upgrade and that 16GB on a 256 bit bus for 350ish bucks is very compelling.

  • @spyrule
    @spyrule Год назад +328

    I'd be interested in seeing you both review it again in 3-6-12 months, just to see the growth and fixed they are deploying. It does seem that the majority of their issues is entirely driver/software based, so its highly likely to get fixed sooner then later at the stage they are in.

    • @lucasRem-ku6eb
      @lucasRem-ku6eb Год назад

      Will ARC ever get supported and Good as ATi tech on AMD now?
      Do we need more parties, or you need to work together on projects, we all use a Car, but one of them does the job in a boat, same results ?

  • @sanguinesomnambulist
    @sanguinesomnambulist Год назад +188

    It would be interesting to see another challenge in six to eight months to see how things have progressed. Not necessarily Linus and Luke this time, maybe someone else can suffer 🤣.

    • @Clove_Parma
      @Clove_Parma Год назад +9

      they should definitely do an update when next gen Arc comes too

  • @bob32qwerty
    @bob32qwerty Год назад +201

    The improvement over the course of the 1 month gives me a lot of hope. I'm excited to see you guys try this challenge again with the 2nd gen Arc release. I'm personally waiting for Intel to have their "Zen 2" moment and (and hopefully they make it that far).

    • @drewdane40
      @drewdane40 Год назад +2

      Alchemist is at least their 2nd generation. They had DG1 last generation and decades of integrated GPUs. Seems like they've had an opportunity to work on graphics drivers before asking the general public to pay to participate in this massive beta test.

    • @DuvJones
      @DuvJones Год назад +2

      @@drewdane40
      For ARC, you would be wrong. DG1 was a development sample that never hit the market, well, not in discrete GPU's anyway. It is basically the Iris tech that Intel has for their integrated GPUs on a single card, so yeah.
      The other thing that that some of these issues, were unavoidable. Take the elephant in the room that is Direct X 9 support, which has about a 4 to 10 year life span, 3 API revisions and ALOT of quirks from game to game. Intel has been rather honest about the effort of Direct X 9 support, which is to say.... it going to be a lot of work. What makes this worse is that Direct X 12 doesn't have a emulation layer in for 9... neither does 11, or 10. It one of the things that separates it from OpenGL (but to be fair, that has caused a lot of issues for OpenGL, to the point of Vulkan existing because of them) in that new versions of the API are NOT reverse-compatible and continues to be an issue because, games TODAY still release using the API of DX9. I guess that is why they are force to rely on the WINE subproject DXVK driver, which.... look WINE is VERY, VERY good, but you are not replacing an Windows install with it, same goes for DXVK and DX9, Windows or not.
      A reasonable point that this video is making that is that we, the consumer, have to gauge our expectations.... even if it's a Multinational, Billion Dollar Corporation the size of Intel.
      The way that I have been looking at Intel jump into the discrete GPU's, they get two generations tops... to figure things out, after that. Either they hit the ground running or this is dud. Right now, I they they are doing ok.

    • @drewdane40
      @drewdane40 Год назад +1

      @@DuvJones You are incorrect. DG1 discrete GPUs absolutely were sold to the public through OEM PC manufacturers. It was widely publicized at the time so you'd have to have been trying really hard to have missed that news. You can buy them today on eBay for about a hundred bucks.

    • @DuvJones
      @DuvJones Год назад

      @@drewdane40
      No, it wasn't. The DG1 never made production, for reasons know only to Intel. There was no way that Intel made them available to the public, the fact that you find these samples to buy on eBay is really irreverent.

    • @drewdane40
      @drewdane40 Год назад

      @@DuvJones Intel's own Ark page lists DG1 as "launched," not cancelled. If you Google "DG1 cancelled" the top result is an article titled "Intel officially confirms DG1 GPU is now shipping to OEMs, and DG2 has taped out." It appears you're getting your news from an alternate timeline (or orifice.)

  • @whalemonstre
    @whalemonstre Год назад +42

    I'm glad Intel is listening and making constant improvements. I'm long-time Nvidia user but I really want Arc to succeed.

  • @Chris-wt1eq
    @Chris-wt1eq 7 месяцев назад +4

    It's been a year, should do an ARC Challenge 2

  • @somemediocregamer
    @somemediocregamer Год назад +81

    As long as intel keeps making these great improvements, I can see myself switching from team green to team blue in the future when I am ready to upgrade. Seriously rooting for intel.

    • @dschwartz783
      @dschwartz783 Год назад +16

      Really seems like they’re already killing the competition at price/performance, at least in games where there aren’t major issues. If they can work out their remaining driver showstoppers, they could be the go-to for budget builds.

    • @Bramble20322
      @Bramble20322 Год назад

      @@dschwartz783 They're losing money in these cards, plus saying "it works fine in the games that work" says a lot about the quality of these cards, lol.

    • @dschwartz783
      @dschwartz783 Год назад

      @@danieloberhofer9035 well, at the very least it’ll get people to consider them, a very new player to the market. Even if the prices go up a bit, I suspect that once they wack enough driver bottlenecks, it’ll still be the best budget option. I hope they manage to make this a success. The market can definitely use a third player, as it seems AMD has left the budget market.

  • @YounisHajeer
    @YounisHajeer Год назад +39

    3:54
    Linus also reading the prompter is hilarious lol

    • @coreycarpenter2489
      @coreycarpenter2489 Год назад +1

      Did you pick up on it or watch them call it out on the WAN show?

    • @Mopsie
      @Mopsie Год назад +2

      I’d never have known if it wasn’t for the WAN show

  • @karola.7908
    @karola.7908 Год назад +227

    Damn, I thought they forgot about this series.

    • @aayaan1935
      @aayaan1935 Год назад +4

      But they did forget about the $1 computer 😂

    • @technosaber9281
      @technosaber9281 Год назад +5

      @@aayaan1935 they did not, it has been sold

    • @colinstu
      @colinstu Год назад +1

      I was thinking that too.. it sounded like they swapped to other GPUs like a month or two ago… and I thought this thing just started lol.

  • @MarkBarrett
    @MarkBarrett Год назад +1

    It amazes and surprises me sometimes, how I prepared repair procedures in advance.
    I dropped my e-cig (again) and it broke the glass tank.
    I smoked 2 real cigarettes, that I saved for that contingency.
    I was preparing to go on a mission to buy a new part.
    After taking a nap, I remembered, I had put a spare tank in my parts box, exactly for this contingency.
    Amazes me, that I had planned for exactly this.
    My e-cig works now.

  • @RyanKarolak
    @RyanKarolak Год назад +16

    As problematic as the Arc GPUs have been I am glad to see progress. I am rooting for them to do well. The GPU market is in definite need of shakeup.

  • @RealLifeTech187
    @RealLifeTech187 Год назад +80

    Sources told ComputerBase another big driver update is on the way in February that will further improve performance substantially. Hope you guys keep testing Arc 👍

    • @K543
      @K543 Год назад

      Link?

    • @Bramble20322
      @Bramble20322 Год назад

      @Steve Sherman The hopium train is strong around these parts. "Buy our GPUs, guys! It doesnt work in most competitive games, it also is not capable of rendering reflections without glitching everything, but i swear in (n+2) weeks it will be close to a 3060ti!"

  • @GStreetEntertainment
    @GStreetEntertainment Год назад +40

    It would be interesting to dig through old web archives / forum posts about early AMD and nVidia Cards and compare wether they had similar bugs and lacking support for older games etc. It's great to see Intel improving the drivers for their cards regularly and that they are listening to their customers for feedback.

    • @stuartthurstan
      @stuartthurstan Год назад +15

      Can't remember the specific card now, but years ago I had an ATi radeon card and the drivers were pretty janky. Inexplicably poor performance in certain games, even instability, then good performance elsewhere. The drivers did slowly improve the card over the two or three years I had it, but the experience was enough to ensure that my next card was an nvidia. I like the commitment we're seeing from Intel on these drivers though and I feel we desperately need a new runner in the GPU race, so I've taken the plunge and ordered an A770.

    • @1981AdamGs
      @1981AdamGs Год назад +5

      As someone that was in the game when Nvidia and AMD (ATI at the time) entered the arena, I can say that they definitely had bugs. But they weren't really that similar to what Intel has now. It was a long time ago and things have changed so much that it's really not even a fair comparison. The biggest bugs were mostly due to the growing pains of 3d rendering. But I will say this. Nvidia had a few issues with their drivers on very early cards. But they got the driver's sorted out fairly quickly. Nvidia was (and still is) good at drivers. AMD/ATI drivers have been pretty hit or miss for me. Some generations they were fine. Others they were horrible. And that horribleness lasted for years and years sometimes. AMD has come a long way since then though. As far as older games not working? That was fairly common. But that was mostly due to games being specifically designed for use on a specific card and architecture. So you had to find workarounds or settle for lackluster ports.
      I do remember those days mostly fondly. Being there at the birth of the modern PC gaming community and then watching what it has become is something I'm glad I experienced. Having said that, I wouldn't want to relive it from a user/gamer perspective. It was a major pain in the ass most of the time.

    • @MLWJ1993
      @MLWJ1993 Год назад

      Probably not software API support problems (since there simply weren't as many back then), but issues with basic things not always working, absolutely.

    • @jesusbarrera6916
      @jesusbarrera6916 Год назад +1

      every GPU brand back then had different support and even API for a number of games

  • @DMS3TV
    @DMS3TV Год назад +24

    I jumped on the Arc train (keeping a 30 series on the side) and honestly its been pretty solid. The only time I had a problem was no driver update notifications and the new dead space remake gets like 5fps. Other than that i'm pretty happy with it.

    • @eric-.
      @eric-. Год назад +2

      do you play forza horizon? if so, how's the game performance on latest arc drivers?

    • @DMS3TV
      @DMS3TV Год назад +4

      @@eric-. Not really sorry, mostly games like Destiny 2, City Skylines, Supreme commander, Warzone, Modded Skyrim, Crysis remastered, Hyperlight drifter, Mass effect series, Cyberpunk, etc.

    • @MacaroniLove
      @MacaroniLove Год назад +2

      @@eric-. Searching Forza Horizon A770 on RUclips, there seems to be a few videos showing how the game runs and it seems to be fine. Focus on most recent videos first if possible, the drivers have improved a lot (and still need to be improved a lot...).

  • @fridaycaliforniaa236
    @fridaycaliforniaa236 Год назад +18

    I truly think that having Intel has a new entry in the GPU market can only be a nice thing. I don't care if their first GPU is not perfect. If it allows avg users to get more well priced GPUs in the near future, it already is a win for me.

    • @davidt8087
      @davidt8087 Год назад +1

      Not for long if everyone keeps buying amd or nvidia regardless. Intel won't pump out arc just to help you get better prices (which won't happen anyway as nvidia and amd don't and won't take Intel seriously since they know most people won't get Intel) if no one buys their GPU. If arc didn't have the DX issues it would have been worth it somewhat

  • @LinusTechTips
    @LinusTechTips  Год назад +121

    Check out the products featured in our video at the links below!
    Buy an Intel Arc A770: geni.us/9IhkN5b
    Purchases made through some store links may provide some compensation to Linus Media Group.

    • @breadsoup3708
      @breadsoup3708 Год назад

      :)

    • @mitlanderson
      @mitlanderson Год назад +2

      No pinned comment?

    • @wta1518
      @wta1518 Год назад +2

      Haha, forgot to pin this

    • @1337eratur
      @1337eratur Год назад +1

      Bold move to post an affiliate link for an Intel Arc GPU after ripping into it for 15 minutes lmao

  • @cpt.doomwolf3794
    @cpt.doomwolf3794 Год назад +8

    I went from a gtx 1060 6gb to the Arc A770. I'd say from my experience it's a card for people who have had a pc for a while and know how to get around bugs and problems not for someone who is starting out building a pc. But performance wise I couldn't be happier. Intel has been dishing out amazing fps updates and they are not leaving it to rot like game devs do with some games now days. Something else I run the Arc A770 in a 9th gen cpu a intel I9 9900k. so even if you don't have a 10th gen cpu your good to go.

  • @jtx1059
    @jtx1059 Год назад +107

    I went from a GTX 1650 and switched to Arc A770 and I'm proud of it damn it

    • @MrTurbo_
      @MrTurbo_ Год назад +25

      I'm proud of you as well, i hope you do a couple hundred bug reports so in 4 years i too can buy an Intel GPU... Except then one that's actually good and not a broken underperforming mess.

    • @Spenlard
      @Spenlard Год назад +15

      @@MrTurbo_ I have used one since launch with only one issue. It's easy to look in from outside and think it's a super buggy experience. But for many users, it's been awesome including myself. They've come a long way in just the 3-4 months since release. Depends on your games as well.

    • @chronometer9931
      @chronometer9931 Год назад +8

      @@MrTurbo_ Your expectations are way off. I have the a770 and it's a lot better than you seem to think it is

    • @MrTurbo_
      @MrTurbo_ Год назад +1

      @@chronometer9931 well, it's not fast, that's for sure, clearly it still got a ton of issues as you can see in the video, and the fact i mainly play indie games and a lot of older games but at really high resolution and framerate i can't imagine helps much for making the expirience any better

    • @ruxandy
      @ruxandy Год назад +1

      One thing I genuinely don't really understand (and maybe Arc buyers can explain it to me), is why you felt comfortable spending a decent amount of money on an Intel Arc A770, when an AMD RX 6650 XT is actually cheaper and better in every possible way (and, before you say it, no, "faster ray tracing" is not a valid reason for this performance tier). I mean, I would totally give Intel my money if they made a good product (we NEED the competition), but this first generation... is really not it. Spending my hard earned money on something that "mostly works... for newer games" is not something that I would ever consider.

  • @DoorvleRanger
    @DoorvleRanger Год назад +45

    Even after a decade of content on this channel, I still enjoy these two on camera together.

  • @ImperialScoundrel
    @ImperialScoundrel Год назад +2

    The limited display port message probs refers to the card having display port 2.0 but monitor using 1.4

  • @andrewjmit
    @andrewjmit Год назад +16

    My A770 works great. First mistake was putting it in my i7 5960k. The performance was very poor due to no rebar. Once I put it into my 10600k the performance almost tripled. Now everything works great at 1440p. Very happy with it.

    • @b0ne91
      @b0ne91 Год назад +3

      Fyi, you can now add ReBarUEFI to your BIOS on X99 and have ReBar work just fine with Intel GPUs.

    • @andrewjmit
      @andrewjmit Год назад +1

      @@b0ne91 that's interesting, I just looked it up and never knew a hack was out to do this on X99. Thanks.

  • @memberberries7669
    @memberberries7669 Год назад +79

    You guys should do this as like a yearly thing every January switch to arc and see how its improved.

    • @josephoverstreet5584
      @josephoverstreet5584 Год назад +6

      I think 3-6 months would be much better lol

    • @memberberries7669
      @memberberries7669 Год назад

      @@josephoverstreet5584 Thats far to much torture.

    • @josephoverstreet5584
      @josephoverstreet5584 Год назад +2

      @@memberberries7669 Lmao let’s settle at 6 months July this year and start the year rotation next year In January

  • @Acre00
    @Acre00 Год назад +54

    Heads up: Arc Challenge pt 2 and 4 are in the GPU playlist but 1 and 3 don't appear to be

    • @mycelia_ow
      @mycelia_ow Год назад +1

      Intel removed them

    • @Acre00
      @Acre00 Год назад +4

      @@mycelia_ow I just watched the first

  • @thebaldnerd
    @thebaldnerd Год назад +40

    Give it 6 months and run them through some tests again, it will be interesting to see how much things have improved

  • @MarkBarrett
    @MarkBarrett Год назад +1

    I built T1 to be a resilient monster. It effectively has 12 hard drives.
    Today, the audio amplifier that does the speakers, repeatedly said it was overheating.
    I confirmed, it was hot.
    It happened to be physically next to T1, I tried and couldn't find another fan.
    T1 donated a case fan (120mm)
    It works now.

  • @solidreactor
    @solidreactor Год назад +14

    This has been a great adventure, I so want to Intel to become the third alternative on my next purchase decision.
    I hope you Linus and Luke do these kind of videos somewhat regularly after new releases, where you try the cards you review for a month or so; Even the other brands and with updated first hand overall user experience.

  • @stephenjones8645
    @stephenjones8645 Год назад +17

    For Arc, all I would really care about is video encoding performance on Linux. I suspect that it is a solid performer at this.

    • @stephenjones8645
      @stephenjones8645 Год назад

      @@leeroyjenkins0 yeah, new hardware can be difficult. I'd be curious if it's up and running on pop yet. I would imagine it would be fine in Arch by now.

  • @GeorgeJFW
    @GeorgeJFW Год назад +33

    I am really happy to see Intel is still interested in the project, it’s basically the only bright spot in the gpu market right now.

    • @gorkskoal9315
      @gorkskoal9315 Год назад

      lol other than crypto crashing? Or wait the dumpster fires of AMD and Nvidia must be pretty bright. Being big dumpsters. :P

  • @pranavp386
    @pranavp386 Год назад +4

    Thank you Linus. Been following u guys since 200k subs. Since then I passed high school...got a degree in electrical engg...now halfway through my pg in vlsi design and still watching u guys make tech content. Keep em coming.

  • @MichaelWilliams-xs1cf
    @MichaelWilliams-xs1cf Год назад +2

    I like this two head, playing off eachother format, and giving a really measured, down to earth perspective.
    Great stuff.

  • @EnvAdam
    @EnvAdam Год назад +45

    I committed to an Arc A770 and I was surprised how it does blender, besides a couple problem games that wasnt too hard to fix like not running BFV in DX12 (runs fine in DX11), remembering to launch BeamNG in vulkan mode (no longer needed, runs fine in DX11 mode) its been quite solid.

    • @UnderageBeerHere
      @UnderageBeerHere Год назад +12

      Yep, I know most people here are probably focused on the gaming aspect of GPUs, but the productivity suite abilities of these 1st Gen ARC cards is seriously overlooked. In fact, I'd go so far as to say that I'd call it a great card for productivity-focused buyers given its price point!

    • @kwizzeh
      @kwizzeh Год назад +4

      @@UnderageBeerHere Well it ain't a surprise since they do review for gamers in mind first.

    • @Kuli24000
      @Kuli24000 Год назад +2

      but my 3080 doesn't run BFV in DX12 properly either.

    • @EnvAdam
      @EnvAdam Год назад +2

      @@UnderageBeerHere yeah it's absurdly good for video encoding speed, I've never seen h265 encode at 800fps for a 1080p60 video

    • @UnderageBeerHere
      @UnderageBeerHere Год назад +2

      @@kwizzeh I'm not saying it's a bad thing, I know it's not LTT's focus, I'm just saying the ARC productivity performance has been flying under the radar.

  • @ralph4370
    @ralph4370 Год назад +9

    I have 2 A750 Intels. One on my security NVR BlueIrirs. It is a beast recording and transcoding HD camera feeds of 8+cameras. It is a beast and barely gets above 5% on GPU resources. With the previous machine which had a intel 6th gen the GPU aspect was capping high 90%. The second one is for my plex server in a docker container on OMV. Runs great on linux side.
    All the articles I have read, Intel cards are great for production transcoding video editing and obs. Gaming I have not tried it. I only have AMD Cards and Intel GPU cards on my home. Removed Nvidia from my home. I took a chance with Intel and Nvidia better gets their stuff together.

  • @Jytra
    @Jytra Год назад +44

    I swapped my 3060 12GB for an Arc A770 for my recording and editing rig, and from a productivity standpoint it isn't too bad when paired with a 13th gen i7. I will say however that I've run into more than my fair share of annoyances, most of which seem to be software based rather than hardware. I'm willing to give them a chance in the long run but they really need to ramp up the driver team to get some of those basic fixes out...like the UAC prompt for Arc Control for one.

    • @jothain
      @jothain Год назад +1

      Isn't 3060 about the same performance card? Why did you change?

    • @houssamalucad753
      @houssamalucad753 Год назад +12

      @@jothain for AV1 support probably..

    • @sgredsch
      @sgredsch Год назад +18

      @@jothain av1, h265 10 bit 4:2:2, 16gb of memory, intel hyper encode, actually more powerful gpu - there are some reasons. the a770 isnt exactly a 3060 in blue - its actually quite a step up in die size and transistor count, also in raw teraflop. the alchemist cards probably have a lot of untapped power sitting in them, its just not clear how much of these resources can actually be accessed by removing software bottlenecks. considering this is first gen for intel, theres probably alot of room for improvements.

    • @jothain
      @jothain Год назад +4

      @@houssamalucad753 Oh yeah. Just remembered that Intel has been quite good with some hardware video stuff even in some of their integrated graphics before. That would make sense in editing then.

    • @JustinMacsMoon
      @JustinMacsMoon Год назад +5

      @@jothain I ran a 3dmark benchmark and it already close to RTX 3070. 3060, even 3060 TI is nothing compared to A770

  • @mlotis
    @mlotis Месяц назад +3

    When’s the redo?

  • @Bogeyatyour6YT
    @Bogeyatyour6YT Год назад +1

    The purple lights and moiree effect that you guys saw, I've seen both happen to me with a RTX 3070 (updated drivers, Nvidia STUDIO Driver) in the last few months with Call of Duty and War Thunder.
    As I never had any kind of ARC card or even a CPU based APU graphic unit on this build (and the Windows install also wasn't originally in another machine), I can only imagine it is something on the Direct3D/DirectX layer that is super rare and is more likely to happen with ARC than other brands, but as I said, I saw the exact same issues at the end of last year and beginning of this one on Call of Duty (Purple/Rainbox reflections and layers over the screen) and War Thunder (extreme moiré patterns).
    So maybe it is an implementation problem of the driver or having conflicts with DirectX, Windows, who knows.
    This happened to me also on a 4K 43' LG Monitor (while a 2nd 24' Samsung monitor was running) as well as in a 32' Acer curved monitor (1080) running solo (with the same system).

    • @Bogeyatyour6YT
      @Bogeyatyour6YT Год назад

      ps: I might have some screenshots if you guys need them, I know I have at least 1, I would have to dig my backups for more though.

  • @VespaManInKorea
    @VespaManInKorea Год назад +17

    Been using A770 for a month. I have had issues, but I went in knowing all these issues. But then again, I don't really game that much so not really affected in Lightroom and Photoshop use.

  • @Keyan9
    @Keyan9 Год назад +72

    i love that both luke and linus wrote their script....you can tell the difference in other videos when someone else writes the script. they feel so natural here.

    • @JohnSmith-us4pj
      @JohnSmith-us4pj Год назад +13

      pretty sure linus said he wrote luke's part on the last WAN show

    • @tbunreall
      @tbunreall Год назад +14

      Really? Feels really forced and cheesy. Like when luke goes to swear but linus cuts him off. Barf lol

    • @Bramble20322
      @Bramble20322 Год назад

      @@tbunreall Right? jfc barf defines it well.

  • @KayJay01
    @KayJay01 Год назад +37

    That limited displayport functionality notification is not caused by Arc. I get it on my 4090 as well with an ASUS PG42UQ.

    • @ELMtreeProductions
      @ELMtreeProductions Год назад +1

      Yep I get it with a 3080 Ti and Dell U2720Q too

    • @doughy041
      @doughy041 Год назад

      That's why I went from my 3070ti to a 3060ti I got in 2020 is this displayport madness. I have two HDMI cables to my monitors and no notifications!

  • @spamcatcher6486
    @spamcatcher6486 Год назад +1

    This feels like the same issues people seemed to have while trying to tweak settings for GuildWars2. Like turning off reflections because, for some reason, there's some reflective texture way below where you are (water under everything). Or users using DXVK to get stabilization, not necessarily for raw performance.

  • @souljunkee
    @souljunkee Год назад +5

    Lots of respect to Intel for rushing thier butts off to push out a GPU as fast as they did to try to ease the shortage. I just hope they can continue to improve and compete with AMD and NVIDIA

  • @kevin65731
    @kevin65731 Год назад +139

    this is a certified intel arc moment

    • @commiequiz
      @commiequiz Год назад +3

      I Am Such A Confused Person!

    • @kevin65731
      @kevin65731 Год назад +5

      @@commiequiz …

    • @thom9106
      @thom9106 Год назад +2

      @@kevin65731 he is confused it seems, maybe we need to keep the sharp objects from him so he doesn't hurt himself

    • @kevin65731
      @kevin65731 Год назад

      @@thom9106 no we give him all the sharp objects hehe

    • @thom9106
      @thom9106 Год назад

      @@kevin65731 Heh

  • @Iothisk
    @Iothisk Год назад +8

    I'm not in the gpu market right now, but I wanna thank Luke and Linus for the light they shed and any early adopters for going through these pains, so that when I get back into the market I'll have some great options available thanks to the competition heating up!

  • @GONAVYCHIEF
    @GONAVYCHIEF Год назад +10

    Thank you for sharing. My pal and I both bought ARC A770s. The games and the programs we are running work great. Hopefully Intel will clear up the issues you identified.

  • @WarriorsPhoto
    @WarriorsPhoto Год назад +2

    That was the best ending, "The future where I tell you about our sponsor." LOL
    I am glad Intel is listening and making changes to their ARC GPU. I know it's about time the GPU market got some serious competition.
    Does anyone know if the ARC GPU drivers are working in OBS fully?

  • @gremmi7098
    @gremmi7098 Год назад +4

    The fact they're doing this at all shows how crazy Linus sees the pricing of flagship gpus at the moment. Kudos

    • @ryo-kai8587
      @ryo-kai8587 Год назад

      Yeah, I think it's very good that influential voices in gaming are constantly expressing distaste for Nvidia's pricing. If they bring the prices a little closer to earth, they'll get a lot of money from me when I build my 4090 behemoth.

  • @LyamWitherow
    @LyamWitherow Год назад +14

    Running Intel A770 for some time now. Very happy with the performance for the price.

  • @ThatBeTheQuestion
    @ThatBeTheQuestion Год назад +4

    Man, the energy of this episode is amazing. It was clear they were reading off a script, as usual, and there's nothing wrong with that, but Luke and Linus were able to vibe off each other and kind of just hang out a la WAN Show, it seemed, and it was so fun to watch.

  • @niceguy7270
    @niceguy7270 Год назад +55

    I am excited for the next generation of cards. Not just for the competition, but for trying something new.

    • @Grand1Admiral
      @Grand1Admiral Год назад +3

      Exactly, I think I will build a new computer just to experience something where real changes are being made vs just taking tons of my money for what (Nvidia's pricing and even amd's)

  • @cNcFalc
    @cNcFalc Год назад +8

    I really hope Intel doesn't give up on ARC even if some financial indicators are rather bad at the moment. Keep it up!

  • @Optimus_Bull
    @Optimus_Bull Год назад +2

    Linus, the "Display connection might be limited" message might actually be related to your Asus PG42UQ.
    As I also experienced that when I was testing with a PG42UQ that I decided to return.
    I didn't get these messages on my two Acer monitors that I'm daily driving.
    And just wait until you actually try using HDR on that Asus monitor, which has washed out colors on firmware version V32 and V33 due to some tone mapper problems.

  • @tylerdoerrer3576
    @tylerdoerrer3576 Год назад +9

    Really hope they keep improving, more options are always appreciated

  • @wielku
    @wielku Год назад +66

    I really hope they''ll release next gen and contiiinue because the driver development is just amazing

    • @jubies6286
      @jubies6286 Год назад +2

      Fr, all they need to do is pay attention to their own mistakes and keep making refinements, and in a couple years they'll have a solid competitive product line on their hands. The hardware itself seems to be good, it's just the software side holding Arc back.

    • @drunkhusband6257
      @drunkhusband6257 Год назад

      My guesses are they will drop making gpus entirely.

  • @j.s.2767
    @j.s.2767 Год назад +13

    gives me hope for a second generaton arc. could be my next upgrade if the price is right

  • @kadehowells2136
    @kadehowells2136 Год назад +10

    I love the concept behind the scripting in the video, I'm excited to see the improvements you guys can make to the style moving forward

  • @jierenzheng7670
    @jierenzheng7670 Год назад +13

    Intel is also working on new Linux drivers for the Xe series iGPUs and up. 2023 might be really interesting for Arc I hope.

  • @okojijoko
    @okojijoko Год назад +6

    That glare bug happened a lot with the the HD 615 for the Surface Go. I could play MHR (with small tweaks and eons of time to allow for shaders to compile), but would get a particular ray of sunshine that would blow out the screen.

  • @TechGuyBeau
    @TechGuyBeau Год назад +4

    I’ve been putting out intel arc a770 test videos on my channel, showing game play and various settings. Objective game testing, real footage, no opinions.
    I have been PLEASANTLY surprised by both its performance and phenomenal driver updates.
    problem is, I make a video and they put out an update the next day. I mean, it’s a good problem to have 😂😂

  • @Russeljrjs
    @Russeljrjs Год назад +1

    The good thing about videos like this is that intel will get all the details from an end user point of view and HOPEFULLY be improved in the next iteration

  • @LordOfNihil
    @LordOfNihil Год назад +6

    i suspect by the time they get to their second or third gen arc, its going to be a pretty good option.

  • @lukeshaddick4105
    @lukeshaddick4105 Год назад +11

    i've got intel arc and it hasn't been too bad; i just use intel driver updater

  • @unpotat7672
    @unpotat7672 Год назад +41

    Out of interest, what issues did you guys have in Minecraft Dungeons? I play it quite a bit with my partner and I've found it alright on the A770! Frame pacing is definitely not as smooth though. That said I did notice FPS drops in the menus at one point but I moved over DXVK files to the game directory and it seemed to get rid of that! It may have just been a driver update though! Hope you do a 6 month or 1 year review down the line!

  • @legendcat8913
    @legendcat8913 Год назад +5

    I want the Arc + Linux Challenge (mainly bc this is what I want to run and I'm finding shockingly little commentary about it)

  • @imagine7408
    @imagine7408 Год назад +1

    After watching the Linus recommended Blackberry 30 day challenge video, Luke is right - the new LTT reaction channel should definitely be just the LTT staff reacting to their old videos 😂😂😂

  • @archerboy2714
    @archerboy2714 Год назад +5

    You guys should run the Arc performance benchmarks again so you can quantify the gains Intel has achieved. Would be really interesting so keep following the improvements of Alchemist until the next gen arrive.

  • @theskyblockman
    @theskyblockman Год назад +8

    Hello, I currently see reflection glitches on my NVidia GPU (an GTX 1650) on a lot of games, so I quickly made a demo with Vulkan in C and saw that the problem is REALLY weird, sometimes it works really well, sometimes the glitch happens, sometimes it fixes itself after a short or long waiting time, but the conclusion I got was that there was some features of Vulkan having troubles with memory addresses and memory overflow, I think that the glitch actually happens constantly, but that the end user see it occasionally so maybe the GPU tries a background optimization which results in this bug and that the Intel GPUs does this optimization more to compensate for the lack of performance on slow games, so I am not entirely sure that the problem is caused essentially by Intel, maybe another API in the process may cause this issue.

  • @DocProctor
    @DocProctor Год назад +9

    I recently switched from an RX 5700XT to the A770 LE and it's a world of difference. I went from near constant crashes and an almost unusable GPU to something that actually runs smoothly (for the most part).
    I actually couldn't be happier with the new hardware.

  • @Afistrife
    @Afistrife Год назад +15

    I hope that there is an update to this considering that Intel just announced a bottle neck fix the other day. Not sure how impactful, if any, this will be but still worth mentioning.

    • @tzuyd
      @tzuyd Год назад +1

      Sounds like something they'd be likely to cover in a TechQuickie

  • @hecate6834
    @hecate6834 Год назад +4

    I can't mention enough how happy I am that Intel is using DXVK in this way, I hope this means more funds and work for that project :)

  • @00SNIVY00
    @00SNIVY00 Год назад +2

    I'm hoping things continue to improve. I'm fine with "okay" performance, I more just want things to work smoothly. One thing that I miss from the Nvidia GPU I had in my previous laptop was the control panel. It was simple and let me choose what opened with what without having to use the Windows settings app. When the GPU works as intended, it works just fine for me. It's just some road bumps getting it up and going.
    One specific problem I've had is with BTD6 not using either GPU and Windows settings doesn't help fix it. Neither GPU in task manager will have any utilization after opening it, it all seems to happen on the CPU instead, and I don't have any way to troubleshoot it.

  • @ahmedghoraba2153
    @ahmedghoraba2153 Год назад +24

    Watching linus while luke is talking is quite funny 😂😂

  • @DKKatano
    @DKKatano Год назад +9

    Thanks for some great episodes on Intet Arc. I hope you will follow up in 6 months - 1 year, to see how better things are at that time. I really want Intel Arc to be a success, but because my gaming station also are my work station, I won't buy a GPU like Arc at the moment, because I has to be sure I don't run in to any issues. I do hope they will get better, and they can challenge Nvidia and AMD in a few years, because the GPU marked really need a competitor to Nvidia and AMD.

  • @Neoxon619
    @Neoxon619 Год назад +68

    I’m surprise that VR wasn’t a consideration for until from the start when making their GPUs. Granted, the close alpha does begin to address it. But it’s one of those things I wish was a consideration for the launch.

    • @CheezMonsterCrazy
      @CheezMonsterCrazy Год назад +19

      The VR userbase is still incredibly small, and I don't think its growing all that fast at the moment.

    • @Funky_Brother
      @Funky_Brother Год назад +5

      I agree. Especially cuz I think people who do buy the intel arc GPUs are gonna be those who are at the very least somewhat enthusiastic about tech, so there's a higher chance than in other cases that they would own a vr device

    • @AnEagle
      @AnEagle Год назад +4

      @@Funky_Brother Well no, because no one in their right mind would spend as much for their gpu as they would for a gaming accessory that can be used in 3 decent games

    • @Funky_Brother
      @Funky_Brother Год назад

      @@AnEagle you realise that things like the valve index aren't the only VR hesdsets available? There's stuff like the quest series that are much more affordable?

    • @forferdeilig
      @forferdeilig Год назад +1

      Honestly vr performance is pretty much the only consideration I personally have now, a mid range card runs anything 2d fine, and if you use vorpx you can run most games in vr, I just find it so much more immersive

  • @masterkil371
    @masterkil371 Год назад +1

    That Luke finger crack! 12:45

  • @himenaaa3565
    @himenaaa3565 Год назад +1

    welp probably they could do this again when the labs are fully workings, and can be used extensively to testing Arc after many updates and see what is improvements they made and see the mysterious problems that even didnt get listed in event listener list

  • @ConeJellos
    @ConeJellos Год назад +4

    I really hope Intel continues to improve on arc and they go for the same price to performance they did with this generation. When it works the price to performance of the A770 looks damn competitive.

  • @eitnorbert
    @eitnorbert Год назад +5

    full vs. limited color range (grey blacks) - I had the exact same issue on an Intel HD620 on an LG C9... it seems to be due to the driver recognizing the TV as a "TV" and forcing HDTV standards (limiting it to 8 bit YCbCr) instead of treating it as a PC monitor - the moment you switch to HDR, it switches to 10bit/RGB

    • @TheMelihTube
      @TheMelihTube Год назад +1

      I had the same issue with my very old Intel HD4000 once I connected my laptop to my TV. Upon searching online, I noticed that this bug has been there for so many Intel graphic cards and still hasn’t been properly resolved until now. I willing to bet my money that they still have the same issues on their ARC GPUs

  • @ktwei
    @ktwei Год назад +4

    The black values as gray may have been a quantization setting, set it to full.

    • @SianaGearz
      @SianaGearz Год назад +1

      They know it's that but there's no option to switch that!

  • @One_Guy
    @One_Guy Год назад

    11:15 COD Warzone 2.0 has some of these lighting / reflection issues too. sometimes these are game issues not card issues. Running an Nvidia 3080 here.

  • @cptdumplin
    @cptdumplin Год назад +12

    I bought a rig with an Arc 770 recently, and it's not let me down once yet. It's kept a consistent 40-50 fps on cyberpunk at max settings with Ray Tracing. Great buy.
    Update: Since the recent addition XeSS into Cyberpunk, I've gotten consistent 120 FPS at the same settings.

    • @dracer35
      @dracer35 Год назад +4

      I was also really surprised with the ray tracing performance in Cyberpunk. Their first try at ray tracing is already way ahead of AMD and I think future generations will rival Nvidas ray tracing ability.

    • @gorkskoal9315
      @gorkskoal9315 Год назад +3

      My condolences on Cyberpunk.

    • @Akens888
      @Akens888 Год назад +4

      @@gorkskoal9315 i dont know why reviewers even mention this game, it is such a turd of a game. Why would anyone care about the RT performance in this title when no one wants to play it.

    • @gorkskoal9315
      @gorkskoal9315 Год назад +2

      @@Akens888 bingo!

    • @dracer35
      @dracer35 Год назад +7

      @@Akens888 Who pissed in your cereal? 😂

  • @yellowcrescent
    @yellowcrescent Год назад +1

    That "DisplayPort connection might be limited" message I get all the time with my RTX 3090 FE. In my experience, it happens when you connect USB-C to a monitor that allows DP over USB-C functionality, but you're only using the USB-C connection for data (aka the connection is hooked up to a USB host bus and not a GPU).

  • @johnnymaynard299
    @johnnymaynard299 Год назад +4

    Thank you both for doing this challenge I know you both suffered a lot for us but could you in the future come back to this again when intel improves the drivers more ? Thanks again.. Awesome stuff !