Ray Tracing Performance On An Intel GPU Is Weird...

Поделиться
HTML-код
  • Опубликовано: 16 дек 2024
  • Video Sponsor Get Some Corsair Dom Plat Epeen here: amzn.to/3SXWC7q
    I try some ray tracing on Intel's new A770 graphics cards, and it was a bit of a weird experience.
    Some Micro Center links for your viewing pleasure:
    -Shop Intel Arc A770 Graphics Card: micro.center/msg
    -Shop Top Deals on ALL Processors: micro.center/j7f
    Get some AWESOME Dawid T-shirts, Mouse pads and more here: dawiddoesmerch...
    Help us buy weird computers and tech to make videos on by 'Joining' the channel on RUclips here: / @dawiddoestechstuff
    Or you can support the channel on Patreon: / dawiddoestechstuff
    Follow me on whichever Social media you don't hate
    Discord: / discord
    Twitch: / dawiddoestechstuff
    Twitch bits on RUclips: / @dawiddoestwitchstuff1128

Комментарии • 1,2 тыс.

  • @DawidDoesTechStuff
    @DawidDoesTechStuff  2 года назад +195

    Hey! For the testing in this video, I did not have ReBar turned on. So I did a follow up video looking at how much of a difference this single setting made to the airbus GPU's gaming performance. Go check it out here: ruclips.net/video/4itS-I_Xtlw/видео.html

    • @ftchannel21
      @ftchannel21 2 года назад +6

      It´s still private, lel

    • @DawidDoesTechStuff
      @DawidDoesTechStuff  2 года назад +14

      @@ftchannel21 20 more minutes. 😃

    • @Megatholis
      @Megatholis 2 года назад +4

      Yeah in this video it seems to be slightly better than a gtx 960 except in cyberpunk

    • @muneelminhas6076
      @muneelminhas6076 2 года назад +1

      if you reply ill send you a pokemon card

    • @godslayer1415
      @godslayer1415 2 года назад +4

      So no Rebar - which Intel has stated needs to be enabled... Maybe you will try testing in Win95 next..

  • @Lyander25
    @Lyander25 2 года назад +1821

    Honestly, describing the Intel ARC GPU as a very smart drug addict was genius. Never change, Dawid, your commentary never ceases to brighten up murky days.

    • @Kardall
      @Kardall 2 года назад +13

      If this was a YLYL video, I would have lost it multiple times... so funny. lol

    • @ArmadaAsesino
      @ArmadaAsesino 2 года назад +21

      His random analogies are the reason I subscribed haha

    • @tyleyden8695
      @tyleyden8695 2 года назад +6

      I have to admit his cheeky comments brighten my day as well 🍻

    • @DawidDoesTechStuff
      @DawidDoesTechStuff  2 года назад +51

      😂

    • @Middleseed
      @Middleseed 2 года назад +3

      Yeah, I totally agree with this.

  • @swayy0809
    @swayy0809 2 года назад +1652

    I like how Dawid just names the gpu "Intel Airbus" and sticks with it throughout the video 😂

    • @UnL1k3
      @UnL1k3 2 года назад +9

      I was just going to comment that 😂😂

    • @50H3i1
      @50H3i1 2 года назад +8

      It's a thing he do in most videos

    • @ElTeeger
      @ElTeeger 2 года назад +11

      His consistency is impeccable

    • @AtomSquirrel
      @AtomSquirrel 2 года назад +31

      And A380 instead of A770 at the beginning of the video

    • @Pearloryx
      @Pearloryx 2 года назад +14

      I’m an avgeek and I approved with his jokes

  • @Elios0000
    @Elios0000 2 года назад +977

    Did you double check you had Resizeable Bar on? the intel GPUS lose a ton of speed if its not active.

    • @L4ftyOne
      @L4ftyOne 2 года назад +47

      yes he did

    • @kidShibuya
      @kidShibuya 2 года назад +26

      @@L4ftyOne You know how?

    • @wargamingrefugee9065
      @wargamingrefugee9065 2 года назад +32

      @@kidShibuya It's a BIOS/UEFI setting on supported hardware. The CPU, motherboard and GPU must all support the feature in order to use Resizable BAR.

    • @konrad999999
      @konrad999999 2 года назад +205

      Dawid did not mention anything he changed in the bios to enable resizable bar, so it's possible he didn't have it on

    • @FakeMichau
      @FakeMichau 2 года назад +6

      @@konrad999999 12th gen *should* have it enabled by default so if he didn't change anything then it should be enabled

  • @baroncalamityplus
    @baroncalamityplus 2 года назад +271

    If the game has an option for vulkan, try that as its better supported than Direct X at the moment. People have been using a DX11 to Vulkan wrapper and getting large performance gains.

    • @luisortega8085
      @luisortega8085 2 года назад +24

      yeah dxvk

    • @harrytsang1501
      @harrytsang1501 2 года назад +59

      So Intel arc GPU benefitted most from the work done on Steam deck and Linux gaming, everyone wins in open source

    • @TheJackiMonster
      @TheJackiMonster 2 года назад +6

      To be honest you also get performance gains with DirectX to Vulkan translation on other GPUs as well. I think there are still 2 years old benchmarks, showing that you can get around 20% more performance in World of Warcraft for example... even comparing DX12 with DXVK.

    • @lillee4207
      @lillee4207 2 года назад

      I just run popos with vulkan. I started using it a little more than a year ago and it was barely supported, and now every game supports it, and every game is on proton

    • @SomeRandomPiggo
      @SomeRandomPiggo 2 года назад +1

      as a linux user it sounds so strange hearing about dxvk on windows XD, some of my friends who run windows did get huge performance gains in games that use old dx versions

  • @shadowswithin702
    @shadowswithin702 2 года назад +319

    I really want these Intel graphics cards to do well, having another competitor for Nvidia's shenanigans would be good. And these cards don't seem to be far off, with better drivers and optimizations, the second or third generation could be very compelling.

    • @TheFool12-12
      @TheFool12-12 2 года назад +7

      At least pre covid GPUs were intended the price drop they had but scalpers ruined it. I still agree tho.

    • @raven4k998
      @raven4k998 2 года назад +4

      well Intels doing pretty good considering they are just starting out think about it AMD bought ATI Intel is starting out from scratch and they are already able to not just game but also do ray tracing to some degree already it's rather impressive when you think about it

    • @crylune
      @crylune 2 года назад +10

      Everyone wants these cards to do well. Only idiot fanboys don't. Competition is good for everyone.

    • @raven4k998
      @raven4k998 2 года назад

      @@cryluneI wish they were out sooner but better late then never🤣

    • @michaelrayguitartech
      @michaelrayguitartech Год назад +3

      @@raven4k998 , that's not a fair comparison considering Intel used a person from AMD to design the Intel GPU. Along with that, its not fair to compare Intel or AMD to NVIDIA with Ray Tracing because the DXR code is programmed for NVIDIA, requiring very complex drivers to compensate. This will not be the case with gameS developed on a real DirectX 12 Engines like Unreal Engine 5 that are not some ghetto rigged DirectX 11 Engine with the "DirectX12 feature" added, non-natively. The major purpose of DirectX 12 was to eliminate all this "driver" non-sense and simplify the process by natively supporting software and hardware RT within the Engine code itself. NVIDiA will still have to deal with drivers for every game because the code has to travel outside of the Engines workspace to utilize dedicated cores. AMD and Intel have the advantage of allowing the developers to utilize them easily because the allocated cores are within the same silicon. There is also the fact that AMDs shader-cores have become so fast and efficient that software RT like Lumen can be handled without the need for dedicated hardware. The differences between Lumen and dedicated Ray Tracing has recently been proven to be minor and its will only get better. NVIDIA still doesn't hold a candle to AMDs pure shader-core performance.

  • @steveleadbeater8662
    @steveleadbeater8662 2 года назад +594

    As a few others have said, Intels cards really need ReBar. Its a decent card at the price, if they can sort out the drivers.

    • @tagesvaterpatrick8780
      @tagesvaterpatrick8780 2 года назад +42

      Right. Especially if You acknowledge that this is the 1st genration with 1st gen drivers!!!

    • @raptorhacker599
      @raptorhacker599 2 года назад +3

      Wtf is rebar?

    • @bennyweimer2345
      @bennyweimer2345 2 года назад +52

      @@raptorhacker599 Resizeable Bar, basically it lets the cpu acces the gpu memory.

    • @shaneeslick
      @shaneeslick 2 года назад +8

      G'day Steve, at $699AUD the A770 16GB isn't priced very well down here in Australia as you can get a RX6700XT or 3060Ti for the same $$$

    • @stamasd8500
      @stamasd8500 2 года назад +41

      @@raptorhacker599 Steel bars used to reinforce concrete. :P

  • @playstation1freak26
    @playstation1freak26 2 года назад +251

    All the stuttering in these games makes it look like rebar is disabled. Arc really likes rebar that is a setting in the bios that really needs to be enabled.

    • @SirSleepwalker
      @SirSleepwalker 2 года назад +28

      Very likely, seen the tests of this card and it was looking like competent 1440p card in new titles so having performance like this in 1080p looks strange.

    • @Derpynewb
      @Derpynewb 2 года назад +13

      @@SirSleepwalker intel gpus seem weird. They perform better at higher resolutions for some reason.

    • @IPendragonI
      @IPendragonI 2 года назад +3

      @@Derpynewb I saw Digital Foundry's review of the Arc GPUs and Raytracing seems a lot better in their testing and they did it for both 1080p and 1440p. I guess Intel cards are little gimmicky right now.

    • @jesusbarrera6916
      @jesusbarrera6916 2 года назад +2

      it weas on, he clarified it

    • @lUnderdogl
      @lUnderdogl Год назад

      @@Derpynewb Well it is from strachs and new. They are well optimized for newer tech.

  • @curvingfyre6810
    @curvingfyre6810 2 года назад +75

    bottom line, intel suceeded in sending a mid-tier performing gpu for (somewhat) reasonable pricing to market, something that neither nvidia or AMD really do anymore. Some of us are still on the nvidia 10 series, and are just excited to see more variety entering the second hand market, and competition for price efficiency from here.

    • @ole-martinbroz8590
      @ole-martinbroz8590 2 года назад +4

      6600 xt is performing without RT about on par, just is a lot cheaper so ?
      this card costs about the same as a 3060, performs like one but just more issues so I don't find it good value cause it's inferior by the fact perf isn't better, price isn't better and driver quality is worse.
      Nothing else to be said really.

    • @curvingfyre6810
      @curvingfyre6810 2 года назад +9

      @@ole-martinbroz8590 they do compete a lot better for professional applications. their encoder is insane, and they seem to perform better in professional rendering than game rendering. They don't have to be the best on their first generation, so long as they have a market. though I will say, I wasn't aware of just how little difference ther was between AMDs low end stuff. I'm happy with the card I have for the next few years, so I didnt do that much research.

    • @WaterZer0
      @WaterZer0 2 года назад

      They've already laid off the graphics division I'm afraid.

    • @jesusbarrera6916
      @jesusbarrera6916 2 года назад +1

      man you guys claim to know so much and can't even look for 5 seconds at newegg to see even RX 6700XT are now found at $350....

    • @curvingfyre6810
      @curvingfyre6810 2 года назад +1

      @@WaterZer0 wait, actually? so theyve given up after the first generation? They cant have expected to profit on the first go, thats insane.

  • @davewagler1092
    @davewagler1092 2 года назад +231

    I checked out my local MicroCenter and they had one Arc GPU, an Arc 750. On the web page showing Intel GPU's (only the one) was this line:
    "When it comes to purchasing a GPU, the first decision you need to make is between an NVIDIA graphics card or an AMD graphics card."

    • @weberman173
      @weberman173 2 года назад +59

      tbf, as funny as that is, its more likely they havent yet gotten around to get their site templates updated then it being meant as a jab, these "blurbs" above product categorys(like GPUs even for specific brands) are part of the template for the product category, and its not often you get a new contender in a space like GPUs so this isnt necesserly something you specificaly check or evne has "high priority"

    • @crem44
      @crem44 2 года назад +8

      Living anywhere near a microcenter is a blessing…

    • @kobolds638
      @kobolds638 2 года назад +4

      future laptop you will only need to choose either Intel CPU /GPU or AMD CPU /GPU .

    • @louiesatterwhite3885
      @louiesatterwhite3885 Год назад

      @@kobolds638 hopefully nvidia goes the way of voodoo then

  • @johnb3513
    @johnb3513 2 года назад +39

    Other reviews had Intel somehow more competitive at 1440p. Like 1/2 the performance at 1080, but 3/4 the performance at 1440.

  • @craig71686
    @craig71686 2 года назад +33

    The Intel A7XX cards do seem to be more optimized for 1440p. These results probably would have been more interesting at that resolution.

    • @Beveyboygames
      @Beveyboygames Месяц назад +1

      it's because of the bus speed, which is better than even the 3080 I believe so it does really well at not dropping as much fps in 1440

  • @busterscrugs
    @busterscrugs 2 года назад +77

    I really hope these cards continue to get better with driver updates. Nvidia/AMD need serious competition!

    • @franciscocornelio5776
      @franciscocornelio5776 Год назад

      You wish but you will not buy one woldn't you?

    • @KiotheCloud
      @KiotheCloud Год назад

      ​@@franciscocornelio5776 actually consedering their price vs similar speed and spec cards from amd and nvdia i would buy it if its getting close to their competitors if i were to buy a gpu thats a $50 dollars lower than nvdia but only having like 10 fps lower is a real good product

    • @SMCwasTaken
      @SMCwasTaken Год назад

      ​@@KiotheCloudI only want one for Minecraft

    • @cjpowretired
      @cjpowretired 9 месяцев назад

      @@SMCwasTaken if you want a graphics card for minecraft you aren't going to need much. just get a low profile card

  • @thatguycalledaustin4206
    @thatguycalledaustin4206 2 года назад +17

    I want Intel to keep creating GPUs and increasing support for older games. I might not buy one of their GPUs this generation but I definitely would consider buying their 2nd or 3rd generation if they continue. For a first attempt at a GPU, everyone seems to expect them to be perfect. As a computer engineer myself I just want to say that this is hella impressive that they got the game support and performance that they did for their first attempt! I’m excited to see them continue in this space.

    • @EustaH
      @EustaH Год назад

      Yes, but rather than being impressive from engineering standpoint it needs to be actually much better than AMD/Nvidia cards at the same price point in practical day to day use, to get reasonable sells, since it doesn't have any brand recognition. I don't see that coming, so let's hope Intel's plan for 1st gen is to be a sort of beta test, and only 2nd gen and above being a commercial success.

  • @alexlexo59
    @alexlexo59 2 года назад +18

    The fps that you saw on dx9-10 was a result of the card not having drivers for dx9 10 and I think 11 too
    Intel focused on creating dx12 and vulkan Drivers first
    You are still able to play those games because they added a translation layer that translates from dx9,10,11 to dx12
    Translating is also somewhat CPU intensive and can be seen on the graphs
    What I personally would do is use dxvk instead of translating to dx12
    because dxvk seems to be more mature cause of its wide use on Linux

  • @sakura2646
    @sakura2646 2 года назад +80

    ARC GPUS are smexy

    • @zero-ej6rt
      @zero-ej6rt 2 года назад +4

      True

    • @startedtech
      @startedtech 2 года назад +13

      now that's a word I haven't seen in ages

    • @Gatorade69
      @Gatorade69 2 года назад +3

      ARC GPUS are smelly.

    • @trr4gfreddrtgf
      @trr4gfreddrtgf 2 года назад

      @@Gatorade69 your smelly

    • @muufle
      @muufle 2 года назад +3

      @@Gatorade69 tbh I'd rather take smell over an extreme "gAmEr" look a little green company uses.

  • @buggerlugz6753
    @buggerlugz6753 2 года назад +20

    Love how the A770 is the price the top end Nvidia cards should be. Good move there intel.

    • @michaelbrindley4363
      @michaelbrindley4363 2 года назад +3

      Yeh from the early 2000s, I miss those days.

    • @buggerlugz6753
      @buggerlugz6753 2 года назад +2

      @@michaelbrindley4363 - Me too Michael, when the quality of games was great and it didn't cost the earth. Now we've got remakes generally, poor innovation and GPU pricing which is a joke.

    • @psychologicalFudge
      @psychologicalFudge Год назад

      But then doesn't that mean the Intel gpu is overpriced? Considering the performance doesn't match the top nvidia gpus

    • @auritro3903
      @auritro3903 11 месяцев назад

      @@psychologicalFudge I think he means that it competes with 3060 and is a much better deal

    • @psychologicalFudge
      @psychologicalFudge 11 месяцев назад +1

      @auritro3903 I mean, the 3060 is 40%+ more powerful, so idk if it really competes with that. That's not even counting, dlss. Either way, op did say high-end. So I assumed he meant the 40 series... which are even more powerful.

  • @teamtechworked8217
    @teamtechworked8217 2 года назад +2

    I started watching when you had 5k subs. Now within days you will reach 500k. Congrats!!

  • @benputhoff9898
    @benputhoff9898 2 года назад +16

    I really hope this architecture takes off and matures well. my next PC might be al Intel!

  • @DigitalJedi
    @DigitalJedi 2 года назад +11

    Finally a David video where I actually own the product. So far I've been loving mine. It absolutely crushes for AV1 and video transcoding, and it renders quite well too.

    • @tvHTHtv
      @tvHTHtv 2 года назад +1

      I've been wondering how well it handles video rendering for live streaming. Thank you for your comment!
      All the big tech reviewers only review GPUs for their gaming performance. I wish they would cover video rendering too

    • @DigitalJedi
      @DigitalJedi 2 года назад +2

      @@tvHTHtv For streaming it is more than enough. Right now mine is used to encode an 800p 60fps twitch stream, and record the gameplay and my webcam at 1600p and 1080p respectively. It seems to do quite well for that in OBS.

    • @wargamingrefugee9065
      @wargamingrefugee9065 2 года назад

      @@tvHTHtv Are you familiar with EposVox? His video, "Intel GPUs are NOT what anyone expected" covers video performance starting at 7:56.

    • @dracer35
      @dracer35 2 года назад +1

      I agree. I used my A770 to stream 1080p60 to do a RUclips stream test while playing Scum the other day and it worked great. A few friends said the video quality looked excellent. Only problem I had was I think I picked the wrong audio settings so It was only picking up my mic audio. My fault.

  • @aku2dimensional
    @aku2dimensional 2 года назад +11

    I like that Intel has entered into the GPU market because most if not all cards are built with NVIDIA or AMD chipsets, looks like they need to sort out things overall but it's a promising start. Intel also has great packaging, in 2016 or so they used an IR soundboard to play the Intel jingle every time you open a CPU box just like those musical birthday cards. I cut them out and put in them in unsuspecting places such as a cabinet or by a door, walk into a room and you'd hear the jingle.

  • @Groovy-Train
    @Groovy-Train 2 года назад +25

    Haha, the first thing when I saw the naming of the ARC GPU'S was think of Airbus too! If they can optimize drivers better then it night be a great buy.

  • @David-yx3bd
    @David-yx3bd 2 года назад +5

    I actually got to build a system with an A770, and honestly in the stability testing I did before handing it off to the customer, it was surprisingly good in the RT department. Optimization was horrible requiring a ton of tweaking and work arounds, but the results were actually worth it.
    Properly dialed in it was about even with a 3060 maybe a 3060 TI in some titles from a straight FPS standpoint. From an RT standpoint though? It was closer to a 3070 when properly dialed in. For a first gen? That's not bad, but you have to like tinkering if you're going to buy this one in it's current state. It has a lot of the "teething issues" Radeon had early on, but much better architecture, so it looks promising if they can get the optimization side of it handled.

  • @kyles8524
    @kyles8524 2 года назад +12

    i saw the ray tracing videos a few months back, it even did as good as the 3060ti in some cases

    • @XiaOmegaX
      @XiaOmegaX 2 года назад +4

      On the linux drivers there's a few tests where it hit *3080* levels. Currently outliers, but a sign of how far these might stretch with heavy driver optimizing.

    • @Lolwutfordawin
      @Lolwutfordawin 2 года назад +1

      @@XiaOmegaX that would make them incredible for blender rendering workloads since they rely so heavily on ray tracing now.

  • @grimmpickins2559
    @grimmpickins2559 2 года назад +38

    I'm sure you've read enough comments about DXVK and 'Rebar' at this point... I've been watching a number of videos about these cards, and yes, the driver situation needs help too... But I've seen better, much better, performance from these cards. I'm very happy to see competition to Nvidia/Radeon - which feels really odd to say since it's Intel... LOL.

    • @Grayfox01
      @Grayfox01 2 года назад

      Oh no, Intel is let down by the performance. They wanted more, but sure they at least priced appropriate for the market. But this first gen is putting the whole project in tepid waters.

  • @Tempora158
    @Tempora158 2 года назад +1

    2:00 Yeah, I know you were attempting to be funny, but that cardboard is saying that YOU are authorized to use the logo (via the sticker) on your computer as long it has that graphics card installed, not that Intel is authorized to use their own logo (because OF COURSE they are authorized to use their own logo).

  • @bhgemini
    @bhgemini 2 года назад +8

    Wait. Did you swap back & forth between the 770 and a 380? I thought you made a verbal oopsie but the Airbus pic also had 380 on it.
    I picked up the A380 for my htpc and it is an AV1 comprssing blu-ray beast as well. No crushed blacks and fully encodes the 1-2 gig AV1 copy in around 10-18 minutes. Going to check out its HDR performance soon.

  • @Eazon_
    @Eazon_ 2 года назад +3

    Did you use reBAR btw? HUGE impact. Pretty sure Petersen even said to buy a 3060 if you dont have reBAR

  • @Thelango99
    @Thelango99 2 года назад +5

    Did you remember to enable reBAR?
    The performance is crippled without it.

  • @dmagik8
    @dmagik8 2 года назад +7

    Great as always. I look forward to new videos with that intel also ran. I hope they stay in the GPU game and offer a real 3rd option.

  • @WorldLoveGaming
    @WorldLoveGaming 2 года назад +3

    The great Control thumbnail is back!

    • @Groovy-Train
      @Groovy-Train 2 года назад +1

      I couldn't resist that click bait! 😄

  • @carrioncrow8191
    @carrioncrow8191 2 года назад +2

    Random video idea:
    By one of those screens that they sell on Amazon or other sites, and play games on it solely for a week or longer.
    These screens are the sizes made for cars or other devices, after cutting the materials for larger screens. It is my understanding that a screen is made as a larger surface, and then cut to size, so any leftovers can be manufactured as displays, or they are waste. It sounds like it could be a cool video.

  • @jboogie4701
    @jboogie4701 2 года назад +4

    please do an SLI arc build - the fun part is finding compatible hardware ;)
    also ive heard the arc series has devolopment for DVXK/ (direct x over vulkan) pretty well

  • @prainmantis
    @prainmantis 2 года назад +2

    Did he turn on Re-Bar or not? couldn't find it if he did. I keep hearing that it needs to be turned on for better performance.

  • @matasa7463
    @matasa7463 2 года назад +6

    Holy shit, this thing's potential is insane. If they can work out the driver problems... damn, I might grab one myself later.

    • @aurelia8028
      @aurelia8028 Год назад

      Yeah... It's actually reasonably priced as well

  • @lighthawk95
    @lighthawk95 2 года назад +27

    If ReBAR was not being used, all of these benchmarks mean nothing.

    • @theaveragecactus
      @theaveragecactus 2 года назад +2

      It doesn't look like it was. To be honest, this is a fantastic card with it on, but a surprising amount of people don't know it needs to be used

    • @FakeMichau
      @FakeMichau 2 года назад

      12th gen should be turning on that by default

    • @OGPatriot03
      @OGPatriot03 2 года назад +1

      He said it was enabled in a post on one of these comments

  • @filipeligabue
    @filipeligabue 2 года назад +6

    It was impressive how the box had a blue glow as he opened it!

    • @scottcol23
      @scottcol23 2 года назад +1

      That wasn't the box.

  • @alessio54321
    @alessio54321 2 года назад +2

    did u enable rebar? its an absolute necessity on these arc gpus to work properly as dumb as that is.

  • @nfg_racing7968
    @nfg_racing7968 2 года назад +4

    I have high hopes for intrls GPU's i hope they get them together we need some more competition in the gpu space

  • @PhoenixKeebs
    @PhoenixKeebs 2 года назад +2

    Honestly thinking of upgrading to this from my 3060 once the drivers get good. But I don't know if that's really an upgrade?

    • @SSY90
      @SSY90 2 года назад

      Complete lateral move. Not sure why you'd even consider that

  • @Toma-621
    @Toma-621 2 года назад +3

    Someone should make a dedicated ray tracing card. It would go in the second PCIE slot along with the gpu and it would allow for you to max out ray tracing without adding even a single percent of GPU usage meaning even ppl with lower end cards that dont have RT capabilities would get to experience ray tracing. Also having a dedicated ray tracing card would allow for developers to go crazy with ray tracing

    • @spht9ng
      @spht9ng 2 года назад +1

      There might be PCI-E limitations regarding bandwidth. RT cores are integrated into the GPU so the bandwidth is high and latency is low.

    • @ummerfarooq5383
      @ummerfarooq5383 2 года назад

      Like a photonic chip card. Then ray tracing would be instant.

    • @ILoveTinfoilHats
      @ILoveTinfoilHats Год назад +1

      Bad idea. RT cards are already basically doing that because they have dedicated cores for that. So it's not hurting shader performance anyways. All you're doing is adding latency and complication, while reducing bandwidth.

  • @ninjasiren
    @ninjasiren 2 года назад +1

    soon there would be a collaboration between Intel and Airbus, and soon Airbus will also compete in the GPU scene lol

  • @fastgecko5799
    @fastgecko5799 2 года назад +2

    DX12 Battlefield probably stutters because it's compiling shaders. If you had kept playing that map it probably would have evened out after a bit.

    • @DawidDoesTechStuff
      @DawidDoesTechStuff  2 года назад

      That’s what I was thinking too. But after 15 minutes it was still stuttering like that.

  • @beardalaxy
    @beardalaxy 2 года назад +3

    It seems like this is a really great mid-level GPU that just lets you use raytracing with it, whereas most mid-level GPUs would not take it as well. Very interesting!

  • @repatch43
    @repatch43 2 года назад +2

    The stutters could be a result of not enabling resizable bar?

  • @ricthehalfrican
    @ricthehalfrican 2 года назад +28

    "It's like living with a very smart drug addict"
    Sir you earned yourself a Like

  • @IntziGG
    @IntziGG Год назад +1

    Sometimes you got like 40 to 60 fps and it feels like when I used to have 10, too much stutter and aside from those very decent fps in all games.
    What is the difference in RTX settings on this card compared to a real rtx card if there is any?
    Is it just branding or the extra RT cores do something?
    I'm comfused by the fact that you get RTX and keep same fps.

  • @Shriram_767
    @Shriram_767 2 года назад +3

    What do you think about its price to performance ratio 🤔

  • @cerescop
    @cerescop 2 года назад +1

    Did you turn on resizable bar? The A770 requires it to be on to run to run well.

  • @demrasnawla
    @demrasnawla 2 года назад +9

    It's funny that ray tracing is so unimportant that we often really struggle to tell the difference except in games where the rasterised lighting is super basic (Quake rtx, Minecraft, etc) and we have to rely on looking at the frame rate

    • @Takisgr07
      @Takisgr07 2 года назад +2

      If you cant tell the difference between Ray Tracing On and Off you really need to check your eyes :)

  • @nonchalanto
    @nonchalanto 2 года назад +1

    just out of curiosity - did you make sure resizable bar was enabled?

  • @jakestocker4854
    @jakestocker4854 2 года назад +7

    Remember PhysX cards? I think they should make RT cards as an option. I think that would be awesome.

  • @Madblaster6
    @Madblaster6 2 года назад +1

    It's odd that dx12 performance was not great. Was resizable bar enabled in the bios?

  • @Karti200
    @Karti200 2 года назад +5

    I know you won't do it, but... I would love to see someone finally testing that card over Linux - with all duo respect, it might be the same thing as with Tiger Lake, (G5/G7 APU) where it runs much better with the Linux Intel drivers than on Windows
    *cat eyes stare* pweety pweeseee

    • @dillonnnnnn
      @dillonnnnnn 2 года назад +1

      ive heard it doesnt have driver support for ubuntu yet

    • @Karti200
      @Karti200 2 года назад

      @@dillonnnnnn not really, there is actually a support
      Kernel 6.0 has them implemented officially by Intel
      BUT
      Mesa3D already updated their packages with Intel Drivers - so really... even if you take for example, Mint with Kernel 5.15 - you still will run that GPU without any issue, and if you use Mesa you will have everything updated to latest no matter what :D

    • @ahmetrefikeryilmaz4432
      @ahmetrefikeryilmaz4432 2 года назад

      proper neckbeard linux.

    • @Karti200
      @Karti200 2 года назад

      @@ahmetrefikeryilmaz4432 mhm, ok xd

  • @jlagraba
    @jlagraba Год назад +1

    I have the same card running 4K resolution at 45-55 fps. I’ve never seen something so beautiful on my monitor

  • @Zionn69
    @Zionn69 2 года назад +3

    The only reason I wanna get an ARC A770 is because of the looks

    • @stephanhart9941
      @stephanhart9941 2 года назад +7

      I want it for the Glue!

    • @Zionn69
      @Zionn69 2 года назад

      @@stephanhart9941 i dont get it, can you explain

    • @Fly_By_Gaming
      @Fly_By_Gaming 2 года назад +2

      @@Zionn69 Gamers Nexus who reviewed one
      of the models said the pcb was tacked with so many thermal pads, acting like glue.

    • @stephanhart9941
      @stephanhart9941 2 года назад +3

      @@Zionn69 check out the Gamers Nexus teardown. There are some questionable assembly methods employed with the reference A770/750, including glue. Not great for maintenance or reassembly.

    • @OCDyno
      @OCDyno 2 года назад

      There's some tape on the backplate

  • @RealSiViX
    @RealSiViX 2 года назад +1

    I'm sure if you give the drivers for that GPU a few solid revisions, most of that stutter will get worked out...

  • @medokn99
    @medokn99 2 года назад +2

    Do you even ray trace bro?

  • @thecarrot4412
    @thecarrot4412 2 года назад +1

    Great video, but to be honest if you can't tell when or if ray tracing is being activated with a settings change then it's a moot point and just a way to have less fps

  • @tvHTHtv
    @tvHTHtv 2 года назад +3

    I've been leery of considering an Intel GPU for my system but Dawid's tests make it look like a decent option...especially for the price. Ngl...I dont hate it! Maybe someday when my EVGA FTW3 3060ti Ultra kicks the bucket, I'll consider replacing it with an Intel Arc because NVIDIA flat out said they are going to price gouge and AMD will probably be jealous of NVIDIA's revenues so they'll start raising prices too.
    Its also worth noting that these are the 1st gen of Intel cards...I would like to think they'll improve in the future.

  • @rtyler1869
    @rtyler1869 2 года назад +1

    quick question - did you activate RBAR? Intel ARC needs this to get stable performance

  • @MarcNorris
    @MarcNorris 2 года назад +1

    Any chance of doing a video using this for video editing/rendering? It would be interesting to see how it compares to the competition.

    • @DigitalJedi
      @DigitalJedi 2 года назад

      My A770 16Gb does pretty well for Davinci Resolve exports to AV1. My only comparison is an RX6800M laptop, which is understandably slower.

    • @aaronjones4529
      @aaronjones4529 2 года назад +1

      @@DigitalJedi Yeah, I think LTT did a video about actually getting the lowest end Intel arc to go in a system secondary to NV/AMD cards just for its AV1 encoder as AV1 will be the new standard across the board as it gives much higher quality output for same bitrate

    • @DigitalJedi
      @DigitalJedi 2 года назад

      @@aaronjones4529 I only went with the A770 for the Vram. Being able to load up huge chunks of a file or the entire file at once makes things go really fast.

    • @aaronjones4529
      @aaronjones4529 2 года назад

      @@DigitalJedi oh nice, it loads into vram and doesn't just stream from SSD.... Clearly I'm not knowledgeable regarding vid editing, and nice to learn a second new thing today

    • @DigitalJedi
      @DigitalJedi 2 года назад +1

      @@aaronjones4529 Some renderers will load up the vram and others just stream from storage. Mine loads into system memory first and then the GPU copies whatever it's working on over.

  • @bikeboy6674
    @bikeboy6674 2 года назад +3

    "That is an odd smell for a graphics card"
    You know you're in for a thorough review of this card when you even get a report on it's olfactory status 😅

  • @SMGJohn
    @SMGJohn 2 года назад

    Makes sense it has ray tracing performance advantages considering Intels GPU attempt 12 years ago, Larrabee was a ray tracing GPU and ran full ray tracing games back then at 60fps at 1080p which is mind blowing for something so old, makes sense they would reuse some of that technology in their second production GPU launch.

  • @peterparker_hu
    @peterparker_hu 2 года назад +4

    despite all the issues this card have it looks awesome 😄 i'm more like an aesthetic guy than performance guy 😅

    • @MasterN64
      @MasterN64 2 года назад +1

      Its too bad if you look at a teardown video and see that its held together with glue and tape. Literally. Sacrificing practical parts of the design like it being easily worked on for aesthetics is a terrible practice for a company. Even if most of your consumers wont tear it apart there is quite often a need to do so for some.

    • @dumbvixen3776
      @dumbvixen3776 2 года назад

      I agree 100% the card is so pretty

    • @Gatorade69
      @Gatorade69 2 года назад

      Looks are superficial.

    • @wargamingrefugee9065
      @wargamingrefugee9065 2 года назад

      @@MasterN64 Uh, yeah, that solid, cast base plate is just for show. Get real. They taped some wires out of the way and used some adhesive to secure the back cover. It is not "literally" held together with glue and tape. If anything, the card is overengineered.

    • @MasterN64
      @MasterN64 2 года назад +1

      @@wargamingrefugee9065 Look guy every single major manufacturer of cards avoids tape and glue at practically all costs because they are an absolute nightmare to deal with and put back together in a way that looks decent. The first step to taking it apart is to peel off the big sheet of flimsy metal on the back of the card to get access to the screws thats held there with double sided tape. A metal sheet that you will -never- get back on in a way that will look decent again. If you want to understand why just watch the gamers nexus teardown of the card and why they say so.

  • @repatch43
    @repatch43 2 года назад +1

    Haha, had to laugh every time you called it an airbus! :)

  • @thomasbetts822
    @thomasbetts822 2 года назад +1

    Did you have resizable Bar on? It is a must for the Intel GPU's otherwise they are extremely unstable.

    • @thomasbetts822
      @thomasbetts822 2 года назад

      Probably explains the stutter if you didn't check the bios to make sure it was on.

  • @raphi_sch
    @raphi_sch 2 года назад +1

    Usually the treaded holes at the back of GPU are their to mount a bracket to hold the GPU in servers.

    • @aaronjones4529
      @aaronjones4529 2 года назад

      Really? Well you learn something new every day

  • @DJSekuHusky
    @DJSekuHusky 2 года назад +1

    Hmmm....curious. I'm running the A770 on a 9700k with 16GB DDR4 4000 and I didn't experience any such issues.
    For instance, your GTA V framerate was what mine does in 4k. Idk what could be causing that.

    • @dracer35
      @dracer35 2 года назад

      Im seeing the same thing with mine except im running a 12900ks. My A770 framerates are pretty much double what hes getting.

  • @AncientGameplays
    @AncientGameplays 2 года назад +1

    Why is Dawid actually "ignoring" the poor frametimes?

  • @OceanSky159
    @OceanSky159 2 года назад

    At 08:04 which software did you download? - DirectX Properties - the official thing from microsoft?

  • @IcebergTech
    @IcebergTech 2 года назад

    "Non-saggy mounter thing" - ahh, you've seen my Tinder profile

  • @xanderblackstar8236
    @xanderblackstar8236 2 года назад +1

    YES! FINALLY! Someone mentions the Airbus naming scheme. Intel has so clearly lifted it from Airbus , that I honestly cannot believe nobody has mentioned it yet😁

  • @christowepener2698
    @christowepener2698 2 года назад

    Dawid, is jy seker die GPU was in gebruik (en nie die een wat op die CPU is nie)? Daai external GPU setting moes dalk gestel gewees het. Ek het nou die dag met Fallout 3 in Epic gesukkel en moes in Windows se settings die game forseer het om die external GPU te gebruik.

  • @demigodgamez
    @demigodgamez 2 года назад

    What were the specs for the test bench?
    I was really looking for it since we do need to rule out any possible bottlenecks during testing.

  • @fantasypvp
    @fantasypvp 2 года назад

    2:47 IKR I have one of those old cards lying around somewhere and my first thought when I saw that arc GPU was, oh wow it looks a lot like that evga card

  • @probablemisclick
    @probablemisclick 2 года назад

    Tough day, really needed some Dawid snark to bring up my mood.
    I did not leave disappointed.

  • @spentcasing3990
    @spentcasing3990 2 года назад

    I saw about half a dozen Intel Arc's for sale at Memory Express at their Broadway location here in Vancouver. They have the A770 and A750

  • @richardpastoor9336
    @richardpastoor9336 2 года назад +2

    I hear you say A380 a few times....but you have an A770 installed...
    And by the way, the Intel Arc performs better at 1440p then on 1080p.

  • @Kim-fm2vo
    @Kim-fm2vo 2 года назад +1

    So when it runs well, it runs *very* well! Thanks for the review! Great video

  • @digidopetech
    @digidopetech 2 года назад

    Great video as usual. We are located in Ontario and actually getting hands on with an A770 was just luck. I had been calling Canada Computers and Memory Express regularly but neither could confirm when they would have Stock, Memory Express took a pre order for me on an A750 but said they could not do it for an A770. The Canada Computers site showed stock of the A750 but still "Sold Out" of the A770 it wasn't until I actually checked the per store stock checker as you displayed it showed an A770 at 3 of the Ontario locations. My Pre Order at MemEx still stands but they haven't received any stock.

  • @someweirdo9129
    @someweirdo9129 2 года назад +2

    Wow, the raytracing on the a770 in quake 2 pulls twice the fps that my 6700xt does on the same settings 😅

  • @janemba42
    @janemba42 2 года назад +2

    RandomgaminginHD did a video showing that vulkan is better with the Intel GPU's in a fair few cases. Especially GTA 5. You should do a video checking it out too, it's called DXVK.

  • @diamondev1
    @diamondev1 2 года назад +2

    Wonder how well this would compare to the mobile 3060 seeing the power draw of this card

  • @poeticsilence047
    @poeticsilence047 2 года назад

    Ahhh Dawids classic Control thumbnail. You know he is doing a GPU test.

  • @vailpcs4040
    @vailpcs4040 2 года назад +1

    So..... Intel built a $350 RTX 2060-class GPU in 2022? OK. Not bad, not great.

  • @shawnrobinson03
    @shawnrobinson03 2 года назад +2

    Sad to realize all the computer nerds don't know that when Dawid says A380 he means the airplane, not the card.

  • @jonathandavis9507
    @jonathandavis9507 Год назад

    How did you get raytracing to show up in Control? It was completely greyed out for me when launching DX12

  • @Pearloryx
    @Pearloryx 2 года назад +1

    As an avgeek and a tech geek, I approved with your jokes with the A380 on a gpu plane lol
    Also the term “GPU” is also in aviation, meaning “Ground Processing Unit”

  • @raffitchakmakjian
    @raffitchakmakjian 2 года назад

    dang, this is one of the best ARC reviews I've seen. Kind of makes me want one.

  • @cgd07
    @cgd07 2 года назад +1

    i'm not sure about setting the game on medium to give room for ray tracing, it's like putting the cart before the horse, but would love to know u guys thought about this

  • @marcinsobczak2485
    @marcinsobczak2485 2 года назад

    There is one Dawids sentence in this video that pretty much sums up the ray traycing: 'i think it looks different'

  • @HanCurunyr
    @HanCurunyr 2 года назад

    I love that Ray Tracing was subtitled as rage racing at 0:09 =)

  • @Cxntrxl
    @Cxntrxl 2 года назад +2

    seems like a lot of this stuff is driver/compatibility related, so it sorta makes sense. Plus at that pricepoint it's definitely an impressive card

  • @Gokul_Yt
    @Gokul_Yt 2 года назад

    5:31 Rockstar finally fixed that now no shutters but when you reach 189 Fps the audio will desync :(

  • @gotscroogled
    @gotscroogled 2 года назад

    Crysis Remastered actually uses software raytracing unless you have an RTX card. You can try to force it on with an autoexec.cfg in the base directory, with the following:
    con_restricted=0
    r_AllowNonRTXHWRT = 1
    You can check if it's toggled in game by typing r_AllowNonRTXHWRT in the console in game, but you would have to do testing to see if it actually does anything. Hard to tell on my 6700xt.

  • @Galiant2010
    @Galiant2010 2 года назад +1

    Control DOES prompt for DX11 or DX12 on a "proper" setup. I just started it up literally yesterday to test how well my new 3060ti would do at 4k with ray tracing lol. (For the curious, not well. Averaged around 10 FPS. Which, I don't understand why people laugh at CP77 when that runs on my system better than Control lol).

  • @rgott1234
    @rgott1234 2 года назад

    What software is he using for testing, I looked in earlier videos but never heard it mentioned. Thanks in advance.

  • @itsmarvin6999
    @itsmarvin6999 2 года назад

    GPU: I fear no man
    Dawid's nephew: But that thing it scares me.

  • @v0styr0-u4n
    @v0styr0-u4n 2 года назад +1

    yoooo dawid you probably won’t see this but i live in the uk and i was able by pre ordering 2 days before i could buy instantly get an a380