Ray Tracing Performance On An Intel GPU Is Weird...
HTML-код
- Опубликовано: 16 дек 2024
- Video Sponsor Get Some Corsair Dom Plat Epeen here: amzn.to/3SXWC7q
I try some ray tracing on Intel's new A770 graphics cards, and it was a bit of a weird experience.
Some Micro Center links for your viewing pleasure:
-Shop Intel Arc A770 Graphics Card: micro.center/msg
-Shop Top Deals on ALL Processors: micro.center/j7f
Get some AWESOME Dawid T-shirts, Mouse pads and more here: dawiddoesmerch...
Help us buy weird computers and tech to make videos on by 'Joining' the channel on RUclips here: / @dawiddoestechstuff
Or you can support the channel on Patreon: / dawiddoestechstuff
Follow me on whichever Social media you don't hate
Discord: / discord
Twitch: / dawiddoestechstuff
Twitch bits on RUclips: / @dawiddoestwitchstuff1128
Hey! For the testing in this video, I did not have ReBar turned on. So I did a follow up video looking at how much of a difference this single setting made to the airbus GPU's gaming performance. Go check it out here: ruclips.net/video/4itS-I_Xtlw/видео.html
It´s still private, lel
@@ftchannel21 20 more minutes. 😃
Yeah in this video it seems to be slightly better than a gtx 960 except in cyberpunk
if you reply ill send you a pokemon card
So no Rebar - which Intel has stated needs to be enabled... Maybe you will try testing in Win95 next..
Honestly, describing the Intel ARC GPU as a very smart drug addict was genius. Never change, Dawid, your commentary never ceases to brighten up murky days.
If this was a YLYL video, I would have lost it multiple times... so funny. lol
His random analogies are the reason I subscribed haha
I have to admit his cheeky comments brighten my day as well 🍻
😂
Yeah, I totally agree with this.
I like how Dawid just names the gpu "Intel Airbus" and sticks with it throughout the video 😂
I was just going to comment that 😂😂
It's a thing he do in most videos
His consistency is impeccable
And A380 instead of A770 at the beginning of the video
I’m an avgeek and I approved with his jokes
Did you double check you had Resizeable Bar on? the intel GPUS lose a ton of speed if its not active.
yes he did
@@L4ftyOne You know how?
@@kidShibuya It's a BIOS/UEFI setting on supported hardware. The CPU, motherboard and GPU must all support the feature in order to use Resizable BAR.
Dawid did not mention anything he changed in the bios to enable resizable bar, so it's possible he didn't have it on
@@konrad999999 12th gen *should* have it enabled by default so if he didn't change anything then it should be enabled
If the game has an option for vulkan, try that as its better supported than Direct X at the moment. People have been using a DX11 to Vulkan wrapper and getting large performance gains.
yeah dxvk
So Intel arc GPU benefitted most from the work done on Steam deck and Linux gaming, everyone wins in open source
To be honest you also get performance gains with DirectX to Vulkan translation on other GPUs as well. I think there are still 2 years old benchmarks, showing that you can get around 20% more performance in World of Warcraft for example... even comparing DX12 with DXVK.
I just run popos with vulkan. I started using it a little more than a year ago and it was barely supported, and now every game supports it, and every game is on proton
as a linux user it sounds so strange hearing about dxvk on windows XD, some of my friends who run windows did get huge performance gains in games that use old dx versions
I really want these Intel graphics cards to do well, having another competitor for Nvidia's shenanigans would be good. And these cards don't seem to be far off, with better drivers and optimizations, the second or third generation could be very compelling.
At least pre covid GPUs were intended the price drop they had but scalpers ruined it. I still agree tho.
well Intels doing pretty good considering they are just starting out think about it AMD bought ATI Intel is starting out from scratch and they are already able to not just game but also do ray tracing to some degree already it's rather impressive when you think about it
Everyone wants these cards to do well. Only idiot fanboys don't. Competition is good for everyone.
@@cryluneI wish they were out sooner but better late then never🤣
@@raven4k998 , that's not a fair comparison considering Intel used a person from AMD to design the Intel GPU. Along with that, its not fair to compare Intel or AMD to NVIDIA with Ray Tracing because the DXR code is programmed for NVIDIA, requiring very complex drivers to compensate. This will not be the case with gameS developed on a real DirectX 12 Engines like Unreal Engine 5 that are not some ghetto rigged DirectX 11 Engine with the "DirectX12 feature" added, non-natively. The major purpose of DirectX 12 was to eliminate all this "driver" non-sense and simplify the process by natively supporting software and hardware RT within the Engine code itself. NVIDiA will still have to deal with drivers for every game because the code has to travel outside of the Engines workspace to utilize dedicated cores. AMD and Intel have the advantage of allowing the developers to utilize them easily because the allocated cores are within the same silicon. There is also the fact that AMDs shader-cores have become so fast and efficient that software RT like Lumen can be handled without the need for dedicated hardware. The differences between Lumen and dedicated Ray Tracing has recently been proven to be minor and its will only get better. NVIDIA still doesn't hold a candle to AMDs pure shader-core performance.
As a few others have said, Intels cards really need ReBar. Its a decent card at the price, if they can sort out the drivers.
Right. Especially if You acknowledge that this is the 1st genration with 1st gen drivers!!!
Wtf is rebar?
@@raptorhacker599 Resizeable Bar, basically it lets the cpu acces the gpu memory.
G'day Steve, at $699AUD the A770 16GB isn't priced very well down here in Australia as you can get a RX6700XT or 3060Ti for the same $$$
@@raptorhacker599 Steel bars used to reinforce concrete. :P
All the stuttering in these games makes it look like rebar is disabled. Arc really likes rebar that is a setting in the bios that really needs to be enabled.
Very likely, seen the tests of this card and it was looking like competent 1440p card in new titles so having performance like this in 1080p looks strange.
@@SirSleepwalker intel gpus seem weird. They perform better at higher resolutions for some reason.
@@Derpynewb I saw Digital Foundry's review of the Arc GPUs and Raytracing seems a lot better in their testing and they did it for both 1080p and 1440p. I guess Intel cards are little gimmicky right now.
it weas on, he clarified it
@@Derpynewb Well it is from strachs and new. They are well optimized for newer tech.
bottom line, intel suceeded in sending a mid-tier performing gpu for (somewhat) reasonable pricing to market, something that neither nvidia or AMD really do anymore. Some of us are still on the nvidia 10 series, and are just excited to see more variety entering the second hand market, and competition for price efficiency from here.
6600 xt is performing without RT about on par, just is a lot cheaper so ?
this card costs about the same as a 3060, performs like one but just more issues so I don't find it good value cause it's inferior by the fact perf isn't better, price isn't better and driver quality is worse.
Nothing else to be said really.
@@ole-martinbroz8590 they do compete a lot better for professional applications. their encoder is insane, and they seem to perform better in professional rendering than game rendering. They don't have to be the best on their first generation, so long as they have a market. though I will say, I wasn't aware of just how little difference ther was between AMDs low end stuff. I'm happy with the card I have for the next few years, so I didnt do that much research.
They've already laid off the graphics division I'm afraid.
man you guys claim to know so much and can't even look for 5 seconds at newegg to see even RX 6700XT are now found at $350....
@@WaterZer0 wait, actually? so theyve given up after the first generation? They cant have expected to profit on the first go, thats insane.
I checked out my local MicroCenter and they had one Arc GPU, an Arc 750. On the web page showing Intel GPU's (only the one) was this line:
"When it comes to purchasing a GPU, the first decision you need to make is between an NVIDIA graphics card or an AMD graphics card."
tbf, as funny as that is, its more likely they havent yet gotten around to get their site templates updated then it being meant as a jab, these "blurbs" above product categorys(like GPUs even for specific brands) are part of the template for the product category, and its not often you get a new contender in a space like GPUs so this isnt necesserly something you specificaly check or evne has "high priority"
Living anywhere near a microcenter is a blessing…
future laptop you will only need to choose either Intel CPU /GPU or AMD CPU /GPU .
@@kobolds638 hopefully nvidia goes the way of voodoo then
Other reviews had Intel somehow more competitive at 1440p. Like 1/2 the performance at 1080, but 3/4 the performance at 1440.
The Intel A7XX cards do seem to be more optimized for 1440p. These results probably would have been more interesting at that resolution.
it's because of the bus speed, which is better than even the 3080 I believe so it does really well at not dropping as much fps in 1440
I really hope these cards continue to get better with driver updates. Nvidia/AMD need serious competition!
You wish but you will not buy one woldn't you?
@@franciscocornelio5776 actually consedering their price vs similar speed and spec cards from amd and nvdia i would buy it if its getting close to their competitors if i were to buy a gpu thats a $50 dollars lower than nvdia but only having like 10 fps lower is a real good product
@@KiotheCloudI only want one for Minecraft
@@SMCwasTaken if you want a graphics card for minecraft you aren't going to need much. just get a low profile card
I want Intel to keep creating GPUs and increasing support for older games. I might not buy one of their GPUs this generation but I definitely would consider buying their 2nd or 3rd generation if they continue. For a first attempt at a GPU, everyone seems to expect them to be perfect. As a computer engineer myself I just want to say that this is hella impressive that they got the game support and performance that they did for their first attempt! I’m excited to see them continue in this space.
Yes, but rather than being impressive from engineering standpoint it needs to be actually much better than AMD/Nvidia cards at the same price point in practical day to day use, to get reasonable sells, since it doesn't have any brand recognition. I don't see that coming, so let's hope Intel's plan for 1st gen is to be a sort of beta test, and only 2nd gen and above being a commercial success.
The fps that you saw on dx9-10 was a result of the card not having drivers for dx9 10 and I think 11 too
Intel focused on creating dx12 and vulkan Drivers first
You are still able to play those games because they added a translation layer that translates from dx9,10,11 to dx12
Translating is also somewhat CPU intensive and can be seen on the graphs
What I personally would do is use dxvk instead of translating to dx12
because dxvk seems to be more mature cause of its wide use on Linux
ARC GPUS are smexy
True
now that's a word I haven't seen in ages
ARC GPUS are smelly.
@@Gatorade69 your smelly
@@Gatorade69 tbh I'd rather take smell over an extreme "gAmEr" look a little green company uses.
Love how the A770 is the price the top end Nvidia cards should be. Good move there intel.
Yeh from the early 2000s, I miss those days.
@@michaelbrindley4363 - Me too Michael, when the quality of games was great and it didn't cost the earth. Now we've got remakes generally, poor innovation and GPU pricing which is a joke.
But then doesn't that mean the Intel gpu is overpriced? Considering the performance doesn't match the top nvidia gpus
@@psychologicalFudge I think he means that it competes with 3060 and is a much better deal
@auritro3903 I mean, the 3060 is 40%+ more powerful, so idk if it really competes with that. That's not even counting, dlss. Either way, op did say high-end. So I assumed he meant the 40 series... which are even more powerful.
I started watching when you had 5k subs. Now within days you will reach 500k. Congrats!!
I really hope this architecture takes off and matures well. my next PC might be al Intel!
Finally a David video where I actually own the product. So far I've been loving mine. It absolutely crushes for AV1 and video transcoding, and it renders quite well too.
I've been wondering how well it handles video rendering for live streaming. Thank you for your comment!
All the big tech reviewers only review GPUs for their gaming performance. I wish they would cover video rendering too
@@tvHTHtv For streaming it is more than enough. Right now mine is used to encode an 800p 60fps twitch stream, and record the gameplay and my webcam at 1600p and 1080p respectively. It seems to do quite well for that in OBS.
@@tvHTHtv Are you familiar with EposVox? His video, "Intel GPUs are NOT what anyone expected" covers video performance starting at 7:56.
I agree. I used my A770 to stream 1080p60 to do a RUclips stream test while playing Scum the other day and it worked great. A few friends said the video quality looked excellent. Only problem I had was I think I picked the wrong audio settings so It was only picking up my mic audio. My fault.
I like that Intel has entered into the GPU market because most if not all cards are built with NVIDIA or AMD chipsets, looks like they need to sort out things overall but it's a promising start. Intel also has great packaging, in 2016 or so they used an IR soundboard to play the Intel jingle every time you open a CPU box just like those musical birthday cards. I cut them out and put in them in unsuspecting places such as a cabinet or by a door, walk into a room and you'd hear the jingle.
Intel has a jangle?
Haha, the first thing when I saw the naming of the ARC GPU'S was think of Airbus too! If they can optimize drivers better then it night be a great buy.
Fr, same. Especially the a380 one.
I actually got to build a system with an A770, and honestly in the stability testing I did before handing it off to the customer, it was surprisingly good in the RT department. Optimization was horrible requiring a ton of tweaking and work arounds, but the results were actually worth it.
Properly dialed in it was about even with a 3060 maybe a 3060 TI in some titles from a straight FPS standpoint. From an RT standpoint though? It was closer to a 3070 when properly dialed in. For a first gen? That's not bad, but you have to like tinkering if you're going to buy this one in it's current state. It has a lot of the "teething issues" Radeon had early on, but much better architecture, so it looks promising if they can get the optimization side of it handled.
i saw the ray tracing videos a few months back, it even did as good as the 3060ti in some cases
On the linux drivers there's a few tests where it hit *3080* levels. Currently outliers, but a sign of how far these might stretch with heavy driver optimizing.
@@XiaOmegaX that would make them incredible for blender rendering workloads since they rely so heavily on ray tracing now.
I'm sure you've read enough comments about DXVK and 'Rebar' at this point... I've been watching a number of videos about these cards, and yes, the driver situation needs help too... But I've seen better, much better, performance from these cards. I'm very happy to see competition to Nvidia/Radeon - which feels really odd to say since it's Intel... LOL.
Oh no, Intel is let down by the performance. They wanted more, but sure they at least priced appropriate for the market. But this first gen is putting the whole project in tepid waters.
2:00 Yeah, I know you were attempting to be funny, but that cardboard is saying that YOU are authorized to use the logo (via the sticker) on your computer as long it has that graphics card installed, not that Intel is authorized to use their own logo (because OF COURSE they are authorized to use their own logo).
Wait. Did you swap back & forth between the 770 and a 380? I thought you made a verbal oopsie but the Airbus pic also had 380 on it.
I picked up the A380 for my htpc and it is an AV1 comprssing blu-ray beast as well. No crushed blacks and fully encodes the 1-2 gig AV1 copy in around 10-18 minutes. Going to check out its HDR performance soon.
Did you use reBAR btw? HUGE impact. Pretty sure Petersen even said to buy a 3060 if you dont have reBAR
Did you remember to enable reBAR?
The performance is crippled without it.
Great as always. I look forward to new videos with that intel also ran. I hope they stay in the GPU game and offer a real 3rd option.
The great Control thumbnail is back!
I couldn't resist that click bait! 😄
Random video idea:
By one of those screens that they sell on Amazon or other sites, and play games on it solely for a week or longer.
These screens are the sizes made for cars or other devices, after cutting the materials for larger screens. It is my understanding that a screen is made as a larger surface, and then cut to size, so any leftovers can be manufactured as displays, or they are waste. It sounds like it could be a cool video.
please do an SLI arc build - the fun part is finding compatible hardware ;)
also ive heard the arc series has devolopment for DVXK/ (direct x over vulkan) pretty well
Did he turn on Re-Bar or not? couldn't find it if he did. I keep hearing that it needs to be turned on for better performance.
Holy shit, this thing's potential is insane. If they can work out the driver problems... damn, I might grab one myself later.
Yeah... It's actually reasonably priced as well
If ReBAR was not being used, all of these benchmarks mean nothing.
It doesn't look like it was. To be honest, this is a fantastic card with it on, but a surprising amount of people don't know it needs to be used
12th gen should be turning on that by default
He said it was enabled in a post on one of these comments
It was impressive how the box had a blue glow as he opened it!
That wasn't the box.
did u enable rebar? its an absolute necessity on these arc gpus to work properly as dumb as that is.
I have high hopes for intrls GPU's i hope they get them together we need some more competition in the gpu space
Honestly thinking of upgrading to this from my 3060 once the drivers get good. But I don't know if that's really an upgrade?
Complete lateral move. Not sure why you'd even consider that
Someone should make a dedicated ray tracing card. It would go in the second PCIE slot along with the gpu and it would allow for you to max out ray tracing without adding even a single percent of GPU usage meaning even ppl with lower end cards that dont have RT capabilities would get to experience ray tracing. Also having a dedicated ray tracing card would allow for developers to go crazy with ray tracing
There might be PCI-E limitations regarding bandwidth. RT cores are integrated into the GPU so the bandwidth is high and latency is low.
Like a photonic chip card. Then ray tracing would be instant.
Bad idea. RT cards are already basically doing that because they have dedicated cores for that. So it's not hurting shader performance anyways. All you're doing is adding latency and complication, while reducing bandwidth.
soon there would be a collaboration between Intel and Airbus, and soon Airbus will also compete in the GPU scene lol
DX12 Battlefield probably stutters because it's compiling shaders. If you had kept playing that map it probably would have evened out after a bit.
That’s what I was thinking too. But after 15 minutes it was still stuttering like that.
It seems like this is a really great mid-level GPU that just lets you use raytracing with it, whereas most mid-level GPUs would not take it as well. Very interesting!
The stutters could be a result of not enabling resizable bar?
"It's like living with a very smart drug addict"
Sir you earned yourself a Like
Sometimes you got like 40 to 60 fps and it feels like when I used to have 10, too much stutter and aside from those very decent fps in all games.
What is the difference in RTX settings on this card compared to a real rtx card if there is any?
Is it just branding or the extra RT cores do something?
I'm comfused by the fact that you get RTX and keep same fps.
What do you think about its price to performance ratio 🤔
Did you turn on resizable bar? The A770 requires it to be on to run to run well.
It's funny that ray tracing is so unimportant that we often really struggle to tell the difference except in games where the rasterised lighting is super basic (Quake rtx, Minecraft, etc) and we have to rely on looking at the frame rate
If you cant tell the difference between Ray Tracing On and Off you really need to check your eyes :)
just out of curiosity - did you make sure resizable bar was enabled?
Remember PhysX cards? I think they should make RT cards as an option. I think that would be awesome.
It's odd that dx12 performance was not great. Was resizable bar enabled in the bios?
I know you won't do it, but... I would love to see someone finally testing that card over Linux - with all duo respect, it might be the same thing as with Tiger Lake, (G5/G7 APU) where it runs much better with the Linux Intel drivers than on Windows
*cat eyes stare* pweety pweeseee
ive heard it doesnt have driver support for ubuntu yet
@@dillonnnnnn not really, there is actually a support
Kernel 6.0 has them implemented officially by Intel
BUT
Mesa3D already updated their packages with Intel Drivers - so really... even if you take for example, Mint with Kernel 5.15 - you still will run that GPU without any issue, and if you use Mesa you will have everything updated to latest no matter what :D
proper neckbeard linux.
@@ahmetrefikeryilmaz4432 mhm, ok xd
I have the same card running 4K resolution at 45-55 fps. I’ve never seen something so beautiful on my monitor
The only reason I wanna get an ARC A770 is because of the looks
I want it for the Glue!
@@stephanhart9941 i dont get it, can you explain
@@Zionn69 Gamers Nexus who reviewed one
of the models said the pcb was tacked with so many thermal pads, acting like glue.
@@Zionn69 check out the Gamers Nexus teardown. There are some questionable assembly methods employed with the reference A770/750, including glue. Not great for maintenance or reassembly.
There's some tape on the backplate
I'm sure if you give the drivers for that GPU a few solid revisions, most of that stutter will get worked out...
Do you even ray trace bro?
Great video, but to be honest if you can't tell when or if ray tracing is being activated with a settings change then it's a moot point and just a way to have less fps
I've been leery of considering an Intel GPU for my system but Dawid's tests make it look like a decent option...especially for the price. Ngl...I dont hate it! Maybe someday when my EVGA FTW3 3060ti Ultra kicks the bucket, I'll consider replacing it with an Intel Arc because NVIDIA flat out said they are going to price gouge and AMD will probably be jealous of NVIDIA's revenues so they'll start raising prices too.
Its also worth noting that these are the 1st gen of Intel cards...I would like to think they'll improve in the future.
quick question - did you activate RBAR? Intel ARC needs this to get stable performance
Any chance of doing a video using this for video editing/rendering? It would be interesting to see how it compares to the competition.
My A770 16Gb does pretty well for Davinci Resolve exports to AV1. My only comparison is an RX6800M laptop, which is understandably slower.
@@DigitalJedi Yeah, I think LTT did a video about actually getting the lowest end Intel arc to go in a system secondary to NV/AMD cards just for its AV1 encoder as AV1 will be the new standard across the board as it gives much higher quality output for same bitrate
@@aaronjones4529 I only went with the A770 for the Vram. Being able to load up huge chunks of a file or the entire file at once makes things go really fast.
@@DigitalJedi oh nice, it loads into vram and doesn't just stream from SSD.... Clearly I'm not knowledgeable regarding vid editing, and nice to learn a second new thing today
@@aaronjones4529 Some renderers will load up the vram and others just stream from storage. Mine loads into system memory first and then the GPU copies whatever it's working on over.
"That is an odd smell for a graphics card"
You know you're in for a thorough review of this card when you even get a report on it's olfactory status 😅
Makes sense it has ray tracing performance advantages considering Intels GPU attempt 12 years ago, Larrabee was a ray tracing GPU and ran full ray tracing games back then at 60fps at 1080p which is mind blowing for something so old, makes sense they would reuse some of that technology in their second production GPU launch.
despite all the issues this card have it looks awesome 😄 i'm more like an aesthetic guy than performance guy 😅
Its too bad if you look at a teardown video and see that its held together with glue and tape. Literally. Sacrificing practical parts of the design like it being easily worked on for aesthetics is a terrible practice for a company. Even if most of your consumers wont tear it apart there is quite often a need to do so for some.
I agree 100% the card is so pretty
Looks are superficial.
@@MasterN64 Uh, yeah, that solid, cast base plate is just for show. Get real. They taped some wires out of the way and used some adhesive to secure the back cover. It is not "literally" held together with glue and tape. If anything, the card is overengineered.
@@wargamingrefugee9065 Look guy every single major manufacturer of cards avoids tape and glue at practically all costs because they are an absolute nightmare to deal with and put back together in a way that looks decent. The first step to taking it apart is to peel off the big sheet of flimsy metal on the back of the card to get access to the screws thats held there with double sided tape. A metal sheet that you will -never- get back on in a way that will look decent again. If you want to understand why just watch the gamers nexus teardown of the card and why they say so.
Haha, had to laugh every time you called it an airbus! :)
Did you have resizable Bar on? It is a must for the Intel GPU's otherwise they are extremely unstable.
Probably explains the stutter if you didn't check the bios to make sure it was on.
Usually the treaded holes at the back of GPU are their to mount a bracket to hold the GPU in servers.
Really? Well you learn something new every day
Hmmm....curious. I'm running the A770 on a 9700k with 16GB DDR4 4000 and I didn't experience any such issues.
For instance, your GTA V framerate was what mine does in 4k. Idk what could be causing that.
Im seeing the same thing with mine except im running a 12900ks. My A770 framerates are pretty much double what hes getting.
Why is Dawid actually "ignoring" the poor frametimes?
At 08:04 which software did you download? - DirectX Properties - the official thing from microsoft?
"Non-saggy mounter thing" - ahh, you've seen my Tinder profile
YES! FINALLY! Someone mentions the Airbus naming scheme. Intel has so clearly lifted it from Airbus , that I honestly cannot believe nobody has mentioned it yet😁
Dawid, is jy seker die GPU was in gebruik (en nie die een wat op die CPU is nie)? Daai external GPU setting moes dalk gestel gewees het. Ek het nou die dag met Fallout 3 in Epic gesukkel en moes in Windows se settings die game forseer het om die external GPU te gebruik.
What were the specs for the test bench?
I was really looking for it since we do need to rule out any possible bottlenecks during testing.
2:47 IKR I have one of those old cards lying around somewhere and my first thought when I saw that arc GPU was, oh wow it looks a lot like that evga card
Tough day, really needed some Dawid snark to bring up my mood.
I did not leave disappointed.
I saw about half a dozen Intel Arc's for sale at Memory Express at their Broadway location here in Vancouver. They have the A770 and A750
I hear you say A380 a few times....but you have an A770 installed...
And by the way, the Intel Arc performs better at 1440p then on 1080p.
@El Cactuar you want a cookie now?
So when it runs well, it runs *very* well! Thanks for the review! Great video
Great video as usual. We are located in Ontario and actually getting hands on with an A770 was just luck. I had been calling Canada Computers and Memory Express regularly but neither could confirm when they would have Stock, Memory Express took a pre order for me on an A750 but said they could not do it for an A770. The Canada Computers site showed stock of the A750 but still "Sold Out" of the A770 it wasn't until I actually checked the per store stock checker as you displayed it showed an A770 at 3 of the Ontario locations. My Pre Order at MemEx still stands but they haven't received any stock.
Wow, the raytracing on the a770 in quake 2 pulls twice the fps that my 6700xt does on the same settings 😅
RandomgaminginHD did a video showing that vulkan is better with the Intel GPU's in a fair few cases. Especially GTA 5. You should do a video checking it out too, it's called DXVK.
Wonder how well this would compare to the mobile 3060 seeing the power draw of this card
Ahhh Dawids classic Control thumbnail. You know he is doing a GPU test.
So..... Intel built a $350 RTX 2060-class GPU in 2022? OK. Not bad, not great.
Sad to realize all the computer nerds don't know that when Dawid says A380 he means the airplane, not the card.
How did you get raytracing to show up in Control? It was completely greyed out for me when launching DX12
As an avgeek and a tech geek, I approved with your jokes with the A380 on a gpu plane lol
Also the term “GPU” is also in aviation, meaning “Ground Processing Unit”
dang, this is one of the best ARC reviews I've seen. Kind of makes me want one.
i'm not sure about setting the game on medium to give room for ray tracing, it's like putting the cart before the horse, but would love to know u guys thought about this
There is one Dawids sentence in this video that pretty much sums up the ray traycing: 'i think it looks different'
I love that Ray Tracing was subtitled as rage racing at 0:09 =)
seems like a lot of this stuff is driver/compatibility related, so it sorta makes sense. Plus at that pricepoint it's definitely an impressive card
5:31 Rockstar finally fixed that now no shutters but when you reach 189 Fps the audio will desync :(
Crysis Remastered actually uses software raytracing unless you have an RTX card. You can try to force it on with an autoexec.cfg in the base directory, with the following:
con_restricted=0
r_AllowNonRTXHWRT = 1
You can check if it's toggled in game by typing r_AllowNonRTXHWRT in the console in game, but you would have to do testing to see if it actually does anything. Hard to tell on my 6700xt.
Control DOES prompt for DX11 or DX12 on a "proper" setup. I just started it up literally yesterday to test how well my new 3060ti would do at 4k with ray tracing lol. (For the curious, not well. Averaged around 10 FPS. Which, I don't understand why people laugh at CP77 when that runs on my system better than Control lol).
What software is he using for testing, I looked in earlier videos but never heard it mentioned. Thanks in advance.
GPU: I fear no man
Dawid's nephew: But that thing it scares me.
yoooo dawid you probably won’t see this but i live in the uk and i was able by pre ordering 2 days before i could buy instantly get an a380