Well I'm not the Timestamp guy but I wanted to help a bit: 1:50 Elden Ring 14:25 Crysis 27:41 Cities skylines 33:31 Forza Horizon 5 37:13 The witcher III 48:00 Team Fortress 2 59:50 Anno 1800 (lol Im Spanish so this sounds way too funny) 01:08:23 Rocket League 01:21:20 BeamNG 01:27:00 Sactisfactory 01:36:00 GTA V 01:52:30 Skyrim 01:55:25 Halflife 2 01:59:30 HALO Master Chief collection 02:12:00 DOOM Eternal 02:23:22 PUBG 02:23:44 Valheim 02:33:11 Destiny 2 02:50:50 Minecraft 03:01:13 Descenders 03:05:30 Linus "dropping" out controversial news 03:10:44 Fallout 4 03:14:19 The guys just comparing sizes 03:15:14 Why is it so big? (that's what she..) 03:17:00 Still Fallout 4 03:20:05 Control 03:33:03 Battlefield V 03:46:53 They just start talking about the prices of 6600 xt, 3060 and the Arc 770 03:47:39 The drivers can only improve, they can't get worse (u sure about that mate?) And that's it.
Linus: "We will play ANY game on Intel ARC for you." Also Linus: "What are these games you guys are asking us to play? Play an Esport game like Dota or GTA V." Lines like these make me love these videos.
TIMESTAMPS 😮 01:50 Elden Ring 12:35 Can it run CRYSIS???? 27:44 Cities: Skylines 33:43 Forza 5 37:16 Witcher 3 46:50 Team Fortress 2 58:40 ANNO 1:08:27 Rocket League 1:19:20 Beam NG 1:26:45 Satisfactory 1:33:35 GTA V 1:52:20 Skyrim 1:55:22 Half Life 2 1:59:56 Halo: The Master Chief collection 2:11:57 Doom Eternal 2:23:13 PUBG 2:23:35 Valheim 2:33:40 Destiny 2 2:50:43 Minecraft 3:01:45 Descenders 3:05:35 Linus’ anti consumer speech 3:10:40 Fallout 4 3:20:05 Control 3:35:05 Battlefield V
@@MistyKathrine Ah I thought they meant there was *something* holding it back on witcher 3 - was wondering what it was because I saw no issues (except with nvidia hairworks but that seemed very unecessary).
I think they mentioned in their video about it that it only runs directx 12, it just translates everything not directx 12 which uses a lot of its resources
LTT should do all graphics card releases this way... LTT crew playing a bunch of different games on different GPUs for comparison. I rather sit through three hours of this than 3 minuets of benchmark graphs.
I'm actually so far impressed with Arc, even tho it does tank performance, it still way above 100 fps. If they fix the breakable problems and squeeze a bit more of performance I'd be actually very compelled to buy the Card.
I have a A770 16GB and it's really really good. When it comes to some classic games, there might be issues, but still, very solid first attempt and tons of VRAM for future games
From what I can tell - a lot of these games can't even tell that Arc is a dedicated GPU. I bet some of these game thinks Arc is a processor integrated graphics. Which means the drivers need a ton more work, I expect A770 will get better performance in a few months.
I bet Intel will just realize they're sinking money into something which will never give them any returns and they'll drop the discrete gpu's and pretend they never existed.
Let's be honest, I want Intel in GPU, but that's like rolling a dice, they are not proven GPU manufacturer like Nvidia and AMD, I expect Celestial will be competetive. Arc is a disappointment for me, I kind of wanted AMD cpu and Intel GPU.
@@aerosw1ft They could be way better, they were supposed to be on the market in q1 22, and instead of developing good drivers they tried to do every bit of tech and introduced RT and DLSS competitor. Intel is not a small plucky company, they are giants that dwars AMD. I expected that after almost 6 months delay they would introduce a product that fight Nvidia and AMD in low and mid-low tier, but what I can see is a bit of a mess. I'm sorry, but let's be honest would you recommend arc to a friend? I can introduce amd/nvidia gpus no problem, and I hope Battlemage will be stable enough for enthusiasts, but I just can't see anyone wanting to play on arc.
Is that just me feeling ecstasy when the opening of Crysis showing Nvidia and Intel core 2 extreme logo, who would have thought one day we will be testing their GPUs side by side. Great time we are living in.
This speed-gaming format is kinda entertaining, and made me want to get back into general gaming. Also, good to know the Arc stuff isn't 100% ready for prime time. Thanks, guys!
In TF2 the performance difference they mentioned at 57:05 is due to them being in a different area When linus went to an area with similar view at 58:00 he also got the same FPS
Was looking for this comment. Linus' "definitely worse" conclusion was a bit annoying, I'm surprised no one in the room or in one of the chats said something.
I’ve seen other people say this but - the a770 system was also running the server and simulating everything. Whereas the 3060 system was just rendering visuals, and while Minecraft looks simple it can be heavily demanding when simulating at max distance.
The Minecraft test wasn't fair because the system with the A770 was running the server at max settings, and the nvidia system only had to render the visuals.
I wish benchmarking Minecraft was more commonplace. It can be very demanding on your CPU to run a server with a huge amount of entities/players/redstone contraptions and there are some shaders, like the SEUS ones, that can absolutely wreck your GPU if configured properly. Running Minecraft can mean a whole host of things and is not something to take for granted. It’s also massively popular, I really don’t understand the neglect.
Imagine a benchmark where they use one of the big world downloads from big creators, like a Hermitcraft Shopping District. Now that's something that would stress-test a computer!
AI stuff would be great, rendering might be poor due to (understandably) lack of support from rendering software, but yeah we need an array of testing on those
@@AbyNeon You can't be sure of that, back in the 90's I had a few graphics cards for no other reason than to evaluate their architecture based efficiency within a specific set or type of 3D CAD calls. The different ways they approached the execution of the call within the simulated space and the relative efficiency of different methods was what I was curious about. Hundreds of hours spent analyzing different methods VS each other using different fixed parameter "tick speed" to simulate computational cost as it were between different approaches in the direction of a solution. This is a very long way of me saying "I don't agree - there is some problem that Arc is currently the best scalpel to dissect, I just don't know yet what it is :-)"
@@AbyNeon I do see advantages with how Arc does things in particular when it comes to analyzing the relative 'cost' of one approach compared with another, in terms of compute resources used. And yes it does matter, a lot actually, even to those who haven't the faintest clue what I'm talking about. To try to elucidate what I mean a bit more in a general sense, every kind of computational analysis science ends up needing it's own language or at the very least terminology and set of agreed upon concepts, problem types, and solution types. These over time evolve to be more and more exact and efficient in their function, be it to define a problem, define a solution, build the road from one to the other, or in some cases demonstrate or prove something new, like what was defined as a problem not actually being one at all but rather a convoluted solution in search of a fitting problem.
It seemed to me that Intel's first wave of GPUs are doing well, all things considered. I'm sure that updates to this generation and their subsequent series' will only improve performance. Nvidia and AMD have been in the GPU game for many years. So I feel we should give props to Intel for jumping in so late while performing somewhat decently with their first generation. Also, having a third competitor in the GPU market could help yield more innovation via competition. Let's see how Intel does. I hope they do well - for our benefit, not just theirs. Decent first try, Intel, but you've also got a long journey ahead. Chin up, boys and gals! 👍
Been on Arc, myself, for a while. The performance improvements have been pretty damn big. I bought mine when, where I am, it was priced so far lower than the closest Nvidia had to offer and my 1080 had died. It's a great entry to gaming, in my opinion. Hoping the future drivers iron out more kinks and that Battlemage cards continue the trend of steady improvement.
The reason why the Intel card got a lower FPS in Minecraft is (probably) because you're hosting the world. Doesn't make a lot of sense, but whenever I play in single player, I get 80-100 fps, and in multiplayer servers I get 150-200 FPS. CPU usage is always at 20%.
If CPU usage is only at 20%, something else is probably getting maxed out. Minecraft hosting requires that you dedicate RAM for just the server. So you could be running out of that ~
By hosting I mean running a single player world (single player is still a server, but no console) Opening to LAN just allows other people to join your server (that's why opening to LAN is just a click of a button, and doesn't require you to re-join your world, etc) My RAM usage is 4 out of 16 GB when I'm in single player tho
One thing that comes to mind is that your render distance could be higher in singleplayer. Servers default to 10 (previously 8, i think). What was happening on the stream is obviously the CPU maxing out, though.
In the future for team fortress 2 (and for anyone reading this) change your FOV settings to 90°. 90 is the default for most games, except for tf2 where it’s a measly 70 degrees. First thing to change in that game
56:29 linus spending like 10 entire seconds solely on murdering a friendly RED heavy with the syringe gun without batting an eye is literally one of the funniest things i've seen in some time lmfao god i love team fortress 2
The Intel Arc is looking pretty good tbh. I think with some time/driver updates it will be a real contender. Also, we're already "paying" for 4k video with the INSANE AMOUNT of adds youtube is shoving down our throat currently.
3:05:35 RUclips putting 4k behind paywall. If they raise RUclips Premium rates to keep 4k, I'm done with YT premium and I'll go back to using ad blockers.
@@Jeremy-ho9cr Not on a phone maybe, but on a big 4K TV they do. 1080p youtube looks like garbage. The bitrate makes any fast motion look worse than high bitrate 720p.
I doubt that they'll do that. I haven't actual even thought about it, but if they actually raise the rates or make it a higher tier I will also end my subscription I've had for many years now.
I'm confused as to why they don't limit 4k to channels above a certain amount of subscribers. Also I'd be in favour of abolishing 4k for less compressed 1080p. 1080p video looks great, 1080p RUclips video looks like steamed turds localized entirely within your kitchen
Thank you for getting around to GTA V, I know it's a pain in the rear to benchmark properly. Ars Technica tried it at 4K Max and could barely get 20fps from either Intel card (8GB or 16GB) and their 3060 was in the 40s. Seems like at 1080p and high/very high settings the A770 falls in between my current GTX970 (mid 50s) and the GTX 1080 my step son is giving me soon (mid 70s). The stuttering was likely from hitting the engine's breaking point. No need to spend $329 for an Arc GPU here in my house then. When the 1080 renders its last rend, there are plenty of other $125-$200 used cards. I could probably get away with as little as a 6GB 1060 at this point, and would likely settle on a pile of 1070/1070ti's to last me until death. Of the game of course, I plan to live forever, so far, so good.
you should have used conventional screen capture, maybe per camera to reduce errors (brightness, fullscreen, crashes etc.) loved the idea of the livestream and would like to see it again with everything you learned there to improve the 2nd round
TF2 is not detecting any vendor stuff from Intel, the source engine has very optimized pathways specifically for NVIDIA, AMD and Intel. Also, FPS drops are very usual as the game is not very optimized at the moment, CPU wise.
The minecraft test was not ballanced since the Arc was hosting the server while the 3060 just rendered, Im not sure if the server is more cpu dependent tho, I also wish they tried bedrock with rtx
I really think as long as ARC performs as well as systems from games releasedate then it's approved in my book. Or above 144 fps on recommended settings
Intel GPUs should be perfect for non gamer professionals who can't exactly afford the Quadros or the 40 series of the world. (The % of CAD designers throughout the world who actually use "professional" GPUs is very low). A lot of CAD programs still rely significantly on CPUs anyway. If Intel solves the reliability on those applications then they can build up on top of that for revenue to develop next gen cards.
Installing and trying so many games with so many problems just seems to prove how high friction it can be to get into PC gaming no matter what GPU you use. If the game doesn’t crash, at best you’re greater with launchers, installers, and relentless EULAs. Once you manage to get into the game, you’re still not even playing most of the time because you’re funneled into some kind of minimally fun tutorial.
Would be nice to see the power consumption level for comparison as well cause that can be a thing here. Sometimes not only price vs performance matters but also power consumption vs performance.
Watching Riley's gamer side come out was pretty enjoyable to witness. I wonder what the LMG rankings are for Rocket League, and how many people actually prefer m&kb over controller
This would be a great card for someone that likes to play casual games once or twice a week like me. But than I'm also a huge fan of technology and always want the best of the best even though i can barely use it these days. So it's complicated 😂
@@n0k0m3 maaaaaan try explaining this to them x100 and the only thing that will go through their head is “NoT FrEee”. Obviously i like the 4k feature but i have defaulting to 1440p which looks amazing still.
On Floatplane they charge $5 extra per month just to watch LTT content in 4K instead of 1080p. Meanwhile on RUclips, 4K is free. Shows where he's coming from.
@@motokid6008 Could be an experiment, I usually get 2 ads, 3 ads maximum. Some are on experiments without consent and apparently get 8-9 ads in videos.
@@sfgrgfsf212 The difference is that Floatplane has always been a paid platform. 4K has been a free RUclips feature for years now, and they want to take that away and paywall it. People aren't going to accept that with open arms, especially right on the heels of them stripping dislikes and attempting to dump us with 10 unskippable ads.
I'd wanted to see Transistor, like even though that should in theory be a really easy game to run, but I missed the stream. Still frustrated about that. He might not have played it even if I did suggest it but I'd have still liked the chance. At least there's like one game on here that I do play, and another I find fun to watch.
the issue in older games is really the 0.1 and 1% lows. some games that you can mod and foce to run on vulkan will end up much better for arc with something like d9vx. forcing vulkan will also help other gpus just not as much as it helps arc
Yeah, if you triple buffer vsync fps lock to a lower frame rate like that, it will basically make it like an older console experience, where it's technically low fps but you don't feel it. At least from my experience. But monitor also matters, 30 fps on 240hz 4k will feel extremely worse
No 90 is unplayable for many fast paced games for me. I need 144 for Doom Eternal. My brain couldn't parse everything going on at 100. (my old laptops screen was a binned 120hz panel, I could push it to 100 hz even though it was "60")
That BF: V Test was weird. I mean you guys didn't do anything wrong and benchmarked it perfectly, but the graphs that Intel were showing were totally different. On their graphs the A770 was basically on par with the 3060. They may have tweaked a bit but I doubt it that Intel would literally double the FPS just so that the graph looks equal, especially after actively showing games that run faster on the 3060.
I would've loved to see how bad it does in Planetside 2. Big fights are already a slog and since the game runs on DX11 the dx11->12 translation thing would come to play. Oh and some DX9 and OpenGL games, not everything is DX12 or Vulkan.
@@JC_BOY Theay are goign to compare with the card that most people is going to buy, my circle of friends are pc gamers and none of my friends thought of buying a 6600X or A 6650, only one though to buy one for a moment becaus it was the only one but literally just like 2 hours later there was a 3060 restock on the online shop we mostly use in Mexico, most people are in thge marketfor a 3060 so they are gopign to compare withthe one most people have or want to buy.
It seems like playing anything on any FPS with Riley is just pure fun. I wish I could hire him on some on-demand teammate site. Not to boost my rank, but my mental wellbeing.
Missed the stream, but I'm glad that not only was Minecraft suggested, but the Java version was tested. Performance with the ARC GPUs that work best with DirectX 12 was a big concern for me, so the fact that not only is it playable, but decent framerates (admittedly, vanilla, not modded) too.
Amazed it that it seems to have decent OpenGL performance. That's something AMD has only begun to fix over the last couple of months on their GPUs. So the fact that intel isn't horrible there is actually pretty promising.
A770 is looking absolutely fantastic at its price point, unless the retailers hike it up by 100%. Better RT than the RX6000 series is a huge plus, but I would only get the 16GB edition myself and it's going to be a limited edition. 😞
Having Hairworks on makes that an invalid test, that's an NVIDIA specific technology that they ensured would only work on CUDA on NVIDIA GPUs. So on systems without an Nvidia GPU it runs pretty bad, because it's only accelerated on NVIDIA GPUs.
🤔In the Witcher 3 game (45:00) there is a scene where we escape Kaer Morhen by horse, with a bunch of enemies, people fighting and fireball raining from the sky. At the time my GTX 860M couldn't handle that even in 720p, the sound started cracking and FPS dropped to 10 😅 ... I am very keen to see how the arc 770 handles it.
An interesting case idea would be a flat design that can be rack mounted. Removable rack-mount frame, and glass top for visuals if left on a desk. Airflow is very easy with rack-mount cases, and the ability to use on desk and in rack would accommodate everyone's needs.
There's some sort of gamma/contrast issue on the Arc but seeing Witcher 3 result, that reminds me of nvidia having a driver default that give them advantage in the past. The creases on Geralt clothes didn't have any shadow but on intel, it is nicely shaded. I'm not sure if Hairworks with 64x tessellation are viable on Ampere cards seeing it's much more of compute oriented architecture (Kepler has a bit of advantage in that and it was a point to bash AMD GCN cards that were bad in tessellation). Maybe nvidia driver default cap it to lower amount that overrides game settings so probably worth forcing max tessellation value with nvidia inspector.
Would be interesting to see if there is a performance drop with AMD processors should do a comparable system in AMD with a Intel card and one all intel just to see what it brings.
Riley saying “this is irresponsible” regards to aiming at hostages and not wanting to shoot the guard in GTA V is just 👌 1:49:16 I feel you Riley I feel you :((
It would be really interesting to see this card preforming in a steam os or other linux distro. Basically we don't have a direct x or whats ever, so it would be nice to see how this thing performs.
I was actually expecting a Star Citizen comparison, given the fact that it's a really hard game to play and really CPU demanding. I'm disappointed honestly.
@@PaulDillinger where did they play it? i didn't see it and 90% of games a trash AF IMO. so I didnt want to watch hours just for the 5 mins of starcitizen
@@NEOgeek402 They didn't. That's why I said it. They even said "We're going to do it later", but never did it. Riley mentioned at one point "We don't know in which state Star Citizen is right now, so I don't know", but then again, I am barely able to play Star Citizen with a 3080 and a 10700-k, so I was really looking forward to see if an intel GPU would get better results or at least "Stable" ones.
The way Linus explains the 4K-paywall-thingy absolutely makes sense. Now I don't know anything about the matter and I don't care too much but from what he's saying I think it's the creators who should pay for 4K uploads, not the viewers who want to watch in 4K. After all, the creators use youtube to make money so they should be the ones paying. Just like vimeo makes the uploader pay for premium, not the viewer.
Honestly… I’m surprised with how well the Arc did in these tests.. I’m even more impressed with how a processor company like Intel has done this will 🤯
Disappointed that they never tried star citizen, even though Linus promised it. It would have been interesting to see if the Arc would have been able to handle it, seeing that Star Citizen is running on one of the most advanced constantly updating engines that exists in gaming right now. The Arc excels in new engines vs old engines, so this would have been an interesting test to run, and though they played many games, they never hit the Star Citizen genre, as it is a category all its own, and very important to the kind of people who are likely to buy an Arc GPU to test and troubleshoot with it to try and get a new line of products off the ground like this, as Linus begged people to do in his previous video. The kind of people who would buy an Arc GPU for that purpose are exactly the kind of people who play Star Citizen, in its current state. With respect, it is not likely someone who is concerned about maximum frames in Rocket League so that they can perform their E-sports best is going to be interested in testing and troubleshooting a new video card lineup with the vague hope that it can become something good in the future, and thus is unlikely to be considering an Arc GPU in the first place...
Yep, one of the reasons I play Star Citizen is I don’t mind being a ‘tester’-type individual. I love finding bugs/reporting them and working to get things running smoothly. I’d’ve grabbed one of these ARC GPUs if they’d come out ~4-5 months ago (even at $400-$500).
advanced engines? Is this satire? They are running on CE (I believe v3), from 2009, now lumberyard. The same engine with the addition of AWS features. It is a heavily modified engine, yes... but, let's not pretend it's the most advanced. I suggest you look at some UE5 stuff. CE\LY is notorious for being difficult to use. That said, yes... I wish they tested it as well, but I'm not about to tell everyone it's using the most advanced engine (*star engine*) It's not, and it is why CIG is struggling to get it working how they want; otherwise we would already see more progress.
Slight error when you guys talk about the translation layer at 32:38 It converts DX9, DX10, and I believe it has native DX11 support. But it converts those calls into Vulkan, as it uses DXVK which is how linux gamers enjoy gaming.
Getting ready to do my first build in about 10 years. As someone who doesn’t know a lot about all this stuff it seems a safer option to pay for Nvidia. The price difference would have to be a lot more to try this.
The Question is, why the RTX 3060? When you play RTX it sucks as well. So a RX 6600XT would be cheaper and has a better Rasterization Performance than the RTX 3060. Or for the same Prize the RX6700XT which got the same RTX Performance but will run circles around a RTX3060 in older Games.
@@DravenCanter In my market the 6650XT costs around 380 USD or 50 USD more than a 6600XT and is closer in Prize to a 6700XT. Which is Cheaper as a 3060ti (Cheapest 3060ti 460 USD Cheapest 6700XT 420 USD) Thats why i didn't have the RX 6650 XT on the "radar".
Alex complaining about in game physics and momentum in the Witcher, got a bit of a chuckle. The added half step makes it a bit more immersive given in IRL you don't dead stop a sprint, and if you got a fast trot for a walk you slow yourself with likely a half step.
Linus talking about arc being worse in Satisfactory, right as Riley's testing shows that Arc might actually be just as good, if not better. Arc was around 100-120 in the intro/drop, then up to 160-200 once it landed and loaded the area more. Very impressed with how much better Arc is managing now
Due to the huge difference in timezones I've missed it but wanted to request one thing and maybe someone here will listen to it and check it out as it might boost up the performance of Intel GPU. You've checked only games with DirectX but what about OpenGL? OpenGL is basically implemented in all of the old titles and some titles (old ones) may really benefit from that change. Some source titles may be run with OGL by just using a paramater `-gl`. Another thing is ReBAR. Currently there are people that still uses hardware that does not support that feature. Should we even consider buying ARC? I'm really dissapointed by the attitude of AMD and NVIDIA so I wanted to "vote with my wallet" but should I still even consider Intel GPU without ReBAR capability?
if your system doesn't support rebar/AMD equivalent don't buy ARC, even their own developers have said the same thing. get something like the 6600XT if you're looking for something in that price range.
It would be great if those powerbanks in the trunk can charge your car battery when system detects its under 12.5V and charge it untill full - 12.8. Just bump it to 14.5V and wait for 10h and test again if it stays above 12.5V. You could forget about charging your lead-acid battery with this kind of system (normally it should be charged twice a year - before and after winter season), and more than 10 years of battery would not be any problem :)
Yo, so i wanted to shed some light on why ER is locked at 60. the main reason is there are certain actions in the game (rolling for instance) that are directly tied to FPS, meaning the higher your FPS the less "Invincibility frames " you have, result in you still receiving damage even if you time your roll perfectly. Fallout 4 has a similar issue resulting in lock picking becoming impossible at high FPS. This is what my friend who is pretty much an expert on Dark Souls and Elden Ring said, so take with a grain of sand ETC.
Unfortunately the drivers are only available in the latest kernal which isn't available in any LTS versions yet (as far as I know) so will have to wait a bit for it to be tested properly.
Does anybody else notice Linus bossiness? He told Dan not to talk because he doesn't have a mic but when jake does it, he doesn't mind and engage in conversation without having any concern for Jake's voice isn't audible. Strange 🤔
The cost of storing 4k video will reduce greatly with the introduction of AV1 encoding to the platform though.. so his argument about that is kinda pointless. Maybe lock 8k behind a paywall. It's new and not a ton of videos are 8k. But it's an option for people who want to buy it. 4k is waaaay too commonplace nowadays to lock it behind a paywall. Especially for a gaming audience.
As a video encoding engineer: No. That's not the case at all. AV1 is a great standard but thinking this will be a non-issue after it's mainstream is a comically bad take. Everybody outside the video coding field all thought the same things about HEVC but what you saw on the ground is companies working moderately hard just to get the same compression ratio at a given PSNR as AVC. It's not because it lacks the tooling, it has it, but using all these features of the new specs, especially in hardware accelerated GPU's where you can only support a certain features and supporting them well enough that you're still working in real time while increasing your compression ratio? Not an easy nut to crack. AVC has had a decade+ to get the shit optimized out of it. It's easy to say that these new standards like AV1 are going to revolutionize the industry when you have a clip cherry picked for the test running in an AV1 reference encoder with all the bells cranking away at 1 frame per 10 seconds, it's another thing entirely when you need to have to try and stuff all of those new tools into a fixed function GPU pipeline. AV1 will have moderate improvements over VP9 - probably practically larger than the gap between VP8 and VP9. But the idea that CDN costs aren't an issue after it's rolled out? No, not a chance. CDN's are stupid expensive to run and the gap between VP9 and AV1 is not even a hint at lowering that. The truth of the matter is that we've made BY FAR the largest gaps in lossy compression technology with MPEG2 & AVC. Everything since then has been super incremental in comparison, and most of the foundation of what we do today boils down to optimizations on the same techniques - the main benefit is more flexible blocking for retaining small details in large frames. But the underlying technology hasn't changed much. The other benefit has been better segmentation for multiple fixed function pipelines on the same GPU. The benefit is for speed though, not compression. As for 4k being "way too common" - you summarized it best in the same line: "Especially for a gaming audience." The gaming audience for YT is a tiny percentage of consumption. There is no need to add it just to satisfy them. If they want it bad enough they'll pay for YT Red. And even among the gaming audience, I wouldn't be surprised if 50% of them didn't even select 4k manually, and just left it on whatever YT gave them. I'm not a gamer but the only time I check the quality at the bottom is if it looks like shit. After I was part of the team who rolled out one of the standards above on the 1st cards who supported them for one of the companies in the video, I worked at a start-up. The start-up didn't make video cards like my previous job, but we did still work in the video space & we had a small CDN for distributing video. The start-up failed, but with only a few hundred users the CDN costs for 4k video were astronomical. The cloud services charge a shit-ton for those transfers, they are *expensive*. It really did not take long before we were dropping tens of thousands on CDN costs per month. You kinda have to see the GCloud or AWS billing to appreciate how hard it sucks. Now granted, Google gets a great deal on this, but the prices aren't high for the rest of us because of price fixing, they're high because it's expensive for them too. Anyway, sorry for the long winded reply. Linus is right though, this makes sense, 4k and 8k are expensive as fuck for Google, and 97% of RUclips viewers likely will never notice. AV1 will be a drop in the bucket for costs for YT.
@@MrSlowestD16 Nah its fine. Google isn't hurting for money, no need to gouge the consumer further. They literally have every possible data point about their customers up for sale. As for AV1, I don't think it's going to take that long to become decent. RUclips already uses it by default on some videos, and you can go into your account and ask it to prefer AV1 when available. And the quality remains the same while the bandwidth has reduced by like 40%. Also, the gaming audience for RUclips is actually their largest viewership next to music. Just because you don't watch gaming videos, doesn't mean it isn't popular. It's literally one of the primary reasons for their growth over the last few years. Don't make shit up.
@@Bry.89 They're not hurting for money, but they have to justify it as a business decision. They're not a charity. They have to explain to their share holders why they make decisions. There needs to be a clear-cut reason they can relay to them. You misunderstood everything I wrote about AV1. It has nothing to do with "becoming decent" or "already in use" - you are NOT getting the same quality with a 40% reduction in bandwidth between VP9 and AV1. That's a cute promotional slogan but it's simply not true in normal test cases. The same claims were made about VP9 when it was new and it didn't deliver even a fraction of that, and the same claims were made about HEVC and it was the same deal. AV1 doesn't offer any groundbreaking technologies, it's an incremental improvement. The standard simply isn't that revolutionary. If you're getting a 40% reduction I can guarantee you there's a lower PSNR to go with it. I've been working in the encoder/decoder development space for over a decade, I'm not making shit up here, lol. But claiming 40% improvements in general use-cases is pure fantasy, I promise you this. If the gaming audience is as big as you claim (and I don't think it is, but let's say you're right for a second), and they all really do require 4k, as you claim, then that's all the more reason as Google as a company to make this change. Those CDN costs are astronomical, that's a huge expenditure. They're very likely not (or barely) breaking even on that alone. The YT Premium is a great place for them to make that up as a business sense. Gotta get it out of your head that YT is a charity and realize how many hours of content they serve and how expensive that is.
ya know thinking about it, putting 4k behind a paywall actually does make alot of sense... youtube is barely profitable as it is. they should have done this from the start, would have gone down alot easier with the general public, but to be hounest consider what other alternatives youtube might start taking, that can only really be worse....
For CONTROL when using ARC you have to go into CONTROL's system files and run the CONTROL DX12.exe. Then ARC can run Ray Tracing. The A770 actually runs great with ARC. Like 90fps maxed out at 1440p. And i'm using a Z370 Aorus Gaming 7 with ReBar and a 5.3GHz 5GHz ring Golden 8700K.
Still really want to see concrete numbers from these cards on cyberpunk, seems like I'm not alone and I'm a little curious why it wasn't done here and why there were no fps displayed in your earlier video.
@@zZzabi yeah agreed, tbh after Linus' last video i wanted to give a770 a go as long as it worked decent on RL, hopefully there will be some early adopters willing to give it a go
HDR in Elden Ring on PC has been a mess since launch. When I first got the game, I had HDR enabled, but it wouldn't let me turn it off. Then I disabled HDR in Windows, and now Elden Ring will never let me reenable it, no matter if its enabled or not in Windows. Its not surprising one system has HDR issues.
@@Varangian_af_Scaniae ya but ltt has people who write scripts so they don't need to be overly knowledgeable, besides it doesn't even need to be tech related personally, maybe something like CSF.
Riley, rendering engine isn't comparable to any "Apples to Apples" argument. Use the most stable one for your platform that is available. DLSS is "Fake (AI generated) frames" and RTX is "better lighting". Rendering engine is more analogous to "run this game in ARM emulated" or "run this game natively"
The previous video about Arc not really working well with DirectX 9.0 is reason why I took mine out of my "pre-order" basket. The game I primarily play requires DirectX9.0c support and will not be upgraded to DirextX 11 or higher ever. I would love to play with higher graphics but not with more lag.
I mean yes it doesn't work well with DX9 Titles but It will surely get more than 144FPS since it's an older Title. Sure if you're a top tier professional in certain games you need like 240FPS on a 240Hz Monitor but it really doesnt matter if you have 700FPS or 150FPS in like 98% of the time as long as the 1 and 0.1 percent lows are good
Well I'm not the Timestamp guy but I wanted to help a bit:
1:50 Elden Ring
14:25 Crysis
27:41 Cities skylines
33:31 Forza Horizon 5
37:13 The witcher III
48:00 Team Fortress 2
59:50 Anno 1800 (lol Im Spanish so this sounds way too funny)
01:08:23 Rocket League
01:21:20 BeamNG
01:27:00 Sactisfactory
01:36:00 GTA V
01:52:30 Skyrim
01:55:25 Halflife 2
01:59:30 HALO Master Chief collection
02:12:00 DOOM Eternal
02:23:22 PUBG
02:23:44 Valheim
02:33:11 Destiny 2
02:50:50 Minecraft
03:01:13 Descenders
03:05:30 Linus "dropping" out controversial news
03:10:44 Fallout 4
03:14:19 The guys just comparing sizes
03:15:14 Why is it so big? (that's what she..)
03:17:00 Still Fallout 4
03:20:05 Control
03:33:03 Battlefield V
03:46:53 They just start talking about the prices of 6600 xt, 3060 and the Arc 770
03:47:39 The drivers can only improve, they can't get worse (u sure about that mate?)
And that's it.
mr timestamp guy would be proud
😂 thanks
Thanks man!
Forza *Horizon* 5
Minecraft*
they can't* get worse?
So helpful thanks
Major props to the team watching for things to mention.
Like anti-aliasing being set differently and mentioning if it was saving, etc.
Can’t wait for the time stamp guy to do this one xD
That's exactly what I was thinking. Lol! He may need to get approved for overtime.
i'm waiting too. XD
He should put a PayPal link at the end of that one, I'd sent him a buck or two for it lol.
He should hire timestamp guy. Missing timestamps are kind of dealbreaker for me
Minecraft at 9889
Linus: "We will play ANY game on Intel ARC for you."
Also Linus: "What are these games you guys are asking us to play? Play an Esport game like Dota or GTA V."
Lines like these make me love these videos.
TIMESTAMPS 😮
01:50 Elden Ring
12:35 Can it run CRYSIS????
27:44 Cities: Skylines
33:43 Forza 5
37:16 Witcher 3
46:50 Team Fortress 2
58:40 ANNO
1:08:27 Rocket League
1:19:20 Beam NG
1:26:45 Satisfactory
1:33:35 GTA V
1:52:20 Skyrim
1:55:22 Half Life 2
1:59:56 Halo: The Master Chief collection
2:11:57 Doom Eternal
2:23:13 PUBG
2:23:35 Valheim
2:33:40 Destiny 2
2:50:43 Minecraft
3:01:45 Descenders
3:05:35 Linus’ anti consumer speech
3:10:40 Fallout 4
3:20:05 Control
3:35:05 Battlefield V
Here’s the rest
TIMESTAMPS AND RATING
Elden ring (Good) 1:00-12:45
Crisis (Good) 21:45-27:45
City Skylines (Meh) 31:00 - 33:30
Forza Horizon 5 (Arc Fail) 35:15 - 35:45
Witcher 3 (arc win w/o hair) 40:30 - 45:45
TF 2 (Good enough) 54:30 - 58:30
Anno 1800 (Bad for both) 1:03:45 -1:07:45
Rocket League (Bad Arc) 1:12:00 -1:19:15
BeamNG (playable, Riley bad) 1:21:30 - 1:24:00
Satisfactory (Good enough) 1:29:45 - 1:33:30
GTA V (fucky at best) 1:41:00 - 1:50:00
Refusing to do star citizen 1:50:00 -1:53:30
Skyrim (hits fps cap) 1:53:30 - 1:55:00
HL2 (good) 1:57:44 - 1:59:15
Halo MCC (chugs) 2:05:00 - 2:08:30
Doom Eternal (Okay) 2:16:00 - 2:20:15
PubG (Both crash) 2:23:00
Valhiem (Better than 3060, then worse) 2:26:00 - 2:32:45
Destiny 2 (good) 2:38:00 - 2:45:00
Minecraft (Good) 2:50:00 - 3:00:00
Decenders (Unplayable) 3:02:00 - 3:05:30
Linus makes a statement 3:05:30
Fallout 4 (meh) 3:12:30
Control (better w/raytracing than 3060) 3:24:00
Battlefield V (struggle bus😅😅) 3:38:00
@@ntindle you barstard 😂
Y'all timestamp guys are killin it
@@Pax.RUclips you’re all timestamp guys are killing it?
you all timestamp guys are killing it
The fuck? 🤨
Absolute legend
from this testing it feels like Intel DOES have the raw power, but there's always SOMETHING that holds it back. especially telling from Witcher 3.
Just needs drivers which just needs time.
Witcher 3 was a win for Arc though? Am I missing something?
@@TankSenior That's his point, Witcher 3 proves that Arc can be good but it's currently being held back by other issues (mostly drivers).
@@MistyKathrine Ah I thought they meant there was *something* holding it back on witcher 3 - was wondering what it was because I saw no issues (except with nvidia hairworks but that seemed very unecessary).
I think they mentioned in their video about it that it only runs directx 12, it just translates everything not directx 12 which uses a lot of its resources
Really good stream. Very casual, LTT folks listening to chat, just a chill fun gaming geek hangout. Please do more low-key stuff like this.
LTT should do all graphics card releases this way... LTT crew playing a bunch of different games on different GPUs for comparison. I rather sit through three hours of this than 3 minuets of benchmark graphs.
I agree. I love this setup!
I imagine these don't pay them as much and with so many things Linus & Yvonne are putting the money into, this is a once in a blue moon situation.
Sure, but handling those cameras are heavy :( Unless they use smaller ones but quality will be impacted.
Finally, a genuine time for the use of the word(s) "low-key".
I'm actually so far impressed with Arc, even tho it does tank performance, it still way above 100 fps. If they fix the breakable problems and squeeze a bit more of performance I'd be actually very compelled to buy the Card.
Yeah it's not bad at all and with time it will probably get better as the drivers improve.
Seems very hit or miss, but in general for the price I think its a good thing.
I have a A770 16GB and it's really really good. When it comes to some classic games, there might be issues, but still, very solid first attempt and tons of VRAM for future games
Did you miss the part about having 120fps + but still getting stutters? Just go get a RX6600 :D
@@friendlyreptile9931 Depends what you need.
From what I can tell - a lot of these games can't even tell that Arc is a dedicated GPU. I bet some of these game thinks Arc is a processor integrated graphics. Which means the drivers need a ton more work, I expect A770 will get better performance in a few months.
I bet Intel will just realize they're sinking money into something which will never give them any returns and they'll drop the discrete gpu's and pretend they never existed.
Let's be honest, I want Intel in GPU, but that's like rolling a dice, they are not proven GPU manufacturer like Nvidia and AMD, I expect Celestial will be competetive.
Arc is a disappointment for me, I kind of wanted AMD cpu and Intel GPU.
@@Kocan7 don't expect intel to come up with their first GPU and be on par or better than Nvidia or AMD right out of the gate
@@aerosw1ft They could be way better, they were supposed to be on the market in q1 22, and instead of developing good drivers they tried to do every bit of tech and introduced RT and DLSS competitor.
Intel is not a small plucky company, they are giants that dwars AMD.
I expected that after almost 6 months delay they would introduce a product that fight Nvidia and AMD in low and mid-low tier, but what I can see is a bit of a mess.
I'm sorry, but let's be honest would you recommend arc to a friend? I can introduce amd/nvidia gpus no problem, and I hope Battlemage will be stable enough for enthusiasts, but I just can't see anyone wanting to play on arc.
@@Kocan7 dwarf amd? Mate what? The 3000 and 5000 series absolutely slayed man you’re nuts rn the 5800 3D is the best cpu on the planet currently
You guys have to test it with DXVK! DXVK works for both (unofficially) windows and (officially) linux, and translates directx < 12 to vulkan.
Is that just me feeling ecstasy when the opening of Crysis showing Nvidia and Intel core 2 extreme logo, who would have thought one day we will be testing their GPUs side by side. Great time we are living in.
Almost happened about 10 years ago with Project Larribee.
This speed-gaming format is kinda entertaining, and made me want to get back into general gaming. Also, good to know the Arc stuff isn't 100% ready for prime time. Thanks, guys!
Q++a
@@jmorriss9519 lol wut
In TF2 the performance difference they mentioned at 57:05 is due to them being in a different area
When linus went to an area with similar view at 58:00 he also got the same FPS
Exactly. Infact the 3060 dipped to almost 100 fps.
Was looking for this comment. Linus' "definitely worse" conclusion was a bit annoying, I'm surprised no one in the room or in one of the chats said something.
@@technicolourmyles The chat spammed it all the time they just ignored it
I’ve seen other people say this but - the a770 system was also running the server and simulating everything. Whereas the 3060 system was just rendering visuals, and while Minecraft looks simple it can be heavily demanding when simulating at max distance.
The Minecraft test wasn't fair because the system with the A770 was running the server at max settings, and the nvidia system only had to render the visuals.
I wish benchmarking Minecraft was more commonplace. It can be very demanding on your CPU to run a server with a huge amount of entities/players/redstone contraptions and there are some shaders, like the SEUS ones, that can absolutely wreck your GPU if configured properly.
Running Minecraft can mean a whole host of things and is not something to take for granted. It’s also massively popular, I really don’t understand the neglect.
Imagine a benchmark where they use one of the big world downloads from big creators, like a Hermitcraft Shopping District. Now that's something that would stress-test a computer!
@@Respectable_Username boatum s8
@@d313m5 They don’t know how to play it haha
@@michaelerekson9912 Or better yet, Mumbo's S6 base. Xisuma flew around it to performance test his new PC at the time because it was so laggy 😂
Riley wins the stream for “Know what else sucks? Locking 4k behind a paywall” at 3:07:00 😂😂😂
I am very interested in seeing Arc's computational performance in tasks like AI, ML, rederinng, folding, etc.
AI stuff would be great, rendering might be poor due to (understandably) lack of support from rendering software, but yeah we need an array of testing on those
Check the reviews from Hardwark Canucks and EposVox, they did some of the non gaming benchmarks in their reviews.
@@AbyNeon You can't be sure of that, back in the 90's I had a few graphics cards for no other reason than to evaluate their architecture based efficiency within a specific set or type of 3D CAD calls. The different ways they approached the execution of the call within the simulated space and the relative efficiency of different methods was what I was curious about. Hundreds of hours spent analyzing different methods VS each other using different fixed parameter "tick speed" to simulate computational cost as it were between different approaches in the direction of a solution.
This is a very long way of me saying "I don't agree - there is some problem that Arc is currently the best scalpel to dissect, I just don't know yet what it is :-)"
@@AbyNeon I do see advantages with how Arc does things in particular when it comes to analyzing the relative 'cost' of one approach compared with another, in terms of compute resources used. And yes it does matter, a lot actually, even to those who haven't the faintest clue what I'm talking about. To try to elucidate what I mean a bit more in a general sense, every kind of computational analysis science ends up needing it's own language or at the very least terminology and set of agreed upon concepts, problem types, and solution types. These over time evolve to be more and more exact and efficient in their function, be it to define a problem, define a solution, build the road from one to the other, or in some cases demonstrate or prove something new, like what was defined as a problem not actually being one at all but rather a convoluted solution in search of a fitting problem.
Its amazing. The harder it gets the better it gets in relation to other cards
It seemed to me that Intel's first wave of GPUs are doing well, all things considered. I'm sure that updates to this generation and their subsequent series' will only improve performance. Nvidia and AMD have been in the GPU game for many years. So I feel we should give props to Intel for jumping in so late while performing somewhat decently with their first generation. Also, having a third competitor in the GPU market could help yield more innovation via competition. Let's see how Intel does. I hope they do well - for our benefit, not just theirs.
Decent first try, Intel, but you've also got a long journey ahead. Chin up, boys and gals! 👍
Been on Arc, myself, for a while. The performance improvements have been pretty damn big.
I bought mine when, where I am, it was priced so far lower than the closest Nvidia had to offer and my 1080 had died.
It's a great entry to gaming, in my opinion. Hoping the future drivers iron out more kinks and that Battlemage cards continue the trend of steady improvement.
its alright but i still chose the rtx with the amd kit
update: it was a great choice!
The reason why the Intel card got a lower FPS in Minecraft is (probably) because you're hosting the world. Doesn't make a lot of sense, but whenever I play in single player, I get 80-100 fps, and in multiplayer servers I get 150-200 FPS. CPU usage is always at 20%.
If CPU usage is only at 20%, something else is probably getting maxed out.
Minecraft hosting requires that you dedicate RAM for just the server. So you could be running out of that ~
By hosting I mean running a single player world (single player is still a server, but no console)
Opening to LAN just allows other people to join your server (that's why opening to LAN is just a click of a button, and doesn't require you to re-join your world, etc)
My RAM usage is 4 out of 16 GB when I'm in single player tho
One thing that comes to mind is that your render distance could be higher in singleplayer. Servers default to 10 (previously 8, i think). What was happening on the stream is obviously the CPU maxing out, though.
I have an RTX2060 and get FPS in the 200's in minecraft...but I usually run non-fancy mode
@@researchandbuild1751 I have a GTX 860M (on Linux) and I also forgot to mention I'm using shaders, without them I get up to 900 fps
In the future for team fortress 2 (and for anyone reading this) change your FOV settings to 90°. 90 is the default for most games, except for tf2 where it’s a measly 70 degrees. First thing to change in that game
56:29 linus spending like 10 entire seconds solely on murdering a friendly RED heavy with the syringe gun without batting an eye is literally one of the funniest things i've seen in some time lmfao god i love team fortress 2
I'm pretty sure Linus is blue team at that point
@@BIGAPEGANGLEADER A friendly in TF2 is a player who doesn't attack anybody. Including enemies.
The Intel Arc is looking pretty good tbh. I think with some time/driver updates it will be a real contender.
Also, we're already "paying" for 4k video with the INSANE AMOUNT of adds youtube is shoving down our throat currently.
3:05:35 RUclips putting 4k behind paywall. If they raise RUclips Premium rates to keep 4k, I'm done with YT premium and I'll go back to using ad blockers.
nobody cares about 4k
@@Jeremy-ho9cr tell that to basically everyone lmao
@@Jeremy-ho9cr Not on a phone maybe, but on a big 4K TV they do. 1080p youtube looks like garbage. The bitrate makes any fast motion look worse than high bitrate 720p.
I doubt that they'll do that. I haven't actual even thought about it, but if they actually raise the rates or make it a higher tier I will also end my subscription I've had for many years now.
I'm confused as to why they don't limit 4k to channels above a certain amount of subscribers. Also I'd be in favour of abolishing 4k for less compressed 1080p. 1080p video looks great, 1080p RUclips video looks like steamed turds localized entirely within your kitchen
DXVK is such a good patch for Intel Arc.
You should do a good comparison between normal DX and DXVK in the games that struggled the most.
Thank you for getting around to GTA V, I know it's a pain in the rear to benchmark properly. Ars Technica tried it at 4K Max and could barely get 20fps from either Intel card (8GB or 16GB) and their 3060 was in the 40s. Seems like at 1080p and high/very high settings the A770 falls in between my current GTX970 (mid 50s) and the GTX 1080 my step son is giving me soon (mid 70s). The stuttering was likely from hitting the engine's breaking point. No need to spend $329 for an Arc GPU here in my house then. When the 1080 renders its last rend, there are plenty of other $125-$200 used cards. I could probably get away with as little as a 6GB 1060 at this point, and would likely settle on a pile of 1070/1070ti's to last me until death. Of the game of course, I plan to live forever, so far, so good.
you should have used conventional screen capture, maybe per camera to reduce errors (brightness, fullscreen, crashes etc.)
loved the idea of the livestream and would like to see it again with everything you learned there to improve the 2nd round
TF2 is not detecting any vendor stuff from Intel, the source engine has very optimized pathways specifically for NVIDIA, AMD and Intel. Also, FPS drops are very usual as the game is not very optimized at the moment, CPU wise.
The minecraft test was not ballanced since the Arc was hosting the server while the 3060 just rendered, Im not sure if the server is more cpu dependent tho,
I also wish they tried bedrock with rtx
I really think as long as ARC performs as well as systems from games releasedate then it's approved in my book. Or above 144 fps on recommended settings
Intel GPUs should be perfect for non gamer professionals who can't exactly afford the Quadros or the 40 series of the world. (The % of CAD designers throughout the world who actually use "professional" GPUs is very low).
A lot of CAD programs still rely significantly on CPUs anyway. If Intel solves the reliability on those applications then they can build up on top of that for revenue to develop next gen cards.
Alex: dlss looks awful at 1080p
Also Alex: "I cant tell the difference"
Every other time I've used it that was the case but Control does DLSS soooooo well
Installing and trying so many games with so many problems just seems to prove how high friction it can be to get into PC gaming no matter what GPU you use. If the game doesn’t crash, at best you’re greater with launchers, installers, and relentless EULAs. Once you manage to get into the game, you’re still not even playing most of the time because you’re funneled into some kind of minimally fun tutorial.
Would be nice to see the power consumption level for comparison as well cause that can be a thing here. Sometimes not only price vs performance matters but also power consumption vs performance.
Oh God stop.
It bother me they didn’t use the monitors as constant variables rather than using two completely different monitors but I digress..
3:05:33 As long as 4k doesn't raise rates for ALREADY Premium Users
Premium gang
Watching Riley's gamer side come out was pretty enjoyable to witness. I wonder what the LMG rankings are for Rocket League, and how many people actually prefer m&kb over controller
This would be a great card for someone that likes to play casual games once or twice a week like me. But than I'm also a huge fan of technology and always want the best of the best even though i can barely use it these days. So it's complicated 😂
I would have really loved minecraft + sodium + iris + a shader (and preferably play on a server)
that way it would actually be gpu limited
3:05:33 Linus has his spicy take on RUclips 4K Quality being a premium only feature.
@@n0k0m3 maaaaaan try explaining this to them x100 and the only thing that will go through their head is “NoT FrEee”.
Obviously i like the 4k feature but i have defaulting to 1440p which looks amazing still.
I would be okay with it if ads were reduced for non premium content.
On Floatplane they charge $5 extra per month just to watch LTT content in 4K instead of 1080p. Meanwhile on RUclips, 4K is free. Shows where he's coming from.
@@motokid6008 Could be an experiment, I usually get 2 ads, 3 ads maximum. Some are on experiments without consent and apparently get 8-9 ads in videos.
@@sfgrgfsf212 The difference is that Floatplane has always been a paid platform. 4K has been a free RUclips feature for years now, and they want to take that away and paywall it. People aren't going to accept that with open arms, especially right on the heels of them stripping dislikes and attempting to dump us with 10 unskippable ads.
We need a Supercut of this. Or a written consumption with the numbers for each game.
Should redo the Minecraft test as the arc one was hosting the world acting as server. Have both do single player then try both hosting.
Why don't they ever benchmark watching youtube videos while scrolling up and down through your steam library for 4 hours before going to bed?
This is hopefully valuable information for Intel driver developers. Godspeed you guys! You got 10-15 years of work cut out for you
I'd wanted to see Transistor, like even though that should in theory be a really easy game to run, but I missed the stream. Still frustrated about that. He might not have played it even if I did suggest it but I'd have still liked the chance. At least there's like one game on here that I do play, and another I find fun to watch.
Pretty sure you guys single handedly made Team Fortress 2 the #1 game on steam today by playing it lol
or it could be the halloween update the devs just released...
@@TheyCallMeContra *dev. Singular.
@Engineer Gaming Clueless
the issue in older games is really the 0.1 and 1% lows. some games that you can mod and foce to run on vulkan will end up much better for arc with something like d9vx. forcing vulkan will also help other gpus just not as much as it helps arc
alex with 90 fps : this is unplayable
me with 30 fps : this is fine
It's the frametime consistency that matters. The 1% lows on Arc were brutal a lot of the time.
Not only that, but 99% of 18 fps, that actually sucks
@JacobTech tell me you’re a pc snob without telling me you’re a pc snob
Yeah, if you triple buffer vsync fps lock to a lower frame rate like that, it will basically make it like an older console experience, where it's technically low fps but you don't feel it. At least from my experience. But monitor also matters, 30 fps on 240hz 4k will feel extremely worse
No 90 is unplayable for many fast paced games for me. I need 144 for Doom Eternal. My brain couldn't parse everything going on at 100. (my old laptops screen was a binned 120hz panel, I could push it to 100 hz even though it was "60")
That BF: V Test was weird. I mean you guys didn't do anything wrong and benchmarked it perfectly, but the graphs that Intel were showing were totally different. On their graphs the A770 was basically on par with the 3060. They may have tweaked a bit but I doubt it that Intel would literally double the FPS just so that the graph looks equal, especially after actively showing games that run faster on the 3060.
They would have to compare setups for Intel and what they did. They had several setup issues on the ARC side and ran several dx12 games in dx11.
Man.... stuff like this just makes me love LTT.
I really feel like they should have taken a bit of extra time, to insure the cables and monitors were the same between the two systems.
Hey Sage, it's so crazy that every video I watch I see you in the comments 😂, hey man loved your SE videos
I would've loved to see how bad it does in Planetside 2. Big fights are already a slog and since the game runs on DX11 the dx11->12 translation thing would come to play.
Oh and some DX9 and OpenGL games, not everything is DX12 or Vulkan.
LTT is actively avoiding comparing ARC to the 6600 xt and 6650 xt which is priced the same to the arc gpus around $320 and below
@@JC_BOYplease don't post this multiple times
@@crazybeatrice4555 just report for spam
@@JC_BOY Theay are goign to compare with the card that most people is going to buy, my circle of friends are pc gamers and none of my friends thought of buying a 6600X or A 6650, only one though to buy one for a moment becaus it was the only one but literally just like 2 hours later there was a 3060 restock on the online shop we mostly use in Mexico, most people are in thge marketfor a 3060 so they are gopign to compare withthe one most people have or want to buy.
Also curious about Planetside 2 performance, hope somebody does some testing with it
It seems like playing anything on any FPS with Riley is just pure fun.
I wish I could hire him on some on-demand teammate site. Not to boost my rank, but my mental wellbeing.
Missed the stream, but I'm glad that not only was Minecraft suggested, but the Java version was tested. Performance with the ARC GPUs that work best with DirectX 12 was a big concern for me, so the fact that not only is it playable, but decent framerates (admittedly, vanilla, not modded) too.
Amazed it that it seems to have decent OpenGL performance. That's something AMD has only begun to fix over the last couple of months on their GPUs. So the fact that intel isn't horrible there is actually pretty promising.
So glad they played Java. Really wanted to know how OpenGL would run. 500FPS, completely unmodded, is... great!
A770 is looking absolutely fantastic at its price point, unless the retailers hike it up by 100%. Better RT than the RX6000 series is a huge plus, but I would only get the 16GB edition myself and it's going to be a limited edition. 😞
Too much driver issue to even consider it to buy If for gaming
@@andyastrand plus is better if you have 300 dollar budget to buy something used
Having Hairworks on makes that an invalid test, that's an NVIDIA specific technology that they ensured would only work on CUDA on NVIDIA GPUs. So on systems without an Nvidia GPU it runs pretty bad, because it's only accelerated on NVIDIA GPUs.
Thanks Jeeves.
🤔In the Witcher 3 game (45:00) there is a scene where we escape Kaer Morhen by horse, with a bunch of enemies, people fighting and fireball raining from the sky. At the time my GTX 860M couldn't handle that even in 720p, the sound started cracking and FPS dropped to 10 😅 ... I am very keen to see how the arc 770 handles it.
An older game that is suprisingly GPU heavy is Conan Exiles. It still challenges my RTX 2080.
Thats just how Funcom do (or dont) things.
An interesting case idea would be a flat design that can be rack mounted. Removable rack-mount frame, and glass top for visuals if left on a desk. Airflow is very easy with rack-mount cases, and the ability to use on desk and in rack would accommodate everyone's needs.
Too bad they didn't try Titanfall 2 :(
There's some sort of gamma/contrast issue on the Arc but seeing Witcher 3 result, that reminds me of nvidia having a driver default that give them advantage in the past. The creases on Geralt clothes didn't have any shadow but on intel, it is nicely shaded. I'm not sure if Hairworks with 64x tessellation are viable on Ampere cards seeing it's much more of compute oriented architecture (Kepler has a bit of advantage in that and it was a point to bash AMD GCN cards that were bad in tessellation). Maybe nvidia driver default cap it to lower amount that overrides game settings so probably worth forcing max tessellation value with nvidia inspector.
Would be interesting to see if there is a performance drop with AMD processors should do a comparable system in AMD with a Intel card and one all intel just to see what it brings.
Riley saying “this is irresponsible” regards to aiming at hostages and not wanting to shoot the guard in GTA V is just 👌
1:49:16 I feel you Riley I feel you :((
i feel old for for anticipating the slim shady reference
It would be really interesting to see this card preforming in a steam os or other linux distro. Basically we don't have a direct x or whats ever, so it would be nice to see how this thing performs.
I was actually expecting a Star Citizen comparison, given the fact that it's a really hard game to play and really CPU demanding.
I'm disappointed honestly.
That's what I was looking forward to as well
@@PaulDillinger where did they play it? i didn't see it and 90% of games a trash AF IMO. so I didnt want to watch hours just for the 5 mins of starcitizen
@@NEOgeek402 They didn't. That's why I said it.
They even said "We're going to do it later", but never did it.
Riley mentioned at one point "We don't know in which state Star Citizen is right now, so I don't know", but then again, I am barely able to play Star Citizen with a 3080 and a 10700-k, so I was really looking forward to see if an intel GPU would get better results or at least "Stable" ones.
I am really hoping intel works on driver updates to make arc become the next 3dfx voodoo card
The way Linus explains the 4K-paywall-thingy absolutely makes sense. Now I don't know anything about the matter and I don't care too much but from what he's saying I think it's the creators who should pay for 4K uploads, not the viewers who want to watch in 4K. After all, the creators use youtube to make money so they should be the ones paying. Just like vimeo makes the uploader pay for premium, not the viewer.
Aha, this sounds way to much like old broadcasting platforms. Oh, you want to watch in HD? Pay more!
8k would make sense.....charging for older well established tech is a dirtbag move
@@BruceKarrde I'm not saying "wanna watch 4K? pay more!"
I'm saying "wanna provide your audience 4K? pay more!"
Huge difference.
1:15:30
Riley : Are you winning son?
Linus : No dad, but there is still 2:20 minutes left in this game. Can you help me?
Riley : Hold my LTT bottle.
Can Linus build a PC that is running anything that "Let's Game It Out" is doing smoothly?
Honestly… I’m surprised with how well the Arc did in these tests.. I’m even more impressed with how a processor company like Intel has done this will 🤯
Disappointed that they never tried star citizen, even though Linus promised it. It would have been interesting to see if the Arc would have been able to handle it, seeing that Star Citizen is running on one of the most advanced constantly updating engines that exists in gaming right now. The Arc excels in new engines vs old engines, so this would have been an interesting test to run, and though they played many games, they never hit the Star Citizen genre, as it is a category all its own, and very important to the kind of people who are likely to buy an Arc GPU to test and troubleshoot with it to try and get a new line of products off the ground like this, as Linus begged people to do in his previous video. The kind of people who would buy an Arc GPU for that purpose are exactly the kind of people who play Star Citizen, in its current state. With respect, it is not likely someone who is concerned about maximum frames in Rocket League so that they can perform their E-sports best is going to be interested in testing and troubleshooting a new video card lineup with the vague hope that it can become something good in the future, and thus is unlikely to be considering an Arc GPU in the first place...
@@IRMacGuyver
They all already have accounts.
Yep, one of the reasons I play Star Citizen is I don’t mind being a ‘tester’-type individual. I love finding bugs/reporting them and working to get things running smoothly. I’d’ve grabbed one of these ARC GPUs if they’d come out ~4-5 months ago (even at $400-$500).
Pretty sure other people will do it more in depth .
advanced engines? Is this satire? They are running on CE (I believe v3), from 2009, now lumberyard. The same engine with the addition of AWS features. It is a heavily modified engine, yes... but, let's not pretend it's the most advanced. I suggest you look at some UE5 stuff. CE\LY is notorious for being difficult to use.
That said, yes... I wish they tested it as well, but I'm not about to tell everyone it's using the most advanced engine (*star engine*) It's not, and it is why CIG is struggling to get it working how they want; otherwise we would already see more progress.
Slight error when you guys talk about the translation layer at 32:38
It converts DX9, DX10, and I believe it has native DX11 support.
But it converts those calls into Vulkan, as it uses DXVK which is how linux gamers enjoy gaming.
Getting ready to do my first build in about 10 years.
As someone who doesn’t know a lot about all this stuff it seems a safer option to pay for Nvidia.
The price difference would have to be a lot more to try this.
The Question is, why the RTX 3060? When you play RTX it sucks as well. So a RX 6600XT would be cheaper and has a better Rasterization Performance than the RTX 3060. Or for the same Prize the RX6700XT which got the same RTX Performance but will run circles around a RTX3060 in older Games.
@@Elkarlo77 4k Vr in older titles would be great. Like halflife 2 vr. They will be updating graphics soon.
@@Elkarlo77 True, raster performance is still the go to. Rtx won't be a big thing until like 6 to 10 years later
@@Elkarlo77 6650 XT at $285
@@DravenCanter In my market the 6650XT costs around 380 USD or 50 USD more than a 6600XT and is closer in Prize to a 6700XT. Which is Cheaper as a 3060ti (Cheapest 3060ti 460 USD Cheapest 6700XT 420 USD) Thats why i didn't have the RX 6650 XT on the "radar".
Alex complaining about in game physics and momentum in the Witcher, got a bit of a chuckle. The added half step makes it a bit more immersive given in IRL you don't dead stop a sprint, and if you got a fast trot for a walk you slow yourself with likely a half step.
Great stream. Watched most of it live. It would be nice if they posted a video with all the other games that were not played using the same set-up.
Been digging this long videos /streams
Crazy idea... but if it has problems with pre DX11 titles, wouldn't a wrapper like DGVoodoo2 that turns old API's (Dx8/9, glide) into DX11 be a fix?
Linus talking about arc being worse in Satisfactory, right as Riley's testing shows that Arc might actually be just as good, if not better. Arc was around 100-120 in the intro/drop, then up to 160-200 once it landed and loaded the area more.
Very impressed with how much better Arc is managing now
Due to the huge difference in timezones I've missed it but wanted to request one thing and maybe someone here will listen to it and check it out as it might boost up the performance of Intel GPU. You've checked only games with DirectX but what about OpenGL? OpenGL is basically implemented in all of the old titles and some titles (old ones) may really benefit from that change. Some source titles may be run with OGL by just using a paramater `-gl`.
Another thing is ReBAR. Currently there are people that still uses hardware that does not support that feature. Should we even consider buying ARC? I'm really dissapointed by the attitude of AMD and NVIDIA so I wanted to "vote with my wallet" but should I still even consider Intel GPU without ReBAR capability?
if your system doesn't support rebar/AMD equivalent don't buy ARC, even their own developers have said the same thing. get something like the 6600XT if you're looking for something in that price range.
It would be great if those powerbanks in the trunk can charge your car battery when system detects its under 12.5V and charge it untill full - 12.8. Just bump it to 14.5V and wait for 10h and test again if it stays above 12.5V. You could forget about charging your lead-acid battery with this kind of system (normally it should be charged twice a year - before and after winter season), and more than 10 years of battery would not be any problem :)
I'd be curious what GPU utilization looks like instead of just FPS alone
Yo, so i wanted to shed some light on why ER is locked at 60. the main reason is there are certain actions in the game (rolling for instance) that are directly tied to FPS, meaning the higher your FPS the less "Invincibility frames " you have, result in you still receiving damage even if you time your roll perfectly. Fallout 4 has a similar issue resulting in lock picking becoming impossible at high FPS. This is what my friend who is pretty much an expert on Dark Souls and Elden Ring said, so take with a grain of sand ETC.
Now I'd be really curious how dxvk runs on these compared to translating DX9/10/11 to DX12. Seems promising for the linux crowd.
Unfortunately the drivers are only available in the latest kernal which isn't available in any LTS versions yet (as far as I know) so will have to wait a bit for it to be tested properly.
@@EeziPZ DXVK doesn't need Linux. It works on Windows as well.
Does anybody else notice Linus bossiness? He told Dan not to talk because he doesn't have a mic but when jake does it, he doesn't mind and engage in conversation without having any concern for Jake's voice isn't audible. Strange 🤔
The cost of storing 4k video will reduce greatly with the introduction of AV1 encoding to the platform though.. so his argument about that is kinda pointless.
Maybe lock 8k behind a paywall. It's new and not a ton of videos are 8k. But it's an option for people who want to buy it.
4k is waaaay too commonplace nowadays to lock it behind a paywall. Especially for a gaming audience.
Yeah, Linus has started getting on my nerves in the past months. He is slowly losing contact with the ground.
@@psychoticapex yea he just sees it as another method of RUclips to profit off people, and thus another method for him to profit off RUclips.
As a video encoding engineer: No. That's not the case at all.
AV1 is a great standard but thinking this will be a non-issue after it's mainstream is a comically bad take. Everybody outside the video coding field all thought the same things about HEVC but what you saw on the ground is companies working moderately hard just to get the same compression ratio at a given PSNR as AVC. It's not because it lacks the tooling, it has it, but using all these features of the new specs, especially in hardware accelerated GPU's where you can only support a certain features and supporting them well enough that you're still working in real time while increasing your compression ratio? Not an easy nut to crack. AVC has had a decade+ to get the shit optimized out of it.
It's easy to say that these new standards like AV1 are going to revolutionize the industry when you have a clip cherry picked for the test running in an AV1 reference encoder with all the bells cranking away at 1 frame per 10 seconds, it's another thing entirely when you need to have to try and stuff all of those new tools into a fixed function GPU pipeline.
AV1 will have moderate improvements over VP9 - probably practically larger than the gap between VP8 and VP9. But the idea that CDN costs aren't an issue after it's rolled out? No, not a chance. CDN's are stupid expensive to run and the gap between VP9 and AV1 is not even a hint at lowering that. The truth of the matter is that we've made BY FAR the largest gaps in lossy compression technology with MPEG2 & AVC. Everything since then has been super incremental in comparison, and most of the foundation of what we do today boils down to optimizations on the same techniques - the main benefit is more flexible blocking for retaining small details in large frames. But the underlying technology hasn't changed much. The other benefit has been better segmentation for multiple fixed function pipelines on the same GPU. The benefit is for speed though, not compression.
As for 4k being "way too common" - you summarized it best in the same line: "Especially for a gaming audience." The gaming audience for YT is a tiny percentage of consumption. There is no need to add it just to satisfy them. If they want it bad enough they'll pay for YT Red. And even among the gaming audience, I wouldn't be surprised if 50% of them didn't even select 4k manually, and just left it on whatever YT gave them. I'm not a gamer but the only time I check the quality at the bottom is if it looks like shit.
After I was part of the team who rolled out one of the standards above on the 1st cards who supported them for one of the companies in the video, I worked at a start-up. The start-up didn't make video cards like my previous job, but we did still work in the video space & we had a small CDN for distributing video. The start-up failed, but with only a few hundred users the CDN costs for 4k video were astronomical. The cloud services charge a shit-ton for those transfers, they are *expensive*. It really did not take long before we were dropping tens of thousands on CDN costs per month. You kinda have to see the GCloud or AWS billing to appreciate how hard it sucks. Now granted, Google gets a great deal on this, but the prices aren't high for the rest of us because of price fixing, they're high because it's expensive for them too.
Anyway, sorry for the long winded reply. Linus is right though, this makes sense, 4k and 8k are expensive as fuck for Google, and 97% of RUclips viewers likely will never notice. AV1 will be a drop in the bucket for costs for YT.
@@MrSlowestD16 Nah its fine. Google isn't hurting for money, no need to gouge the consumer further. They literally have every possible data point about their customers up for sale.
As for AV1, I don't think it's going to take that long to become decent. RUclips already uses it by default on some videos, and you can go into your account and ask it to prefer AV1 when available. And the quality remains the same while the bandwidth has reduced by like 40%.
Also, the gaming audience for RUclips is actually their largest viewership next to music. Just because you don't watch gaming videos, doesn't mean it isn't popular. It's literally one of the primary reasons for their growth over the last few years. Don't make shit up.
@@Bry.89 They're not hurting for money, but they have to justify it as a business decision. They're not a charity. They have to explain to their share holders why they make decisions. There needs to be a clear-cut reason they can relay to them.
You misunderstood everything I wrote about AV1. It has nothing to do with "becoming decent" or "already in use" - you are NOT getting the same quality with a 40% reduction in bandwidth between VP9 and AV1. That's a cute promotional slogan but it's simply not true in normal test cases. The same claims were made about VP9 when it was new and it didn't deliver even a fraction of that, and the same claims were made about HEVC and it was the same deal. AV1 doesn't offer any groundbreaking technologies, it's an incremental improvement. The standard simply isn't that revolutionary. If you're getting a 40% reduction I can guarantee you there's a lower PSNR to go with it. I've been working in the encoder/decoder development space for over a decade, I'm not making shit up here, lol. But claiming 40% improvements in general use-cases is pure fantasy, I promise you this.
If the gaming audience is as big as you claim (and I don't think it is, but let's say you're right for a second), and they all really do require 4k, as you claim, then that's all the more reason as Google as a company to make this change. Those CDN costs are astronomical, that's a huge expenditure. They're very likely not (or barely) breaking even on that alone. The YT Premium is a great place for them to make that up as a business sense. Gotta get it out of your head that YT is a charity and realize how many hours of content they serve and how expensive that is.
ya know thinking about it, putting 4k behind a paywall actually does make alot of sense... youtube is barely profitable as it is. they should have done this from the start, would have gone down alot easier with the general public, but to be hounest consider what other alternatives youtube might start taking, that can only really be worse....
DXVK/Steam Play performance on A770 please...
For CONTROL when using ARC you have to go into CONTROL's system files and run the CONTROL DX12.exe. Then ARC can run Ray Tracing. The A770 actually runs great with ARC. Like 90fps maxed out at 1440p. And i'm using a Z370 Aorus Gaming 7 with ReBar and a 5.3GHz 5GHz ring Golden 8700K.
Any one got time stamps or a list of titles they played?
Still really want to see concrete numbers from these cards on cyberpunk, seems like I'm not alone and I'm a little curious why it wasn't done here and why there were no fps displayed in your earlier video.
appreciate the stream, was considering risking an a770 for rocket league but probs gonna give it a pass now
any dx 12 and vulkan title should be fine if you`re looking to play variety of game better go for tried and tested
@@zZzabi yeah agreed, tbh after Linus' last video i wanted to give a770 a go as long as it worked decent on RL, hopefully there will be some early adopters willing to give it a go
Hopefully we see more people do things like this. I would love to se modded skyrim still reaching the 60 cap
but 4k and 8k have been free for so many years already...
HDR in Elden Ring on PC has been a mess since launch. When I first got the game, I had HDR enabled, but it wouldn't let me turn it off. Then I disabled HDR in Windows, and now Elden Ring will never let me reenable it, no matter if its enabled or not in Windows. Its not surprising one system has HDR issues.
I'd pay good money for the Alex and Riley show what a duo
They have great camera presence but their knowledge about games and their settings feels lacking.
@@Varangian_af_Scaniae ya but ltt has people who write scripts so they don't need to be overly knowledgeable, besides it doesn't even need to be tech related personally, maybe something like CSF.
Riley, rendering engine isn't comparable to any "Apples to Apples" argument. Use the most stable one for your platform that is available. DLSS is "Fake (AI generated) frames" and RTX is "better lighting". Rendering engine is more analogous to "run this game in ARM emulated" or "run this game natively"
2:10:02 That Part Cracked me up! XD
The previous video about Arc not really working well with DirectX 9.0 is reason why I took mine out of my "pre-order" basket. The game I primarily play requires DirectX9.0c support and will not be upgraded to DirextX 11 or higher ever. I would love to play with higher graphics but not with more lag.
I mean yes it doesn't work well with DX9 Titles but It will surely get more than 144FPS since it's an older Title. Sure if you're a top tier professional in certain games you need like 240FPS on a 240Hz Monitor but it really doesnt matter if you have 700FPS or 150FPS in like 98% of the time as long as the 1 and 0.1 percent lows are good
Linus was unpleasant to his employees in this video
I think it's funny my 1080ti would still trade hands with modern cards 5-6 years after it was released.