My first pc build(in 2011) had 8gbs ddr3 and I thought for sure I would be set forever given I came from 2gb ddr2(dell oem). So yeah 32gbs was insanity for ddr2. Currently running 16gbs ddr4 Ryzen based system and loving every minute of it.
Sean Price 40 FPS is very annoying to deal with, and most people in the gaming community don’t play on those kinds of frame rates; I think you’re out of touch, if anything.
"CPU does not have popcnt" - for those interested, popcnt usually is a hardware intrinsic function, that calculates the number of set bits in a number. Often used for heavy micro optimizations. And usually supported by any cpu, except maybe refered to by a different name. So, two options: Programmers under enough stress to forget fallback options, or a really bad instruction set on the cpu.
Programmers don't bother telling their compiler what specific assembly instructions to use or not use unless it's already become an issue. The performance improvement usually isn't worth the tradeoff of messing with assembly, instruction set fallbacks, and feature flags, and if you do need the performance then it's usually easier and more impactful to audit your code.
one of my friends is working on a project called d9vk which is a directX 9 to vulkan translation layer, could improve performance in non vulkan titles ?
@@Jambu5ter1 A bit but not that much. DirectX9-11 and Open GL in general are not "thread safe" APIs, basically one thread is declared the "sacred thread allowed to talk to the GPU", and well, it is the only CPU that can send draw calls etc, while all the other CPUs can do is to pre-process stuff to make the life of the sacred CPU easier. Now on vulkan, everyone can talk to vulkan, everyone can issue draw calls. So the D9VK can in theory give the DirectX9 the ability of being thread safe, but no game will use it because as far they know, you can't do it.
@@dan_loup League of Legends runs better under Wine in Linux with D9VK compared to OpenGL. So D9VK rocks. Also older games (both on Windows and Linux, it's not OS-bound) can run better with D9VK.
First time he said it, it was ok, i believed HIM. second time he says the same quote, i was like "WTF? Samsung told you what to say!!!!" and it looses some of it credibility
yes Linus, there is no doubt whatsoever that high refresh rate gives you a clear competitive advantage and when you have the FPS to back it up, obviously the whole concept works a little bit better.
Yeah but have you considered the fact that there is no doubt whatsoever that high refresh rate gives you a clear competitive advantage and when you have the FPS to back it up, obviously the whole concept works a little bit better?
It's the mainstream snob gamer in him speaking or at best that's his youtuber persona. His real problem is with the blocky aesthetic, he has a pitiful fixation with photo realistic graphics.
@@halowars4000 I make more than that and still don't want to spend that much on a GPU. I mostly watch this stuff for the fantasy while using cheaper previous-gen AMD stuff.
It was 5000 then, and now LTT casually drops in TWO Titan XP's that are $2200 apiece. "Oh, they were sitting on the shelf collecting dust, let's use these!" 😲 Boosts PC's price back up to $5000 now.
one of the moments when an ie user sees a chrome user not beeing so overly pride over their "so awesome" browser ie isn't that bad if you actually make it good, it needs some settings switched tho and then it's even more user friendly than chrome at a fraction of the system requirements
Hi I'm the real Paul from Ontario and the RAM I sent to Linus was dropped onto the floor,borked,and replaced by RAM that Linus downloaded from the interwebz.
I mean, it's really unfair to have a "10yr old computer" with dual Titans in SLI. To be really fair, Linus' should've used whatever high end card they had back in 2008 lol
@@MrNalettinho well, the pont of the video was if the motherboard, cpu, and ram could actually do it. I mean, if they did that and only played with the titles compatible with the onld DX back then, it would just be like, "can this high end 10 year old pc run 10 year old games" and the answer is obviously. So him putting the new graphics cards makes sense because 10 year old hardware alone obviously can't run new games.
@@CutTheVioletWire yeah, I hadn't consider the software side of things. I mean, obviously a 10 year old hardware (GPU) wouldn't be able to run doom, even if there was DX suport for it
"There's no doubt whatsoever that high refresh rate gives you a clear competitive advantage and, when you have the FPS to back it up obviously the whole concept works a little bit better."
"There's no doubt whatsoever that high refresh rate gives you a clear competitive advantage and, when you have the FPS to back it up obviously the whole concept works a little bit better."
"There's no doubt whatsoever that high refresh rate gives you a clear competitive advantage and, when you have the FPS to back it up obviously the whole concept works a little bit better."
Most of the stuttering was probably because you’re running 2 gpus, removing one probably would've mostly fixed it and you would’ve probably gotten more FPS too
i used to have nvidia drivers problems on my old pc turned out to be one of the least likely thing, nvidia audio drivers conflicting with realtek!?, maybe thats why things run better on vulcan than directx?? ever since this shit when i update drivers i clean instal with audio drivers ticked off, srls it took me like 2 weeks to find what caused stutters etc
"There's no doubt whatsoever that high fps gives you a competitive advantage.." Yeah I know Lin- " *There's no doubt whatsoever that high fps gives you a competitive advantage* "
The people who built that mobo were brilliant. It's amazing how well that tech holds up after a decade of modern advancement. 10 years in computer tech is like 50 years of anything else.
@@blisphul8084 How many Samsung products have you owned, and if any how long did they last for you? I have seen several older Samsung products work just fine years later..I have a tv that was one of their first flat screen models and it still works perfectly fine.
most of my steam library can run on an Athlon II x4 from a system originally built 10 years sitting out in my shed... any system's good enough as long as it plays what you want to play, right?
I find it very hilarious how I was just watching this video like a minute ago at full screen and got BSOD right after Linus was saying "Windows stability is a whole other story" at 5:02. I legit thought this was some kind of an editor prank for a sec due to how crazy the timing was and i'm still laughing about it since i wasn't having any BSODs for very long time now.
Before I started playing source games, I used the "shift to crouch" configuration, but then it got annoying to play minecraft so now I only use shift for running. Its better. Make the switch if you haven't.
FYI, the "g-sync" is adaptive sync/freesync, not Nvidias closed hardware implementation. Important distinction, considering one works with AMD and Nvidia whilst the other exclusively works with Nvidia.
This video made me realize what a beast my old rig actually is. With my 24gigs of ECC DDR 3. a Xeon w3550 and a GTX 1060. It still plays everything I tried without any problems.and yes the w3550 was released in 2009
Great to see 32GB of RAM working on this platform! Bit sad about the lack of time for proper overclocking though. For benchmarking in Cinebench R15 I've had my QX9775s at 4.2GHz with about 1.425V, so there was definitely more performance left to be had here. Enabling 'enhanced power slope' would have helped as well. In any case, it's good that newer APIs can now start to favour multithreading rather than singlethreading which really helps when dealing with an architecture of this era. From my personal testing I've seen performance gains up to 50% in Crysis 3 when going from single Core 2 Quad to 2x Core 2 Quad which is substantial.
I think the system needed a bit more FSB instead of multiplier. A single socketmodded X5460 with FSB at 1600 and clocked to 4Ghz goes over 400cb with decent DDR2 ram.
@@stijnbagin as long as you stay below the max FSB of the 5400 chipset (around 430MHz) it is not relevant to system stability. This was simply a matter of putting some time into overclocking and running a clean version of W10 without stuff in the background. I've reached well over 800 points at 4GHz on my own Skulltrail setups.
Yup. Instant video pause, rage, comment scan for camaraderie. Edit: Oh man, just scanned Samsung's amazon page there. They also NEVER say "Freesync". Entirely "G-Sync Compatible"
This is why I liked the times back then when it was the Mobo/chipset that controlled what RAM types you could use. A Q6600 and Q9400 really shined with DDR3 so you could go 1:1 with RAM and use more than 2GB sticks. Imagine buying a 9900k now that you could upgrade to DDR5 when it comes out
@@yourposer Motherboards can take more than you would think. I tried breaking a really old one for fun one time before I threw it out, and it was a major challenge.
The mouse issue making it choppy was likely related to a bad windows updates around end of summer. It affected quite a few numbers of games in first person shooters genre.
@@bastaudio I got all the way through it while it was public, even commented on it... There were about 29 comments total at that point but those are all gone now.
@@pradeepkumar-qo8lu Or just aggressive Nvidia Marketing. Total B.S. on Nvidia's part and I hope there is some decent competition to Nvidia next year so they can pull their socks up.
"G-sync comptable" means freesync - why would you say "as long as you have an nVidia graphics card"? It will work with any graphics card that supports variable refresh rate (except of course the gtx 900 and 700 and 600 series graphics cards, which only support G-sync classic and g-sync classic)
he means technology CPU, MB, RAM, Using/replacing the other components to make sure they would not bottleneck anything else, thus boosting the maximum performance from the old cpu's/ Mb. He is effectively testing the MB here.
There's no doubt whatsoever that high refresh rate gives you a clear competitive advantage and, when you have the FPS to back it up obviously the whole concept works a little bit better
@@CrimsonRegalia there was no incentive, they were making butt loads of money off what they had, no need at all to make something new, still really nothing new, just more cores, same old shit otherwise...
I love how Linus straight up ignores the fact that "G-Sync compatible" is just Freesync that was certified by Nvidia. No word about compatibility with AMD. Nvidias marketing trick seems to have worked quite well.
Hehe, it seems you have not owned a Samsung with freesync before that simply doesnt work with Freesync most of the time. So the certificate is deffo worth it.
exactly bad test by linus, he had to use a 10 year old GPU with that system if he has one, to see how well that old system can run todays AAA titles the test was ruined was he used new hardware
Damn, I thought I’m hallucinating for a sec yesterday when I was searching this video but cant find any, cause I’m so damn sure I noticed linus uploaded this
"32GB was a lot back then". Well, it still a lot.
My first pc build(in 2011) had 8gbs ddr3 and I thought for sure I would be set forever given I came from 2gb ddr2(dell oem). So yeah 32gbs was insanity for ddr2. Currently running 16gbs ddr4 Ryzen based system and loving every minute of it.
@Sean Price 25fps is playable.
Sean Price 40 FPS is very annoying to deal with, and most people in the gaming community don’t play on those kinds of frame rates; I think you’re out of touch, if anything.
@Sean Price 40fps is super unplayable, considering 60 is the standard. Even 60 can be a pain in the ass.
@Sean Price kids these days... had 25 growning up and was happy with it lol
"CPU does not have popcnt" - for those interested, popcnt usually is a hardware intrinsic function, that calculates the number of set bits in a number. Often used for heavy micro optimizations. And usually supported by any cpu, except maybe refered to by a different name. So, two options: Programmers under enough stress to forget fallback options, or a really bad instruction set on the cpu.
Option 1 sounds like a subset of option 2.
Appreciate the info!
Skulltrail is Core 2 arch (specifically,Yorkfield XE) , POPCNT first appeared in Nehalem, which is the successor to Core 2.
It's 2nd option- I've had apex throw many errors when the AVX instructions weren't valid
Programmers don't bother telling their compiler what specific assembly instructions to use or not use unless it's already become an issue. The performance improvement usually isn't worth the tradeoff of messing with assembly, instruction set fallbacks, and feature flags, and if you do need the performance then it's usually easier and more impactful to audit your code.
Thanks was about to look that up
"Can you believe the CPU running this is 11 years old"
*my 10 year old CPU's sweating intensifies*
SAME
My i7 950 is still going strong with all games except the new cod 😀
Same dude. AM3 platform here.
i7 920 been running @ 3,6ghz for idk how many years 😅 Using pc as a space heater is also pretty nice during the winter
I still have an i3 3240 with a gt 210
F
When the pc from 2008 gets better frames than your pc from 2019
well he does have 2 rtx graphics cards on it
sorry. 1
I think we would be a lot better if we had 2 RTX capable graphics cards sitting around
@@MrSpannners right!
@@MrSpannners Linus is the reason the price is so damn high and supply so low.
I’m guessing some where in that sponsorship script there’s a part where it says “must say: clear competitive advantage”
Twice, even!
it's weird how advertisement scripts have this stink on them that is so easy to detect
**smashes L key**
Pay to win has moved to hardware as well, according to Samsung. Nice.
Teh dck wz sckd
but it is a clear competetive advantage haha
14:33
LTT: Ctrl + C, Ctrl + V
15:28
lmao
Lmaooooo
😂😂😂😂😂😂
Great catch
LMFAO
LTT: We Overclocked this $5000 PC from 10 years ago.
Me: My PC running as a $500 Non Overclocked PC from 10 years ago.
And Me: My laptop running as a $150 ( does it have a clock ? ) laptop also from 10 years ago.
Phil Dodd i doubt you’re being honest, if you are, get a fucking job. If it was $150 ten years ago, you desperately need a job.
Bob chill
@@callaco3176 yeah that's gotta be like a P2 400 or something
Even worse: my Lenovo S10e netbook from 10 years ago. It has an 576p monitor and starts chugging with mere task manager and RUclips running.
Linus: 32 GB ram is lot back then
Me: cries in 4 GB ram
Cries in 8 Gigs... and find a good laptop for a reasonable price with 16 gigs...
@@acmenipponair Cries in chromebook
cries in 1 gig
@@acmenipponair dont find,buy 8gb and add or replace memory stick
Isn't 32GB Ram more than average even today?
Famous for dropping things but somehow manages to hold on to the hot ram.
He's been going to rehab.
@S S yeeeet. another strong man and tech fan. How about arm wrestling?
r/unneccessaryyeet
SO now he'll be a dropsie with no fingertip control. Great! Hardware is now really doomed.
@@AG.Floats wait. Theres necessary yeets???
This is more like a demo of Vulkan's potential.
one of my friends is working on a project called d9vk which is a directX 9 to vulkan translation layer, could improve performance in non vulkan titles ?
@@Jambu5ter1 A bit but not that much.
DirectX9-11 and Open GL in general are not "thread safe" APIs, basically one thread is declared the "sacred thread allowed to talk to the GPU", and well, it is the only CPU that can send draw calls etc, while all the other CPUs can do is to pre-process stuff to make the life of the sacred CPU easier.
Now on vulkan, everyone can talk to vulkan, everyone can issue draw calls.
So the D9VK can in theory give the DirectX9 the ability of being thread safe, but no game will use it because as far they know, you can't do it.
Vulkan is the future
And about how high fps feels smoother?
@@dan_loup League of Legends runs better under Wine in Linux with D9VK compared to OpenGL. So D9VK rocks. Also older games (both on Windows and Linux, it's not OS-bound) can run better with D9VK.
14:33
15:28
Whoa. Deja vu.
YAOMTC It’s a script from Samsung.
First time he said it, it was ok, i believed HIM. second time he says the same quote, i was like "WTF? Samsung told you what to say!!!!" and it looses some of it credibility
@@tranceonline It's proven high fps/refresh rates give advantages, so not really a credibility issue.
Yikes
@@tranceonline Yeah, there is no doubt whatsoever about that.
yes Linus, there is no doubt whatsoever that high refresh rate gives you a clear competitive advantage and when you have the FPS to back it up, obviously the whole concept works a little bit better.
Yeah but have you considered the fact that there is no doubt whatsoever that high refresh rate gives you a clear competitive advantage and when you have the FPS to back it up, obviously the whole concept works a little bit better?
Found this comment as Linus were saying it. Thanks for the subtitle 👍
@@akirafan28 Same exact thing, just happened to me. Lol
@@theaveragegamer9632 🤜🤛
"It's Doom running on a 10-year-old computer!"
Has two Titans.
@Thunder_Arch Well yeah but for the price of those 2 gpus, you could have a new great gaming pc with a good monitor xD
I was playing Doom back in 90s so its not a big deal that it runs on old hardware.
hubertnnn im sure he meant Doom (2016)
Twotans.
r/whoosh
*benchmarks at 240fps* "Are you guys seeing this frame stuttering?"
*films at 30fps, so no, we can't*
To be fair, they didn't go much higher than 40FPS
But no, we still didn't
He asked not just said it.
You can clearly see the game stuttering even in the youtube video. It was noticeable before he even mentioned it.
It has everything to do with framerate.
@@Zachx_618 stuttering is literally frame drops
“Minecraft is not realistic”
Also plays a game where there are zombies with energy shields
It's the mainstream snob gamer in him speaking or at best that's his youtuber persona. His real problem is with the blocky aesthetic, he has a pitiful fixation with photo realistic graphics.
what... you don't run into zombies with energy shields on your way to work? must be nice!
@@kalmtraveler We don't live in remoted area. We live the big cities where zombies fly to their workplace.
its just like giving a fat dumb american and energy shield. more realistic than one might think.
Chaz titan actually Minecraft is better
Plot twist: The whole video was actually just an ad for Vulkan.
👎
14:34 - 14:46
Ctrl + C
Ctrl + V
15:28 - 15:39
John- 117 Marketing Shill lol 😂 and Gsync and Nvidia so monnney
Yeah copy paste, no doubt whatsoever.
Bruh
Straight from the sponsor script
Probably what Samsung told them to say for sponsoring the video.
i like how he casually says
"lets throw in an RTX 2080"
that graphics card is more expensive than mi dad´s car
@@fuantei7283 I think a lot of the viewers here will be aha. Unless Linus' average viewer makes 40k a year lol
@@halowars4000 I make more than that and still don't want to spend that much on a GPU. I mostly watch this stuff for the fantasy while using cheaper previous-gen AMD stuff.
@@Chompski_ car
@@sebagomez4647 what type of car is only 1k?
Linus: The stupidest configuration ever....
Most everybody: Lets take it easy.....
Linus: Lets overclock this thing...
ruclips.net/video/KnsiZOJjfUg/видео.html
@@cheif10thumbs I don't get it
"So smooth" gives us video in 30fps.
"Can you believe this is running on 10 year old computer?" Yes, it was freaking 5000 dollars...
You can get similar performance with an oc core 2 quad for like $200
It was 5000 then, and now LTT casually drops in TWO Titan XP's that are $2200 apiece. "Oh, they were sitting on the shelf collecting dust, let's use these!" 😲 Boosts PC's price back up to $5000 now.
@@pault151 True true. Still around 5K $
Yeah, my CPU is a 7 year old non-overclocked i5, and it runs modern titles fine.
My 150 dollar Dell T7500 with a pair of X5687 xeons isn't exactly slow and it's 11 years old.
CPU Time and RAM: _is available_
Google Chrome: *”It’s free real estate”*
These comments are the real deja vu, and look its got an anime profile pic, deffinetly begging for attention........
and edge too ... wait a sec. whats edge?
one of the moments when an ie user sees a chrome user not beeing so overly pride over their "so awesome" browser
ie isn't that bad if you actually make it good, it needs some settings switched tho and then it's even more user friendly than chrome at a fraction of the system requirements
a mistake that never should have been done
A bunch of other windows background processes: "Hey, we want some too."
... I'm Paul from Ontario, and I can confirm I sent no RAM to LTT, must be an imposter out there
You should sue for personality copyright infringement.
It was this Paul from Ontario. Perhaps you're the imposter!
@@hitnrun7 Paul from Ontario, California, I'll be seeing you two in court
Hi I'm the real Paul from Ontario and the RAM I sent to Linus was dropped onto the floor,borked,and replaced by RAM that Linus downloaded from the interwebz.
Hey I'm not Paul from Ontario and I will still sue you for personality copyright infringement
"this is doom running from a 10yr old computer"
no this is doom running from a 10yr old computer that you upgraded heavily
I mean, it's really unfair to have a "10yr old computer" with dual Titans in SLI. To be really fair, Linus' should've used whatever high end card they had back in 2008 lol
@@MrNalettinho That would mean they had to use DX9 and 10 titles only, as DX11 support didn't come until 2009 with HD5000.
@@Orcawhale1 You're right
@@MrNalettinho well, the pont of the video was if the motherboard, cpu, and ram could actually do it. I mean, if they did that and only played with the titles compatible with the onld DX back then, it would just be like, "can this high end 10 year old pc run 10 year old games" and the answer is obviously. So him putting the new graphics cards makes sense because 10 year old hardware alone obviously can't run new games.
@@CutTheVioletWire yeah, I hadn't consider the software side of things. I mean, obviously a 10 year old hardware (GPU) wouldn't be able to run doom, even if there was DX suport for it
"There's no doubt whatsoever that high refresh rate gives you a clear competitive advantage and, when you have the FPS to back it up obviously the whole concept works a little bit better."
hey i dont think he mentioned this in the video. we should tell him
"There's no doubt whatsoever that high refresh rate gives you a clear competitive advantage and, when you have the FPS to back it up obviously the whole concept works a little bit better."
lmao yeah he said that more than 2 times
"There's no doubt whatsoever that high refresh rate gives you a clear competitive advantage and, when you have the FPS to back it up obviously the whole concept works a little bit better."
"if you love bottle necks" was expecting an ltt store straight after
@@HelenGPitts STFU you BOT, bet you have a -3 k.d.
@@edragyz8596 lmaooooo
@@edragyz8596 Just report it as 'unwanted commercial content or spam'. Don't bother replying to it.
@@hauntedshadowslegacy2826 forgot you could report comments
Most of the stuttering was probably because you’re running 2 gpus, removing one probably would've mostly fixed it and you would’ve probably gotten more FPS too
i used to have nvidia drivers problems on my old pc turned out to be one of the least likely thing, nvidia audio drivers conflicting with realtek!?, maybe thats why things run better on vulcan than directx?? ever since this shit when i update drivers i clean instal with audio drivers ticked off, srls it took me like 2 weeks to find what caused stutters etc
Got one gtx1080 with i7 980 4ghz
Smooth even in assasins creed valhala
@@jofert015 that's a great card
@@RemoWilliams1227 but the cpu is from 2011
@@phate2706 higher base clock tend to be better. This turbo/throttling is mid
They posted this yesturday then took it down when I was half way through it lol
i was lucky and removed it last second
@Zeno they tweeted something about testing new RUclips features so they probably just accidentally pressed publish.
@Zeno defenetly was there was RGB stroving in the room and lights were on off
like a 2 weeks ago on floatplane. so classic
I figured they just messed up it happens ya know
Samsung: “when’s he going to actually showcase the monitor?”
Linus: “can’t wait to break out the Epic after we finish recording”
"POPCNT"
*DEMONETIZED*
I genuinely lol’d
Lol with my E5450 i understood the joke quite well
Too bad
POPCNT = Population Count in SSE4
That was seriously the single most hysterical moment LTT has ever had. I genuinely couldn't help myself from lmao. Cheers, Linus!
"Can you believe the CPU running this is 11 years old"
When your system cost over 5000$, I would expect it to do so
it's over 9000 with these ram sticks
0:22 a cop car went wailing by me exactly as Linus paused and looked around and lemme tell ya it FREAKED me out
0:20
I'm moving my browser window from my lesser display, over to my better one, and Linus just stops and looks up at my nonsense.
it'd be funny if it actually happen
yet yours doesn't
And I'm a jesus
@Salt Salt LOL mad
@Salt Salt HAHAHA YES
"There's no doubt whatsoever that high fps gives you a competitive advantage.."
Yeah I know Lin-
" *There's no doubt whatsoever that high fps gives you a competitive advantage* "
There's no doubt whatsoever that high fps gives you a competitive advantage.
@@CaveyMoth There's no doubt whatsoever that high fps gives you a competitive advantage.
Yeah, I know that *There's no doubt whatsorver that high fps gives you a competitive advantage*
@@hollow8194 Yeah, but *There's no doubt whatsoever that high fps gives you a competitive advantage.*
There's no doubt whatsoever that he read his script twice.
The people who built that mobo were brilliant. It's amazing how well that tech holds up after a decade of modern advancement. 10 years in computer tech is like 50 years of anything else.
Thats a really cool pfp you have there
0:21
My Linus is a moth theory gains more evidence by the day
My respect for Linus shines brighter.
LOL
Omg I'm dead of laughter
Linus: "So let's start by getting our monitor cracked open."
Samsung: "Wait, what?!"
@@blisphul8084 Running a 3D plasma from Samsung from back in the PS3 days, still works fine
@@blisphul8084 How many Samsung products have you owned, and if any how long did they last for you? I have seen several older Samsung products work just fine years later..I have a tv that was one of their first flat screen models and it still works perfectly fine.
0:20 bro that light just overclocked its self
So you’re telling me, if I send in a super old computer, you’ll put 2 titans on it?
Yes please.
So you’re telling me, if I send in a super old computer, you’ll put 2 titans on it?
Yes please.
Me: Hey I’ve seen this it’s a classic!
A lot of people: How? It just came out.
What's a rerun?
Sebastian Perez Nice somebody knows what I’m referencing.
Chill yep
I know right reruns always bring back good memories
Very dated hardware.
Me, on my eight year old laptop: Sure.
most of my steam library can run on an Athlon II x4 from a system originally built 10 years sitting out in my shed... any system's good enough as long as it plays what you want to play, right?
Nicholas Fabian hear hear, my daily driver is a Dell Latitude E6330 running Arch Linux ☺️
Me on an HP laptop with 4:3 screen.. yeah it canb run new vegas lol
99 liker on you comment!!!!!
Yeah me on my 12 year old laptop: *SURE! VERY DATED INDEED!!!*
11:55 sing it with me
"Doofenshmirtz Evil Incorporated!"
After hours
Doofenshmirtz malvados y asociados
I didnt realise his building was there xD
😂😂
I find it very hilarious how I was just watching this video like a minute ago at full screen and got BSOD right after Linus was saying "Windows stability is a whole other story" at 5:02. I legit thought this was some kind of an editor prank for a sec due to how crazy the timing was and i'm still laughing about it since i wasn't having any BSODs for very long time now.
"Let's start by getting our monitor cracked open" and with Linus you know that's a 50/50 shot as to what he means...
"Who binds run to Control?!"
THANK YOU.
Me. Im serious. I use shift to prone.
That was so annoying and no one mentioned it even. Lol
Me? What do you use control for? I bet you double tap w.
Before I started playing source games, I used the "shift to crouch" configuration, but then it got annoying to play minecraft so now I only use shift for running. Its better. Make the switch if you haven't.
@@user-qq6si7zv3t I've been playing Minecraft since the alpha days so i'm so used to double-tapping w it's hard to change.
FYI, the "g-sync" is adaptive sync/freesync, not Nvidias closed hardware implementation. Important distinction, considering one works with AMD and Nvidia whilst the other exclusively works with Nvidia.
This video made me realize what a beast my old rig actually is. With my 24gigs of ECC DDR 3. a Xeon w3550 and a GTX 1060. It still plays everything I tried without any problems.and yes the w3550 was released in 2009
14:17 "Let's see what the experience is supposed to be like"
*gets only 120fps*
bruh
1440p 120 fps. What's he running? A 2060 Super?
That triggered me so hard !
It's a 1080p monitor... I get 1440p 150+ fps on Vega 64...
@@imadecoy. no you don't, i have a i7 9700k and rtx 2080 super and don't get that much on most games at max settings.
@@Zain-tr2vo I guess you've never played Doom then. Moron.
Great to see 32GB of RAM working on this platform! Bit sad about the lack of time for proper overclocking though. For benchmarking in Cinebench R15 I've had my QX9775s at 4.2GHz with about 1.425V, so there was definitely more performance left to be had here. Enabling 'enhanced power slope' would have helped as well. In any case, it's good that newer APIs can now start to favour multithreading rather than singlethreading which really helps when dealing with an architecture of this era. From my personal testing I've seen performance gains up to 50% in Crysis 3 when going from single Core 2 Quad to 2x Core 2 Quad which is substantial.
wow
I think the system needed a bit more FSB instead of multiplier. A single socketmodded X5460 with FSB at 1600 and clocked to 4Ghz goes over 400cb with decent DDR2 ram.
@@stijnbagin as long as you stay below the max FSB of the 5400 chipset (around 430MHz) it is not relevant to system stability. This was simply a matter of putting some time into overclocking and running a clean version of W10 without stuff in the background. I've reached well over 800 points at 4GHz on my own Skulltrail setups.
Linus "Hopefully that will be enough to get it stable"
PC : *instant BSOD*
I've been subbed for over a decade and I'm kind of amazed that you're STILL able to come up with such compelling content. Thanks LTT
When your PC in 2019 is much worse than that one from video...
lol
Big oof.
Tell me brother, I feel your pain, when a consistent 20 fps is a good day.
I assume you're using an hp stream laptop lol
Doom running on these CPUs is more a credit to how great Doom was coded and optimized.
I prefer to take the view its more of an indictment of how badly intel has been shortchanging their customer over the past 11 years
Everyone: uses one CPU
Linus: there ist another
"we've got 16 gigs of ram in there"
" b u t I w a n t m o r e e e e e "
>Gsync compatible (so actually freesync)
"So as long as you've got an Nvidia graphics card"
That's market dominance for you..
Yup. Instant video pause, rage, comment scan for camaraderie.
Edit: Oh man, just scanned Samsung's amazon page there. They also NEVER say "Freesync". Entirely "G-Sync Compatible"
@@thomasmcoscar681 that was the part of the deal for gsync compatible label, freesync is gone
@@kumbandit Why didn't they just call it "adaptive refresh" and screw all the trademark/marketing hype? That would have made sense :)
Thomas McOscar would it still work with a FreeSync monitor?
Who saw this yesterday??? :)
I did
Me
I did, said it was private
yup
Me
0:21, “GOD”has entered the chat
Harharhar
hahah wp sir
Linus: "So let's start by getting our monitor cracked open."
Me: *DON'T*
Ram: *ddr2*
Also ram: *32gb in 4 sticks*
Everyone: "That's cheating!"
This is why I liked the times back then when it was the Mobo/chipset that controlled what RAM types you could use. A Q6600 and Q9400 really shined with DDR3 so you could go 1:1 with RAM and use more than 2GB sticks. Imagine buying a 9900k now that you could upgrade to DDR5 when it comes out
It made me cringe when he put the memory in. That board flexed a lot.
@@yourposer after all these years, everybody has to flex their muscles ;)
@@yourposer Motherboards can take more than you would think. I tried breaking a really old one for fun one time before I threw it out, and it was a major challenge.
DDR2 RAM 8 GB modules - oh my. I would kill for that back in those days.
"This is Doom running on a ten year old computer!" So?... Oh, THAT Doom. :D
"mating_call.ogg"
The editors were definitely high !!!
or creative
*are
The mouse issue making it choppy was likely related to a bad windows updates around end of summer. It affected quite a few numbers of games in first person shooters genre.
"How to bottleneck your monitor"
Linus: not running well, *20-40 frames*
Me: That’s my computer when playing the dinosaur game...
Lol. That game is terrible.
Just casually watching this video on my profusely sweating ten year old gaming rig. That didn't cost 5k back then.
Some of the old CPUs can still keep up pretty fine. Got a 2500K i5 from 2011. Thing has absolutely no trouble running new games in 1080p resolution.
I’m guessing this was uploaded by accident yesterday then made private 😂
They reset it as I commented on in the video last night and even got two like notifications before it was nuked.
I was WATCHING IT before it got made private :(
@@bastaudio I got all the way through it while it was public, even commented on it... There were about 29 comments total at that point but those are all gone now.
I think they delete it and then reupload it today, otherwise it's gonna affect the algorithm's things
@@robertnees9781 I commented on it and got about five minutes in to the video when it got deleted
0:22 **God whispering to Linus that he should get himself a nice slice of pizza after this**
"g-sync compatible" "need a GeForce GPU"
So Jim was right about Freesync dying
Thought the same...
More like being killed by aggressive nvidia policies
Nvidia pulling those strings.
@@pradeepkumar-qo8lu Or just aggressive Nvidia Marketing. Total B.S. on Nvidia's part and I hope there is some decent competition to Nvidia next year so they can pull their socks up.
And not all Nvidia GPUs that use VRR will work on it, whereas every AMD (or intel) GPU that supports VRR does
"I dont need to explain the advantages of 240hz monitor"
Explains.
Me: Sees the LTT water bottle, he's gonna sa-
Linus: "LTT store .com
Me: sees free online video entertainment
Linus: *makes it free*
instant setup and payoff, its a great way to advertise , feels more natural than a full fledged commercial.
@@tobiwonkanogy2975 not for people who feel entitled and refuse to even allow RUclips ads 😂
Gotta fuel the Lambo somehow.
SLI and bonus points in 2019?
I love Architects
blegh
"So this is the new Wolfenstein II... "
*BSOD*
Yeah that's pretty good.
Linus struggling to play Minecraft gives me life
"Let's start by getting our monitor cracked open" said no one ever
"G-sync comptable" means freesync - why would you say "as long as you have an nVidia graphics card"? It will work with any graphics card that supports variable refresh rate (except of course the gtx 900 and 700 and 600 series graphics cards, which only support G-sync classic and g-sync classic)
Says it’s 10 year old hardware
Uses SLI Titan XPs and a 240Hz display
he means technology CPU, MB, RAM, Using/replacing the other components to make sure they would not bottleneck anything else, thus boosting the maximum performance from the old cpu's/ Mb. He is effectively testing the MB here.
@@jeffvella9765 this
shut up kyle
There's no doubt whatsoever that high refresh rate gives you a clear competitive advantage and, when you have the FPS to back it up obviously the whole concept works a little bit better
"Who binds run to ctrl?"
I feel personally attacked.
@@K0LBIE it's shift key
Elder In some but not all. Personally, i like control for running
Kolbie Smith I don’t know a single person who uses ctrl as sprint, and can’t think of any recent games that defaults ctrl to sprint.
Nope, he's right, I personally use the grave key (left to "1")
SHIFT is for RUN, CTRL is for CROUCH
Missed chance to add dubstep and airhorns when the RGB lights came on.
10:00
You got me with that subtitle
Thats how linus is still a virgin
"You can turn on your xhair if you're a big fat cheater"
15:01
15:40
Yeah. You nailed it, Linus!
"That core count looks pretty modern."... This statement is so depressing
Why?
@@Stevenx01 It's more of a jab to Intel for refusing to add more cores until they felt the pressure from AMD.
@@CrimsonRegalia I don't see that level of subtlety... It's just that multicores (4+) became a common thing just recently. AMD or Intel.
@@CrimsonRegalia there was no incentive, they were making butt loads of money off what they had, no need at all to make something new, still really nothing new, just more cores, same old shit otherwise...
@@Jimster481 AMD is still way behind intel on clock and just starting to edge up on ipc...
"The next Video Sponsor is * Google Chrome*. "
You messed that up.. *Google Chrome*
Linus 2029: We OVERCLOCKED this $15k DUAL RTX PC from 2019!!!
Virtual Linux in Matrix at 2050: We overclocked our Universe.....
@@KrotowX 😂
4:13 Wow I really liked the way your cooling solution for ram.
New challenge. Every time Linus says “clear competitive advantage” take a shot.
Samsung def made him say that as much as possible
You mean he chose to say it after Samsung requested it. It¨s also at best a subjective opinion, at worst a straight up lie. Sellout.
I love how Linus straight up ignores the fact that "G-Sync compatible" is just Freesync that was certified by Nvidia. No word about compatibility with AMD. Nvidias marketing trick seems to have worked quite well.
Hehe, it seems you have not owned a Samsung with freesync before that simply doesnt work with Freesync most of the time. So the certificate is deffo worth it.
‘This is doom running on a 10 year old computer’
More like old except gpu
What I’m saying is that the gpu is not old or bad
I can run Doom on a 30 year old computer.
exactly bad test by linus, he had to use a 10 year old GPU with that system
if he has one, to see how well that old system can run todays AAA titles
the test was ruined was he used new hardware
4:01 Slowly pushing Bro.
You push so hard, im scared you will brake the Maotherboard :D
Thanks for that Video sending from 2o2o.
Mr Sebastian, how much ram would you like
Linus: yes
Wrong, the correct awnser is: MORE!!!
The Titan XP's aren't 10 years old. Just thought you should know.
0:22 God have bless this build
Tolis Den Paei Kala or the light 💡 went on inside Linus head and went ding I have a idea lol 😂
@@jayhollowayii2 Hahahahahaha niceee
Thanks for this! I can tell you had fun, so did I!
"We got a new PC"
Linus: overclock it
Damn, I thought I’m hallucinating for a sec yesterday when I was searching this video but cant find any, cause I’m so damn sure I noticed linus uploaded this
he did - it was private tho :/
1:04 wait, where's the intro?
Linus dropped the intro.
[24-10-2019_01_13.]
So much better imo :D
12:35 Minecraft's way of punishing someone when they say the game isn't realistic.
14:16 look at the reflection