+MinecraftBysup69 Yeah, believe it or not there are a lot of us waiting for a good reason to upgrade. Intel being the only ones in the high end CPU market is not good for the consumer in any way.
+Ronnie Slowinski The first release claimed that Conroe / the Core 2 Duo had an integrated memory controller, whereas it was Nehalem (the first iX chips) that did that.
SSE-instructions (and their versions) which were very, very important for flotaing point math for all sorts of media calculations (music production, video encoding etc)...
Ive never actually seen a intel cpu made in the last 10 years thats stopped working, even ones ive given a pretty hard time voltage and clock wise have always still run @ stock, they may not pull as higher OC as when new, but the drop is not much and they still stress test fully stable @ stock clocks and voltages.
few days ago i replaced q6600 from my system as motherboard was bad and i figured if i am replacing components i might as well upgrade. q6600 was oc to 3.2 ghz and i have to say it was really hard to let it go. now i have i5 that i had to undervolt to get the same temp at boost clock as i had on q6600 oveclocked.
Accessing your home feed from 2022; the world is nuts right now, but the processor world is waawaay more nuts right now than ever previously predicted.
The 4 GB physical ram limit didn't last twenty years on the x86, it was lifted with the Pentium Pro nine years later, which had a 36-bit physical address space due to Physical Address Extension.
True, but it was still only 4 GB linear addressing. Just as the 8086/286 only had 64KB linear addressing, despite their 20/24 address lines (via special segment registers).
And no consumer operating system could adress that either. Windows 98, for example, only allowed 512 megabytes maximum. It wasn't until the 64-bit operating systems that we could go beyond 4 gigs (or even 3 gigs since there was a Windows limit for that too). And that was in the mid 2000:s..
1:50 Hmm.. a card with "Philippines" printed on it.. Interesting.. O_O And yeah, I remember my family's old Pentium 4 PC and the good old days of Red Alert 2 and Yuri's Revenge. :)
Would have liked to have seen more discussion on the earlier chips, the 286, and the 486 were omitted. The 486 especially was instrumental for the success of the win boxes.
1:43 The 80386 wasn't only capable of addressing "system RAM" (whatever that is), but four megabyte of *_any_* semiconductor memory, any mix of ROM, EPROM, SRAM, DRAM, NOR Flash, etc.
You left out the 8008 and 8080 processors, the instruction set and architecture of both of these can be found buried in the x86 even today. The early Celeron processors could actually be modified to work on a dual process motherboard!
1:34 The 80386 wasn't called "DX" in 1985 (or 1986, when you could actually buy one). "DX" was a much later suffix that was added to the 386 name only when the newer and cheaper version with only 16 bits databus (called "386 SX") was introduced.
@@johnhpalmer6098 Read again. All I said was that it wasn't *called* "DX" (only 80386) the first years. That was a marketing idea that came much later, when the "SX" variant was introduced.
@@herrbonk3635 True, however, they were thee SAME CHIP, UNCHANGED. The DX came first, the SX not until 1988. The SL was the mobile version and was not introduced until 1990. The difference was, the 386DX came with the 387 dx math coprocessor, the SX did not. My Dad bought a Packard Bell desktop in 1991 that was a 386SX/20MHz based machine, our first Windows machine, though it didn't have windows installed initially, but had DOS.
Again, thanks for making me feel old (j/k) My first x86 PC was a Tandy 1000 with and 8088. It was a hand me down from my dad after he got a Commodore 64. Technically, my first PC was a Ti99/4a. After the 8088, I got a 286. Then I skipped the 386 and 486 and got a Pentium 75MHz. Upgraded that to a 100MHz. It was then when I performed my first overclock. I ran the 100MHz at 133MHz and never had an issue. Back then, CPU speed was determined by the pattern you set a block of jumpers, usually a few rows of three pins. the jumpers either connected pins 1-2 or 2-3 and the combo of connected pins across the rows set your clock speed. For the most part, you could never set the jumpers for any speed other than the CPU was marked as. I actually set the pins to a pattern not shown in the manual. At the time, I never knew I was actually overclocking. I wouldn't truly overclock again until I got my Athlon 800MHz when I used a pencil (YES A PENCIL) to connect points on the top of the CPUs PCB. I got it to a stable 900MHz.The points were actually the socket pins that went all the way through. Very similar to the Pentium overclocking, the Athlons had a set of 4 pairs of tiny points on the top of the PCB. The speed was determined on how these were connected via small connections. You could use a pencil to connect, unconnected points to change the clock speed, but using a rear window defroster repair kit was the best (basically metal paint)
Yea once sandy bridge came out. It's been pretty incremental performance wise. Kind of a good thing. Doesn't force as many people to the latest hardware as much.
Bring back memory lane days with 386 and 486. My i7 6800K remind me a long road of Intel path. Many miss the chance to bootstrap paper tape to start a miini computer with 17.5 MB disk the size of our file cabinet.
I love that not a whole lot changed in the last couple years. I am still using my 2012 i5-3570k clocked at 4.2 GHz and there is basically nothing I cannot do, especially focusing on games. Sure, the more demanding applications (because of bad programming, e.g. ARK:SE) do not run perfectly but they run, reliably, and because of my GTX970 also at pretty good graphics and performance.
I'm not satisfied with MMX part, and again, doesn't matter what Intel wants people to belive, K7 was the first 1GHz processor and Athlon X2 the first with integrated memory controller. Way so hard to say so?
Don't assume anything about me by my picture or name, but Intel beat AMD by a couple of days with an OC'd chip. AMD did however, have the first CPU with a base clock of 1GhZ+
@02:15 _The Pentium III introduced speedstep... was the first CPU from Intel to include an on-die L2 cache._ Little correction. in 1995, 4 years earlier than the PIIII (even before the PII), the *Pentium Pro* included an on-package Level 2 cache (similiar to on-die as it also ran at the same speed as the processor)
Can confirm the Celeron 300A overclocking. It was my second gaming computer and the first one I built myself. It was also my first foray into overclocking.
Yup and that small 8086 was a nightmare to manage the memory due to it being so small. Intel held off on that multicore CPU because they were pushing that primarily on the server Xeon chips. They held off well for a long time until they gave in to releasing it to the consumer market.
I switched from AMD to Intel in 2006 with the Intel's dual core processors, and not looked back since sadly. What killed, or is killing AMD was the push for GHZ, and more cores without significantly improving IPCs, L1-3 cache, and memory performance. AMD's shared resources, now slow DDR 3 standard RAM, slow APU CPU performance, and lack of innovation regarding new instruction sets has four of their top CPUs being beaten by Intel's Dual Core solutions in gaming. Application performance is still ok, and I still use AMD 8320-50s for PC builds when not building for gamers. Going forward after this year I will make the switch to Intel's 6600K CPUs to get some of my clients on the new DDR 4 standard for future proofing.
***** Not bad for typical productivity use, but bottlenecks professional power users, and gamers. I've used AMD'S 6300 before, but since late 2013, I just go with either the 8320, or 50 for typical PC build request with DDR 1866. For gaming builds, I'm using either the I5 4590 S-T for my low end to midrange (4690K) builds. High end gamer builds 5820 or 5930K are my typical builds. I usually talk people down from buying the 5960, and invest in higher end video cards (980 - 980TI) instead of AMD 290 - Nvidia 970 for $15-1700 builds.
despite intel's overcloked coppermine 1ghz issues and unstabilities, amd was the first company who releases a stable 1ghz processor 2 days before to intel's release. do your homework and sorry for my bad english. en.m.wikipedia.org/wiki/List_of_Intel_Pentium_III_microprocessors en.m.wikipedia.org/wiki/List_of_AMD_Athlon_microprocessors
I hope AMD uses HBM as sort of L4 Cache for their ZENs, would be a blast 'cause RAM's so slooooow compared to every cache on DIE and it would be a nice bridge between these technologies...
idk but intel processor is so reliable, last week after a sentimental goodbye, i just retired my intel pentium d 820 ( circa 2005) bundled with the solid asus p5vd2-x. up to now, its still working, can still play games, photshop 2016, runs good on win 7 but sluggish on windows 10, hence the need to now upgrade. but it survived, numerous ms office versions, from windows xp to vista etc. also i think it outlived 3 hardisks, 4 videocards and 2-3 ram (533mhz) changes and psu failures. im a heavy addicted gamer back then, with explosion of online games, from ragnarok, MU, ran, perfect world, flyff etc..and used it overnight for school work. my atx casing remained, and installed my new rig to it though rusted i repainted it and still got the vintage 3.5 floppy disk drive. i tell my sons this pc is older than you so you better respect and take good care of it but they dont care about pc’s anymore, they only know ipads and iphones.
John Drix Reyes Most chips I've seen were "made" in Malaysia, some in Philippines. The AMD chip shown on the video was made in Malaysia as well. Chip companies put manufacturing plants there. A big portion of hard disk drives is made in Thailand, at least it used to be like that. It's like how most things are made in China these days, while they were designed/engineered elsewhere.
Hard to believe they left out the 486DX2-66 and 486DX4-100 and 133, where they first introduced the separation of the front-side bus speed from the core CPU clock speed, which was also the first real consumer overclocking opportunity as you could change the FSB multiplier to run your CPU faster!
I am disappointed you didn't mention that Pentium 4 (Prescott) ran so hot, they demanded a bed of dwarf gold and had to be returned to fires of Mount Doom when it came time to upgrade.
0:24 The 4004 would not be "uselessly slow"... It could actually still be used for a household appliance, for a scientific calculator, or even a simple industrial robot. _It was _*_never_*_ intended for computers._
He almost skipped the 386 as well. This was the processor that intel thought was so good that they had a break with AMD as ”second source” for IBM and AMD had to go it alone. This is the CPU that introduced virtual memory and paging that is used by every modern OS. The key invention of the 486 I would say was the clock multiplier.
How about an "As fast as possible..." explaining the difference between 4, 16, 32 and 64 bit processors (and why they're incompatible)? Also why we don't have 128-bit CPUs (yet). That's something I'd love to see.
@Devlover Nibir Then we'd be many, many years behind just like Intel is still releasing 14nm CPU's, 5 years later from it's first 14nm release because it underestimated AMD.
@@chakra_x yes, the usual dumb comment. I don't have that luxury though running a server company, you tend to go for what is the best option. Intel history boring and i think that is obvious that most think that when you compare this video's views to the AMD video is not even half the views.
Intel stopped worrying half as much about performance because they started focusing more on power usage. Their central focus these days is making the highest efficiency in terms of power consumption that they can, while still delivering the same process power from before, at least, which, in combination, causes the processor to perform better overall. Additionally, it comes down to not just process cycles, but also what Else it can do. Right now, for instance, Intel is focusing on adding in high power video processing to consumer grade units, with the intent of making budget-model video cards no longer effectively necessary.
+Toxik Gaming Na, he's said a few times on the WAN show that he can't but he's going to take the opportunity of teaching his son to also teach him self.
Intel is absolute proof that when you have no competitors, you have no reason to have major upgrades. Like it's said in the video, when Amd was a major competitor, both companies were coming out with huge improvements every couple of years. If the new Zen is able to compete with Intel's high end chips, only then will you see another major release from them.
That’s absolute BS. Dennard scaling ended in the early 00’s. That’s why performance growth ground to a halt and that’s why the pentium 4 was a bad design in hindsight. Intel made a huge bet on itanium; that was going to be the desktop 64-bit architecture; they were sure of it for years and years. Pentium 4 was baked in the cake, they couldn’t change it. So during the time AMD was at its most competitive intel fucked up on all fronts and it took half a decade to fix, and the fix was to regress to the P6 architecture which had hibernated in low power laptop applications and tweak that a bit with multicore.
So is AMD, after winning the war on the Anthlon 64 War, they made shit CPUs which overheated, they had to drop the price crazily just to stay conpetitive, a dual core Intel would beat their highest end 8 core one even in MULTICORE, these wars keep on happening, the Zen2 is just another one
you need to do another correction. 8086 is only an 8 bit chip. 80286 moved everything to 16bits. also 2006 marked the release of core duo 32bit chips. core2duo 64bit chips came out in 2007
No, it was originally a statement about the number of transistors you could fit on an integrated circuit, or how fast that number grows. It has nothing to do with processors or performance per se. That's just the usual narrow minded recentism.
whats the difference between the old i7 (2andsomenumbershere) that some people are still using(that i7 that still runs modern games well) and newer i5/ i7 cpus
+bastard Mostly? Some changes in IPC (Instructions Per Cycle - how much it can do per MHz), some additions in instruction set (it can do certain tasks much more efficiently), a noteable improvement in energy efficiency and graphics performance of the integrated GPU. But comparatively, the difference between the i7-2600K and the i7-6700K is much smaller than the difference between the i7-2600K and the C2Q Q6600, despite the same span of time (~4 years) between their release. It's kind of as if Intel had looked at the Sandy Bridge generation (core 2XXX CPUs) and thought 'yep, those are fast enough for anything a consumer could want. Let's concentrate on other things now.'
+Steamrick has the same happend with i5 processors? im thinking of buying i5 4690k but i thought it would be outdated already and it wouldnt be able to run games propely after few years
+bastard As a 4690k user myself, I can say for certain it's still the bleeding edge. I just watched a gaming benchmark comparison (can't remember by who) and it showed the 4xxx CPUs keeping up with and sometimes outperforming the 6xxx CPUs.
mandaloin have you overclocked? im getting some good mobo so i can get good oc going on, and aftermarket air cpu cooler. how much ghz u did get on ur oc?
Actually, the Pentium architecture did not initially introduce MMX instructions. There were several iterations, and several years later after the Pentium was introduced before MMX was introduced. What the Pentium was most famous for was the introduction of superscalar architecture to x86. The ability to execute multiple instructions per clock tick. This was massive, it allowed the Pentium to be nearly 2x as fast per clock tick compared to it's predecessor the 486. It was way ahead of it's time, and took the competitors AMD, Cyrix, and others quite awhile to catch up. Also to add to things you missed, or got wrong. The Pentium Pro, this CPU had out of order execution which further enhanced the superscaler ability, and moved the L2 cache normally installed on the motherboard to the CPU itself (albeit on a separate die) which now ran at the same frequency as the CPU, i.e 200mhz vs. 33mhz on the shared buss.
@@oofboi114 Would not be surprised if Zen 5 is 128 cores/256 threads... With maybe SMID 4 or 9. . . With enough stacked memory, SIMD 9 is capable. So imagine 128 cores *9 for threads.
??? No, because the average joe is just a term for whatever is average. If there are a bunch of rich fucks, then they are the average joe. We would have better computers if there was more competition from AMD. Hell, we'd have WAY better computers if there was an entirely new company that started making a new type of CPU to compete with both Intel and AMD.
Even in 2021 my i960 from 10 years ago can still play a lot of modern games. It’s not my main system anymore as it bottlenecked my GPU but it’s still setup in the house when kids want to play online stuff. I can’t say that I was rocking our early 90s 386 in the year 2000.
what happened at 5:47.18 (aprox) is this us seeing into the matrix that is linus tech tips?! or a minor edit glitch? '-) xXx nice to know I'm not the only one who publishes with (ehrm) deliberate mistakes to make sure the end user is watching keep up the excellent and very informative videos and apologies for being that guY that noticed
Finally someone has mentioned that CPUs have stopped improving ! A CPU today tin 2016 is about 3 times faster than a CPU in 2002 for single thread usage, 15 years ago ! And many programs are not designed for multi thread, or multi core since it takes more work. And in 2002 CPU speed was at 3GHz and 15 years later in 2016 is STILL at 3GHz !!! WTF ??? Yes, there is more than just clock speed but it shows how things have stopped improving. In 2002 the NSA mentioned that consumers have TOO MUCH COMPUTING POWER and that is about when CPUs stopped doubling in speed / power , and barely improved at all ! Now 6 - 9 % per year at most, that is TERRIBLE !!!!!! And benchmarks always show multi core, not single threat, which is the better comparison since you don't have several areas of RAM to go with each core so a 4 core isn't 4 times as fast. It may be much faster at times and no faster at other times if it needs to use the RAM frequently. At least he mentioned it at 6:00, but should have went in more detail. We should have CPUs that run at 384GHz by now, with about 256GB of RAM and a 16 - 32TB hard drive. But RAM and hard drives have also almost stopped improving ! In 2007 we had 4 - 8GB of RAM and most computers now have 8 - 16GB of RAM, 8 years later ! It should be about 256GB of RAM ! Linus, go into detail about this ! And how Intel and other companies have transistors that run at 1000GHz, a TERAHERTS , but CPUs can't even run 1% of that ? 1% would be 10GHz, 10% would be 100GHz, we should be able to have that easily ! And with carbon nano tube strip or copper strips and spacing the chips and integrating heat pipes, the heat wouldn't be a problem either. It would just slow down if needed just like now. And 2TB hard drives have been around since 2009 and STILL BEING SOLD for most computers !!!!! They should be 2009, 2TB , 2011 4TB, 2013, 8TB, 2015 16TB, 2017 32TB. So at least 16TB hard drives !!!! And they still sell 1TB hard drives in many computers too ! Today, Aug 2016, Staples sells 2 computers for $499 and 699 with a 1TB hard drive and 8GB of RAM ! The same as 8 years ago in 2007 !!! This is progress ? And it is their fastest computers, not the cheapest slow ones either ! And even the one for $800 with a i7-6700 has 8GB of RAM and a TB hard drive ! The same RAM and hard drive size as in 2007 !!!!!
+JJop123 60fps is a huge increase in processing on both the camera, in post and uploading. I'd prefer more frequent techquickies over 60fps. Plus, 29.98fps looks nicer and streams much better for content like this. :)
Jarryd Hall hmmmmm, I don't know about that, like I said, other tech channels I'm subbed to started going 60fps for similar styles of content and it makes a world of difference, plus, LMG always show off all their "high end" stuff, if other tech channels with less can do it, I think they are more than capable of doing it.
Correction to 2:25 : Celeron 300A was the first retail CPU with integrated on-die L2 cache (clocked at full speed), not Pentium 3. That's why it was on par with way more expensive Pentium II (its 512KB cache was clocked at half speed).
The Athlon64 x2 was faster than the Pentium dual core (similar to a c2d but with less cache). I had a 3800+ Toledo core (2000MHz) that I ran 24/7 at 2750MHz, barely upping the vcore at all. Generally, the Toledo core seemed to clock higher than the Manchester core x2 by AMD. I think the A64 (single core) from 3000+ (1800MHz) to the 4000+ were introduced in 2006? Had a 939 socket 3500+ 2200MHz which clocked to 3280MHz (WR on air), or at least it used to be when ripping.org worked. Have the cpu-z link of the OC on my channel if anyone is interested.
The Mendocino Celeron and Dixon Pentium II were actually the first Intel processors to have on-die L2 cache. The Katmai Pentium III had half-speed external L2, but the later Coppermine and newer Pentium III’s had on-die L2
What I know for sure is that I went shopping for a computer around the time when the 1Ghz barrier was broken, and those on offer were all AMD. And I foolishly let the salesman talk me into getting a 733 Mhz Pentium instead. But, hey, I was in the eighth grade, and it was my first computer since my 486.
So disappointing the earlier CPU's were just skipped over, no mention of the 8088 or the Z80, I guess it's just to far in the past to get his head around.
Not sure what the edit was exactly, but there's a cut at 5:48 where you can clearly see he was about to say something but it was removed. He "lost" his watch after that cut, so it looks like they re-recorded it. Must have said something incorrect. Oddly enough, he isn't wearing his watch during the sponsor as well.
linuscorrecttips
no
yes
Maybe
+Paul Albao noyesmaybeabit?
Possibly
Linus in 2015 doesn't know what for a fantastic processor war we've got in 2021
How did you know that in 2020 ?
@@Heliocentric I know what will happen in 2025
@@jopeckxpress will I ever find love ?
@@Heliocentric Damn yeah!
@@jopeckxpress Will there be CPUs and GPUs on stock?
I just color corrected my P6 chip the other day, now I can see it better when removing it from the computer.
Lol you are a verified you tuber but you only have 2 likes and no replies
richez explanations!! Are you 8
@@jackargie1092 I'm 13
@@richezexplanations2524 That makes sense.
@@jayfaraday1176 hey don't lump me in with this broken grammar kid
If only 2015 Linus could see what AMD could cook up in 2019.
Or in 2021.
@@randy206 Or in 2022
@@Iwitrag or in 2023
Or in 2024… probably
@@KaRaTeLoRd11PS3 or in 2025 definitely
linus pls make a AMD version of this.
he did. it's on vessel.
+Vinay Vyraveraja I agree
+Scoopta it will be out in like. two days. it's been on vessel for 5
+Theb0red YUSH. Thanks for the info.
Theb0red thanks fam.
I sooo wish AMD's Zen will be competitive. Intel needs a kick up their ass
Lol "I wish the opposition was good, intel's too good"
+MinecraftBysup69 Yeah, believe it or not there are a lot of us waiting for a good reason to upgrade. Intel being the only ones in the high end CPU market is not good for the consumer in any way.
yeah. there really has been no real improvements that makes a good reason to upgrade..
Good console killer cpu tho.
+MinecraftBysup69 You dumb shit. Competition drives down prices and drives forwards innovation.
Can lynda.com overclock my brain to 4.35Ghz?
Only if you upgrade your cooling system, sure it can.
You would become really dumb and slow if your brain worked like that.
Union of Earth Soviet Socialist Republics yeah I’m sure if your brain can store a petabyte of data, 4.83 ghz would be nothing
Your brain is way more faster than a CPU. You really underestimate the human brain. And I don't speak about the memory!
NetTubeUser r/woooooooosh
what was the correction about?
+Ronnie Slowinski yeah... what was corrected ?
+Ronnie Slowinski The first release claimed that Conroe / the Core 2 Duo had an integrated memory controller, whereas it was Nehalem (the first iX chips) that did that.
+opethfan333 I wish they put this in the description so you don't have to see 99.9% of what you already have seen.
+PrimaPunchy I didn't get a reply to my smart-alec knowitall tweet either, so perhaps they're just keeping hush-hush to avoid drawing attention to it.
+opethfan333 Except for when they put *CORRECTED* In the title
Man, he skipped a lot of my favorite details: 8088, math coprocessors, 286, etc... those were the good old days!
SSE-instructions (and their versions) which were very, very important for flotaing point math for all sorts of media calculations (music production, video encoding etc)...
Finally. A more accurate description of Moore's law.
A Moore accutate description
Hehe I'll go now
As fast as possible= AFAP
Y E S
@@qwertyiuwg4uwtwthn *_Y E S_*
Y E S
lemme do AFAP fap
Y E S
My Core 2 Quad is still chugging along just fine.
My Q8300 2.5GHz works perfectly with 3.2GHz overclock.
Ive never actually seen a intel cpu made in the last 10 years thats stopped working, even ones ive given a pretty hard time voltage and clock wise have always still run @ stock, they may not pull as higher OC as when new, but the drop is not much and they still stress test fully stable @ stock clocks and voltages.
Ive got a q9450 thats still goin
Got a QX6700 going strong... The 1st Quad Core!!
few days ago i replaced q6600 from my system as motherboard was bad and i figured if i am replacing components i might as well upgrade. q6600 was oc to 3.2 ghz and i have to say it was really hard to let it go. now i have i5 that i had to undervolt to get the same temp at boost clock as i had on q6600 oveclocked.
Watching in 2020, AMD finally got competitive again.
They are ferociously digging into Intel once again. Hope they can end up practically killing em in the end
Accessing your home feed from 2022; the world is nuts right now, but the processor world is waawaay more nuts right now than ever previously predicted.
I got deja vu. Where have i seeing this before. ^^
The 4 GB physical ram limit didn't last twenty years on the x86, it was lifted with the Pentium Pro nine years later, which had a 36-bit physical address space due to Physical Address Extension.
True, but it was still only 4 GB linear addressing. Just as the 8086/286 only had 64KB linear addressing, despite their 20/24 address lines (via special segment registers).
And no consumer operating system could adress that either. Windows 98, for example, only allowed 512 megabytes maximum.
It wasn't until the 64-bit operating systems that we could go beyond 4 gigs (or even 3 gigs since there was a Windows limit for that too).
And that was in the mid 2000:s..
I'd love to see the update on this, with Intel now using all the numbers for their 10th gen CPUs!
1:50 Hmm.. a card with "Philippines" printed on it.. Interesting.. O_O And yeah, I remember my family's old Pentium 4 PC and the good old days of Red Alert 2 and Yuri's Revenge. :)
but everything changed when 2008 attacked
There was NO dispute - AMD was first to 1GHz, and AMD created the 64-bit instruction set.
Intel sponsorship ruins the integrity of this video.
Daddy_Daughter is correct.
Uh it has nothing to with fanboyism.. it's to do with innovation and engineering. Facts are facts in science and technology.
@@anthonybane1567 so, why it's called AMD64 (?)
First IN INTEL, not the world
Ikr
Would have liked to have seen more discussion on the earlier chips, the 286, and the 486 were omitted. The 486 especially was instrumental for the success of the win boxes.
1:43 The 80386 wasn't only capable of addressing "system RAM" (whatever that is), but four megabyte of *_any_* semiconductor memory, any mix of ROM, EPROM, SRAM, DRAM, NOR Flash, etc.
You left out the 8008 and 8080 processors, the instruction set and architecture of both of these can be found buried in the x86 even today.
The early Celeron processors could actually be modified to work on a dual process motherboard!
teehee 8008ies
AT 0:06 THE LEGENDRY SOUND EFECT.(FEELS NOSTLOGIA).
Yes
1:34 The 80386 wasn't called "DX" in 1985 (or 1986, when you could actually buy one). "DX" was a much later suffix that was added to the 386 name only when the newer and cheaper version with only 16 bits databus (called "386 SX") was introduced.
That is actually incorrect. The 386 DX came first, the SX variant came later and was a cut down DX chip essentially.
@@johnhpalmer6098 Read again. All I said was that it wasn't *called* "DX" (only 80386) the first years. That was a marketing idea that came much later, when the "SX" variant was introduced.
@@herrbonk3635 True, however, they were thee SAME CHIP, UNCHANGED. The DX came first, the SX not until 1988. The SL was the mobile version and was not introduced until 1990. The difference was, the 386DX came with the 387 dx math coprocessor, the SX did not. My Dad bought a Packard Bell desktop in 1991 that was a 386SX/20MHz based machine, our first Windows machine, though it didn't have windows installed initially, but had DOS.
Again, thanks for making me feel old (j/k)
My first x86 PC was a Tandy 1000 with and 8088. It was a hand me down from my dad after he got a Commodore 64. Technically, my first PC was a Ti99/4a.
After the 8088, I got a 286. Then I skipped the 386 and 486 and got a Pentium 75MHz. Upgraded that to a 100MHz. It was then when I performed my first overclock. I ran the 100MHz at 133MHz and never had an issue. Back then, CPU speed was determined by the pattern you set a block of jumpers, usually a few rows of three pins. the jumpers either connected pins 1-2 or 2-3 and the combo of connected pins across the rows set your clock speed. For the most part, you could never set the jumpers for any speed other than the CPU was marked as. I actually set the pins to a pattern not shown in the manual. At the time, I never knew I was actually overclocking. I wouldn't truly overclock again until I got my Athlon 800MHz when I used a pencil (YES A PENCIL) to connect points on the top of the CPUs PCB. I got it to a stable 900MHz.The points were actually the socket pins that went all the way through. Very similar to the Pentium overclocking, the Athlons had a set of 4 pairs of tiny points on the top of the PCB. The speed was determined on how these were connected via small connections. You could use a pencil to connect, unconnected points to change the clock speed, but using a rear window defroster repair kit was the best (basically metal paint)
Do you still like, use this account?
@@Triickld Odd question
@@GiSWiG I know, I'm just trying to really comprehend how old youtube is.
Yea once sandy bridge came out. It's been pretty incremental performance wise. Kind of a good thing. Doesn't force as many people to the latest hardware as much.
Bring back memory lane days with 386 and 486. My i7 6800K remind me a long road of Intel path. Many miss the chance to bootstrap paper tape to start a miini computer with 17.5 MB disk the size of our file cabinet.
5:48 missed frame in editing..
don't worry I still love you, Linus
No love for the 8008, 8080, and 8087?
Kidding of course, this is is a great summary. Good job!
Guys there is a new LinusCatTips video, go watch it now!
Yep
holy shit thanks mate
Pentium Pro had on-die cache. It was included inside the Socket 8 architecture.
I love that not a whole lot changed in the last couple years. I am still using my 2012 i5-3570k clocked at 4.2 GHz and there is basically nothing I cannot do, especially focusing on games. Sure, the more demanding applications (because of bad programming, e.g. ARK:SE) do not run perfectly but they run, reliably, and because of my GTX970 also at pretty good graphics and performance.
An episode on the technology behind bone conduction headphones would be pretty sweet, and potentially full of innuendo.
I'm not satisfied with MMX part, and again, doesn't matter what Intel wants people to belive, K7 was the first 1GHz processor and Athlon X2 the first with integrated memory controller. Way so hard to say so?
Don't assume anything about me by my picture or name, but Intel beat AMD by a couple of days with an OC'd chip. AMD did however, have the first CPU with a base clock of 1GhZ+
Could you tell me your source for this information.
He said it was disputed... Didn't he?
lubu920 enough Wikipedia mining and it would be fine, give me a sec
lubu920 ok, i ve found that first was athlon 800mhz in late 1999, dont ask. ..
The Pentium 4 were CPUs? I thought they were space heaters?!
What tha fuck?
@@Ecological_Disaster they ran REALLY hot, up to 100 Celsius on the stock cooler.
1:52 wtf did I really see that right? My country who is known to be late to tech on the early days???
It's crazy to think that just 2 short years after this video, AMD came to smoke Intel into a corner they won't escape for years to come
And then 2 years after that, AMD clapped Intel's cheeks like a DK F-smash
Long live Ryzen 3
@@shanez1215 and then 1 year after that, AMD would beat intel further into the dirt with Ryzen 5 just for them to rise from the ashes with 12th gen
@02:15 _The Pentium III introduced speedstep... was the first CPU from Intel to include an on-die L2 cache._
Little correction. in 1995, 4 years earlier than the PIIII (even before the PII), the *Pentium Pro* included an on-package Level 2 cache (similiar to on-die as it also ran at the same speed as the processor)
please do AMD Processor Generations
oh i see it thanks
No mention of the 286 or the 486? The Pentium Pro? Or the inclusion of the FPU on the die? Hm...
Corrected? What was wrong with the first video?
The first release claimed that Conroe / the Core 2 Duo had an integrated memory controller, whereas it was Nehalem (the first iX chips) that did that.
Can confirm the Celeron 300A overclocking. It was my second gaming computer and the first one I built myself. It was also my first foray into overclocking.
the 4004 looks SO ADORABLE HOLY SHIT
Yup and that small 8086 was a nightmare to manage the memory due to it being so small.
Intel held off on that multicore CPU because they were pushing that primarily on the server Xeon chips. They held off well for a long time until they gave in to releasing it to the consumer market.
I switched from AMD to Intel in 2006 with the Intel's dual core processors, and not looked back since sadly. What killed, or is killing AMD was the push for GHZ, and more cores without significantly improving IPCs, L1-3 cache, and memory performance. AMD's shared resources, now slow DDR 3 standard RAM, slow APU CPU performance, and lack of innovation regarding new instruction sets has four of their top CPUs being beaten by Intel's Dual Core solutions in gaming. Application performance is still ok, and I still use AMD 8320-50s for PC builds when not building for gamers. Going forward after this year I will make the switch to Intel's 6600K CPUs to get some of my clients on the new DDR 4 standard for future proofing.
+hababacon Exactly like that but still, imo their fx 6300 and lower CPUs aren't *that* bad
***** Not bad for typical productivity use, but bottlenecks professional power users, and gamers. I've used AMD'S 6300 before, but since late 2013, I just go with either the 8320, or 50 for typical PC build request with DDR 1866. For gaming builds, I'm using either the I5 4590 S-T for my low end to midrange (4690K) builds. High end gamer builds 5820 or 5930K are my typical builds. I usually talk people down from buying the 5960, and invest in higher end video cards (980 - 980TI) instead of AMD 290 - Nvidia 970 for $15-1700 builds.
hababacon yeah it's good for budget systems, so you have a little bussines building PCs? Damn that's my dream job
***** I do it on the side, probably build 5-8 PCS a year. Only built 3 PCS this year, with the 4th pending when the guy has enough money.
***** Yeah I'm looking Intel's 6600K paired with DDR at 3.2 GHZ as my new gaming standard.
despite intel's overcloked coppermine 1ghz issues and unstabilities, amd was the first company who releases a stable 1ghz processor 2 days before to intel's release. do your homework and sorry for my bad english.
en.m.wikipedia.org/wiki/List_of_Intel_Pentium_III_microprocessors
en.m.wikipedia.org/wiki/List_of_AMD_Athlon_microprocessors
I hope AMD uses HBM as sort of L4 Cache for their ZENs, would be a blast 'cause RAM's so slooooow compared to every cache on DIE and it would be a nice bridge between these technologies...
5800x3d says yes, kinda :) but years later.
From the description: "[...] and sonic logo are trademarks of Intel Corporation." Wait, what?
they gotta give dem credits to dem trademark owners
+Philip Johansson Here you go! bfy.tw/39q2
+Philip Johansson the "sonic "logo" "is the intel jingle thing
+Philip Johansson
Intel gotta go faaast
SEEEGA
Imagine what specs we would have today if they had 2021 performance back in the 70s.
Well, physics has its (real) laws.
What's with the 'Philippines' label on the first picture of a Pentium at 1:53?
idk but intel processor is so reliable, last week after a sentimental goodbye, i just retired my intel pentium d 820 ( circa 2005) bundled with the solid asus p5vd2-x. up to now, its still working, can still play games, photshop 2016, runs good on win 7 but sluggish on windows 10, hence the need to now upgrade. but it survived, numerous ms office versions, from windows xp to vista etc. also i think it outlived 3 hardisks, 4 videocards and 2-3 ram (533mhz) changes and psu failures. im a heavy addicted gamer back then, with explosion of online games, from ragnarok, MU, ran, perfect world, flyff etc..and used it overnight for school work. my atx casing remained, and installed my new rig to it though rusted i repainted it and still got the vintage 3.5 floppy disk drive.
i tell my sons this pc is older than you so you better respect and take good care of it but they dont care about pc’s anymore, they only know ipads and iphones.
1:52
do i see the word
"philippines"
....what and how
I'm from Bohol btw...
+RJ Cifra AMD Athlon64 from Malaysia :D
John Drix Reyes Most chips I've seen were "made" in Malaysia, some in Philippines. The AMD chip shown on the video was made in Malaysia as well. Chip companies put manufacturing plants there. A big portion of hard disk drives is made in Thailand, at least it used to be like that. It's like how most things are made in China these days, while they were designed/engineered elsewhere.
For cheapear manufacturing
Hard to believe they left out the 486DX2-66 and 486DX4-100 and 133, where they first introduced the separation of the front-side bus speed from the core CPU clock speed, which was also the first real consumer overclocking opportunity as you could change the FSB multiplier to run your CPU faster!
Brings back memories of days of Windows 3.1 :) thank you!
PLEASE DO A HISTORY OF OVERCLOCKING!!!!!
@@leroymanoza What? Early 00's?, Oh I know what you mean, early 2000's
the 80386DX66 was still my favorite generation :D
now explain amd's :)
also talk about motorola's and powerpc's
very good video :) history of what we all love
do this for AMD!
I am disappointed you didn't mention that Pentium 4 (Prescott) ran so hot, they demanded a bed of dwarf gold and had to be returned to fires of Mount Doom when it came time to upgrade.
proud to be malaysians (im malaysian) that CPUs made in malaysia but the price very high......................
0:24 The 4004 would not be "uselessly slow"... It could actually still be used for a household appliance, for a scientific calculator, or even a simple industrial robot. _It was _*_never_*_ intended for computers._
the 386 was about 10 years before the pentium, not a few :D
can't believe you skipped the 486 - first processor to have an onboard math co-processor.
Jethro Rose Needed a 486 for DooM II
Azzalack nah, it did RUN on a 386 but was definitely better on a 486...
He almost skipped the 386 as well. This was the processor that intel thought was so good that they had a break with AMD as ”second source” for IBM and AMD had to go it alone. This is the CPU that introduced virtual memory and paging that is used by every modern OS.
The key invention of the 486 I would say was the clock multiplier.
How about an "As fast as possible..." explaining the difference between 4, 16, 32 and 64 bit processors (and why they're incompatible)? Also why we don't have 128-bit CPUs (yet). That's something I'd love to see.
Meh, Intel history is boring compared to AMDs.
rickster4k if you know amd was working for intel but they got x86 instruction and then they start making cpu
lil amd basic fanboy lol😍
@@chakra_x what if they like Intel but they just find their history boring?
@Devlover Nibir Then we'd be many, many years behind just like Intel is still releasing 14nm CPU's, 5 years later from it's first 14nm release because it underestimated AMD.
@@chakra_x yes, the usual dumb comment. I don't have that luxury though running a server company, you tend to go for what is the best option. Intel history boring and i think that is obvious that most think that when you compare this video's views to the AMD video is not even half the views.
Intel stopped worrying half as much about performance because they started focusing more on power usage. Their central focus these days is making the highest efficiency in terms of power consumption that they can, while still delivering the same process power from before, at least, which, in combination, causes the processor to perform better overall. Additionally, it comes down to not just process cycles, but also what Else it can do. Right now, for instance, Intel is focusing on adding in high power video processing to consumer grade units, with the intent of making budget-model video cards no longer effectively necessary.
just a quick question. Can you code at all Linus? just curious.
I heard him talking about learning how to code with his son on the Wan show. So probably not.
+Toxik Gaming Na, he's said a few times on the WAN show that he can't but he's going to take the opportunity of teaching his son to also teach him self.
3:22 Amd GPU with sli bridge. nice job linus.
Intel is absolute proof that when you have no competitors, you have no reason to have major upgrades. Like it's said in the video, when Amd was a major competitor, both companies were coming out with huge improvements every couple of years. If the new Zen is able to compete with Intel's high end chips, only then will you see another major release from them.
Sakash52 zen beated the crap out of the kaby lake now and we are going to see the first 6 corw i7 after YEARS
That’s absolute BS. Dennard scaling ended in the early 00’s. That’s why performance growth ground to a halt and that’s why the pentium 4 was a bad design in hindsight. Intel made a huge bet on itanium; that was going to be the desktop 64-bit architecture; they were sure of it for years and years. Pentium 4 was baked in the cake, they couldn’t change it. So during the time AMD was at its most competitive intel fucked up on all fronts and it took half a decade to fix, and the fix was to regress to the P6 architecture which had hibernated in low power laptop applications and tweak that a bit with multicore.
So is AMD, after winning the war on the Anthlon 64 War, they made shit CPUs which overheated, they had to drop the price crazily just to stay conpetitive, a dual core Intel would beat their highest end 8 core one even in MULTICORE, these wars keep on happening, the Zen2 is just another one
@Sosuke Aizen after like 6 years, only because ryzen happened
you need to do another correction.
8086 is only an 8 bit chip. 80286 moved everything to 16bits.
also 2006 marked the release of core duo 32bit chips. core2duo 64bit chips came out in 2007
My sister just said programmers are "pro gamers" facepalm*
Moore's Law actually states that CPU performance doubles roughly every TWO years, not EACH year
en.wikipedia.org/wiki/Moore%27s_law
No, it was originally a statement about the number of transistors you could fit on an integrated circuit, or how fast that number grows. It has nothing to do with processors or performance per se. That's just the usual narrow minded recentism.
Both Nvidia and Intel need some competition from AMD
Can we get your opinions on how they’ve done?
I think it is safe to say you got what you wanted
You skipped over the 8080 and the 8085 chips, they were the 8 bit data, 16 bit address, 64 kilobyte cpus between the 4040 and the 8086.
whats the difference between the old i7 (2andsomenumbershere) that some people are still using(that i7 that still runs modern games well) and newer i5/ i7 cpus
+bastard Mostly? Some changes in IPC (Instructions Per Cycle - how much it can do per MHz), some additions in instruction set (it can do certain tasks much more efficiently), a noteable improvement in energy efficiency and graphics performance of the integrated GPU.
But comparatively, the difference between the i7-2600K and the i7-6700K is much smaller than the difference between the i7-2600K and the C2Q Q6600, despite the same span of time (~4 years) between their release.
It's kind of as if Intel had looked at the Sandy Bridge generation (core 2XXX CPUs) and thought 'yep, those are fast enough for anything a consumer could want. Let's concentrate on other things now.'
+Steamrick has the same happend with i5 processors? im thinking of buying i5 4690k but i thought it would be outdated already and it wouldnt be able to run games propely after few years
+bastard It has a lower IPC and uses more power, but that's pretty much most every older CPU
+bastard As a 4690k user myself, I can say for certain it's still the bleeding edge. I just watched a gaming benchmark comparison (can't remember by who) and it showed the 4xxx CPUs keeping up with and sometimes outperforming the 6xxx CPUs.
mandaloin have you overclocked? im getting some good mobo so i can get good oc going on, and aftermarket air cpu cooler. how much ghz u did get on ur oc?
Actually, the Pentium architecture did not initially introduce MMX instructions. There were several iterations, and several years later after the Pentium was introduced before MMX was introduced. What the Pentium was most famous for was the introduction of superscalar architecture to x86. The ability to execute multiple instructions per clock tick. This was massive, it allowed the Pentium to be nearly 2x as fast per clock tick compared to it's predecessor the 486. It was way ahead of it's time, and took the competitors AMD, Cyrix, and others quite awhile to catch up. Also to add to things you missed, or got wrong. The Pentium Pro, this CPU had out of order execution which further enhanced the superscaler ability, and moved the L2 cache normally installed on the motherboard to the CPU itself (albeit on a separate die) which now ran at the same frequency as the CPU, i.e 200mhz vs. 33mhz on the shared buss.
Imagine the processors 10-20years from now😍
In 2020 we have 64 core / 128 thread cpu's from AMD
@@hammerheadcorvette4 true
@@oofboi114 In 2022 AMD have 96 core processors...
@@hammerheadcorvette4 maybe soon the day will come when we get 128 core 256 thread lmao
@@oofboi114 Would not be surprised if Zen 5 is 128 cores/256 threads... With maybe SMID 4 or 9. . . With enough stacked memory, SIMD 9 is capable. So imagine 128 cores *9 for threads.
Even me who was born too soon understood the information. He put into clear layman's terms, thank you.
I wish the average Joe didn't exists we would have better computers these days if wasn't for it
??? No, because the average joe is just a term for whatever is average. If there are a bunch of rich fucks, then they are the average joe. We would have better computers if there was more competition from AMD. Hell, we'd have WAY better computers if there was an entirely new company that started making a new type of CPU to compete with both Intel and AMD.
Dude you nailed it!
Sean Ramey True, I really kind of want a company to rival both of them
Cyrix did this in the mid-90s.
You wish poor people didn't exist?
Even in 2021 my i960 from 10 years ago can still play a lot of modern games. It’s not my main system anymore as it bottlenecked my GPU but it’s still setup in the house when kids want to play online stuff. I can’t say that I was rocking our early 90s 386 in the year 2000.
Ah yes, the 1970s... back when Intel manufactured microprocessors. Today they are in the space heater industry.
especially since they mostly market for overclocking
what happened at 5:47.18 (aprox) is this us seeing into the matrix that is linus tech tips?! or a minor edit glitch? '-) xXx nice to know I'm not the only one who publishes with (ehrm) deliberate mistakes to make sure the end user is watching
keep up the excellent and very informative videos and apologies for being that guY that noticed
Finally someone has mentioned that CPUs have stopped improving ! A CPU today tin 2016 is about 3 times faster than a CPU in 2002 for single thread usage, 15 years ago !
And many programs are not designed for multi thread, or multi core since it takes more work.
And in 2002 CPU speed was at 3GHz and 15 years later in 2016 is STILL at 3GHz !!! WTF ???
Yes, there is more than just clock speed but it shows how things have stopped improving. In 2002 the NSA mentioned that consumers have TOO MUCH COMPUTING POWER and that is about when CPUs stopped doubling in speed / power , and barely improved at all !
Now 6 - 9 % per year at most, that is TERRIBLE !!!!!!
And benchmarks always show multi core, not single threat, which is the better comparison since you don't have several areas of RAM to go with each core so a 4 core isn't 4 times as fast. It may be much faster at times and no faster at other times if it needs to use the RAM frequently.
At least he mentioned it at 6:00, but should have went in more detail.
We should have CPUs that run at 384GHz by now, with about 256GB of RAM and a 16 - 32TB hard drive. But RAM and hard drives have also almost stopped improving !
In 2007 we had 4 - 8GB of RAM and most computers now have 8 - 16GB of RAM, 8 years later ! It should be about 256GB of RAM !
Linus, go into detail about this ! And how Intel and other companies have transistors that run at 1000GHz, a TERAHERTS , but CPUs can't even run 1% of that ? 1% would be 10GHz, 10% would be 100GHz, we should be able to have that easily !
And with carbon nano tube strip or copper strips and spacing the chips and integrating heat pipes, the heat wouldn't be a problem either. It would just slow down if needed just like now.
And 2TB hard drives have been around since 2009 and STILL BEING SOLD for most computers !!!!! They should be 2009, 2TB , 2011 4TB, 2013, 8TB, 2015 16TB, 2017 32TB.
So at least 16TB hard drives !!!!
And they still sell 1TB hard drives in many computers too !
Today, Aug 2016, Staples sells 2 computers for $499 and 699 with a 1TB hard drive and 8GB of RAM ! The same as 8 years ago in 2007 !!!
This is progress ? And it is their fastest computers, not the cheapest slow ones either !
And even the one for $800 with a i7-6700 has 8GB of RAM and a TB hard drive !
The same RAM and hard drive size as in 2007 !!!!!
James you are aware that hard drives have simply reached their physical limits Do you? Linus even has an episode on it
The main thing is also because there is currently no need for 256 gb ram or 32tb hard drives you dumbass
10-15% HAH more like 0-8% these days.
I really wish they'd start recording in 60fps.....like some of the other tech channels I'm subbed to...looks like crap at 30 :/
It's more cinematic
+JJop123 60fps is a huge increase in processing on both the camera, in post and uploading. I'd prefer more frequent techquickies over 60fps. Plus, 29.98fps looks nicer and streams much better for content like this. :)
Jarryd Hall
hmmmmm, I don't know about that, like I said, other tech channels I'm subbed to started going 60fps for similar styles of content and it makes a world of difference, plus, LMG always show off all their "high end" stuff, if other tech channels with less can do it, I think they are more than capable of doing it.
+jesse reed now u sound like ubisoft
I believe they did test it out, but I forgot the videos :/
and then 2017 arrived, now we have 4 core i3s and pentiums are just dual core i3
AMD was first to 1GHz. Boo intel
@techquickie there needs to be a video on the best portable solar panel chargers out there!!
WAY WAY BACK IN THE 1980S
Correction to 2:25 : Celeron 300A was the first retail CPU with integrated on-die L2 cache (clocked at full speed), not Pentium 3. That's why it was on par with way more expensive Pentium II (its 512KB cache was clocked at half speed).
Man this is outdated the whole part about AMD being uncompetitive
The Athlon64 x2 was faster than the Pentium dual core (similar to a c2d but with less cache).
I had a 3800+ Toledo core (2000MHz) that I ran 24/7 at 2750MHz, barely upping the vcore at all.
Generally, the Toledo core seemed to clock higher than the Manchester core x2 by AMD.
I think the A64 (single core) from 3000+ (1800MHz) to the 4000+ were introduced in 2006? Had a 939 socket 3500+ 2200MHz which clocked to 3280MHz (WR on air), or at least it used to be when ripping.org worked.
Have the cpu-z link of the OC on my channel if anyone is interested.
The Mendocino Celeron and Dixon Pentium II were actually the first Intel processors to have on-die L2 cache. The Katmai Pentium III had half-speed external L2, but the later Coppermine and newer Pentium III’s had on-die L2
i thought it was the ppro that first had on-die l2
What I know for sure is that I went shopping for a computer around the time when the 1Ghz barrier was broken, and those on offer were all AMD.
And I foolishly let the salesman talk me into getting a 733 Mhz Pentium instead. But, hey, I was in the eighth grade, and it was my first computer since my 486.
Cyrix had x86 compatible CPUs with onboard graphics years ago, the MediaGX.
So disappointing the earlier CPU's were just skipped over, no mention of the 8088 or the Z80, I guess it's just to far in the past to get his head around.
Wow didn't realize I got here exactly as this was uploaded
No 8088 love? I thought that you would give an honorable mention as it was the cpu in the first IBM pc.
In 2020... It's the Core Wars ! AMD has a 64 core 128 thread cpu
Not sure what the edit was exactly, but there's a cut at 5:48 where you can clearly see he was about to say something but it was removed. He "lost" his watch after that cut, so it looks like they re-recorded it. Must have said something incorrect. Oddly enough, he isn't wearing his watch during the sponsor as well.
Insane I woke up and dreamed about the intro music lol