Why Moore’s Law Matters

Поделиться
HTML-код
  • Опубликовано: 20 сен 2024
  • I finished this video before the passing of Gordon Moore. Moving it up now. RIP to a legend
    Links:
    - The Asianometry Newsletter: asianometry.com
    - Patreon: / asianometry
    - Twitter: / asianometry

Комментарии • 441

  • @Asianometry
    @Asianometry  Год назад +1363

    RIP, Gordon Moore.

    • @ggboss8502
      @ggboss8502 Год назад +52

      And his law too

    • @nomadhgnis9425
      @nomadhgnis9425 Год назад +6

      What ever happened to wang computer corporation. I remember my school having one. The ones with those little monitors and large system unit.

    • @vrclckd-zz3pv
      @vrclckd-zz3pv Год назад +9

      This is how I found out

    • @dongshengdi773
      @dongshengdi773 Год назад +10

      ​@@ggboss8502 only God's Law exists

    • @dougdimmadoodahdaay7887
      @dougdimmadoodahdaay7887 Год назад +2

      an omen

  • @AlanTheBeast100
    @AlanTheBeast100 Год назад +657

    "...a lucky guess that got a lot more publicity than it deserved."
    - Gordon Moore.

    • @matsv201
      @matsv201 Год назад +9

      Well. With a span between 1 year and 2 year it was a pretty wide guess. Also they did throttle there development during periods of the 80s and 90s. Basically cheating.

    • @AlanTheBeast100
      @AlanTheBeast100 Год назад +24

      @@matsv201 It was a very good guess IMO.

    • @Connection-Lost
      @Connection-Lost Год назад +2

      @@matsv201 Their*

    • @Connection-Lost
      @Connection-Lost Год назад

      @@AlanTheBeast100 You not correcting him means you must be low IQ as well

    • @AlanTheBeast100
      @AlanTheBeast100 Год назад +3

      @@Connection-Lost No it means that I know that predictions based on the very small data set that Moore had at the beginning has stood the test of time well. It's never going to be perfect. Indeed look at the graph in the video: ( @21:30 ) a pretty straight line on a log graph? Hmm? You do _know_ what that means, right? I mean you passed grade 9 math? Right?
      So, a little algebra
      1) 1971: about 2500 transistors
      2) 2020: about 35B transistors
      Based on that, if the number of transistors goes up 1.4x every years (on average), then you get close to the 2020 count.
      Of course you can adjust the limits as you will and get slightly different numbers. For example if I make the end number 50B, then the factor would be: 1.41 every year.
      So before accusing people of a low IQ, maybe you should do some basic high school math first and see if your own IQ is up to a level from which to throw cheap shots.

  • @tomhalla426
    @tomhalla426 Год назад +138

    Another issue is that Moore’s Law only applies to microchips. Some politicians act as if similar advances apply to solar cells or batteries.

    • @PainterVierax
      @PainterVierax Год назад +18

      yet, it barely applies to integrated circuit as most of the improvements during the last decades has been done through the design of the chips/systems and making smarter/fine tuned algorithms. And the raw computationnal power gain has mainly been used to ease software programming and portability. (eg. cache levels, multicore, SMT, interposers, ASICs, DSPs, FPGAs, GPGPU, higher level compiled prog languages, HAL, APIs, a metric ton of interpreted languages or cloud/server/Web-based apps)

    • @julioguardado
      @julioguardado Год назад +3

      One of my pet peeves. There will be some spillover of 300mm wafers to solar cells which runs mostly on 200mm because the equipment is cheaper and often second hand. The switchover to 300mm claimed a cost advantage based on wafer size ratio. The same should apply to solar but I am not a solar manufacturing guy. Same could apply to LED's perhaps...

    • @tomhalla426
      @tomhalla426 Год назад +6

      @@julioguardado My understanding is that solar cells are already at a high percentage of the theoretic performance (as are windmills), so only more efficient manufacture is possible.

    • @julioguardado
      @julioguardado Год назад +2

      @@tomhalla426 Same here. Polysilicon is king and its efficiency hasn't changed from around 20% iirc. They're still looking for that high efficiency material that can be manufactured cheaply. Don't see any breakthroughs there.

  • @Palmit_
    @Palmit_ Год назад +229

    a forecast of a decade, based on a few years data, just to fit sales targets.. is now, more than legendary. Mythological as it were. Amazing. Thanks Jon. Really interesting stuff.

  • @chengong388
    @chengong388 Год назад +467

    Camera lenses got better because designers could simply let the software run random variations of the lens over and over again to find a better design, the more compute power the more variations you can try and the more likely it is the find a better design. Modern smartphone camera lenses are so ridiculously complex, each element basically has a radial wavy surface that make no sense but somehow focus the light just the right way at the end with minimal aberrations.

    • @WaterZer0
      @WaterZer0 Год назад +105

      Ah the ole brute force approach.

    • @kylinblue
      @kylinblue Год назад +13

      Could you show us an example?

    • @seventhtenth
      @seventhtenth Год назад +81

      not entirely true, a lot is material science and a lot is micro fabrication costs

    • @Laundry_Hamper
      @Laundry_Hamper Год назад +58

      Also important is no longer needing to produce a geometrically correct image at the focal plane. Digital image corrections allow you to trade distortion for sharpness

    • @musaran2
      @musaran2 Год назад

      @@kylinblue Search "polynomial optics" and get ready for a headache.

  • @antman7673
    @antman7673 Год назад +68

    Hearing the story of Moores Law, it is a self-fulfilling prophecy:
    We wouldn’t have current level of tech, without the ambitions of Moores Law.

  • @bloqk16
    @bloqk16 Год назад +3

    I recall back in the 1990s [US] when it came to PCs, the Moore's Law was getting into the lexicon of engineering professionals that used PCs in their work; as the rapid growing power of succeeding Pentium chips were rendering 18 month old PCs obsolete. It was an amazing era on the increasing processing power of the PCs back then on an annual basis.
    The engineering company I worked at, as a means to unload the _obsolete_ Pentium PCs they had [less than two-years old], were selling them to employees for around $100 [US]. Yet, those Pentium PCs were purchased at around $2.5K each, when new, two years prior.

  • @hushedupmakiki
    @hushedupmakiki Год назад +63

    21:00 - when mythology is so engrained you create a global industry of adherents. George Moore's passing really struck something in us, even in very adjacent semiconductor research/industries.

  • @jordanwalsh1691
    @jordanwalsh1691 Год назад +138

    Really interesting video. One small quibble, 2:19 - 2:49 "That 'roughly' is doing some serious heavy lifting". Not really in my opinion. As you point out, 50*2^10 is only 51 200, but the power of the exponent is so large, that you only need to increase the base by 2.5% to 2.05 to make up the difference. 50*2.05^10 = 65 540
    Personally I think 2.05 falls comfortably within the neighbourhood of "roughly" 2

    • @woofcaptain8212
      @woofcaptain8212 Год назад +2

      That was my thought

    • @spodule6000
      @spodule6000 Год назад +1

      I came here to make that comment. Thanks for saving me the trouble!

  • @sambojinbojin-sam6550
    @sambojinbojin-sam6550 Год назад +20

    Thanks for telling us Moore's Lore, not just the over-quoted "Law".

  • @afterSt0rm
    @afterSt0rm Год назад +85

    Well, I'll take the opportunity that creators generally see the initial comments to thank you a lot for the amazing content you've been putting out. I wish you only the best! Hugs from (just) another one of your Brazilian viewers ❤

    • @gordonfreeman9965
      @gordonfreeman9965 Год назад +1

      Nice to see that I´m not the only Brazillian that knows this amazing channel kkkkkkkkkk

    • @OgbondSandvol
      @OgbondSandvol Год назад +1

      @@gordonfreeman9965 Now there are three of us.

    • @DanielLavedoniodeLima_DLL
      @DanielLavedoniodeLima_DLL Год назад +3

      It was nice to see as well that a Brazilian was involved in the last paper that he presented in the video

    • @zerotwo7319
      @zerotwo7319 Год назад

      GG izi shrink brazil to a micro size so we can be more efficient also. Fit more brazils inside brazil

  • @mrrolandlawrence
    @mrrolandlawrence Год назад +59

    rip GM. a real visionary & titan of semiconductors!

    • @chrisbova9686
      @chrisbova9686 Год назад +1

      Humanity would be immeasurably better off without tech, or those who would enrich themselves from the death of humanity.

    • @Vysair
      @Vysair Год назад

      @@chrisbova9686 you mean extinction? Without tech, you are back to middle age/caveman

    • @---------c5741
      @---------c5741 Год назад +10

      ​@@chrisbova9686 ironic u need to spread this wisdom using technology 😅

    • @chrisbova9686
      @chrisbova9686 Год назад

      @@---------c5741 indeed. Smoke signals aren't dependable, but won't ruin the entire life experience.

    • @wyw201
      @wyw201 Год назад

      @@chrisbova9686 Wouldn't you say the transistor is one of mankind's greatest inventions?

  • @covert0overt_810
    @covert0overt_810 Год назад +15

    We need Moore transistors….

  • @johnhorner5711
    @johnhorner5711 Год назад +27

    Thank you for yet another educational video. The timing of it's release is uncanny. RIP Gordon Moore indeed. A parallel technology trend which doesn't get as much attention is magnetic storage (disk drives). This has been on a similar trajectory as integrated circuits have been, and has been every bit as important to the development of technology. Data storage is now so plentiful and cheap that we don't even think about it. RUclips allows anyone to upload unlimited video content for the world to watch. That is thanks to the magnetic storage revolution. Maybe you could do a video on that topic at some point.

    • @InvictraX
      @InvictraX Год назад

      I did notice a big gap in the development of data storage. The capacity has slow down dramatically.

  • @hareTom
    @hareTom Год назад +20

    Good info.
    However I just want to point out that "DRAM" should be pronounced as "DEE-RAM"
    Like SRAM is "S - RAM"
    They should be the same logic when you pronounce the word

    • @matthiaskamm
      @matthiaskamm Год назад +7

      yes please, I cringe every time I hear "dram" instead of "Dee-ram" :-) Great video.

    • @shanent5793
      @shanent5793 Год назад

      Whose prescription was that?

    • @seanwieland9763
      @seanwieland9763 Год назад +1

      Same with people who say “oh-led” instead of “O-L-E-D”.

    • @juhotuho10
      @juhotuho10 Год назад +2

      not the first time he does it, most likely he is intentionally saying it like this
      don't know why though

  • @niosanfrancisco
    @niosanfrancisco Год назад +29

    Excellent presentation. RIP Dr. Moore.

  • @ColeL88
    @ColeL88 Год назад +4

    Was a nice surprise to see a picture of mine used as the thumbnail and another used part way through the video!
    Also thanks for listing the source :D

  • @bgop346
    @bgop346 Год назад +5

    good video on history and i think this is also one of my favourite quotes from "the intel trinity"
    "Gordon more than anyone else understood that it wasn’t really a law in
    the sense that its fulfillment over the years was inevitable, but rather that it was a
    unique cultural contract made between the semiconductor industry and the rest
    of the world to double chip performance every couple of years and thus usher in
    an era of continuous, rapid technological innovation and the life-changing
    products that innovation produced.
    That much was understood pretty quickly by everyone in the electronics
    industry, and it wasn’t long before most tech companies were designing their
    future products in anticipation of the future chip generations promised by
    Moore’s Law. But what Gordon Moore understood before and better than anyone
    was that his law was also an incredibly powerful business strategy. As long as
    Intel made the law the heart of its business model, as long as it made the
    predictions of the law its polestar, and as long as it never, ever let itself fall
    behind the pace of the law, the company would be unstoppable. As Gordon
    would have put it, Moore’s Law was like the speed of light. It was an upper
    boundary. If you tried to exceed its pace, as Gene Amdahl did at Trilogy, your
    wings would fall off. Conversely, if you fell off the law’s pace, you quickly drew
    a swarm of competitors. But if you could stay in the groove, as Intel did for forty
    years, you were uncatchable"

  • @Freak80MC
    @Freak80MC Год назад +17

    Listening to this, almost makes me wonder if Moore's Law was more of a self-fulfilling prophecy. Something that motivated people to push harder for technological advancement, which ended up making it come true.

    • @m_sedziwoj
      @m_sedziwoj Год назад

      Look at Ray Kurzweil predictions, they are more interesting than limited Moore's Law.

  • @StevieFQ
    @StevieFQ Год назад +7

    I would never argue that we don't need improvements in compute performance but you can make the related statement that improvements in computing power (along with a seemingly never ending thirst for more SW developers) has lead to less efficient SW being developed to take adequate advantage of compute performance.

    • @PainterVierax
      @PainterVierax Год назад +5

      True. But this lack of code efficiency also comes with many advantages like ease of writing, prototyping, debugging, reviewing, correcting, improving, porting or even installing programs. All of the heavy lift is made by a few software bricks now (compilers, interpreters, OS, HALs, APIs, game engines, Web browsers).
      Even in embedded, it becomes way more practical to restrict ASM or RTOS usage only when it's imperatively required.

    • @son_guhun
      @son_guhun Год назад

      This claim is sort of absurd on its face. If it were more profitable to produce more efficient software, then that's would companies would make. However, the increasing complexity of business domains, infrastructure and the sheer amount of different platforms a piece of code must be compatible with makes it extremely inefficient (in terms of development costs) to attempt to squeeze every last bit of performance from a chip by writing code in low-level languages.
      Simply put, there's nothing stopping you from putting out highly optimized software today. But you would simply get out-competed unless you were working on a very specific domain or platform. So it's not that powerful hardware leads to less efficient software, but that less efficient software is usually more competitively produced and priced. Hardware performance simply dictates the minimum point at which software is simply not usable due to its inefficiency. For most applications, "adequate" advantage of compute performance IS the ability to produce less efficient software that is still usable, because this means you can produce MORE software or tackle problems that are more complex.
      If a browser already opens a webpage less than 2 seconds, nobody realistically needs it to be faster. It (the browser or the webpage) just needs better features, bugfixes or increased stability. Or maybe just less costly maintenance.

    • @PainterVierax
      @PainterVierax Год назад +2

      @@son_guhun it still really depends on the application.
      Sure Linux got rid of its old ASM pieces to be plain C (and now Rust) but in the mean time Android mostly got rid of Java runtime to compile software during install like BSDs do, despite the vast improvement of ARM SoCs.
      Similarly, developing production software for microcontrolers is not done with extremely inefficient/portable code like MicroPython or Arduino. And sometimes ASM is still used for critical timing functions.
      Same thing with desktop applications: Developing small tools and games with high level languages and APIs can be done without taxing too much of the increasing computing resources (even on laptops and embedded) but it's never used for developing AAA games or production applications that want to use every resources available to speed up the execution.

  • @RoyAntaw
    @RoyAntaw Год назад +15

    Gordon Moore a true pioneer RIP.

  • @AlokRayadurga
    @AlokRayadurga Год назад +4

    Over the past weekend, I've been thinking of Moore, the traitorous 8, the last AMD video that you had out recently, and the impact these men made on the industry. Thanks for this video (or tribute?)

  • @stephanhart9941
    @stephanhart9941 Год назад +5

    The Broken Silicon episode you were on was 🔥!!!

  • @ciCCapROSTi
    @ciCCapROSTi Год назад +5

    I fucking love how this guy does not separate jokes and information. You have to actually pay attention to distinguish.

  • @tonyv8925
    @tonyv8925 Год назад +9

    Wow, incredible history. I remember my first computer, a VIC 20, then upgrading to the C-64, then the 8086. The first computer I programmed was with Holirith cards on a Univac that used magnetic ring memory and large drum magnetic tape. It was a simple employee hours/wages program. So many things have changed since then. My cell phone has more computing power than the biggest computer that our local college had at that time. Amazing!

    • @milantrcka121
      @milantrcka121 Год назад

      Which Univac -1108? Yes those were the days of dropped card stacks...

    • @Agent-ie3uv
      @Agent-ie3uv Год назад +1

      Obviously a grandma but where on earth the notion that boomers can't use computers? 🤔🧐

  • @acolyte1951
    @acolyte1951 Год назад +2

    I appreciate your take on what you said was technological nihilism, that improving the speed (among other things) of electronics is a good and necessary thing. Even if the average consumer doesn't see it, the track record of these developments have indeed transformed the lives of many humans. Seemingly, for the better.

  • @lucidmoses
    @lucidmoses Год назад +3

    And here I though this was going to be some nonsense about how accurate it's been and how it will never end. Nice that you actually looked up the info first. Nicely done.

  • @stachowi
    @stachowi Год назад +23

    How do you pump out so much great content

  • @nickbensema3045
    @nickbensema3045 Год назад +4

    A few years ago I saw a Philosophy Tube video, in which for the first time I heard Moore's Law referred to as a marketing term, as opposed to a venerable guideline for technological progress. This video illustrates that label wasn't just left-wing cynicism, but kind of accurate -- it demonstrably instilled a confidence in progress that drove sales.

  • @rollinwithunclepete824
    @rollinwithunclepete824 Год назад +15

    Excellent video, Jon! Among other things it explains why I was never sure what Moore's Law predicted. Was it a doubling every year, every 2 years, every 18 months? Thing itself, Moore's Law, has been redefined to fit the data, the doubling over a span of time.

  • @samueltan9279
    @samueltan9279 Год назад +4

    Do basically Moores law was a self fulfilling prophecy. The industry followed it not because of some universal physical law but because everyone in the industry tried their best to follow it for various reasons.

  • @1998awest
    @1998awest Год назад +15

    Awesome video, Jon. I worked on Intel's 65nm node - you got most of the products of the time (Cedarmill, Yonah, Tukwilla). Great summary on the main drivers keeping Moore's Law alive (advancements in design and lithography).
    From 2000 - 2010, Intel's biggest worry was being categorized a monopoly and broken up. Since then, Intel lost its once-massive competitive advantage, and has been surpassed by TSMC and Samsung. The company was a juggernaut with Moore, but after his retirement, hubris crept into the company culture, and mismanagement became the norm, IMO.

    • @razorbackroar
      @razorbackroar Год назад +4

      Sick dude that's awesome we on 4nm now

    • @m_sedziwoj
      @m_sedziwoj Год назад

      @@razorbackroar anybody is naming they node 4nm? Intel 4, TSMC N4, maybe only Samsung, but they lying anyway so who cares.

    • @大砲はピュ
      @大砲はピュ Год назад

      @@m_sedziwoj tsmc on 2nm soon bruh

    • @rkan2
      @rkan2 Год назад

      ​@@大砲はピュEven TSMC doesn't call it with "nm"...

  • @thepenguin11
    @thepenguin11 Год назад +5

    I disagree that regular people do not really care about it anymore. Software expects for systems to advance, web alone, try using web with devices 10 years old, you will rage like crazy how slow everything is. The productivity goes up as well with more powerful units.

    • @WaterZer0
      @WaterZer0 Год назад +1

      That's because programmers are less and less efficient.

    • @thepenguin11
      @thepenguin11 Год назад +5

      @@WaterZer0 Clearly you don't know how programming works then. Same software these days is way more efficient than old software doing the same task on same machines. The software like cars advance and need more supporting functions to achieve more efficient and more advanced processes. The issue with badly optimized things these days are not fault of programmers in majority of cases, but fault of leadership, who push unrealistic time frames and cost restrains.

    • @WaterZer0
      @WaterZer0 Год назад +1

      @@thepenguin11 capitalism bad? loud and clear

  • @jaymacpherson8167
    @jaymacpherson8167 Год назад +8

    FYI…NOAA is typically called “no-ah.”
    Thank you for great documentation that Moore’s Law is passé.

    • @EndOfLineTech
      @EndOfLineTech Год назад

      FOR YOUR INFORMATION you don’t need to be a dick

  • @evinoshima9923
    @evinoshima9923 Год назад +2

    I remember in the late 70s in Manila we were making a K&S 478 add on that turned that manual wire bonder into one controlled by a microprocessor. Zylog, AMD, Intel, were our customers... what an amazing time.... our computer had no monitor, used a trackpad with a billard ball in it, and had 12 kb of ram!

  • @samueladams2340
    @samueladams2340 Год назад +2

    Always look forward to your videos. Well researched and in depth.

  • @Nor-tc8vz
    @Nor-tc8vz Год назад +5

    Moore's law is dead, long live Gordon Moore.

  • @matthewvenn
    @matthewvenn Год назад +3

    Another great video Jon! The last few minutes were especially interesting for me - I'll read that paper. I'd like to know what other industries rely on increasing compute density. I think that mobile phones are an example, the industry plans around people needing to buy new phones to keep up with the latest software. But if phones stop getting more powerful, then that industry needs to rethink its financial plans. I'm also guessing that AI (especially the training side) will want more and more compute going forwards.

    • @m_sedziwoj
      @m_sedziwoj Год назад

      Many people are buying clothes each season, so I think phone industry will find the way ;)

  • @gljames24
    @gljames24 Год назад +1

    Moore's law, like every exponential in nature, is bounded by physical optimization limitations and more accurately follows a Sigmoidal curve as the technology hits an inflection point and only gives you diminishing returns and the graph goes logarithmic. The only way to advance is to switch to a new technology. Adding cores, gate geometry, 3D stacking has helped in many areas, but we'll likely have to switch from Silicon mosfets to Gallium Nitride or other chemistries to get any sort of frequency improvements at this point.

  • @douginorlando6260
    @douginorlando6260 Год назад +2

    Shrinking transistor technology also led to lower power consumption solutions, particularly valuable for battery powered devices

    • @ttb1513
      @ttb1513 Год назад +2

      Yes. The shrinking of a transistor’s area by 50%:
      1) Allows twice as many transistors on a chip with the same area.
      2) The same area implies basically the same cost, for twice as many transistors (far from costing twice as much).
      3) A transistor with 1/2 the area consumes 1/2 the power. With twice as many at 1/2 the power, the power consumption stays basically unchanged, for a chip with twice as many transistors.
      Neither the power nor the cost double when the number of transistors double in the same unit of area. This scaling phenomenon is the real important thing.
      If transistors stopped shrinking, compute that needs twice as many transistors will start costing twice as much and using twice the power. Do this for a few generations and costing 8x as much and consuming 8x as much power and 8x the area …. that will make you appreciate the transistor shrinkage advances we’ve had in the past.

  • @truefan1367
    @truefan1367 Год назад

    That truck backing up really sells this video.

  • @supremebeme
    @supremebeme Год назад

    This is probably my top 10 channels. Keep it going!

  • @paulmuaddib451
    @paulmuaddib451 Год назад +12

    I love the little bits of humor, wit and charm you add to each video, "...technically correct, the best kind of correct (from Futurama)", and that "Whomp Whomp".
    We see you, @Asianometry we see you. 😘

  • @RobBCactive
    @RobBCactive 9 месяцев назад

    There ought to have been discussion of Dennard scaling which was a key driver of rapid processor improvement, node shrinks gave not only smaller & cheaper but also faster transistors at a constant energy cost, meaning there weren't the heat & power wall issues which halted the frequency scaling.
    I was a bit disappointed that this channel failed to mention that as most people focus on number of transistors, when multi-core is a response to physical limitations on uni-processor performance.

  • @mrlucasftw42
    @mrlucasftw42 Год назад +3

    Modern SSD have really revolutionized computing for the base user - boot in sub 2 minutes - after Windows updates can still often be sub 5 minutes.

    • @WaterZer0
      @WaterZer0 Год назад +8

      TWO MINUTES?!
      JESUS CHRIST
      What are you doing?!
      There's no way it should take more than 30 seconds *maximum* to load the OS.

  • @billfargo9616
    @billfargo9616 Год назад +1

    Since "Moore's law is the observation that the number of transistors in an integrated circuit (IC) doubles about every two years," all that is required to keep it valid forever is to make bigger ICs.

  • @tengkualiff
    @tengkualiff Год назад +24

    Moore's Law will never truly die. :(

    • @benc3825
      @benc3825 Год назад +10

      Moore’s law, or Moore’s observation, I dead, simple as that. We don’t get to rename and define it so it still correcr

    • @RonnieMcNutt666
      @RonnieMcNutt666 Год назад +1

      @@benc3825 AMD and intel future server cpus want to know your location, also gpu density

    • @benc3825
      @benc3825 Год назад +1

      ​@@RonnieMcNutt666Which one specifically? The Turnin family, Venice, Emerald Rapids, Granite Rapids, Diamond Rapids, Sierra Forest and/or Clearwater Forest. :-P

    • @KekusMagnus
      @KekusMagnus Год назад +1

      It's been dead for awhile now

    • @RonnieMcNutt666
      @RonnieMcNutt666 Год назад +1

      @@benc3825 3090 to 4090 having nearly 3x the transistors in a smaller die

  • @polka23dot70
    @polka23dot70 Год назад +6

    In the past decade there was no improvement in CPU frequency and almost no improvement in single-thread performance and cost. The term "3 nanometer process" has no relation to any physical feature (such as gate length, metal pitch, or gate pitch) of the transistors. In other words, the "3 nanometer process" is a marketing term. According to International Roadmap for Devices and Systems published by IEEE Standards Association Industry Connection, the 3 nm process is expected to have a contacted gate pitch of 48 nanometers and a tightest metal pitch of 24 nanometers.

    • @elizabethwinsor-strumpetqueen
      @elizabethwinsor-strumpetqueen Год назад

      I like your style - reality rather than hype...thanks

    • @tinayoga8844
      @tinayoga8844 Год назад +3

      My current computers have CPUs from 2011. And the comparable CPU of today has only a doubling in speed of each core.

    • @shanent5793
      @shanent5793 Год назад +2

      Yet throughput has managed to increase, so maybe those things don't matter

    • @supersat
      @supersat Год назад +3

      CPU frequency is more about Dennard scaling, which has been dead for a while. Every time we think Moore's law is dead, we come up with another breakthrough to extend it, although I wouldn't be surprised if it was the end of the line soon.

  • @deem1819
    @deem1819 Год назад +4

    Never thought I'd hear an overlap between all the dead space lore channels I follow and my semiconductor manufacturing interests

  • @nielsdaemen
    @nielsdaemen Год назад +5

    *Moore predicted Moore and Moore complex chips*

  • @atanumaulik7093
    @atanumaulik7093 Год назад

    Brilliant, as always. The world needs more compute. Long live the Moore's law !

  • @julioguardado
    @julioguardado Год назад

    There's another wave coming in semiconductor manufacturing - the maturing of the industry. Optical scaling has a physical limit and wafer size is not going to go beyond 300mm. What we'll see is all chips becoming much cheaper, particularly complex ones as industry laggards catch up. I think the best is yet to come.

  • @Jalae
    @Jalae Год назад +1

    1 trillion fold increase of compute for 2 degrees of accuracy seems like a gross misuse of energy.

  • @glennmcgurrin8397
    @glennmcgurrin8397 Год назад +2

    If you take a technology that's developing that rapidly and is that early in it's lifecycle and give me a ten year roadmap into the future and at year 10 you are actually at what you said would be year 9 that's amazingly accurate, it wasn't perfectly accurate but it's incredibly rare to see anything close to that as far as I see.

  • @Edward135i
    @Edward135i Год назад +1

    It's hard to think of a single person who effected more human lives than Gordon Moore, RIP Gordon thank you, I hope Pat doesn't run your company into the ground completely.

    • @shorerocks
      @shorerocks Год назад

      Yeah. Now think about the inventor, or one of the godfathers, of AI: Geoffrey Hinton. There is a CBS morning interview that is super interesting. And... I am not able to predict how much live will change in the next 10, 20 years.

    • @Noqtis
      @Noqtis Год назад

      When you fart, you change the life of more bacteria than there a humans on the planet. Never forget; your asshole is a planet.

  • @colebevans8939
    @colebevans8939 Год назад +2

    With chat gpt 4 this week and how much programmers are already saying it’s helping them code, I can’t help but feel we are just at the start of another massive Burt’s of exponential growth. Soon we will be able to code 10x more things, 10x faster for a fraction of the cost. That alone will be a massive boost to efficiency. As AI models grow it’s only going to get better. Add to that how quickly quantum computing is growing these last few years. Sure it’s an entirely different field but we have no idea what quantum computers could be capable of 20 years from now because we haven’t had the tools to start playing with them until yesterday.

  • @AgentSmith911
    @AgentSmith911 Год назад +1

    Moore's law today is ~15% yield improvement per year

  • @ruperterskin2117
    @ruperterskin2117 Год назад

    Right on. Thanks for sharing.

  • @JoshuaC923
    @JoshuaC923 Год назад

    Did not expect to see dead space in a Asianometry video😂😂👍🏻👍🏻

  • @AdityaMehendale
    @AdityaMehendale Год назад +1

    The actors at 11:05 haven't actually soldered a goddamn thing in their entire lives.

  • @Schroinx
    @Schroinx Год назад

    Great video. If you are running out of topics, then the Wintel and x86 story could be one.

  • @d00dEEE
    @d00dEEE Год назад +2

    Moore's Law should be rewritten as an Old English epic poem, say in the style of "Beowulf".

  • @HellishPestilence
    @HellishPestilence Год назад

    There are of course business applications for more computing power. But for the first time in history, consumer products today are not limited by computing power but by things like network speed. The market for more advanced chips is a lot smaller if you're developing for a few HPC clusters at companies or universities rather than smartphones, which perform just fine with a 7nm chip for the vast majority of people

  • @m_sedziwoj
    @m_sedziwoj Год назад +1

    Personally I would look at "Kurzwell's law" so cost of computing as calculation/s per $1000, because it extend to times with punching machines and ignore technical aspects, which is interesting as a story how it change with time.
    About end, I would give different arguments "you don't need more computing for today software, but only today" because to allow something new, sometimes is few orders of magnitud before it become available (real time RT in games, really smart AI etc). Where are limits to human perception as resolution, refresh or interaction, but quality and complexity have a long way.
    As autonomous cars, robots (humanoid or not but one which can deliver stuff under your door) and many many more. But AI revolution is only at beginning and Von Neumann architecture will not be best for it.

  • @grzegorzkapica7930
    @grzegorzkapica7930 Год назад +3

    Moore's law is flawed. The true law is Wright's law; every doubling of production units brings a steady % decrease in cost per unit.

  • @sunroad7228
    @sunroad7228 Год назад

    "In any system of energy, Control is what consumes energy the most.
    Time taken in stocking energy to build an energy system, adding to it the time taken in building the system will always be longer than the entire useful lifetime of the system.
    No energy store holds enough energy to extract an amount of energy equal to the total energy it stores.
    No system of energy can deliver sum useful energy in excess of the total energy put into constructing it.
    This universal truth applies to all systems.
    Energy, like time, flows from past to future".

  • @danielsuguwa746
    @danielsuguwa746 Год назад

    Interesting video, and thanks for the content! I just know today that Dr. Moore passed away on Friday... RIP legend...

  • @hsharma3933
    @hsharma3933 Год назад

    I’m so glad you addressed the elephant in the room right at the beginning.

  • @jrherita
    @jrherita Год назад

    Appreciate the deep respect for Moore!
    Only comment is the node charge didn't really end in 2006 for Intel. They executed pretty well until about 2014-2015.

  • @piethein4355
    @piethein4355 Год назад +1

    I need more compute for my gaming rig still, My 380 can still just barely drive current gen VR titles, I will need sever orders of magnetude more compute before even being able to drive something currently high end like A XR3 at a native resulution. If i also wan't high refreshrates (atleast 144 hertz but preveribly in the 200s) without reprojection and raycasting for improved lighting then we are still many many orders of magnetude away from what is needed.

  • @jamesmorton7881
    @jamesmorton7881 Год назад +1

    1978 the Motorola 6800. The applications uses exploded. The rocket was just launched.
    I loved all that was CMOS the RCA 4 bit; Now we are at around 2000MIPS. The IBM System 360 waas about 16.6MIPS in 1970.
    The self heat now increases with operating temperature due to higher leakage currents of smaller geometry, around 90nanoM

  • @yashsanghvi5956
    @yashsanghvi5956 Год назад

    Argument against the fact the Gordon Moore's math was wrong: Yes, 50 times (2^10) would be 51200 transistors. But Gordon Moore also said that the rate of growth was roughly 2 times per 2 year. Assuming the rate of growth was 2.05 instead of 2, we get 50 times (2.05 ^10) = 65540 transistors - which something Gordon Moore said in the whitepaper.
    TLDR: I wouldn't say his math was wrong but the with exponentials the even small approximation in rate of growth can lead to very different answers

  • @alpaykasal2902
    @alpaykasal2902 Год назад +3

    RIP Doctor Moore. Your fingerprints are all over everything, forever.

  • @stuartmacintosh4868
    @stuartmacintosh4868 Год назад +2

    Plot twist: it was a log curve

  • @EricJorgensen
    @EricJorgensen Год назад +1

    The less popular corollary to moore's law is the one about how moore's law will be re-defined in order to argue that it's not really meaningless every 3-4 years.

  • @Aarkwrite
    @Aarkwrite 11 месяцев назад

    I was not expecting a Dead Space reference bravo

  • @luke144
    @luke144 Год назад +1

    We need to adapt and change the way we compute itself. We need to quit looking for unicorn farts with dark matter detectors and tackle problems like P Vs. NP. We need to make out computing me effective and efficient. Right now we run A LOT of flawed, blotted programs. I think things like gallium arsenic will surely help but we need to go back to the basics. We need to change the geometry and architecture of processors. The advent of the arm processor is a perfect example.

    • @buzzsaw838
      @buzzsaw838 Год назад

      Go back to the basics? So basically we've reached the "local maximum" for the current technological paradigms in place for compute. Sounds like some fundamental revolution is needed at the underlying architectural level to continue anything like Moore's law level growth.

    • @luke144
      @luke144 Год назад

      @@buzzsaw838 there are many ways the current model of computing can change.

  • @pierQRzt180
    @pierQRzt180 Год назад

    I am simple man, I see an interesting article cited, and I upvote.

  • @lydierayn
    @lydierayn Год назад

    Moore'sLawIsDead is one of the best GPU Rankings on the internet

  • @accutronitisthe2nd95
    @accutronitisthe2nd95 Год назад

    It can't go on forever...

  • @user-jp1qt8ut3s
    @user-jp1qt8ut3s Год назад +8

    I realize now how lucky I am to have met this guy, and many other great inventors and scientists. I think being scientist is one of the most beautiful lifes one could have. What's your job?

  • @FogelsChannel
    @FogelsChannel Год назад +1

    Amazing analytical history of transistors and chip design.

  • @ps3301
    @ps3301 Год назад

    We need 4k game running at 120hz refresh rate. That is another 6 years away

  • @Venkat2811
    @Venkat2811 Год назад +1

    As your narration and explanation is unbeatable, would be great if you could video on evolution of technology from 1400 AD (printing press) till date and how humans were worried about jobs being replaced. This would be very relevant to current GPT / AI revolution.

  • @ku0n349
    @ku0n349 Год назад

    I absolutely adore your conclusion, I hate when people say things which ignore everything that technology could be, in favour of what we already have.
    Recently at a party I saw someone who is an engineer at Bosch saying that most of the things are already invented and that there is so little innovation that is yet to come. When I heard that from an actual engineer, I felt disgusted tbh

    • @ceeb830
      @ceeb830 Год назад

      I don’t understand how someone could say that today

  • @bighands69
    @bighands69 Год назад

    There is probably 60 years remaining of Moore's law.

  • @VedranCro
    @VedranCro Год назад

    Moore's Law reminds me of Hubble's Law, which describes the expansion of the Universe. Edwin Hubble used only a few data points and boldly drew a line connecting them to prove that galaxies farther from us are receding faster. The values on Hubble's chart were inaccurate, as were his predictions based on it (that the Universe was 2 billion years old). Nevertheless, Hubble's Law inspired others to replicate his work and refine their measurements, which ultimately led to plausible theories and predictions.

    • @VedranCro
      @VedranCro Год назад

      And I love egg fried rice :)

  • @matthagy86
    @matthagy86 Год назад

    Thanks!

  • @timdere
    @timdere Год назад +2

    "Moore's Law didn't get tossed into the trash like that two-day-old white rice in the fridge. Instead, the industry modified it to make delicious fried rice." Nice touch on the commentary! 😆

  • @saricubra2867
    @saricubra2867 Год назад +1

    He died with his law, after 2013, stagnation.

  • @awesommee333
    @awesommee333 Год назад +5

    While certainly in the weather example you have the additional compute power has helped, idk it seems that these massive investments in compute power only improve the prediction by so much, and, given the increasing cost of transistors, you almost have to wonder whether the next step would be worth it.

  • @gp.gonzales
    @gp.gonzales Год назад

    Rest in Peace, Gordon Moore (March 24, 2023)

  • @twilightknight123
    @twilightknight123 Год назад +1

    His estimate of 65,000 components in 10 years wasn't that far off from a (roughly) factor of two. It's a bit unfair to say the word roughly was "doing a lot of heavy lifting" when the 65k component figure comes from a factor of 2.05 instead of 2.

  • @TeodorLojewski
    @TeodorLojewski Год назад +1

    RIP G. Moore

  • @timwildauer5063
    @timwildauer5063 Год назад +6

    I don't argue with your thesis that Moore's Law matters, but I'd like to push back a bit. There are industries where the extra compute is absolutely necessary. That being said, there are countless ways that code can be improved to make better use of compute. One major machine learning model was improved by I believe 16% because they realized that one method of squaring in Python was faster than another (x*x is different from x**2 is different from pow(x,2), three different functions that give exactly the same result, one is always faster than the others). Not that we should stop pushing forward with compute, but we should equally emphasize efficiency in coding and compiling. New programmers are really bad at coding, and throwing more compute at the problem is most often the solution, it's an easy way to turn it into "not my problem." "Just buy a better CPU/GPU" is the most common philosophy of gaming companies. I don't think we should get rid of Moore's Law, and your summary at the end was a perfect explanation of why, but everyone needs to take ownership and responsibility for improving the things that are under their control. Manufacturers/compilers can't give up and say "write better code," and coders equally can't give up and say "give us more compute." Everyone should have a relentless drive to improve.

  • @allthethings3071
    @allthethings3071 Год назад

    Problem is the speed of light and needing new substrates, the heat dissipation and current leakage are the main barriers, multi-core has reached a kind of dead end, we need a return of single threaded performance and increased frequency if computing is going to get better. I don't see any major material science advancements on the horizon for a long time, with current tech, all Intel and AMD are doing is basically optimization at this point, fragility of gates and electro migration become massive issues as parts get smaller, not to mention thermal wear on contacts. I don't see how we're going to solve the heat, leakage and memory bandwidth/latency issue anytime soon.
    The biggest issue now is RAM is infinitely slower then l1/l2 cache. So we need - new memory technologies, new materials and new fabrication techniques. Because heat and leakage are the biggest issues that aren't going away anytime soon.

  • @marianmarkovic5881
    @marianmarkovic5881 Год назад +1

    I woudnt call Multicore as failture to Moor law or anything, yes it was forced change(by pretty much hitting practical limit on frequency of single CPU), but extremly effective one. And industry was using multiprocesor units for decades by then, now it was just integrated in one chip.

  • @JimCareyMulligan
    @JimCareyMulligan Год назад +1

    It’s just a trend line. You can make your own right now. Moore himself (or probably some analyst in company) revisited data. And if you famous enough you can call it Gelsinger-Moore’s law. Or RayJacket-Moore’s law. RIP

  • @CalgarGTX
    @CalgarGTX Год назад +1

    We're still a few gens away from not needing more compute for gaming imo. We have been trading more performance for equally more power too often last few gens. Id be glad to go back to the days of an actually 50W TDP CPU and sub 200W max TDP GPU that can handle all the latest games at 100-140fps without needing over the top cooling solutions and noise. Not to mention AMD and nvidia seem to be abandoning the low and mid range as much as they can right now. So still a way to go if you ask me.

  • @krakhedd
    @krakhedd Год назад +1

    5:23 - the guy who discovered ALUMINUM named it, "aluminum", and it was only subsequent Brits who thought it needed more flourish and changed it to "aluminium".
    However, the correct pronunciation and spelling are still, "aluminum".
    Especially lacking any hint of King's English, you should be pronouncing it, "aluminum".

    • @Kieselmeister
      @Kieselmeister Год назад

      Technically he originally proposed Alum-ium, but people complained it wasn't classical enough, so he changed it to the actual Latin alumin-um (lit. meaning "from-alum")...
      Then the same complainers claimed that the "ium" ending "sounded" more classical, and wanted "alumin-ium", even though the "um" ending was LITERALLY CORRECT CLASSICAL LATIN, and they started calling it that even though he kept using "alumin-um".
      TLDR, the original invented name for the element was "Alum-ium", which is thus correct ENGLISH...
      The proper declension LATIN name given to the element is "Alumin-um", which is thus correct LATIN...
      And "alumin-ium" is only correct if you are speaking the language of ignorant sore losers.