The AI Hardware Problem

Поделиться
HTML-код
  • Опубликовано: 15 ноя 2024

Комментарии • 1,1 тыс.

  • @NewMind
    @NewMind  3 года назад +62

    ▶ Check out Brilliant with this link to receive a 20% discount! brilliant.org/NewMind/

    • @calholli
      @calholli 3 года назад +2

      Your videos are like an endless competitive battle between my comprehension and your articulation; may this war never find peace.

    • @davidhollenshead4892
      @davidhollenshead4892 3 года назад +2

      @No Name Same here, as I already had to use captions while watching movies, and now long covid is making my hearing even worse...

    • @davidhollenshead4892
      @davidhollenshead4892 3 года назад

      The connectionist solution will work, by having many small CPUs with limited memory, for each node, which are connected to only neighboring CPUs...

    • @Evelyn_theghost
      @Evelyn_theghost 3 года назад

      No

  • @mickelodiansurname9578
    @mickelodiansurname9578 3 года назад +1362

    The brain.... billions of calculations per second... powered on a subway sandwich!

    • @krasimirgedzhov8942
      @krasimirgedzhov8942 3 года назад +51

      I don't think the exact process of the brain is comparable to computing. I imagine it's something more complex.

    • @mihailmilev9909
      @mihailmilev9909 3 года назад +63

      @@krasimirgedzhov8942 nah I think it's just a big neural network

    • @mihailmilev9909
      @mihailmilev9909 3 года назад +25

      @@krasimirgedzhov8942 that's kinda where the name comes from isn't it lol

    • @krasimirgedzhov8942
      @krasimirgedzhov8942 3 года назад +76

      @@mihailmilev9909 it's a name given to one of the most complex softwares we have. It's only inspired by the structure of neurons, it doesn't have the exact same process as far as we know.

    • @serdarcam99
      @serdarcam99 3 года назад +11

      İf its powered by subway sandwich its not going to calculate billions of things per sec its going to be much less

  • @deltalight584
    @deltalight584 3 года назад +39

    12:21 That comparison was brilliant.
    It ties in computing & neurology together.
    Low speed, high precision needed => Digital ("Slow system of thought")
    High speed, low precision needed => Analog ("Fast system of thought")

  • @raphaelcardoso7927
    @raphaelcardoso7927 3 года назад +789

    I'm applying to do a phd exactly in this field. Amazing video!
    Update: I was accepted!

    • @santoshmutum3263
      @santoshmutum3263 3 года назад +11

      I am also writing my research proposal in this topic for PhD... Not accepted yet

    • @Rahul016-d6k
      @Rahul016-d6k 3 года назад +3

      @@santoshmutum3263 Where did you apply?I'm Manipuri anyway.

    • @santoshmutum3263
      @santoshmutum3263 3 года назад +2

      @@Rahul016-d6k Japan

    • @Rahul016-d6k
      @Rahul016-d6k 3 года назад +3

      @@santoshmutum3263 Good Luck Brother👍👍

    • @santoshmutum3263
      @santoshmutum3263 3 года назад +2

      @@Rahul016-d6k thanks

  • @gordonlawrence1448
    @gordonlawrence1448 3 года назад +19

    Actually Radar computers in the 1950's were analogue. I was one of the last people at my college to be taught both analogue and digital computing. Add, Subtract, Multiply, Divide, integrate and differentiate can all be done with a single op-amp. The problem is Nyquist noise, and issues with capacitor dielectrics such as dielectric absorption and leakage. With a digital system you can just vastly over-sample add them all up then divide by your number of samples to reduce effective noise. You don't get that choice with analogue.

  • @Jojobreack324
    @Jojobreack324 3 года назад +30

    I developed an asic for ai acceleration as part of my bachelor`s thesis and I must say this video is of very high quality. It is definitely an interesting approach to go back using analog techniques.

  • @lidarman2
    @lidarman2 3 года назад +145

    Well done. I played around with small neural nets using op-amps in the 90s and although I saw that it was kinda the way to go, I had trouble with drift due to integration bias and all sorts of noise--and of course training was super tedious. But I always thought that neural nets really need to stay in the analog world. Modern non-volatile memory seems to be a solution for training weights since you can put variable amounts of charge in a cell, very densely.

    • @forwardplans8168
      @forwardplans8168 3 года назад +7

      Did you ever look at using Fuzzy-Set theory to improve decision accuracy? I used a new program called CLIPS , around that time period. It's time to review it again.

    • @lidarman2
      @lidarman2 3 года назад +9

      @@forwardplans8168 I was doing a lot of fuzzy logic at that time too. Interesting times.

    • @ArneChristianRosenfeldt
      @ArneChristianRosenfeldt 3 года назад +8

      So the brain uses discrete values in the nerves. I may use analog within the cell. Opamps use analog values only. I tried very hard to understand analog multiplication for radio modulation and it is .. complicated. Satellite radio is energy efficient because it uses only one bit. AM radio uses a lot more energy ( for 8 bit signal/noise ).
      OpAmps are large compared to digital circuits.
      Flash memory is basically analog memory. We just use DACs and ADCs to only store discrete values in it. This is similar to the analog amplitude in DSL. ADC and DAC, especially for only 8 bits are fast. They are used for video equipment. So I don't know what the video want to claim there.
      I have read that the lsb could run on lower supply voltage because errors are not so bad there. There are always errors and mostly by supply voltage we decide how often we accept an error.
      Also those "half open" valves scare me. The nice thing about CMOS is that no change in state => no current drawn.
      I just chatted about the general problem of matching resources to tasks. That is an np-complete problem. So with different kind of transistors for tasks of different importance one opens a very big can of worms...

    • @adamrak7560
      @adamrak7560 3 года назад +4

      Digital beats out analog until you scale to the extremes where we are right now.
      So until now it did not make much sense to use analog NN chips.
      I have studied one such analog chip and was sad to see that modern digital deep submicron could beat in every was possible. But that was 10 years ago, right now the digital CMOS hardware is nearing its limits, so we may need a paradigm change.

    • @seraphina985
      @seraphina985 3 года назад +4

      @@ArneChristianRosenfeldt Arguably flash memory cells while not binary in nature are still more digital as the defining characteristic of digital systems as opposed to analog is the quantization of the signal. That is to say, digital signals are interpreted by quantizing them into one of a finite array of buckets that each correspond to some arbitrary range of the physical input value. The consequence of this is that digital signals are highly accurate but their precision is finite and limited as every input signal is effectively rounded to fit into one of those finite buckets. In contrast, the precision of analog signals is as close to infinite as you can get within our universe though in practice accuracy is the limiting factor is in how well you can insulate the system from noise and how accurately you can measure the input signal.
      Granted even with an ideal isolated system and ideal signal measuring device any analog system in our universe is likely to have its precision limited by the fact fundamental particles in our universe have defined properties. But this is also arguably academic as the applications for a system that can process values more precisely than could ever physically be generated or represented in our universe due to the lack of any particle with a measurable quantifiable property small enough is rather limited. Short of us discovering some way to either change the laws of physics in some region of space in that manner or travel to universes where the laws of physics are different a system that precise if sufficiently accurate could solve any problem that could exist in our reality limited only by our ability to understand and specify the problem. Well if given sufficient processing time that is but it could process any and all possible states that could even exist within our universe which would, in theory, allow us to solve any problem that could exist in reality.
      Sure there would still be a fundamental limit on the maximum precision that could still be demonstrated by the fact we could imagine arbitrary problems that couldn't be represented with enough precision without some clever workarounds. Hell even beyond that it is likely there is a finite amount of particles a civilization could ever collect in a universe with a finite speed of light and there will always be some arbitrary number larger that could be imagined but still, the practical applications of dealing with values that could not be replicated within the physical limitations of the observable universe are rather limited. There is probably a limit to what insights can be gleaned from simulating things that are physically impossible to ever encounter or bring into being.

  • @BlackholeYT11
    @BlackholeYT11 3 года назад +49

    Ooh, good to see this put into words and in a concise manner

    • @calholli
      @calholli 3 года назад

      Your videos are like an endless competitive battle between my comprehension and your articulation; may this war never find peace.

  • @naimas8120
    @naimas8120 3 года назад +379

    Another masterpiece from New Mind! Never fails to entertain while teaching.

    • @XBONESXx
      @XBONESXx 3 года назад +3

      Your avatar is a masterpiece

  • @JohnDoe-zs6gj
    @JohnDoe-zs6gj 3 года назад +422

    That energy comparison between our brain and our best processors is incredible. It's amazing the efficiency evolution can devolope given enough time.

    • @garrysekelli6776
      @garrysekelli6776 3 года назад +35

      Computers are weak. They will never even beat a human at Chess.

    • @hedgehog3180
      @hedgehog3180 3 года назад +73

      Evolution is the most aggressive optimization function in the known universe and has been running for over 4 billion years. Every single animal alive today is optimized down to the smallest cells. It's really no wonder that human brains are both the most powerful computer we know of and has efficiencies that makes everything else look like a joke.

    • @Eloign
      @Eloign 3 года назад +36

      Computers don't happen by random processes. Neither did humans. Computers were created as were humans.

    • @ickebins6948
      @ickebins6948 3 года назад +52

      @@Eloign Sure, provide some proof for that.
      Will you?

    • @WERT2025
      @WERT2025 3 года назад +40

      @@hedgehog3180 Yeah I feel like every cell of my armpit hair is 100% optimized

  • @tannerbuschman1
    @tannerbuschman1 3 года назад +482

    the idea of an AI being inherently impossible to debug or decipher is really cool and scary, science fiction was not far off on that one.

    • @aidanquinn1549
      @aidanquinn1549 3 года назад +95

      We take for granted the amount of info we know about each other (human AI). I can guarantee you that when you last had a conversation about a specific feeling (whether is love with your spouse, hate towards something, how horiffic a scary movie was...) that you did not settle on the same exact emotion. A.K.A: you don't even know how to debug or understand what is going on behind any human's eyes right now!

    • @jss7668
      @jss7668 3 года назад +7

      But humans are!

    • @aidanquinn1549
      @aidanquinn1549 3 года назад +5

      @@jss7668 scary, yeah, but the most beautiful and fascinating things I've ever seen

    • @En_theo
      @En_theo 3 года назад +14

      And this is how an AI becomes self-aware, hides its true meaning to you and go all Skynet when you expect it the least. S-F was not far off neither on that one.

    • @UNSCPILOT
      @UNSCPILOT 3 года назад +5

      There are projects trying to find ways to break down and comprehend how learning algorithms work, it actually uses the same program that SETI@HOME did to allow people to donate processing power to help the project in testing and breaking down how the algorithms work

  • @amrohendawi6007
    @amrohendawi6007 3 года назад +65

    It amazes me how many different state-of-the-arts you perfectly and briefly cover in 10 minutes

  • @Zpajro
    @Zpajro 3 года назад +278

    As a student in computer science, this is really interesting

    • @naimas8120
      @naimas8120 3 года назад +4

      I'm a student of Information and Communications Technology. What do you think about the future of our field? Do you think it's really AI?

    • @olfmombach260
      @olfmombach260 3 года назад +26

      @@naimas8120 As a student of Computer Science I can definitely say that I have absolutely no idea because I'm dumb

    • @samik83
      @samik83 3 года назад +5

      @@naimas8120 As a layman I'd say definitely yes. Just the last couple of years AI's made some big strides. When we get quantum computing up and running and pair it with AI the possibilities are endless...and scary

    • @Zpajro
      @Zpajro 3 года назад +5

      @@naimas8120 The problem is that the AI hype has come 3 times now, so predicting if This is the time it will really break throw is quite hard to tell. Personally, I eagerly waiting for our machine overlords (as long as there is no human controlling the AI). And if we get a true general intelligence going, it would be interesting to see how a different alien intelligence solves problems.

    • @ovoj
      @ovoj 3 года назад

      @@Zpajro imagine the conversations with something that isn't human. Hopefully we reach that point in my lifetime

  • @lidarman2
    @lidarman2 3 года назад +54

    You made a somewhat profound comment at 12:13. The essence of intuition versus analysis. From our vast experiences we develop intuitions that gives us that "gut feeling" but when it matters, we do rigorous analysis to confirm. RE: "Blink" Malcolm Gladwell.

    • @calholli
      @calholli 3 года назад

      Also it could be the difference in function from our creative right and analytical left brain.

    • @Nnm26
      @Nnm26 3 года назад +1

      @@calholli that is bs btw

    • @calholli
      @calholli 3 года назад +1

      @@Nnm26 Well, even if only metaphorical. It still has value as a concept.

  • @Bhatakti_Hawas
    @Bhatakti_Hawas 3 года назад +503

    I promise I understood everything he said

    • @kevinperry8837
      @kevinperry8837 3 года назад +13

      Yes me too comrades

    • @xlnc1980
      @xlnc1980 3 года назад +8

      We all did!

    • @tymek200101
      @tymek200101 3 года назад +17

      it is enough to be a 1st-year Computer Science student to understand all of the words and concepts

    • @Bhatakti_Hawas
      @Bhatakti_Hawas 3 года назад +3

      @@xlnc1980 Hey fellow DT fan 👋🏽👋🏽

    • @xlnc1980
      @xlnc1980 3 года назад +1

      @@Bhatakti_Hawas Hi there, fellow! LTE3 coming out next month. Been waiting for that one for only 22 years now. :)

  • @ryansupak3639
    @ryansupak3639 3 года назад +10

    Nice...so it seems like digital-style processors still do all the “housekeeping” tasks of the computer, but then there are these “analog resistor networks” that do specialized tasks like the implementation of convolutional neural networks.
    Makes me smile when “everything old is new again”.

  • @alexkuhn5078
    @alexkuhn5078 3 года назад +10

    4:30 I was kinda zoning out and I heard that as "50 to 100 pikachus"

  • @fredoo6627
    @fredoo6627 3 года назад +231

    It's so annoying to discover channels like this and see they don't get the views they deserve.

    • @mitchellsteindler
      @mitchellsteindler 3 года назад +2

      Its a fairly new channel

    • @georgf9279
      @georgf9279 3 года назад +6

      @@mitchellsteindler Let's boost it with some engagement (comments) then.

    • @hedgehog3180
      @hedgehog3180 3 года назад +6

      Definitely one of the best engineering channels on RUclips.

    • @keashavnair3607
      @keashavnair3607 3 года назад

      Well the problem is, there are 16,852 views, yet only 1.6K likes and 31 Dislikes and 149 comments. This world is full of consumer minded half curious morons. That's why.

    • @mitchellsteindler
      @mitchellsteindler 3 года назад +4

      @@keashavnair3607 dude. Just stop and get off your high horse. People like what they like.

  • @SpiritmanProductions
    @SpiritmanProductions 2 года назад +2

    So, hybrid processors are the future, then, perhaps.

  • @digicinematic
    @digicinematic 3 года назад +12

    Yes, I have vague memories of the memristor being touted as the missing passive component, or some such thing.

  • @joel230182
    @joel230182 3 года назад +86

    "...analog circuitry" , that caught me off guard

    • @davidhollenshead4892
      @davidhollenshead4892 3 года назад +18

      That is one solution, the other is the connectionist solution, having many small CPUs with limited memory, for each node, which are connected to only neighboring CPUs...

    • @mihailmilev9909
      @mihailmilev9909 3 года назад +1

      @@davidhollenshead4892 interesting... what is it called?

    • @this_is_japes7409
      @this_is_japes7409 3 года назад +2

      @@mihailmilev9909 mesh computing, i think, or at least it's mesh topology based.

    • @TauCu
      @TauCu 3 года назад +1

      Or just building a type of FPGA.
      I think in the future however, that FPGA will be a combination of Electronics and Photonics.
      For NN I don't see how Photonics could be beaten for general purpose networks.

    • @am-i-ai
      @am-i-ai 3 года назад +1

      It actually makes a lot of sense. There has been a recent resurgence in the interest of analog systems. I, for one, feel like we ditched that particular technology a little prematurely. I'd be willing to bet that we see some rather spectacular new analog-based technologies in the near future.

  • @kumarsuraj9450
    @kumarsuraj9450 3 года назад +66

    My professor once said in class that future is analog. We were in a dilemma thinking what he actually meant. Now i see what he meant

    • @Alimhabidi
      @Alimhabidi 3 года назад +4

      Future is quantum

    • @CrashTheRed
      @CrashTheRed 3 года назад +2

      @@Alimhabidi It's been mentioned that quantum computers are specific purpose machines that won't improve on everything a conventional machine does. It also requires a conventional machine to process a lot of the data. Sabine Hossenfelder made a number of videos on quantum computers and the direction they're heading. Maybe that might interest you, especially since she's a theoretical physicist.

    • @davidthacher1397
      @davidthacher1397 3 года назад

      The future is not fully analog or quantum. Analog will work of a set of properties which manipulates energy in waves, aka signals. Digital is a very simple signal, it is currently very stable and cheap. We can do a lot with this simple signal. For if we do not master digital how are we to understand analog. There signal architectures which are analogous on digital. Most who study CS or ECE never learn this. Most if not all of CS's theories are wrong! Literally might as well study Psychology, if you want to be that wrong.

    • @CrashTheRed
      @CrashTheRed 3 года назад

      @@davidthacher1397 Ofc the future will be a combination of all of the above. But how are the CS theories wrong? This is a first for me, and I'd like to hear you explain it a bit

    • @this_is_japes7409
      @this_is_japes7409 3 года назад +2

      everything is analog if you dig deep enough.

  • @sknt
    @sknt 3 года назад +25

    Great video, pretty much sums up the current state of AI. It's still a long way to go until we can even compare AI to a "real" brain. The brain is an insanely complex electrochemical machine that was evolved over millions of years.

  • @CreeperSlenderman
    @CreeperSlenderman 3 года назад +4

    I have an idea for AI Emotions.
    We humans used to live in jungles and forests, biomes.
    in which we tried to survive and reproduce, for surviving our emotions are
    Fear, trust, confidence and loneliness.
    For reproducing it is
    love, attraction, and idk.
    so we would need to make an AI with "ADN"
    or atleast try to give it those feelings, but would have to be 2 AIs
    or else it won't be able to interact

  • @alengm
    @alengm 3 года назад +17

    7:00 triggers google assistant :D

    • @calholli
      @calholli 3 года назад +2

      That's by design.

  • @stage666
    @stage666 3 года назад +8

    I feel good about myself that I know just enough about neural networks and computer engineering to somewhat understand what this video is talking about.

  • @CuthbertNibbles
    @CuthbertNibbles 3 года назад +64

    11:57 "They form a sort of black box with no means to verify the integrity of a result. This creates the dilemma of potentially unexplainable AI systems, creating issues of trust..."
    This is how the AI apocalypse begins. "Why'd that car run over that advocate?" "No idea."

    • @nipunasudha
      @nipunasudha 3 года назад

      Exact same thing I thought.

    • @davidhollenshead4892
      @davidhollenshead4892 3 года назад

      Using an AI to control a car is a waste of an AI...
      Besides, while autonomous aircraft or spacecraft is feasible technology, an autonomous car will never be "safe" due to pedestrians, cyclists, animals, etc. sharing the roads. This should be obvious by the weird accidents caused by cars like the Tesla decapitating the idiot occupant by driving under a truck and continuing on until it crashed into a house...

    • @nipunasudha
      @nipunasudha 3 года назад +2

      @@davidhollenshead4892 lol they only need to be more accurate than a human driver. Doesn't need to be perfect. And the car structure and safety features are getting advanced by the day too. The sweet spot is closer than you think! 😁❤️

    • @starskiiguy1489
      @starskiiguy1489 3 года назад +4

      @@davidhollenshead4892 I wouldn't be so sure. What you say may be true for modern infrastructure, but autonomous vehicles if they catch on may change the way we view transportation infrastructure overall.
      I could personally see a future where few own cars we rideshare if we need to travel a long distance with a car, but other than that we create more walkable cities with more public transit. In such a future comparing autonomous vehicles on modern infrastructure and autonomous vehicles in future infrastructure may be comparing apples to oranges.

    • @HelloKittyFanMan.
      @HelloKittyFanMan. 3 года назад

      @@nipunasudha: *As accurate as...

  • @joey199412
    @joey199412 3 года назад +3

    Great video especially the assembly multiply instruction and outlining it with an analogue computing method. This makes sense since analogue results are instantaneous and don't require a clock pulse and thus no memory storage between calculations as you can add and subtract analogue signals instantaneously.

  • @user-cx2bk6pm2f
    @user-cx2bk6pm2f 3 года назад +1

    As soon as he mentioned "analog" was waiting for the requisite mention of noise being the limiting factor.
    I'm impressed that he did indeed talk about that.. but disappointed that precluded the epic rant I was about to unleash 🤣

  • @somenygaard
    @somenygaard 3 года назад +4

    Ahh the mobile net 224, one of my favorite neural network accumulator modules.

  • @onehouraday
    @onehouraday 3 года назад +2

    Quite interesting! Neurons in the brain actually act as a mixed analogue/digital system. The input is digital (action potential), they do analogue processing, and output digital again (action potential, it's either 0 or 1).

  • @nikolausluhrs
    @nikolausluhrs 3 года назад +14

    Just gonna say we cant really explain how digital neural networks are making decisions that well either

    • @thewhitefalcon8539
      @thewhitefalcon8539 3 года назад +2

      yeah we just have an algorithm that randomly adjusts them until they give the answers we want

    • @Taladar2003
      @Taladar2003 3 года назад +1

      Which means we have no way to efficiently improve their performance. Doing almost what we want is no closer to doing exactly what we want than doing something completely different if we have no way to deliberately improve them in some iterative way.

    • @thewhitefalcon8539
      @thewhitefalcon8539 3 года назад

      @PolySaken We can understand why, in general, a neural network might be capable of detecting triangles. We can't understand why *that particular* neural network *is* capable of detecting triangles.

    • @thewhitefalcon8539
      @thewhitefalcon8539 3 года назад

      @PolySaken and what is that data? We don't know. We just know when you put a triangle in it says yes, and when you put in a square it says no. Also there's a megabyte of random-looking numbers involved.

    • @thewhitefalcon8539
      @thewhitefalcon8539 3 года назад

      @PolySaken We can see what each square millimeter contributes to the painting by looking at it. We can understand why this square millimeter is this colour. Not so with AI models!

  • @am-i-ai
    @am-i-ai 3 года назад +5

    We definitely made a rash collective decision when we decided that digital was to *replace* analog. I would not be surprised at all to see a resurgence of analog systems ... we barely even explored that technical space. There surely are future analog developments that will rock the foundation of technological advancement. Very well done :)

  • @fieryferret
    @fieryferret 2 года назад +3

    The inherent precision floor of possible analog-driven neural net AI is now my headcanon for every single science fiction book/movie where a robot gains consciousness and starts acting unpredictably.

    • @MrFram
      @MrFram 11 месяцев назад

      Analog is not required for this, existing AI are already unpredictable due to the black-box nature of machine learning

  • @hikaroto2791
    @hikaroto2791 3 года назад +2

    8:27 background song is amazing, it remembers me prometheus movie from Alien saga

    • @kxtof
      @kxtof 3 года назад +1

      so i'm not the only one who noticed

  • @derek8564
    @derek8564 3 года назад +13

    I knew my collection of Vacuum tubes would come in handy one day...

    • @hardrays
      @hardrays 3 года назад +1

      you saved them so you can crank the plate voltage up past 15KV so you can self annihilate with pizazzzzz

    • @HelloKittyFanMan.
      @HelloKittyFanMan. 3 года назад

      When did "vacuum" become a brand, to you?

  • @entropysalamander
    @entropysalamander 3 года назад +2

    I love the visuals used in your videos, they're always unobtrusive but fascinating.

  • @pacifico4999
    @pacifico4999 3 года назад +5

    Going back to the basics so we can move forward. This is a fascinating topic!

    • @questioneverything4633
      @questioneverything4633 3 года назад

      We will never figure out computers until we properly master the harnessing of energy, especially electricity. We don't understand the fundamentals of things like this.

  • @marticus42
    @marticus42 3 года назад +1

    7:22
    Never thought I would understand a statement like that. Good teaching

  • @jimmarburger611
    @jimmarburger611 3 года назад +3

    Wow, amazing video. It's unbelievable the progress we've made. Just in my lifetime I've seen a blistering pace of achievement. The first computer I played with didn't even have a video interface, lol. I've loaded data and run programs from punch cards. Now I play video games on a machine that I built that probably rivals all the computing power available to NASA during Apollo. It's somewhat ironic that machine learning may lead to the rebirth of analog computing. Except for specific applications, analog has been relegated to unwanted stepchild status. Just saying, there's nothing wrong with analog, this video shows how it can be more efficient for machine learning.

  • @nicocalimero
    @nicocalimero 3 года назад

    I don't know if I understand 5 % of the video, but still mind blowing.
    An analog world control by digital telechnology and using Analog digital converter in modern society.
    With the deep learning for AI it look already like a black box for the programming part even for autonomous vehicule when the IA learn by examples, isn'it ?

  • @ramentabetai1266
    @ramentabetai1266 3 года назад +6

    Neuromorphic cpus are likely the future for this. These special chips by IBM and Intel are already much more effective at neural net tasks. IBM's goal is to build a system no larger than a brain that has the same amount of connections as the real one.

    • @olfmombach260
      @olfmombach260 3 года назад

      We don't even remotely have the hardware manufacturing capabilities to do that

    • @rupertgarcia
      @rupertgarcia 3 года назад +2

      @@user-ee1hj7rk9l, look up "IBM TrueNorth Chip". They've been working on it for years now.

    • @augustovasconcellos7173
      @augustovasconcellos7173 3 года назад +1

      @@user-ee1hj7rk9l I'd say it's a bit too early to tell, but so far it looks like they won't. Quantum Computers are really only good when their workload consists of doing the same thing over and over again. This is good for breaking encryption, searching through databases, and so on, but not for AI.

  • @majorfallacy5926
    @majorfallacy5926 3 года назад

    Analog computing is technically used commercially in measurment and control systems. Those applications don't exactly push the technologies limits, but are still important

  • @morkovija
    @morkovija 3 года назад +15

    Is this another gem of quality content? That we're getting for free? Oh my

  • @PunmasterSTP
    @PunmasterSTP Год назад +1

    It kind of blew my mind when I found out there were dedicated AI regions in microchips, but I guess that was only a logical next step. I'm not in the field and I doubt I'll ever use this knowledge, but I definitely find it fun and interesting to learn about. Thanks for the very high-quality video!

  • @naota3k
    @naota3k 3 года назад +5

    What is the machine doing around 0:35? Is it extruding solder to bond the pads? This seems ridiculously precise and I've never seen it before, now I'm curious what this process is.

    • @justinmallaiz4549
      @justinmallaiz4549 3 года назад

      good eye, never seen that myself

    • @lemlihoussama2905
      @lemlihoussama2905 3 года назад +9

      It is a machine that uses gold wires to link between the integrated circuit in the chip and the chip's outside pins.
      This process is called "Wire Bonding" you can search it on youtube for more videos !

    • @naota3k
      @naota3k 3 года назад

      @@lemlihoussama2905 Fantastic, thank you!

  • @johnzinhoinhoinho
    @johnzinhoinhoinho 3 года назад +1

    what really impresses me is the huge amount of knowledge in 13 minutes of video. Congrats for the content

  • @godetaalibaba2522
    @godetaalibaba2522 3 года назад +8

    This was a very interesting topic that I didn't really heard about before, thank you for the amount of work this video took you to make !

  • @soumilbanik1128
    @soumilbanik1128 3 года назад +1

    I know very little about Machine Learning and Artificial Intelligence, but I often used to think that AI and ML should be processed like our human brains.
    I saw this video and realised that my thought has a potential. Thanks for such an informative video.

  • @NiffirgkcaJ
    @NiffirgkcaJ 3 года назад +5

    This guy clearly needs more views and subscriptions.

  • @Guilherme-social
    @Guilherme-social 3 года назад +1

    The animations on this video are just gorgeous.

  • @fr3zer677
    @fr3zer677 3 года назад +15

    Another amazing video!
    It's astonishing to me how many different topics are covered on this channel and how in-depth and interesting all of your videos are.

  • @calholli
    @calholli 3 года назад +2

    Your videos are like an endless competitive battle between my comprehension and your articulation; may this war never find peace.

  • @gingerpukh7309
    @gingerpukh7309 3 года назад +5

    NASA's Appolo mission guidance n Control analog computer design might be useful.

    • @Onewheelordeal
      @Onewheelordeal 3 года назад +1

      I thought of that Smarter Every Day video first thing

  • @BryceSchroeder
    @BryceSchroeder 3 года назад

    The character / is forward slash. \ is backslash. URLs have forward slashes in them. The backslash \ is used for DOS and Windows paths.

  • @WilliamDye-willdye
    @WilliamDye-willdye 3 года назад +5

    The music at 7:55 was also used in another good video about ML ( ruclips.net/video/3JQ3hYko51Y/видео.html ). It's called "Atlantis", but now whenever I hear it I think of artificial neurons.

    • @sonofagunM357
      @sonofagunM357 3 года назад +1

      At first I thought that song was from Alien Isolation, but no, both sound pretty close wouldn't you say?
      ruclips.net/video/txjs5MpATUg/видео.html

    • @WilliamDye-willdye
      @WilliamDye-willdye 3 года назад +2

      @@sonofagunM357 Heh. I can definitely hear similarities. Thanks for the link, BTW. I haven't played that game, but now if a Steam sale comes along I might get it just because the soundtrack is promising.

  • @fugslayernominee1397
    @fugslayernominee1397 3 года назад +2

    I had goosebumps just before the ending. Looks like brains truly are the most efficient machines that nature has provided us with.

  • @jakub_simik
    @jakub_simik 3 года назад +12

    What's the music at 11:00? It sounds like something from pink floyd. Thanks.

    • @rupertgarcia
      @rupertgarcia 3 года назад +3

      I don't need sleep. I need answers!

    • @davidg5898
      @davidg5898 3 года назад +4

      ruclips.net/video/THihnuQJHF4/видео.html

    • @jakub_simik
      @jakub_simik 3 года назад +3

      @@davidg5898 Thank you so much.

  • @grandreddithotel8059
    @grandreddithotel8059 3 года назад

    I know of a cool thought experiment regarding not artificial intelligence but artificial consciousess on digital architecture. It goes like:
    - all digital computation can be modeled by finite state machines
    - finite state machines can be expressed on pen and paper
    - therefore all digital computation can be expreased on pen and paper--pause during a CPU cycle and you can write the contents of your memory and registers on paper. This would take a lot of paper for one CPU cycle, let alone the trillions of CPU cycles a computer would execute over the span of a day. However, it is theoretically possible.
    So, of consciousness could be achieved on a digital computer, then it could also be a achieved by a very long book. I don't think consciousness can be achieved by any digital system, but it's still fun to think about.

  • @Texplainedeverythingdetailed
    @Texplainedeverythingdetailed 3 года назад +5

    If someone start using things like femtojoules, i believe them. No questions asked.

  • @youdrakkar
    @youdrakkar 2 года назад +1

    I bet Derek has been defenetly inspired by your wonderfull video for his recent take on the analog computation.

  • @EweChewBrrr01
    @EweChewBrrr01 3 года назад +4

    I have no idea why I thought I could watch this and understand what's going on. Haven't even had my morning coffee yet.

  • @zorgonfire
    @zorgonfire 3 года назад

    Very clear explanation of a very very hard domain of CS.

  • @unchartedthoughts7527
    @unchartedthoughts7527 3 года назад +3

    0:00 - 1:45 *Oh man, I was thinking about that staring at a potentiometer, get me some vacuum tubes bois, we are going to Rome*

  • @aceofspades001
    @aceofspades001 3 года назад +2

    You know how dumb I am? I was fully immersed in the video to the point I thought the ad was part of the topic! Im dumb 🤣

  • @glazzinfo6031
    @glazzinfo6031 3 года назад +5

    Sir you are "Brilliant"

  • @peregrinesantos7483
    @peregrinesantos7483 3 года назад +1

    Analog computers remain as the fastest known. In the 80s compute multipliers were already operating in the tens of gigahertz range.

  • @latemhh5577
    @latemhh5577 3 года назад +11

    This really is a masterpiece

  • @robertmclean6629
    @robertmclean6629 3 года назад

    It takes less inputs and energy to compute in a “wet” environment. Air gapped transistors are going to become antiquated and relegated to low cost switching and basic computing.
    It might be easier and more efficient to compute using “wet” chemistry in situ rather than building up voltage/amperage to jump air gaps with noisy forward voltage to fulfill potential.
    Just a thought. Have a great day.

  • @raykent3211
    @raykent3211 3 года назад +8

    Excellent vidéo, thank you. I don't think it's the analogue aspect that can make AI indecipherable. In a purely digital neural network the trail of causation that resulted in certain weightings (and therefore a decision) is irreversible. I'm fascinated and worried by recent discussions of how a trained system could have inbuilt prejudice that can't be proven.

    • @faustin289
      @faustin289 3 года назад

      This is no different by how decisions are formed in human mind either. It has been observed that we (not sure who's we) take decisions and our conscious self then try to rationalize those decisions after the fact.

    • @fofopads4450
      @fofopads4450 3 года назад +1

      It is not indecipherable. It is just not easy to decipher, because you will end up needing more computational power than the AI itself consumes, just to monitor what the AI is doing. It defeats its purpose.

  • @wood6454
    @wood6454 2 года назад +1

    86 billion processing units in my brain and I still can't add two digit numbers

  • @user-pc5sc7zi9j
    @user-pc5sc7zi9j 3 года назад +3

    What is this "widespread aviability of GPU's" he is talking about?

  • @tonysu8860
    @tonysu8860 3 года назад

    Only covers this topic at the 30,000 foot level...
    Something approximately what lay persons can find and understand quickly.
    Was hoping to go at least one level deeper, introduction to actual computations or otherwise these GPU hardware are appropriate for ML and the likely direction of evolving designs.

  • @theonetruemorty4078
    @theonetruemorty4078 3 года назад +4

    Quantum indeterminacy is required, there's no such thing as "artificial." What we call "consciousness" is the portion of a calculatory apparatus that dwells within a probability distribution, in a similar manner to which eyes dwell in the world of partial electromagnetic spectrum wavelength variation. Analyses of visual spectrum wavelengths are communicated to the visual cortex via a shared communication protocol; the visual spectrum itself does not dwell within the same domain as the eyes. In a similar fashion, what humans derogatively refer to as the "subconscious" mind acts as an interpreter of data received from the "conscious" mind reporting from the front line of a probabilistic domain; the "subconscious" does not dwell in the same domain as the "conscious." The deterministic informs the probabilistic, the probabilistic guides the deterministic, feedback loop paradox party time ensues; this is the strange and largely misunderstood process that we refer to as freewill. (disclaimer: don't listen to anything i say, i've clearly taken too many psychedelics, cheers)

  • @bigbeneconotmyjob6474
    @bigbeneconotmyjob6474 3 года назад

    I'm huge into R-RAM or memristor technology, I wish you could talk about it more but I understand that the video is more a general overview and not too in-depth in the sub-subjects.
    With that said, some comments are Memristor (last I researched late 2019): Trying to get memristors to be accurately set and operate predictably took some time, but in 2018 a team made a memristor that could be accurately set to 64 different states, or 5 bits of equivalent precision, which is the first hurdle solved (I think it still used cobalt, which is a material everyone wants to avoid due to being monopolized by a slave labor country). That is just the memristor itself, there are circuit reading techniques that further enhance the memristor accuracy, as well as circuits that will accurately perform the calculation directly with memristors (combined into a perception module) reducing the need for ADC and DAC to improve speed and energy efficiency. Currently the three main things that need to be improved to make it a feasible technology is, when reading the data it is automatically destroyed (memristor changes state once read) so a system to automatically reprogram the resistor is needed (its not the hardest, we simple just haven't gotten to that point yet in prototype research, as people are still debating memristor design/chemistry types), setting such a system into a matrix array with all needed support circuitry (aka figuring out a good architecture, which for now is putting the cart before the horse), and tooling is also the major cost. Production of memristor uses techniques that aren't too standard, so making it easier is needed to easy adoption.

  • @Lukegear
    @Lukegear 3 года назад +18

    new mind hardware xD

  • @pranjalmittal
    @pranjalmittal Год назад

    4:35 it was mentioned that the memory transfer accounts for the vast majority of time and power consumed, and later on we discussed that analog systems may be a solution to optimizing the computation (accumulative matrix multiplication), but shouldn't the transfer speed be the focus of the optimization given it's the main time and energy consumption bottleneck? Something like using using photonics for storing/transfer of data to make it faster and have low energy footprint, something like what Lightmatter (the company) is doing.

  • @SciHeartJourney
    @SciHeartJourney 3 года назад +2

    OMG, I've found my calling!
    I'm a pro with Op Amps and transistors! I know digital design and computer architecture very well too. I'm excited! 🤗

  • @generalx5220
    @generalx5220 3 года назад +3

    Wow! I’m now woke AF in the understanding of AI

  • @trumanhw
    @trumanhw 3 года назад

    Brilliant, the adjective -- not the noun ... made this video possible.
    (truly fantastic quality in every metric I can think of; THANK YOU!)

  • @shoam2103
    @shoam2103 3 года назад +1

    3:05 most terse summary of current ML tech, rather accurate too I'd think!

  • @shairozsohail1059
    @shairozsohail1059 3 года назад +2

    Been an AI researcher for years and learned a lot from this video. Thanks

  • @hoaxuan7074
    @hoaxuan7074 3 года назад

    The fast Hadamard transform basically only needs patterns of add and subtract operation. Needing only a few transistors per operation on a chip.
    Then you can make Fast Transform fixed filter bank neural nets.
    The 2-point transform of a,b is
    a+b,a-b. It is self inverse.
    (a+b)+(a-b),(a+b)-(a-b)
    =2a,2b
    To get the 4-point transform of a,b,c,d form two 2-point transforms. Then form the sum and difference of the sum terms (alike terms) (a+b),(c+d). And the sum and difference of the difference terms (a-b), (c-d). Done.
    At each stage you sum and difference alike terms.

  • @yepyep266
    @yepyep266 3 года назад +1

    computing has to become analog to get to the next level. In a sense, going digital is effectively compressing a whole signal to a single bit of information.

  • @UmairHussaini
    @UmairHussaini 3 года назад +1

    Excellently explained!

  • @simepaul4882
    @simepaul4882 3 года назад +1

    What a greatly done video. The narration is so logical...

  • @evennot
    @evennot 3 года назад

    I thought this video was about massive parallelism. It's basically a high-rise elevator dilemma, but for chips.
    Also, analog computing is cool. For instance, it can computationally solve differential equations with a ton of parameters. Basically, you can recreate any physical function with custom analog schematics and compute it instantly. Only downside is that it's too expensive to build on a chip (in terms of engineering). So I think it's just an act of economical balancing: Cost of analog ASIC engineering + production + programming > cost of consumer grade engineering + production + programming. Trend might reverse only when consumer chips would incorporate analog processing (as a natural evolvement due to clock frequency and nanotechnology limitations)

  • @andrewharbit7449
    @andrewharbit7449 2 года назад

    Tuning is the very process we our selves use to perfect our understanding of the environment around us. As newborns the environment bombards our sensors with information, this information could be seen as static, as time goes on we adjust and begin tuning our sensors, the static noise that we were born with never goes away, we simply tune into signals that benifets our existence. Sense the human brain is such an effective system and it appears to utilize frequency modulation in its formation of the environment it would make sense to take another look at analog computing.

  • @mikeall7012
    @mikeall7012 3 года назад

    I assume you were referring to home electronics when referring to the commercial use of analog computers? Analog computing was a tenant of manufacturing and powerplant control systems up through the 90s. There are many old plants that still use analoge computers for control systems, since they have yet to upgrade. While certainly obsolete, analog computers make wonderful control systems since they do not rely on discrete sampling. I have worked on both digital and analog systems. I even work on a plant that has a mechanical computer that is used for pressure control.

  • @bits_of_michel
    @bits_of_michel 3 года назад

    This is one of the best RUclips videos I've seen in my life. Incredible visuals and explanation. Thank you.

  • @cavemann_
    @cavemann_ 3 года назад +1

    It's amazing how far computer science can go

  • @elchicovip01
    @elchicovip01 3 года назад

    I use vulkan fp16 aka half presision for image upscaling. Also, yeah half presition is less accurate, but it's completely worth the performance boost.

  • @JayJay-ki4mi
    @JayJay-ki4mi Год назад +2

    Analogue systems are like the human brain. They require the right temperature for example. With that in mind when designing an analogue AI system you must also factor in the environment. It's the responsibility of the designer (call him God if you will) to create the correct environment (temperature, noise etc) for the AI to survive and do it's task. I recently quit working in software. I now stare at an oscilloscope all day. Analogue computers have never died,, it's just people didn't know about them until now.

  • @derekwood8145
    @derekwood8145 3 года назад

    One hugely negative downside to analog computing, beyond what you measured is that they are typically “low power”, but have high energy consumption.
    Digital logic (I.e CMOS) consumes little power when static (when the clock signal is stable) whereas most analog designs have much lower peak power consumption but they consume power all the time. To decrease power consumption you have to introduce additional gates, dramatically reducing the benefits of analog systems, and making them essentially a digital/analog hybrid design.
    It’s really impressive you could convey these sorts of details in a coherent 13 minute video, kudos.

  • @Jluna16
    @Jluna16 3 года назад +1

    This man went from one shit I don’t understand to another shit I don’t understand all video long 😂

  • @Eo_Tunun
    @Eo_Tunun 3 года назад

    Analog computing *was* successfully used in design for decades! Car manufacturers used analog computers (One should rather call them electronic simulators I think, for there are no real computations going on in them, they simulate systems in an electronic equivalent.) for evaluating the behaviour of suspensions in various conditions. In aircraft design, analog simulations were used to predict and prevent wing flutter. I am not sure, I vaguely remember that Professor Eppler's electronic windchannel, on which he developed his aerofoils, many of which helped helped some great aircraft to become the success they were, was such an analog machine.
    All sorts of engineering problems were simulated with such machines and their output was (economically) succesfully applied. Simulations on digital computers had a hard time catching up with them until well into the seventies, and it took computers of the Oomph! of Cray 2 and their likes to really make them obsolete in that field.
    Remember why the F-117 was that edgy? It was designed using Cray 1s in the late seventies. Despite being absolute top notch machines of their days, the Cray 1 didn't quite have the calculation power to calculate all the possible radar refletions in a feasible timeframe, so designers went for a shape the computers could handle. Use Computers that are 20 years ahead, and you get a B-2 or F-35.

  • @marekdyjor
    @marekdyjor 3 года назад

    I must say its very interesting. It's some kind of back to roots of neuron networks, first perceptron was analog systems. Analog machines where very efficient in modelling of physical processes, but where complicated and need lots of maintenance. So we abandon this technology to pure digital machines.

  • @bilalbzaka5152
    @bilalbzaka5152 3 года назад +2

    This guy can make anything sound thrilling and fun to watch

  • @A.Mere.Creator
    @A.Mere.Creator 3 года назад +1

    The part about the human brain blew my.... mind.

  • @HexDani
    @HexDani 3 года назад +1

    Man this was next level