Комментарии •

  • @samkelosibongakonke5003
    @samkelosibongakonke5003 2 года назад +13

    Great Overview👌, I never get tired of watching these videos. There's always something new to learn and gain

  • @gasasira
    @gasasira 2 года назад +3

    Excellent explanation Dr. Nicole Saulnier. Proud to work with such a brilliant team of Research Scientists like you.

  • @rock3tcatU233
    @rock3tcatU233 9 месяцев назад +2

    This was a great explanation of in memory computing.

  • @kartikpodugu
    @kartikpodugu 11 месяцев назад

    Excellent.
    Have heard about in memory compute earlier, but came to know the details here.

  • @helsonkumar8173
    @helsonkumar8173 2 года назад +4

    Such a detailed explanation. Thank You.

  • @Flankymanga
    @Flankymanga 2 года назад +4

    From my understanding: You need to design physical chips that perform similarly as brain neurons where each neuron has its computational part and memory colocated?

  • @newdar-ff5bz
    @newdar-ff5bz 2 месяца назад

    great explanation❤

  • @satheeshan
    @satheeshan 2 года назад

    Nice description!

  • @edbertkwesi4931
    @edbertkwesi4931 3 месяца назад

    the lady is more clear in her explanation than my lecturer.

  • @vivekpandya2163
    @vivekpandya2163 4 месяца назад +1

    How registers i.e W weight values can be changed dynamically? Something similar to FPGA?

    • @jachymfibir
      @jachymfibir 2 месяца назад

      nah, this can be done using flash memory where electrons/charge is stored in a potential trap, just instead of using discrete values like in 3d-nand, you use continuous charges to represent varying values of G

  • @cybergame.
    @cybergame. 2 года назад +2

    Nice video

  • @carniv0t
    @carniv0t Год назад

    but how are the G Variables with their different values set into the matrix? I guess they need to be modified when training the neural network so they can`t be set by lithography when making the chip. But how is it done afterwards when working with the chip?

    • @jachymfibir
      @jachymfibir 2 месяца назад

      this is done using flash memory where electrons/charge is stored in a potential trap, just instead of using discrete values like in 3d-nand, you use continuous charges to represent varying values of G

  • @kimisaacbuelagala1314
    @kimisaacbuelagala1314 2 года назад +3

    analog computers are back

  • @loveyourneighbors4406
    @loveyourneighbors4406 2 года назад

    Would be great if there is a video that explains stateless vs stateful?

    • @IBMTechnology
      @IBMTechnology 2 года назад +1

      Thank you for the suggestion! We're looking into it. Don't forget to subscribe so you'll know when we publish "Stateless vs Stateful explained."

  • @mayaq8324
    @mayaq8324 2 года назад +1

    What do you mean by “memory”, (in beginning of clip) give an example.. RAM?

    • @thelonespeaker
      @thelonespeaker 2 года назад

      Yes, it is RAM. During the forward propagation process (and backwards likewise) computations are performed -as usual- by the ALU, so it implies fetch-decode-execute steps that inevitably rely on the buses mentioned in the first diagram. Some limitations can be overcome by GPUs thanks to heavier loads of data and parallel buses between memory and the streamlined multiprocessors. Still way slower than doing everything locally though.

    • @jachymfibir
      @jachymfibir 2 месяца назад

      @@thelonespeaker Actually, RAM uses more power than flash, as flash can store weights (G) with no power

  • @randomcraft2345
    @randomcraft2345 Год назад

    I can't understand, what is stored in such memory? Is V or G a variable?

    • @randomcraft2345
      @randomcraft2345 Год назад

      Ok. From the site in the description I understand that G is variable.

  • @geeknerd763
    @geeknerd763 6 месяцев назад

    How does it work for negative weights?

    • @jachymfibir
      @jachymfibir 2 месяца назад

      As far as I understand, using flash memory for this, you can only use positive charge to increase resistance, so you would need to normalize to remove negative values

  • @velo1337
    @velo1337 2 года назад

    how far is IBM with in memory computing?

  • @monstercameron
    @monstercameron 2 года назад

    I wish I was smart enough to enter the AI field, this is the future.

    • @icekk642
      @icekk642 Год назад

      it's never too late bud

  • @Abu_Shawarib
    @Abu_Shawarib 2 года назад +1

    Using analog logic to compute on memory. Interesting.

  • @ritheshofficial
    @ritheshofficial 10 месяцев назад +1

    wait, is she actually writing in reverse or it's being reversed in real time or something? My apologies if it's something common, there were no digital boards back when I was in school.

    • @IBMTechnology
      @IBMTechnology 10 месяцев назад +1

      See ibm.biz/write-backwards

    • @ritheshofficial
      @ritheshofficial 10 месяцев назад +1

      @IBMTechnology Wow, that's actual clever and simple. Only reinforces my belief of best solutions are often the simplest ones.

  • @pushpagautam9173
    @pushpagautam9173 Год назад +2

    How is she writing that way

    • @quentinpolet4788
      @quentinpolet4788 9 месяцев назад +1

      Just write as usual, then flip the video

  • @SR_M0L1NA
    @SR_M0L1NA 2 года назад +1

    🤔 I am thinking about maybe it is possible to replace those resistors by digital potentiometers in rapid prototyping circuits to expirment with them, as a hybrid between a fpga with resistors, or include already the part of the predesigned circuit: with transistors and these to the resistors. The rest how the memory, demux, D/A, A/D, etc circuit would be designed by the engineer or student.

    • @SR_M0L1NA
      @SR_M0L1NA 2 года назад

      A digital potentiometer has typically 5 to 8 bits controlling 31 to 255 resistors respectively, that is 0.3922% resolution for 8 bits, and analog potentiometers offer 2%. In this way AI systems can be designed for home use or small applications at a very low cost. If they are made of 10 bits would control 1023 resistors that would have a resolution of 0.0978%.

  • @balmytsunami
    @balmytsunami 2 года назад +9

    Is she writing backwards?

    • @nand3kudasai
      @nand3kudasai Год назад

      Probably not, a nice trick is having she write the normal way, but then flip the video horizontally, if you notice it *seems* she's using her left hand, which is statistically un-probable. but if the video is flipped horizontally then she's writing with her right hand.
      in this other ibm video this person also "seems" to be using his left hand, but if flipped it would be his right hand. so..
      ruclips.net/video/b61DPVFX03I/видео.html

  • @josequinton6940
    @josequinton6940 9 месяцев назад

    Watched a doctor explain sugars & honey just a hour ago using clear glass{?} & various colors markers writing backwards like this Chancellor Lady,,,man i want to know what is the trick

  • @TwentyNineJP
    @TwentyNineJP 3 месяца назад

    Customer service chatbots aren't a good example to evoke a positive reaction 😑

  • @beautifulsmall
    @beautifulsmall Год назад

    Looks like we did indeed move on. Elon Musk and others saying stop lhe lamas.

  • @aryansharma-wf9eh
    @aryansharma-wf9eh 8 месяцев назад

    Did she learn how to write backwards, just for this video or am I missing something?

    • @TwentyNineJP
      @TwentyNineJP 3 месяца назад

      Very late reply, but the video is probably mirrored left to right. You can see that her ring is on what would be her right hand if it isn't flipped, and that she also appears to be left-handed if it's not flipped.
      Both of those are possible without a doubt, but it's more likely not