Tensor Processing Units: History and hardware

Поделиться
HTML-код
  • Опубликовано: 19 ноя 2024

Комментарии • 31

  • @saiviswanth450
    @saiviswanth450 4 месяца назад +2

    Excellent explanation of the TPU, especially the example of the letters registered in my mind on the difference between CPU, GPU & TPU

  • @arbeen
    @arbeen 3 года назад +17

    Here because of Pixel 6 announcement. 😂

  • @cristher5685
    @cristher5685 3 года назад +20

    Great analogy of CPU, GPU and TPU =)

    • @_zeeblo
      @_zeeblo 11 месяцев назад

      yeah when I heard the analogy I was like "aaah"

  • @HeavyMetal56339
    @HeavyMetal56339 4 года назад +4

    very awesome explanation

  • @thehazeofficial6038
    @thehazeofficial6038 2 года назад +4

    very nice explanation for begginers like me

  • @rosylisa1997
    @rosylisa1997 2 года назад +1

    thank you :) but
    where's the link plz?

  • @luizcarlosv.b.das.junior7477
    @luizcarlosv.b.das.junior7477 2 года назад +1

    Thanks for the video!

  • @zilog1
    @zilog1 5 месяцев назад +1

    There is a deep bass that can be heard with nice headphones. AC?

  • @kralslosh
    @kralslosh Год назад

    nice explanation, thanks

  • @ncknelson
    @ncknelson 3 года назад +2

    Who's watching here after Google announces they will be using custom design Tensor Soc.

  • @PunmasterSTP
    @PunmasterSTP 2 года назад

    TPU? More like "Totally great information for you." Thanks for sharing!

  • @TheSireeshKumar
    @TheSireeshKumar 3 года назад +1

    I have seen it last year and came here again for Pixel 6

  • @vtrandal
    @vtrandal Год назад

    Please explain this technique of quantization where Google mapped 32 bits to 8 bits in TPU v1.

  • @sarbajitg
    @sarbajitg 2 года назад +1

    So is TPU replacement for only GPU or it replacement for both CPU and GPU?

    • @PunmasterSTP
      @PunmasterSTP 2 года назад

      I think a TPU is more like a piece of hardware geared towards certain applications such as deep learning. As there should still be a need for both general computation and computer graphics, I think CPUs and GPUs should be here to stay.

    • @ConsciusVeritasVids
      @ConsciusVeritasVids 2 года назад +1

      TPUs are geared toward neural network machine learning.
      I use Google's TPU cloud computing to process multimodal AI image generation.

  • @stackexchange7353
    @stackexchange7353 4 года назад +1

    Thanks!

  • @jominksimon9296
    @jominksimon9296 3 года назад +1

    My Pixel 6😍

  • @abdontroche
    @abdontroche 4 года назад +1

    Amazing! Which other tasks can be a target for a new specific processor?

  • @shahvaze12
    @shahvaze12 3 года назад +1

    pixel 6 will use yess

  • @wowepic2256
    @wowepic2256 4 года назад +1

    any higher precision?

  • @someghosts
    @someghosts 8 месяцев назад

    Why is their weird sub base/ low notes in this video?

  • @hullopes
    @hullopes 3 года назад +1

    Which do you think is the best for TensorFlow training models: GPU or TPU?

    • @stasgurevich7786
      @stasgurevich7786 Год назад +1

      Well you can try this yourself. Use colab notebook. You can select GPUs like T4, V100 or A100 or TPU. I tried this experiment (but with torch, not tf, and for inference only) and got pretty disappointing results. TPUs have only 16g of memory and they were slower than the slowest GPU. Maybe TPU use less watts, maybe colab instances have suboptimal config, maybe torch performance is bad, anyway interesting experience :)

  • @joehsiao6224
    @joehsiao6224 3 года назад +1

    Apple silicon joining the room.

  • @shahvaze12
    @shahvaze12 3 года назад +1

    and 6 pro

  • @wffff2
    @wffff2 11 месяцев назад

    This video is getting too little likes and views given the AI hype now

  • @Showmetas
    @Showmetas Год назад

    i was hopeing to actually see some but no just boring graphs boring how does this video get 44,000 views but our videos get 230 views over 3 years