I think a TPU is more like a piece of hardware geared towards certain applications such as deep learning. As there should still be a need for both general computation and computer graphics, I think CPUs and GPUs should be here to stay.
Well you can try this yourself. Use colab notebook. You can select GPUs like T4, V100 or A100 or TPU. I tried this experiment (but with torch, not tf, and for inference only) and got pretty disappointing results. TPUs have only 16g of memory and they were slower than the slowest GPU. Maybe TPU use less watts, maybe colab instances have suboptimal config, maybe torch performance is bad, anyway interesting experience :)
Excellent explanation of the TPU, especially the example of the letters registered in my mind on the difference between CPU, GPU & TPU
Here because of Pixel 6 announcement. 😂
Me 2!
Great analogy of CPU, GPU and TPU =)
yeah when I heard the analogy I was like "aaah"
very awesome explanation
very nice explanation for begginers like me
Glad to hear that
thank you :) but
where's the link plz?
Thanks for the video!
There is a deep bass that can be heard with nice headphones. AC?
nice explanation, thanks
Who's watching here after Google announces they will be using custom design Tensor Soc.
TPU? More like "Totally great information for you." Thanks for sharing!
I have seen it last year and came here again for Pixel 6
Please explain this technique of quantization where Google mapped 32 bits to 8 bits in TPU v1.
So is TPU replacement for only GPU or it replacement for both CPU and GPU?
I think a TPU is more like a piece of hardware geared towards certain applications such as deep learning. As there should still be a need for both general computation and computer graphics, I think CPUs and GPUs should be here to stay.
TPUs are geared toward neural network machine learning.
I use Google's TPU cloud computing to process multimodal AI image generation.
Thanks!
My Pixel 6😍
Amazing! Which other tasks can be a target for a new specific processor?
pixel 6 will use yess
any higher precision?
Why is their weird sub base/ low notes in this video?
Which do you think is the best for TensorFlow training models: GPU or TPU?
Well you can try this yourself. Use colab notebook. You can select GPUs like T4, V100 or A100 or TPU. I tried this experiment (but with torch, not tf, and for inference only) and got pretty disappointing results. TPUs have only 16g of memory and they were slower than the slowest GPU. Maybe TPU use less watts, maybe colab instances have suboptimal config, maybe torch performance is bad, anyway interesting experience :)
Apple silicon joining the room.
and 6 pro
This video is getting too little likes and views given the AI hype now
i was hopeing to actually see some but no just boring graphs boring how does this video get 44,000 views but our videos get 230 views over 3 years