Tensor Products are just Matrix Multiplication, Seriously.

Поделиться
HTML-код
  • Опубликовано: 30 июн 2024
  • Tensor products are the first step towards a theoretical framework of tensorial data, that, is scalars stored in arrays and grids. We define this beginning with matrix multiplication deriving its universal quality.
  • НаукаНаука

Комментарии • 17

  • @leonjones7120
    @leonjones7120 9 месяцев назад +1

    Thanks you guys. I am looking into Particle Physics and need stronger Tensor comprehension. This has helped me! Thanks!

  • @Dhruvbala
    @Dhruvbala 8 месяцев назад +1

    This seems profound but I lack the background knowledge to make sense of it. Senior undergrad here, just recently introduced to the tensor product in the context of quantum entanglement. I look forward to revisiting this video at some point when I’m ready 😁

  • @tarsioonofrio
    @tarsioonofrio 2 месяца назад

    James Wilson, I LOVE YOU!!!!

  • @rationalbelief4451
    @rationalbelief4451 3 месяца назад

    Thanks a lot master...

  • @user-sv5vb1mj1q
    @user-sv5vb1mj1q 2 года назад

    Also what do you thing about Clifford algebra? Could it be more general case than tensor algebra?

    • @shaunticlair
      @shaunticlair Год назад

      Late reply, but for anyone curious (as a result of some googling just before this), it turns out tensor algebra is actually the more general one: clifford algebra less general (but that gives it lots of applications!)

    • @user-sv5vb1mj1q
      @user-sv5vb1mj1q Год назад +1

      @@shaunticlair THX a lot. Could you please
      make video about it?

  • @DarioOliveri
    @DarioOliveri 3 месяца назад

    So if I have a tensor of size 5x6x7x3x2 and a tensor of size 5x6x7x2x4. The out put should be a tensor of size 5x6x7x3x4 and to compute that I should just line up the 5x6x7 matrices from first tensor and multiply with the 5x6x7 matrices of the second tensor... Correct???

    • @algeboy
      @algeboy  3 месяца назад

      Many configurations are possible with these data. One has to decide what axes are being contracted. Say 5 x 6 x 7 x 3 x 2 is to be contracted with 5 x 6 x 7 x 3 x 4 by matching the first three axes of each. Then 5 x 6 x 7 will contract and no longer be part of the output. This leaves 3 x 2 and 3 x 4 for a total 3 x 2 x 3 x 4 as the resulting tensor.

    • @DarioOliveri
      @DarioOliveri 3 месяца назад

      @@algeboy i'm interested in machine learning. The basic step Is forward pass so a weight Matrix W, a input Vector x, and a bias b, they add together as Wx+b.. of course I'm having difficulty generalizing for N dimensional input. Says the layer have 17*17 neurons and the input Is a 28*28 image, then I think I Need some kind of operation that starts, i Guess uh. From a 17*17*28*28 W tensor multiplied by a 28*28 x input, added to a 17*17 bias.. I sort of miss the general algorithm for that. Thanks for Quick answer. Very good video

    • @algeboy
      @algeboy  3 месяца назад

      You may do better with this video ruclips.net/video/pQcIrO7eyW0/видео.htmlsi=ocPdVDnm1oodlLmy

  • @gaboqv
    @gaboqv 2 года назад +1

    a tensor product is a product between two vector spaces to a third one that doesn't care about choices of bases in the inputs, this tensor products can be constructed inteligently in a base agnostic way by using properties of a particular base, however is not only that we can inteligently find a way to present the product but that in fact we construct the tensor product formally by saying it is the partition of all the equivalent products between these two spaces. mindbendgly all this products are exactly the kernel of one function with mappings: BxV>K as inputs and something that isn't still clear to me as outputs (maybe also other mappings?) it seems that with your table construction you could stablish which products would be exactly in the kernel
    and this could be now written with the special circled x product, i don't why i didn't watch this video before maybe it was a little convoluted but i appreciate you published it, i came back to it after 5 or 6 months and getting into IA, and wanting to grasp a little bit more theory to at least know when people are using tensors in a more deep way than np arrays

  • @rs-tarxvfz
    @rs-tarxvfz Год назад +2

    Dear friend, Tensor product are special kind of product called *Kronecker products* and not plain Matrix Multiplication.

    • @algeboy
      @algeboy  Год назад +3

      That's a common misconception, and we mathematicians are to blame for the confusion. The tensor product in this video is how Whitney defined the universal distributive product on two spaces (abelian groups/modules) so turn space A and B into a new space A(x)B. Once you have that notion then you can ask to combine two linear transforms f:A->C and g:B->D into one space f(x)g :A(x)B -> C(x)D. If these are vector spaces and if you choose bases, you can explain that using Kronecher products. Think about this: if A(x)B becomes an a x b matrix M and you have a c x a matrix F and a d x b matrix G. Then the map M -> . FMG^t is c x d. This is linear and this is F(x)G (try it on E_ij matrices and you recover Kroncher's formula). But that is all based on having first build the Whitney tensor product, so to say "kronecher product" = "tensor product" is an inadequate summary, but I completely agree this confusion is easy to make.

  • @user-sv5vb1mj1q
    @user-sv5vb1mj1q 2 года назад

    Ha ha I knew that)

  • @TheAstroVideos
    @TheAstroVideos 2 года назад +1

    Educational value of the Video is Zero!

    • @positivobro8544
      @positivobro8544 Год назад +1

      Go watch an introductory video and come back then