Tensors for Beginners 13: Tensor Product vs Kronecker Product

Поделиться
HTML-код
  • Опубликовано: 18 ноя 2024

Комментарии • 75

  • @zhiiology
    @zhiiology 5 лет назад +30

    Hi eigenchris, just wanted to say that while the video is clear, the kronecker product is usually computed by shoving the tensor on the right into the tensor on the left, which is usually how it's done at least in quantum computing

    • @KleineInii
      @KleineInii 5 лет назад +4

      Yeah, I also found that confusing when I read a paper from Tamara Kolda (epubs.siam.org/doi/pdf/10.1137/07070111X, cited 5000 times). I think right to left is more common, but still it's explained brilliant in this video :)

    • @sathasivamk1708
      @sathasivamk1708 3 года назад +2

      @@KleineInii There is math book called ABSTRACT ALGEBRA by DUMMIT AND FOOTE. Take a look, Kronecker product is just tensor product with respect to different ordering of basis

  • @danieldanieldadada
    @danieldanieldadada 5 лет назад +134

    i need a joint

  • @connorbeaton8375
    @connorbeaton8375 3 года назад +2

    This channel is a goldmine

  • @anthonym316
    @anthonym316 3 года назад +5

    My man this is exactly what I needed thank you

  • @underratedPie
    @underratedPie Год назад +1

    I told my colleague to study the Tensor calculus on this channel only

  • @zzzoldik8749
    @zzzoldik8749 5 лет назад +4

    tankyou, eigenchris, i never understand if i learn tensor before, you explain it from the star / beginner. these really helpful

  • @vassillenchizhov290
    @vassillenchizhov290 Год назад

    Your notation with the array of arrays suggests that the Kronecker product produces arrays of higher dimension that two. However the Kronecker product of a matrix is always a matrix, for instance if I take $u\oplus v \oplus w$ this will not produce an object having three indices, it will still be a matrix. Of course you can relate this to the object made from arrays of arrays, but technically they are not the same thing.

  • @dennisbrown5313
    @dennisbrown5313 5 лет назад +3

    Just amazingly clear and good

  • @alessandrogardini5469
    @alessandrogardini5469 4 года назад +6

    Hi! It seems to me that at 2:49 the new vector components should be represented by a latin letter and upper indices (like [w^1 over w^2] rather than [omega_1 over omega_2]). Thank you again for the videos.

  • @mathgeek43
    @mathgeek43 3 года назад

    Something seems off at 1:36. It is my understanding that the tensor product takes a (p,q)-tensor and (a,b)-tensor and produces a (p+a,q+b) tensor. In the example shown, we have the vector e_i which is a (1,0) tensor and eps^j which is a (0,1) tensor. This means that the tensor product between e_i and eps^j is a (1,1) tensor. What's confusing is about 1:36 is that the tensor resulting from the tensor product of e_i and eps^j should have a covector and a vector as an input. This will make sure that the output of the tensor is a scalar and not a vector.

  • @jasonbroadway8027
    @jasonbroadway8027 5 лет назад +2

    I am hazy now. I followed you until this video and the previous video.

    • @eigenchris
      @eigenchris  5 лет назад +1

      Which part is confusing you?

    • @jasonbroadway8027
      @jasonbroadway8027 4 года назад +2

      @@eigenchris I get it. Thank you, Eigenchris.

  • @Timelaser001
    @Timelaser001 6 лет назад +7

    Hi, I'd just like to clarify something. On 2.21, is the summation sign implied for v(j)e(i)? I understand Einstein's notation is used for letters which are the same on top and the bottom, but in this case the letters at the top and bottom are different.

    • @A.Arbabinezhad
      @A.Arbabinezhad 5 лет назад +1

      Tensor product(e_i, epsilon^j) is a stack of vectors all parallel to e_i and the result of acting of this linear map on a vector is linear combination of these parallel vectors. So the result is a vector prallel to e_i (wich is v^j e_i)

    • @abstractnonsense3253
      @abstractnonsense3253 2 года назад

      It's not a sum. v^j e_i is the *number* v^j multiplying the *vector* e_i. The *number* v^j is the jth component of the *vector* v.

  • @小江-j1i
    @小江-j1i 4 года назад +2

    If we can download all the slides, it would be perfect, since we can print it out and study it without electronic devices, also we can drop down notes for deeper understandings.

    • @eigenchris
      @eigenchris  4 года назад +8

      I have the slides uploaded here: github.com/eigenchris/MathNotes/tree/master/TensorsForBeginners

  • @jacobshin4279
    @jacobshin4279 Год назад +1

    Would the array at 3:02 be a rank 3 tensor?
    Am I correct in guessing that the rank = m+n, where m is the # of covariants and n is the number of contravariants?

  • @auvski5903
    @auvski5903 4 года назад +1

    I'm actually pretty sure the Kronecker product as used in this video (and the previous one) is backwards. Both in my university's coursework and on the Wikipedia page, it seems like you're actually supposed to distribute the array on the right over the one on the left, that is, the left array is meant to be used as a "template" and the right array is copied for each block. You can take a look here:
    en.wikipedia.org/wiki/Kronecker_product

    • @eigenchris
      @eigenchris  4 года назад +1

      Yeah, I noticed that. I'm not sure if it's just two different conventions or if the wikipedia one is fundamentally correct.

  • @geneyi8703
    @geneyi8703 4 года назад +1

    I didn't get it that what is the definition of Kronecker product? "We just distribute the array on the left into the array on the right."

  • @chenlecong9938
    @chenlecong9938 Год назад

    Hey as far as i know kronecker product is a branch of tensor product?

  • @IntegralMoon
    @IntegralMoon 6 лет назад +7

    Awesome! Thanks for this :)

  • @dirrelito
    @dirrelito 4 года назад +3

    Typically, you also flatten when using the Kronecker product. Tensor product increases tensor order, but kronecker product does not. This is a quite imporant distinction both in theory and practise. The Kronecker product as you explain it here is not how it works in e.g. Numpy or PyTorch.

    • @sathasivamk1708
      @sathasivamk1708 3 года назад

      Kronecker product is just tensor product with respect to different ordering of basis

  • @trendytrenessh462
    @trendytrenessh462 2 года назад

    Same thing, different context
    Got it, cheers

  • @jamescook5617
    @jamescook5617 5 лет назад +1

    Hold a second... now a row vector is an array. Good.

  • @PaulWintz
    @PaulWintz 5 лет назад +1

    At 1:56, when the j-th basis covector of V* acts on the k-th basis vector of V to become the Kronecker delta, why does the tensor product operator disappear? Is it a valid operation to take the tensor product between a vector and a scaler? And does it merely equal the scalar times the vector?

    • @eigenchris
      @eigenchris  5 лет назад +1

      It probably should have disappeared in the 2nd line, since epsilon(v) is a scalar, not a vector.

  • @ahmadfaraz9279
    @ahmadfaraz9279 3 года назад

    Is dot product is one of the tensor product?

  • @thedorantor
    @thedorantor 2 года назад

    1:57 so I can just as well put a co-vector in as argument, the epsilon of the co-vector "eats" the e_i of the (1,1) tensor, and the result would be a co-vector?

  • @no-one-in-particular
    @no-one-in-particular Год назад

    2:07 The "circle times" shouldn't be there from the second row onwards as epsilon acting on v is a number

  • @srtghfnbfg
    @srtghfnbfg 3 года назад

    Since the kronecker product can act on nested arrays, is there an index notation formula that i can find anywhere ? I've scoured google and can't find anything related to this specifically

    • @eigenchris
      @eigenchris  3 года назад

      I think the Kronecker product, as seen in most math classes, doesn't use "nested arrays". If you have a 2x2 matrix krocker-product'ed with a 3x1 matrix, you will just get a 6x2 matrix, without any "nesting". The "nested arrays" thing is something that I came up with myself. I was desperately trying to make "array multiplication" work for arrays beyond 2D matrices. But my view now is to "give up" trying to multiply arrays bigger than 2D and just use the tensor index notation instead.

  • @jasonbroadway8027
    @jasonbroadway8027 5 лет назад +1

    Linear map...how?

  • @AJ-et3vf
    @AJ-et3vf Год назад

    great video. thank you

  • @quantabot1165
    @quantabot1165 3 года назад

    If i want to do a kronecker product between a 1-column array and a nxn matrix. Do I distribute the column to each row?

    • @eigenchris
      @eigenchris  3 года назад +1

      You distribute the column to each element of the matrix. If the column is m elements tall, you'll end up with an nxnxm array.

    • @quantabot1165
      @quantabot1165 3 года назад

      @@eigenchris I see, Thank You eigenchris!

    • @sathasivamk1708
      @sathasivamk1708 3 года назад

      Nothing fancy here.
      Kronecker product is just tensor product with respect to different ordering of basis

  • @minecrafthowtodude
    @minecrafthowtodude 3 года назад

    so.... I understood a little bit of that :)

  • @paradigmshift03
    @paradigmshift03 3 года назад

    Hey eigenchris, at 2:04, I'm a bit confused about what v^j e_i actually is. Should this not result in a vector? Therefore, shouldn't the top and bottom indices end up being the same?

    • @jonathanchippett4036
      @jonathanchippett4036 3 года назад

      v^j e_i is a vector. It is the ith basis vector of V scaled by the jth component of v.
      In regards to your last question, the top and bottom indices should not be the same because we are not summing over anything. We do not need to do an implicit sum to get a vector.

    • @paradigmshift03
      @paradigmshift03 3 года назад

      @@jonathanchippett4036 Ahh ok, I now understand :) Thank you!

  • @sdu28
    @sdu28 3 года назад

    Is a kronecker product between 2 same-size square matrices valid?

    • @eigenchris
      @eigenchris  3 года назад +1

      Yes. You can do the Kronecker product of any two matrices.

    • @sdu28
      @sdu28 3 года назад

      @@eigenchris thanks

  • @mithsaradasanayake3211
    @mithsaradasanayake3211 3 года назад +1

    Thanks eigen chris

  • @kimchi_taco
    @kimchi_taco Год назад

    It looks like
    Tensot product is operation for vector and covector
    Kroneker product is operation for vector and covector components

  • @ncolyer
    @ncolyer Месяц назад

    and this is meant to be the easy part

  • @shenyuan9810
    @shenyuan9810 4 года назад

    Aren’t they just the same thing but in different forms?

    • @sathasivamk1708
      @sathasivamk1708 3 года назад

      Kronecker product is just tensor product with respect to different ordering of basis, You are right

  • @RaoBlackWellizedArman
    @RaoBlackWellizedArman 4 года назад +1

    Thanks a million!

  • @rs-tarxvfz
    @rs-tarxvfz 2 года назад

    This is so confusing. When you did first Kronecker you distributed the RHS into the LHS but when you did second Kronecker you distributed LHS into RHS. This is like WTF !!

  • @Mathymagical
    @Mathymagical Месяц назад

    You know, one thing I never liked about this @1:52 was that we go from linear map as a tensor product, to then taking the tensor product with the kronecker delta. This step is completely skipped over in most of the discussions. It is not defined to take the tensor product with scalar quantities.

    • @eigenchris
      @eigenchris  Месяц назад

      I probably should have removed the tensor product after the first line if I wanted to be technically correct.

    • @Mathymagical
      @Mathymagical Месяц назад

      @@eigenchris Actually I was doing some additional reading and for wedge products and 1-forms it appears to be generally defined that a k-form wedge a r form is a k+r form. When r is a zero form (a scalar form?) then the wedge product is just a k-form.
      I know I've seen the same logic elsewhere for the tensor product, so I suppose it would be defined. This is that sort of brain hurt that always bothers me about math. After a certain point it stops being 'intuitive' for me. Like everything through differential equations and linear algebra and abstract algebra is fairly 'easy' for me. But these new algebraic structures go from being intuitive and obvious to being 'oh yea it actually totally makes sense to define this operation between what are otherwise completely different objects.'
      In a comparison to something more familiar to me, we learn vectors and the standard stuff in classical mechanics and then my professor just shows us one day that oh yea just add time to the displacement vector. Like adding a scalar to it is fine. Why? Well why not? It doesn't come from a place of derivation but from a place of arbitrary creation.
      It feels more like art than science. I guess it's just matrix multiplication all over again where it doesn't feel like it comes from a place of intuition but is made out of whole clothe, but can be used elsewhere and it's uses become apparent over time.

    • @eigenchris
      @eigenchris  Месяц назад +1

      @@Mathymagical I think the correct thing to do is remove the tensor product after the first line, like I said. As soon as a linear map eats a vector, it should become a vector as a result.
      You're right that taking the wedge product of a scalar and vector does sound a bit fishy. But that wedge product is usually defined in the "exterior algebra", which is a "mega" vector space that contains scalars, vectors, bivectors, trivectors, and so on. The exterior algebra for 3D space is an 8D vector space that contains the 8 basis vectors {1, x, y, z, xy, yz, zx, xyz}. In this 8D space, it's sensible to have a wedge product defined between 1 and x, for example, because these are both treated as "vectors" in the 8D space. I have a more recent video on Clifford Algebras (from my spinors series) that goes into some detail about this if you are interested.

  • @mythic3187
    @mythic3187 6 лет назад

    Hey Eigenchris, can we do some problems of some of these.. I'm kinda getting a grasp of what tensors are..what are these good for besides computer programs and/or GR

    • @eigenchris
      @eigenchris  6 лет назад +2

      I'd like to get another 2-3 videos out of the way just to finish the series, but after that I can make a video or two on applications. In addition to GR, there are applications in quantum mechanics, electricity & magnetism, and continuum mechanics. Unfortunately I am less experienced with the physics compared to the pure math, so I don't have a ton to say on those topics right now.

    • @mythic3187
      @mythic3187 6 лет назад

      Thank you very much

  • @debendragurung3033
    @debendragurung3033 3 года назад +2

    Kinda lost here.

  • @sathasivamk1708
    @sathasivamk1708 3 года назад

    I guess it's probably intended for physicist, or engineer . As a mathematician, I think it's not good way to learn it.
    Kronecker product is just tensor product with respect to different ordering of basis

  • @farriskft4847
    @farriskft4847 3 года назад +2

    Same as prevoius: when why how can this information be used...? Start with some practical approach pls.

  • @MrSidTaylor
    @MrSidTaylor 4 года назад +1

    This is hopeless - it's supposed to be for beginners!!

    • @eigenchris
      @eigenchris  4 года назад

      Which parts are you struggling with? Have you watched the rest of the series up until this point?

  • @shone7064
    @shone7064 Год назад

    I think your definition of Kronecker product is reversed.
    The definition on Wiki says that A⊗B distributes B into A, but you just distribute A into B