Tensors for Beginners 12: Bilinear Forms are Covector-Covector pairs

Поделиться
HTML-код
  • Опубликовано: 18 ноя 2024

Комментарии • 118

  • @eigenchris
    @eigenchris  6 лет назад +35

    Around 6:45-7:04 there are mistakes in the 3rd and 4th lines... Some of the B indexes are reversed.

    • @jasonbroadway1608
      @jasonbroadway1608 5 лет назад +1

      It is true, but that is not what threw me off anyhow. I can usually handle small errors.

    • @soumyabratahazra7723
      @soumyabratahazra7723 4 года назад +2

      I think at 6:45 , the 2nd line should be [[B11 B21] [B12 B22]]....plz see

    • @soumyabratahazra7723
      @soumyabratahazra7723 4 года назад

      I am finding difficulty at this tensor product of two covectors point,plz explain

    • @sudiptoborun
      @sudiptoborun 4 года назад

      @@soumyabratahazra7723
      You're right bro. Seems like you've understood it but aren't aware of it.

    • @МуратХоконов-г4н
      @МуратХоконов-г4н 4 года назад

      Yes, there is a mistake at 6:56 etc

  • @Hermis14
    @Hermis14 5 лет назад +15

    As an aircraft engineer, I was only used to the length-preserving transformation of three-dimensional vectors. But I recently started to think that, the tensors, which I had an only taste of long ago, maybe a good tool for solving control problems of complex nonlinear systems. I decide to learn what it is and these videos are amazing to me in many aspects. Each stuff explained is so easy to understand. Thank you!

  • @jamesbaxter8914
    @jamesbaxter8914 8 месяцев назад +2

    This series is the best expanation of tensors and their notation, you make everything very clear.

  • @Charlesstrahan
    @Charlesstrahan 6 лет назад +40

    I almost never comment on videos, but I just have to say: this series is awesome. Thanks a ton for putting your time and energy into producing these!

    • @eigenchris
      @eigenchris  6 лет назад +11

      Thanks. They do take a lot of time, so I'm glad it was worth it.

  • @TheAnton4life
    @TheAnton4life 5 лет назад +9

    I was so confused about how to formulate the metric tensor as being sorta double covectorish and now I finally understand! It was my reason for starting this series but now I'm gonna watch the whole thing

    • @eigenchris
      @eigenchris  5 лет назад +2

      It's interesting that you say that because when I look back on these videos, I see the whole "row of rows" thing as being a bit ridiculous and convoluted. But I'm glad you got something out of it.

  • @haibojiang5320
    @haibojiang5320 2 года назад +1

    I tried to build a constitutive model for geotechinical material and promote it to general stress space, which requires a lot tensor knowledge. I have been looking for different books and videos. These videos really inform me about what is a tensor and how it transforms. Your video is highly appreciated!!!

  • @Nickelicious7
    @Nickelicious7 3 года назад +3

    I’ve literally been searching for an explanation like this for an entire week and now I’ve finally found it! Thank you so much!

  • @erikengheim1106
    @erikengheim1106 4 года назад +5

    Amazing job you have done here. This must have taken a long time to create. You put a lot of attention into using colors in a helpful manner in showing the math. Also you have been very thorough in explaining all the concepts that those of us with with patchy memory of our school math can still follow the video series. Very thorough and carefully planned.
    Good pacing, editing, clear voice and everything. Very well done videos.

  • @dhwaneelkapadia3265
    @dhwaneelkapadia3265 10 месяцев назад

    I feel like I had my first Eureka moment while watching this video. Thank you for the amazing content.

  • @awakening5967
    @awakening5967 2 года назад

    I like the idea of "row of rows", which disclose the awkwardness I have never noticed before. Thanks.

  • @rlicinio1
    @rlicinio1 6 лет назад

    I could not understand a single word for the last three or four last videos but strangely I am still enjoying. Thank you¡

    • @eigenchris
      @eigenchris  6 лет назад

      Is that because of the audio quality, or just just because of the nature of the math? Is there anything I can do to make it easier to understand?

    • @rlicinio1
      @rlicinio1 6 лет назад

      eigenchris No, the sound is very good and the way that you explain it is great too. Actually, the math is hard to me. I need to watch the whole series from the beginning again. Thank you for taking your time to answer my silly comment and thank you for the classes. Obrigado.

    • @eigenchris
      @eigenchris  6 лет назад +1

      I'm a little worried my tensor product videos haven't been as good as the previous ones... I'm just going to keep making them for now. Feel free to comment about anything that's unclear or confusing.

    • @Suav58
      @Suav58 6 лет назад +1

      @@rlicinio1 And how thick is your exercise book on this series? You do understand that one viewing might not be enough and recalculating all the stuff along might not be sufficient either. Invent your own examples and do fill some pages with them. It might help as well to have some software that works with tensors (there are free packages). Good luck!

  • @Drull76
    @Drull76 Год назад

    Thanks a lot for the video. It's literally a godsend for me, as I'm struggling with the tensor in general relativity course

  • @danielribastandeitnik9550
    @danielribastandeitnik9550 5 лет назад +1

    Really nice the last part on row of rows!

  • @81546mot
    @81546mot 6 лет назад

    Really enjoy your videos. Had the same question as below (at 6:45)..glad you said it was a typo--I thought I was very dense for about 30 minutes trying to figure out what I was missing. I feel often times I am dense but not that dense....

    • @eigenchris
      @eigenchris  6 лет назад +1

      I apologize. That was a pretty bad typo. I'll pin a comment about the mistake so it's more visible. I wish RUclips hadn't removed the ability to make annotations.

    • @81546mot
      @81546mot 6 лет назад

      Thanks for your response. I am a somewhat slow learner and your explanations are so well presented and clear, it is helpful to me. At some point, will you cover how the metric tensor fits into Einstein's general relativity equations and how all this ties into curved space geometry with changing derivatives and different bases--i am trying to under stand some of this but it is very difficult for me....

    • @eigenchris
      @eigenchris  6 лет назад +1

      My long-term plan is to talk about tensor calculus, curved spaces, and general relativity. But it will take a lot of time to get through all this... I have recently started posting videos on tensor calculus, and I mention the metric tensor and Einstein's equations briefly. If you like you can try asking me questions, but I am not very experienced with relativity yet.

    • @81546mot
      @81546mot 6 лет назад

      Thank you very much. I will start studying your new videos.

  • @isaacAdam
    @isaacAdam 6 лет назад +1

    Thank you sir I love your work I did understand tensor meaning from you for the first time

  • @adarshchaturvedi3498
    @adarshchaturvedi3498 5 лет назад +2

    Around 6.50, we have row of rows , that's a 1x2 matrix ryt , which when multiplied by a vector ( a 2x1 matrix ) will give a 1x1 scalar .. which we get when a covector acts on a vector .. after that , scalar multiplied by again a vector which is (w1,w2) , how it's giving a scalar (

  • @takumimatsuzawa1774
    @takumimatsuzawa1774 4 года назад +2

    These are truly amazing.

  • @zeotex2851
    @zeotex2851 9 месяцев назад

    Bro... this feels like completing my basic Linear Algebra knowledge from undergrad math. How did we learn none of this :(

  • @likeplayinghello04
    @likeplayinghello04 2 года назад +1

    Around 6:40, I cannot understand when you start to multiply B and the column vector v. There is a + sign that disappears on the next line. Could you explain more ? Thanks a lot.

  • @syedzaheerabbas4691
    @syedzaheerabbas4691 6 лет назад

    Thanks Professor. keep it up very natural way of explanation.

  • @subrotodatta7835
    @subrotodatta7835 2 года назад +1

    I am struggling with this derivation between lines 2 and 3 where the BijVk products are being matrix multiplied and regrouped as a row "vector", with two entries of row vectors. I am aware that there is a typo in the B12V2 term which should be B12V1 and the B21V2 term should be B22V2.

  • @RealLifeKyurem
    @RealLifeKyurem 4 года назад +2

    If Bilinear forms are covector-covector pairs, is there an equivalent for vector-vector pairs?
    e.g. e₁ ⊗ e₂

    • @eigenchris
      @eigenchris  4 года назад

      I'm not sure there's a special name for them. There's a related concept called a "bivector", but that's made by combining two vectors using the wedge product, not the tensor product.

  • @jasonbroadway1608
    @jasonbroadway1608 5 лет назад +3

    I have my Homer the poet moments, and I have my Homer Simpson moments. Homer Cross Product Homer. Maybe my mental battery will last longer this time.

  • @FantasmasFilms
    @FantasmasFilms 6 лет назад

    Fantastic series of videos. Do you have anything on the affine connection? The symmetric, antisymmetric part... weyl vector and non-metricity? Thank you for your work!

  • @steveparsley3010
    @steveparsley3010 8 месяцев назад

    I have been trying to see how the bilinear form, written as a row of rows, can work with the two forward transforms written as square matrices, to yield the "new" bilinear form. I can never get the same result as I get when I use the formula at 4:25. Using the formula, for B11~, the F terms would both have '1' in their lower index, but I always get some terms with a '2' there. Am I just being clumsy?

  • @jasonbroadway1608
    @jasonbroadway1608 5 лет назад

    You videos are excellent! I can dense at times! Notwithstanding, the logical leap between 5:47 and 7:30 was huge! Your mistake did not particularly bother me; it is getting the correct stuff right that is hard.

  • @_maniplant
    @_maniplant Год назад

    Thanks a lot for the courses.
    Can we say that Linear maps themselves make a vector space over V x V* . And bilinear forms over V* x V* ?

  • @devalapar7878
    @devalapar7878 Месяц назад

    There is an error at 6:48. The third line in the second equation must be...
    [B_11 v1 + B_21 * v2, B_12 * v1 + B_22 * v2]
    which can be proven with the first equation
    v * B = [v1 * B_11 + v2 * B_21, v1 * B_12 + v2 * B_22]

  • @RockRadioBlog
    @RockRadioBlog 4 года назад

    Around 3:04 you say something along the line of "obviously linear maps are matrices", but I think you meant the converse (at least in absolute, general terms)? I mean, it's true that every matrix is, and can be represented by, a linear map, but the converse is not always true

  • @jasonbroadway1608
    @jasonbroadway1608 5 лет назад

    You do not contradict yourself at 5:47 and at 7:43, but it can seem that way if you don't catch the transposing nature that takes place between B12 and B21 when vector v swaps sides. The mistakes aren't bad....it's the correct stuff that took me a while to understand.

    • @jasonbroadway8027
      @jasonbroadway8027 5 лет назад

      Nevermind... no contradiction..but you did leap a bit.

  • @alegian7934
    @alegian7934 4 года назад

    Can someone explain how L(x) becomes Lee(x)? That is, why does the function variable in parentheses move specifically to the epsilon?

  • @PeterH1900
    @PeterH1900 6 лет назад

    Hi
    I'd like you to make a short video with a small example of its application.
    Whether physics or some other area is not important.

  • @Oh4Chrissake
    @Oh4Chrissake 2 года назад +1

    Am I right in thinking that, in general, when you add two row vectors together, you simply add the corresponding terms to create a single row vector? In other words, is it entirely analogous to adding two column vectors together?

    • @eigenchris
      @eigenchris  2 года назад +1

      Yes. All array addition works element-by-element.

    • @Oh4Chrissake
      @Oh4Chrissake 2 года назад

      @@eigenchris Much obliged, Chris.

  • @chick3n71
    @chick3n71 Год назад

    In quantum mechanics we tend to distribute the latter matrix into the former for the kronecker product, howcome here it’s reversed?

    • @eigenchris
      @eigenchris  Год назад +1

      I didn't think too hard about which matrix distributes into which when I made this 5-6 years ago. Just follow whichever convention your textbook uses.

  • @rubixtheslime
    @rubixtheslime 3 года назад +1

    Eigencris: a matrix is basically a row of columns
    Me, a computer scientist: you mean I've been doing it backwards my whole life?
    Jokes aside, would a column of rows and a row of columns be the same thing? ie, vⁱ ⊗ αⱼ = αⱼ ⊗ vⁱ? I'm guessing the tensor product doesn't commute vector with vector or covector with covector, but I feel like it would make sense that a covector and vector would commute in the tensor product.

    • @eigenchris
      @eigenchris  3 года назад +1

      The tensor product doesn't technically commute, although the spaces V⊗W and W⊗V are "very nearly the same"... the only difference is that the indices for the tensors are reversed. In a certain sense, one has the "transposed matrices" of the other.

    • @rubixtheslime
      @rubixtheslime 3 года назад

      @@eigenchris wow that was a very fast response, thank you! I think that helps a bit. I think one thing that keeps holding me back here is I'm coming from a computer science background so I subconsciously assume a vector is an array.

    • @eigenchris
      @eigenchris  3 года назад +1

      @@rubixtheslime Yeah, normally CS people see "vectors" as just a data structure like a list. For mathematicians, vectors are more closely related to geometry, because they need to keep track of how to change coordinates. If you like, you can continue thinking of a vector as a column, but for it to be also left-multiplied by a row of basis vectors. This way, each basis vector gets matched with one of the numbers in the column, giving the "sum"-style vectors you see in my videos. So you can think of the vector-as-column as still being sort of true, but it's only part of the truth. The other "half" the story (the other half being the basis vectors).

  • @EW-mb1ih
    @EW-mb1ih 2 года назад

    Everybody seems to enjoy this video but I don't get so much from it:
    Why bilinear are useful for?
    Why bilinear forms are linear comb. of covector covector (and not vector vector for example)?

    • @eigenchris
      @eigenchris  2 года назад +1

      This is largely a pure math video, explaining the mathematical concept of a bilinear map in pure math terms. I'm talking about any mathematical function that eats two vectors and outputs a scalar. The metric tensor, or dot product (which lets you measure lengths and angles in space) is one example of a function that eats two vectors and outputs a scalar. In hamiltonian mechanics, there is a bilinear map called the "Symplectic Form" which helps you write out relationships between position vectors and and momentum vectors.

  • @zhangjin7179
    @zhangjin7179 6 лет назад

    at 6:27 mark, in the last time....the operation is mixing tensor product with matrix multiplication, right?

    • @eigenchris
      @eigenchris  6 лет назад

      Yeah, there should probably be a "kronecker product" operator between the column vectors.

    • @zhangjin7179
      @zhangjin7179 6 лет назад

      a tensor product between columns yields a 4 entry column, then the product combining the row and column into a real number.. that is inner product (and a tensor product as well)... am I right?

  • @flavioaugusto4159
    @flavioaugusto4159 2 года назад

    I derived all these formulas considering "e" and "epsilon" as covariant and convariant basis vector. I'm not complitelly awaer of the benefits of the covector notation.

    • @eigenchris
      @eigenchris  2 года назад

      Are you asking what covectors are for? I'm not clear on the question.

    • @flavioaugusto4159
      @flavioaugusto4159 2 года назад

      @@eigenchris No. It is just a different notation that I saw in a book which did not considered covector as a "new" mathematical entity but as another basis vector called covariant. So they are also considered simply as vectors.
      www.google.com/url?sa=t&source=web&rct=j&url=www.seas.upenn.edu/~amyers/DualBasis.pdf&ved=2ahUKEwjAseaF4Ov4AhUqJ7kGHV2SB1YQFnoECDkQAQ&usg=AOvVaw0YLXetehSeVuSuPxa0ta97

  • @timreisinger7078
    @timreisinger7078 6 лет назад

    I have a question. When a basis covector acts on a basis vector (say (1 0) and (1 0)^T you get the Kronecker dalta like you defined somewere in the beginning in the series, but when these two get multipied by the tensor product you get a matrix with the rows (1 0) and (0 0) but that is not the Kronecker tensor right? I am a little confused about the distinction between these two things especially when the notation seems to be the same.

    • @eigenchris
      @eigenchris  6 лет назад

      I'm starting to regret writing it the way I did. Anytime vectors and covectors are written next to each other without function notation (using the brackets), you should imagine a circle-times symbol beteen them. I'm hoping the next video (#13) might clear some of this up. Let me know if you're still confused.

    • @timreisinger7078
      @timreisinger7078 6 лет назад

      OK thanks!

  • @nawsherjahanparvez8085
    @nawsherjahanparvez8085 4 года назад

    The right one is then
    [[B11 B21] [B12 B22]]
    Ok. But there is still a problem.
    When the bilinear form is multiplied with vector v & w, i found that the result is
    B11v1w1 + B12v2w1 + B21v1w2 + B22v2w2
    The problem is with the two middle terms. It seems the indexes don't match up.
    Shouldn't they be B12v1w2 + B21v2w1 ??

    • @warrenchu6319
      @warrenchu6319 3 года назад

      Yes, your answer agrees with @Aliyu Bagudu.

  • @Musiclover5258
    @Musiclover5258 3 года назад

    Great tutorial, thank you so much for your kindness🙏. I am new to tensors, but have decent general mathematical intuition (or so I think), and this is a perfect fit for me...
    I did have a slightly difficult time digesting this video. I couldn't quite internalize the logic at 04:50. Again, at 6:23, wouldn't a "row of rows" be the same as a row? I finally visualized bilinear form in the following manner, not sure if it makes sense.
    B(v, w) = (B_ij epsilon^i epsilon^j) . (v w)
    The first term of the dot product is a weighted sum of tensor products of 2d row vectors (co-vectors) giving the 4d co-vector [B11 B12 B21 B22]. And the second term is the tensor product of column vectors v and w giving a 4d column vector. This inner product in 4d would give the same result as that of the earlier quadratic operation using 2d matrix. i.e., the quadratic operation in n-dimensional space seems to be equivalent to a corresponding linear operation in n^2 dimensional space.

    • @Musiclover5258
      @Musiclover5258 3 года назад

      Watching the next video, I am not sure if I was mixing up tensor products and kronecker products above. In fact I am still not really sure what the difference is..

  • @drlangattx3dotnet
    @drlangattx3dotnet 6 лет назад

    Am I correct that wherever I see two basis vector symbols together, I should envision the circle cross tensor multiplication symbol?

    • @eigenchris
      @eigenchris  6 лет назад

      For these videos, yes. This is a notation I made up, however, and you probably won't see it anywhere else.

    • @drlangattx3dotnet
      @drlangattx3dotnet 6 лет назад

      I guess I find that a little confusing. I will go back and envision the circle product symbols so I get used to it as well as knowing your symbol.

    • @eigenchris
      @eigenchris  6 лет назад

      Sorry you find it confusing. I'll have to reconsider how I write in my my future videos.

    • @drlangattx3dotnet
      @drlangattx3dotnet 6 лет назад

      If two basis symbols are next to each other in any other situation... a situation that is NOT a tensor product... that would be confusing. I am still getting used to this. Sometimes the "e's" are paired with brackets and sometimes not, etc. I need to work on this. I DO like how you keep reviewing with summaries after each portion.

    • @drlangattx3dotnet
      @drlangattx3dotnet 6 лет назад

      As I look over my notebook where I transcribe your lecture notes, I see the ee or e(e) or e(e)e. Cannot insert epsilon symbols here) Is there a difference here between these notations? Is this where circle-mult symbol should be? I am trying here but finding that I am unsure as to whether I am confused. That sounds strange, I realize, but I wish I could see a clarification of this with the "translation between your notation and conventional notation. Thanks for all your work.

  • @64Watasi
    @64Watasi 5 лет назад +1

    Previous mic was much better

  • @doctormeister4566
    @doctormeister4566 8 месяцев назад

    At 6:22, why is are these 3 objects not associative?

    • @matejmizak7585
      @matejmizak7585 2 месяца назад

      Because of matrix multiplication. We multiply by rows and columns (in order), not in any other way.

  • @MrPhilipe711
    @MrPhilipe711 6 лет назад

    Good Saga!
    Any upcoming intuitions on physics applications? especially on eletromagnetism and the whole venctorial product?
    missing more good examples on the vids.
    ty!

    • @eigenchris
      @eigenchris  6 лет назад +1

      I honestly wasn't planning on covering much physics in the near future... partly because I don't feel qualified/educated enough to cover it properly. Is there anything in particular you wanted to know about? I can maybe direct you to other places online to learn about it.

    • @dennisbrown5313
      @dennisbrown5313 6 лет назад +1

      General Relativity to explain its main formula - maybe also show using geo examples on how the metric tensor can be applied to a few simple examples.

  • @baguduak
    @baguduak 6 лет назад

    @7:04 the second line after awkward is wrong and Then the subsequent lines are correct. Is it true?

    • @eigenchris
      @eigenchris  6 лет назад

      I believe I wrote that line the way I wanted... The first element of the row is [B11 B12] and that gets multiplied by the first element of the column V1. The second element of the row is [B21 B22] and that gets multiplied by the second element of the column V2.
      Do you disagree? This is sort of something I made up myself. I apologize if it's confusing.

    • @baguduak
      @baguduak 6 лет назад

      How would you then produce the subsequent line from this?

    • @eigenchris
      @eigenchris  6 лет назад

      Distribute v1 to get [B11v1 B12v1] and distribute v2 to get [B21v2 B22v2]. After that, add the vectors together.
      I see now that I probably should have included an extra line to make the steps clear. I apologize for that.

    • @baguduak
      @baguduak 6 лет назад

      sorry actually @7:04

    • @baguduak
      @baguduak 6 лет назад +1

      Look at it very well you made mistake in the indices. The 3rd line after awkward should have [(B11V1 + B21V2) (B12V1 + B22V2)]

  • @kamaellewis1545
    @kamaellewis1545 5 лет назад

    You sound younger than in the last video.

  • @likeplayinghello04
    @likeplayinghello04 2 года назад

    Mi n

  • @one7-1001s
    @one7-1001s 6 лет назад

    Thank you but we want more examples

    • @eigenchris
      @eigenchris  6 лет назад

      I plan on including more examples in my future videos. I also plan on uploading another video on the tensor product in the next month or two to help explain it better.

  • @farriskft4847
    @farriskft4847 3 года назад +1

    This is the least comprehensive video of the series so far for me. Even the previous started at least with a problem not that practical, but at least clear one...but here just the formal blahblah. Ok, i managed to get the most of it, but how should i exploit me newly acquired knowledge? Pls if possible, start with a practical approach, after that lets have a numerical example, then we can go formal.

  • @anatheistsopinion9974
    @anatheistsopinion9974 5 лет назад

    Great series but dude, you're making so many mistakes!

    • @eigenchris
      @eigenchris  5 лет назад +1

      Yeah, I'm sorry for that. I try to correct them in the video description.

  • @hengzhou4566
    @hengzhou4566 2 года назад

    From your video, it seems that tensor is only a notational simplification, a tool that facilitates memorizing, or just a personal notational preference. But tensor is more than that! For example, tensor contraction (a very important concept of tensor, but you did not mention a word about it) can be used to prove that the trace of linear transformation does not depend on choose of basis. Tensor can also be used in (nonlinear higher order) label spreading in machine learning. But what your video can help in these use cases? It can do nothing but waste time of audience. Don't harm the audience. I suggest you delete your videos, which is perhaps the best thing these videos can do for learners of tensors.

  • @likeplayinghello04
    @likeplayinghello04 2 года назад

    Mi n