Tensors for Beginners 1: Forward and Backward Transformations (REMAKE)

Поделиться
HTML-код
  • Опубликовано: 4 окт 2024
  • Tensors for Beginners playlist: • Tensors for Beginners
    Leave me a tip: ko-fi.com/eige...
    I made a mistake in the original version of this video that has been confusing people for years. Super late, but trying to make amends.

Комментарии • 115

  • @thedorantor
    @thedorantor Год назад +175

    The fact that after so many years you still bother to answer questions from comments under old videos and even remake a 4 year old video... You're an amazing teacher, nothing but respect for you!

  • @Jonas-Seiler
    @Jonas-Seiler Год назад +35

    I have never seen the visualisation of vectors you show at the very end of the video before. This is basically an epiphany for me. Taking the basis to be rows while taking the components to be columns suddenly makes everything I have heard about vectors and covectors make sense. This video has already been truly invaluable to me.

    • @eigenchris
      @eigenchris  Год назад +20

      I didn't actually realize that's what you're supposed to do until half-way through making this video series. I think a viewer pointed it out to me. Part of the reason I did a "REMAKE" of this video was to state that fact. It's a bit shocking it's not more common.

  • @linuxp00
    @linuxp00 Год назад +19

    I'd like to thank you for your care, even on such early lessons like that. Personally, your lectures helped me in a presentation about Schwarzschild's solution to Einstein's Field Equations

  • @atzuras
    @atzuras Год назад +5

    This remake is very welcome. One of the best introductory courses I found on this plattform.

  • @nstrisower
    @nstrisower Год назад +4

    WHAT THE FUCK HOW DID I NEVER MANAGE TO LEARN ABOUT THAT WAY OF MULTIPLYING MATRICES
    that alone makes this video a literal godsend tbh thank u so much

  • @dalisabe62
    @dalisabe62 Год назад +3

    Excellent presentation. It assumes that the student know very little or nothing. This shows experience in teaching and good common sense on the part of the instructor.

  • @Mikey-mike
    @Mikey-mike Год назад +7

    Thanks, EigenChris.
    Actually, your original video on this subject with your mistake was a good pedagogic device to point the matrix entries of the individual equations.
    You are one of kind, an excellent teacher.

  • @etiennecameron7783
    @etiennecameron7783 Год назад +5

    Thank you Chris. You answered the exact question I had yesterday in your tensor calculus series. Didn't understand why the jacobian matrix was not transposed. Basis vectors are covariant. Covariant vectors are covectors. Covectors multiply from the left. Thank you

  • @phijiks
    @phijiks Год назад +7

    I was reading quite a few books on Tensors for beginners, and then i came across your series of Tensors for beginners(many years ago when you just started making those initial videos)..... Initially it felt the best source for learning tensors however after the error in this particular video and then the correction video you made earlier it all felt so much confusing that i tried many times on multiple occasions to clear things up and make a coherent sense of all the books i have and your (this particular) video but i couldn't.... Ultimately each time i had to drop this topic and move further only to get stuck in advance as I didn't know basics well......
    I really hope this time when I'm going to give a new try with this video, i really can make a sense of all the books and your videos and finally understand the proper basics of it....
    Meanwhile all these years i was just wondering you moved ahead continuing your series all the way up to General relativity and it made me wonder even if you could revisit your older videos and make a correction for people like me so I can reach the advance videos with you as well..... In hopes i always had your notifications ON...... And woah ho finally you did that today ..... I am so so much grateful for you, for that ...... I'm a high school physics teacher, but Tensors has always been something i really really want to understand and learn properly....and you are the best source I have found till date, again thankyou for this correction video

    • @eigenchris
      @eigenchris  Год назад +6

      Yeah, I'm really sorry about the error. I was pretty inexperienced when I made this and had no idea so many people would watch it. I thought the correction video would help, but I realized it just confused some people more. I'm late to the party fixing this, but hopefully it will help future people who watch.
      Also, my relativity series covers tensors from the beginning in a much more understandable way (I think). I'd suggest watching the Relativity 102 videos if you are interested.

    • @phijiks
      @phijiks Год назад +2

      @@eigenchris woah, this the second time i got your reply 🥺🥺 .... And yes the correction video created more confusion😅..... I understand your concern that you were learning that time but still even that time your videos made much more sense than mere mathematical definitions on Tensors and yes still better late than never, a huge shoutout to your effort in remaking this video ..... Will see your other series as well(on relativity), first I'm going to resume this series again and see if i can get through this time .... Heartfelt thank you for this remake Chris 🙂

  • @markneumann381
    @markneumann381 20 дней назад

    Wonderful work. Greatly appreciated. Very helpful. Thank you so much.

  • @MrMilesfinn
    @MrMilesfinn Год назад +1

    Thanks from an experimental physicist--great work.

  • @michaelzumpano7318
    @michaelzumpano7318 Год назад +5

    I watched your series on general relativity. It was awesome! You make these subjects so navigable. I hope you never stop.

  • @jonpritzker3314
    @jonpritzker3314 Год назад +2

    Heaping on the deserved laudations. Your effort is noble and nearly brings me to tears. Thank you.
    I've studied row and column multiplication, and I started Arfken's Math for physicists, where one of the opening problems is rotating a 2d basis, and I've heard people say "nevermind that torque is a pseudo bi-vector," but I did not know linear algebra and that Arfken stuff was baby tensor math. Your presentation has got me excited, because it might not be as scary as I thought :D
    Second smily for emphasis :D

  • @robertlouis9083
    @robertlouis9083 Год назад

    Wow I just found this video for the first time and haven't watched it yet but I'm sure glad whatever mistake was in the original one I missed because by the sounds of these comments it was a doozy.

  • @johnsimspon8893
    @johnsimspon8893 Год назад

    kudo man. That mistake has caused me much difficulty for years.

  • @msontrent9936
    @msontrent9936 7 месяцев назад

    Excellent explanation. Thank you for taking the time ...

  • @manaayek8091
    @manaayek8091 3 месяца назад

    Your defining linear algebra terms better than when i actually took linear algebra. Its all clicking now.

    • @dydx_mathematics2
      @dydx_mathematics2 3 месяца назад

      Is this video good for me as a mathematician .. imean symbols in physics not like symbols in math

  • @Michallote
    @Michallote Год назад +1

    Thank you for the remake!

  • @mMaximus56789
    @mMaximus56789 Год назад

    Hopefully you updated/continue this series!

  • @quantumspark343
    @quantumspark343 Год назад

    Excellent quality as always

  • @jonnymahony9402
    @jonnymahony9402 Год назад

    to understand all of this you really have this notation and linear algebra manipulations under your belt, same if true for quantumfieldtheory, I'm often lost in notation 😂

  • @tomgraupner171
    @tomgraupner171 Год назад

    Thanks a lot. Still hope to see you doing stuff on Dirac spinors, QM and QFT!

    • @tomgraupner171
      @tomgraupner171 Год назад

      That's so cool - I wished - You made it .... Wonderful life

  • @omnipotentpotato2436
    @omnipotentpotato2436 10 месяцев назад

    Great video chris!

  • @sdsa007
    @sdsa007 Год назад

    awesome! I get that they are inverses of each other… is seems like it would be easy to directly derive one from the other… just by reversing the left-right vertical and flipping the sign on the right-left vertical… but not sure how to prove that.

    • @eigenchris
      @eigenchris  Год назад +2

      If you google "2x2 matrix inverse formula", you'll find a formula for how to convert and 2x2 matrix into its inverse.

  • @aditya_a
    @aditya_a Год назад +3

    Hey, thanks for this series! Just a question - In the very beginning, you say that the matrix representing the "forward transformation" eats vectors in the old, blue basis and outputs vectors in the new, red basis. But this doesn't quite sit right with me. If I feed the vector v with components (1, 0) into this matrix via F*v, the result is (2, 1). So clearly, what seems to be happening is F eats vectors in the red, what you call "new" basis, such as (1, 0), and outputs vectors in the blue, what you call "old" basis...

    • @jongraham7362
      @jongraham7362 Год назад

      I think you are correct, if you are multiplying from the right, Aditya. To go from the new basis to the old you write the "new basis elements" in terms of the old basis. So the first column is [2,1] the second column is [-1/2, 1/4]...again these are the coordinates for the new basis elements {e˜1, e˜2} in the old basis {e1, e2}. These coordinates make up the columns of the matrix. This will take you from the new basis to the old basis. Plugging in something on the right with coordinates in the new basis...this will give you that point in the old basis. For instance plugging in (1,0) from the new basis will give you (2,1) in the old basis. Plugging in (0.1) in the new basis will give you (-1/2, 1/4) in the old basis. He is not multiplying on the right, he is multiplying on the left. It makes much more sense to me multiplying on the right as a transformation, it is not clear to me why he is multiplying on the left. I am hoping though that this will not cause issues further along though, because he is someone who fleshes out issues with tensors that I struggle with, that others seem to skim over.

    • @jongraham7362
      @jongraham7362 Год назад

      In fact, if you go on to the next video, you will discover that Chris understands that F goes from the new basis to the old basis...plug in something in with new basis coordinates and you get that point in the old basis coordinates, and B goes from the old to the new. This makes sense for B because it is the old basis elements (1,0), (0,1) written in the new basis coordinates. He describes it as being backwards... but that is because he has it backwards in this video. No worries though... I think his videos are well done.

  • @peronianguy
    @peronianguy 6 месяцев назад

    Great video! Two questions/comments from a real beginner: in the linear algebra video series by 3b1b, the vector was to the right and the transformation to the left, echoing f(x). In matrix multiplication, it means that we frist apply the transformaiton to the right and then the transformation to the left. Why is the order different here?
    The second comment is that although the building of the new basis vectors out of the unit vectors looks visually pretty intuitive, the opposite is not true. In fact it is not clear at all whether you're applying the same "visual" procedure and why you get for example a -1

    • @eigenchris
      @eigenchris  6 месяцев назад +1

      The way 3b1b does it involves the vector components (I show this quickly near the end of the video). In this video I'm dealing with basis vectors, which must be written as rows if you want the multiplication rules to work out properly.
      This style of writing basis vectors in a row is pretty unique to me. I haven't seen it anywhere else on youtube or textbooks.
      As for figuring out which vectors belong in the linear combination, I agree it can involve some trial and error.

    • @peronianguy
      @peronianguy 6 месяцев назад

      @@eigenchris I see! Thank you very much for your answer :)

  • @manfredbogner9799
    @manfredbogner9799 9 месяцев назад

    very good

  • @martin2ostra
    @martin2ostra 5 месяцев назад

    Thanks

  • @thevegg3275
    @thevegg3275 Месяц назад

    Comparing four differing cartesian coordinates
    x grid is 1 inch long
    y grid is 1 inch long
    x basis vector is 1 inch long
    y basis vector is 1 inch long
    x grid is 2 inches long
    y grid is 1 inch long
    x basis vector is 2 inches long
    y basis vector is 1 inch long
    x grid is 1 inch long
    y grid is 1 inch long
    x basis vector is .5 inch long
    y basis vector is .5 inch long
    x grid is 2 inches long
    y grid is 1 inch long
    x basis vector is 1 inch long
    y basis vector is 1 inch long
    Would the metric tensors have all the same values?

    • @eigenchris
      @eigenchris  Месяц назад +1

      You might be interested in watching video 9 in this series, which covers the metric tensor. The diagonal components of the metric give the squared basis vector lengths. So if e_x is 2 units long, then g_xx = 4. The off-diagonal elements of the metric give the angles between each pair of basis vectors.

    • @thevegg3275
      @thevegg3275 Месяц назад

      So then a grid of rectangles of constant dimensions 3 in the y and 4 in the x, whose basis vectors are exactly half the dimensions of the grids will have a metric tensor of 8 for gxx and 9 for gyy?
      So it sounds like the metric tensor values only depend on the length of the basis, vectors and not the size of the grid upon which these bases vectors live ?

  • @redrosin99
    @redrosin99 Месяц назад

    It's a pedagogical error to equate Tensors with vectors and matrices. These are specific examples of particular cases.

  • @matthewsarsam8920
    @matthewsarsam8920 4 месяца назад

    Aren’t these matrices just change of basis matrices if I multiply from the right, but in the opposite direction

    • @eigenchris
      @eigenchris  4 месяца назад

      Yes. The idea is that if you change the row of basis vectors with a matrix F, then you change the column of vector components using the inverse matrix B = F^-1.

  • @pratik9056
    @pratik9056 5 месяцев назад

    But e_i = \sum_i e_i ? then the the summation here only represents the value of i taken into consideration?

  • @rameshsimkhada9742
    @rameshsimkhada9742 2 месяца назад

    As e1 and e2 are perpendicular. Are the new basis e1 tilde and e2 tilde also perpendicular? Also, thank you for the video.

    • @eigenchris
      @eigenchris  2 месяца назад

      There is no guaramtee the new basis will be perpendicular.

  • @fredrickcampbell8198
    @fredrickcampbell8198 4 месяца назад

    3:10
    Using the forward matrix:
    [1, 0][2, -1/2; 1, 1/4] = [2 1]
    Since the forward matrix is supposed to transform the basis vectors from the one without a ~ to the one with a ~, isn't this actually the backwards matrix since it transforms the basis vectors from the one with a ~ to the one without a ~?

  • @kilianwilhelm3184
    @kilianwilhelm3184 8 месяцев назад

    Why do we notate basis vectors as row vectors? Is it because basis vectors transform like covectors? Would you then notate basis covectors as column vectors?

  • @Benjatastic
    @Benjatastic Год назад +1

    Is there a deep reason this video series multiplies vectors from the left like xA instead of from the right? Or is it just the convention you felt was pedagogically superior?

    • @Benjatastic
      @Benjatastic Год назад +1

      To elaborate on what I mean by a "deep reason," are the vectors rows because they represent linear functionals or come from a dual space?

  • @viaprenestina3894
    @viaprenestina3894 11 месяцев назад

    shouldn't the backward transformation be the inverse of the forward one?

  • @vincenzocotrone4370
    @vincenzocotrone4370 6 месяцев назад

    Good morning,
    Could you please suggest a book which explains the subject in a similar way that you do?
    With exercises as well.
    Thank you very much, in advance.

  • @thevegg3275
    @thevegg3275 Год назад

    Is there something fundamental about the blue old basis? It seems like they are unit basis.
    For the sake of argument, what if the old basis was the red one and then you wanted to transform to the new blue basis… then yould need to use the forward transformation still.
    It almost seems like you can always just use the forward transformation as long as you switch, which one is old and which one is new

    • @eigenchris
      @eigenchris  Год назад +2

      You're right: the one you label "old" and the one you label "new" is arbitrary. I just needed to call them something.

    • @thevegg3275
      @thevegg3275 Год назад

      @@eigenchris thank you I’ve been waiting a long time to ask this question.
      I cannot seem to make a connection between a graphical representation of contravariant components and cover and components, and how they relate to tensors as indices. if I have a vector with contravariant components of five and three, and this can be represented as covariate components of said 13 and seven how do these numbers actually populate anything in the tensor? I’m just trying to understand the connection between the graphical representation of contravariant and covariant components and tensors. Thanks for any help you can give.

  • @ganondorfchampin
    @ganondorfchampin Месяц назад

    Aren’t these just vector operations? What additional power do tensors give?

    • @eigenchris
      @eigenchris  Месяц назад

      Tensors give you a way to combine vectors and covectors into more complicated objects, like linear maps, the metric tensor, and the curvature tensor from general relativity. These are represented with multi-dimensional arrays instead of 1D arrays.

  • @eqwerewrqwerqre
    @eqwerewrqwerqre Год назад +1

    If you made a text book I would buy 10

  • @Vickipirate12
    @Vickipirate12 6 месяцев назад

    How you changed the summation signs in 8:47
    Is there any property?
    IDK

  • @thevegg3275
    @thevegg3275 Год назад

    At minutes 6:50 you say how to determine the coefficients for a summation
    I do not understand your choice of linear term coefficients to express E1 tilde and E2 tilde.
    You have a myriad of terms to select the components from but you pick F12 and Fn1. I do not understand why those two. Thanks

    • @eigenchris
      @eigenchris  Год назад

      I was just using those two as an example to explain what the subscripts mean. The 1st F subscript is attached to an "e" basis vector on the right of the = sign, and the 2nd F subscript is attached to an "e~" basis vector on the right of the = sign.

  • @sethapex9670
    @sethapex9670 Год назад

    Is the fact that tensors are coordinate system invariant the reason they are used in relativity, since then it would not matter what frame of reference we are operating in?

    • @eigenchris
      @eigenchris  Год назад +1

      Yes. All important equations in relativity should be written with tensors, so that they are the same for all reference frames.

  • @anangelsdiaries
    @anangelsdiaries 7 месяцев назад

    Is there a specific reason why we write the vectors as row vectors and multiply from the left instead of column vector multiplied on the right?

    • @eigenchris
      @eigenchris  7 месяцев назад +2

      It's arbitrary. But almost everyone writes vector components as columns, so I stick to that convention.

  • @abhisheksuretia
    @abhisheksuretia Год назад

    Sir can you tell the refrence book for tensor

  • @danieleba.9924
    @danieleba.9924 Год назад

    If I understand the backward Matrix is equal to the inverse of the Forward Matrix? (I do not speak English so much, and if I had a mistake I'm sorry 😅)

  • @सुदय-त8थ
    @सुदय-त8थ Месяц назад

    A 10th grader from India, I got completely lost from 8:40. Is there a nice Matrice table demonstration instead of these random sigmas?

    • @eigenchris
      @eigenchris  Месяц назад

      I have another explanation with matrices. You can take a look at videos 102a and 102b from this playlist: ruclips.net/video/_Il-aQ8RY6Y/видео.html
      The sigmas are for "summation notation". Normally by the time someone wants to learn tensors, they have seen summation notation before. Can I ask why you want to learn tensors?

  • @yoavboaz1078
    @yoavboaz1078 6 месяцев назад

    What was the mistake?

  • @StefanoBusnelliGuru
    @StefanoBusnelliGuru Год назад

    Writing vectors as row vectors means that those are not vectors but 1-forms?

    • @eigenchris
      @eigenchris  Год назад +3

      1-forms have components written as rows and basis-covectors written as columns.
      The fact that basis vectors are written as a row just means that the basis vectors are covariant.

  • @thevegg3275
    @thevegg3275 Год назад

    As far as vector spaces by the definition it seems like all vectors in the entire universe are in the same vector space since all vectors have the same 3 V,S,+,; collection. Is that correct. Based on that it seems vector space never have a vector outside of the one and only vector space.

    • @eigenchris
      @eigenchris  Год назад +2

      The main thing that differentiates one vector space from another is the dimension (how many independent directions you can define).

    • @thevegg3275
      @thevegg3275 Год назад

      @@eigenchris Chris, I will pay you if you can help me go from the graphical representation of contravariant and covariance components how they get attached to a Tensor. It will probably have to be done through a phone call. Let me know if you’re willing to do this and I will somehow give you my contact information. This is super important to me.

  • @dhruvsharma8430
    @dhruvsharma8430 Год назад

    Bro @8.48 how did you changed the position of sigmas without checking the limits for the variables

    • @eigenchris
      @eigenchris  Год назад

      I don't understand the question. Can you explain what you mean?

    • @roadtogod6556
      @roadtogod6556 Год назад

      doesnt matter thise are finite summs

    • @dksmiffs
      @dksmiffs 8 месяцев назад

      @dhruvsharma8430, the swapped sigmas @8.48 also weren't intuitive to me, so I took the time to expand the sums written both ways. I concluded that associativity of addition allows this swap, @eigenchris please correct me if I'm wrong.

  • @edwardlee7898
    @edwardlee7898 11 месяцев назад

    Fkj? Do you mean Fki ?

  • @tirtahadith
    @tirtahadith 2 месяца назад

    woaw, it's a click now

  • @sampson4844
    @sampson4844 Год назад +1

    is [e1 e2] a covector? or is it informal to write it this way? thx

    • @eigenchris
      @eigenchris  Год назад +2

      It's a pair of basis vectors written as a row. They transform covariantly, but they are not a covector. It's a pretty informal way to write it--I haven't seen anyone else write them like this.

    • @sampson4844
      @sampson4844 Год назад

      @@eigenchris thx,😃

  • @JL-jc5fj
    @JL-jc5fj 8 месяцев назад

    Sir,how do we know that we need 1/4 of e1 tilde to construct e1

    • @eigenchris
      @eigenchris  8 месяцев назад

      It's just from looking at the picture and doing the measurement.

  • @maryamhasan2618
    @maryamhasan2618 Год назад

    Please clear my one confusion, here inverse means that when we go from old basis to new basis we had ro take inverse of the new basis. M i correct ?

    • @eigenchris
      @eigenchris  Год назад

      Yes. If the F matrix takes us from the old basis to the new basis, then F-inverse takes us from the new basis to the old basis.

  • @wjrasmussen666
    @wjrasmussen666 Год назад

    is there going to be a playlist for this?

    • @eigenchris
      @eigenchris  Год назад

      There already is a playlist from 5 years ago. Just search "Tensors for Beginners" in the search bar and it should pop up.

  • @sourabhborkar8167
    @sourabhborkar8167 2 месяца назад

    10:10 kronecker delta

  • @ShadowZZZ
    @ShadowZZZ Год назад

    well that's unexpected to see

    • @eigenchris
      @eigenchris  Год назад +5

      I still get people in the comments confused about it, so I just had to bit the bullet and upload a new one, even if it's an old video.

    • @abstractnonsense3253
      @abstractnonsense3253 Год назад

      @@eigenchris Thank you for doing that

  • @Oylesinebiri58
    @Oylesinebiri58 Год назад

    👍

  • @sachinmohan6935
    @sachinmohan6935 4 месяца назад

    It may sound silly... but was this video made with an AI voice generator or tacotron model, or maybe it's just me😂😂Lol I have been skeptical of every video since @bycloudAI ( www.youtube.com/@bycloudAI ) revealed all his videos use an AI voice. Also thanks for the great content!! It's really helping for a Mechanics exam that I have tomorrow.

    • @eigenchris
      @eigenchris  4 месяца назад

      Nope. I'm just naturally monotone and reading from a script, which makes me even more monotone.

  • @thesigmaenigma9102
    @thesigmaenigma9102 Год назад

    Tensors are just objects that behave like tensors

  • @b43xoit
    @b43xoit Год назад

    Leopold Kronecker, German.

  • @DavidSartor0
    @DavidSartor0 Год назад

    The audio sounds worse. People usually get better audio, not worse, so I'm probably doing something wrong.

    • @eigenchris
      @eigenchris  Год назад

      I figured the audio sounded much better. What's wrong with it?

    • @DavidSartor0
      @DavidSartor0 Год назад

      @@eigenchris Your voice sounds deeper. Probably it just deepened in real life.

  • @mohsinshah6857
    @mohsinshah6857 Год назад

    Sir, why u r not regular in uploading the vedioes

    • @linuxp00
      @linuxp00 Год назад

      These topics are no easy, man. Chris was learning while teaching. In fact, he already finished this course, now is just a correction.

    • @jonpritzker3314
      @jonpritzker3314 Год назад

      I like that format. Sir, why u not explain me everything pls?

    • @francisherman8982
      @francisherman8982 Год назад

      You're assuming he's got a queue of videos finished and ready to post. More likely he posts as he finishes them, and makes them as time allows. I don't know if he's monetized these videos at all, but even if he has, I doubt it's enough to quit his day job. A bit less entitlement, a bit more appreciation!

  • @wrog268
    @wrog268 Год назад

    Fji

  • @edwardlee7898
    @edwardlee7898 11 месяцев назад

    Sorry Fkj is right

  • @seriktabussov5892
    @seriktabussov5892 9 месяцев назад

    it's more like brugging not teaching

  • @angeldude101
    @angeldude101 Год назад

    What do you mean by "summarizes ... nice and simply" when that just looks like an ordinary matrix multiplication? It doesn't seem to be saying anything that isn't simpler than ẽᵢ = eⱼF. (The fact that the shorter version is completely representable in a RUclips comment just hurts the more verbose version more.)
    Similarly, δ is a lot harder to type than I, is introduced much later, but seems to mean the exact same thing. Iv = v is one of the first things that you learn about matrices, alongside MM¯¹ = M¯¹M = I. I'm just failing to see the reasoning for using δ instead of just the identity matrix, or using explicit summation over the matrix (and probably later tensor) product. (Actually, I'm pretty sure the tensor product isn't a direct extension of the matrix product like how the matrix product is an extension of the dot product, but that just begs the question of why there isn't such an extension to begin with.)

  • @ilikegeorgiabutiveonlybeen6705

    thanks