Tensor Transformation Laws: Contravariant, Covariant, and Mixed Tensors

Поделиться
HTML-код
  • Опубликовано: 25 ноя 2024

Комментарии • 40

  • @borokaolteanpeter9150
    @borokaolteanpeter9150 4 года назад +10

    Honestly, these videos are so precious that the most teachers' face to face lessons cannot be compared to them. Thank you!

  • @CrucemDomini
    @CrucemDomini 4 года назад +5

    I cannot stress enough how helpful your videos are. Thank you sooo much!

  • @jackiewhitt4551
    @jackiewhitt4551 3 года назад +1

    Best (and clearest) explanation of Tensor Transformation Laws of Mixed Tensors yet, and I have seen MANY MANY MANY

  • @blzKrg
    @blzKrg 4 года назад +11

    I cannot thank you enough for making my semester so easy!❤

  • @Astro-X
    @Astro-X 4 года назад +12

    Frantically revising this playlist on the day before my Quantum Field Theory exam lol

  • @青雲浮遊
    @青雲浮遊 5 лет назад +5

    So happy to see a future-useful video is just posted haha

    • @青雲浮遊
      @青雲浮遊 5 лет назад +3

      Monsieur Café mathematical physics :)

  • @henrikrynauw244
    @henrikrynauw244 4 года назад +5

    Thanks for the awesome video, first time looking at the mathematics behind tensors and now I cant wait to do it in my course hopefully.

    • @FacultyofKhan
      @FacultyofKhan  4 года назад +1

      Best of luck!

    • @blackhole3298
      @blackhole3298 3 года назад

      @@FacultyofKhan Very nice video, but shouldn't there also be a space between the indices in mixed tensors. So the subscript and superscript are not allowed to be directly above each other?

  • @hippophile
    @hippophile 5 лет назад +3

    Nothing in this was difficult. That means we can easily understand it and can move on, so good work!
    On the proofs, I also looked back to the geometric examples in the earlier lecture Contravariant and Covariant Vectors 1/2, to provide a geometric outline proof to help my intuition. I hope that is sensible!

  • @TalathRhunen
    @TalathRhunen 5 лет назад +2

    7:50 small correction, T should also use the i and j indices instead of k and l (as is done in the proof right after)

  • @KhaosTy
    @KhaosTy 4 года назад +4

    Excellent video as always. I'd love to be a patron but money is tight. If you add the ability to make subtitles, I'd love to help put them on.

  • @dandan1364
    @dandan1364 3 месяца назад +1

    My head hurts. All I wanted was to understand how to multiply 3-rank tensors. But first you have to understand Einstein notation then understand covariant etc… and all the rules, THEN we can understand how to multiply? That said, this is the BEST series to actually understand tensors! Try reading the Wikipedia article … if you don’t slip into a coma immediately I’ll buy you a coffee.

  • @jackjiang714
    @jackjiang714 4 года назад +4

    Hi
    Faculty of Khan, thanks for the video!
    One question is that, I'm a bit confused when you call them contravariant and covariant "Tensors". As you mentioned, Tensor is invariant,
    Is it better to call them contravariant and covariant "components for the Tensors"? because they are just the magnitude components in front of the basis vectors. Tensor should be described with both the magnitude and basis vectors.
    Or there is some misunderstanding from my side?

    • @FacultyofKhan
      @FacultyofKhan  4 года назад +3

      Yes, you're absolutely right! It's more accurate to call them contravariant and covariant components, as I alluded to when I initially introduced contravariant and covariant vectors/tensors: ruclips.net/video/vvE5w3iOtGs/видео.html
      Tensors themselves are invariant, but their components individually can transform in either a contravariant manner or covariant manner.

  • @tes8771
    @tes8771 4 года назад

    Very clarifying, thank you a lot!

  • @sayanjitb
    @sayanjitb 2 года назад

    Is this proof (8:00) applicable for this statement also? "Consider a mixed tensor of contravariant rank m and covariant rank n, show that this transforms linearly under a general coordinate transformation."

  • @vincentproud6589
    @vincentproud6589 2 года назад

    How can a covariant tensor be unchanged under a coordinate transform if its elements transform the same way as the basis?

  • @alapandas6398
    @alapandas6398 4 года назад

    As for vector the invariance can be written like v¹e_1+v²e_2+v³e_3=V, how can we present tensors of higher rank and show their invariance?
    What are the basises for higher rank tensor? Is that like the following?
    ds²=g_ij (dx_i)(dx_j) which is invarient. If g_ij are the components of Metric tensor then can we think (dx_i)(dx_j) as the basises of this co-ordinate system?

  • @umedina98
    @umedina98 3 года назад

    Just have one question, how can it be proven that when the field obeys the transformation rules, that it maintains invariant? It is intuitive for a vector, however, how can it be proved mathematically?

  • @janoortje
    @janoortje 5 лет назад

    How long is this series going to be?

  • @TheTechnicalEducator
    @TheTechnicalEducator 3 года назад

    Very nice 👍

  • @underfilho
    @underfilho 4 года назад

    ok, but a vector field can be considered a Tensor?

  • @Zxv975
    @Zxv975 5 лет назад

    Wow, I was literally thinking about something you covered in this video today. I knew that not all matrices were tensors, but I didn't know what you'd call something that wasn't n×n and wasn't a tensor. The answer is an array, which is painfully obvious considering I have a comp sci degree.
    Anyway, the question I wanted to ask is: how do you ensure that something transforms as a tensor under *all* basis transformations? I understand that you defined that components must transform according to the Jacobians, but doesn't that just correspond to that particular choice of transformation? How can I be sure that given a vector v = v^j e_j, that v will always be invariant under any basis change? Do I have to define v^j in a particular way, like as components of a directional derivative or something?

    • @danmccarron0
      @danmccarron0 5 лет назад +2

      Array. Yes. You are forgiven this time ;)
      I think I can answer your question qualitatively and provide you a direction for more rigorous investigation.
      1. Problem is You are whipping out a chicken-and-egg dilemma. If you start with a 'matrix field,' as they call it here, in this case an nxn matrix of scalar fields which are functions of however many variables, let's say m which is usually like 3, that's simply a matrix of functions whose values change at whatever points in the (x_1, x_2,..., x_m) space. THEN you define the transformation to a new set of m variables according to the Jacobian - which you absolutely have to specify, and you have now taken your nxn matrix and made it a tensor of rank 2 in that it transforms according to the condition set forth (two derivatives).
      2. There is no need to select special elements to ensure invariance under general transformations. Once you call it a tensor, so long as the appropriate transformations to new coordinates are specified and the transformation is able to be applied correctly, it IS one - because that is how it is defined in the first place. Each element of the tensor (in this case of rank 2) is made up of a scalar "response" term and a pair of basis vectors 'glued' together (one for vectors, and a trio for a tensor of rank 3 for good measure).
      3. If you quibble about the fact that you only demonstrate for specific cases, it can be done in general curvilinear systems. Taking those derivatives element-by-element transforms everything (both scalar multiple and basis vectors smashed together) correctly so that what was invariant in the old coordinate system will also be invariant in the new one. The general formalism to get these values, however, involves a thing called the metric tensor (usually "g") in flat space and then more generally Riemann Curvature tensor in curved space (yup - generalized curvilinear coordinates IN a curved space as well) which encapsulates and extends how we do the operations to take these invariants (e.g. the scalar product) but they can be shown invariant in general without assuming anything about the coordinates or curvature.
      4. As a simple example, going down one rank, for well-behaved functions, a vector field is only a gradient of some 'potential function' if its curl is zero. If so, we can work backward to find this function, but not every vector field is a gradient obviously. Going the other way, you can write down ANY scalar function and take the gradient given its coordinate system. There are no rules telling you WHAT functions are potentials. You simply make one up and take the gradient and you are good. All you have to do is take the gradient as defined in the coordinate system given the curvature of space. The curl, for example, of this, despite being perhaps ungodly to look at, is still zero. The divergence at any point is, as expected, the same and that can be shown in general. See the appendix at the end of D. Griffiths electrodynamics book for the simplest, most illustrative example of this.

  • @Nickesponja
    @Nickesponja 5 лет назад

    If a mathematical object obeys the tensor's transformation rules, can we assert that it is a tensor or do we have to prove that it is invariant under a change of coordinates?
    Also, can you sum two tensors of the same total rank even if they have different covariant and contravariant ranks? Like, for example, adding up a covariant and a contravariant vector, would the result be a tensor?

    • @Zxv975
      @Zxv975 5 лет назад

      For your first question, do you mean to ask that if we see that a tensor is invariant under one basis change, can we assume it's invariant under all basis changes? Or do you mean that we see that the tensor *components* change according to the rules listed, therefore can we assume the associated tensor (i.e. components + basis) will be invariant?
      For the second question, it doesn't really make sense to add tensors with different covariant and contravariant ranks. In the simplest case, consider v = v^j e_j to be a (1, 0) tensor (a column vector) and w = w_j e^j to be a (0, 1) tensor (a row or dual vector). Since v and w belong to different vector spaces entirely, their sum is not defined. A hint as to why this doesn't work is that the sum v^j e_j + u^j e_j can be simplified to (v^j + u^j)e_j where we factor out the basis elements. The fact that we can factor the basis elements means that the result is an element of the same vector space. You can't do this with v and w. The same argument holds for tensors of any rank. You can also look at the multilinear map interpretation of tensors and see that it breaks down if you allow for sums of tensors of different cov/contra ranks.

    • @Nickesponja
      @Nickesponja 5 лет назад

      @@Zxv975 Oh, thank you very much!
      I meant the second thing: if we see that the tensor components change according to the tensor transformation rules, can we assume that the associated tensor is invariant under a change of coordinates?

    • @Zxv975
      @Zxv975 5 лет назад

      @@Nickesponja I want to say no, but I am not 100% confident in this answer. If you know the basis you are working with is a differential form / dual of a differential form, then yes, but otherwise my instinct tells me no.
      If you look on the Wikipedia article for tensor and scroll down to "Definition > As Multidimensional Arrays" (I don't think I can post links in YT comments), then it has the most general form for the rules a tensor should follow. Here, the transformation is generalised as a change of basis matrix R--in contrast with this video, where the change of basis matrix is specifically given as a Jacobian. If your tensor components follow the generalised rule on Wikipedia, then it definitely will be associated with a tensor.
      Basically, if you can prove or disprove the statement: "the matrix representation of a change of basis is given by an associated Jacobian", then you have the answer to your question. I am a little hazy on the subject, so I cannot definitively say whether or not this is true right now.
      However, given that every treatment on the topic of tensors always uses differential forms and their duals as the bases, in all practical applications the answer to your question is yes. That's why the rules in this lecture were given with respect to the Jacobian instead of the more general change of basis matrix on Wikipedia.

    • @danmccarron0
      @danmccarron0 5 лет назад

      @@Zxv975 It's a moot point.
      A tensor is defined explicitly by how it transforms under coordinate change. That's it. There is not a special subset of matrices in a matrix field which are tensors (of rank 2 let's say) because they fit some mathematical "tensor form." All are tensors ONCE YOU DEMAND they transform according to the rule AND YOU have the recipe as explicitly provided by the Jacobian matrix, assuming it is not singular, it guarantees that the invariants the tensor represents are preserved under any transformation. It is vitally important that the transformation matrix (Jacobian) be correct of course. But in general you DEFINE A TENSOR in this way so you are stuck having believe its general invariance in this way. And yes these can be proven in general.
      It's like asking if all well-behaved scalar functions are Potential-Functions...yes of course. Just take the gradient in the coordinates in which it is expressed and you can show that the curl is zero and the divergence is a constant even under coordinate transformation. Obviously not all vector fields come from gradients but you can take the gradient of any well-behaved function and there you have one - that's no surprise right? I mean you used the definition of the gradient to make the gradient and the function is therefore a potential by default. So, By analogy not every matrix is a tensor resulting from the operations of other tensor products. But I can take one and MAKE it transform as per definition.

  • @francescopiazza4882
    @francescopiazza4882 4 года назад

    Great!

  • @ctroy38
    @ctroy38 4 года назад +1

    Dude nobody dies if you give some examples here and there that contain numbers. That would make it much easier to understand...

  • @er4255
    @er4255 5 лет назад +1

    Man, the classes are starting going too fast. Also, the material of this video isn't yet an alternative to textbook, since it brings pretty much the same content. I have missed specific examples. Just leaving a tip.

  • @omgpullmyfinger
    @omgpullmyfinger 5 лет назад

    Thanks for breakfast!

  • @vikramnagarjuna3549
    @vikramnagarjuna3549 5 лет назад

    Animation based teaching please instead of board teaching..... Humble request...