Basis vectors and the metric tensor

Поделиться
HTML-код
  • Опубликовано: 25 ноя 2024

Комментарии • 54

  • @scottydscottd
    @scottydscottd 6 лет назад +20

    Hell yesss. Clearest and most logical exposition on RUclips. Reasonable definitions, etc. This is a gold mine. Thank you!

  • @abcdef-ys1sb
    @abcdef-ys1sb 6 лет назад +7

    I was looking for this kind of explanation for a long time

  • @logansimon6653
    @logansimon6653 4 года назад +3

    Honestly, a very competent run through. Thanks!

    • @TensorCalculusRobertDavie
      @TensorCalculusRobertDavie  4 года назад +1

      Hello Logan and thank you for your comment. Much appreciated!

    • @logansimon6653
      @logansimon6653 4 года назад +1

      @@TensorCalculusRobertDavie Hi, you are very welcome. I browsed through some of your other titles just now, and I am excited to see a rich source of mathematics of my most favorite type. Would you mind if I cite you as a source in the text I am writing on general relativity (with a rigour in tensor calculus and differential/Riemannian geometry) -- especially for any instances that I am inspired to add to my work because of your content? If you would like, this is a hyperlink to my document. drive.google.com/open?id=1-MU7daeZ0Q8TefNOwzImcGD2uZIhktvZl3FO13R5UkQ
      Thank you for the content!

    • @TensorCalculusRobertDavie
      @TensorCalculusRobertDavie  4 года назад +1

      You are welcome to cite my material andgood luck with your efforts.

  • @marinajacobo3550
    @marinajacobo3550 5 лет назад +6

    Thank you Robert! I really enjoyed this video.

  • @g3452sgp
    @g3452sgp 7 лет назад +1

    The images at 1:59 and at 3:20 are good.
    They are well organized and help us to get the whole picture of underlying concept.
    Excellent!
    Thanks a lot.

  • @marinajacobo3550
    @marinajacobo3550 5 лет назад +6

    Thank you! I really enjoyed this explanation :)

  • @theboombody
    @theboombody 3 года назад +1

    I like the ad placements on these videos. "Are you struggling with calculus?" If you're watching a video on curvature and differential geometry, then no, you're not struggling with calculus. You're struggling with something far beyond.

    • @TensorCalculusRobertDavie
      @TensorCalculusRobertDavie  3 года назад

      Yes, a bit ironic. I hope there aren't too many ads?

    • @theboombody
      @theboombody 3 года назад +1

      @@TensorCalculusRobertDavie No, it's not too bad. That's the price of posting stuff on youtube. They can put ads in your stuff and there's nothing you can do about it except not post videos. But I think it's a small price to pay for the freedom of being able to post mathematical content. I'm pretty grateful for youtube both as a viewer and as a poster.

  • @dansaunders6957
    @dansaunders6957 4 года назад +2

    What happens to the position vector when working with a manifold? how does one typically define a basis without a position vector.

  • @benedekjotu266
    @benedekjotu266 5 лет назад +2

    Excellent presentation.
    In general, what is the punch line for working with both covariant and contravariant coordinates? They are both representing the same objects. The metric tensor is usually at hand anyways. At first it seems an unnecessary complication while on the way to general relativity. How come they didn't just go with one or the other? And left the other as a fun fact side note.
    Thanks

    • @TensorCalculusRobertDavie
      @TensorCalculusRobertDavie  5 лет назад +1

      Hello Benedek and thank you for your question. Wikipedia discusses this issue in the quote below and further in the link below that.
      "The vector is called covariant or contravariant depending on how the transformation of the vector's components is related to the transformation of coordinates.
      Contravariant vectors are "regular vectors" with units of distance (such as a displacement) or distance times some other unit (such as velocity or acceleration). For example, in changing units from meters to millimeters, a displacement of 1 m becomes 1000 mm.
      Covariant vectors, on the other hand, have units of one-over-distance (typically such as gradient). For example, in changing again from meters to millimeters, a gradient of 1 K/m becomes 0.001 K/mm."
      www.wikiwand.com/en/Covariance_and_contravariance_of_vectors

    • @dsaun777
      @dsaun777 5 лет назад +2

      @@TensorCalculusRobertDavie So it doesnt matter if you use contravariant or covariant they are just used whenever most conveniently for transforms?

  • @한두혁
    @한두혁 3 года назад +2

    Is linear algebra needed (I mean in a rigorous way starting from defining vector spaces and dual spaces and so on...)
    to fully understand Tensor and General Relativity? Because some Textbooks were pretty hard to read since they start from a very abstract point of view not even mentioning about differentials, chain rules from calculus.
    I really enjoyed the video by the way I really appreciate it.
    Thank you!

    • @TensorCalculusRobertDavie
      @TensorCalculusRobertDavie  3 года назад +1

      Hello and thank you for your comment. The answer is no because this video provides you with a basic introduction to basis vectors and one forms (the objects with raised indices). However, the more you learn the better so do continue to study linear algebra if you can.
      Thank you for the feedback and good luck with your studies.

    • @한두혁
      @한두혁 3 года назад +2

      @@TensorCalculusRobertDavie Thankyou!

  • @nicolecui3214
    @nicolecui3214 4 года назад +1

    Hi, thanks for the video, but why does every vector is written by a covariant component with contravarient basis, and vice versa. Intuitively, I thought isn't the component and basis are consistent?

    • @TensorCalculusRobertDavie
      @TensorCalculusRobertDavie  4 года назад +1

      The two bases are distinct, hence the upper and lower indexes, and behave in different ways unlike in Euclidean space where they really are just the same thing hence no reason to raise or lower indices. Sorry for the short answer. Have a look at this article; en.wikipedia.org/wiki/Covariance_and_contravariance_of_vectors
      and this video; ruclips.net/video/CliW7kSxxWU/видео.html
      In General Relativity we use a metric to raise and lower these indices that is not the same as the Euclidean metric.

    • @nicolecui3214
      @nicolecui3214 4 года назад +1

      @@TensorCalculusRobertDavie Thank you for the reply, will take a look! :)

  • @davidprice1875
    @davidprice1875 7 лет назад +1

    Very clear and precise summary.

    • @TensorCalculusRobertDavie
      @TensorCalculusRobertDavie  7 лет назад

      Thank you David.

    • @TensorCalculusRobertDavie
      @TensorCalculusRobertDavie  7 лет назад +2

      Hello Sjaak, the content covered here does assume some prior knowledge of vector calculus. The main point of the video are the two forms of basis vectors that can be formed so could I suggest that a good starting point would be to focus on the meaning of the diagrams before moving on to deal with the notation and what it is trying to express. Hope that helps?

  • @parthvarasani495
    @parthvarasani495 4 месяца назад +1

    12:24 , u • v = g(ij) ui vj , not square root of it. I think.(In Numerator)

    • @TensorCalculusRobertDavie
      @TensorCalculusRobertDavie  4 месяца назад +1

      You are right. Thank you for spotting that.

    • @parthvarasani495
      @parthvarasani495 4 месяца назад +1

      @@TensorCalculusRobertDavie Thank you for your all efforts, highly appreciated 👍👏👏

  • @박용석-n8y
    @박용석-n8y 4 года назад +2

    5:12 thank you so much

  • @hariacharya5533
    @hariacharya5533 6 лет назад +1

    good presentation. you explain nicely.

  • @garytzehaylau9432
    @garytzehaylau9432 4 года назад +1

    Excuse me
    What is Nable i u in 12:53 actually
    this notation is not clear
    why g^ij Nablaj u ej = n
    could you explain to me
    thank for your great videos
    i would recommand to other people

    • @TensorCalculusRobertDavie
      @TensorCalculusRobertDavie  4 года назад

      Hello Gary and thank you for your question.
      The inverse metric is the g^ij part and the nabla u is the derivative giving us the maximum direction of increase in the scalar u in each of the directions j. The inverse metric raises the j index on the resultant of nabla u so that we obey the Einstein summation convention and don't end up with two j's down below.
      We CANNOT have (nabla u)j e_j but we can and must have (nabla u)^j e_j.
      Hope that helps?

  • @zoltankurti
    @zoltankurti 5 лет назад

    At the beginning of the video, you have to assume that the coordinate transformation and its inverse is also differentiable

    • @TensorCalculusRobertDavie
      @TensorCalculusRobertDavie  5 лет назад

      Thank you Zoltan, that is a good point about differentiability, I should have mentioned it at the beginning.

  • @rontoolsie
    @rontoolsie 7 лет назад +1

    At 11:45, line 3 should end up as u(covariant)V(contravariant). Otherwise this is an excellent presentation.

    • @TensorCalculusRobertDavie
      @TensorCalculusRobertDavie  7 лет назад +1

      Hello Ron, thank you for your comment and you are correct however, in this case, we have u(covariant)v(contravariant) = u(contravariant)v(covariant) which was the point I was trying to show across lines 3 and 4. The point here is that there are four different looking ways to get the same result. At the time I did um and arrgh about whether I should write it in the form you have pointed out but my goal took precedence in the end.

  • @vicentematricardi3596
    @vicentematricardi3596 6 лет назад +1

    Muy Buenos sus Videos !!!!!

  • @abhishekrai1204
    @abhishekrai1204 5 лет назад +1

    Thanks sir

  • @anthonysegers01
    @anthonysegers01 6 лет назад +3

    GREAT JOB!!! (

  • @rontoolsie
    @rontoolsie 7 лет назад

    correction...line 4