Derive Black Holes Yourself | Penrose’s Singularity Theorem (Nobel Prize 2020)

Поделиться
HTML-код
  • Опубликовано: 5 ноя 2024

Комментарии • 14

  • @kasugaryuichi9767
    @kasugaryuichi9767 2 года назад +3

    I found you thanks to SOME ,looking forward to more~

  • @hydraliskinfector4529
    @hydraliskinfector4529 2 года назад +2

    Great content! I hope someday you get the followers and views that you deserve.

  • @sergiolucas38
    @sergiolucas38 2 года назад +1

    Great video, thanks :)

  • @MaxxTosh
    @MaxxTosh 2 года назад +1

    My only nitpick, you said calculus doesn’t work with infinities - doesn’t calculus exclusively only work with infinities?

    • @mindmaster107
      @mindmaster107  2 года назад +2

      Calculus works with limits. You can search "calculus first principles" on RUclips if you want a look.
      When it comes to infinity, algebra doesn't work. Calculus does a special trick where instead of real infinity, we put an arbitrarily increasing variable instead. This imitates infinity, while still allowing algebra to happen.
      Note, infinity still breaks algebra, and by extension calculus.

    • @MaxxTosh
      @MaxxTosh 2 года назад +1

      @@mindmaster107 That makes sense! You reminded me that in certain integrals we couldn’t plug in infinity we had to plug in “t approaches infinity.”

  • @Lincoln_Bio
    @Lincoln_Bio 2 года назад

    I'm reading Penrose's Road to Reality at the moment, the depth of his understanding for the mathematics of geometry - and vice versa - is astounding, I'm lost half the time lol

    • @mindmaster107
      @mindmaster107  2 года назад

      Well, drop me some questions!
      I can’t guarantee I can answer each one, but give it a shot!

    • @Lincoln_Bio
      @Lincoln_Bio 2 года назад

      ​@@mindmaster107 It's amazing he starts with Pythagoras and keeps generalising to more and more dimensions, we haven't even hit Relativity yet and I've just learned what fibre bundles are haha
      Tensor indices are confusing me atm, what does it means when they're upper or lower?

    • @mindmaster107
      @mindmaster107  2 года назад +1

      Step 1: Watch my tensor video (You need to know contravariant, covariant, and scalar)
      Step 2: Read explanation below.
      A tensor is a general vector and has two properties: rank and dimensions.
      Rank determines the “vectorness” of the tensor.
      Rank 0 is a scalar. Rank 1 is a vector.
      Rank tells how much the tensor reacts to coordinates. A rank 2 covariant tensor on a doubling of coordinates, actually increases in magnitude by 4 times.
      It is possible for a tensor to be both covariant and contravariant (the riemannian curvature tensor is rank 1 contravariant and 3 covariant), but intuition here breaks down slightly, so I will ignore this for now.
      Dimension is the space the tensor is in. A vector can be in a 2D or 3D space, and still act like a vector (has length, has dot product, etc.)
      Mathematically, the number of dimensions does not change what a tensor can do. (Aside from dimension 0)
      A covariant vector’s coefficients transform with coordinates, while a contravariant vector transforms against them. Double coordinate spacing, covariant vector doubles, and the contravariant vector halves.
      What if, you made a product between a contravariant and covariant vector? The amount that one increases should perfectly cancel the amount the other decreases, and coordinate transformations should do nothing.
      This means, just multiplying the coefficients of a vector from both categories makes a scalar. This is what the dot product does.
      More generally, co-multiply any contravariant tensor index with a covariant tensor index, and it is removed from the equation.
      This is what summing over tensor indices does, and why the notation is set up that way. If you sum any top with any bottom index, they remove each other. Since traditional vectors are contravariant, and usually have indices on top, covariant got the bottom instead.
      There is more stuff with the metric tensor, and how its used to raise and lower indices, but with after understanding what I mentioned above, looking through videos yourself should be perfectly possible.

    • @Lincoln_Bio
      @Lincoln_Bio 2 года назад

      @@mindmaster107 I felt like it was a vector/co-vector space thing or something, I was vaguely along the right lines lol thanks!

  • @siddharthsinghchauhan8664
    @siddharthsinghchauhan8664 2 года назад +2

    Bro.
    You be awesome