Everything is a matrix.

Поделиться
HTML-код
  • Опубликовано: 5 ноя 2024

Комментарии • 71

  • @a.i7538
    @a.i7538 2 года назад +94

    He's Beginning To Believe!

  • @essadababneh5871
    @essadababneh5871 2 года назад +22

    The greatest thing about this channel is that it is heavy on examples. This is the way I learn. I need things spelled out for me. Thank you!

    • @DelandaBaudLacanian
      @DelandaBaudLacanian 2 года назад +1

      Penn has become indispensable, I don't understand everything but he explores it all in such a methodical way that I always learn a little bit along the way

  • @angeldude101
    @angeldude101 2 года назад +19

    That last matrix looks like a rotation matrix, or more accurately a spiral. This honestly makes complete sense giving the usage of exponentials and trig functions. You can also write it as the complex number a ± bi, which is the exact same rotation. (Though whether you use plus or minus depends on which side you're looking at the plane from.)

  • @brendand3765
    @brendand3765 2 года назад +21

    Honestly that's why I love linear algedra so much. So versatile

  • @goodplacetostop2973
    @goodplacetostop2973 2 года назад +17

    17:22

  • @txikitofandango
    @txikitofandango 2 года назад +18

    Was confused at 7:45 because you seem to do XA - AX rather than AX - XA. But I get that you changed what A is to make A the input of T.

  • @johnshortt3006
    @johnshortt3006 2 года назад +59

    I think he reversed A and X in the commutation calculation?

    • @jkid1134
      @jkid1134 2 года назад +7

      Classic Penn

    • @davidt939
      @davidt939 2 года назад +14

      So he computed -T instead of T

  • @manucitomx
    @manucitomx 2 года назад +10

    This was a great video.
    Thank you, professor.

  • @ashwinjain5566
    @ashwinjain5566 2 года назад

    the last example was NEAT

  • @SliversRebuilt
    @SliversRebuilt 2 года назад +9

    Have you ever had a dream...

  • @jesusalej1
    @jesusalej1 2 года назад +7

    Everything is IN the matrix. Matrix can become Everything.

  • @Jack_Callcott_AU
    @Jack_Callcott_AU 2 года назад +5

    That last application blew me away. Is that a much easier way to get that integral ? I think it would take a long time by heuristic trial an error methods.

    • @alexisren365
      @alexisren365 2 года назад +5

      The integral can be computed by using complex numbers, instead of integrating e^(ax)*cos(bx) (resp. e^(ax)*sin(bx)), you integrate e^((a+ib)x) and take the real component (resp. imaginary component) to get the result

    • @Jack_Callcott_AU
      @Jack_Callcott_AU 2 года назад

      @@alexisren365 Thanks for telling me this. It's very interesting . Is this method an example of Feynman integration? I think not.

    • @jorex6816
      @jorex6816 2 года назад +8

      The "normal" way would be to use integration by parts (I think two times?). The integral you're trying to solve will come up again and you can easily solve for it by manipulating the equation.

    • @Jack_Callcott_AU
      @Jack_Callcott_AU 2 года назад

      @@jorex6816 Yes ! I've seen that done before in the dim distant past. Thanks. 👍

  • @tiborgrun6963
    @tiborgrun6963 2 года назад +1

    Looking at the inverse of
    [ [a, b];
    [c, d] ]
    1/det *
    [ [d, -b];
    [-c, a] ]
    And next to it the 4x4 matrix for our commutator -T with the matrix A =
    [ [2, 3];
    [-1, 0] ]
    becomes
    [ [0, -1, -3, 0];
    [3, -2, 0, -3];
    [1, 0, 2, -1];
    [0, 1, 3, 0] ]
    This looks like some kind of kronecker product: Top right is 3 ⊗ [ [-1, 0]; [0, -1] ] and bottom left is (-1) ⊗ [ [-1, 0]; [0, -1] ]. So like in the inverse case, where they get multiplied by -1 but blown up from 1 to 2 dimensions.
    And the other two entries instead of swapping places they are sort of swapping places with the whole matrix A:
    Top left is (if we ignore the 1/det factor) -(A^(-1))^T
    Bottom right then is just A^T
    Maybe this coincidence is just because of the missing minus sign, so if we multiply the 4x4 matrix by -1
    [ [0, 1, 3, 0];
    [-3, 2, 0, 3];
    [-1, 0, -2, 1];
    [0, -1, -3, 0] ]
    and then the top right and bottom left are just the top right and bottom left of matrix A multiplied by the 2x2 identity matrix.
    And top left becomes (A^(-1))^T
    And bottom right becomes -A^T.

  • @DeanCalhoun
    @DeanCalhoun 2 года назад +3

    “you think that’s calculus you’re doing
    now?”

  • @urumomaos2478
    @urumomaos2478 2 года назад +4

    Nobody:
    Literally nobody:
    Linear algebra teachers be like: everythings a matrix

  • @Nikolas_Davis
    @Nikolas_Davis 2 года назад +8

    You can even use matrices for mappings between infinite dimensional vector spaces. Heisenberg did it!

    • @juliandominikreusch8979
      @juliandominikreusch8979 2 года назад +1

      Well, what do you mean by that? Are there linear maps between infinite dimensional spaces ? Yes. Can they be represented by a matrix? Trivially no. Linear operator =/= matrix

    • @MrFtriana
      @MrFtriana 2 года назад +1

      Kind of. Yes, you can represent any linear operator in quantum mechanics using the solutions of the Schrödinger equation and get a matrix representation, and in principle, that matrix could be infinite dimensional, but there are some questions of technical character, so i recommend investigate it in quantum mechanics and functional analysis textbooks in order to get a proper understanding.

    • @asukalangleysoryu6695
      @asukalangleysoryu6695 2 года назад +8

      "Jesse... Jesse, listen to me! We have to transform!"
      "Yo, mister White... I don't think we can transform between infinite dimensional vector spaces."
      "Jesse... What the fuck are you talking about Jesse?"

    • @Nikolas_Davis
      @Nikolas_Davis 2 года назад +2

      @@juliandominikreusch8979
      Yes, I mean linear mappings, which can be expressed as generalized matrices. Heisenberg formulated quantum mechanics in the form of "matrix mechanics", without actually realizing it was equivalent to Schrödinger's "wave mechanics".
      The discrete nature of quantum physics allows one to express physical states, e.g. energy levels, as infinite vectors, and physical operators - which correspond to classical quantities like momentum, energy or position - as matrices acting upon these vectors, and giving us other vectors. The analogy is powerful, and can be made rigorous, because even though you can't actually write down these infinite matrices, their elements are well-defined, just as the terms of an infinite series are well defined.

    • @MrFtriana
      @MrFtriana 2 года назад +2

      @@Nikolas_Davis but don't forget that not all the operators have discrete spectra. For example, the position and the momentum operators, or the kinetic energy in the case of free particles. In that case, there are some subleties (and yes, there is the Dirac delta, but remember, it is a distribution, what makes this problem a bit tricky).

  • @abrahammekonnen
    @abrahammekonnen 2 года назад +1

    Great video

  • @corneliusgoh
    @corneliusgoh 2 года назад +1

    I always enjoy his video, especially he is excellent in linking Linear Algebra to other fields like Calculus - never seen in the French Math which deals abstractly Linear Algebra, but never applied in other areas like Calculus.

  • @colemanrohde8243
    @colemanrohde8243 2 года назад +1

    Could you do a video on the limit lim n-->∞ BB(n)/TREE(n) where BB(n) is the busy beaver function and TREE(n) is the tree function? Also a video on uncomputable numbers would be cool as well.

  • @abrahammekonnen
    @abrahammekonnen 2 года назад +1

    6:06 when I watched this video through the first time I thought that left multiplication was wrong here, but if you consider integration to be the inverse of differentiation then you should right multiply here!

  • @MrBeiragua
    @MrBeiragua 2 года назад

    I remember doing this to calculate the half derivative of cos(x)

  • @mircovalentini7153
    @mircovalentini7153 2 года назад

    The mapping (and its inverse) between one element of the finite-dimensional v-space and the "vector" used to encode it, can be tricky to define. One positive thing about the matrices is that there are a lot of "tools", sharpened by years of usage, that can be used to easily establish the properties of the linear transformation itself.

  • @jacemandt
    @jacemandt 2 года назад +2

    Is there an obvious (or not-so-obvious) relationship between the 2x2 matrix A and the 4x4 matrix representing the commutator operation? My linear algebra knowledge isn't very extensive, but it seems like there must be something....

    • @smiley_1000
      @smiley_1000 2 года назад

      Perhaps it even is a linear relationship given by a (4x4)x(2x2) = 16x4 Matrix

    • @schweinmachtbree1013
      @schweinmachtbree1013 2 года назад +2

      if you've got time and patience then you could carry out the same calculation michael did but for a general matrix instead of {{2, 3}, {-1, 0}} and you can discover the answer for yourself (I don't know the answer but I have neither time nor patience xD I can certainly see some patterns in the 4x4 matrix michael gets though)

  • @VeteranVandal
    @VeteranVandal 2 года назад +3

    Nah. Technically everything is a tensor.

  • @wesleydeng71
    @wesleydeng71 2 года назад

    Great video but audio was a bit low.

  • @minwithoutintroduction
    @minwithoutintroduction 2 года назад +3

    دائما نستفيد.شكرا جزيلا لكم.

    • @jesusalej1
      @jesusalej1 2 года назад +2

      Estoy totalmente de acuerdo.

    • @asukalangleysoryu6695
      @asukalangleysoryu6695 2 года назад +4

      Samaa mieltä.

    • @jesusalej1
      @jesusalej1 2 года назад +2

      No podría estar más de acuerdo con vos también.

    • @BMK5298
      @BMK5298 2 года назад +2

      السلام عليكم ورحمة الله وبركاته هل يمكن أن تنصحني ببعض الكتب بالنسبة السلك أول جامعي

  • @361Jonel
    @361Jonel 2 года назад

    Around 8:00 I think you've accidentally swapped A and X

  • @skypickle29
    @skypickle29 2 года назад

    In the commutator example, he takes rows of the 4x4 matrix and multiplied each row by the corresponding row entry of the vector in front. Doesn’t matrix multiplication require the COLUMNS of the second matrix to be used as the ‘multiplier’?

    • @Alex_Deam
      @Alex_Deam 2 года назад +2

      No, he multiplies each row of the 4x4 matrix with the whole column vector. The matrix isn't 'on the right' in the sense of matrix multiplication, it's a diagram showing an operator - the 4x4 matrix is placed where it is because it's the operator represented by the arrow below it. I believe this diagram is called a commutative diagram. The matrix operator itself is still a left multiplier.

  • @General12th
    @General12th 2 года назад

    Hi Dr.!

  • @gedeonss4052
    @gedeonss4052 2 года назад

    Interesting , nice job 🙂👍

  • @CM63_France
    @CM63_France 2 года назад

    Hi,
    These glances on the right all the time are not very pleasant, it looks like Numberphile 🤣

  • @elixiroflife9636
    @elixiroflife9636 2 года назад

    Hi Michael,
    Pls suggest a book on linear algebra which has these kinds of theory and examples. Thanks

  • @Jnglfvr
    @Jnglfvr 2 года назад +1

    You look like a young John McEnroe.

  • @rocky171986
    @rocky171986 2 года назад +8

    The video is missing the most central idea: that the matrix can be constructed by examining what the linear transformation does to the basis elements

    • @jorex6816
      @jorex6816 2 года назад +3

      How would that work with the linear map from M(2x2) to M(2x2)?

    • @Alex_Deam
      @Alex_Deam 2 года назад +3

      What? He literally does exactly that in the final example!

    • @mbrusyda9437
      @mbrusyda9437 2 года назад +2

      @@jorex6816 The basis can be the 4 matrices with a 1 in a slot and 0 in others, no?

  • @charleyhoward4594
    @charleyhoward4594 2 года назад

    the ans. at 6:39 makes no sense to me: a 3 X 1 vector times a 1 X 3 vector eq. a single value ????

    • @smiley_1000
      @smiley_1000 2 года назад +1

      Yeah, it should be the other way around. A 1x3 vector times a 3x1 vector gives a 1x1 single value.

  • @RalphDratman
    @RalphDratman 2 года назад +1

    That does not apply to octonion multiplication, which is non-associative.
    Matrix multiplication is always associative, whether or not it is communtative.

    • @bruh-pj3kq
      @bruh-pj3kq 2 года назад +2

      If it’s non-associative, it’s not linear

    • @RalphDratman
      @RalphDratman 2 года назад

      @@bruh-pj3kq I'm not sure what you mean by linear in this context.

  • @ready1fire1aim1
    @ready1fire1aim1 2 года назад

    [Leibniz's contingency argument, clarified]:
    Ten whole, rational numbers 0-9 and their geometric counterparts 0D-9D.
    0 and it's geometric counterpart 0D are:
    1) whole
    2) rational
    3) not-natural (not-physical)
    4) necessary
    1-9 and their geometric counterparts 1D-9D are:
    1) whole
    2) rational
    3) natural (physical)
    4) contingent
    Newton says since 0 and 0D are
    "not-natural" ✅
    then they are also
    "not-necessary" 🚫.
    Newton also says since 1-9 and 1D-9D are "natural" ✅
    then they are also
    "necessary" 🚫.
    This is called "conflating" and is repeated throughout Newton's Calculus/Physics/Geometry/Logic.
    con·flate
    verb
    combine (two or more texts, ideas, etc.) into one.
    Leibniz does not make these fundamental mistakes.
    Leibniz's "Monadology" 📚 is zero and it's geometric counterpart zero-dimensional space.
    0D Monad (SNF)
    1D Line (WNF)
    2D Plane (EMF)
    3D Volume (GF)
    We should all be learning Leibniz's Calculus/Physics/Geometry/Logic.
    Fibonacci sequence starts with 0 for a reason. The Fibonacci triangle is 0, 1, 2 (Not 1, 2, 3).
    Newton's 1D-4D "natural ✅ =
    necessary 🚫" universe is a contradiction.
    Natural does not mean necessary. Similar, yet different.
    Not-natural just means no spatial extension; zero size; exact location only. Necessary.
    Newtonian nonsense will never provide a Theory of Everything.
    Leibniz's Law of Sufficient Reason should be required reading 📚.

  • @DelandaBaudLacanian
    @DelandaBaudLacanian 2 года назад

    even our minds can be represented in the form of what's called "Matrixial trans-subjectivity", it reveals the threads of trauma that tie us all together, allowing individuals to transform their unrelenting death drives to life drives

  • @Neomatrixology
    @Neomatrixology 2 года назад +1

    💊🐇🕳🔋🍪📡
    #BeyondTheGlitch

  • @와우-m1y
    @와우-m1y 2 года назад +1

    asnwer=1 os isit 🤣🤣🤣🤣

  • @SuperYoonHo
    @SuperYoonHo 2 года назад +1

    thanks! 🤗🤗x100
    🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗🤗