Spinors for Beginners 19: Tensor Product Representations of su(2) [Clebsch-Gordan coefficients]
HTML-код
- Опубликовано: 9 июл 2024
- Full spinors playlist: • Spinors for Beginners
Leave me a tip: ko-fi.com/eigenchris
Powerpoint slide files + Exercise answers: github.com/eigenchris/MathNot...
Clebsch-Gordan Tables: pdg.lbl.gov/2002/clebrpp.pdf
Videos on Clebsch-Gordan Coefficients:
- • Deriving Clebsch-Gorda...
- • Clebsch-Gordan Coeffic...
0:00 - Introduction
2:45 - Direct Sum vs Tensor Product
7:19 - Multi-particle systems
8:27 - Tensor Product of Lie Algebras
12:45 - Tensor product of su(2) reps
15:21 - Eigenvalue Operator
17:39 - Ladder Operators
20:50 - 2x2 = 3+1
24:09 - Casimir Operator
26:24 - Clebsch-Gordan Coefficients
29:28 - 3 and 4 spinor products
32:40 - Weight Diagrams
35:12 - Building tensors using spinors
36:59 - Larger Tensor Product Reps.
Literally the best math for physics content on RUclips!
Between eignchris, @sudgylacmoe and @ron-math we're kinda spoiled. But also @richarde.borcherds7998.
This is a fantastic video! I wish this had been my introduction to this topic. Even just one passive viewing,, without commiting all details to memory, nonetheless removes all the mystery and so, upon rereading, one would no longer see this all as black magic but rather be confident in one's ability to understand it!
if only i saw this before my Quantum final, this video is a great explanation and builds more around the CG coefficients
Really good! Bravo again! This series should be required viewing for all graduate students taking the QM course sequence. Things may have changed in the years since I was a grad student, but for me, there wasn't any stress on learning tensor products or Lie algebra representations. But now, with quantum computing being drenched in tensor products, it's irresponsible to fail to teach this to students. (For those who don't know, QM courses tend to be taught by senior professors who use antiquated lecture notes and older textbooks.)
You're a didactic genius!
Always thanks for your works. In my view, these viedo serise are best way to learning lie algebra representaion theory 👏👏👏
Great video, I used to be only able to mindlessly read the coefficients, now I understand them!
Surprisingly, watching this video has enforced and enriched (by analogy) the theory of cognition I'm building (which I'm beginning to call something like multidimensional cognitive algebra) even though I know Lie groups and Lie algebras only work in continuous spaces and the cognitive ones aren't. But the tensor sum and the tensor product act similarly. Thanks a lot!
Especially appreciate the spin 3/2x1/2 and 1x1 examples at 37.29. Opportunities for self-affirmation, sort of.
Incredible work, keep it up. Whenever you have a video covering material i am currently learning in uni i am very happy
Nice video as always Chris! 😊
Thanks! For all your work …
Amazing video!
Awesome video!
Great work
Wish this video existed when I was in undergrad… I missed the lecture and felt like I missed a semester
32:44 How is Mark Thomson's graph algorithm easier to understand than standard physics of spinning tops aligning or anti-aligning?
If I have two 1/2h spinners I can
A. line them both up in the same direction to get a 1h angular momentum spinner with 3 orientations (m)
B. flip one of them into the opposite direction to get effectively 0 angular momentum with 1 "orientation" (only m=0).
= 3+1 = 4 states
Then if you get a third 1/2 spinner, you can use it to
A. lengthen (strengthen) the 1h combination from above to 3/2h (4 orientations)
B. shorten (antagonise) the 1h to 1/2h by spinning in the opposite direction (2 orientations of the result)
C. lengthen the singlet state above to 1/2h again (also 2 orientations)
= 4+2+2 =8
Then if you get a fourth 1/2 spinner, you can use it to
A. lengthen the 3/2h to 2h (5 orientations)
B. shorten the 3/2h to 1h (3 orientations)
C. lengthen each of the 1/2h to 1h (two times 3 orientations)
D. shorten each of the 1/2h to 0 (two times 1 orientation)
= 5+3 +3+3 +1+1 =16
Damn I got flash backs back to the time where I was way to lazy to evaluate the Klebsch-Gordan coeffs on my own
Waiting for tomorrow
Representatinos ❤
I was like "haha I often make that same typo too!"
Oh. ):
It’s a very funny word lol, I love it.
@@somerandomperson569 Sounds like it's a particle, like "neutrino".
Representatinos for beginos! xD
To me, representatinos has several definite grammatical flavours derived from my own language: diminutive, masculine, plural, iterative, projective.
I don't know if you'll get to it, but at 12:10 this feels very similar to finding the whole hamiltonian in a multi-particle Hilbert space. Any relationship? Maybe since hamiltonians act as the generators of the time evolution operator?
I'm not sure exactly, since I haven't studied multi-particle QM much. But the form of the equation at 12:10 is standard for Lie algebras. And Hamiltonian operators live in Lie algebras, and they generate time translations as you say.
Thanks, It's so interesting, I start to feel something about what is the spin, but I've reached my limits (the ladder operator , need to be studied again). Sure I'll need to go back to College...
Maybe for my retirement.
10000........wonderful.
thanks..
The captions in this is less... deranged than the last video. 8/10 stars
I'm guessing the previous video's captions gave it a 10/10.
What sources do you use?
If you do unitary transformation on lets say a triplet state, can it end up in a singlet state?
Can the two transition into each other?
No, they remain separate. It's impossible to change the spin value j of a state using a rotation.
Waooooooo!!!
Great one! Took some insights with me today:
- About the right way to Lie-algebrize the Casimir operator
- Visual weight diagram multiplication
- Clebsch-Gordon tables as change of basis.
Thanks!
In Lie Algebra, what would be the difference between a vector space, tangent space, and tensor space??
I'm not sure I understand the question. The tangent space is the set of all tangent vectors at a point. What's a "tensor space"?
Video at 0:50 and 2:40 cleared up that question. Conceptual semantics matter and in this video possibly a little more than others. Thank you.
It turns out that even the higher spin tensor product states are useful in classifying what states are there in the "particle zoo", and figure out the properties of nuclides when many protons and neutrons combine
Yeah, I may or may not make a video on SU(3) explaining that. Maybe in the 2nd half of 2024. Want to finish the spinor series first.
@@eigenchris This spinor series is in itself a feat
which I'm sure many grad students will use forward
One last question:
If we start by tensor-producting two algebras g1⊗g2 and exponentiating it to get members of the Lie group, is it related to the Lie group members of the individual algebras?
Can we write exp(g1⊗g2) = exp(g1) ⊗ exp(g2) for example?
whoops turns out there's another question for the community.
For the tensor product of _three_ spin-1/2 states, there are _two_ spin-1/2 irreps.
How are they different?
For you first question, can't write that. The equation exp(X+Y) = e(X) e(Y) only holds if X and Y commute. With "a⊗1 + 1⊗b", the two terms in the sum do commute: (a⊗1)(1⊗b)= (a⊗b) = (1⊗b)(a⊗1). This means we can write exp(a⊗1 + 1⊗b) = exp(a⊗1) exp(1⊗b).
For your second question, there isn't really a difference between the two spin-1/2 reps. They're just different subspaces of the overall 8D space. But mathematically, they are identical.
@@eigenchris In my other question you said states of different j can't transform into each other through unitary transformation.
For this spin-3/2 case, the two spin-1/2 have the same j but nevertheless different irreps. Can their states end up transforming into each other? Can we effectively lump the two together as one big spin-1/2 irrep?
@@GeoffryGifari The representations never mix. When you change basis from the 8 "tensor product" basis vectord to the 4+2+2 basis, the accompanying 8x8 unitary matrix will change basis and become block-diagonal, so that it's 4x4, 2x2, 2x2. So the representations only transform into themselves.
Genuine question because Ive been watching your videos for years and I've never been able to figure this out. Is this your real voice or a generator??
It's my real voice lol.
Is commutativity lost in tensor product and direct sum?
They give different results whem you flip the order (different order of the compontents in the result vector) so they are not commutative.
11:48
21:00
Rocket science 🤤
Can you explain how did you find sqrt(2) from 20:19 to 20:50, i'm confused
For j=1, m=1, the formula at the top right is: sqrt(1*(1+1) - 1*(1-1)) = sqrt(2). So lowering the |m=+1> state gives you an extra coefficient of sqrt(2).
For j=1, m=0, the formula at the top right is: sqrt(1*(1+1) - 0*(0-1)) = sqrt(2). So lowering the |m=0> state gives you an extra coefficient of sqrt(2) as well.
For j=1, m=01, the formula gives sqrt(1*(1+1) - 1*(1-1)) =0, which means you can't lower the |m=-1> state, since it's the lowest state.
I cover this in the previous video here: ruclips.net/video/Q_RUDQkDsE0/видео.htmlsi=pDQgx81cPB1RvLh0&t=1630
@@eigenchris thank you, i understand now
@@eigenchrislast question, how do you know that the singlet state is a spin 0 representation?
@@omega82718 1D representations are always spin-0 representations. Also, the casimir operator returns the j=0 result.
@@eigenchris thanks
Hi @eigenchris I have some questions 🙏:
1) Why are the spin up/down vectors multiplying with the lie algebra representations instead of the lie group representations? I can interpret multiplying the spin state vectors with SU2 representations as rotations, but how should I interpret multiplying the spin state vectors with su2 representation instead?
2) Are the spin state vectors sort of "weight space vectors" as described in this video ruclips.net/video/2mGUckJY51A/видео.htmlsi=XfAOgPuwRGdoi9ff ?
3) Are the spin state vectors (especially the spin 1/2 spin state vectors) not the same spinor vectors from the previous videos in this series? Since it is not multiplying with rotation matrices like SU2 matrices but with lie algebra generators now?
4) It also seems giving a meaning to perform matrix multiplication between 2 lie algebra representations beyond the lie bracket, for example g+ g- |up> = |up> thus g+ g- acts like multiplying to identity matrix in this case? And for example g+ g+ |up> = 0 thus g+ g+ acts like a nilpotent multiplication where g+² = 0 in this case? Are we actually working with spin state vectors in the universal enveloping algebra?
1) The up/down states can be acted on by both the lie algebra matrices and the lie group matrices. I just happen to be focusing on the lie algebras in this video. Multiplying state vectors by su(2) representations gives results that are related to the expected values of spin in the xy, yz, and zx planes in quantum mechanics. Although in my convention, you need to multiply by i to make them hermitian instead of anti-hermitian.
2) Yes, spin eigenstates and "weight vectors" are the same thing.
3) They are the same as the states/spinors we've seen before, and they can still be rotated using SU(2) group matrices. The Lie Algebras matrices just give us new insights and new relationships between the states, and that's what I focus on in this video.
4) g+g- isn't quite the identity. It acts like an identity for |up> but it gives zero on |down>. You can work out the matrix for it pretty simply. It's this projector matrix:
[1 0]
[0 0]
However for the spin-1/2 rep, the combination g+g- + g-g+ is the identity. This is the anti-commutator of g+ and g-, as opposed to the commutator. I go over this in video 15 of this series.
The universal enveloping algebra still operates on the same vector space that the lie algebra does.
@@eigenchrisThanks for the detailed explanations for all of my questions👍👍👍, sorry for my late reply since the notification of my youtube application is somehow not working for comment replies 🙏.
I still have question about 4), are actions like g+ g- or g+ g+ truly matrix multiplications between lie algebra elements where supposedly only lie bracket is defined? If yes then why all lie theory tutorials claims matrix multiplication is not defined between lie algebra elements and only lie bracket is defined?
@@hellfirebb When you have a matrix representation of g+ and g-, you can multiply them as matrices. But in the purely abstract land of Lie Algebras, when you only have symbols, only the Lie brackets are defined. You don't get matrices until you pick a matrix representation.
This is probably not so clear because many videos (including mine to some extent) don't distinguish between the abstract Lie algebra symbols and their matrix representations. Frequency when I'm writing g+ and calling it a matrix, I really mean the matrix representation π(g+). The Lie bracket rules can be defined purely abstractly, without reference to any matrix representation.
@@eigenchris so I can perform matrix multiplication between two lie algebra matrix representations even though the resulted matrix is not representing a lie algebra element anymore?
@@hellfirebb Correct. Lie Algebra representations only require that the Lie bracket is preserved.
last time I asked you, what is your educational background or academic training in university.Could I know where are you from ?
I have an undergrad degree in "engineering physics", but everything I've posted on this channel (tensors, general relativity, spinors), I learned on my own after my degree was finished. Also I'm from Canada.
Best Physics Teacher Number one
@@eigenchris What career options would be pursued by this series and all other stuff on your channel?
@@martinnjoroge6006 Most of this stuff is geared towards theoretical physics. So it would either be theoretical physics research, or some other niche area. Tensors are a bit more commonplace, but I don't think you need to learn things like tensor products unless you really want to understand the geometry behind tensors.
I prefer Clifford algebras where circle-plus is just +.
Cant wait for the april fools video
'Promosm'
The reason why this series wasn't named "spinors for beginnors" is absolutely beyond me.
Edit: what the actual fuck. I clicked away from this video, and the first thing that my eye landed on was a video from eighenchris where the thumbnail said "spinors for beginnors". What the fuck
Could be nicer if you leave references somewhere
read the description.
When dealing with scalar multiplication of a tensor product N( v ⊗ w ) , would it be illegal if we distribute a square root √N to v and another √N to w ?
That's allowed.