Functional Analysis 15 | Riesz Representation Theorem

Поделиться
HTML-код
  • Опубликовано: 6 янв 2025

Комментарии • 84

  • @lucaug10
    @lucaug10 4 года назад +25

    I'm loving the frequent videos! Such a happy surprise to open RUclips and see a new functional analysis video every day! :)

    • @brightsideofmaths
      @brightsideofmaths  4 года назад +4

      Thank you! I am working hard at the moment :)

    • @saadtahir96
      @saadtahir96 4 года назад

      @@brightsideofmaths Thank you for this! Can you please share your email address or inbox me at saadtahir96@gmail.com? I have some useful material that you may like, and ultimately also help me with this course too! :D

    • @brightsideofmaths
      @brightsideofmaths  4 года назад +1

      @@saadtahir96 ruclips.net/user/brightsideofmathsabout

  • @dibeos
    @dibeos 4 года назад +5

    I don’t understand why (lambda)*x(hat)-x is in the Kernel of l... 7:08

    • @brightsideofmaths
      @brightsideofmaths  4 года назад +3

      If you apply the map l, you get zero. This is same calculation as done by the blue brackets above.

    • @dibeos
      @dibeos 4 года назад +2

      Ahhh got it! And by the way, thanks for the videos. They are really amazing. I even already sign up to your your website steadyhq.

    • @brightsideofmaths
      @brightsideofmaths  4 года назад +1

      @@dibeos Thank you very much :)

  • @psyspin
    @psyspin 4 года назад +8

    Congratulations man! This is an amazing intro to a topic that I like very much (although I am not a mathematician) but I struggle to understand through my self-study. It really helps me a lot! Again congrats and keep up the good work :)

  • @NorwegianFr34k
    @NorwegianFr34k 4 года назад +15

    Not even joking, yesterday I was about to ask whether you would make a video on this topic. So this was a nice surprise :D

  • @RepTheoAndFriends
    @RepTheoAndFriends 3 года назад +4

    I know that this is a meme, but: I really enjoy the statements of basic Functional Analysis, because they resemble stuff from representation theory (A continuous function H -> F seems to be an analytic version of an exact Functor from a 'nice' triangulated category T to k-Vect). For a certain triangulated category T one can show that K_0(T) is the power series ring k[[t]]. Alltogether we obtain a map from K_0(T)= k[[t]]-> k=K_0(k-Vect) (At least after tensoring with k). The statement is that every such Functor which is continuous (i.e. exact and sends arbitrary coproducts to arbitrary coproducts) is representable (This is Brown Representability theorem).

  • @jorgearturomartinezsanchez4882
    @jorgearturomartinezsanchez4882 2 года назад +1

    Thank you, good sir. I'm writing my thesis and never took functional analysis so your videos help a lot

  • @arturo3511
    @arturo3511 Год назад

    at 4:45, is it always true that by continuity, the pre-image of closed sets are closed ? You said that with the continuity translates to closed sets for complements. I don't understand what you mean for complements , is there an extra-criterion for it to translate to closed sets? Or is it always true that if continuous the pre-image of closed sets is always closed? I'm simply asking this to know whether it's possible to have the preimage of an closed set being open which wouldn't go against the definition of continuity we saw. Thank you !
    Additionally it seems that at 5:16, x_l can be defined by any x^ (x-hat) that satisfies given properties, is it true that only one x^ satisfies these properties since x_l is unique ?

    • @brightsideofmaths
      @brightsideofmaths  Год назад

      By the abstract continuity we have: preimages of open sets are also open. This translates to: preimages of closed sets are also closed.
      Please also note that a set can be closed and open at the same time.

  • @RangQuid
    @RangQuid Год назад

    The proofs are very elegant, they really bring out the beauty and bright side of mathematics!

  • @anne-catherine_gagne
    @anne-catherine_gagne 8 месяцев назад

    Thank you! Your video really helped me understand better the material. I feel more confident for my final tomorrow

  • @MikhailBarabanovA
    @MikhailBarabanovA 4 года назад

    Finally an answer to why we can just transpose vector space elements and they become OK as an functional. Thanks!

  • @AadityaVicramSaraf
    @AadityaVicramSaraf Год назад

    I'm unsure if I'm being dumb but for 6:34, doesn't the complex conjugate come when we multiply the scalar in the second component? I might be confused but kindly clarify.

    • @AadityaVicramSaraf
      @AadityaVicramSaraf Год назад

      in wikipedia and conway's functional analysis as well, I saw that it is conjugates for second component and normal for first component. I checked out your previous video where you said linear in second component though.

    • @AadityaVicramSaraf
      @AadityaVicramSaraf Год назад

      Ok sorry i rewatched that video and found at 6:18 that you clarified that you had chosen this definition. I also understood that eventually it is there to ensure positivity so it is our choice to choose linearity in first (or second) argument. Thanks. Stuff is much clear now

    • @brightsideofmaths
      @brightsideofmaths  Год назад

      Great :)

  • @tensorfeld295
    @tensorfeld295 4 года назад +20

    Can you do a course on differential geometry? Starting elementary then continuing with manifolds. Maybe you can do something with Banach- and Hilbert-Manifolds. Would be nice! ^^

  • @StratosFair
    @StratosFair 2 года назад

    Wanted to give myself a quick refresher for the proof of Riesz representation theorem, and this was extremely clear and helpful, just like I remembered it to be !
    I hope you will get the chance to cover orthogonal projections as some point

  • @lonjezosithole6285
    @lonjezosithole6285 3 года назад

    I am learning a lot from your videos, man. Thank you for posting this content

  • @Hold_it
    @Hold_it 4 года назад +4

    I hope you still get enough sleep with all these high quality videos coming out in a short time ;)

  • @mathieumaticien
    @mathieumaticien 3 года назад +2

    Why does the closedness of ker(l) imply that ker(l)^ortho is nontrivial?

    • @brightsideofmaths
      @brightsideofmaths  3 года назад

      We also have the assumption that ker(l) is not the whole space. Hence closedness means that ker(l) is a proper subset and a Hilbert space in the Hilbert space X. Does this already help you?

    • @mathieumaticien
      @mathieumaticien 3 года назад

      @@brightsideofmaths hmmm now I'm wondering why the closedness is necessary. If we say ker(l) is a strict subset of X, and k is in ker(l) and let x be in X but not ker(l), then = 0 by definition, so x is in ker(l)^ortho. Since 0 is in ker(l) and we defined x to not be in ker(l), x is not 0, and ker(l)^ortho is nontrivial.
      Where does the closedness of ker(l) come into play?

    • @hanfsi
      @hanfsi 3 года назад

      ​@@mathieumaticien To even be able to split up the whole space into a subspace and its orthogonal complement you need to apply the Hilbert projection theorem. (Which is done implicitly in the video) And the theorem requires a closed subspace. (Just look at its proof) So its really a condition imposed by that theorem if you want to be able to split the space up in the first place.

  • @hyperduality2838
    @hyperduality2838 2 года назад

    Domain (pre-image) is dual to the co-domain (image) -- rank nullity theorem in linear algebra.
    Isomorphism (sameness) is dual to homomorphism (similar or relative sameness) -- Group Theory.

  • @scollyer.tuition
    @scollyer.tuition 3 года назад

    In a finite dimensional Euclidean space, we often represent linear functionals via row vectors, which map column vectors into the underlying field via a dot/inner product. I guess the Riesz Representation Theorem guarantees:
    a) that this operation can be justified rigorously
    b) that the analogue of this operation in infinite dimensional vector spaces also exists

    • @brightsideofmaths
      @brightsideofmaths  3 года назад +2

      I think that is a short rough summary one can always have in mind.
      However, in infinite-dimensional spaces some technical details are involved as well: We need completeness for example and the dual space consists of *continuous* functionals.

  • @qiaohuizhou6960
    @qiaohuizhou6960 3 года назад +3

    Hi, thank you so much for your video! I am sorry if I throw too many questions on the same day... I am wondering could you please share insights on why x_l must belongs to the orthogonal complement of the kernel of l? I know the kernel is a subspace of a vector space, and I know the row space(or column space) is orthogonal to the null space. I can sort of following every step to where l(x)= but I don't get the insight of choosing x_l from orthogonal complement of the kernel. Also, it seems this special x_l chosen is analog to the singular vector in a finite space... are these two concepts somehow connected?
    Sorry I wasn't majored in Maths and have a very limited background in all sorts of maths subjects. I hope you don't find my question naive and lack in basic understanding. I am glad if you could point me to the right direction of study!

    • @brightsideofmaths
      @brightsideofmaths  3 года назад +2

      Don't worry at all. All questions are welcome here. Even naive ones can help other viewers here quite a lot.
      The choice of x_l makes sense here because in the inner product all elements in ker(l) have to be sent to 0 as well. This is then what the inner product can do.

  • @zaccandels6695
    @zaccandels6695 6 месяцев назад

    Excellent video

  • @zazinjozaza6193
    @zazinjozaza6193 4 года назад +2

    Wow this was a really cool topic, can't wait to see the applications.

  • @luciaperez4400
    @luciaperez4400 Год назад

    Excellent video! Would you reference where the proof that the orthogonal complement of a closed set in a Hilbert space contains elements other than 0?

  • @moritzbecker5703
    @moritzbecker5703 4 года назад

    Thank you very much for your excellent videos!

  • @hoijanlai
    @hoijanlai 3 года назад

    The course I am taking also has a step that proves that the dimension of the ortho-complement of ker(l) is 1, do you know why is it? Thanks

  • @tigernov_425
    @tigernov_425 2 года назад

    why L-norm's bound is L(uni-vector of X)'s norm?

  • @Domzies
    @Domzies 4 года назад

    3:50 has made me realise I ddin't understand this. The professor at my functional analysis course did say that the theorem wouldn't work without X being a hilbert space but he didn't explicitly say why. Judging form your video I also probably don't quite understand orthogonal projectors as well as I'd like to. I've tryied looking into the book Functional Analysis by Peter Lax , but got even more confused. There it almost seems like you need a vector subspace (not jusz an arbitrary set) in order to even define an orthogonal complement. Besides this it would seem that perhaps the classical relation from linear algebra, namely that X=Y directsum Y^ortho, for any vector substace Y, only holds true in a general hilber space if Y is closed ?

  • @anowarali668
    @anowarali668 Год назад

    Thanks for the video. My doubt is "Whenever you are entering l(x^) in inner product , you are taking Conjugate of l(x^)" why? We know conjugate come if we take with second term of inner product. Please clear it.

    • @brightsideofmaths
      @brightsideofmaths  Год назад

      I defined the conjugate in the first term of the inner product.

    • @anowarali668
      @anowarali668 Год назад

      @@brightsideofmaths Is it not against the inner product formula since we know = a and = b*.

    • @brightsideofmaths
      @brightsideofmaths  Год назад

      @@anowarali668 What is not against it?

    • @anowarali668
      @anowarali668 Год назад

      @@brightsideofmaths l(x^)

    • @brightsideofmaths
      @brightsideofmaths  Год назад

      @@anowarali668 As I said: we defined the inner product with the property = b

  • @h-bar8649
    @h-bar8649 Год назад

    No clue where you would put it, but it would be great if somehow Fréchet and Gateaux were discussed in this Functional Analysis series. Unless you think it should belong elsewhere? Thanks for the videos!

  • @JR-iu8yl
    @JR-iu8yl 2 года назад

    Thank You

  • @xwyl
    @xwyl 2 года назад

    With your constructed x\hat, the proof is done like a knife through butter. But it raises a bigger question, how did you come up with the construction?

    • @brightsideofmaths
      @brightsideofmaths  2 года назад

      Thanks. We know the start and the goal. One just tries to fill in the gaps and finds x_l.

    • @xwyl
      @xwyl 2 года назад +1

      @@brightsideofmaths I'm trying to understand this without any construction (for these constructions were perhaps invented after the theorem was proven, and may hinder deeper understanding)
      The original inspiration may be the Euclidean space R^n. Consider a vector r in R^3, r=(x,0,0)+(0,y,0)+(0,0,z). When we study l(r), just take l((x,0,0)) for example, the linearity of l(r) implies that l((x,0,0)) is just a multiple of x, therefore l(r) is just . This also implies that dim(ker(l))=n-1 for the space R^n.
      Knowing that x_l exists, then any vector is the sum of the parallel part and the orthogonal part with x_l. Then it's natural to propose the unified parallel component x\hat (meaning x_l is a multiple of x\hat) and then the parallel part is easily l(x)/l(x\hat)*x\hat = \lambda * x\hat.
      The next big leap is 6:17 where is miraculously put there. It's natural to approach from =, comparing to the equation l(x)=\lambda * l(x\hat), knowing that x_l is a multiple of x\hat, say x_l=a*x\hat, we finally get = \lambda * l(x\hat), and solve for a=l(x\hat), i.e. x_l=a*x\hat=l(x\hat)*x\hat. And this process can be generalized to Hilbert spaces.
      Sorry for the messy writing, but the reasoning is completely natural without any prior-construction, all from what we already have in the derivation. I prefer this derivation for it's more basic and learner-friendly.

  • @weirdo-jw9kc
    @weirdo-jw9kc 4 года назад

    Do a series on topology and algebra too. If you have done it before please share the link. I like how you present the ideas and it gives right intuition.

  • @hectormerinocruz7965
    @hectormerinocruz7965 4 года назад

    Bonita y muy bien explicada la demostración de este importante teorema.

  • @pan19682
    @pan19682 2 года назад

    we are looking forward to giving us a video series in topology

  • @jaimelima2420
    @jaimelima2420 4 года назад

    Thank you so much!

  • @chenliou2578
    @chenliou2578 4 года назад +1

    Thx

  • @Zero-es-natural
    @Zero-es-natural 4 года назад

    Great video!

  • @ecologypig
    @ecologypig 2 года назад

    Thanks for your super helpful videos! 😀 I have a quick question: how do we know that $x_l := l(\hat{x}) \hat{x}$ is still inside the set $X$? Since we have scaled $\hat{x}$ by $l(\hat{x})$, and the scaling might be large, so it could be that $x_l$ now lies outside of the set $X$?

    • @brightsideofmaths
      @brightsideofmaths  2 года назад

      X is not just a set but a vector space. Hence you can never leave it just by scaling :)

    • @ecologypig
      @ecologypig 2 года назад

      @@brightsideofmaths oh got you! Thanks very much for your quick reply!😃

  • @TheWombatGuru
    @TheWombatGuru 4 года назад

    Thank you for this video :)

  • @munausef3891
    @munausef3891 3 года назад

    thanks good explain can give me site to solve questions .thx

  • @RohanKumar-zn4qg
    @RohanKumar-zn4qg 4 года назад

    Can you share your slides

    • @brightsideofmaths
      @brightsideofmaths  4 года назад +1

      Oh sorry! I totally forgot. Now they are all in :) steadyhq.com/en/brightsideofmaths/posts/c6641292-1666-4a24-a4b9-cd9c4147d7d3

    • @RohanKumar-zn4qg
      @RohanKumar-zn4qg 4 года назад

      @@brightsideofmaths it is asking for member access... please share on some open source platform

    • @brightsideofmaths
      @brightsideofmaths  4 года назад +2

      @@RohanKumar-zn4qg PDFs are a perk for my Steady members.

  • @lorenzougo6571
    @lorenzougo6571 11 месяцев назад

    want to cry, calculus 3 incoming hahahah

  • @sanjursan
    @sanjursan 3 года назад

    My proof is much simpler than this. Of course, it is wrong!

  • @JaspreetSingh-zp2nm
    @JaspreetSingh-zp2nm 3 месяца назад

    Why orthogonal complement being closed has to contain something other than zero vector? Closed is something topological I am confused here. For finite dimension Gram- Schmidt process may help but in general I am not sure.

    • @brightsideofmaths
      @brightsideofmaths  3 месяца назад

      The orthogonal complement is always a closed set. So maybe you can clarify your question?