Brittany Hamfeldt
Brittany Hamfeldt
  • Видео 76
  • Просмотров 131 968
Advanced Calculus - The Lebesgue Integral
Math 481: Advanced Calculus
The Lebesgue Integral
April 25, 2023
This is a lecture on "The Lebesgue Integral" given as a part of Brittany Hamfeldt's class Math 481: Advanced Calculus at New Jersey Institute of Technology in the Spring 2023 semester.
Syllabus: web.njit.edu/~bdfroese/Math481_Spring2023_syllabus.pdf
Resources:
Cheney, Analysis for Applied Mathematics
This recorded lecture was supported by NSF DMS-1751996.
Просмотров: 437

Видео

Advanced Calculus - The Lebesgue Measure
Просмотров 263Год назад
Math 481: Advanced Calculus The Lebesgue Measure April 20, 2023 This is a lecture on "The Lebesgue Measure" given as a part of Brittany Hamfeldt's class Math 481: Advanced Calculus at New Jersey Institute of Technology in the Spring 2023 semester. Syllabus: web.njit.edu/~bdfroese/Math481_Spring2023_syllabus.pdf Resources: Cheney, Analysis for Applied Mathematics This recorded lecture was suppor...
Advanced Calculus - Continuous Functions on Metric Spaces
Просмотров 132Год назад
Math 481: Advanced Calculus Continuous Functions on Metric Spaces April 18, 2023 This is a lecture on "Continuous Functions on Metric Spaces" given as a part of Brittany Hamfeldt's class Math 481: Advanced Calculus at New Jersey Institute of Technology in the Spring 2023 semester. Syllabus: web.njit.edu/~bdfroese/Math481_Spring2023_syllabus.pdf Resources: Trench, Introduction to Real Analysis T...
Advanced Calculus - Compact Sets of Continuous Functions
Просмотров 86Год назад
Math 481: Advanced Calculus Compact Sets of Continuous Functions April 13, 2023 This is a lecture on "Compact Sets of Continuous Functions" given as a part of Brittany Hamfeldt's class Math 481: Advanced Calculus at New Jersey Institute of Technology in the Spring 2023 semester. Syllabus: web.njit.edu/~bdfroese/Math481_Spring2023_syllabus.pdf Resources: Trench, Introduction to Real Analysis Thi...
Advanced Calculus - Compact Sets in Metric Spaces
Просмотров 76Год назад
Math 481: Advanced Calculus Compact Sets in Metric Spaces April 11, 2023 This is a lecture on "Compact Sets in Metric Spaces" given as a part of Brittany Hamfeldt's class Math 481: Advanced Calculus at New Jersey Institute of Technology in the Spring 2023 semester. Syllabus: web.njit.edu/~bdfroese/Math481_Spring2023_syllabus.pdf Resources: Trench, Introduction to Real Analysis This recorded lec...
Advanced Calculus - Convergence in a Metric Space
Просмотров 82Год назад
Math 481: Advanced Calculus Convergence in a Metric Space March 30 2023 This is a lecture on "Convergence in a Metric Space" given as a part of Brittany Hamfeldt's class Math 481: Advanced Calculus at New Jersey Institute of Technology in the Spring 2023 semester. Syllabus: web.njit.edu/~bdfroese/Math481_Spring2023_syllabus.pdf Resources: Trench, Introduction to Real Analysis This recorded lect...
Advanced Calculus - Introduction to Metric Spaces
Просмотров 178Год назад
Math 481: Advanced Calculus Introduction to Metric Spaces March 28, 2023 This is a lecture on "Introduction to Metric Spaces" given as a part of Brittany Hamfeldt's class Math 481: Advanced Calculus at New Jersey Institute of Technology in the Spring 2023 semester. Syllabus: web.njit.edu/~bdfroese/Math481_Spring2023_syllabus.pdf Resources: Trench, Introduction to Real Analysis This recorded lec...
Advanced Calculus - Change of Variables Theorem
Просмотров 99Год назад
Math 481: Advanced Calculus Change of Variables Theorem March 23, 2023 This is a lecture on "Change of Variables Theorem" given as a part of Brittany Hamfeldt's class Math 481: Advanced Calculus at New Jersey Institute of Technology in the Spring 2023 semester. Syllabus: web.njit.edu/~bdfroese/Math481_Spring2023_syllabus.pdf Resources: Trench, Introduction to Real Analysis This recorded lecture...
Advanced Calculus - Change of Volume Under Linear Transformations
Просмотров 65Год назад
Math 481: Advanced Calculus Change of Volume Under Linear Transformations March 21, 2023 This is a lecture on "Change of Volume Under Linear Transformations" given as a part of Brittany Hamfeldt's class Math 481: Advanced Calculus at New Jersey Institute of Technology in the Spring 2023 semester. Syllabus: web.njit.edu/~bdfroese/Math481_Spring2023_syllabus.pdf Resources: Trench, Introduction to...
Advanced Calculus - Iterated Integrals
Просмотров 75Год назад
Math 481: Advanced Calculus Iterated Integrals March 9, 2023 This is a lecture on "Iterated Integrals" given as a part of Brittany Hamfeldt's class Math 481: Advanced Calculus at New Jersey Institute of Technology in the Spring 2023 semester. Syllabus: web.njit.edu/~bdfroese/Math481_Spring2023_syllabus.pdf Resources: Trench, Introduction to Real Analysis This recorded lecture was supported by N...
Advanced Calculus - Multiple Integrals Over Regions
Просмотров 52Год назад
Math 481: Advanced Calculus Multiple Integrals Over Regions March 7, 2023 This is a lecture on "Multiple Integrals Over Regions" given as a part of Brittany Hamfeldt's class Math 481: Advanced Calculus at New Jersey Institute of Technology in the Spring 2023 semester. Syllabus: web.njit.edu/~bdfroese/Math481_Spring2023_syllabus.pdf Resources: Trench, Introduction to Real Analysis This recorded ...
Advanced Calculus - Multiple Integrals Over Rectangles
Просмотров 91Год назад
Math 481: Advanced Calculus Multiple Integrals Over Rectangles March 2, 2023 This is a lecture on "Multiple Integrals Over Rectangles" given as a part of Brittany Hamfeldt's class Math 481: Advanced Calculus at New Jersey Institute of Technology in the Spring 2023 semester. Syllabus: web.njit.edu/~bdfroese/Math481_Spring2023_syllabus.pdf Resources: Trench, Introduction to Real Analysis This rec...
Advanced Calculus - Proof of the Implicit Function Theorem
Просмотров 206Год назад
Math 481: Advanced Calculus Proof of the Implicit Function Theorem February 28, 2023 This is a lecture on "Proof of the Implicit Function Theorem" given as a part of Brittany Hamfeldt's class Math 481: Advanced Calculus at New Jersey Institute of Technology in the Spring 2023 semester. Syllabus: web.njit.edu/~bdfroese/Math481_Spring2023_syllabus.pdf Resources: Trench, Introduction to Real Analy...
Advanced Calculus - Implicit Function Theorem
Просмотров 319Год назад
Math 481: Advanced Calculus Implicit Function Theorem February 16, 2023 This is a lecture on "Implicit Function Theorem" given as a part of Brittany Hamfeldt's class Math 481: Advanced Calculus at New Jersey Institute of Technology in the Spring 2023 semester. Syllabus: web.njit.edu/~bdfroese/Math481_Spring2023_syllabus.pdf Resources: Trench, Introduction to Real Analysis This recorded lecture ...
Advanced Calculus - Proof of the Inverse Function Theorem
Просмотров 593Год назад
Math 481: Advanced Calculus Proof of the Inverse Function Theorem February 14, 2023 This is a lecture on "Proof of the Inverse Function Theorem" given as a part of Brittany Hamfeldt's class Math 481: Advanced Calculus at New Jersey Institute of Technology in the Spring 2023 semester. Syllabus: web.njit.edu/~bdfroese/Math481_Spring2023_syllabus.pdf Resources: Trench, Introduction to Real Analysi...
Advanced Calculus - Inverse Function Theorem
Просмотров 381Год назад
Advanced Calculus - Inverse Function Theorem
Advanced Calculus - Vector-Valued Transformations
Просмотров 102Год назад
Advanced Calculus - Vector-Valued Transformations
Advanced Calculus - Taylor's Theorem
Просмотров 179Год назад
Advanced Calculus - Taylor's Theorem
Advanced Calculus - The Chain Rule
Просмотров 196Год назад
Advanced Calculus - The Chain Rule
Advanced Calculus - Differentiability
Просмотров 174Год назад
Advanced Calculus - Differentiability
Advanced Calculus - Heine-Borel
Просмотров 720Год назад
Advanced Calculus - Heine-Borel
Introductory Mathematical Analysis - Power Series
Просмотров 2992 года назад
Introductory Mathematical Analysis - Power Series
Introductory Mathematical Analysis - Series of Functions
Просмотров 3612 года назад
Introductory Mathematical Analysis - Series of Functions
Introductory Mathematical Analysis - Sequences of Functions
Просмотров 7432 года назад
Introductory Mathematical Analysis - Sequences of Functions
Numerical Methods I - Least Squares Approximations
Просмотров 3492 года назад
Numerical Methods I - Least Squares Approximations
Numerical Methods I - Spline Interpolation
Просмотров 1732 года назад
Numerical Methods I - Spline Interpolation
Numerical Methods I - Divided Differences
Просмотров 652 года назад
Numerical Methods I - Divided Differences
Numerical Methods I - Lagrange Polynomial Interpolation
Просмотров 1102 года назад
Numerical Methods I - Lagrange Polynomial Interpolation
Numerical Methods I - Systems of Nonlinear Equations and Function Approximation
Просмотров 1602 года назад
Numerical Methods I - Systems of Nonlinear Equations and Function Approximation
Numerical Methods I - Fixed Point Iterations
Просмотров 832 года назад
Numerical Methods I - Fixed Point Iterations

Комментарии

  • @ebenezerasiama5868
    @ebenezerasiama5868 14 дней назад

    video quality is the main issue here

  • @smu160
    @smu160 3 месяца назад

    Thank you for making this lecture available. How would the equation b_{ij*} = p_{j*} + (v_{ij*} - w_{ij*}) + \epsilon work if we include a 3rd best profit?

  • @binshuaiwang5871
    @binshuaiwang5871 3 месяца назад

    Thanks for sharing this great video! What is the levy's paper that refers to in the video? Is it A NUMERICAL ALGORITHM FOR L2 SEMI-DISCRETE OPTIMAL TRANSPORT IN 3D?

  • @yanhaitao
    @yanhaitao 3 месяца назад

    Hello,Professor. May I ask you that, Do u have organized lecture notes for this course. would u like to share it ?

  • @GirinChutia-qr4pt
    @GirinChutia-qr4pt 8 месяцев назад

    great lecture ! thank you

  • @dc_daily4514
    @dc_daily4514 9 месяцев назад

    Thank you Prof. Hamfeldt, I have a question at 53:25, where did the o(1) term go? After integrating over X, we still have the o(1) term left right?

  • @ahanadeb3148
    @ahanadeb3148 Год назад

    At 38:23 shouldn't it be Total Mass coming from x not X? Great video btw, thank you

  • @filippoelgorni3238
    @filippoelgorni3238 Год назад

    Doctor Hamfeldt please be aware this lecture series, for free, on youtube, is an absolute godsend! I will forever be thankful

  • @zichaoyu
    @zichaoyu Год назад

    thank you for your sharing what a nice lecture!

  • @edisonmucllari7904
    @edisonmucllari7904 Год назад

    Hi, thank you so much for the lectures! I have a question here, why is it true that \pi shows up in a linear way at the Kantorovich Problem? Thank you so much

    • @brittanyhamfeldt
      @brittanyhamfeldt Год назад

      This formulation requires computing the cost c(x,y) between ALL possible pairs of points x and y. There are no unknowns in this step. Then the unknown pi(x,y) (the transport plan) essentially multiplies this and determines how much weight is assigned to each pair (i.e. how much mass is moved from x to y).

  • @kamalhaider397
    @kamalhaider397 Год назад

    Which book is recommended for this course if someone is watching online? Thanks

    • @brittanyhamfeldt
      @brittanyhamfeldt Год назад

      I was using Trench, Introduction to Real Analysis, which is freely available online.

    • @kamalhaider397
      @kamalhaider397 Год назад

      @@brittanyhamfeldt Thanks for your prompt reply as I have it

  • @beauzeta1342
    @beauzeta1342 Год назад

    Thank you professor. On the general discrete problem (K), the entropy regularization of Lec. 18 still applies, doesn't it ? Additionally, given the sparsity observation, could we add an L1 norm regularization on P_{ij} ?

    • @brittanyhamfeldt
      @brittanyhamfeldt Год назад

      Yes, entropy regularisation would still apply in this setting. And good observation - L1 based minimisation tools are certainly reasonable for trying to capture the sparsity.

    • @beauzeta1342
      @beauzeta1342 Год назад

      @@brittanyhamfeldt Thank you professor.

  • @Considerationhhh
    @Considerationhhh Год назад

    Hi Dr. Hamfeldt, thank you very much for the great lectures! Do you have the lecture notes posted online by any chance?

    • @brittanyhamfeldt
      @brittanyhamfeldt Год назад

      Unfortunately I don't have any clean lecture notes available.

  • @ariel415el
    @ariel415el Год назад

    It seems like the continouity equation comes from somwhere else in the course. Can someone explain why one we move to wasserstein spaces we need new terms like a vector field and a divergence? where does it come from and why is V what we are looking for.

    • @brittanyhamfeldt
      @brittanyhamfeldt Год назад

      We used the continuity equation in a previous lecture on the "Benamou-Brenier Formulation", and showed how to connect this to optimal transport. Now we are looking at gradient flow schemes and trying to show that they can be interpreted as a similar kind of flow satisfying a continuity equation, which requires some v in order to be fully defined.

    • @tobiassugandi
      @tobiassugandi 7 месяцев назад

      Continuity equation is is just "mass conservation" saying that the time rate of change at a specific location must be balanced by the flux coming in and out (the divergence term involving vector field). The gradient flow here must satisfy continuity because we are transporting pdf whose integral must always equal 1 ("the mass of the pdf must be conserved").

  • @punditgi
    @punditgi Год назад

    Excellent video! 😊🎉

  • @cmdcs1
    @cmdcs1 Год назад

    Does this lecture follow from the one on Heine-Borel?

    • @brittanyhamfeldt
      @brittanyhamfeldt Год назад

      Yes, this would come next after Heine-Borel. There is a bit of intermediary material on limits assumed (but, unfortunately, not recorded due to technical issues), but this is not essential if you have taken a first course on analysis.

  • @cmdcs1
    @cmdcs1 Год назад

    Thanks for posting

  • @dassorajit
    @dassorajit Год назад

    Hey,nice lecture I am from India liked this video it's very helpful to us higher secondary science students face very difficulty in this topics this are asked in various competitive exams in india after 12th class to get admission into engineering colleges

  • @toyomicho
    @toyomicho Год назад

    great lecture. thank you for posting nitpick: probably a typo in the title, should read "Calculus" cf "Calclulus"

  • @cmdcs1
    @cmdcs1 Год назад

    Thanks for uploading. I think the sound cuts out around 33:45, at least it does for me 🙁

    • @toyomicho
      @toyomicho Год назад

      sounds cuts out around 33.45 and doesn't come back until around 1:06:48 ☹

    • @brittanyhamfeldt
      @brittanyhamfeldt Год назад

      Thanks for letting me know. It appears, unfortunately, that the microphone died at that point, and there is a period of silence before they were able to switch over to the backup.

  • @养兔大户
    @养兔大户 Год назад

    Hi Brittany,may I ask which textbook you refer to for these lessons?谢谢你

  • @AI_For_Scientists
    @AI_For_Scientists Год назад

    very nice lecture. thank you for sharing!

  • @supplychainoperationsresearch

    this is now my favorite area of math

  • @AmitSingh-jo8ob
    @AmitSingh-jo8ob 2 года назад

    Let Sn is bounded and monotonously decreasing function. Then Limit superior will be lower bound of sequence ryt?

  • @AmitSingh-jo8ob
    @AmitSingh-jo8ob 2 года назад

    Do we actually need a limit value L in order to show that Riemann integral exists? Can not we start by saying that there is some L ( and possibly unknown) and just bound error terms?

    • @brittanyhamfeldt
      @brittanyhamfeldt 2 года назад

      No, we don't need to know the value (and often don't in practice), we just need to be able to assert that such a value L exists.

  • @AmitSingh-jo8ob
    @AmitSingh-jo8ob 2 года назад

    Dear prof, Do we have takeaways (documented) for this lecture series? It may not be possible to always remember the complete math but takeaways can help. Also i think Taylor's expension works fine in practice if |x-x_0|<1, otherwise error term will exponentially increase.

  • @AmitSingh-jo8ob
    @AmitSingh-jo8ob 2 года назад

    Dear Prof, can one access handouts or scribble notes and assignments?

    • @brittanyhamfeldt
      @brittanyhamfeldt 2 года назад

      Unfortunately I don't have any notes available at this time. For practice problems, I recommend the exercises in Trench, "Introduction to Real Analysis", which is freely available online.

  • @AmitSingh-jo8ob
    @AmitSingh-jo8ob 2 года назад

    Do we have to proof for contradictary even after proofing that my hypothesis are correct? I couldn't understand the motivation behind that.

    • @brittanyhamfeldt
      @brittanyhamfeldt 2 года назад

      We only needed to do this proof by contradiction argument one time: to prove that "mathematical induction" works. That is, we proved by contradiction that anytime the hypotheses hold (base step and induction step), then the conclusion automatically holds as well for all n. From this point on, we know that mathematical induction works. Given a particular problem, all we need to do is demonstrate that the hypotheses are true (base step and induction step). And then we are done: we are guaranteed that the conclusion holds for all n. There is no need for an additional proof by contradiction.

  • @bibek2599
    @bibek2599 2 года назад

    Very precise and clear instruction. Just wondering is this the same field of optimal transport that prof. Cedric villani works in.

    • @brittanyhamfeldt
      @brittanyhamfeldt 2 года назад

      Indeed -actually, Villani wrote one of the main textbooks I used as a reference for this. (Topics in Optimal Transportation)

  • @punditgi
    @punditgi 2 года назад

    Great stuff! 😃

  • @UgenrwotRonnieTimothy
    @UgenrwotRonnieTimothy 2 года назад

    👏👏👏

  • @Jason-sq7cc
    @Jason-sq7cc 2 года назад

    Thanks for such an amazing course! I have a question. Is cyclically monotonicity a sufficient condition for optimal mapping? If so, why?

  • @jameschen2308
    @jameschen2308 2 года назад

    Mind. Blown. 🤯

  • @jameschen2308
    @jameschen2308 2 года назад

    Hi Professor Hamfeldt. At 15:56, you claim that if, for all neighborhoods of x_0, there exists an open set s.t. phi(x)<u(x)=0 in the open set, then grad phi(x_0)!=0. How can I show this? Isn't the contrapositive, that derivative at a point =0 implies that there exists a neighborhood s.t. the function is constant, not necessarily true?

    • @brittanyhamfeldt
      @brittanyhamfeldt 2 года назад

      The missing ingredient here is that phi is convex. Let's restrict everything to the line segment in the direction -x_0 for simplicity. This is our setting: u(x) = 0 for all x < x_0 and phi(x_0) = 0 and phi"(x) >= 0 for all x. Now let's suppose that also phi(x)<0 for all nearby x < x_0. Taylor expand. Then for some c: 0 > phi(x) = phi'(x_0)(x-x_0) + phi"(c)(x-x_0)^2/2 >= phi'(x_0)(x-x_0) for all nearby x < x_0. Since there is a strict inequality involved, it must be the case that phi'(x_0) is not equal to 0.

    • @jameschen2308
      @jameschen2308 2 года назад

      @@brittanyhamfeldt ohhhh thank you very much professor!!

  • @jameschen2308
    @jameschen2308 2 года назад

    This is one of the best lecture series on applied math on the Internet! It makes optimal transport theory and it's modern application so much more attainable than the other resources online!

  • @lonjezosithole6237
    @lonjezosithole6237 2 года назад

    Thank you, Brittany, for these videos. Extremely useful! And you are a fantastic teacher!

  • @yacinemokhtari7531
    @yacinemokhtari7531 2 года назад

    thank you for the great lecture, I have just one question: in 18:30, you made a substitution y=X(t,x) and naturally, we have to inverse the transform with respect to x then computing the jacobian of the inverse of X with respect to x. Am'I mistaken?. Thank you again.

    • @brittanyhamfeldt
      @brittanyhamfeldt 2 года назад

      Hi, I'm not sure if I'm 100% understanding the question correctly so feel free to follow up. Indeed, we end up inverting the transform with respect to x in order to write down a candidate velocity field. Then you can verify directly that v(t, X(t,x)) = (T-I) (y^{-1}(y(x))) = T(x) - x, which ensures that X(1,x) coincides with the optimal map. Note that if f and g are nice enough, then the Jacobian of T is positive definite (T is the gradient of a convex function) and so is the Jacobian of y.

    • @yacinemokhtari7531
      @yacinemokhtari7531 2 года назад

      @@brittanyhamfeldt Thanks for the answer, The problem is that I'm not finding the Jacobian in the integral after the variable substitution.

    • @brittanyhamfeldt
      @brittanyhamfeldt 2 года назад

      @@yacinemokhtari7531 This is where we make the substitution s(x) = X(t,x)? Here I am not using the change of variables formula, I am using the fact that s # f = ho One way of characterizing the pushforward is that \int \phi(y) ho(y) dy = \int \phi(s(x)) f(x) dx for every continuous compactly supported function \phi(y).

    • @yacinemokhtari7531
      @yacinemokhtari7531 2 года назад

      Thank you Brittany for the clarifications. Great lectures

  • @rickmcn1986
    @rickmcn1986 2 года назад

    Your optimal transport videos are terrific. Thanks for uploading more videos, I'm sure they will help me a lot in the future.

    • @jiezhang2571
      @jiezhang2571 2 года назад

      Same! Her OT videos helped me a lot

  • @limp_crimpet
    @limp_crimpet 2 года назад

    This is pretty formal right since the Kantorovich potential $u$ will depend on $\epsilon$ since it is between $g$ and $f+\epsilon \chi$?

    • @brittanyhamfeldt
      @brittanyhamfeldt 2 года назад

      Indeed, u depends on epsilon, and you need to appeal to stability of the Kantorovich potential in order to fill in the details. I recommend Santambrogio's book (section 7.2) for a more detailed discussion of the first variation.

    • @limp_crimpet
      @limp_crimpet 2 года назад

      @@brittanyhamfeldt thanks for the ref. Great lectures by the way :)

  • @Ethan-lz5rw
    @Ethan-lz5rw 2 года назад

    Hello professor, thanks for the great video! I have a question regarding the cyclical monotonicity. Is cyclical monotonicity close under functional composition (suppose all involved measures are absolutely continuous)? This is true in R1 since the composition of two non-decreasing maps is still non-decreasing. I am thinking about whether we have similar properties for cyclical monotonicity in general Rd. Thanks!

    • @brittanyhamfeldt
      @brittanyhamfeldt 2 года назад

      Great question, this actually does not work in Rd. For example, let's consider 2 linear maps T(x) = Ax, S(y) = By with A and B both symmetric positive definite. These are cyclically monotone since they can be written as the gradient of a convex function (1/2 * x^TAx and 1/2 * y^TBy respectively). Consider the composition of the maps: R(x) = S(T(x)) = BAx. In general, the matrix product BA need not be symmetric and the map is not the gradient of any function (much less a convex one), thus not cyclically monotone.

    • @Ethan-lz5rw
      @Ethan-lz5rw 2 года назад

      @@brittanyhamfeldt Thanks for the clarification, it makes sense! I have a follow-up question. In that case, is the composition of two optimal transport maps still optimal (assume all involved measures are absolutely continuous, supported on compact sets, and use quadratic cost in Monge formulation)? I have this question because of the equivalence between cyclical monotonicity and optimality of transport map, c.f. Brenier-McCann theorem.

    • @brittanyhamfeldt
      @brittanyhamfeldt 2 года назад

      @@Ethan-lz5rw Indeed, a consequence of this is that the composition of two optimal maps is NOT optimal in general.

    • @Ethan-lz5rw
      @Ethan-lz5rw 2 года назад

      @@brittanyhamfeldt Thank you!

  • @viktorajstein
    @viktorajstein 2 года назад

    56:12 so if the sought after vector field v is *the gradient of* the first variation, shouldn't the continuity equation become rho_t - abla \cdot ( ho abla \delta F / \delta ho) = 0?

  • @kuroshkabir136
    @kuroshkabir136 2 года назад

    Sorry, that may be a stupid question, but I dont understand why you make use of a maximizer (e.g. at 39:13 of the lecture). Why are we looking for a term that maximizes \phi(x) ? i would appreciate if you could explain in a few sentences. Thank you in advane

  • @jiwoongjang2764
    @jiwoongjang2764 2 года назад

    Hi Professor, I'm a student technically from outside, but interested in the proof you left on canvas at 23:00. Can I see the proof if you don't mind sharing?

    • @brittanyhamfeldt
      @brittanyhamfeldt 2 года назад

      The proof is in this paper: www.intlpress.com/site/pub/files/_fulltext/journals/cms/2016/0014/0008/CMS-2016-0014-0008-a009.pdf Feel free to reach out via e-mail if you have trouble viewing it on the journal's website.

  • @jameskalombe2890
    @jameskalombe2890 2 года назад

    Wonderful work I need a book!

  • @buh357
    @buh357 2 года назад

    Hi, what math do I need to follow in this course?

    • @brittanyhamfeldt
      @brittanyhamfeldt 2 года назад

      I recommend at least a good background in analysis/measure theory. Some exposure to other topics such as calculus of variations, numerical analysis may be helpful but probably isn't required.

    • @buh357
      @buh357 2 года назад

      @@brittanyhamfeldt thank you.

  • @yikunbai6315
    @yikunbai6315 3 года назад

    Hello, professor, in 27:31, why the first variation is a integral?

    • @brittanyhamfeldt
      @brittanyhamfeldt 3 года назад

      Here F is a functional on the space of probability measures, so when we compute its first variation we expect to get another functional, which will act on the functions Chi through an integral to produce a real number.

  • @yikunbai6315
    @yikunbai6315 3 года назад

    Hi professor, could you explain the motivation we set boundary to be 0? For example. f,g are densities of uniform distribution on [0,1]^2. The optimal phi should be phi(x,y)=1/2 x^2+1/2y^2. And D^2\phi is just the identity. In this case, the boundary \phi=0 is violated. for when x=1, or y=1. But phi is indeed the optimal one that solves the OT problem.

    • @brittanyhamfeldt
      @brittanyhamfeldt 3 года назад

      This is meant to be an example showing that we cannot expect to get smooth solutions in general. Remember: the solution of a PDE depends on both the PDE and the boundary conditions. So phi(x,y)=1/2 x^2+1/2y^2 is actually not the solution in this case, since it violates the boundary condition. I don't know an explicit formula for the solution of the Monge-Ampere equation with this Dirichlet boundary condition. But, we can certainly argue that it cannot be smooth up to the boundary. That is the motivation here: to show that smooth solutions may not be possible.

  • @yikunbai6315
    @yikunbai6315 3 года назад

    Hi professor, at 35:22 where you use the condition that r is non-degenerate? I do not find where this condition is applied during the proof.

    • @brittanyhamfeldt
      @brittanyhamfeldt 3 года назад

      Good question! I may never have stated it explicitly in this lecture, but it comes from showing that we can extract a measure-preserving map from the minimisation problem. The arguments are identical to those in the lecture on "Charcterising the Optimal Map". There we made the assumption that \mu and u are sufficiently nice in that they don't give mass to sets of measure 0. This allowed us to represent the optimal maps as an (a.e.) gradient of a convex function, and only integrate over the subsets \tilde{X} or \tilde{Y} where they are differentiable. We don't lose out on anything by doing this since the sets of non-differentiability have measure-zero and do not have any mass. In the polar factorisation theorem, the identical arguments go through if we choose \mu to be the Lebesgue measure and u = r # \mu (the pushforward measure). The statement that r is non-degenerate is equivalent to saying that u does not give mass to sets of Lebesgue measure zero.

  • @yikunbai6315
    @yikunbai6315 3 года назад

    Hello, in 15:51 why you can do the same thing to \psi_n, you do not show \psi_n is lower bounded while you using the technique to make \phi_n to be lower bounded.

    • @brittanyhamfeldt
      @brittanyhamfeldt 3 года назад

      Great question, bounds on \psi follow from it's definition as a L-F transform of \phi (and the boundedness of the domains). So we get |\psi| \leq sup |X| * sup |Y| + \sup |\phi|

  • @beauzeta1342
    @beauzeta1342 3 года назад

    Hi professor, thank you for this great lecture. At 1:00:24, the T1 map is expressed as interpolation between the identity map and the optimal transport T, while still being itself an optimal transport. This shows that we are moving along a geodesic curve. I am actually wondering what happens if we extrapolate beyond this geodesic segment. I mean, if we let lambda1 and lambda2 take any value while having a sum of 1, then we can extrapolate the map. If we extrapolate too much, we will end up with a degenerate measure. But if the extrapolation ends up to a nondegenerate measure, do we know if this extrapolated map is still an optimal transport?

    • @brittanyhamfeldt
      @brittanyhamfeldt 3 года назад

      Thanks for the question! It's not something I'd thought about before. Here is my initial (hand-waving) attempt at an answer. If I understand your question correctly, let us suppose that we are given measures \mu_1 and \mu_2, with the optimal map between them given by T. Now for some real-valued \lambda, let us define the new mapping T_1(x) = \lambda x + (1-\lambda) T(x) and the new measure \mu = T_1 # \mu_1. The question then is, is T_1 the optimal map from \mu_1 to mu? Let's assume everything is smooth enough that we can represent the optimal map T as the gradient of a convex C^2 function u, T = abla u. Then T_1(x) = abla ( \lambda |x|^2 / 2 + (1-\lambda) u ) = abla \psi(x) Is this optimal? It is if the potential function \psi is convex. Check the Hessian: D^2\psi(x) = \lambda I + (1-\lambda) D^2u(x) This is certainly positive semi-definite for \lambda \in [0,1]. Since we've assumed here smoothness of the Hessian, it will also be positive definite for a range of values of \lambda > 1. If, moreover, us is uniformly convex (so D^2u(x) is strictly positive definite), \psi should also be convex for some values of \lambda < 0 (as long as they are not too large in magnitude). So the answer to your question seems to be yes: if things are "nice enough", we can extrapolate a little ways and still obtain an optimal map.

    • @beauzeta1342
      @beauzeta1342 3 года назад

      @@brittanyhamfeldt Thank you a lot professor for taking time on my question! I like this result a lot. Not only because we now know the extrapolability is controlled by the eigenvalues of D^2u, but also we know that for nondegenerate case, there is always a strictly positive margin that we can extend this geodesic segment while still being on the same geodesic. Thank you again!