Juan MR Parrondo
Juan MR Parrondo
  • Видео 12
  • Просмотров 49 846
The additivity of entropy in the microcanonical ensemble
Here you have a basic discussion of the additivity of Boltzmann and microcanonical entropy. In the video, I explain the solution of exercise 3.5 of my lectures at Universidad Complutense.
Просмотров: 105

Видео

A simple derivation of the Black-Scholes equation
Просмотров 3,5 тыс.Год назад
At the end of the video there is an important comment that corrects a conceptual error in the derivation.
Two notes on foundations of statistical mechanics: objectivity and the origin of giant fluctuations.
Просмотров 4822 года назад
Boltzmann’s explanation of irreversibility is based on the concept of macro-states and the definition of entropy as the logarithm of the volume in phase space of the region of micro-states compatible with a given macro-state. The explanation, however, lacks an objective (i.e. non arbitrary) definition of macro-states and of the crossover between micro- and macro-scales. Here we show that this p...
Why nobody understands thermodynamics
Просмотров 1,3 тыс.2 года назад
If you studied thermodynamics and did not understand a thing, maybe this video could help. The main idea is to forget about the first and the second law (which refer to processes) and focus on the characterization of equilibrium states, which is the basic problem in equilibrium thermodynamics. This is the same approach followed by H.B Callen in his celebrated textbook on thermodynamics but expl...
Lesson 6 (5/5). Stochastic differential equations. Part 5
Просмотров 2,5 тыс.4 года назад
Lecture for the course Statistical Physics (Master on Plasma Physics and Nuclear Fusion). Universidad Complutense de Madrid. Course webpage: seneca.fis.ucm.es/parr/sp
Lesson 6 (4/5). Stochastic differential equations. Part 4
Просмотров 3,4 тыс.4 года назад
Lecture for the course Statistical Physics (Master on Plasma Physics and Nuclear Fusion). Universidad Complutense de Madrid. Course webpage: seneca.fis.ucm.es/parr/sp
Lesson 6 (3/5). Stochastic differential equations. Part 3
Просмотров 4,3 тыс.4 года назад
Lecture for the course Statistical Physics (Master on Plasma Physics and Nuclear Fusion). Universidad Complutense de Madrid. Course webpage: seneca.fis.ucm.es/parr/sp
Lesson 6 (2/5). Stochastic differential equations. Part 2
Просмотров 4,9 тыс.4 года назад
Lecture for the course Statistical Physics (Master on Plasma Physics and Nuclear Fusion). Universidad Complutense de Madrid. Course webpage: seneca.fis.ucm.es/parr/sp
Lesson 6 (1/5). Stochastic differential equations. Part 1
Просмотров 25 тыс.4 года назад
Lecture for the course Statistical Physics (Master on Plasma Physics and Nuclear Fusion). Universidad Complutense de Madrid. Course webpage: seneca.fis.ucm.es/parr/sp
Lesson 5. Markov chains
Просмотров 2,1 тыс.4 года назад
Lecture for the course Statistical Physics (Master on Plasma Physics and Nuclear Fusion). Universidad Complutense de Madrid. Course webpage: seneca.fis.ucm.es/parr/sp
Lesson 3. Quantum Ideal Gases.
Просмотров 1,3 тыс.4 года назад
Lecture for the course Statistical Physics (Master on Plasma Physics and Nuclear Fusion). Universidad Complutense de Madrid. This lecture corresponds to Lesson 3 on Quantum Ideal Gases. I'm a bit slow. I recommend watching it at speed x1.25. There are three mistakes that I've detected after recording the video, the last one is conceptually important: mins 7:43 and 8:00. I say "evaluated in" and...
Sortis in ludis: Euler, juegos y paradojas.
Просмотров 4676 лет назад
Jornada Euler. 14 de febrer de 2007. Universitat Politècnica de Catalunya. Juan M. R. Parrondo (Universidad Complutense de Madrid) Sortis in ludis: Euler, juegos y paradojas Euler atacó problemas de probabilidad y estadística en varias ocasiones. Una de las más interesantes es el trabajo “Vera estimatio sortis in ludis” (La correcta evaluación del riesgo en un juego), publicado póstumamente y e...

Комментарии

  • @adityabanerjee9476
    @adityabanerjee9476 Месяц назад

    Extremely Helpful, thank you prof.

  • @jonathanspurgeon5461
    @jonathanspurgeon5461 3 месяца назад

    could you kindly explain how v = white noise at around 24:20. Thanks!

    • @juanmrparrondo1375
      @juanmrparrondo1375 3 месяца назад

      v= white noise results from applying the overdamped limit to the Langevin equation of a free particle. The position of a Brownian particle in the overdamped limit is the Wsiener process, and the velocity is the white noise. If you include inertia, then the velocity is an Orstein-Uhlenbeck process, which tends to white noise when the correlation time goers to zero.

  • @jonathanspurgeon5461
    @jonathanspurgeon5461 3 месяца назад

    Amazing lecture! Around 45:28 , I believe the second term on the right of the FPE for Stratanovich interpretation must have a positive sign and not negative. Please correct me if I am wrong☺

    • @juanmrparrondo1375
      @juanmrparrondo1375 3 месяца назад

      Sure, you're right. The error is there from minute 44:20 to 45:30 approx. Thanks for pointing it out!!

  • @oolongtea0922
    @oolongtea0922 3 месяца назад

    thank you so much

  • @soundharyas2954
    @soundharyas2954 7 месяцев назад

    very interesting sir...Please post more problems and theorems

  • @shazzadhasan4067
    @shazzadhasan4067 7 месяцев назад

    cool explanation, thnnks

  • @stathius
    @stathius 10 месяцев назад

    Excellent!

  • @thanosathanasopoulos7529
    @thanosathanasopoulos7529 10 месяцев назад

    May I ask? I might be wrong but Ito calculus seems to have a problem with oscillating terms. Assume that instead of dot(x) we had i dot(x). Then we would expect that the fluctuations and random noises would just change the frequencies in the problem. But due to the extra term, g^2 ->-g^2 we would have a d.amping. Why? What do I miss?

  • @jayantapari9114
    @jayantapari9114 11 месяцев назад

    really enjoyable lecture

  • @jjjbossjjj
    @jjjbossjjj Год назад

    One of the better explanations, thank you Juan!

  • @Bbdu75yg
    @Bbdu75yg Год назад

    👍 Thanks

  • @pdc7482
    @pdc7482 Год назад

    Outstanding set of lectures on SDE

  • @lorarand
    @lorarand Год назад

    Thank you for the lecture series. Loved the combination of intuition and math. Amazing!

  • @lorarand
    @lorarand Год назад

    Amazing lecture. This should have a lot more views. I wish you would put more of your lectures online. It's rare to find someone who breaks down difficult concepts to clearly and concisely.

  • @awerawer0708
    @awerawer0708 Год назад

    I'm not sure what I'm misunderstanding at 35:00, the expectation of w(t) is 0 b/c w(t) ~ N(0, σ*root(t)), shouldn't the expectation of w(t)^2 = Var(W) which equals σ*root(t)? rather than σ^2*t

    • @juanmrparrondo1375
      @juanmrparrondo1375 Год назад

      I use the notation N(mu,sigma) to indicate a gaussian random variable with average mu and dispersion sigma (=> variance=sigma^2). I've just learned that the standard notation is N(mu,sigma^2). Sorry for the confusion! The Wiener process at time t is a gaussian variable with zero average and dispersion sigma*sqrt(t) => variance = sigma^2*t

    • @awerawer0708
      @awerawer0708 Год назад

      @@juanmrparrondo1375 ah thank you so much Professor, this video helped me greatly

  • @durgeshkumarbhagat8314
    @durgeshkumarbhagat8314 Год назад

    Great. please give the link for lesson 1 and lesson2

    • @juanmrparrondo1375
      @juanmrparrondo1375 Год назад

      Sorry. This is part of a course and lessons 1-2 are not available. Thanks!

  • @Pomeron34
    @Pomeron34 Год назад

    Dear Dr. Parrondo, thank you for the lectures. I have one question. How one should deal with stochastic DE when the stochastic part of the equation is nonlinear in \xi, but some function of \xi?

    • @juanmrparrondo1375
      @juanmrparrondo1375 Год назад

      Hi Roman. Thank you for your comment. I do not know the answer to your question. Just a couple of comments: If the noise appears in a term as g(x) h(xi), where xi is Gaussian white noise and h(.) is a nonlinear function, then this is equivalent to considering a non-gaussian noise. There are theorems that prove that some nongaussian white noises are equivalent to gaussian white noise (see for instance the limit of a dichotomous noise in the book by Horsthemke and Lefebver, Noise Induced Transitions). But I don't know if this is general. In fact, the dichotomous noise cannot be written as a nonlinear function of a gaussian noise. If the noise appears in a more complicated way, like sin(x xi) then I guess things are even more complicated. But I don't know the literature on this topic.

    • @Pomeron34
      @Pomeron34 Год назад

      @@juanmrparrondo1375 Thank you for a prompt response. I am reading a couple of articles on supression/introduction of chaos in nonlinear systems by random phase in the drive (e.g., doi:10.1155/2011/53820, doi:10.1016/j.chaos.2004.04.014, both refers Runge-Kutta-Verner method for simulation of SDE). Unfortunately I fail to find anything else but Ito calculus for SDE numerical methods which it its turn deals only with linear \xi(t) at r.h.s. Now I came to some understanding that it seems that for simulation it should be enough to deal with argument of cos(\omega*t+\sigma\xi(t)) as separate differential equation in the SDE system, so there would be one additional equation d\theta/dt=\omega+\sigma*d(\xi(t))/dt (e.g.,10.1006/jsvi.1996.0869). However, I am still confused about usage of white noise and Wiener process as \xi(t). It seems that different papers mean different thing for \xi(t) in cosine argument. So I believe I still have to do some research.

  • @user-wc7em8kf9d
    @user-wc7em8kf9d Год назад

    Muchísimas gracias Profesor.

  • @garydejong8693
    @garydejong8693 2 года назад

    Very well done. You not only derive the equations, but explain the thought process involved in their development. Very helpful.

  • @SayakKolay
    @SayakKolay 2 года назад

    Hi, this is a most wonderful lecture ! Thank you so much for uploading this, Professor. Would it be possible for you to upload other Stochastics lectures as well ?

  • @shuyuehu273
    @shuyuehu273 2 года назад

    This lecture is very well organized and incredibly easy to follow. Thank you, professor!

  • @tarektohme2508
    @tarektohme2508 2 года назад

    Regarding equilibrium for microstates: the condition is S(a*, E) - S(A(x), E) <= k for all A in R, where R is some restricted set of observables. Intuitively, whatever the set R, objectively defined or not, and for *any* observable A in R, the function w(a, E) must assign a large volume in phase space to the observed value a = A(x), or else the difference in entropies would be large. This means that no observable A in R should be able to distinguish x from a very large set of other microstates, comparably large to the set which is typical with respect to A (the set associated with a*). I am trying to think of a counter-example: a system with some causal chain between a microscopic event and a macroscopic observable, something like the Schrödinger's cat thought experiment (I know the point here is to bypass the notion of macroscopic variables, but just for the sake of argument.) Such a system, by construction, would provide a macroscopic observable that enables us to distinguish a small set of microstates from the rest of phase space (e.g. a lamp that turns on when a particle hits a detector) Therefore, the observables in R that satisfy the above condition depend on the system. How can you define an objective set of observables R that excludes the observable "state of the lamp" from this system? Maybe the answer is that such causal links cannot exist in isolated systems, they must be drawing energy or producing entropy to operate at the level of precision required to amplify microscopic events, and so will be far from equilibrium anyway.

  • @stathius
    @stathius 2 года назад

    I've searched a lot for good SDE tutorials. This was the best video series by far. Nothing, and I mean nothing, is taken for granted. Everything is rigorously explained and even if I didn't get something there was a concrete reference to go look at and then easily jump back. Enhorabuena profesor y muchisimas gracias!

  • @rahulbansal2699
    @rahulbansal2699 2 года назад

    Sir Please tell me the basic books for stochastic differential equation and stochastic fractional differential equations please sir🙏

    • @juanmrparrondo1375
      @juanmrparrondo1375 2 года назад

      This is a good one specially if you are interested in simulations (but ok for theory as well): www.amazon.es/Stochastic-Numerical-Methods-Introduction-Scientists/dp/3527411496

  • @adokoka
    @adokoka 2 года назад

    Thank you very much for the courses. I followed them part 1 to 5. It would be great if you could also post some lessons on PSDE :)

  • @delq
    @delq 2 года назад

    haha just in time, i got an exam tomorrow !!

  • @angeloqwequ-boateng4686
    @angeloqwequ-boateng4686 2 года назад

    This was very helpful, Juan

  • @avadavies1336
    @avadavies1336 2 года назад

    The unique poland contrarily tempt because rice immunologically zoom out a guarded committee. bustling, sparkling russia

  • @santosreckz7203
    @santosreckz7203 2 года назад

    Hello SIr i am completely new to all of this, what basic knowledge do i need to understand stochastic differential equations or on this video is this the basic knowledge i need to know. i hope what is said makes sense?

    • @juanmrparrondo1375
      @juanmrparrondo1375 2 года назад

      I think you can follow it if you know differential calculus, a bit of differential equations, and basic probability (gaussian variables, central limit theorem, average, independent random variables)

  • @lanablaschke5888
    @lanablaschke5888 2 года назад

    Dear Mr. Parrondo, thank you so much for your exceptionally clear and incredibly helpful lectures! It might be very obvious, but I get lost once during the derivation of the Fokker-Planck equation (around 31:00). I would be very grateful if you could help me out of my confusion! When replacing the average $\langle \dot A (t) angle$ by its integral definition $\int dx ho(x,t) \dot A (t)$, I don't understand why $$ \langle \dot A angle = \int \frac{\partial ho}{\partial t} A $$ holds or how one would get there...

    • @juanmrparrondo1375
      @juanmrparrondo1375 2 года назад

      Hia Lana, thanks for your comment!! "A" is introduced as an arbitrary function of x, so it's A(x). Then we define A(t) as A(x(t)). Then, the average is <A(t)>=<A(x(t))>= int dx rho(x,t) A(x). Imagine for instance that A(x)=x, then <A(t)>=<x(t)>=int dx rho(x,t) x. Now, if we differentiate the equation <A(t)>=<A(x(t))>= int dx rho(x,t) A(x) with respect to time, we get: d <A(t)> / dt = int dx [\partial rho(x,t)/\partial t] A(x) The l.h.s. is \langle \dot A angle I hope this will help!

    • @lanablaschke5888
      @lanablaschke5888 2 года назад

      @@juanmrparrondo1375 Now it all makes sense, thank you very much!

  • @gareebmanus2387
    @gareebmanus2387 2 года назад

    Is there a video of the first lecture?

    • @juanmrparrondo1375
      @juanmrparrondo1375 2 года назад

      Sure, you have the whole course in my youtobe channel

    • @gareebmanus2387
      @gareebmanus2387 2 года назад

      @@juanmrparrondo1375 Prof. Parrondo, Sorry, I had missed your reply. Thank you very much!

  • @leonardocabrera9253
    @leonardocabrera9253 2 года назад

    Loved it

  • @Happylife-en3se
    @Happylife-en3se 2 года назад

    thank you. please what is the programme that you use in these video (writing in table)

    • @juanmrparrondo1375
      @juanmrparrondo1375 2 года назад

      Hi. It is doceri: doceri.com/. It's a great app for the ipad. You can record the writing and play it in the class at the speed that you wish.

    • @Happylife-en3se
      @Happylife-en3se 2 года назад

      @@juanmrparrondo1375 muchos gracias

  • @anupamsarkar5449
    @anupamsarkar5449 2 года назад

    Corona took away a lot from our lives but in exchange it gave us the professors like you. You are one of the best teacher one can have. Thank you professor for so understandable explanation of obscure concepts. Anticipating more lectures from you in future.

  • @subu94
    @subu94 2 года назад

    Thank you Sir for such an excellent lecture

  • @albertalbesagonzalez9386
    @albertalbesagonzalez9386 2 года назад

    Very nicely explained, I very much liked the balance between intuition and formality.

  • @leonardocabrera9253
    @leonardocabrera9253 2 года назад

    I did a course in stochastic process and was a nightmare as the professor spent too much time in silly demonstrations and proof, but this videos just explain the concept and applications in just five minutes which is great simple and efficient. Many thanks for this videos please keep posting more videos.

  • @leonardocabrera9253
    @leonardocabrera9253 2 года назад

    Gracias Juan

  • @Seastric
    @Seastric 3 года назад

    Thank you very much juan. Do you have any notes available for the public . Thank you in advance.

  • @franziss1
    @franziss1 3 года назад

    Thank you Prof Parrondo for your wonderful lectures! I have watched your Part 1-3 and you have demystified the complex world of SDE! Can I ask how did you derive \sigma^2 \Delta t at 8:44 mins? I am confused. Is it because dW is \Delta x and in your earlier videos, you define \frac{\Deltax^2}{\Delta t} = \sigma^2?

  • @darthkenobi66
    @darthkenobi66 3 года назад

    28:00 great explanation

  • @abdulsaleem7027
    @abdulsaleem7027 3 года назад

    can you share the pdf to mail id abdsaleem111@gmail.com

  • @erny.wijayanti
    @erny.wijayanti 3 года назад

    cool, thank you very much for the lecture Prof. Could you please post some videos about SPDE (Stochastic Partial Differential Equation)?

  • @chhaviyadav1741
    @chhaviyadav1741 3 года назад

    prof. this is great. do you mind uploading other lectures in stat physics from your course?

  • @Olivier_Foka_Djidzem
    @Olivier_Foka_Djidzem 3 года назад

    It's so interesting... And i'll take a Time to understand so well the Stratonovich's integral

  • @julesdominik4744
    @julesdominik4744 4 года назад

    great video thanks!

  • @alipedram5720
    @alipedram5720 4 года назад

    Very informative and concise. Thanks! :D

  • @joshuayao6236
    @joshuayao6236 4 года назад

    Bravo, I am your fan now. Please post the whole lectures.

  • @joshuayao6236
    @joshuayao6236 4 года назад

    It is very good. Please post more. Thank you.