2D Spectral Derivatives with NumPy.FFT

Поделиться
HTML-код
  • Опубликовано: 16 июл 2024
  • The Fast Fourier Transform allows to easily take derivatives of periodic functions. In this video, we look at how this concept extends to two dimensions, such as how to create the wavenumber grid and how to deal with partial derivatives. Here is the notebook: github.com/Ceyron/machine-lea...
    -------
    👉 This educational series is supported by the world-leaders in integrating machine learning and artificial intelligence with simulation and scientific computing, Pasteur Labs and Institute for Simulation Intelligence. Check out simulation.science/ for more on their pursuit of 'Nobel-Turing' technologies (arxiv.org/abs/2112.03235 ), and for partnership or career opportunities.
    -------
    📝 : Check out the GitHub Repository of the channel, where I upload all the handwritten notes and source-code files (contributions are very welcome): github.com/Ceyron/machine-lea...
    📢 : Follow me on LinkedIn or Twitter for updates on the channel and other cool Machine Learning & Simulation stuff: / felix-koehler and / felix_m_koehler
    💸 : If you want to support my work on the channel, you can become a Patreon here: / mlsim
    🪙: Or you can make a one-time donation via PayPal: www.paypal.com/paypalme/Felix...
    -------
    ⚙️ My Gear:
    (Below are affiliate links to Amazon. If you decide to purchase the product or something else on Amazon through this link, I earn a small commission.)
    - 🎙️ Microphone: Blue Yeti: amzn.to/3NU7OAs
    - ⌨️ Logitech TKL Mechanical Keyboard: amzn.to/3JhEtwp
    - 🎨 Gaomon Drawing Tablet (similar to a WACOM Tablet, but cheaper, works flawlessly under Linux): amzn.to/37katmf
    - 🔌 Laptop Charger: amzn.to/3ja0imP
    - 💻 My Laptop (generally I like the Dell XPS series): amzn.to/38xrABL
    - 📱 My Phone: Fairphone 4 (I love the sustainability and repairability aspect of it): amzn.to/3Jr4ZmV
    If I had to purchase these items again, I would probably change the following:
    - 🎙️ Rode NT: amzn.to/3NUIGtw
    - 💻 Framework Laptop (I do not get a commission here, but I love the vision of Framework. It will definitely be my next Ultrabook): frame.work
    As an Amazon Associate I earn from qualifying purchases.
    -------
    Timestamps:
    00:00 Intro & Overview
    02:50 Domain, Discretization & Mesh
    05:53 Example function and its analytical derivatives
    09:30 Plot & Discussion of function
    13:58 Wavenumber grid in 2d
    18:15 Perform spectral derivatives and compare
    22:41 Bonus: Gradient (both partial derivatives at the same time)
    24:43 Outro

Комментарии • 14

  • @dutuzeremy5544
    @dutuzeremy5544 5 месяцев назад +2

    Great stuff, keep it coming!

  • @user-qp2ps1bk3b
    @user-qp2ps1bk3b 5 месяцев назад +1

    very good video, thank you !

  • @matteopiccioni196
    @matteopiccioni196 5 месяцев назад +1

    Great content video. Very nice

  • @trasor486
    @trasor486 5 месяцев назад +1

    Hi, good video. To my understanding spectral derivatives work fine for periodic BCs. However, any non periodic function would be subjected to gibbs phenomena, which kills the derivative. Have you any ideas /tipps to extend spectral derivatives for arbitary functions?

    • @MachineLearningSimulation
      @MachineLearningSimulation  5 месяцев назад

      Thanks for the kind comment. 😊
      Yes, Fourier-spectral derivatives require periodic boundary conditions, violating this assumption likely kills the quality of the derivative and deletes many nice properties of the FFT.
      For this video, I used spectral synonymous with Fourier-spectral. More generally speaking it refers to finding domain wide Ansatz functions. For instance, to handle dirichlet boundaries one can use chebychev spectral methods. Those also admit a somewhat efficient spectral derivative via the FFT (Check Trefethen's "spectral methods in Matlab" for more details).
      However, Fourier-spectral methods have a striking advantage over other spectral methods that with them the derivative operator diagonalizes in Fourier space. AFAIK, they are the only spectral method with that capability that can also be easily used with the FFT. That makes them so useful for solving PDEs, of course only if you assume periodic boundaries.

  • @user-rl8uu9mv9f
    @user-rl8uu9mv9f 5 месяцев назад +1

    Thanks, indeed a nice video. Quick question: The derivative operator should be set to 0 at the Nyquist frequecy for all odd derivatives, right? In this problem, the amplitude itself is zero there, so does not matter.

    • @MachineLearningSimulation
      @MachineLearningSimulation  5 месяцев назад

      [note: I confused the zero/mean mode and the Nyquist mode in my first reply. For anyone reading please scroll down in the thread.]
      Hi,
      thanks for the kind words and the question. 😊
      By creating the wavenumber grid using "np.fft.rfftfreq" and ”np.fft.fftfreq" the derivative operator is already zero at the Nyquist frequency. You can also check this if you index it at [0, 0]. This actually does not keep the "mean energy" but set it to zero (which is what we want because the derivative of a constant offset is zero). If we wanted to retain the "mean energy" we would set it to (real) 1.0. Do you agree? I think by amplitude you meant "mean energy" or constant offset.
      How I interpret the DFT/FFT the amplitude of each mode is within the Fourier coefficients (and they change according to the derivative operator).

    • @user-rl8uu9mv9f
      @user-rl8uu9mv9f 5 месяцев назад

      @@MachineLearningSimulation I think [0,0] is the DC mode, not the Nyquist frequency which occurs at 2*pi*(N/2) for either kx or ky.

    • @MachineLearningSimulation
      @MachineLearningSimulation  5 месяцев назад

      Sorry, I got you wrong with the initial reply. You referred to the Nyquist mode, not the zero or mean mode. Again, sorry for this confusion.
      I am curious: Why does the Nyquist mode need to be set to zero?

    • @user-rl8uu9mv9f
      @user-rl8uu9mv9f 5 месяцев назад +1

      @@MachineLearningSimulation Your f is sampled only at discrete x_n. Interpolating it on continuous x will have the Nyquist contribution proportional to cos (pi N x/L), whose odd derivative is proportional to sin(pi Nx/L) which is always zero at the sampled points x_n. Please ref: math.mit.edu/~stevenj/fft-deriv.pdf

    • @MachineLearningSimulation
      @MachineLearningSimulation  5 месяцев назад

      Thanks for the resource 👍. I was partly aware of it as part of the creation of the video, but maybe I interpreted it incorrectly.
      As far as I understand, this should not be a problem for differentiating real signals. Let's say N=6, and I want to sample cos(3 * 2*pi/L x). This will give a contribution to the Nyquist mode [at unscaled wavenumber -3] of N + 0i (N in the real part of the complex number, and zero in the imaginary part). Then, there are three scenarios:
      1. I want to obtain the first derivative: I would multiply with 1j (-3), turning the coefficient into 0 - 3N j. In an inverse transform, the imaginary Nyquist component would result in an imaginary cosine signal (precisely: 3j * cos(3 * 2*pi/L x) ). However, in the video, we zero out any imaginary components (by only taking the real part of the result array), so there essentially is only a zero real signal, which is fine because the analytical derivative of the cosine at Nyquist mode would be seen as a zero signal anyway.
      2. I want to obtain the second derivative efficiently: I would multiply with (1j * (-3))^2 = -9. Hence, I would just scale the real component. Transforming back would give the correct derivative signal of -9 cos(3 * 2*pi/L x).
      3. I want to obtain the second derivative by applying the former first derivative twice: As mentioned in the linked resource, this would - incorrectly - also give a zero signal. [If one chooses not to discard the imaginary part of the inversely transformed signal, however, we could still get the right derivative]. This can be problematic, but the effect can also be very small because a signal with a Nyquist component likely also has higher modes that, with their aliases, produce issues anyway.
      In conclusion, maybe I got it wrong, but my impression was that this was not too big of an issue when differentiating **real** signals **once**. It is an interesting edge case, and I want to dedicate a video to it. Please let me know if I got something wrong. I would be highly interested in lifting a potential misunderstanding 😊.