Introduction to Normalizing Flows (ECCV2020 Tutorial)

Поделиться
HTML-код
  • Опубликовано: 12 дек 2024

Комментарии • 28

  • @anselmud
    @anselmud 8 месяцев назад +9

    The relevance of this tutorial from 2020 increased to the roof in 2024 after the recent release of Stable Diffusion 3 and its implementation of Flow Matching as an alternative to Diffusion.
    This is a very good building block to understand Flow Matching, that is why I happened here.
    It must be weird for researchers in Normalizing Flows at that time to witness the explosion of Gen AI through Diffusion models that were so close to what they were doing, it was like being missed by a nuclear bomb.
    But good research resist in the face of time and and the author predictions on Continuous-time Normalizing Flows and the research started by FFJORD were spot on.
    Kudos and thanks for putting this together, back in the day.
    I hope you resume posting videos like this!

  • @user-or7ji5hv8y
    @user-or7ji5hv8y 3 года назад +27

    this video deserves a million view. so clearly explained.

  • @thijsvanweezel4669
    @thijsvanweezel4669 6 месяцев назад +1

    At 47:00, the goal is to reduce dimensionality, but 4x4x1==2x2x4. How does that help?

  • @williamashbee
    @williamashbee 2 года назад +3

    don't stop making videos, you are fantastic.

  • @nocomments_s
    @nocomments_s 3 года назад +1

    Amazing video, will share it with my colleagues and friends, it deserves much more views than it has

  • @jackshi7613
    @jackshi7613 3 года назад +2

    Super useful, I have been looking for this video for a long time, finally, I got it. Great video, keep going!

    • @MarcusBrubaker
      @MarcusBrubaker  3 года назад

      Glad it helped! You may want to check out the more recent version of this tutorial here: ruclips.net/video/8XufsgG066A/видео.html Much of the same content although a few things have been updated and refined.

  • @prabhnoorsingh2104
    @prabhnoorsingh2104 3 года назад +5

    WOW! This has been so helpful. You deserve a medal prof :)

  • @derroitionman
    @derroitionman 3 года назад +4

    Great presentation, thanks for sharing it.

  • @cobaltl8557
    @cobaltl8557 Год назад

    Thank you for making this excellent tutorial.

  • @ibraheemmoosa
    @ibraheemmoosa 3 года назад +2

    I have a question. How does taking cube root change a bimodal distribution to a unimodal distribution at 13:50?

    • @MarcusBrubaker
      @MarcusBrubaker  3 года назад

      It's hard to give a good intuitive explanation for how/why a cubic transform creates multi-modality. However. I can confirm that this is actually what happens in that particular example, those figures are the real result of transforming those distributions.

  • @spyrosp.551
    @spyrosp.551 9 месяцев назад +1

    I will get out from this lecture that "If d is small that is not a big deal".

  • @payamjomeyazdian1794
    @payamjomeyazdian1794 3 года назад +1

    Very nice slides and presentation.

  • @huseyintemiz5249
    @huseyintemiz5249 4 года назад +5

    Nice tutorial.

  • @avideepmukherjee9307
    @avideepmukherjee9307 3 года назад +5

    Can we get the slides, please?

  • @mausci71
    @mausci71 3 года назад +2

    Loved your tutorial, Marcus.

  • @Gaetznaa
    @Gaetznaa 2 года назад +1

    Thanks for the video! Very clearly explained :)

  • @qichaoying4478
    @qichaoying4478 3 года назад +2

    Why GLOW is skipped??

  • @hosseinrafipoor8784
    @hosseinrafipoor8784 2 года назад

    Thanks for the great explanatoin!

  • @CourtOfWinter
    @CourtOfWinter Год назад

    Around 19:30 you say that flow layers technically need to be diffeomorphisms, but is that actually the case? I don't see any reason why the inverse needs to be differentiable as well.

    • @MarcusBrubaker
      @MarcusBrubaker  Год назад +1

      You need the flow to be differentiable in the normalizing direction in order to enable training and the computation of the Jacobian. Further, the Jacobian needs to be non-singular (non-zero determinant), and that implies (by the inverse function theorem) that the inverse is also differentiable.

    • @CourtOfWinter
      @CourtOfWinter Год назад

      @@MarcusBrubaker Ah, okay, makes sense. Thanks!

  • @ff-fh8nh
    @ff-fh8nh 2 года назад

    In coupling flows: does the split step split the x along the channel(features) or along others?

    • @MarcusBrubaker
      @MarcusBrubaker  2 года назад

      It can split the dimensions in any way. Traditionally in image applications the split is along channels dimensions, but, e.g., this paper split along spatial dimensions: proceedings.neurips.cc/paper/2020/hash/ecb9fe2fbb99c31f567e9823e884dbec-Abstract.html

  • @chainonsmanquants1630
    @chainonsmanquants1630 3 года назад

    Thanks

  • @laurenpinschannels
    @laurenpinschannels Год назад +1

    he keeps saying probabilistic graphical models when he meant probabilistic generative models

  • @aojing
    @aojing 9 месяцев назад

    First, explain the name, what is "normalizing"?