Predictive Coding

Поделиться
HTML-код
  • Опубликовано: 8 апр 2021
  • Prof. Orchard describes Predictive Coding networks, which are biologically plausible networks that learn in a manner similar to backprop.
    00:00 Introduction
    00:36 Biological Plausibility
    04:30 Predictive Coding Idea
    09:22 Error Nodes, State Nodes
    12:45 PC Network
    17:40 Error Node Dynamics
    20:30 Training Goal
    23:30 Joke Break
    26:00 Hopfield Function
    26:56 State Node Dynamics
    32:11 Training a PC Network
    35:31 PC and Backprop
    39:32 Joke Break
    39:47 Updating the Weights
    48:16 Testing the PC Network

Комментарии • 7

  • @sidbonshawn6910
    @sidbonshawn6910 Год назад

    Great work, thanks a lot for sharing! Was going through Millidge, Friston and Bogacz's works and was getting a bit confused on some concepts. Having you of through it slowly definitely put some doubts to rest! Seems like a a very interesting field!

  • @alirahimi92
    @alirahimi92 8 месяцев назад

    Thank you sir

  • @MLDawn
    @MLDawn Год назад

    I believe this version of PC does not consider continuous dynamic states, which is the real life situation. I'm talking about state space models for analyzing temporal data where the states and their parameters keep changing. What you have here reminds of the work done by the Oxford group which has simplified PC and provided a solution for the Machine Learnig community. Still this is very interesting🙂

    • @JeffOrchard
      @JeffOrchard  Год назад

      Yes, this formulation is based on convergence to equilibrium. But I’m not sure what other formulation there is.

    • @MLDawn
      @MLDawn Год назад

      @@JeffOrchard I wonder if we could have a zoom call where I can discuss some of my ideas and question regarding PC networks? The resource on the web is really limited and the papers are not explanatory enought (even the original paper).

  • @gustavobrasildesouza2165
    @gustavobrasildesouza2165 8 месяцев назад

    You wrote “predictions sent up” aren’t prediction sent down, and error flows up?

    • @JeffOrchard
      @JeffOrchard  8 месяцев назад +1

      Yes, in more recent implementations, the network is flipped so that predictions are sent down the network, which is more in-line with the cognitive theory of predictive coding.