Predictive Coding
HTML-код
- Опубликовано: 8 апр 2021
- Prof. Orchard describes Predictive Coding networks, which are biologically plausible networks that learn in a manner similar to backprop.
00:00 Introduction
00:36 Biological Plausibility
04:30 Predictive Coding Idea
09:22 Error Nodes, State Nodes
12:45 PC Network
17:40 Error Node Dynamics
20:30 Training Goal
23:30 Joke Break
26:00 Hopfield Function
26:56 State Node Dynamics
32:11 Training a PC Network
35:31 PC and Backprop
39:32 Joke Break
39:47 Updating the Weights
48:16 Testing the PC Network
Great work, thanks a lot for sharing! Was going through Millidge, Friston and Bogacz's works and was getting a bit confused on some concepts. Having you of through it slowly definitely put some doubts to rest! Seems like a a very interesting field!
Thank you sir
I believe this version of PC does not consider continuous dynamic states, which is the real life situation. I'm talking about state space models for analyzing temporal data where the states and their parameters keep changing. What you have here reminds of the work done by the Oxford group which has simplified PC and provided a solution for the Machine Learnig community. Still this is very interesting🙂
Yes, this formulation is based on convergence to equilibrium. But I’m not sure what other formulation there is.
@@JeffOrchard I wonder if we could have a zoom call where I can discuss some of my ideas and question regarding PC networks? The resource on the web is really limited and the papers are not explanatory enought (even the original paper).
You wrote “predictions sent up” aren’t prediction sent down, and error flows up?
Yes, in more recent implementations, the network is flipped so that predictions are sent down the network, which is more in-line with the cognitive theory of predictive coding.