Do you mean "during generation"? During generation you can't skip the quantization because the pixel-CNN is defined to generate the quantized vectors (the symbols).
@@davidmcallester4973 I guess that during generation it wouldn't make much sense, i was thinking more in the direction of interpolating smoothly between two different symbols.
what a legend!
Thanks! I don't attend this institution, but this was an extremely clear lecture :)
Thanks for explaining that! Great job. Subscribed!
do you train the pixel cnn on the same data and just not update the Vae weights while training?
yes, the vector quantization is held constant as the pixel CNN is trained.
also, what if you skip the quantization during inference? would you still get images that make sense?
Do you mean "during generation"? During generation you can't skip the quantization because the pixel-CNN is defined to generate the quantized vectors (the symbols).
@@davidmcallester4973 I guess that during generation it wouldn't make much sense, i was thinking more in the direction of interpolating smoothly between two different symbols.
Thank you for this video, however, please don't call your variable ŝ 😆 (or at least don't say it out loud)