Excellent clarification of multiple concepts in one pass. Helped me relate the encoders relative to latent space in a much more accessible metaphor. Thank you.
I really appreciate your lucid explanation. Superb. I wanna request you if you could enhance the sound quality a bit. Good wishes and thanks for such videos
@avb_fj Thank you! I'd like to see one where you code the encoder and decoder I'm coding a autoencoder and it's a little tough trying to find a good balance of reduction but also keeling the important details
If I remember correctly, I just used ipywidgets inside a jupyter notebook to do the UI and display. I also wrote the logic for the PCA (sklearn), encoding/decoding, and interpolating the latent vectors.
Unfortunately YT doesn't allow to increase the volume after posting videos. I could re-upload it, but idk if it'll be really worth it. I'll just take this as lesson for future videos, thanks for the comment!
Oh man. I was a bit lost when you where saying encoder this decoder that but the smile example at 6:50 hit right on the nail. It's indeed mindblowing. I'd love to know more about AI for outsiders, subscribed. PS: A concept I picked up from Ezra is that AI turns semantics into geometry. So you can do king - man + woman and get queen! (paraphrasing). If you could expand on this and give more examples in different modalities... that'd be awesome.
Nice…glad you enjoyed it and stuck around for the whole thing. The semantic example is pretty awesome yeah… I’ve brought it up in the channel in my History of NLP video, but more examples on different modalities seems like a nice idea for a video!
Just subscribed after your NeRF video, and this one is awesome too! You, Yannic, and Two Minute Papers are great at making AI content relatable and interesting and freaking cool :) What a time to be alive! lol
This is the most concise, simplest and best explanation of latent space that I've found so far!
Thanks a lot! Glad you enjoyed it! 🙏🏽
Excellent clarification of multiple concepts in one pass. Helped me relate the encoders relative to latent space in a much more accessible metaphor. Thank you.
10 min felt like 30 min, I had so many rewinds during the vid. the video is so full of info, thanks a lot.
Thanks Neural Breakdown with AVB! I always wanted to know how "latent space arithmetic" works.
Great job! I have used your videos to help several recent grads, who had some gaps in understanding. You are a very good teacher.
Awesome! Thank you!
This video is clear and concise, amazing work!
Great explanations! Getting better understanding how some parts of Stable Diffusion work without any efforts )
This video is super cool. It's good to see those visualized concept
Thanks!😊
This video is so fascinating. Amazing work.
Thanks! Glad you enjoyed it!
I really appreciate your lucid explanation. Superb. I wanna request you if you could enhance the sound quality a bit. Good wishes and thanks for such videos
Thanks! I’ll keep that in mind going forward…
Amazing Explanation
Thanks a lot!
Truly Amazing
A great description of interpreting deep learing models. Well done!
Excellent video! Thanks for your work!
QQ: Is there a repo for the real-time image manipulation software you used as your demo?
Excellent video
This was a insanely good video and explanation ty
Thanks! Awesome to hear that! 😊
@avb_fj Thank you! I'd like to see one where you code the encoder and decoder I'm coding a autoencoder and it's a little tough trying to find a good balance of reduction but also keeling the important details
Thanks for great video! Very well explained!
This is a fantastic break down. Great pacing, wonderful examples with easy to follow metaphors. Fix your audio and keep em coming!
Great content!
Great video, thanks
Which tools did you use to be able to change each principal component and see its effect on output image?
If I remember correctly, I just used ipywidgets inside a jupyter notebook to do the UI and display. I also wrote the logic for the PCA (sklearn), encoding/decoding, and interpolating the latent vectors.
really great overview of topics. subd n liked each im watching!
I hope you can fix the quiet audio!
Unfortunately YT doesn't allow to increase the volume after posting videos. I could re-upload it, but idk if it'll be really worth it. I'll just take this as lesson for future videos, thanks for the comment!
Thank you sir ! You clear the concept of latent space for me ! And I can’t wait to click on your multi modal video in this channel
Oh man. I was a bit lost when you where saying encoder this decoder that but the smile example at 6:50 hit right on the nail. It's indeed mindblowing. I'd love to know more about AI for outsiders, subscribed.
PS: A concept I picked up from Ezra is that AI turns semantics into geometry. So you can do king - man + woman and get queen! (paraphrasing). If you could expand on this and give more examples in different modalities... that'd be awesome.
Nice…glad you enjoyed it and stuck around for the whole thing. The semantic example is pretty awesome yeah… I’ve brought it up in the channel in my History of NLP video, but more examples on different modalities seems like a nice idea for a video!
Great content!
Awesome content!
🙏🏽🙌🏼
wow
can we do same with the pixels to enhance the image?
Can you clarify what you meant by “doing the same with pixels”?
Just subscribed after your NeRF video, and this one is awesome too! You, Yannic, and Two Minute Papers are great at making AI content relatable and interesting and freaking cool :) What a time to be alive! lol
Wow that’s high praise! Those two are definitely an inspiration, so I’m kinda feeling surreal reading this! Thank you so much!! 🙌🏼🙌🏼
Great content! Just as a FYI, might want to turn up the Mic volume. It's easier to lower the volume than to turn it up from the consumers POV.
Thanks for the feedback! Will keep it in mind for the next one…
Great video but the audio level is way too low. Also the video and audio is not in sync.