Thanks so much for the tutorial. I learned a lot from you!!!! I have a question: What modifications should be made to the code fabric.setup(model, optimizer) if I use a learning rate scheduler?
And just personal feedback for an awesome tutorial It would be great if you could consider including a gentle reminder that running code on multiple GPUs often requires the use of scripts rather than executing them directly in a notebook. Sorry if I missed any mention of this information already being included in the tutorial.
thanks, and great question. Since normal schedulers don't have any parameters you can use it as usual (no need to put it into fabric.setup). But using fabric.setup also doesn't hurt. I added a quick example here: github.com/rasbt/cvpr2023/blob/main/07_fabric-vit-mixed-fsdp-with-scheduler.py
@@nguyenhuuuc2311 Good point. Yeah, notebook (or interactive) environments are generally incompatible with multi-GPU training due to their multiprocessing limitations. Hah, I take it for granted these days but definitely a good time to mention that as a reminder!
Thank you so much. Impressive presentation! Do you think it is worth learning lightning, I am a PhD student and I am comfortable with Pytorch. Does lightning have all capabilities like Pytorch? I know that lightening to Pytorch like keras to Tensorflow
Good question @hamzawi2752. Fabric (covered in this video) is basically an add-on to PyTorch. It's basically useful for tapping into more advanced features like multi-GPU training, mixed-precision training etc. with minimal code changes. It's essentially a wrapper around PyTorch features, but doing this in pure PyTorch is definitely more work. So, I'd say it's worth it.
Love the straightforward video, didn't know about Fabric for quickly upgrading existing PyTorch code
Wow, great presentation.
Thanks :)
this is really awesome content.
Great tutorial. 🎉
Thank you! 😊
Thanks so much for the tutorial. I learned a lot from you!!!!
I have a question: What modifications should be made to the code fabric.setup(model, optimizer) if I use a learning rate scheduler?
And just personal feedback for an awesome tutorial It would be great if you could consider including a gentle reminder that running code on multiple GPUs often requires the use of scripts rather than executing them directly in a notebook. Sorry if I missed any mention of this information already being included in the tutorial.
thanks, and great question. Since normal schedulers don't have any parameters you can use it as usual (no need to put it into fabric.setup). But using fabric.setup also doesn't hurt. I added a quick example here: github.com/rasbt/cvpr2023/blob/main/07_fabric-vit-mixed-fsdp-with-scheduler.py
@@nguyenhuuuc2311 Good point. Yeah, notebook (or interactive) environments are generally incompatible with multi-GPU training due to their multiprocessing limitations. Hah, I take it for granted these days but definitely a good time to mention that as a reminder!
@@SebastianRaschka Thanks for spending time on my question and the quick answer with a notebook ❤
Thank you so much. Impressive presentation! Do you think it is worth learning lightning, I am a PhD student and I am comfortable with Pytorch. Does lightning have all capabilities like Pytorch? I know that lightening to Pytorch like keras to Tensorflow
Good question @hamzawi2752. Fabric (covered in this video) is basically an add-on to PyTorch. It's basically useful for tapping into more advanced features like multi-GPU training, mixed-precision training etc. with minimal code changes. It's essentially a wrapper around PyTorch features, but doing this in pure PyTorch is definitely more work. So, I'd say it's worth it.