Nick Mancuso: Happy Scientist Workshop #22: How I learned to stop worrying and love autodiff
HTML-код
- Опубликовано: 8 сен 2024
- Much of modern computational statistics relies upon high
dimensional likelihoods whose gradients can be tedious to
derive and correctly implement. Recent advances in automatic
differentiation (i.e. autodiff) have enabled ultra-high dimensional
objectives to be optimized (e.g., deep learning), yet their use for
statistical settings has received less attention. Here I'll
showcase the utility of a state-of-the-art autodiff library for
Python, JAX. This workshop will introduce the basics of autodiff,
how to leverage GPUs for computation by merely setting a single
flag (i.e. no complicated code), and conclude with a
straightforward implementation of a Poisson regression model.