AIxplained
AIxplained
  • Видео 12
  • Просмотров 39 829
Mini-lecture on Differentiable Neural Architecture Search (DARTS)
Mini-lecture where we discuss differentiable neural architecture search. More specifically, we explain DARTS (arxiv.org/abs/1806.09055) in quite some detail (although we did not cover everything, such as the cell-based search method, where we search for cells and stack them in pre-defined ways, rather than searching for the entire network architecture at once).
Liked the video? Share with others!
Any feedback, comments, or questions? Let me know in the comments section below!
Просмотров: 2 414

Видео

Lecture on Deep Meta-Learning (MAML, Matching network, Prototypical network)
Просмотров 1,9 тыс.2 года назад
This lecture covers the field of deep meta-learning, the categorization of the different methods, and fundamental algorithms. If you want to learn more about deep meta-learning, be sure to check out our survey paper! (link.springer.com/article/10.1007/s10462-021-10004-4) Liked the video? Share it with others! Any questions, comments, or feedback? Let me know in the comments below!
Stateless Neural Meta-Learning using Second-Order Gradients (ECMLPKDD 2022)
Просмотров 1922 года назад
This is the talk of the paper "Stateless Neural Meta-Learning using Second-Order Gradients" given at the ECMLPKDD 2022 conference in Grenoble, France. Link to the paper: link.springer.com/article/10.1007/s10994-022-06210-y If you liked the video, make sure to share it with others! Any questions, comments, or feedback? Let me know below!
Automated Machine Learning - Tree Parzen Estimator (TPE)
Просмотров 8 тыс.2 года назад
In this video, we cover another Bayesian Optimization method to perform hyperparameter optimization: Tree Parzen Estimator. Original paper where TPE was proposed: proceedings.neurips.cc/paper/2011/file/86e8f7ab32cfd12577bc2619bc635690-Paper.pdf The animation of TPE used in the slides was created by Alois Bissuel (medium.com/criteo-engineering/hyper-parameter-optimization-algorithms-2fe447525903...
Automated Machine Learning: Sequential Model-Based Optimization (SMBO) and Bayesian Optimization
Просмотров 2,8 тыс.2 года назад
In this video, we discuss a model-based approach to hyperparameter optimization: sequential model-based optimization, or SMBO in short. We also discuss Bayesian Optimization (BO) as an instantiation of this scheme. If you like the video, make sure to share it with others. Any questions, feedback, or comments? Let me know below!
Automated Machine Learning - Successive Halving and Hyperband
Просмотров 3,5 тыс.2 года назад
In this video, we take a look at Successive Halving, which is an extension of random search to make it more efficient, as well as Hyperband, which is an extension of Successive Halving. Both methods can be used for finding good hyperparameters (and algorithms) for a machine learning problem. Original Successive Halving paper: arxiv.org/pdf/1502.07943.pdf Original Hyperband paper: arxiv.org/pdf/...
Automated Machine Learning: Grid Search and Random Search
Просмотров 6 тыс.2 года назад
In this video, we look at two methods to perform hyperparameter optimization (finding the best hyperparameters for a learning algorithm to maximize the performance), namely grid search and random search. If you liked the video, make sure to share it with others! Any comments, feedback, or questions? Let me know through the comments section!
Automated Machine Learning: Combined Algorithm Selection and Hyperparameter Optimization (CASH)
Просмотров 6172 года назад
In this video, we cover the problem of finding the best algorithm and hyperparameter configuration, or CASH in short. In addition, we discuss the relationship to hyperparameter optimization (HPO). If you like the video, make sure to share it with other people! Any feedback, questions, or comments? Let me know through the comments section!
Automated Machine Learning: what is it, and why important?
Просмотров 8282 года назад
In this video, I give an introduction to the field of Automated Machine Learning and describe what it is and why it is important. Did you like the video? Please share with other people! Any feedback, questions, or thoughts? Let me know in the comments section!
The Feature Representations of Transfer Learning and Gradient-Based Meta-Learning
Просмотров 2233 года назад
Our workshop paper presentation for the Meta-learning NeurIPS 2021 workshop. We investigate what transfer learning and gradient-based meta-learning techniques actually do under the hood. How do they facilitate faster learning of new tasks? Liked the video? Share it with others!
Meta-Learning - Master Thesis Presentation
Просмотров 1,4 тыс.3 года назад
Master Thesis Presentation of Mike Huisman about Meta-Learning. Liked the video? Share it with others! Abstract: A key characteristic of human intelligence is the ability to learn new tasks quickly. While deep learning techniques have demonstrated super-human level performance on some tasks, their ability to learn quickly is limited. Meta-learning has been proposed as one approach to bridge thi...
Meta-Learning for Neural Networks: what is it?
Просмотров 12 тыс.4 года назад
The field of Artificial Intelligence is moving at great velocity. Despite the fact that we can now create (deep) neural networks that learn to perform some tasks just as well as humans, they learn much slower than humans. Meta-learning is an approach to deal with this issue. In this video, we give a brief introduction to this strategy. Liked the video? Share it with others!