DEIB PhD Student in Information Technology
This event will be held online through Microsoft Teams
September 2nd, 2021
11:00 am /12:00 am
On September 2nd, 2021 at 11:00 am Luca Sabbioni, PhD Student in Information Technology, will hold the online seminar titled "Hyperparameter Tuning for Deep Learning: from grid search to online optimization".
Since deep neural networks were developed, they have made huge contributions to people’s everyday lives. However, despite this achievement, the design and training of deep neural networks are still challenging and unpredictable procedures that have been alleged to be alchemy. Consequently, Automated hyper-parameter optimization (HPO) has become a popular topic in both academic and industrial areas.
Conventional hyperparameter optimization methods are computationally intensive and hard to generalize to scenarios that require dynamically adapting hyperparameters, such as life-long learning. After providing an overview of the most common tuning techniques adopted, an interesting analogy between hyperparameter optimization and parameter learning in recurrent neural networks is shown. Taking advantage from this analogy, it is possible to develop an online hyperparameter optimization algorithm to tune hyperparameters and network parameters simultaneously, yielding systematically better generalization performance compared to standard methods.
To participate in the online event, please, use this link