DPG Phi
Verhandlungen
Verhandlungen
DPG

Freiburg 2019 – wissenschaftliches Programm

Sitzungen | Tage | Auswahl | Suche | Aktualisierungen | Downloads | Hilfe

FM: Fall Meeting

FM 55: Quantum & Information Science: Neural Networks, Machine Learning, and Artificial Intelligence II

FM 55.2: Talk

Mittwoch, 25. September 2019, 14:30–14:45, 1098

Training Deep Neural Networks by optimizing over paths in hyperparameter spaceVlad Pushkarov1, Jonathan Efroni1, Mykola Maksymenko2, and •Maciej Koch-Janusz31Technion, Haifa, Israel — 2SoftServe Inc., Lviv, Ukraine — 3ETH Zurich, Switzerland

Hyperparameter optimization is both a practical issue and an interesting theoretical problem in training of deep architectures. Despite many recent advances the most commonly used methods almost universally involve training multiple and decoupled copies of the model, in effect sampling the hyperparameter space. We show that at a negligible additional computational cost, results can be improved by sampling paths instead of points in hyperparameter space. To this end we interpret hyperparameters as controlling the level of correlated noise in the training, which can be mapped to an effective temperature. The usually independent instances of the model are then coupled and allowed to exchange their hyperparameters throughout the training using the well established parallel tempering technique of statistical physics. Each simulation corresponds then to a unique path, or history, in the joint hyperparameter/model-parameter space. We provide empirical tests of our method, in particular for dropout and learning rate optimization. We observed faster training and improved resistance to overfitting and showed a systematic decrease in the absolute validation error, improving over benchmark results.

100% | Mobil-Ansicht | English Version | Kontakt/Impressum/Datenschutz
DPG-Physik > DPG-Verhandlungen > 2019 > Freiburg