OmniOpt: A tool for
Hyperparameter Optimization

Architecture, performance and accuracy of numerical algorithms are governed by their hyperparameters. Therefore, hyperparameter optimization is a crucial task in numerous applications, ranging from neural networks and other machine learning methods to classical simulations. Within ScaDS.AI the versatile and user-friendly hyperparameter optimization tool OmniOpt has been developed. Its main features are:

  • Omniopt is applicable to a broad class of problems. Virtually every application running on a Linux system can be optimized by OmiOpt. This includes classical simulations, neural networks and other machine learning methods.
  • OmniOpt takes advantage of the vast resources of the High Performance Computing (HPC) system at TU Dresden. The user has access to more than 40000 CPUs and several hundred GPUs.
  • The user is free to choose the programming language, the number and type of hyperparameters, their borders and the type of objective function to be optimized.
  • Omniopt is robust. It checks and installs all dependencies automatically and fixes many problems in the background.
OmniOpt logo

by Peter Winkler & Norman Koch

How does OmniOpt work?

  • The use of OmniOpt is free for all users having an account at the HPC system of TU Dresden. For information on getting an account, please contact us.
  • A user-friendly GUI can be used to enter the control parameters: The computational resources (number of GPUs or CPUs to be used), the number of hyperparameters and their borders, the number of evaluations to be performed, etc.
  • Moreover, the users have to provide their applications as a black box program in any programming language running on Linux systems. The objective function chosen by the user has to be calculated therein. This program has to read the hyperparameters as command-line arguments and print the objective function to the standard output.
  • The hyperparameters are determined by a stochastic Bayesian optimization algorithm using the Hyperopt project on GitHub. Thereby, a multitude of evaluations of the objective function are performed, making a trade-off between exploration and exploitation of the optimization space.
  • The results are twofold: Firstly, the user obtains the full numerical results of the optimization (all evaluated hyperparameter sets with the corresponding objective function). Secondly, there are automatic graphical evaluation tools available: 2D slices of the optimization space as color maps for each pair of hyperparameters, as well as a representation of the results as a parallel plot.
  • The parallelization and distribution of the calculations on the HPC resources are done automatically according to the resources requested by the user.

Do you have more questions?

Contact us or take part in our training Hyperparameter Optimization with OmniOpt!

Links

References

Bergstra, J., Yamins, D., Cox, D. D. (2013) Making a Science of Model Search: Hyperparameter Optimization in Hundreds of Dimensions for Vision Architectures. TProc. of the 30th International Conference on Machine Learning (ICML 2013), June 2013, pp. I-115 to I-23.
https://github.com/hyperopt/hyperopt

TU
Universität
Max
Leibnitz-Institut
Helmholtz
Hemholtz
Institut
Fraunhofer-Institut
Fraunhofer-Institut
Max-Planck-Institut
Institute
Max-Plank-Institut