Logo LVSN
EnglishAccueil
A proposPersonnesRecherchePublicationsEvenementsProfil
A propos
Publications

 

 

 

 

CERVIM

REPARTI

MIVIM

Bayesian Optimization for Conditional Hyperparameter Spaces


Julien-Charles Lévesque, Audrey Durand, Christian Gagné and Robert Sabourin

En savoir plus...

Abstract - Hyperparameter optimization is now widely applied to tune the hyperparameters of learning algorithms. The hyperparameters can have structure, resulting in hyperparameters depending on conditions, or on the values of other hyperparameters. We target the problem of combined algorithm selection and hyperparameter optimization, which includes at least one conditional hyperparameter: the choice of the learning algorithm. In this work, we show that Bayesian optimization with Gaussian processes can be used for the optimization of conditional spaces with the injection of knowledge concerning conditions in the kernel. We propose and examine the behavior of two kernels, a conditional kernel which forces the similarity of two samples from different condition branches to be zero, and the Laplace kernel, based on similarities with Mondrian processes and random forests. We show the benefit of using such kernels, as well as proper imputation of inactive hyperparameters, on a benchmark of scikit-learn models.

download documentdownload document

Bibtex:

@inproceedings{Lévesque1148,
    author    = { Julien-Charles Lévesque and Audrey Durand and Christian Gagné and Robert Sabourin },
    title     = { Bayesian Optimization for Conditional Hyperparameter Spaces },
    booktitle = { Proc. of the International Joint Conference on Neural Networks (IJCNN) },
    publisher = { IEEE },
    year      = { 2017 },
    month     = { 05 },
    location  = { Anchorage, AK },
    web       = { https://doi.org/10.1109/IJCNN.2017.7965867 }
}

Dernière modification: 2017/02/21 par cgagne

     
   
   

©2002-. Laboratoire de Vision et Systèmes Numériques. Tous droits réservés