Publications

This is home for my research publications, each linked with accompanying preprints, posters, and/or news pieces.

Written by Nadhir Hassen

Approximate Bayesian Optimisation for Neural Networks

A novel Bayesian Optimization method based on a linearized link-function to accounts the under-presented class by using a GP surrogate model. This method is based on Laplace’s method and Gauss-Newton approximations to the Hessian. Our method can improve generalization and be useful when validation data is unavailable (e.g., in nonstationary settings) to solve heteroscedastic behaviours. Our experiments demonstrate that our BO by Gauss-Newton approach competes favorably with state-of-the-art blackbox optimization algorithms.

Orion-Asynchronous Distributed Hyperparameter Optimization

Oríon is a black-box function optimization library with a key focus on usability and integrability for its users. As a researcher, you can integrate Oríon to your current workflow to tune your models but you can also use Oríon to develop new optimization algorithms and benchmark them with other algorithms in the same context and conditions.

By Xavier Bouthillier, Nadhir Hassen, Christos Tsirigotis, François Corneau-Tremblay, Thomas Schweizer, Pierre Delaunay, Mirko Bronzi, Lin Dong , Christopher Beckham et.al in Research Journal of Machine Learning

December 10, 2020