BayesOpt 2017

NIPS Workshop on Bayesian Optimization
December 9, 2017
Long Beach, USA

Past Workshops
Special Issue

Workshop schedule


Saturday 10 December, 2016

9:00 AM Introduction and opening remarks
9:10 AM Invited talk 1: Roman Garnett (slides)
9:40 AM Contributed talk 1: Learning Optimal Interventions
9:55 AM Contributed talk 2: Efficient nonmyopic active search (slides)
10:10 AM Poster spotlights 1
10:30 AM Coffee Break
11:00 AM Invited talk 2: Katharina Eggensperger (slides)
11:30 AM Poster spotlights 2
11:45 AM Poster session 1
12:00 PM Lunch Break
2:00 PM Invited talk 3: Alex Wiltschko
2:30 PM Poster session 2
3:00 PM Coffee Break
3:30 PM Poster session 3
4:00 PM Invited talk 4: Marc Toussaint (slides)
4:30 PM Invited talk 5: Joshua Knowles (slides)
5:00 PM Panel discussion — Black-box Optimization & Beyond
6:00 PM End

Poster spotlight

Spotlight 1 at 10:10 will present the papers with ID: 1, 5, 13, 17, 19, 23, 24, 26, 2, 3, 10, 14 (in this order) Spotlight 2 at 10:10 will present the papers with ID: 4, 7, 11, 12, 15, 16, 18, 20, 21, 28, 29, 30, 32 (in this order)

Invited speakers

Joshua Knowles (University of Birmingham)

Title: Multiobjective Bayesian Optimization

Abstract: Bayesian optimization solves the problem of selecting where to place the next search sample(s) when we are under severe budget limitations. It trades computational cost off against the cost of sampling. When we look at multiobjective algorithms we have some further issues: How should we set up the Gaussian process surrogate(s), and how should we adjust the acquisition functions? Advanced algorithms need to solve even more challenging questions: how do we estimate where the true Pareto front is, even before we get there, how do we account for constraints, how do we make use of prior knowledge, how do we deal with dynamic constraints that occur when our samples are, for example, from factory or laboratory processes that can’t be stopped and changed at will, and how should we deal with various types of uncertainty? I will offer only partial solutions, and point to others I know working on these challenges. Hopefully this will lead into fruitful discussions. I will also offer some final remarks on benchmarking and what we desperately need in that area.


Marc Toussaint (University of Stuttgart)

Title: Bayesian Optimization: Applications in Robotics and Better Hyperparameters

Abstract: Bayesian Optimization became a standard tool for global black-box optimization, in particular in robotics. However, some aspects are still very unsatisfying, esp in practise: Naive choices of kernels (homogeneous squared exponential) tend to lead to grid-like covering; online adaptation of hyperparameters is intricate and, in my view, not sufficiently investigated. I will first report on some of our applications of BayesOpt for (safe) policy search and controller tuning. Then I discuss new work that proposes a novel non-homogeneous covariance function (that can pickup local polynomial models) and an online hyperparameter adaptation scheme that integrates ideas from classical optimization and performs very promisingly.


Roman Garnett (Washington University in St. Louis)

Title: Active Learning of Hyperparameters for Gaussian Processes

Abstract: We propose an active learning method for learning the hyperparameters of a Gaussian process model. Although the technique is general, we will focus on a motivating example: automatically discovering latent low-dimensional structure in nominally high-dimensional functions. High-dimensional settings are increasingly frequent and important, but still present severe practical difficulties. Our technique may be used to perform adaptive initialization for tasks such as GP regression or Bayesian optimization.


Alex Wiltschko

Title: Bayesian Optimization in the Wild

Abstract: The performance of complicated engineering systems depends critically on careful configuration. However, black-box optimization, much less Bayesian Optimization, has not penetrated large-scale software engineering disciplines. I will give a few successful case studies of the application of BayesOpt to infrastructural problems in the modern data center. Also, I will highlight the organizational and communicational work required to use BayesOpt with teams that have little or no machine learning background.

Katharina Eggensperger (University of Freiburg)

Title: Benchmarking Beyond Branin

Abstract: Bayesian optimization has evolved to deal with problems beyond typical black box optimization. Recent approaches handle high dimensional problems, exploit cheap approximations of very costly problems, and optimize across a set of tasks. Tracking progress in this constantly evolving area of research requires thorough empirical evaluations. In this talk I will give an overview of possibilities to assess the performance of our methods, including a new class of surrogate benchmark problems which enable efficient and realistic comparisons.