Guest lecture in APML course 2023
Guest lecturer: Arto Klami, Helsinki University
Links to an external site.
Time and place: 19 September, 14:15-15:00, Ångström 4101
Title: Better priors for everyone
Abstract: Statistical modelling and Bayesian machine learning build on the core principle of updating prior beliefs based on data, to estimate a distribution over possible values the unknown parameters of the model could take. Much of the literature focuses on presenting new models or inference algorithms and largely avoids the question of how to specify the prior distributions. In statistical modelling the practitioners are instructed to encode subjective prior knowledge in form of a suitable distribution, but they lack proper tools for doing it since it is typically far from trivial to define a prior that matches the beliefs. In machine learning models the priors often play a regularizing role and are chosen by cross-validation procedures to maximize predictive accuracy, which is computationally costly.
This talk focuses on the question of how to choose the prior distributions in varying scenarios, describing concepts and tools that help choosing better priors with less cognitive and computational effort. We will go through prior elicitation as means of transforming tacit human knowledge into valid prior distributions and discuss the current state of prior elicitation techniques. In addition, we will introduce an approach for choosing the prior distributions for Bayesian machine learning methods without cross-validation setups, resulting in automatic choice of priors that does not require carrying out computationally costly inference.
The talk builds largely on papers https://doi.org/10.1214/23-BA1381 Links to an external site. and https://www.jmlr.org/papers/v24/21-0623.html Links to an external site.