VIBASS 3
Registration form

Introduction to Bayesian Learning

VIBASS3 - Basic Course

VIBASS3 Basic Course

The first two days include a basic course on Bayesian learning (12 hours), with conceptual sessions in the morning and practical sessions with basic Bayesian packages in the afternoon. This is a summary of the contents of both days.

Monday

Session I: Introduction to Bayesian statistics (10:00 – 11:30)

All you need is… probability. Frequentist and Bayesian probability. Bayes’ theorem for random events and variables, parameters, hypothesis, etc. Sequential updating. Predictive probabilities. Proportions: binomial distribution and likelihood function. Prior distribution: the beta distribution. Summarising posterior inferences. Estimation and prediction. Simulated samples: comparison of independent populations.

Session I: Practice (12:00 – 12:30)

All you need is… lacasitos.

Session II: Basic statistical models (15:00 – 16:30)

Count data: Poisson distribution. Poisson model parameterized in terms of rate and exposure. Gamma distribution as conjugate prior distributions. Negative binomial predictive distributions.

Normal data: Estimation of a normal mean with known variance. Prediction of a future observation. Normal data with unknown mean and variance. Nuisance parameters. Joint prior distributions. Joint, conditional and marginal posterior distributions.

Session II: Practice (17:00 – 18:30)

How many u’s in a Game of Thrones book page and how tall are you?

Tuesday

Session III: Bayesian inference (10:00 – 11:30)

The big problem in the Bayesian framework: resolution of integrals that appear when applying the learning process. Numerical approaches: Gaussian approximations, Laplace approximations, Monte Carlo integration and importance sampling. Markov chain Monte Carlo: Gibbs sampling and Metropolis Hastings. Convergence, inspection of chains, etc. Software for performing MCMC.

Session III: Practice (12:00 – 13:30)

Programming your own Metropolis-Hasting algorithm.

Session IV: Bayesian hierarchical models (15:00 – 16:30)

Incorporating random effects: Bayesian hierarchical models (BHMs), the coolest tool for modelling highly structured models. Hierarchies, hyperparameters, and hyperpriors. (Generalized) linear mixed models as basic examples of BHMs.

Session IV: Practice (17:00 – 18:30)

Software for inference in Bayesian hierarchical models.