Introduction to Bayesian Learning

# VIBASS2 - Basic Course

VIBASS2 Basic Course

The first two days include a basic course on Bayesian learning (12 hours), with conceptual sessions in the morning and practical sessions with basic Bayesian packages in the afternoon. This is a summary of the contents of both days.

## Monday 16

### Session I: Theory (10:00 – 11:30)

**Introduction**. All you need is… probability. **Proportions**: binomial distribution and likelihood function. **Prior distribution**: the beta distribution. Posterior distribution is also a beta distribution. **Summarising** posterior inferences. **Estimation and prediction**. Prediction of new binomial data. Inference and prediction with simulated samples: comparison of independent populations.

### Session II: Theory (12:00 – 13:30)

**Count data**: Poisson distribution. Poisson model parameterized in terms of rate and exposure. Gamma distribution as **conjugate prior distributions**. Negative binomial **predictive distributions**. **Normal data**. Estimation of a normal mean with known variance. **Prediction** of a future observation. Normal data with unknown mean and variance. **Nuisance parameters**. **Joint prior distributions**. Joint, conditional and marginal **posterior distributions**. **Hypothesis testing. Bayes factor**.

### Session III and IV: Practice (15:00 – 16.30, 17:00 – 18:30)

All you need is… lacasitos, Winterfell, and to measure your height. Conceptual and computational issues for the Beta-Binomial, Poisson-Gamma, and Normal-Normal models.

## Tuesday 17

### Session V: Theory (10:00 – 11.30)

**Bayesian statistical modelling**. Starting with linear and generalized linear models and understanding the basics of how to model a real problem from the Bayesian point of view. Response variables, covariates, factors (fixed and random).

### Session VI: Theory (12:00 – 13.30)

**The big problem** in the Bayesian framework: **resolution of integrals** that appear when applying the learning process. **Numerical approaches**: Laplace approximations, **Monte Carlo integration** and importance sampling. **Markov Chain Monte Carlo**: Gibbs sampling and Metropolis Hastings. Convergence, inspection of chains, etc. Examples of MCMC. Software for performing MCMC. **Hierarchical Bayesian modeling**. Hierarchies or levels. Parameters and hyperparameters. Priors and hyperpriors.

### Session VII and VIII: Practice (15:00 – 16.30, 17:00-18:30)

Programming your own Metropolis-Hasting algorithm for the data and models of the Sessions III and IV. `R`

Software for inference in Bayesian hierarchical models.

EVENTS