session | date | topic | reading (main) | homework |
---|---|---|---|---|

1 | 10/14 | course overview & probability primer | Kruschke 4 & 5.1 | |

2 | 10/21 | basics of BDA | Krushke 5 & 6 | |

3 | 10/28 | Classical and Bayesian statistics (with shiny) | Wagenmakers (2007) | |

4 | 11/4 | Regression modeling in R | Kruschke 3 | hw 1 solutions |

5 | 11/11 | MCMC methods | Kruschke 7 | |

6 | 11/18 | using JAGS | Kruschke 8 | hw 2 solutions |

7 | 11/25 | generative models | Kruschke 9 | |

8 | 12/2 | model comparison | Vandekerckhove et al. (2015) | hw 3 solutions |

9 | 12/9 | estimation, comparison & criticism | Kruschke 11, 12 | |

10 | 12/16 | Stan, JASP, GLMs, projects | Kruschke 10 | hw 4 solutions |

11 | 13/01 | task types & link functions | Franke (2016) | |

12 | 20/01 | Q&A session | ||

13 | 27/01 | estimating subjective beliefs | tba | hw 5 |

14 | 03/02 | nonparametric Bayesian methods | see below | |

15 | 10/02 | project presentations |

some project ideas are here

**1. course overview & probability primer, October 14**

- course overview
- student survey
- probability primer
- subjective vs. objective
- mass vs. density
- cumulative distributions
- conditional probability
- Bayes rule
- joint distributions
- marginalization

*required reading*- Kruschke 2015, chapters 4 & 5.1

**2. basics of BDA, October 21**

- towards BDA (Kruschke 5, 6)
- models, parameters, data, likelihood, priors
- x% credible interval
- coin bias example
- credible intervals

*required reading*- Kruschke 2015, chapters 5 & 6

*optional reading*

**3. Classical and Bayesian statistics, October 28**

- recap classical statistics
- motivating binomial example
- maximum likelihood, sufficiency, consistency, bias
- sampling distribution, the bootstrap
*p*values, confidence intervals- problems with
*p*values and confidence intervals

- Bayes factor and (Markov Chain) Monte carlo methods
- t-test
- classical versus Bayesian

*required reading*- Wagenmakers (2007) “A practical solution to the pervasive problem of p values”

*optional reading*

**4. Regression modeling in R, November 4**

- recap of session 3
- introduction to regression modeling
- loss function, least squares, maximum likelihood
- problems, pitfalls, and assumptions; regularization

- R
- how to run a regression
- how to implement own regression function
- cool tools: rmarkdown, shiny, RStudio

Bayesian regression

*optional reading*

**5. MCMC sampling, November 11**

- MCMC sampling (importance, theory)
convergence checks (theory)

*required reading*- Kruschke 2015, chapter 7

**6. JAGS, November 18**

- why and how to use JAGS
- syntax
- calls from R
- good practices & debugging

MCMC convergence checks in practice

*required reading*- Kruschke 2015, chapter 8

**7. generative models, November 25**

- generative models, graphical Bayes nets
- notation from Lee & Wagenmakers
limitations; pointers to Church / WebPPL

*required reading*- Kruschke 2015, chapter 9

*optional reading*

**8. model comparison, December 2**

- model comparison by Bayes factor
- philosophy of Bayes factors
- comparison with AIC, BIC, MDL…

- methods for computing Bayes factors:
- grid approximation
- trans-dimensional MCMC
- importance sampling
- Savage-Dickey

*required reading*- Kruschke 2015, chapter 10

*optional reading*

**9. estimation, comparison, & criticism, December 9**

answer different questions

*optional reading*

**10. Stan, JASP, possible projects: December 16**

- Stan
*optional reading*

**11. Task types & link functions: January 13**

- theoretical & experimental pragmatics
- natural language quantifiers
- scalar implicatures
- typicality of quantifier \(\textit{some}\)

- generalized linear model
- types of dependent variables
- predictors & link functions

- probabilistic model
- gradient salience of alternative expressions
- one predictor feeds two link functions

*main reading*:*optional reading*:*model code*:

**12. Q&A session: January 20**

- ramblings by Fabian
- Here is his talk from the day before (might help for homework 5)

**13. estimating subjective beliefs: January 27**

- models with subjects doing Bayesian inference
- need for subjective belief estimates
- hierarchical model for estimating population-level average beliefs
- reading to be added soon

**14. non-parametric Bayesian methods: February 3**

- motivation and misnomers
- Bayesian linear regression on EEG data (recap)
- Gaussian process regression
- the kernel trick
- infinite dimensional multivariate normal distribution

- Finite mixture models (recap)
- Dirichlet process mixture
- Modeling significant \(p\) values

*optional reading:**optional listening:*

**15. project presentations: February 10**

- short presentations of projects
- final remarks, reflection, discussion