This set is due 11/4 before class. Any form of readable and intelligible presentation will be accepted. Electronic submissions should go to Fabian with subject BDA: Homework 1. We encourage discussing exercises with fellow students, but only individual solutions will be accepted.
Consider the following two-dimensional matrix:
## blond brown red black
## blue 0.22 0.21 0.00 0.01
## green 0.00 0.14 0.06 0.01
## brown 0.16 0.15 0.00 0.04
Imagine that there are three cards: one is red on either side, the other is white on either side, and the third is red on one side and white on the other. Suppose a confederate draws a card from this set of three totally at random, and shows a totally random side of that card to you.
We saw two different likelihood functions for coin flips in the second lecture. The first one is the binomial distribution:
\[P_{\text{binom}}(\langle n_h, n_t \rangle \, | \, \theta) = {{n}\choose{n_h}} \theta^{n_h} \, (1-\theta)^{n_t}\]
The second one was Kruschke’s generalization of the Bernoulli distribution:
\[P_{\text{Bern}}(\langle n_h, n_t \rangle \, | \, \theta) = \theta^{n_h} \, (1-\theta)^{n_t}\]
Prove that no matter what the priors \(P(\theta)\) are and no matter what \(n_h\) and \(n_t\) we observe, the posterior \(P_{\text{binom}}(\theta \, | \, \langle n_h, n_t \rangle)\) derived from the first likelihood function will be identical to the posterior \(P_{\text{Bern}}(\theta \, | \, \langle n_h, n_t \rangle)\) derived from the second.
Please spell out and comment/explain each relevant derivation step. Pay good attention to spelling out and manipulating the normalizing constants.
Read Wagenmakers (2007) and answer the following questions: