HW2 (Due 10/9)
(6 points) Consider a jar with three types of dices. Dice A is fair. Dice B has probabilities \(0.1,0.1,0.1,0.1,0.3,\) and \(0.3\) for outcomes \(1,2,3,4,5,\) and \(6\), respectively. And Dice C has probabilities \(0.3,0.3,0.1,0.1,0.1,\) and \(0.1\) for outcomes \(1,2,3,4,5,\) and \(6\), respectively. Assume that we have one A-dice, 3 B-dices, and 6 C-dices in the jar, and we drew one dice from the jar and threw it three times and getting all ones. What is the estimated probability of getting another one in the next toss if we use a) MLE, b) MAP, and c) Bayesian estimation?
(6 points) Repeat Q1 but assume that we have a Dirichlet prior with parameters \((\alpha_1=\alpha_2=\alpha_3=\alpha_4=\alpha_5=\alpha_6=2)\) instead. Dirchlet distribution is an extension of the Beta distribution, and it is the conjugate prior of the multinomial distribution (in contrast with Beta being the conjugate prior of binomial distribution). Please check out slides 87-91 if you are stuck below.
What is the a posteriori distribution after observing three ones as in Q1?
What is the estimated probability of getting another one if we use MAP for estimation?
What is the estimated probability of getting another one if we use Bayesian estimation?
(6 points) Back to Q1, denote \(D\) as the actual dice we drew, and \(X_1, X_2, X_3\) are the respective outcomes of three tosses.
What is \(H(X_1)\)?
What is \(H(X_1|D)\)?
What is \(I(X_1;X_2|D)\)?
(2 points) Consider sequences of \(10,000\) coin-flips with the probability of head equal to \(0.9\). Show that a sequence with \(9,000\) heads and \(1,000\) tails is typical.
|