HW2
Due on Oct 8 (20 points)
(8 points) Consider a jar with three types of dices. Dice A is fair. Dice B has probabilities \(0.1,0.1,0.1,0.1,0.3,\) and \(0.3\) for outcomes \(1,2,3,4,5,\) and \(6\), respectively. And Dice C has probabilities \(0.3,0.3,0.1,0.1,0.1,\) and \(0.1\) for outcomes \(1,2,3,4,5,\) and \(6\), respectively. Assume that we have 2 A-dices, 3 B-dices, and 5 C-dices in the jar, and we drew one dice from the jar and threw it three times and getting all ones. What is the estimated probability of getting another one in the next toss if we use a) MLE, b) MAP, and c) Bayesian estimation?
(8 points) Repeat Q1 but assume that we have a Dirichlet prior with parameters \((\alpha_1=\alpha_2=\alpha_3=3,\alpha_4=\alpha_5=\alpha_6=2)\) instead. Dirchlet distribution is an extension of the Beta distribution, and it is the conjugate prior of the multinomial distribution (in contrast with Beta being the conjugate prior of binomial distribution).
What is the a posteriori distribution after observing three ones as in Q1?
What is the estimated probability of getting another one if we use MAP for estimation?
What is the estimated probability of getting another one if we use Bayesian estimation?
(4 points) Consider sequences of \(10,000\) coin-flips with the probability of head equal to \(0.8\). Show that a sequence with \(8,000\) heads and \(2,000\) tails is typical.
|