HW1 (Due 9/18)

  1. Compute the mean and variance of \(X\) when

    1. \(X\) is continuous and uniformly distributed between 0 and 2. That is, \(p_X(x)=\begin{cases}0.5, &\mbox{if } 0\le x\le 2\\0, & \mbox{otherwise}\end{cases}\)

    2. \(X\) is discrete and \(Pr(X=1)=0.4, Pr(X=2)=0.2, Pr(X=3)=0.1, Pr(X=4)=0.3\)

  2. (Markov chain) Let's recall that for five random variables, say \(X_1,X_2,X_3,X_4,X_5\), forms a Markov chain \(X_1 \rightarrow X_2 \rightarrow X_3 \rightarrow X_4 \rightarrow X_5 \) if \(X_5 \bot X^3 |X_4\) (i.e., \(p(x_5|x_4,x^3)=p(x_5|x_4))\), \(X_4 \bot X^2 | X_3\), and \(X_3 \bot X_1|X_2\). We will show in the following that the above conditional independence conditions can be summarized and are equivalent to \(p(x^5)=p(x_1) p(x_2|x_1) p(x_3|x_2) p(x_4|x_3) p(x_5|x_4)\). (Remark: note that the proof below can be generalized to chain of any length with mathematical induction.)

    1. Show that the conditional independence conditions lead to the joint distribution equation (Hint: add one \(x\) at a time. That is, show \(p(x^3) = p(x_1) p(x_2|x_1) p(x_3|x_2)\) with the conditional independence conditions, and then \(p(x^4) = p(x_1) p(x_2|x_1) p(x_3|x_2) p(x_4|x_3)\), and so on.)

    2. Show that the joint distribution equation leads to the conditional independence conditions (Hint: use marginalization)

    3. Show that \(p(x^5) = p(x_1) p(x_2|x_1) p(x_3|x_2) p(x_4|x_3) p(x_5|x_4)\) leads to \(p(x^5)=p(x_5) p(x_4|x_5) p(x_3|x_4) p(x_2|x_3) p(x_1|x _2)\). Consequently, the Markov chain definition is symmetric. That is, \(X_1 \rightarrow X_2 \rightarrow X_3 \rightarrow X_4 \rightarrow X_5 \equiv X_5 \rightarrow X_4 \rightarrow X_3 \rightarrow X_2 \rightarrow X_1\). (Hint: if you are stuck, try to show each of the conditional independence conditions are satisfied.)