$29
EECS 126: Probability and Random Processes
Problem Set 14 (Optional)
1. Balls in Bins Estimation
We throw n ≥ 1 balls into m ≥ 2 bins. Let X and Y represent the number of balls that land
in bin 1 and 2 respectively.
(a) Calculate E[Y | X].
(b) What are L[Y | X] and Q[Y | X] (where Q[Y | X] is the best quadratic estimator of Y
given X)?
Hint: Your justification should be no more than two or three sentences, no calculations
necessary! Think carefully about the meaning of the MMSE.
(c) Unfortunately, your friend is not convinced by your answer to the previous part. Compute
E[X] and E[Y ].
(d) Compute var(X).
(e) Compute cov(X, Y ).
(f) Compute L[Y | X] using the formula. Ensure that your answer is the same as your
answer to part (b).
2. MMSE and Conditional Expectation
Let X, Y1, . . . , Yn be square integrable random variables. The MMSE of X given (Y1, . . . , Yn)
is defined as the function φ(Y1, . . . , Yn) which minimizes the mean square error
E[(X − φ(Y1, . . . , Yn))2
].
(a) For this part, assume n = 1. Show that the MMSE is precisely the conditional expectation
E[X|Y ]. Hint: expand the difference as (X − E[X|Y ] + E[X|Y ] − φ(Y )).
(b) Argue that
E
(X − E[X | Y1, . . . , Yn])2
≤ E
X −
1
n
Xn
i=1
E[X | Yi
]
2
.
That is, the MMSE does better than the average of the individual estimates given each
Yi
.
3. Geometric MMSE
Let N be a geometric random variable with parameter 1 − p, and (Xi)i∈N be i.i.d. exponential
random variables with parameter λ. Let T = X1 + · · · + XN . Compute the LLSE and MMSE
of N given T.
Hint: Compute the MMSE first.
1