Starting from:

$30

Densities and Expectation

Densities and Expectation
Submit a PDF of your answers to Canvas.
1. Unfair dice, convolution.
a) Consider two independent dice X ∼ p(x) and Y ∼ q(y) where
p(x) =



p1 if x = 1
.
.
.
p6 if x = 6
0 otherwise
and
q(y) =



q1 if y = 1
.
.
.
q6 if y = 6
0 otherwise.
Find an expression for the pmf of X + Y . It may be helpful to specify the ways
in which X + Y = i for i = 2, 3, . . . , 12.
b) Next consider the more general case. Let X and Y be integer valued independent
random variables with pmfs given by pX and pY , and define the random variable
Z = X + Y . Show that
pZ(z) = X
i
pX(xi)pY (z − xi).
2. In homework 1, you wrote a few lines of code to find the minimum of n i.i.d. samples
of a uniform random variable. Here, you will address the same problem analytically.
More precisely, let
Xi
i.i.d. ∼ U[0, 1],
and define
Y = min
i=1,...,n
Xi
.
a) Find an expression for the pdf of Y . Your answer should depend on n.
1 of 3
b) Find an expression for E[Y ].
c) Does this agree with the plot you made previously?
3. Expectation Basics. Let P(X = 1) = 1/2, P(X = 2) = 1/4, and P(X = 3) = 1/4.
a) Compute E[X].
b) Compute E[g(X)] if g(x) = x
2
.
c) The variance of an random variable is defined as Var(X) = E[(X − E[X])2
].
Compute Var(X).
d) Write an expression for E[− log(g(X))] in terms of the function g(X).
e) One particular function is g(X) = P(X), where P(X) is defined above. Write an
expression for E[− log2
(P(X))]. This quantity is the entropy of X.
4. Bi-variate Random Variables. The joint pmf p(x, y) is defined as follows:
P(X = 1, Y = 1) = 1/8
P(X = 1, Y = 2) = 1/8
P(X = 2, Y = 1) = 1/2
P(X = 2, Y = 2) = 1/4
a) Are X and Y independent? Why or why not?
b) For any two random variables X and Y , the covariance is defined as Cov(X, Y ) =
E[(X − E[X])(Y − E[Y ])]. Compute Cov(X, Y ).
c) Define a new random variable Z = X + Y . Specify the pmf of p(z).
d) Write a function (in code) that generates X and Y at random as specified by
the joint PMF. Your function should take no input arguments, and should return
X, Y ∈ {1, 2}
2 with probability specified above.
5. Total Probability. Let X and Y be discrete random variables. Conditional expectation
is defined as EX[X|Y ] = P
x
x P(X = x|Y ) for discrete random variables. Use this
definition to show that
E[X] = EY [EX[X|Y ]].
2 of 3
6. Optional. MAP classification and 1-d discriminant analysis. Let X ∈ R represent a
feature, and Y = 0 or Y = 1 the class label. The distribution of X depends on the
label:
X|Y = 0 ∼ N (0, 1)
X|Y = 1 ∼ N (4, 1)
where N (µ, σ2
) is the Gaussian density: p(x) = √
1
2πσ2
e

(x−µ)
2
2σ2
. Let the prior probability
of the classes be p(y = 0) = 3/4 and p(y = 1) = 1/4.
a) Use a computer to create a plot with both pdfs p(x|y = 0) and p(x|y = 1) on the
same axis.
b) Use Bayes and total probability to find an expression for the posterior p(y|x).
c) Use a computer to evaluate p(y = 0|x = 2) using your expression above. What is
p(y = 0|x = 2)?
d) Recall that maximum a posteriori (MAP) classification rule predicts the label y
as follows:
yb = arg max
y
p(y|x).
Use maximum a posteriori to design a classification rule that will predict if Y = 0
or Y = 1 given X = x.
e) What is the true risk of your MAP classifier? Use a computer to find a numerical
answer.
3 of 3

More products