Starting from:

$30

Cmput 466 Assignment 4

Problem 1.
Prove that, if 𝑃(𝑋, π‘Œ) = 𝑓(𝑋)𝑔(π‘Œ) for some function 𝑓 on 𝑋only, and 𝑔on π‘Œonly, then 𝑋 and π‘Œ are
independent.
Hint: Use 𝑃(𝑋, π‘Œ) = 𝑃(𝑋)𝑃(π‘Œ) to prove independence, which is simply the definition. To obtain
𝑃(𝑋) and 𝑃(π‘Œ), use the definition of marginal probability. The rule of the thumb is that, when we
don’t know how to start a proof, it’s a good indicator that the proof is simple. Just follow the
definitions.
Problem 2.
Prove that the expectation is a linear system.
𝔼
𝑋∼𝑃(𝑋)
[π‘Ž 𝑓(𝑋) + 𝑏 𝑔(𝑋)] = π‘Ž 𝔼
𝑋∼𝑃(𝑋)
[𝑓(𝑋)] + 𝑏 𝔼
𝑋∼𝑃(𝑋)
[𝑔(𝑋)]
You may treat 𝑋 as a discrete variable.
Hint: Again, use the definition 𝔼
𝑋∼𝑃(𝑋)
[𝑓(𝑋)] =
𝑋
∑ 𝑃(𝑋)𝑓(𝑋)
Problems 3 and 4 concern the following setting.
Let be a continuous random variable uniformly distributed in the interval ,
where and are unknown parameters.
We have a dataset , where each data sample is iid drawn from the above distribution,
and we would like to estimate the parameters and .
Problem 3.
(a) Give the likelihood of parameters.
(b) Give the maximum likelihood estimation of parameters.
Problem 4.
(c) Prove that MLE is biased in this case.
(d) Prove that MLE is asymptotically unbiased if .
Hint: A parameter estimation being biased is said in terms of the current dataset ,
although we imagine that could be repeatedly drawn from a repeatable trial. In other
words, we do not assume goes to infinity in (c). Being asymptotically unbiased means that, if
goes to infinity, the bias would become smaller and converge to 0.
END OF W4

More products