Starting from:

$30

Cmput 466 Assignment 6

Problem 1
Suppose 90% samples are positive (t=1) and 10% are negative (t=0). Compute P, R, and F1
scores of majority guess (always predicting t=1). If in your derivation the denominator is 0,
please compute the limit.
Explain the deficiency of the F1-score in this case, and discuss one possible treatment to
resurrect P, R, F1 scores for this problem.
Problem 2.
Prove that the sigmoid function σ(𝑧) = is symmetric about the point (0,0.5), in
1
1+exp{−𝑧}
other words, σ(− 𝑧) = 1 − σ(𝑧).
Problem 3.
Prove that minimizing the loss 𝐽 =− 𝑡 log 𝑦 − (1 − 𝑡) log(1 − 𝑦) is equivalent to
minimize the Kullback--Leibler (KL) divergence between t and y, denoted by KL(t || y), where t=(
1 − 𝑡, 𝑡) and y=(1 − 𝑦, 𝑦) are two Bernoulli distributions.
For two discrete distributions P= (𝑝 and Q , the KL divergence is defined as
1
, ···, 𝑝
𝐾
) = (𝑞
1
, ···, 𝑞
𝐾
)
KL(P || Q)=
𝑘=1
𝐾
∑ 𝑝
𝑘
log
𝑝
𝑘
𝑞
𝑘
Note: KL divergence is not symmetric between P and Q. To minimize the KL, Q must cover all
the support of P. Thus, the learned distribution may be smoother than it should be.
END OF W6

More products