Starting from:

$29.99

Homework 5: Regression & Neural Networks & Bayes Nets

CS410: Artificial Intelligence 
Homework 5: Regression & Neural Networks & Bayes Nets

1. Cross entropy loss. Recall the statement in Lecture 8, Slide 67 that
the cross entropy loss function is convex in θ. Prove this statement in this
exercise.
2. Backpropagation.
Figure 1: Problem 2.
(a) Calculate the output values at nodes h1, h2 and ˆy of this network
for input x1 = 0, x2 = 1. Show all steps in your calculation. Assume
that the neurons have sigmoid activation function.
(b) Compute one step of the backpropagation algorithm with η = 1 for a
given example with input x1 = 0, x2 = 1 and target output y = 1, using new weights and old weights respectively. Compute the updated
weights for both the hidden layer and the output layer. Comment
on whether a further forward pass gives a lower error. Show all steps
in your calculation. The error on the given example is defined as
E = 1/2(y − O)
2 where O is the real-valued network output of that
example at the output node, and y is the integer-valued target output
for that example.
3. Bayes Nets. Consider the following Bayes net. Calculate the marginal
and conditional probabilities P(¬P3), P(P2 | ¬P3), P(P1 | P2, ¬P3), and
1
P(P1 | ¬P3, P4) using inference by enumeration. Show all steps in
your calculation.
Figure 2: Problem 3.
4. Bayes Nets. Consider the same Bayes net in Exercise 3. Compute
P(¬P3) and P(P2 | ¬P3) using variable elimination. Compare the computational complexity of inference by enumeration and variable elimination and discuss your findings. Show all steps in your calculation.
5. Independence. Answer the following questions by explicitly showing all
steps in your calculation.
(a) Is D independent from A given B in Figure 3?
(b) Is D independent from C given E in Figure 4?
(c) Is D independent from A given E in Figure 5?
6. Likelihood Weighting. Consider the following Bayesian network and
the corresponding probabilities. Assume we generate the following six
samples given the evidence I1 = T and I2 = F: (W1, I1, W2, I2) =
{(S, T, R, F),(R, T, R, F),(S, T, R, F),(S, T, S, F),(S, T, S, F),(R, T, S, F)}
(a) What is the weight of the first sample (S, T, R, F) above?
(b) Use likelihood weighting to estimate P(W2|I1 = T, I2 = F).
2
Figure 3: Problem 5.1.
Figure 4: Problem 5.2.
3
Figure 5: Problem 5.3.
Figure 6: Problem 6.
4

More products