Starting from:

$35

Deep Learning – Assignment 1

000
001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031
032
033
034
035
036
037
038
039
040
041
042
043
044
045
046
047
048
049
050
051
052
053
054
055
056
057
058
059
ece, Deep Learning – Assignment 1
Submit by Sept. , pm
tldr: Perform linear regression of a noisy sinewave using a set of gaussian basis
functions with learned location and scale parameters. Model parameters are
learned with stochastic gradient descent. Use of automatic differentiation is
required. Hint: note your limits!
Problem Statement Consider a set of scalars {x1, x2, . . . , xN } drawn from U(0, 1)
and a corresponding set {y1, y2, . . . , yN } where:
yi = sin (2πxi) + ϵi (1)
and ϵi
is drawn from N (0, σnoise). Given the following functional form:
yˆi =

M
j=1
wjϕj (xi
| µj
, σj ) + b (2)
with:
ϕ(x | µ, σ) = exp
−(x − µ)
2
σ
2
(3)
find estimates ˆb, {µˆj}, {σˆj}, and {wˆj} that minimize the loss function:
J(y, yˆ) = 1
2
(y − yˆ)
2
(4)
for all (xi
, yi) pairs. Estimates for the parameters must be found using stochastic
gradient descent. A framework that supports automatic differentiation must be
used. Set N = 50, σnoise = 0.1. Select M as appropriate. Produce two plots. First,
show the data-points, a noiseless sinewave, and the manifold produced by the
regression model. Second, show each of the M basis functions. Plots must be of
suitable visual quality.
−4 −2 0 2 4
x
−1.5
−1.0
−0.5
0.0
0.5
1.0
1.5
y
Fit 1
−4 −2 0 2 4
x
0.0
0.2
0.4
0.6
0.8
1.0
y
Bases for Fit 1
−4 −2 0 2 4
x
−1.5
−1.0
−0.5
0.0
0.5
1.0
1.5
y
Fit 2
−4 −2 0 2 4
x
0.0
0.2
0.4
0.6
0.8
1.0
y
Bases for Fit 2
Figure 1: Example plots for models with equally spaced sigmoid and gaussian basis functions.

More products