Starting from:

$30

School of Computing Theory Assignment 1 of 4


School of Computing Theory Assignment 1 of 4
COMP3670/6670: Introduction to Machine Learning
Maximum credit. 100
Exercise 1 Solving Linear Systems (4+4 credits)
Find the set S of all solutions x of the following inhomogenous linear systems Ax = b, where A and b
are defined as follows. Write the solution space S in parametric form.
(a)
A =


0 1 5
1 4 3
2 7 1

 , b =


−4
−2
−2


(b)
A =

2 3 1
4 0 3
, b =

6
12
Exercise 2 Inverses (4 credits)
For what values of [a, b, c]
T ∈ R
3 does the inverse of the following matrix exist?


1 a b
1 1 c
1 1 1


Exercise 3 Subspaces (3+3+3+3 credits)
Which of the following sets are subspaces of R
3
? Prove your answer. (That is, if it is a subspace, you
must demonstrate the subspace axioms are satisfied, and if it is not a subspace, you must show which
axiom fails.)
(a) A = {(x, y) ∈ R
2
: x ≥ 0, y ≥ 0}
(b) B = {(x, y, z) : x + y + z = 0}.
(c) C = {(x, y) ∈ R
2
: x = 0 or y = 0}
(d) D = The set of all solutions x to the matrix equation Ax = b, for some matrix A and some vector
b. (Hint: Your answer may depend on A and b.)
Exercise 4 Linear Independence (4+8+8 credits)
Let V and W be vector spaces. Let T : V → W be a linear transformation.
(a) Prove that T(0) = 0.
(b) For any integer n ≥ 1, prove that given a set of vectors {v1, . . . vn} in V and a set of coefficients
{c1, . . . , cn} in R, that
T(c1v1 + . . . + cnvn) = c1T(v1) + . . . + cnT(vn)
(c) Let {v1, . . . vn} be a set of linearly dependent vectors in V .
Define w1 := T(v1), . . . , wn := T(vn).
Prove that {w1, . . . , wn} is a set of linearly dependent vectors in W.
Exercise 5 Inner Products (4+8 credits)
(a) Show that if an inner product h·, ·i is symmetric and linear in the first argument, then it is bilinear.
(b) Define h·, ·i for all x = [x1, x2]
T ∈ R
2 and y = [y1, y2]
T ∈ R
2 as
hx, yi = x1y1 + x2y2 + 2(x1y2 + x2y1)
Which of the three inner product axioms does h·, ·i satisfy?
Exercise 6 Orthogonality (8+6 credits)
Let V denote a vector space together with an inner product h·, ·i : V × V → R.
Let x, y be non-zero vectors in V .
(a) Prove or disprove that if x and y are orthogonal, then they are linearly independent.
(b) Prove or disprove that if x and y are linearly independent, then they are orthogonal.
Exercise 7 Properties of Norms (4+4+10 credits)
Given a vector space V with two norms k · ka : V → R≥0 and k · kb : V → R≥0, we say that the two
norms k · ka and k · kb are ε-equivalent if for any v ∈ V , we have that
εkvka ≤ kvkb ≤
1
ε
kvka.
where ε ∈ (0, 1].
If k · ka is ε-equivalent to k · kb, we denote this as k · ka
ε∼ k · kb.
(a) Is ε-equivalence reflexive for all ε ∈ (0, 1]?
(Is it true that k · ka
ε∼ k · ka?)
(b) Is ε-equivalence symmetric for all ε ∈ (0, 1]?
(Does k · ka
ε∼ k · kb imply k · kb
ε∼ k · ka?)
(c) Assuming that V = R
2
, prove that k · k1
ε∼ k · k2. for the largest ε possible.
Exercise 8 Projections (3+3+3+3 credits)
Consider the Euclidean vector space R
3 with the dot product. A subspace U ⊂ R
3 and vector x ∈ R
3 are
given by
U = span





1
1
1

 ,


2
1
0





, x =


12
12
18


(a) Show that x 6∈ U.
(b) Determine the orthogonal projection of x onto U, denoted πU (x).
(c) Show that πU (x) can be written as a linear combination of [1, 1, 1]T and [2, 1, 0]T
.
(d) Determine the distance d(x, U) := miny∈U kx − yk2.

More products