Starting from:

$30

HW 1 Orthogonal matrices

Submit your assignments via gradescope.com. We
have yet to set this up, so we will post an a announcement soon about how to do this.
1. Orthogonal matrices (5 points each)
(a) Show that if U is an orthogonal matrix, then for all x ∈ R
d
, kxk = kUxk, where the
norm is the Euclidean norm.
(b) Show that all 2 × 2 orthogonal matrices have the form
?
cos θ − sin θ
sin θ cos θ
?
or ?
cos θ sin θ
sin θ − cos θ
?
In which case does Ux correspond to clockwise rotation of x?
(c) Express the equation of an ellipse in R
2
in the form
n
x


(x − c)
T A(x − c) = r
2
o
,
where c ∈ R
2
, r 0, and A = UΛU
T where U is orthogonal and Λ is diagonal. Choose
c, r, U, and Λ such that the major axis has length 3, the minor axis has length 1, and
the center of the ellipse is at the point [3 − 1]T
, and the major axis makes an angle of
+π/6 radians with the positive x-axis.
2. Positive (semi-)definite matrices (5 points each)
Let A be a real, symmetric d × d matrix. We say A is positive semi-definite (PSD) if, for all
x ∈ R
d
, x
T Ax ≥ 0. We say A is positive definite (PD) if, for all x 6= 0, x
T Ax 0. We write
A ? 0 when A is PSD, and A ? 0 when A is PD.
The spectral theorem (which we will assume without proof) says that every real symmetric
matrix A can be expressed via the spectral decomposition
A = UΛU
T
,
where U is a d × d matrix such that UUT = U
TU = I (called an orthogonal matrix), and
Λ = diag(λ1, . . . , λd). Multiplying on the right by U we see that AU = UΛ. If we let ui
denote the i-th column of U, we have Aui = λiui for each i. This expression reveals that the
λi are eigenvalues of A, and the corresponding columns are eignvectors associated to λi
. The
eigenvalues constitute the “spectrum” of A, and the spectral decomposition is also called the
eigenvalue decomposition of A.
Using the spectral decomposition, show that
(a) A is PSD iff λi ≥ 0 for each i.
(b) A is PD iff λi 0 for each i.
Hint: Use the identity
UΛU
T =
X
d
i=1
λiuiu
T
i
,
which can be verified just by showing that the matrices representing the left and right hand
sides have the same entries.
3. Unconstrained Optimization (5 point each)
(a) Show that if f is strictly convex, then f has at most one global minimizer.
(b) Use the Hessian to give a simple proof that the sum of two convex functions is convex.
You may assume f is twice continously differentiable.
(c) Consider the function f(x) = 1
2
x
T Ax + b
Tx + c, where A is a symmetric d × d matrix.
Derive the Hessian of f. Under what conditions on A is f convex? Strictly convex?
4. Optional – Do not turn in – no extra credit
In this problem you will prove some of properties of unconstrained optimiziation problems
discussed in class. It will be helpful to understand the proofs presented in the notes. The
following multivariable versions of Taylor’s Theorem will also be helpul. A twice continuously
differentiable function admits the quadratic expansion
f(x) = f(y) +

∇f(y), x − y
?
+
1
2


x − y, ∇2
f(y)(x − y)
?
+ o

kx − yk
2
?
,
where o(t) denotes a function satisfying lim
t→0
o(t)
t
= 0, as well as the expansion
f(x) = f(y) +

∇f(y), x − y
?
+
1
2


x − y, ∇2
f

y + t(x − y)
?
(x − y)
?
for some t ∈ (0, 1) possibly depending on x and y.
(a) Show that if f is twice-continuously differentiable and x

is a local minimizer, then
∇2f(x

) ? 0, i.e., the Hessian of f is positive semi-definite at the local minimizer x

.
(b) Show that if f is twice continuously differentiable, then f is convex if and only if the
Hessian ∇2f(x) is positive semi-definite for all x ∈ R
d

More products