final exam review Flashcards

1
Q

solution for continuous dynamic systems with eigenvalues p +/- iq

A

for the linear system dx/dt = Ax, where A is a 2 x2 matrix with eigen vals p + iq (and q != 0),

Consider an eigen vector v + iwwith eigenvalue p + iq

Then x(t) = e^pt * S * [ (cos(qt) - sin(qt)) | (sin(qt) cos(qt)) ] * S^-1 x0

where S = [w v]. (S-1 * x0) is the coordinate vector of x0 with respect to basis w, v.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

trajectories of continuous dyanmical systems with eigenvalues p + iq

A

ellipses (linearly distorted circles) if p = 0
spirals inwawrd if p is negative
spirals outward if p is postiive.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

stability of continuous dynamic systems (three conditions)

A

for the system dx/dt = Ax where A is a real 2 x 2 matrix, the zero state is asymtotically stable equilibrium solution iff tr A < 0 and det A > 0. Another case is that iff the real parts of all eigenvalues of A are negative.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

euler’s formula

A

e^it = cos(t) + i sin(t)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

complex exponential function characterization

A

if Y is a complex number, then z = e^Yt is the unique, complex valued function such that

dz/dt = Yz and 
z(0) = 1
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

general solution of a continuous dynamical system

A

for the system dx/dt = Ax, suppose there is an eigenbasis v1, …, vn for A, with associated eigen values Y1, …, Yn

Then the general solution of the system is
(c1)(e^Y1t)v1 + … (cn)(e^Ynt)vn

OR

x(t) = [eigenbasis] * [diagonals of e^Yi*t] [matrix of coordinates c]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

how to solve a linear differential equation

A

for linear differential equation dx/dt = kx, with initial value x0,
the solution is x(t) = (e^kt)x0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

ways a linear dynamical sstem can be modeled

A

discrete: x(t + 1) = Bx(t)
continuous: dx/dt = Ax`

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

singular value decomposition

A

any n x m matrix A can be written as A = U Z V^T.

Z is the diagonal of the matrix’s singular values. (It always has the same dimensions as original matrix). Singular values are the square roots of the symmetric matrix A^T(A)

V is the orthonormalized eigenbasis of A^T(A). Orthonormalizing a basis usually only involves making vectors unit vectors, especially if they are perpendicular.

Each column vector of U is produced by
u1 = 1/(singularvalue1) * A * V1
replacing 1 with whatever floats your boat.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

significance of singular value decompositions

A

for L(x) = Ax (a linear transform from Rm to Rn), there’s an orthonormal basis v1, v2…vm such that

Vectors L(vi) are orthogonal and
their lengths are the singular values of matrix A

v1..vm are the orthonormal eigenbasis fo A^T(A) - V in the sigular value decomposition.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

definiteness of a quadtraic form

A

For a quadtraic form q(x) = xAx where A is a symmetric n x n matrix,

A is positive definite if q(x) is positive for all nonzero x
positive semidefinite if >= 0
and negative definite/semidefinite analogousy.
Indefinite otherwise.

A symmetric matrix is positive definite iff all of its eigenvalues are positive. Semi if >=0. And so forth.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

how to diagonalize a quadratic form

A
for q(x) = x * Ax
Find the orthonormal eigen basis with eigen values Y1...Yn. 

Have q(x) = Y1c1^2 + … Yn(cn^2)

Wheren teh ci are the coordinates of x with respect to B.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

principal axes

A

the eigenspaces of A are teh principal axes of q when q(x) = x * Ax
They are one-dimensional.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

orthogonal diagonalization of a symmetric matrix A

A

find the eigenvalues of A and a basis of each eigenspace.
Use Gram-Schmdt to find an orthonormla basis of each eigenspace.
Form an orthonomral eigenbasis v1, v2…vn for A by concatenating the orthonormal basis I find.

S = [ v1 v2 … vn] and is orthogonal and S^-1AS will be diagonal. Finding the later finishes the diagonalization.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

eigenvalue test for matrix symmetryq

A

a symmetric n xn matrix A has n real eigenvalues if they are counted with their algebraic multiplicities

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

spectral theorem

A

a matrix is orthogonally diagonalizable (there exists an orthogonal matrix S such that S^-1AS = S^TAS) iff A is symmetric (A^T = A)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

orthogonality of eigenvectors

A

if A is symmetric and v1 and v2 are eigenvectors of A with distinct eigenvalues, then v1 * v2 = 0. And the two are orthogonal.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

modulus and argument

A

modulus: |z| wheren z = b + ai equals… sqrt(b^2 + a^2)
argument: the polar angle of the complex number.
Found by drawing [b a] and then using trig to find the angle (usually involves arctan(a/b).

arg(zw) = arg(z) + arg(w)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

polar form

A

z + r(cosZ + isinZ)

where Z is the polar angle and r is the magnitude of the compex number

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

de moivre’s formula

A

(cosZ + isinZ)^n = cos(nZ) + isin(nZ)

21
Q

trace, determininat, complex eigenvalues

A

tr A = Y1 + .. Yn
det A + Y1 * .. Yn
when the eigenvalues are all complex.

22
Q

stability and eigenvalues

A

a dynamical system x(t + 1) = Ax(t): its zero state is asymtotically stable iff the modulus of all the complex eigenvalues of A is less than 1.

23
Q

discrete model with complex eigen vales - solution

A

for x(t + 1) = Ax(t), with eigen values p+/- iq = r(cosZ +/i sin(Z))

let v + iw be an eigen vector of A with eigenvalue p + iq,

x(t) = r^t S[cos(Zt) -sin(Zt) | sinZt cosZt ] [coordinates]

where S = [w v]

24
Q

how to determine if a matrix is diagonalizable

A

we wish tifnd an invertible matrix S such that S^-1 A S = B is diagonal.

a. find the eigenvalues of A
b. for eigen eigenvalue find the eigenbasis v1..vn
c A is diagonalizable iff the dimensions of the eigenspaces add up to n.
S = [v1 …vn] and B = [diagonals of eigenvalues]

25
Q

eigenvalues of similar matrices

A

if A is similar to B,
A and B hav the same characteristic polynomial,
rank A = rank B and nullity A = nullity B
A and B have the same eigenvalues, etc,
same determinant, same trace

26
Q

characterizations of invertible

A
Ax = b has a unique solution for all b in Rn
rref A = In
rank A = n
im A = Rn
ker A = {0}
columns of A form a a basis of Rn
columns of A span Rn
columns of A are linearly independent
det A != 0
0 fails to be an eigenvalye of A
27
Q

volume of a parallelpiped

A

for a matrix [v1 v2 v3] and the parallpiped defined by its vectors, its volumn is |detA|

For n x m matrices its sqrt(det(A^T * A))

28
Q

expansion factor

A

area of T(space)/area of space

29
Q

determinant of the transpose

A

if A is square, det(A^T) = det A

30
Q

elementary row operations and determinants

A

if b comes from dividng a row of A by a scalar k, det B = (1/k) det A
row swap: detB = -A
adding a multple of A to another row: detB = detA

31
Q

determinants of powers and products

A

det(AB) = det A * detB

powers is now obvious

32
Q

determinants of similar matrices

A

det A = detB

33
Q

determinants of inversion of a matrix

A

det(A^-1) = 1/detA

34
Q

imA and A^T

A

imA^perpendicular = ker(A^T)

^perpendicular is the orthogonal complement, the st of vecotrs x in Rn orthogonal to all vectos in A

35
Q

kernels and transpose

A
kerA = ker(A^T A)
if ker(A) = {0)} then A^T A is iinvertible. Duh.
36
Q

least squares solution

A

for Ax = b, a least square soslution is he one that minimizes the difference between b and Ax for all x in R^m.

the least square solution is the exact solution of the consistent system
A^TA x = A^T b. That’s the normla equation of Ax = b

37
Q

the matrix of the orthogonal project using the transpose

A

A (A^T A)-1 A^T is that matrix.

38
Q

matrix of orthogonal projection found by orthonomrla basis

A

for a subapce V with orthonomral basis u1..vm, the matrix P of the orthogonal project onto V is

P = QQ^T where Q is thr matrix of u1..m

39
Q

properties of the transpose

A
(A + B) ^ T = A^T + B^T
(kA)^T = k(A^T)
(AB)^T = B^T A^T
rank(A^T) = rankA
(A^T)^-1 = (A^-1)^T
40
Q

characteristics of orthogonal matrices

A

A is orthogongal.
L(x) = Ax preserves length.
columns of A forma an orthonormal basis of Rn
A^T A = In
A^-1 = A^T
A preserves the dot projet for all x and y. (Ax) * (Ay) = x * y

41
Q

angle between two vectors

A

arrcos [ x * y / ||x||||y||]

also the correlation coefficient

42
Q

formlar for the orthogonal project

A

different from gram schmit

)u1 * x)u .. (um * x) um

43
Q

coordinates in a subspace of Rn

A

[x]b is the coordinates of the funciton x cv1 + c2v2 + … cmvm

where the coordinates express relations between the values of x and the vectors of the basis of V, b.
They have linearity properties,

44
Q

rank-nulity theorem

A

dim(kerA) + dim(imA) = m for any n * m matrix.

45
Q

matrix for orthogonal project

A

[u^2 u1u2
u1u2 u2^2]

where [u1 u2] is a unit vector parallel to L, the line being projected onto
also (ui * x)ui …

46
Q

reflection

A

[a -b
b -a]

find with 2projL(x) - x for a specific line. a^2 + b^2 = 1

47
Q

rotation

A

cos Z -sinZ
sinZ cos Z

or [a -b
b a] where a^2 + b^2 = 1

Z is the angle. Multiply by r to also scale.

48
Q

horizontal shear

A

[1 k
0 1]

Vertical is by swap)