 7.1.1: Suppose A is a real 2 2 matrix with complex eigenvalues i, and supp...
 7.1.2: Find the eigenvalues and eigenvectors of the following real matrice...
 7.1.3: Prove Corollary 1.3. (Hint: Generalize the argument in Exercise 1.)
 7.1.4: Prove Lemma 1.4.
 7.1.5: Verify that, in the case of a 3 3 matrix A with dim N(A I) = 2 in t...
 7.1.6: Prove that if p(t) = (t )3 and dim N(A I) = 1, then we must have N(...
 7.1.7: Mimic the discussion of the examples in the proof of Theorem 1.5 to...
 7.1.8: Determine the Jordan canonical form J of each of the following matr...
 7.1.9: SupposeAis an n n matrix with all real entries and suppose is a com...
 7.1.10: If w, z Cn, define their (Hermitian) dot product by w z = _n j=1 wj...
 7.1.11: (Gerschgorins Circle Theorem) Let Abe a complex n n matrix. If is a...
 7.1.12: Use Exercise 11 to show that any eigenvalue of an n n complex matri...
 7.1.13: Use Exercise 12 to show that any eigenvalue of a stochastic matrix ...
 7.1.14: Let T : Cn Cn be a linear transformation. We say v Cn is a generali...
 7.1.15: a. Suppose T (w) = w. Prove that (T I)k(w) = ( )kw. b. Suppose 1, ....
 7.1.16: a. Let J be a k k Jordan block with eigenvalue . Show that (J I)k =...
 7.1.17: Prove that the set of n vectors constructed in the proof of Theorem...
Solutions for Chapter 7.1: Complex Eigenvalues and Jordan Canonical Form
Full solutions for Linear Algebra: A Geometric Approach  2nd Edition
ISBN: 9781429215213
Solutions for Chapter 7.1: Complex Eigenvalues and Jordan Canonical Form
Get Full SolutionsThis textbook survival guide was created for the textbook: Linear Algebra: A Geometric Approach, edition: 2. Linear Algebra: A Geometric Approach was written by and is associated to the ISBN: 9781429215213. Chapter 7.1: Complex Eigenvalues and Jordan Canonical Form includes 17 full stepbystep solutions. This expansive textbook survival guide covers the following chapters and their solutions. Since 17 problems in chapter 7.1: Complex Eigenvalues and Jordan Canonical Form have been answered, more than 4448 students have viewed full stepbystep solutions from this chapter.

Big formula for n by n determinants.
Det(A) is a sum of n! terms. For each term: Multiply one entry from each row and column of A: rows in order 1, ... , nand column order given by a permutation P. Each of the n! P 's has a + or  sign.

CayleyHamilton Theorem.
peA) = det(A  AI) has peA) = zero matrix.

Change of basis matrix M.
The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)

Cofactor Cij.
Remove row i and column j; multiply the determinant by (I)i + j •

Companion matrix.
Put CI, ... ,Cn in row n and put n  1 ones just above the main diagonal. Then det(A  AI) = ±(CI + c2A + C3A 2 + .•. + cnA nl  An).

Fourier matrix F.
Entries Fjk = e21Cijk/n give orthogonal columns FT F = nI. Then y = Fe is the (inverse) Discrete Fourier Transform Y j = L cke21Cijk/n.

Graph G.
Set of n nodes connected pairwise by m edges. A complete graph has all n(n  1)/2 edges between nodes. A tree has only n  1 edges and no closed loops.

Hankel matrix H.
Constant along each antidiagonal; hij depends on i + j.

lAII = l/lAI and IATI = IAI.
The big formula for det(A) has a sum of n! terms, the cofactor formula uses determinants of size n  1, volume of box = I det( A) I.

Left inverse A+.
If A has full column rank n, then A+ = (AT A)I AT has A+ A = In.

Multiplicities AM and G M.
The algebraic multiplicity A M of A is the number of times A appears as a root of det(A  AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).

Network.
A directed graph that has constants Cl, ... , Cm associated with the edges.

Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.

Row space C (AT) = all combinations of rows of A.
Column vectors by convention.

Semidefinite matrix A.
(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

Trace of A
= sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.

Transpose matrix AT.
Entries AL = Ajj. AT is n by In, AT A is square, symmetric, positive semidefinite. The transposes of AB and AI are BT AT and (AT)I.

Triangle inequality II u + v II < II u II + II v II.
For matrix norms II A + B II < II A II + II B II·

Vector addition.
v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.