 7.2.1: Verify thaI IS an orthogonal matrix.
 7.2.2: Find the inverse of each of the following orthogonal matrices:
 7.2.3: Show that if A and B are orthogonal matrices. then A B IS an orthog...
 7.2.4: Show that if A is an orthogonal matrix. then A I is orthogonal.
 7.2.5: Prove Theorem 7.8.
 7.2.6: Verify Theorem 7.8 for the matrices in Exercise 2.
 7.2.7: Verify that the matrix I' in Example 3 is an orthogonal matrix and ...
 7.2.8: Show that if A is an orthogonal matrix. then det{A) = l.
 7.2.9: (a) Verify that the matrix is orthogonal. [ cos sin Sin] cos (b) P...
 7.2.10: For the orthogonal matrix A~  sin q,] cosq, sin! ].  cos! verify ...
 7.2.11: Let A be an II x 1/ orthogonal matrix. and let L : R" + R" be the ...
 7.2.12: A linear operator L: V ....... V. where V is an 1/ dimensional Euc...
 7.2.13: Let L: R" + Rt be the linear operator performing a counterclockwis...
 7.2.14: Let A be an 1/ x 1/ matrix and let B = 1' 1 A I' be similar to A. ...
 7.2.15: II/ .{ercise.l 15 rhrollgh 20. diagol/alize each gil'en malrL{ lind...
 7.2.16: II/ .{ercise.l 15 rhrollgh 20. diagol/alize each gil'en malrL{ lind...
 7.2.17: II/ .{ercise.l 15 rhrollgh 20. diagol/alize each gil'en malrL{ lind...
 7.2.18: II/ .{ercise.l 15 rhrollgh 20. diagol/alize each gil'en malrL{ lind...
 7.2.19: II/ .{ercise.l 15 rhrollgh 20. diagol/alize each gil'en malrL{ lind...
 7.2.20: II/ .{ercise.l 15 rhrollgh 20. diagol/alize each gil'en malrL{ lind...
 7.2.21: III 'u/Lisc.1" 2 J Ihmugh 28. diagol/alize each gil'en II/(l/ri.>:.
 7.2.22: III 'u/Lisc.1" 2 J Ihmugh 28. diagol/alize each gil'en II/(l/ri.>:.
 7.2.23: III 'u/Lisc.1" 2 J Ihmugh 28. diagol/alize each gil'en II/(l/ri.>:.
 7.2.24: III 'u/Lisc.1" 2 J Ihmugh 28. diagol/alize each gil'en II/(l/ri.>:.
 7.2.25: III 'u/Lisc.1" 2 J Ihmugh 28. diagol/alize each gil'en II/(l/ri.>:.
 7.2.26: III 'u/Lisc.1" 2 J Ihmugh 28. diagol/alize each gil'en II/(l/ri.>:.
 7.2.27: III 'u/Lisc.1" 2 J Ihmugh 28. diagol/alize each gil'en II/(l/ri.>:.
 7.2.28: III 'u/Lisc.1" 2 J Ihmugh 28. diagol/alize each gil'en II/(l/ri.>:.
 7.2.29: Prove Theorem 7.9 for the 2 x 2 case by stlldying the two possible ...
 7.2.30: Let L: V '> V be an oI1hogonallinear operator (see Exercise 12), w...
 7.2.31: LetL: R2 ;. R2 bedefinedby L ( [,]) I;' ;, l [x]. ) 1;,;,1' Show...
 7.2.32: Let L: R2 '> H2 be defined by L (x ) = Ax. for x in H2. where A is...
 7.2.33: Let L. N" ;. H" be a linear operator. (a) Prove that if L is an i...
 7.2.34: Let L: R" + R" be a linear operator defined by L(x ) = Ax for x i...
 7.2.35: Let L. N" '> H" be a linear operator and S = (v ,. V2 .. vn) an or...
 7.2.36: Show that if AT Ay =y forallyin R" .then A T A = I".
 7.2.37: Show that if A is an orthogonal matrix. then A T is also orthogonal
 7.2.38: Let A be an orthogonal matrix. Show that cA is orthogonal if and on...
 7.2.39: Assuming that the software you use has a command for eigenvalues an...
 7.2.40: If the answer to Exercise 39 is no. you can use the GramSchmidt pro...
Solutions for Chapter 7.2: Diagonalization and Similar Matrices
Full solutions for Elementary Linear Algebra with Applications  9th Edition
ISBN: 9780471669593
Solutions for Chapter 7.2: Diagonalization and Similar Matrices
Get Full SolutionsChapter 7.2: Diagonalization and Similar Matrices includes 40 full stepbystep solutions. This textbook survival guide was created for the textbook: Elementary Linear Algebra with Applications, edition: 9. This expansive textbook survival guide covers the following chapters and their solutions. Elementary Linear Algebra with Applications was written by and is associated to the ISBN: 9780471669593. Since 40 problems in chapter 7.2: Diagonalization and Similar Matrices have been answered, more than 9189 students have viewed full stepbystep solutions from this chapter.

Big formula for n by n determinants.
Det(A) is a sum of n! terms. For each term: Multiply one entry from each row and column of A: rows in order 1, ... , nand column order given by a permutation P. Each of the n! P 's has a + or  sign.

Column picture of Ax = b.
The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).

Fourier matrix F.
Entries Fjk = e21Cijk/n give orthogonal columns FT F = nI. Then y = Fe is the (inverse) Discrete Fourier Transform Y j = L cke21Cijk/n.

Independent vectors VI, .. " vk.
No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.

Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b  Ax is orthogonal to all columns of A.

Left nullspace N (AT).
Nullspace of AT = "left nullspace" of A because y T A = OT.

Linear combination cv + d w or L C jV j.
Vector addition and scalar multiplication.

Markov matrix M.
All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

Minimal polynomial of A.
The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A  AI) if no eigenvalues are repeated; always meA) divides peA).

Multiplication Ax
= Xl (column 1) + ... + xn(column n) = combination of columns.

Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q 1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

Positive definite matrix A.
Symmetric matrix with positive eigenvalues and positive pivots. Definition: x T Ax > 0 unless x = O. Then A = LDLT with diag(D» O.

Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.

Schur complement S, D  C A } B.
Appears in block elimination on [~ g ].

Similar matrices A and B.
Every B = MI AM has the same eigenvalues as A.

Simplex method for linear programming.
The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

Standard basis for Rn.
Columns of n by n identity matrix (written i ,j ,k in R3).

Symmetric factorizations A = LDLT and A = QAQT.
Signs in A = signs in D.

Triangle inequality II u + v II < II u II + II v II.
For matrix norms II A + B II < II A II + II B II·

Unitary matrix UH = U T = UI.
Orthonormal columns (complex analog of Q).