 7.3.1: Verify thaI IS an orthogonal matrix.
 7.3.2: Find the inverse of each of the following orthogonal matrices
 7.3.3: Show that if A and B are orthogonal matrices. then A B IS an orthog...
 7.3.4: Show that if A is an orthogonal matrix. then A I is orthogonal.
 7.3.5: Prove Theorem 7.8.
 7.3.6: Verify Theorem 7.8 for the matrices in Exercise 2
 7.3.7: Verify that the matrix I' in Example 3 is an orthogonal matrix and ...
 7.3.8: Show that if A is an orthogonal matrix. then det{A) = l.
 7.3.9: (a) Verify that the matrix is orthogonal. [ cos sin Sin] cos (b) P...
 7.3.10: For the orthogonal matrix verify that (Ax. Ay) = (x. y) for any vec...
 7.3.11: Let A be an II x 1/ orthogonal matrix. and let L : R" + R" be the ...
 7.3.12: A linear operator L: V ....... V. where V is an 1/ dimensional Euc...
 7.3.13: Let L: R" + Rt be the linear operator performing a counterclockwis...
 7.3.14: Let A be an 1/ x 1/ matrix and let B = 1' 1 A I' be similar to A. ...
 7.3.15: II/ .{ercise.l 15 rhrollgh 20. diagol/alize each gil'en malrL{ lind...
 7.3.16: II/ .{ercise.l 15 rhrollgh 20. diagol/alize each gil'en malrL{ lind...
 7.3.17: II/ .{ercise.l 15 rhrollgh 20. diagol/alize each gil'en malrL{ lind...
 7.3.18: II/ .{ercise.l 15 rhrollgh 20. diagol/alize each gil'en malrL{ lind...
 7.3.19: II/ .{ercise.l 15 rhrollgh 20. diagol/alize each gil'en malrL{ lind...
 7.3.20: II/ .{ercise.l 15 rhrollgh 20. diagol/alize each gil'en malrL{ lind...
 7.3.21: III 'u/Lisc.1" 2 J Ihmugh 28. diagol/alize each gil'en II/(l/ri.>:....
 7.3.22: III 'u/Lisc.1" 2 J Ihmugh 28. diagol/alize each gil'en II/(l/ri.>:....
 7.3.23: III 'u/Lisc.1" 2 J Ihmugh 28. diagol/alize each gil'en II/(l/ri.>:....
 7.3.24: III 'u/Lisc.1" 2 J Ihmugh 28. diagol/alize each gil'en II/(l/ri.>:....
 7.3.25: III 'u/Lisc.1" 2 J Ihmugh 28. diagol/alize each gil'en II/(l/ri.>:....
 7.3.26: III 'u/Lisc.1" 2 J Ihmugh 28. diagol/alize each gil'en II/(l/ri.>:....
 7.3.27: III 'u/Lisc.1" 2 J Ihmugh 28. diagol/alize each gil'en II/(l/ri.>:....
 7.3.28: III 'u/Lisc.1" 2 J Ihmugh 28. diagol/alize each gil'en II/(l/ri.>:....
 7.3.29: Prove Theorem 7.9 for the 2 x 2 case by stlldying the two possible ...
 7.3.30: Let L: V '> V be an oI1hogonallinear operator (see Exercise 12), w...
 7.3.31: LetL: R2 ;. R2 bedefinedby L ( [,]) I;' ;, l [x]. ) 1;,;,1' Show...
 7.3.32: Let L: R2 '> H2 be defined by L (x ) = Ax. for x in H2. where A is...
 7.3.33: Let L. N" ;. H" be a linear operator. (a) Prove that if L is an i...
 7.3.34: Let L: R" + R" be a linear operator defined by L(x ) = Ax for x i...
 7.3.35: Let L. N" '> H" be a linear operator and S = (v ,. V2 .. vn) an or...
 7.3.36: Show that if AT Ay =y forallyin R" .then A T A = I".
 7.3.37: Show that if A is an orthogonal matrix. then A T is also orthogonal.
 7.3.38: Let A be an orthogonal matrix. Show that cA is orthogonal if and on...
 7.3.39: Assuming that the software you use has a command for eigenvalues an...
 7.3.40: If the answer to Exercise 39 is no. you can use the GramSchmidt pro...
Solutions for Chapter 7.3: Diagonalization of Symmetric Matrices
Full solutions for Elementary Linear Algebra with Applications  9th Edition
ISBN: 9780132296540
Solutions for Chapter 7.3: Diagonalization of Symmetric Matrices
Get Full SolutionsThis expansive textbook survival guide covers the following chapters and their solutions. Chapter 7.3: Diagonalization of Symmetric Matrices includes 40 full stepbystep solutions. Elementary Linear Algebra with Applications was written by and is associated to the ISBN: 9780132296540. This textbook survival guide was created for the textbook: Elementary Linear Algebra with Applications, edition: 9. Since 40 problems in chapter 7.3: Diagonalization of Symmetric Matrices have been answered, more than 11938 students have viewed full stepbystep solutions from this chapter.

Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

Column picture of Ax = b.
The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).

Column space C (A) =
space of all combinations of the columns of A.

Commuting matrices AB = BA.
If diagonalizable, they share n eigenvectors.

Complex conjugate
z = a  ib for any complex number z = a + ib. Then zz = Iz12.

Conjugate Gradient Method.
A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax  x Tb over growing Krylov subspaces.

Cross product u xv in R3:
Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].

Diagonal matrix D.
dij = 0 if i # j. Blockdiagonal: zero outside square blocks Du.

Elimination.
A sequence of row operations that reduces A to an upper triangular U or to the reduced form R = rref(A). Then A = LU with multipliers eO in L, or P A = L U with row exchanges in P, or E A = R with an invertible E.

Ellipse (or ellipsoid) x T Ax = 1.
A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA1 yll2 = Y T(AAT)1 Y = 1 displayed by eigshow; axis lengths ad

Graph G.
Set of n nodes connected pairwise by m edges. A complete graph has all n(n  1)/2 edges between nodes. A tree has only n  1 edges and no closed loops.

Hermitian matrix A H = AT = A.
Complex analog a j i = aU of a symmetric matrix.

Matrix multiplication AB.
The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

Multiplier eij.
The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).

Nilpotent matrix N.
Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.

Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q 1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.

Schur complement S, D  C A } B.
Appears in block elimination on [~ g ].

Singular matrix A.
A square matrix that has no inverse: det(A) = o.

Unitary matrix UH = U T = UI.
Orthonormal columns (complex analog of Q).