 6.4.1: For each of the following pairs of vectors z and w in C2, compute (...
 6.4.2: Let z1 = 1 + i 2 1 i 2 and z2 = i 2 1 2 (a) Show that {z1, z2} is a...
 6.4.3: Let {u1, u2} be an orthonormal basis for C2, and let z = (4 + 2i )u...
 6.4.4: Which of the matrices that follow are Hermitian? Normal? (a) 1 i 2 ...
 6.4.5: Find an orthogonal or unitary diagonalizing matrix for each of the ...
 6.4.6: Show that the diagonal entries of a Hermitian matrix must be real.
 6.4.7: Let A be a Hermitian matrix and let x be a vector in Cn. Show that ...
 6.4.8: Let A be a Hermitian matrix and let B = i A. Show that B is skew He...
 6.4.9: Let A and C be matrices in Cmn and let B Cnr . Prove each of the fo...
 6.4.10: Let A and B be Hermitian matrices. Answer true or false for each of...
 6.4.11: Show that _z,w_ = wH z defines an inner product on Cn. 1
 6.4.12: Let x, y, and z be vectors in Cn and let and be complex scalars. Sh...
 6.4.13: Let {u1, . . . , un} be an orthonormal basis for a complex inner pr...
 6.4.14: Given that A = 4 0 0 0 1 i 0 i 1 find a matrix B such that BH B = A. 1
 6.4.15: Let U be a unitary matrix. Prove that (a) U is normal. (b) _Ux_ = _...
 6.4.16: Let u be a unit vector in Cn and define U = I 2uuH . Show that U is...
 6.4.17: Show that if a matrix U is both unitary and Hermitian, then any eig...
 6.4.18: Let A be a 2 2 matrix with Schur decomposition UTUH and suppose tha...
 6.4.19: Let A be a 5 5 matrix with real entries. Let A = QT QT be the real ...
 6.4.20: Let A be a n n matrix with Schur decomposition UTUH . Show that if ...
 6.4.21: Show that M = A + i B (where A and B real matrices) is skew Hermiti...
 6.4.22: Show that if A is skew Hermitian and is an eigenvalue of A, then is...
 6.4.23: Show that if A is a normal matrix, then each of the following matri...
 6.4.24: Let A be a real 2 2 matrix with the property that a21a12 > 0, and l...
 6.4.25: Let p(x) = x3 + cx2 + (c + 3)x + 1, where c is a real number. Let C...
 6.4.26: Let A be a Hermitian matrix with eigenvalues 1, . . . , n and ortho...
 6.4.27: Let A = 0 1 1 0 Write A as a sum 1u1uT 1 + 2u2uT 2 , where 1 and 2 ...
 6.4.28: Let A be a Hermitian matrix with eigenvalues 1 2 n and orthonormal ...
 6.4.29: Given A Rmm, B Rnn, C Rmn, the equation AX XB = C (3) is known as S...
Solutions for Chapter 6.4: Hermitian Matrices
Full solutions for Linear Algebra with Applications  8th Edition
ISBN: 9780136009290
Solutions for Chapter 6.4: Hermitian Matrices
Get Full SolutionsThis expansive textbook survival guide covers the following chapters and their solutions. Linear Algebra with Applications was written by and is associated to the ISBN: 9780136009290. Since 29 problems in chapter 6.4: Hermitian Matrices have been answered, more than 4349 students have viewed full stepbystep solutions from this chapter. This textbook survival guide was created for the textbook: Linear Algebra with Applications, edition: 8. Chapter 6.4: Hermitian Matrices includes 29 full stepbystep solutions.

Commuting matrices AB = BA.
If diagonalizable, they share n eigenvectors.

Conjugate Gradient Method.
A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax  x Tb over growing Krylov subspaces.

Cramer's Rule for Ax = b.
B j has b replacing column j of A; x j = det B j I det A

Cyclic shift
S. Permutation with S21 = 1, S32 = 1, ... , finally SIn = 1. Its eigenvalues are the nth roots e2lrik/n of 1; eigenvectors are columns of the Fourier matrix F.

Diagonal matrix D.
dij = 0 if i # j. Blockdiagonal: zero outside square blocks Du.

Eigenvalue A and eigenvector x.
Ax = AX with x#O so det(A  AI) = o.

Elimination matrix = Elementary matrix Eij.
The identity matrix with an extra eij in the i, j entry (i # j). Then Eij A subtracts eij times row j of A from row i.

Fundamental Theorem.
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n  r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.

Indefinite matrix.
A symmetric matrix with eigenvalues of both signs (+ and  ).

Independent vectors VI, .. " vk.
No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.

Jordan form 1 = M 1 AM.
If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

Kronecker product (tensor product) A ® B.
Blocks aij B, eigenvalues Ap(A)Aq(B).

Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b  Ax is orthogonal to all columns of A.

Projection p = a(aTblaTa) onto the line through a.
P = aaT laTa has rank l.

Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.

Schwarz inequality
Iv·wl < IIvll IIwll.Then IvTAwl2 < (vT Av)(wT Aw) for pos def A.

Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

Standard basis for Rn.
Columns of n by n identity matrix (written i ,j ,k in R3).

Tridiagonal matrix T: tij = 0 if Ii  j I > 1.
T 1 has rank 1 above and below diagonal.

Wavelets Wjk(t).
Stretch and shift the time axis to create Wjk(t) = woo(2j t  k).