 6.4.1: For each of the following pairs of vectors z and w in C2, compute (...
 6.4.2: Let z1 = 1 + i 2 1 i 2 and z2 = i 2 1 2 (a) Show that {z1, z2} is a...
 6.4.3: Let {u1, u2} be an orthonormal basis for C2, and let z = (4 + 2i )u...
 6.4.4: Which of the matrices that follow are Hermitian? Normal? (a) 1 i 2 ...
 6.4.5: Find an orthogonal or unitary diagonalizing matrix for each of the ...
 6.4.6: Show that the diagonal entries of a Hermitian matrix must be real.
 6.4.7: Let A be a Hermitian matrix and let x be a vector in Cn. Show that ...
 6.4.8: Let A be a Hermitian matrix and let B = i A. Show that B is skew He...
 6.4.9: Let A and C be matrices in Cmn and let B Cnr . Prove each of the fo...
 6.4.10: Let A and B be Hermitian matrices. Answer true or false for each of...
 6.4.11: Show that _z,w_ = wH z defines an inner product on Cn. 1
 6.4.12: Let x, y, and z be vectors in Cn and let and be complex scalars. Sh...
 6.4.13: Let {u1, . . . , un} be an orthonormal basis for a complex inner pr...
 6.4.14: Given that A = 4 0 0 0 1 i 0 i 1 find a matrix B such that BH B = A. 1
 6.4.15: Let U be a unitary matrix. Prove that (a) U is normal. (b) _Ux_ = _...
 6.4.16: Let u be a unit vector in Cn and define U = I 2uuH . Show that U is...
 6.4.17: Show that if a matrix U is both unitary and Hermitian, then any eig...
 6.4.18: Let A be a 2 2 matrix with Schur decomposition UTUH and suppose tha...
 6.4.19: Let A be a 5 5 matrix with real entries. Let A = QT QT be the real ...
 6.4.20: Let A be a n n matrix with Schur decomposition UTUH . Show that if ...
 6.4.21: Show that M = A + i B (where A and B real matrices) is skew Hermiti...
 6.4.22: Show that if A is skew Hermitian and is an eigenvalue of A, then is...
 6.4.23: Show that if A is a normal matrix, then each of the following matri...
 6.4.24: Let A be a real 2 2 matrix with the property that a21a12 > 0, and l...
 6.4.25: Let p(x) = x3 + cx2 + (c + 3)x + 1, where c is a real number. Let C...
 6.4.26: Let A be a Hermitian matrix with eigenvalues 1, . . . , n and ortho...
 6.4.27: Let A = 0 1 1 0 Write A as a sum 1u1uT 1 + 2u2uT 2 , where 1 and 2 ...
 6.4.28: Let A be a Hermitian matrix with eigenvalues 1 2 n and orthonormal ...
 6.4.29: Given A Rmm, B Rnn, C Rmn, the equation AX XB = C (3) is known as S...
Solutions for Chapter 6.4: Hermitian Matrices
Full solutions for Linear Algebra with Applications  8th Edition
ISBN: 9780136009290
Solutions for Chapter 6.4: Hermitian Matrices
Get Full SolutionsThis expansive textbook survival guide covers the following chapters and their solutions. Linear Algebra with Applications was written by and is associated to the ISBN: 9780136009290. Since 29 problems in chapter 6.4: Hermitian Matrices have been answered, more than 8401 students have viewed full stepbystep solutions from this chapter. This textbook survival guide was created for the textbook: Linear Algebra with Applications, edition: 8. Chapter 6.4: Hermitian Matrices includes 29 full stepbystep solutions.

Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.

Basis for V.
Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!

Column picture of Ax = b.
The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).

GaussJordan method.
Invert A by row operations on [A I] to reach [I AI].

Identity matrix I (or In).
Diagonal entries = 1, offdiagonal entries = 0.

Independent vectors VI, .. " vk.
No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.

Inverse matrix AI.
Square matrix with AI A = I and AAl = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B1 AI and (AI)T. Cofactor formula (Al)ij = Cji! detA.

Kirchhoff's Laws.
Current Law: net current (in minus out) is zero at each node. Voltage Law: Potential differences (voltage drops) add to zero around any closed loop.

Kronecker product (tensor product) A ® B.
Blocks aij B, eigenvalues Ap(A)Aq(B).

lAII = l/lAI and IATI = IAI.
The big formula for det(A) has a sum of n! terms, the cofactor formula uses determinants of size n  1, volume of box = I det( A) I.

Normal matrix.
If N NT = NT N, then N has orthonormal (complex) eigenvectors.

Nullspace N (A)
= All solutions to Ax = O. Dimension n  r = (# columns)  rank.

Outer product uv T
= column times row = rank one matrix.

Positive definite matrix A.
Symmetric matrix with positive eigenvalues and positive pivots. Definition: x T Ax > 0 unless x = O. Then A = LDLT with diag(D» O.

Projection p = a(aTblaTa) onto the line through a.
P = aaT laTa has rank l.

Row space C (AT) = all combinations of rows of A.
Column vectors by convention.

Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

Semidefinite matrix A.
(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

Transpose matrix AT.
Entries AL = Ajj. AT is n by In, AT A is square, symmetric, positive semidefinite. The transposes of AB and AI are BT AT and (AT)I.