- 7.1: Verify that each matrix is orthogonal, and find its inverse. (a) (b)
- 7.2: Prove: If Q is an orthogonal matrix, then each entry of Q is the sa...
- 7.3: Prove that if A is a positive definite symmetric matrix, and if u a...
- 7.4: Find the characteristic polynomial and the dimensions of the eigens...
- 7.5: Find a matrix P that orthogonally diagonalizes and determine the di...
- 7.6: Express each quadratic form in the matrix notation . (a) (b)
- 7.7: Classify the quadradic form as positive definite, negative definite...
- 7.8: Find an orthogonal change of variable that eliminates the cross pro...
- 7.9: Identify the type of conic section represented by each equation. (a...
- 7.10: Find a unitary matrix U that diagonalizes and determine the diagona...
- 7.11: Show that if U is an unitary matrix and then the product is also un...
- 7.12: Suppose that . (a) Show that iA is Hermitian. (b) Show that A is un...
Solutions for Chapter 7: Diagonalization and Quadratic Forms
Full solutions for Elementary Linear Algebra: Applications Version | 10th Edition
Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.
Change of basis matrix M.
The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)
Remove row i and column j; multiply the determinant by (-I)i + j •
Commuting matrices AB = BA.
If diagonalizable, they share n eigenvectors.
z = a - ib for any complex number z = a + ib. Then zz = Iz12.
Elimination matrix = Elementary matrix Eij.
The identity matrix with an extra -eij in the i, j entry (i #- j). Then Eij A subtracts eij times row j of A from row i.
Fourier matrix F.
Entries Fjk = e21Cijk/n give orthogonal columns FT F = nI. Then y = Fe is the (inverse) Discrete Fourier Transform Y j = L cke21Cijk/n.
Free variable Xi.
Column i has no pivot in elimination. We can give the n - r free variables any values, then Ax = b determines the r pivot variables (if solvable!).
Set of n nodes connected pairwise by m edges. A complete graph has all n(n - 1)/2 edges between nodes. A tree has only n - 1 edges and no closed loops.
Hermitian matrix A H = AT = A.
Complex analog a j i = aU of a symmetric matrix.
Ln = 2,J, 3, 4, ... satisfy Ln = L n- l +Ln- 2 = A1 +A~, with AI, A2 = (1 ± -/5)/2 from the Fibonacci matrix U~]' Compare Lo = 2 with Fo = O.
Minimal polynomial of A.
The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A - AI) if no eigenvalues are repeated; always meA) divides peA).
Nullspace N (A)
= All solutions to Ax = O. Dimension n - r = (# columns) - rank.
Particular solution x p.
Any solution to Ax = b; often x p has free variables = o.
Ps = pascal(n) = the symmetric matrix with binomial entries (i1~;2). Ps = PL Pu all contain Pascal's triangle with det = 1 (see Pascal in the index).
Positive definite matrix A.
Symmetric matrix with positive eigenvalues and positive pivots. Definition: x T Ax > 0 unless x = O. Then A = LDLT with diag(D» O.
Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.
R = [~ CS ] rotates the plane by () and R- 1 = RT rotates back by -(). Eigenvalues are eiO and e-iO , eigenvectors are (1, ±i). c, s = cos (), sin ().
Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.
Constant down each diagonal = time-invariant (shift-invariant) filter.