- 9.5.1: Apply two iterations of the QR method without shifting to the follo...
- 9.5.2: Apply two iterations of the QR method without shifting to the follo...
- 9.5.3: Use the QR Algorithm to determine, to within 105, all the eigenvalu...
- 9.5.4: Use the QR Algorithm to determine, to within 105, all the eigenvalu...
- 9.5.5: Use the Inverse Power method to determine, to within 105, the eigen...
- 9.5.6: Use the Inverse Power method to determine, to within 105, the eigen...
- 9.5.7: a. Show that the rotation matrix cos sin sin cos applied to the vec...
- 9.5.8: Let P be the rotation matrix with pii = pjj = cos and pi j = pji = ...
- 9.5.9: Show that the product of an upper triangular matrix (on the left) a...
- 9.5.10: Let Pk denote a rotation matrix of the form given in (9.17). a. Sho...
- 9.5.11: Jacobis method for a symmetric matrix A is described by A1 = A, A2 ...
- 9.5.12: Repeat Exercise 3 using the Jacobi method.
- 9.5.13: In the lead example of this chapter, the linear system Aw = 0.04(/p...
- 9.5.14: The (m 1) (m 1) tridiagonal matrix A =1 2 00 .........................
- 9.5.15: The eigenvalues of the matrix A in Exercise 14 are i = 1 4 sin i 2m...
Solutions for Chapter 9.5: The QR Algorithm
Full solutions for Numerical Analysis | 9th Edition
Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).
Big formula for n by n determinants.
Det(A) is a sum of n! terms. For each term: Multiply one entry from each row and column of A: rows in order 1, ... , nand column order given by a permutation P. Each of the n! P 's has a + or - sign.
Cross product u xv in R3:
Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].
Echelon matrix U.
The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.
0,1,1,2,3,5, ... satisfy Fn = Fn-l + Fn- 2 = (A7 -A~)I()q -A2). Growth rate Al = (1 + .J5) 12 is the largest eigenvalue of the Fibonacci matrix [ } A].
Fourier matrix F.
Entries Fjk = e21Cijk/n give orthogonal columns FT F = nI. Then y = Fe is the (inverse) Discrete Fourier Transform Y j = L cke21Cijk/n.
Invert A by row operations on [A I] to reach [I A-I].
Hankel matrix H.
Constant along each antidiagonal; hij depends on i + j.
Hermitian matrix A H = AT = A.
Complex analog a j i = aU of a symmetric matrix.
Identity matrix I (or In).
Diagonal entries = 1, off-diagonal entries = 0.
Markov matrix M.
All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.
If N NT = NT N, then N has orthonormal (complex) eigenvectors.
Nullspace N (A)
= All solutions to Ax = O. Dimension n - r = (# columns) - rank.
Every v in V is orthogonal to every w in W.
Ps = pascal(n) = the symmetric matrix with binomial entries (i1~;2). Ps = PL Pu all contain Pascal's triangle with det = 1 (see Pascal in the index).
R = [~ CS ] rotates the plane by () and R- 1 = RT rotates back by -(). Eigenvalues are eiO and e-iO , eigenvectors are (1, ±i). c, s = cos (), sin ().
Row picture of Ax = b.
Each equation gives a plane in Rn; the planes intersect at x.
Symmetric matrix A.
The transpose is AT = A, and aU = a ji. A-I is also symmetric.
Constant down each diagonal = time-invariant (shift-invariant) filter.
Tridiagonal matrix T: tij = 0 if Ii - j I > 1.
T- 1 has rank 1 above and below diagonal.