- 7.1: Solve the following lower triangular system of equations. 2x1 = 4 3...
- 7.2: Let A = LU. The sequence of transformations R2 - RI> R3 +RI> R3 - 4...
- 7.3: Solve the following system of equations using LU decomposition. 6x1...
- 7.4: Let A = LU. If you are given L and U how many arithmetic operations...
- 7.5: Fmd the condit10n number of the matnx 125 201 Let this matrix be th...
- 7.6: Find the condition numbers of the matrices () [i -] 1 -1 -1 0 0 0 0...
- 7.7: Prove that c(A) = c(A-1 ).
- 7.8: Does c(A) define a linear mapping of Mnn R?
- 7.9: Solve the following systems of equations using pivoting and scaling...
- 7.10: Solve the following system of equations using the GaussSeidelmethod...
- 7.11: Use the power method to determine the dominant eigenvalue and corre...
- 7.12: Use the power method with inflation to find all eigenvalues and eig...
- 7.13: Find a singular value decomposition of the following matrix A = [ -...
- 7.14: LetA be an m X n matrix and P be an m X m orthogonal matrix. Show t...
- 7.15: Find the unique minimum length least squares solution of the follow...
Solutions for Chapter 7: Numerical Methods
Full solutions for Linear Algebra with Applications | 8th Edition
Characteristic equation det(A - AI) = O.
The n roots are the eigenvalues of A.
Column space C (A) =
space of all combinations of the columns of A.
Conjugate Gradient Method.
A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax - x Tb over growing Krylov subspaces.
A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.
Free variable Xi.
Column i has no pivot in elimination. We can give the n - r free variables any values, then Ax = b determines the r pivot variables (if solvable!).
Invert A by row operations on [A I] to reach [I A-I].
Set of n nodes connected pairwise by m edges. A complete graph has all n(n - 1)/2 edges between nodes. A tree has only n - 1 edges and no closed loops.
Hankel matrix H.
Constant along each antidiagonal; hij depends on i + j.
Hypercube matrix pl.
Row n + 1 counts corners, edges, faces, ... of a cube in Rn.
A symmetric matrix with eigenvalues of both signs (+ and - ).
Inverse matrix A-I.
Square matrix with A-I A = I and AA-l = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B-1 A-I and (A-I)T. Cofactor formula (A-l)ij = Cji! detA.
Left nullspace N (AT).
Nullspace of AT = "left nullspace" of A because y T A = OT.
= Xl (column 1) + ... + xn(column n) = combination of columns.
Particular solution x p.
Any solution to Ax = b; often x p has free variables = o.
Polar decomposition A = Q H.
Orthogonal Q times positive (semi)definite H.
Iv·wl < IIvll IIwll.Then IvTAwl2 < (vT Av)(wT Aw) for pos def A.
Semidefinite matrix A.
(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.
Simplex method for linear programming.
The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!
Special solutions to As = O.
One free variable is Si = 1, other free variables = o.
Trace of A
= sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.