×

> > > Chapter 7.4

# Solutions for Chapter 7.4: Matrix Norms and Condition Numbers

## Full solutions for Linear Algebra with Applications | 8th Edition

ISBN: 9780136009290

Solutions for Chapter 7.4: Matrix Norms and Condition Numbers

Solutions for Chapter 7.4
4 5 0 337 Reviews
28
1
##### ISBN: 9780136009290

Since 47 problems in chapter 7.4: Matrix Norms and Condition Numbers have been answered, more than 4186 students have viewed full step-by-step solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. Linear Algebra with Applications was written by and is associated to the ISBN: 9780136009290. This textbook survival guide was created for the textbook: Linear Algebra with Applications, edition: 8. Chapter 7.4: Matrix Norms and Condition Numbers includes 47 full step-by-step solutions.

Key Math Terms and definitions covered in this textbook
• Adjacency matrix of a graph.

Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

• Cayley-Hamilton Theorem.

peA) = det(A - AI) has peA) = zero matrix.

• Change of basis matrix M.

The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)

• Complex conjugate

z = a - ib for any complex number z = a + ib. Then zz = Iz12.

• Echelon matrix U.

The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.

• Eigenvalue A and eigenvector x.

Ax = AX with x#-O so det(A - AI) = o.

• Exponential eAt = I + At + (At)2 12! + ...

has derivative AeAt; eAt u(O) solves u' = Au.

• Factorization

A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.

• Hankel matrix H.

Constant along each antidiagonal; hij depends on i + j.

• Hessenberg matrix H.

Triangular matrix with one extra nonzero adjacent diagonal.

• Hypercube matrix pl.

Row n + 1 counts corners, edges, faces, ... of a cube in Rn.

• Jordan form 1 = M- 1 AM.

If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

• Left inverse A+.

If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.

• Pivot.

The diagonal entry (first nonzero) at the time when a row is used in elimination.

• Plane (or hyperplane) in Rn.

Vectors x with aT x = O. Plane is perpendicular to a =1= O.

• Semidefinite matrix A.

(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

• Solvable system Ax = b.

The right side b is in the column space of A.

• Standard basis for Rn.

Columns of n by n identity matrix (written i ,j ,k in R3).

• Vandermonde matrix V.

V c = b gives coefficients of p(x) = Co + ... + Cn_IXn- 1 with P(Xi) = bi. Vij = (Xi)j-I and det V = product of (Xk - Xi) for k > i.

• Wavelets Wjk(t).

Stretch and shift the time axis to create Wjk(t) = woo(2j t - k).

×