×
Get Full Access to Calculus and Pre Calculus - Textbook Survival Guide
Get Full Access to Calculus and Pre Calculus - Textbook Survival Guide
×

# Solutions for Chapter 7.8: Inverse of a Matrix. Gauss-Jordan Elimination

## Full solutions for Advanced Engineering Mathematics | 9th Edition

ISBN: 9780471488859

Solutions for Chapter 7.8: Inverse of a Matrix. Gauss-Jordan Elimination

Solutions for Chapter 7.8
4 5 0 323 Reviews
23
2
##### ISBN: 9780471488859

Chapter 7.8: Inverse of a Matrix. Gauss-Jordan Elimination includes 23 full step-by-step solutions. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Advanced Engineering Mathematics, edition: 9. Advanced Engineering Mathematics was written by and is associated to the ISBN: 9780471488859. Since 23 problems in chapter 7.8: Inverse of a Matrix. Gauss-Jordan Elimination have been answered, more than 44311 students have viewed full step-by-step solutions from this chapter.

Key Math Terms and definitions covered in this textbook
• Affine transformation

Tv = Av + Vo = linear transformation plus shift.

• Basis for V.

Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!

• Companion matrix.

Put CI, ... ,Cn in row n and put n - 1 ones just above the main diagonal. Then det(A - AI) = ±(CI + c2A + C3A 2 + .•. + cnA n-l - An).

A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax - x Tb over growing Krylov subspaces.

• Covariance matrix:E.

When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x - x) (x - x) T is positive (semi)definite; :E is diagonal if the Xi are independent.

• Diagonalizable matrix A.

Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.

• Diagonalization

A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.

• Echelon matrix U.

The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.

• Exponential eAt = I + At + (At)2 12! + ...

has derivative AeAt; eAt u(O) solves u' = Au.

• Hilbert matrix hilb(n).

Entries HU = 1/(i + j -1) = Jd X i- 1 xj-1dx. Positive definite but extremely small Amin and large condition number: H is ill-conditioned.

• Hypercube matrix pl.

Row n + 1 counts corners, edges, faces, ... of a cube in Rn.

• Identity matrix I (or In).

Diagonal entries = 1, off-diagonal entries = 0.

• Indefinite matrix.

A symmetric matrix with eigenvalues of both signs (+ and - ).

• Multiplication Ax

= Xl (column 1) + ... + xn(column n) = combination of columns.

• Network.

A directed graph that has constants Cl, ... , Cm associated with the edges.

• Outer product uv T

= column times row = rank one matrix.

• Projection matrix P onto subspace S.

Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.

• Vector space V.

Set of vectors such that all combinations cv + d w remain within V. Eight required rules are given in Section 3.1 for scalars c, d and vectors v, w.

• Volume of box.

The rows (or the columns) of A generate a box with volume I det(A) I.

• Wavelets Wjk(t).

Stretch and shift the time axis to create Wjk(t) = woo(2j t - k).

×