×
×

# Solutions for Chapter 4.5: Basis and Dimension

## Full solutions for Elementary Linear Algebra | 8th Edition

ISBN: 9781305658004

Solutions for Chapter 4.5: Basis and Dimension

Solutions for Chapter 4.5
4 5 0 280 Reviews
31
5
##### ISBN: 9781305658004

This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Elementary Linear Algebra, edition: 8. Chapter 4.5: Basis and Dimension includes 172 full step-by-step solutions. Elementary Linear Algebra was written by and is associated to the ISBN: 9781305658004. Since 172 problems in chapter 4.5: Basis and Dimension have been answered, more than 44190 students have viewed full step-by-step solutions from this chapter.

Key Math Terms and definitions covered in this textbook
• Affine transformation

Tv = Av + Vo = linear transformation plus shift.

• Column picture of Ax = b.

The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).

• Commuting matrices AB = BA.

If diagonalizable, they share n eigenvectors.

• Complete solution x = x p + Xn to Ax = b.

(Particular x p) + (x n in nullspace).

• Cramer's Rule for Ax = b.

B j has b replacing column j of A; x j = det B j I det A

• Diagonalizable matrix A.

Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.

• Dimension of vector space

dim(V) = number of vectors in any basis for V.

• Ellipse (or ellipsoid) x T Ax = 1.

A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA-1 yll2 = Y T(AAT)-1 Y = 1 displayed by eigshow; axis lengths ad

• Fourier matrix F.

Entries Fjk = e21Cijk/n give orthogonal columns FT F = nI. Then y = Fe is the (inverse) Discrete Fourier Transform Y j = L cke21Cijk/n.

• Free variable Xi.

Column i has no pivot in elimination. We can give the n - r free variables any values, then Ax = b determines the r pivot variables (if solvable!).

• Full column rank r = n.

Independent columns, N(A) = {O}, no free variables.

• Inverse matrix A-I.

Square matrix with A-I A = I and AA-l = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B-1 A-I and (A-I)T. Cofactor formula (A-l)ij = Cji! detA.

• Jordan form 1 = M- 1 AM.

If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

• Left inverse A+.

If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.

• Linear combination cv + d w or L C jV j.

• Plane (or hyperplane) in Rn.

Vectors x with aT x = O. Plane is perpendicular to a =1= O.

• Rank one matrix A = uvT f=. O.

Column and row spaces = lines cu and cv.

• Row picture of Ax = b.

Each equation gives a plane in Rn; the planes intersect at x.

• Singular Value Decomposition

(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

• Skew-symmetric matrix K.

The transpose is -K, since Kij = -Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

×