- Chapter 1: Vectors
- Chapter 2: Systems of Linear Equations
- Chapter 3: Matrices
- Chapter 4: Eigenvalues and Eigenvectors
- Chapter 5: Orhthogonality
- Chapter 6: Vector Spaces
- Chapter 7: Vector Spaces
Linear Algebra: A Modern Introduction (Available 2011 Titles Enhanced Web Assign) 3rd Edition - Solutions by Chapter
Full solutions for Linear Algebra: A Modern Introduction (Available 2011 Titles Enhanced Web Assign) | 3rd Edition
Linear Algebra: A Modern Introduction (Available 2011 Titles Enhanced Web Assign) | 3rd Edition - Solutions by ChapterGet Full Solutions
Basis for V.
Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!
Change of basis matrix M.
The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)
Circulant matrix C.
Constant diagonals wrap around as in cyclic shift S. Every circulant is Col + CIS + ... + Cn_lSn - l . Cx = convolution c * x. Eigenvectors in F.
S. Permutation with S21 = 1, S32 = 1, ... , finally SIn = 1. Its eigenvalues are the nth roots e2lrik/n of 1; eigenvectors are columns of the Fourier matrix F.
Echelon matrix U.
The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.
Gram-Schmidt orthogonalization A = QR.
Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.
Hankel matrix H.
Constant along each antidiagonal; hij depends on i + j.
Hermitian matrix A H = AT = A.
Complex analog a j i = aU of a symmetric matrix.
Identity matrix I (or In).
Diagonal entries = 1, off-diagonal entries = 0.
Jordan form 1 = M- 1 AM.
If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.
Left nullspace N (AT).
Nullspace of AT = "left nullspace" of A because y T A = OT.
Length II x II.
Square root of x T x (Pythagoras in n dimensions).
Linear transformation T.
Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.
Multiplicities AM and G M.
The algebraic multiplicity A M of A is the number of times A appears as a root of det(A - AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).
Nilpotent matrix N.
Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.
Projection matrix P onto subspace S.
Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.
Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.
Combinations of VI, ... ,Vm fill the space. The columns of A span C (A)!
Vandermonde matrix V.
V c = b gives coefficients of p(x) = Co + ... + Cn_IXn- 1 with P(Xi) = bi. Vij = (Xi)j-I and det V = product of (Xk - Xi) for k > i.
Volume of box.
The rows (or the columns) of A generate a box with volume I det(A) I.