- Chapter 1: Vectors
- Chapter 2: Systems of Linear Equations
- Chapter 3: Matrices
- Chapter 4: Eigenvalues and Eigenvectors
- Chapter 5: Orhthogonality
- Chapter 6: Vector Spaces
- Chapter 7: Vector Spaces
Linear Algebra: A Modern Introduction (Available 2011 Titles Enhanced Web Assign) 3rd Edition - Solutions by Chapter
Full solutions for Linear Algebra: A Modern Introduction (Available 2011 Titles Enhanced Web Assign) | 3rd Edition
Linear Algebra: A Modern Introduction (Available 2011 Titles Enhanced Web Assign) | 3rd Edition - Solutions by ChapterGet Full Solutions
Tv = Av + Vo = linear transformation plus shift.
Basis for V.
Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!
Change of basis matrix M.
The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)
Remove row i and column j; multiply the determinant by (-I)i + j •
Column picture of Ax = b.
The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).
cond(A) = c(A) = IIAIlIIA-III = amaxlamin. In Ax = b, the relative change Ilox III Ilx II is less than cond(A) times the relative change Ilob III lib II· Condition numbers measure the sensitivity of the output to change in the input.
Eigenvalue A and eigenvector x.
Ax = AX with x#-O so det(A - AI) = o.
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n - r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.
Hilbert matrix hilb(n).
Entries HU = 1/(i + j -1) = Jd X i- 1 xj-1dx. Positive definite but extremely small Amin and large condition number: H is ill-conditioned.
Hypercube matrix pl.
Row n + 1 counts corners, edges, faces, ... of a cube in Rn.
Length II x II.
Square root of x T x (Pythagoras in n dimensions).
Nilpotent matrix N.
Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.
Orthogonal matrix Q.
Square matrix with orthonormal columns, so QT = Q-l. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.
The diagonal entry (first nonzero) at the time when a row is used in elimination.
Skew-symmetric matrix K.
The transpose is -K, since Kij = -Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.
If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.
Symmetric factorizations A = LDLT and A = QAQT.
Signs in A = signs in D.
Transpose matrix AT.
Entries AL = Ajj. AT is n by In, AT A is square, symmetric, positive semidefinite. The transposes of AB and A-I are BT AT and (AT)-I.
Tridiagonal matrix T: tij = 0 if Ii - j I > 1.
T- 1 has rank 1 above and below diagonal.
Volume of box.
The rows (or the columns) of A generate a box with volume I det(A) I.
Having trouble accessing your account? Let us help you, contact support at +1(510) 944-1054 or email@example.com
Forgot password? Reset it here