- Chapter 1: Matrices and Systems of Equations
- Chapter 1.1: Systems of Linear Equations
- Chapter 1.2: Row Echelon Form
- Chapter 1.3: Matrix Arithmetic
- Chapter 1.4: Matrix Algebra
- Chapter 1.5: Elementary Matrices
- Chapter 1.6: Partitioned Matrices
- Chapter 2: Determinants
- Chapter 2.1: The Determinant of a Matrix
- Chapter 2.2: Properties of Determinants
- Chapter 2.3: Additional Topics and Applications
- Chapter 3: Vector Spaces
- Chapter 3.1: Definition and Examples
- Chapter 3.2: Subspaces
- Chapter 3.3: Linear Independence
- Chapter 3.4: Basis and Dimension
- Chapter 3.5: Change of Basis
- Chapter 3.6: Row Space and Column Space
- Chapter 4: Linear Transformations
- Chapter 4.1: Definition and Examples
- Chapter 4.2: Matrix Representations of Linear Transformations
- Chapter 4.3: Similarity
- Chapter 5: Orthogonality
- Chapter 5.1: The Scalar Product in Rn
- Chapter 5.2: Orthogonal Subspaces
- Chapter 5.3: Least Squares Problems
- Chapter 5.4: Inner Product Spaces
- Chapter 5.5: Orthonormal Sets
- Chapter 5.6: The GramSchmidt Orthogonalization Process
- Chapter 5.7: Orthogonal Polynomials
- Chapter 6: Eigenvalues
- Chapter 6.1: Eigenvalues and Eigenvectors
- Chapter 6.2: Systems of Linear Differential Equations
- Chapter 6.3: Diagonalization
- Chapter 6.4: Hermitian Matrices
- Chapter 6.5: The Singular Value Decomposition
- Chapter 6.6: Quadratic Forms
- Chapter 6.7: Positive Definite Matrices
- Chapter 6.8: Nonnegative Matrices
- Chapter 7: Numerical Linear Algebra
- Chapter 7.1: Floating-Point Numbers
- Chapter 7.2: Gaussian Elimination
- Chapter 7.3: Pivoting Strategies
- Chapter 7.4: Matrix Norms and Condition Numbers
- Chapter 7.5: Orthogonal Transformations
- Chapter 7.6: The Eigenvalue Problem
- Chapter 7.7: Least Squares Problems
Linear Algebra with Applications 8th Edition - Solutions by Chapter
Full solutions for Linear Algebra with Applications | 8th Edition
Cramer's Rule for Ax = b.
B j has b replacing column j of A; x j = det B j I det A
Diagonalizable matrix A.
Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.
A(B + C) = AB + AC. Add then multiply, or mUltiply then add.
Echelon matrix U.
The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.
Eigenvalue A and eigenvector x.
Ax = AX with x#-O so det(A - AI) = o.
Invert A by row operations on [A I] to reach [I A-I].
Hermitian matrix A H = AT = A.
Complex analog a j i = aU of a symmetric matrix.
Hilbert matrix hilb(n).
Entries HU = 1/(i + j -1) = Jd X i- 1 xj-1dx. Positive definite but extremely small Amin and large condition number: H is ill-conditioned.
lA-II = l/lAI and IATI = IAI.
The big formula for det(A) has a sum of n! terms, the cofactor formula uses determinants of size n - 1, volume of box = I det( A) I.
Minimal polynomial of A.
The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A - AI) if no eigenvalues are repeated; always meA) divides peA).
= Xl (column 1) + ... + xn(column n) = combination of columns.
Multiplicities AM and G M.
The algebraic multiplicity A M of A is the number of times A appears as a root of det(A - AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).
Every v in V is orthogonal to every w in W.
Simplex method for linear programming.
The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!
Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.
Special solutions to As = O.
One free variable is Si = 1, other free variables = o.
If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.
Triangle inequality II u + v II < II u II + II v II.
For matrix norms II A + B II < II A II + II B II·
Vandermonde matrix V.
V c = b gives coefficients of p(x) = Co + ... + Cn_IXn- 1 with P(Xi) = bi. Vij = (Xi)j-I and det V = product of (Xk - Xi) for k > i.
Volume of box.
The rows (or the columns) of A generate a box with volume I det(A) I.