- Chapter 1: Vectors
- Chapter 2: Vectors
- Chapter 3: Matrices
- Chapter 4: Eigenvalues and Eigenvectors
- Chapter 5: Orthogonality
- Chapter 6: Vector Spaces
- Chapter 7: Distance and Approximation
Linear Algebra: A Modern Introduction 1st Edition - Solutions by Chapter
Full solutions for Linear Algebra: A Modern Introduction | 1st Edition
Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.
Cramer's Rule for Ax = b.
B j has b replacing column j of A; x j = det B j I det A
A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.
Dimension of vector space
dim(V) = number of vectors in any basis for V.
A(B + C) = AB + AC. Add then multiply, or mUltiply then add.
Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.
Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).
Elimination matrix = Elementary matrix Eij.
The identity matrix with an extra -eij in the i, j entry (i #- j). Then Eij A subtracts eij times row j of A from row i.
A sequence of row operations that reduces A to an upper triangular U or to the reduced form R = rref(A). Then A = LU with multipliers eO in L, or P A = L U with row exchanges in P, or E A = R with an invertible E.
Gram-Schmidt orthogonalization A = QR.
Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.
Hilbert matrix hilb(n).
Entries HU = 1/(i + j -1) = Jd X i- 1 xj-1dx. Positive definite but extremely small Amin and large condition number: H is ill-conditioned.
A symmetric matrix with eigenvalues of both signs (+ and - ).
Left inverse A+.
If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.
Length II x II.
Square root of x T x (Pythagoras in n dimensions).
Linear transformation T.
Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.
Ln = 2,J, 3, 4, ... satisfy Ln = L n- l +Ln- 2 = A1 +A~, with AI, A2 = (1 ± -/5)/2 from the Fibonacci matrix U~]' Compare Lo = 2 with Fo = O.
Nullspace N (A)
= All solutions to Ax = O. Dimension n - r = (# columns) - rank.
Rank r (A)
= number of pivots = dimension of column space = dimension of row space.
Semidefinite matrix A.
(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.
Combinations of VI, ... ,Vm fill the space. The columns of A span C (A)!
Constant down each diagonal = time-invariant (shift-invariant) filter.