- Chapter 1.1:
- Chapter 1.2:
- Chapter 1.3:
- Chapter 1.4:
- Chapter 1.5:
- Chapter 1.6:
- Chapter 1.7:
- Chapter 1.8:
- Chapter 1.9:
- Chapter 2.1:
- Chapter 2.2:
- Chapter 3.1:
- Chapter 3.2:
- Chapter 3.8:
- Chapter 3.9:
- Chapter 4.1:
- Chapter 4.2:
- Chapter A.1:
- Chapter A.10:
- Chapter A.2:
- Chapter A.3:
- Chapter A.4:
- Chapter A.6:
- Chapter A.7:
- Chapter A.8:
- Chapter A.9:
Introduction to Linear Algebra 5th Edition - Solutions by Chapter
Full solutions for Introduction to Linear Algebra | 5th Edition
A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.
Diagonal matrix D.
dij = 0 if i #- j. Block-diagonal: zero outside square blocks Du.
Dimension of vector space
dim(V) = number of vectors in any basis for V.
Fourier matrix F.
Entries Fjk = e21Cijk/n give orthogonal columns FT F = nI. Then y = Fe is the (inverse) Discrete Fourier Transform Y j = L cke21Cijk/n.
Invert A by row operations on [A I] to reach [I A-I].
Set of n nodes connected pairwise by m edges. A complete graph has all n(n - 1)/2 edges between nodes. A tree has only n - 1 edges and no closed loops.
Length II x II.
Square root of x T x (Pythagoras in n dimensions).
Linear combination cv + d w or L C jV j.
Vector addition and scalar multiplication.
Ln = 2,J, 3, 4, ... satisfy Ln = L n- l +Ln- 2 = A1 +A~, with AI, A2 = (1 ± -/5)/2 from the Fibonacci matrix U~]' Compare Lo = 2 with Fo = O.
Markov matrix M.
All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.
Matrix multiplication AB.
The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .
IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.
Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •
Pivot columns of A.
Columns that contain pivots after row reduction. These are not combinations of earlier columns. The pivot columns are a basis for the column space.
Plane (or hyperplane) in Rn.
Vectors x with aT x = O. Plane is perpendicular to a =1= O.
Polar decomposition A = Q H.
Orthogonal Q times positive (semi)definite H.
Rank r (A)
= number of pivots = dimension of column space = dimension of row space.
Similar matrices A and B.
Every B = M-I AM has the same eigenvalues as A.
Sum V + W of subs paces.
Space of all (v in V) + (w in W). Direct sum: V n W = to}.
Tridiagonal matrix T: tij = 0 if Ii - j I > 1.
T- 1 has rank 1 above and below diagonal.