 7.2.1: Questions 14 extend the first derivative example to higher derivat...
 7.2.2: Questions 14 extend the first derivative example to higher derivat...
 7.2.3: Questions 14 extend the first derivative example to higher derivat...
 7.2.4: Questions 14 extend the first derivative example to higher derivat...
 7.2.5: Questions 59 are about a particular T and its matrix A.
 7.2.6: Questions 59 are about a particular T and its matrix A.
 7.2.7: Questions 59 are about a particular T and its matrix A.
 7.2.8: Questions 59 are about a particular T and its matrix A.
 7.2.9: Questions 59 are about a particular T and its matrix A.
 7.2.10: Questions 1013 are about invertible linear transformations.
 7.2.11: Questions 1013 are about invertible linear transformations.
 7.2.12: Questions 1013 are about invertible linear transformations.
 7.2.13: Questions 1013 are about invertible linear transformations.
 7.2.14: Questions 1419 are about changing the basis
 7.2.15: Questions 1419 are about changing the basis
 7.2.16: Questions 1419 are about changing the basis
 7.2.17: Questions 1419 are about changing the basis
 7.2.18: Questions 1419 are about changing the basis
 7.2.19: Questions 1419 are about changing the basis
 7.2.20: Questions 2023 are about the space of quadratic polynomials A + Bx...
 7.2.21: Questions 2023 are about the space of quadratic polynomials A + Bx...
 7.2.22: Questions 2023 are about the space of quadratic polynomials A + Bx...
 7.2.23: Questions 2023 are about the space of quadratic polynomials A + Bx...
 7.2.24: The GramSchmidt process changes a basis aI, a2, a3 to an orthonorm...
 7.2.25: Elimination changes the rows of A to the rows of U with A = L U. Ro...
 7.2.26: Suppose vI, V2, V3 are eigenvectors for T. This means T(Vi) = AiVi ...
 7.2.27: Every invertible linear transformation can have I as its matrix! Ch...
 7.2.28: Using VI = WI and V2 = W2 find the standard matrix for these T's:
 7.2.29: Suppose T is reflection across the x axis and S is reflection acros...
 7.2.30: Suppose T is reflection across the 45 line, and S is reflection acr...
 7.2.31: Show that the product ST of two reflections is a rotation. Multiply...
 7.2.32: True or false: If we know T(v) for n different nonzero vectors in R...
 7.2.33: Express e = (1,0, 0, 0) and v = (1, 1, 1, 1) in the wavelet basis...
 7.2.34: To represent v = (7,5,3, 1) in the wavelet basis, start with (6, 6,...
 7.2.35: What are the eight vectors in the wavelet basis for R8? They includ...
 7.2.36: Suppose we have two bases v I, ... , Vn and WI, . .. , Wn for Rn. I...
 7.2.37: The space M of 2 by 2 matrices has the basis VI, V2, V3, V4 in Work...
 7.2.38: Suppose A is a 3 by 4 matrix of rank r = 2, and T(v) = Av. Choose i...
Solutions for Chapter 7.2: The Matrix of a Linear Transformation
Full solutions for Introduction to Linear Algebra  4th Edition
ISBN: 9780980232714
Solutions for Chapter 7.2: The Matrix of a Linear Transformation
Get Full SolutionsSince 38 problems in chapter 7.2: The Matrix of a Linear Transformation have been answered, more than 8145 students have viewed full stepbystep solutions from this chapter. This textbook survival guide was created for the textbook: Introduction to Linear Algebra, edition: 4. Chapter 7.2: The Matrix of a Linear Transformation includes 38 full stepbystep solutions. This expansive textbook survival guide covers the following chapters and their solutions. Introduction to Linear Algebra was written by and is associated to the ISBN: 9780980232714.

CayleyHamilton Theorem.
peA) = det(A  AI) has peA) = zero matrix.

Cross product u xv in R3:
Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].

Free variable Xi.
Column i has no pivot in elimination. We can give the n  r free variables any values, then Ax = b determines the r pivot variables (if solvable!).

Hermitian matrix A H = AT = A.
Complex analog a j i = aU of a symmetric matrix.

Hypercube matrix pl.
Row n + 1 counts corners, edges, faces, ... of a cube in Rn.

Incidence matrix of a directed graph.
The m by n edgenode incidence matrix has a row for each edge (node i to node j), with entries 1 and 1 in columns i and j .

Inverse matrix AI.
Square matrix with AI A = I and AAl = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B1 AI and (AI)T. Cofactor formula (Al)ij = Cji! detA.

Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b  Ax is orthogonal to all columns of A.

Left nullspace N (AT).
Nullspace of AT = "left nullspace" of A because y T A = OT.

Lucas numbers
Ln = 2,J, 3, 4, ... satisfy Ln = L n l +Ln 2 = A1 +A~, with AI, A2 = (1 ± /5)/2 from the Fibonacci matrix U~]' Compare Lo = 2 with Fo = O.

Multiplicities AM and G M.
The algebraic multiplicity A M of A is the number of times A appears as a root of det(A  AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).

Pivot.
The diagonal entry (first nonzero) at the time when a row is used in elimination.

Plane (or hyperplane) in Rn.
Vectors x with aT x = O. Plane is perpendicular to a =1= O.

Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.

Rank r (A)
= number of pivots = dimension of column space = dimension of row space.

Rayleigh quotient q (x) = X T Ax I x T x for symmetric A: Amin < q (x) < Amax.
Those extremes are reached at the eigenvectors x for Amin(A) and Amax(A).

Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

Solvable system Ax = b.
The right side b is in the column space of A.

Subspace S of V.
Any vector space inside V, including V and Z = {zero vector only}.

Sum V + W of subs paces.
Space of all (v in V) + (w in W). Direct sum: V n W = to}.