 4.1.1: Show that each of the following are linear operators on R2. Describ...
 4.1.2: Let L be the linear operator on R2 defined by L(x) = (x1 cos x2 sin...
 4.1.3: Let a be a fixed nonzero vector in R2. A mapping of the form L(x) =...
 4.1.4: Let L : R2 R2 be a linear operator. If L((1, 2)T ) = (2, 3)T and L(...
 4.1.5: Determine whether the following are linear transformations from R3 ...
 4.1.6: Determine whether the following are linear transformations from R2 ...
 4.1.7: Determine whether the following are linear operators on Rnn: (a) L(...
 4.1.8: Let C be a fixed n n matrix. Determine whether the following are li...
 4.1.9: Determine whether the following are linear transformations from P2 ...
 4.1.10: For each f C[0, 1], define L( f ) = F, where F(x) = _ x 0 f (t) dt ...
 4.1.11: Determine whether the following are linear transformations from C[0...
 4.1.12: Use mathematical induction to prove that if L is a linear transform...
 4.1.13: Let {v1, . . . , vn} be a basis for a vector space V, and let L1 an...
 4.1.14: Let L be a linear operator on R1 and let a = L(1). Show that L(x) =...
 4.1.15: Let L be a linear operator on a vector space V. Define Ln, n 1, rec...
 4.1.16: Let L1 : U V and L2 : V W be linear transformations, and let L = L2...
 4.1.17: Determine the kernel and range of each of the following linear oper...
 4.1.18: Let S be the subspace of R3 spanned by e1 and e2. For each linear o...
 4.1.19: Find the kernel and range of each of the following linear operators...
 4.1.20: Let L : V W be a linear transformation, and let T be a subspace of ...
 4.1.21: A linear transformation L : V W is said to be onetoone if L(v1) =...
 4.1.22: A linear transformation L : V W is said to map V onto W if L(V) = W...
 4.1.23: Which of the operators defined in Exercise 17 are onetoone? Which...
 4.1.24: Let A be a 2 2 matrix, and let L A be the linear operator defined b...
 4.1.25: Let D be the differentiation operator on P3, and let S = {p P3  p(...
Solutions for Chapter 4.1: Definition and Examples
Full solutions for Linear Algebra with Applications  8th Edition
ISBN: 9780136009290
Solutions for Chapter 4.1: Definition and Examples
Get Full SolutionsChapter 4.1: Definition and Examples includes 25 full stepbystep solutions. Linear Algebra with Applications was written by and is associated to the ISBN: 9780136009290. This textbook survival guide was created for the textbook: Linear Algebra with Applications, edition: 8. Since 25 problems in chapter 4.1: Definition and Examples have been answered, more than 6789 students have viewed full stepbystep solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions.

Column space C (A) =
space of all combinations of the columns of A.

Companion matrix.
Put CI, ... ,Cn in row n and put n  1 ones just above the main diagonal. Then det(A  AI) = ±(CI + c2A + C3A 2 + .•. + cnA nl  An).

Dimension of vector space
dim(V) = number of vectors in any basis for V.

Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.
Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).

Fundamental Theorem.
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n  r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.

GaussJordan method.
Invert A by row operations on [A I] to reach [I AI].

Hypercube matrix pl.
Row n + 1 counts corners, edges, faces, ... of a cube in Rn.

Nullspace N (A)
= All solutions to Ax = O. Dimension n  r = (# columns)  rank.

Orthogonal matrix Q.
Square matrix with orthonormal columns, so QT = Ql. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

Orthogonal subspaces.
Every v in V is orthogonal to every w in W.

Partial pivoting.
In each column, choose the largest available pivot to control roundoff; all multipliers have leij I < 1. See condition number.

Pivot.
The diagonal entry (first nonzero) at the time when a row is used in elimination.

Polar decomposition A = Q H.
Orthogonal Q times positive (semi)definite H.

Positive definite matrix A.
Symmetric matrix with positive eigenvalues and positive pivots. Definition: x T Ax > 0 unless x = O. Then A = LDLT with diag(D» O.

Projection p = a(aTblaTa) onto the line through a.
P = aaT laTa has rank l.

Rayleigh quotient q (x) = X T Ax I x T x for symmetric A: Amin < q (x) < Amax.
Those extremes are reached at the eigenvectors x for Amin(A) and Amax(A).

Schur complement S, D  C A } B.
Appears in block elimination on [~ g ].

Special solutions to As = O.
One free variable is Si = 1, other free variables = o.

Triangle inequality II u + v II < II u II + II v II.
For matrix norms II A + B II < II A II + II B II·

Vector addition.
v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.