 1.SE.1E: Mark each statement True or False. Justify each answer. (If true, c...
 1.SE.2E: Let a and b represent real numbers. Describe the possible solution ...
 1.SE.3E: The solutions (x, y, z) of a single linear equation ax + by + cz = ...
 1.SE.4E: Suppose the coefficient matrix of a linear system of three equation...
 1.SE.5E: Determine h and k such that the solution set of the system (i) is e...
 1.SE.6E: Consider the problem of determining whether the following system of...
 1.SE.7E: Consider the problem of determining whether the following system of...
 1.SE.8E: Describe the possible echelon forms of the matrix A. Use the notati...
 1.SE.9E: Write the vector as the sum of two vectors, one on the lineand one ...
 1.SE.10E: 10. Let a1, a2, and b be the vectors in ?2 shown in the figure, and...
 1.SE.11E: Construct a 2 × 3 matrix A, not in echelon form, such that the solu...
 1.SE.12E: Construct a 2 × 3 matrix A, not in echelon form, such that the solu...
 1.SE.13E: Write the reduced echelon form of a 3 × 3 matrix A such that the fi...
 1.SE.14E: Determine the value(s) of a such that Is linearly independent.
 1.SE.15E: In (a) and (b), suppose the vectors are linearly independent. What ...
 1.SE.16E: Use Theorem 7 in Section 1.7 to explain why the columns of the matr...
 1.SE.17E: Explain why a set {v1, v2, v3, v4} in ?5 must be linearly independe...
 1.SE.18E: Suppose {v1, v2} is a linearly independent set in ?n. Show that {v1...
 1.SE.19E: Suppose v1, v2, v3 are distinct points on one line in ?3. The line ...
 1.SE.20E: Let T : ?n? ?m be a linear transformation, and suppose T (u) = v. S...
 1.SE.21E: Let T : ?3? ?3 be the linear transformation that reflects each vect...
 1.SE.22E: Let A be a 3 × 3 matrix with the property that the linear transform...
 1.SE.23E: A Givens rotation is a linear transformation from ?n& to ?n& used i...
 1.SE.24E: The following equation describes a Givens rotation in ?3. Find a an...
 1.SE.25E: A large apartment building is to be built using modular constructio...
Solutions for Chapter 1.SE: Linear Algebra and Its Applications 4th Edition
Full solutions for Linear Algebra and Its Applications  4th Edition
ISBN: 9780321385178
Solutions for Chapter 1.SE
Get Full SolutionsThis expansive textbook survival guide covers the following chapters and their solutions. Linear Algebra and Its Applications was written by and is associated to the ISBN: 9780321385178. Chapter 1.SE includes 25 full stepbystep solutions. Since 25 problems in chapter 1.SE have been answered, more than 34944 students have viewed full stepbystep solutions from this chapter. This textbook survival guide was created for the textbook: Linear Algebra and Its Applications, edition: 4.

Affine transformation
Tv = Av + Vo = linear transformation plus shift.

Back substitution.
Upper triangular systems are solved in reverse order Xn to Xl.

Column space C (A) =
space of all combinations of the columns of A.

Cross product u xv in R3:
Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].

Cyclic shift
S. Permutation with S21 = 1, S32 = 1, ... , finally SIn = 1. Its eigenvalues are the nth roots e2lrik/n of 1; eigenvectors are columns of the Fourier matrix F.

Elimination.
A sequence of row operations that reduces A to an upper triangular U or to the reduced form R = rref(A). Then A = LU with multipliers eO in L, or P A = L U with row exchanges in P, or E A = R with an invertible E.

Ellipse (or ellipsoid) x T Ax = 1.
A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA1 yll2 = Y T(AAT)1 Y = 1 displayed by eigshow; axis lengths ad

Factorization
A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.

Fast Fourier Transform (FFT).
A factorization of the Fourier matrix Fn into e = log2 n matrices Si times a permutation. Each Si needs only nl2 multiplications, so Fnx and Fn1c can be computed with ne/2 multiplications. Revolutionary.

GaussJordan method.
Invert A by row operations on [A I] to reach [I AI].

Graph G.
Set of n nodes connected pairwise by m edges. A complete graph has all n(n  1)/2 edges between nodes. A tree has only n  1 edges and no closed loops.

Matrix multiplication AB.
The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

Normal equation AT Ax = ATb.
Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b  Ax) = o.

Orthogonal matrix Q.
Square matrix with orthonormal columns, so QT = Ql. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

Particular solution x p.
Any solution to Ax = b; often x p has free variables = o.

Plane (or hyperplane) in Rn.
Vectors x with aT x = O. Plane is perpendicular to a =1= O.

Sum V + W of subs paces.
Space of all (v in V) + (w in W). Direct sum: V n W = to}.

Symmetric matrix A.
The transpose is AT = A, and aU = a ji. AI is also symmetric.

Tridiagonal matrix T: tij = 0 if Ii  j I > 1.
T 1 has rank 1 above and below diagonal.

Vector v in Rn.
Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.