 1.SE.1E: Mark each statement True or False. Justify each answer. (If true, c...
 1.SE.2E: Let a and b represent real numbers. Describe the possible solution ...
 1.SE.3E: The solutions (x, y, z) of a single linear equation ax + by + cz = ...
 1.SE.4E: Suppose the coefficient matrix of a linear system of three equation...
 1.SE.5E: Determine h and k such that the solution set of the system (i) is e...
 1.SE.6E: Consider the problem of determining whether the following system of...
 1.SE.7E: Consider the problem of determining whether the following system of...
 1.SE.8E: Describe the possible echelon forms of the matrix A. Use the notati...
 1.SE.9E: Write the vector as the sum of two vectors, one on the line and one...
 1.SE.10E: 10. Let a1, a2, and b be the vectors in ?2 shown in the figure, and...
 1.SE.11E: Construct a 2 × 3 matrix A, not in echelon form, such that the solu...
 1.SE.12E: Construct a 2 × 3 matrix A, not in echelon form, such that the solu...
 1.SE.13E: Write the reduced echelon form of a 3 × 3 matrix A such that the fi...
 1.SE.14E: Determine the value(s) of a such that Is linearly independent.
 1.SE.15E: In (a) and (b), suppose the vectors are linearly independent. What ...
 1.SE.16E: Use Theorem 7 in Section 1.7 to explain why the columns of the matr...
 1.SE.17E: Explain why a set {v1, v2, v3, v4} in ?5 must be linearly independe...
 1.SE.18E: Suppose {v1, v2} is a linearly independent set in ?n. Show that {v1...
 1.SE.19E: Suppose v1, v2, v3 are distinct points on one line in ?3. The line ...
 1.SE.20E: Let T : ?n? ?m be a linear transformation, and suppose T (u) = v. S...
 1.SE.21E: Let T : ?3? ?3 be the linear transformation that reflects each vect...
 1.SE.22E: Let A be a 3 × 3 matrix with the property that the linear transform...
 1.SE.23E: A Givens rotation is a linear transformation from ?n& to ?n& used i...
 1.SE.24E: The following equation describes a Givens rotation in ?3. Find a an...
 1.SE.25E: A large apartment building is to be built using modular constructio...
Solutions for Chapter 1.SE: Linear Algebra and Its Applications 5th Edition
Full solutions for Linear Algebra and Its Applications  5th Edition
ISBN: 9780321982384
Solutions for Chapter 1.SE
Get Full SolutionsThis expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Linear Algebra and Its Applications , edition: 5. Since 25 problems in chapter 1.SE have been answered, more than 40810 students have viewed full stepbystep solutions from this chapter. Linear Algebra and Its Applications was written by and is associated to the ISBN: 9780321982384. Chapter 1.SE includes 25 full stepbystep solutions.

Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

Basis for V.
Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!

Companion matrix.
Put CI, ... ,Cn in row n and put n  1 ones just above the main diagonal. Then det(A  AI) = ±(CI + c2A + C3A 2 + .•. + cnA nl  An).

Covariance matrix:E.
When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x  x) (x  x) T is positive (semi)definite; :E is diagonal if the Xi are independent.

Cyclic shift
S. Permutation with S21 = 1, S32 = 1, ... , finally SIn = 1. Its eigenvalues are the nth roots e2lrik/n of 1; eigenvectors are columns of the Fourier matrix F.

Dimension of vector space
dim(V) = number of vectors in any basis for V.

Distributive Law
A(B + C) = AB + AC. Add then multiply, or mUltiply then add.

Ellipse (or ellipsoid) x T Ax = 1.
A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA1 yll2 = Y T(AAT)1 Y = 1 displayed by eigshow; axis lengths ad

Hermitian matrix A H = AT = A.
Complex analog a j i = aU of a symmetric matrix.

Markov matrix M.
All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

Nullspace N (A)
= All solutions to Ax = O. Dimension n  r = (# columns)  rank.

Orthogonal matrix Q.
Square matrix with orthonormal columns, so QT = Ql. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

Outer product uv T
= column times row = rank one matrix.

Positive definite matrix A.
Symmetric matrix with positive eigenvalues and positive pivots. Definition: x T Ax > 0 unless x = O. Then A = LDLT with diag(D» O.

Projection p = a(aTblaTa) onto the line through a.
P = aaT laTa has rank l.

Rayleigh quotient q (x) = X T Ax I x T x for symmetric A: Amin < q (x) < Amax.
Those extremes are reached at the eigenvectors x for Amin(A) and Amax(A).

Reflection matrix (Householder) Q = I 2uuT.
Unit vector u is reflected to Qu = u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q1 = Q.

Right inverse A+.
If A has full row rank m, then A+ = AT(AAT)l has AA+ = 1m.

Skewsymmetric matrix K.
The transpose is K, since Kij = Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

Wavelets Wjk(t).
Stretch and shift the time axis to create Wjk(t) = woo(2j t  k).