 3.2.1: Determine whether the following sets form subspaces of R2: (a) {(x1...
 3.2.2: Determine whether the following sets form subspaces of R3: (a) {(x1...
 3.2.3: Determine whether the following are subspaces of R22: (a) The set o...
 3.2.4: Determine the null space of each of the following matrices: (a) 2 1...
 3.2.5: Determine whether the following are subspaces of P4 (be careful!): ...
 3.2.6: Determine whether the following are subspaces of C[1, 1]: (a) The s...
 3.2.7: Show that Cn[a, b] is a subspace of C[a, b].
 3.2.8: Let A be a fixed vector in Rnn and let S be the set of all matrices...
 3.2.9: In each of the following, determine the subspace of R22 consisting ...
 3.2.10: Let A be a particular vector in R22. Determine whether the followin...
 3.2.11: Determine whether the following are spanning sets for R2: (a) _ 2 1...
 3.2.12: Which of the sets that follow are spanning sets for R3? Justify you...
 3.2.13: Given x1 = 1 2 3 , x2 = 3 4 2 , x = 2 6 6 , y = 9 2 5 (a) Is x Span...
 3.2.14: Let {x1, x2, . . . , xk } be a spanning set for a vector space V. (...
 3.2.15: In R22, let E11 = 1 0 0 0 , E12 = 0 1 0 0 E21 = 0 0 1 0 , E22 = 0 0...
 3.2.16: Which of the sets that follow are spanning sets for P3? Justify you...
 3.2.17: Let S be the vector space of infinite sequences defined in Exercise...
 3.2.18: Prove that if S is a subspace of R1, then either S = {0} or S = R1. 1
 3.2.19: Let A be an n n matrix. Prove that the following statements are equ...
 3.2.20: Let U and V be subspaces of a vector space W. Prove that their inte...
 3.2.21: Let S be the subspace of R2 spanned by e1 and let T be the subspace...
 3.2.22: Let U and V be subspaces of a vector space W. Define U + V = {z  z...
 3.2.23: Let S, T, and U be subspaces of a vector space V. We can form new s...
Solutions for Chapter 3.2: Subspaces
Full solutions for Linear Algebra with Applications  8th Edition
ISBN: 9780136009290
Solutions for Chapter 3.2: Subspaces
Get Full SolutionsThis expansive textbook survival guide covers the following chapters and their solutions. Since 23 problems in chapter 3.2: Subspaces have been answered, more than 6801 students have viewed full stepbystep solutions from this chapter. Linear Algebra with Applications was written by and is associated to the ISBN: 9780136009290. Chapter 3.2: Subspaces includes 23 full stepbystep solutions. This textbook survival guide was created for the textbook: Linear Algebra with Applications, edition: 8.

Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

Big formula for n by n determinants.
Det(A) is a sum of n! terms. For each term: Multiply one entry from each row and column of A: rows in order 1, ... , nand column order given by a permutation P. Each of the n! P 's has a + or  sign.

Cyclic shift
S. Permutation with S21 = 1, S32 = 1, ... , finally SIn = 1. Its eigenvalues are the nth roots e2lrik/n of 1; eigenvectors are columns of the Fourier matrix F.

Dimension of vector space
dim(V) = number of vectors in any basis for V.

Eigenvalue A and eigenvector x.
Ax = AX with x#O so det(A  AI) = o.

Elimination.
A sequence of row operations that reduces A to an upper triangular U or to the reduced form R = rref(A). Then A = LU with multipliers eO in L, or P A = L U with row exchanges in P, or E A = R with an invertible E.

Graph G.
Set of n nodes connected pairwise by m edges. A complete graph has all n(n  1)/2 edges between nodes. A tree has only n  1 edges and no closed loops.

Hankel matrix H.
Constant along each antidiagonal; hij depends on i + j.

Normal matrix.
If N NT = NT N, then N has orthonormal (complex) eigenvectors.

Orthogonal matrix Q.
Square matrix with orthonormal columns, so QT = Ql. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

Outer product uv T
= column times row = rank one matrix.

Pivot.
The diagonal entry (first nonzero) at the time when a row is used in elimination.

Pseudoinverse A+ (MoorePenrose inverse).
The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

Rayleigh quotient q (x) = X T Ax I x T x for symmetric A: Amin < q (x) < Amax.
Those extremes are reached at the eigenvectors x for Amin(A) and Amax(A).

Rotation matrix
R = [~ CS ] rotates the plane by () and R 1 = RT rotates back by (). Eigenvalues are eiO and eiO , eigenvectors are (1, ±i). c, s = cos (), sin ().

Skewsymmetric matrix K.
The transpose is K, since Kij = Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

Spectrum of A = the set of eigenvalues {A I, ... , An}.
Spectral radius = max of IAi I.

Stiffness matrix
If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.

Subspace S of V.
Any vector space inside V, including V and Z = {zero vector only}.

Vandermonde matrix V.
V c = b gives coefficients of p(x) = Co + ... + Cn_IXn 1 with P(Xi) = bi. Vij = (Xi)jI and det V = product of (Xk  Xi) for k > i.