 19.1E: Verify that each of the sets in Examples 1– 4 satisfies the axioms ...
 19.2E: (Subspace Test) Prove that a nonempty subset U of a vector space V ...
 19.3E: Verify that the set in Example 6 is a subspace. Find a basis for th...
 19.4E: Verify that the set defined in Example 7 is a subspace.Reference:
 19.5E: Determine whether or not the set {(2, –1, 0), (1, 2, 5), (7, –1, 5)...
 19.6E: Determine whether or not the set is linearly independent over Z5.
 19.7E: If {u, v, w} is a linearly independent subset of a vector space, sh...
 19.8E: If {v1, v2, . . . , vn} is a linearly dependent set of vectors, pro...
 19.9E: (Every spanning collection contains a basis.) If {v1, v2, . . . , v...
 19.10E: (Every independent set is contained in a basis.) Let V be a finite ...
 19.11E: V is a vector space over F of dimension 5 and U and W are subspaces...
 19.12E: Show that the solution set to a system of equations of the form whe...
 19.13E: Let V be the set of all polynomials over Q of degree 2 together wit...
 19.14E: Let V = R3 and W = {(a, b, c) ? V  a2 + b2 = c2}. Is W a subspace ...
 19.15E: Let V = R3 and W = {(a, b, c) ? V  a + b = c}. Is W a subspace of ...
 19.16E: Let . Prove that V is a vector space over Q, and find a basis for V...
 19.17E: Verify that the set V in Example 9 is a vector space over R.REFERNCE:
 19.18E: Let P = {(a, b, c)  a, b, c [ R, a = 2b + 3c}. Prove that P is a s...
 19.19E: Let B be a subset of a vector space V. Show that B is a basis for V...
 19.20E: If U is a proper subspace of a finitedimensional vector space V, s...
 19.21E: Referring to the proof of Theorem 19.1, prove that {w1, u2, . . . ,...
 19.22E: If V is a vector space of dimension n over the field Zp, how many e...
 19.23E: Let S = {(a, b, c, d)  a, b, c, d ? R, a = c, d = a+ b}. Find a ba...
 19.24E: Let U and W be subspaces of a vector space V. Show that U ? W is a ...
 19.25E: If a vector space has one basis that contains infinitely many eleme...
 19.26E: Let u = (2, 3, 1), v = (1, 3, 0), and w = (2, –3, 3). Since (1/2)u ...
 19.27E: Define the vector space analog of group homomorphism and ring homom...
 19.28E: Let T be a linear transformation from V to W. Prove that the image ...
 19.29E: Let T be a linear transformation of a vector space V. Prove that {v...
 19.30E: Let T be a linear transformation of V onto W. If {v1, v2, . . . , v...
 19.31E: If V is a vector space over F of dimension n, prove that V is isomo...
 19.32E: Let V be a vector space over an infinite field. Prove that V is not...
Solutions for Chapter 19: Contemporary Abstract Algebra 8th Edition
Full solutions for Contemporary Abstract Algebra  8th Edition
ISBN: 9781133599708
Solutions for Chapter 19
Get Full SolutionsThis expansive textbook survival guide covers the following chapters and their solutions. Chapter 19 includes 32 full stepbystep solutions. Since 32 problems in chapter 19 have been answered, more than 51859 students have viewed full stepbystep solutions from this chapter. This textbook survival guide was created for the textbook: Contemporary Abstract Algebra , edition: 8. Contemporary Abstract Algebra was written by and is associated to the ISBN: 9781133599708.

Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.

Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

Column picture of Ax = b.
The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).

Column space C (A) =
space of all combinations of the columns of A.

Companion matrix.
Put CI, ... ,Cn in row n and put n  1 ones just above the main diagonal. Then det(A  AI) = ±(CI + c2A + C3A 2 + .•. + cnA nl  An).

Conjugate Gradient Method.
A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax  x Tb over growing Krylov subspaces.

Ellipse (or ellipsoid) x T Ax = 1.
A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA1 yll2 = Y T(AAT)1 Y = 1 displayed by eigshow; axis lengths ad

Fundamental Theorem.
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n  r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.

Kronecker product (tensor product) A ® B.
Blocks aij B, eigenvalues Ap(A)Aq(B).

Left nullspace N (AT).
Nullspace of AT = "left nullspace" of A because y T A = OT.

Linearly dependent VI, ... , Vn.
A combination other than all Ci = 0 gives L Ci Vi = O.

Minimal polynomial of A.
The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A  AI) if no eigenvalues are repeated; always meA) divides peA).

Multiplication Ax
= Xl (column 1) + ... + xn(column n) = combination of columns.

Pivot.
The diagonal entry (first nonzero) at the time when a row is used in elimination.

Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.

Rank r (A)
= number of pivots = dimension of column space = dimension of row space.

Right inverse A+.
If A has full row rank m, then A+ = AT(AAT)l has AA+ = 1m.

Schur complement S, D  C A } B.
Appears in block elimination on [~ g ].

Similar matrices A and B.
Every B = MI AM has the same eigenvalues as A.

Solvable system Ax = b.
The right side b is in the column space of A.