×
×

# Solutions for Chapter 15: GENERATORS. DIRECT PRODUCTS

## Full solutions for Modern Algebra: An Introduction | 6th Edition

ISBN: 9780470384435

Solutions for Chapter 15: GENERATORS. DIRECT PRODUCTS

Solutions for Chapter 15
4 5 0 410 Reviews
23
4
##### ISBN: 9780470384435

Since 30 problems in chapter 15: GENERATORS. DIRECT PRODUCTS have been answered, more than 8073 students have viewed full step-by-step solutions from this chapter. Modern Algebra: An Introduction was written by and is associated to the ISBN: 9780470384435. Chapter 15: GENERATORS. DIRECT PRODUCTS includes 30 full step-by-step solutions. This textbook survival guide was created for the textbook: Modern Algebra: An Introduction, edition: 6. This expansive textbook survival guide covers the following chapters and their solutions.

Key Math Terms and definitions covered in this textbook
• Augmented matrix [A b].

Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

• Change of basis matrix M.

The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)

• Diagonalizable matrix A.

Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.

• Distributive Law

A(B + C) = AB + AC. Add then multiply, or mUltiply then add.

• Exponential eAt = I + At + (At)2 12! + ...

has derivative AeAt; eAt u(O) solves u' = Au.

• Hessenberg matrix H.

Triangular matrix with one extra nonzero adjacent diagonal.

• Identity matrix I (or In).

Diagonal entries = 1, off-diagonal entries = 0.

• Kronecker product (tensor product) A ® B.

Blocks aij B, eigenvalues Ap(A)Aq(B).

• Lucas numbers

Ln = 2,J, 3, 4, ... satisfy Ln = L n- l +Ln- 2 = A1 +A~, with AI, A2 = (1 ± -/5)/2 from the Fibonacci matrix U~]' Compare Lo = 2 with Fo = O.

• Markov matrix M.

All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

• Orthogonal subspaces.

Every v in V is orthogonal to every w in W.

• Pivot columns of A.

Columns that contain pivots after row reduction. These are not combinations of earlier columns. The pivot columns are a basis for the column space.

• Pivot.

The diagonal entry (first nonzero) at the time when a row is used in elimination.

• Polar decomposition A = Q H.

Orthogonal Q times positive (semi)definite H.

• Projection matrix P onto subspace S.

Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.

• Reduced row echelon form R = rref(A).

Pivots = 1; zeros above and below pivots; the r nonzero rows of R give a basis for the row space of A.

• Right inverse A+.

If A has full row rank m, then A+ = AT(AAT)-l has AA+ = 1m.

• Simplex method for linear programming.

The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

• Singular Value Decomposition

(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

• Wavelets Wjk(t).

Stretch and shift the time axis to create Wjk(t) = woo(2j t - k).

×