×
×

# Solutions for Chapter 50: FINITE FIELDS

## Full solutions for Modern Algebra: An Introduction | 6th Edition

ISBN: 9780470384435

Solutions for Chapter 50: FINITE FIELDS

Solutions for Chapter 50
4 5 0 301 Reviews
11
2
##### ISBN: 9780470384435

This expansive textbook survival guide covers the following chapters and their solutions. Chapter 50: FINITE FIELDS includes 11 full step-by-step solutions. This textbook survival guide was created for the textbook: Modern Algebra: An Introduction, edition: 6. Since 11 problems in chapter 50: FINITE FIELDS have been answered, more than 8968 students have viewed full step-by-step solutions from this chapter. Modern Algebra: An Introduction was written by and is associated to the ISBN: 9780470384435.

Key Math Terms and definitions covered in this textbook
• Affine transformation

Tv = Av + Vo = linear transformation plus shift.

• Block matrix.

A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

• Condition number

cond(A) = c(A) = IIAIlIIA-III = amaxlamin. In Ax = b, the relative change Ilox III Ilx II is less than cond(A) times the relative change Ilob III lib II· Condition numbers measure the sensitivity of the output to change in the input.

• Diagonalizable matrix A.

Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.

• Diagonalization

A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.

• Identity matrix I (or In).

Diagonal entries = 1, off-diagonal entries = 0.

• Incidence matrix of a directed graph.

The m by n edge-node incidence matrix has a row for each edge (node i to node j), with entries -1 and 1 in columns i and j .

• Jordan form 1 = M- 1 AM.

If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

• Krylov subspace Kj(A, b).

The subspace spanned by b, Ab, ... , Aj-Ib. Numerical methods approximate A -I b by x j with residual b - Ax j in this subspace. A good basis for K j requires only multiplication by A at each step.

• Left inverse A+.

If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.

• Length II x II.

Square root of x T x (Pythagoras in n dimensions).

• Normal equation AT Ax = ATb.

Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.

• Orthogonal matrix Q.

Square matrix with orthonormal columns, so QT = Q-l. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

• Permutation matrix P.

There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or -1) based on the number of row exchanges to reach I.

• Pivot.

The diagonal entry (first nonzero) at the time when a row is used in elimination.

• Polar decomposition A = Q H.

Orthogonal Q times positive (semi)definite H.

• Positive definite matrix A.

Symmetric matrix with positive eigenvalues and positive pivots. Definition: x T Ax > 0 unless x = O. Then A = LDLT with diag(D» O.

• Rank one matrix A = uvT f=. O.

Column and row spaces = lines cu and cv.

• Sum V + W of subs paces.

Space of all (v in V) + (w in W). Direct sum: V n W = to}.

• Symmetric factorizations A = LDLT and A = QAQT.

Signs in A = signs in D.

×