- 50.50.1: Find all generators of the multiplicative group of ZII.
- 50.50.2: The first step in the proof of Theorem 50.2 states that the charact...
- 50.50.3: Construct addition and multiplication tables for the ring ~[x ]/(x ...
- 50.50.4: Verify that I + x 2 E Z3 [x] is irreducible over Z3. (Suggestion: U...
- 50.50.5: (a) Verify that I +x +x3 E ~[x] is irreducible over~.(b) Construct ...
- 50.50.6: Find all n such that the multiplicative group of GF(13) has an elem...
- 50.50.7: Find all n such that the multiplicative group of GF(27) has an elem...
- 50.50.8: (a) Verify that I + x + X4 E Z2 [x] is irreducible over Z2.(b) Find...
- 50.50.9: Assume that F is a finite field of order pn and let (bl , b2 , . , ...
- 50.50.10: (a) Prove: If H is a subfield of GF(pn), then H has order pm for so...
- 50.50.11: Prove that if p is a prime, then piC (p, k )for I ::: k ::: p - I.
Solutions for Chapter 50: FINITE FIELDS
Full solutions for Modern Algebra: An Introduction | 6th Edition
Tv = Av + Vo = linear transformation plus shift.
A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.
cond(A) = c(A) = IIAIlIIA-III = amaxlamin. In Ax = b, the relative change Ilox III Ilx II is less than cond(A) times the relative change Ilob III lib II· Condition numbers measure the sensitivity of the output to change in the input.
Diagonalizable matrix A.
Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.
A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.
Identity matrix I (or In).
Diagonal entries = 1, off-diagonal entries = 0.
Incidence matrix of a directed graph.
The m by n edge-node incidence matrix has a row for each edge (node i to node j), with entries -1 and 1 in columns i and j .
Jordan form 1 = M- 1 AM.
If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.
Krylov subspace Kj(A, b).
The subspace spanned by b, Ab, ... , Aj-Ib. Numerical methods approximate A -I b by x j with residual b - Ax j in this subspace. A good basis for K j requires only multiplication by A at each step.
Left inverse A+.
If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.
Length II x II.
Square root of x T x (Pythagoras in n dimensions).
Normal equation AT Ax = ATb.
Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.
Orthogonal matrix Q.
Square matrix with orthonormal columns, so QT = Q-l. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.
Permutation matrix P.
There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or -1) based on the number of row exchanges to reach I.
The diagonal entry (first nonzero) at the time when a row is used in elimination.
Polar decomposition A = Q H.
Orthogonal Q times positive (semi)definite H.
Positive definite matrix A.
Symmetric matrix with positive eigenvalues and positive pivots. Definition: x T Ax > 0 unless x = O. Then A = LDLT with diag(D» O.
Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.
Sum V + W of subs paces.
Space of all (v in V) + (w in W). Direct sum: V n W = to}.
Symmetric factorizations A = LDLT and A = QAQT.
Signs in A = signs in D.