- 6.4.1: Define SQRT: by SQRT a) Is SQRT: operation preserving?(b) Is SQRT: ...
- 6.4.2: Define SQR: by SQR(a) Is SQR: operation preserving?(b) Is SQR: oper...
- 6.4.3: Define on by setting(a) Show that is an algebraic system.(b) Show t...
- 6.4.4: Let be the set of all real-valued integrable functions defined on t...
- 6.4.5: Let f: and be OP maps.(a) Prove that is an OP map.(b) Prove that if...
- 6.4.6: Let be the set of all matrices with real entries. Define Det: byDet...
- 6.4.7: Let Conj: be the conjugate mapping for complex numbers given byConj...
- 6.4.8: Let f be a function from set A to set B. Let f and be the induced f...
- 6.4.9: Prove Theorem 6.4.3.
- 6.4.10: (a) Show that any two groups of order 2 are isomorphic.(b) Show tha...
- 6.4.11: Let and be the sets of integer multiples of 3 and 6, respectively. ...
- 6.4.12: Let and be the groups in Exercise 12 and let g be the functionfrom ...
- 6.4.13: Let be the group with the operation table shown here.a b ca a b cb ...
- 6.4.14: Let and (a) Prove that the function given by is welldefined and is ...
- 6.4.15: Let and Defineby f (xq) = [4x]. (a) Prove that f is a well-defined ...
- 6.4.16: Let and be groups, i be the identity element for H, andbe a homomor...
- 6.4.17: Show that and are isomorphic.
- 6.4.18: Is isomorphic to Explain.
- 6.4.19: Prove that the relation of isomorphism is an equivalence relation. ...
- 6.4.20: Use the method of proof of Cayleys Theorem to find a group of permu...
- 6.4.21: Assign a grade of A (correct), C (partially correct), or F (failure...
- 6.4.22: Claim. Let and be OP maps.Then the composite is an OP map.Proof.
Solutions for Chapter 6.4: Operation Preserving Maps
Full solutions for A Transition to Advanced Mathematics | 7th Edition
Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).
Big formula for n by n determinants.
Det(A) is a sum of n! terms. For each term: Multiply one entry from each row and column of A: rows in order 1, ... , nand column order given by a permutation P. Each of the n! P 's has a + or - sign.
peA) = det(A - AI) has peA) = zero matrix.
Cramer's Rule for Ax = b.
B j has b replacing column j of A; x j = det B j I det A
Diagonalizable matrix A.
Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.
Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.
Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).
Echelon matrix U.
The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.
Fourier matrix F.
Entries Fjk = e21Cijk/n give orthogonal columns FT F = nI. Then y = Fe is the (inverse) Discrete Fourier Transform Y j = L cke21Cijk/n.
Free variable Xi.
Column i has no pivot in elimination. We can give the n - r free variables any values, then Ax = b determines the r pivot variables (if solvable!).
Hypercube matrix pl.
Row n + 1 counts corners, edges, faces, ... of a cube in Rn.
Identity matrix I (or In).
Diagonal entries = 1, off-diagonal entries = 0.
Jordan form 1 = M- 1 AM.
If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.
Linear combination cv + d w or L C jV j.
Vector addition and scalar multiplication.
Minimal polynomial of A.
The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A - AI) if no eigenvalues are repeated; always meA) divides peA).
IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.
Normal equation AT Ax = ATb.
Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.
Every v in V is orthogonal to every w in W.
In each column, choose the largest available pivot to control roundoff; all multipliers have leij I < 1. See condition number.
Skew-symmetric matrix K.
The transpose is -K, since Kij = -Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.
If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.