×

×

# Solutions for Chapter 5.1: Equivalent Sets; Finite Sets

## Full solutions for A Transition to Advanced Mathematics | 7th Edition

ISBN: 9780495562023

Solutions for Chapter 5.1: Equivalent Sets; Finite Sets

Solutions for Chapter 5.1
4 5 0 345 Reviews
27
5
##### ISBN: 9780495562023

A Transition to Advanced Mathematics was written by and is associated to the ISBN: 9780495562023. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: A Transition to Advanced Mathematics, edition: 7. Chapter 5.1: Equivalent Sets; Finite Sets includes 22 full step-by-step solutions. Since 22 problems in chapter 5.1: Equivalent Sets; Finite Sets have been answered, more than 27269 students have viewed full step-by-step solutions from this chapter.

Key Math Terms and definitions covered in this textbook
• Condition number

cond(A) = c(A) = IIAIlIIA-III = amaxlamin. In Ax = b, the relative change Ilox III Ilx II is less than cond(A) times the relative change Ilob III lib II· Condition numbers measure the sensitivity of the output to change in the input.

• Cyclic shift

S. Permutation with S21 = 1, S32 = 1, ... , finally SIn = 1. Its eigenvalues are the nth roots e2lrik/n of 1; eigenvectors are columns of the Fourier matrix F.

• Diagonalization

A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.

• Fast Fourier Transform (FFT).

A factorization of the Fourier matrix Fn into e = log2 n matrices Si times a permutation. Each Si needs only nl2 multiplications, so Fnx and Fn-1c can be computed with ne/2 multiplications. Revolutionary.

• Hypercube matrix pl.

Row n + 1 counts corners, edges, faces, ... of a cube in Rn.

• Identity matrix I (or In).

Diagonal entries = 1, off-diagonal entries = 0.

• Jordan form 1 = M- 1 AM.

If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

• Lucas numbers

Ln = 2,J, 3, 4, ... satisfy Ln = L n- l +Ln- 2 = A1 +A~, with AI, A2 = (1 ± -/5)/2 from the Fibonacci matrix U~]' Compare Lo = 2 with Fo = O.

• Markov matrix M.

All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

• Nilpotent matrix N.

Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.

• Normal equation AT Ax = ATb.

Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.

• Particular solution x p.

Any solution to Ax = b; often x p has free variables = o.

• Projection p = a(aTblaTa) onto the line through a.

P = aaT laTa has rank l.

• Pseudoinverse A+ (Moore-Penrose inverse).

The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

• Row space C (AT) = all combinations of rows of A.

Column vectors by convention.

• Schur complement S, D - C A -} B.

Appears in block elimination on [~ g ].

• Spectral Theorem A = QAQT.

Real symmetric A has real A'S and orthonormal q's.

• Vandermonde matrix V.

V c = b gives coefficients of p(x) = Co + ... + Cn_IXn- 1 with P(Xi) = bi. Vij = (Xi)j-I and det V = product of (Xk - Xi) for k > i.

• Vector v in Rn.

Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.

• Wavelets Wjk(t).

Stretch and shift the time axis to create Wjk(t) = woo(2j t - k).