×
×

# Solutions for Chapter 6.3: Analysis of Algorithms

## Full solutions for Discrete Mathematics | 1st Edition

ISBN: 9781577667308

Solutions for Chapter 6.3: Analysis of Algorithms

Solutions for Chapter 6.3
4 5 0 397 Reviews
20
2
##### ISBN: 9781577667308

Chapter 6.3: Analysis of Algorithms includes 9 full step-by-step solutions. This expansive textbook survival guide covers the following chapters and their solutions. Since 9 problems in chapter 6.3: Analysis of Algorithms have been answered, more than 12271 students have viewed full step-by-step solutions from this chapter. This textbook survival guide was created for the textbook: Discrete Mathematics, edition: 1. Discrete Mathematics was written by and is associated to the ISBN: 9781577667308.

Key Math Terms and definitions covered in this textbook
• Adjacency matrix of a graph.

Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

• Basis for V.

Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!

• Block matrix.

A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

• Circulant matrix C.

Constant diagonals wrap around as in cyclic shift S. Every circulant is Col + CIS + ... + Cn_lSn - l . Cx = convolution c * x. Eigenvectors in F.

• Covariance matrix:E.

When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x - x) (x - x) T is positive (semi)definite; :E is diagonal if the Xi are independent.

• Diagonalizable matrix A.

Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.

• Fast Fourier Transform (FFT).

A factorization of the Fourier matrix Fn into e = log2 n matrices Si times a permutation. Each Si needs only nl2 multiplications, so Fnx and Fn-1c can be computed with ne/2 multiplications. Revolutionary.

• Hermitian matrix A H = AT = A.

Complex analog a j i = aU of a symmetric matrix.

• Jordan form 1 = M- 1 AM.

If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

• Kronecker product (tensor product) A ® B.

Blocks aij B, eigenvalues Ap(A)Aq(B).

• Left nullspace N (AT).

Nullspace of AT = "left nullspace" of A because y T A = OT.

• Matrix multiplication AB.

The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

• Multiplicities AM and G M.

The algebraic multiplicity A M of A is the number of times A appears as a root of det(A - AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).

• Nilpotent matrix N.

Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.

• Partial pivoting.

In each column, choose the largest available pivot to control roundoff; all multipliers have leij I < 1. See condition number.

• Pseudoinverse A+ (Moore-Penrose inverse).

The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

• Rank one matrix A = uvT f=. O.

Column and row spaces = lines cu and cv.

• Rank r (A)

= number of pivots = dimension of column space = dimension of row space.

• Symmetric matrix A.

The transpose is AT = A, and aU = a ji. A-I is also symmetric.

• Tridiagonal matrix T: tij = 0 if Ii - j I > 1.

T- 1 has rank 1 above and below diagonal.

×