 5.6.5.6.1: Place these permutatiQnsQf{ l, 2, 3, 4, 5} in lexicQgraphic order: ...
 5.6.5.6.2: Place these permutatiQns .of { 1 ,2,3,4,5,6} in lexicQgraphic .orde...
 5.6.5.6.3: Find the next larger permutatiQn in lexicQgraphic .order after each...
 5.6.5.6.4: Find the next larger permutatiQn in lexicQgraphic .order after each...
 5.6.5.6.5: Use AlgQrithm tQ generate the 24 permutatiQns .of the first fQur PQ...
 5.6.5.6.6: Use AlgQrithm 2 tQ list all the subsets .of the set {I, 2, 3, 4}.
 5.6.5.6.7: Use AlgQrithm 3 tQ list all the 3cQmbinatiQns .of {I, 2, 3, 4, 5}.
 5.6.5.6.8: ShQW that AlgQrithm 1 produces the next larger permutatiQn in lexic...
 5.6.5.6.9: ShQW that AlgQrithm 3 produces the next larger rcQmbinatiQn in lex...
 5.6.5.6.10: DevelQP an algQrithm fQr generating the rpermutatiQns .of a set .o...
 5.6.5.6.11: List all 3permutatiQns .of {l, 2, 3, 4, 5}.
 5.6.5.6.12: Find the integers that cQrresPQnd tQ these permutatiQns . a) 246531...
 5.6.5.6.13: ShQW that the cQrresPQndence described here is a bijectiQn between ...
 5.6.5.6.14: Find the permutatiQns Qf{l, 2, 3, 4, 5} that cQrresPQnd tQ these in...
 5.6.5.6.15: DevelQP an algQrithm fQr producing all permutatiQns .of a set .of n...
Solutions for Chapter 5.6: Counting
Full solutions for Discrete Mathematics and Its Applications  6th Edition
ISBN: 9780073229720
Solutions for Chapter 5.6: Counting
Get Full SolutionsThis expansive textbook survival guide covers the following chapters and their solutions. Discrete Mathematics and Its Applications was written by and is associated to the ISBN: 9780073229720. Since 15 problems in chapter 5.6: Counting have been answered, more than 35948 students have viewed full stepbystep solutions from this chapter. This textbook survival guide was created for the textbook: Discrete Mathematics and Its Applications, edition: 6. Chapter 5.6: Counting includes 15 full stepbystep solutions.

Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

Covariance matrix:E.
When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x  x) (x  x) T is positive (semi)definite; :E is diagonal if the Xi are independent.

Diagonal matrix D.
dij = 0 if i # j. Blockdiagonal: zero outside square blocks Du.

Diagonalization
A = S1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k SI.

Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.
Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).

Factorization
A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.

Fast Fourier Transform (FFT).
A factorization of the Fourier matrix Fn into e = log2 n matrices Si times a permutation. Each Si needs only nl2 multiplications, so Fnx and Fn1c can be computed with ne/2 multiplications. Revolutionary.

Hilbert matrix hilb(n).
Entries HU = 1/(i + j 1) = Jd X i 1 xj1dx. Positive definite but extremely small Amin and large condition number: H is illconditioned.

Indefinite matrix.
A symmetric matrix with eigenvalues of both signs (+ and  ).

Independent vectors VI, .. " vk.
No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.

Krylov subspace Kj(A, b).
The subspace spanned by b, Ab, ... , AjIb. Numerical methods approximate A I b by x j with residual b  Ax j in this subspace. A good basis for K j requires only multiplication by A at each step.

Length II x II.
Square root of x T x (Pythagoras in n dimensions).

Linear combination cv + d w or L C jV j.
Vector addition and scalar multiplication.

Linearly dependent VI, ... , Vn.
A combination other than all Ci = 0 gives L Ci Vi = O.

Matrix multiplication AB.
The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q 1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

Projection p = a(aTblaTa) onto the line through a.
P = aaT laTa has rank l.

Reflection matrix (Householder) Q = I 2uuT.
Unit vector u is reflected to Qu = u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q1 = Q.

Spectrum of A = the set of eigenvalues {A I, ... , An}.
Spectral radius = max of IAi I.

Vandermonde matrix V.
V c = b gives coefficients of p(x) = Co + ... + Cn_IXn 1 with P(Xi) = bi. Vij = (Xi)jI and det V = product of (Xk  Xi) for k > i.