×

×

# Solutions for Chapter 3.2: Discrete Mathematics and Its Applications 7th Edition

## Full solutions for Discrete Mathematics and Its Applications | 7th Edition

ISBN: 9780073383095

Solutions for Chapter 3.2

Solutions for Chapter 3.2
4 5 0 350 Reviews
30
4
##### ISBN: 9780073383095

This textbook survival guide was created for the textbook: Discrete Mathematics and Its Applications, edition: 7. Discrete Mathematics and Its Applications was written by and is associated to the ISBN: 9780073383095. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 3.2 includes 74 full step-by-step solutions. Since 74 problems in chapter 3.2 have been answered, more than 535244 students have viewed full step-by-step solutions from this chapter.

Key Math Terms and definitions covered in this textbook
• Change of basis matrix M.

The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)

• Cramer's Rule for Ax = b.

B j has b replacing column j of A; x j = det B j I det A

• Dimension of vector space

dim(V) = number of vectors in any basis for V.

• Hessenberg matrix H.

Triangular matrix with one extra nonzero adjacent diagonal.

• Hypercube matrix pl.

Row n + 1 counts corners, edges, faces, ... of a cube in Rn.

• Left inverse A+.

If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.

• Multiplier eij.

The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).

• Norm

IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

• Normal matrix.

If N NT = NT N, then N has orthonormal (complex) eigenvectors.

• Outer product uv T

= column times row = rank one matrix.

• Polar decomposition A = Q H.

Orthogonal Q times positive (semi)definite H.

• Schur complement S, D - C A -} B.

Appears in block elimination on [~ g ].

• Singular Value Decomposition

(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

• Standard basis for Rn.

Columns of n by n identity matrix (written i ,j ,k in R3).

• Symmetric factorizations A = LDLT and A = QAQT.

Signs in A = signs in D.

• Symmetric matrix A.

The transpose is AT = A, and aU = a ji. A-I is also symmetric.

• Toeplitz matrix.

Constant down each diagonal = time-invariant (shift-invariant) filter.

• Tridiagonal matrix T: tij = 0 if Ii - j I > 1.

T- 1 has rank 1 above and below diagonal.