×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Already have an account? Login here
×
Reset your password

Solutions for Chapter 4.4: Discrete Mathematics with Applications 4th Edition

Discrete Mathematics with Applications | 4th Edition | ISBN: 9780495391326 | Authors: Susanna S. Epp

Full solutions for Discrete Mathematics with Applications | 4th Edition

ISBN: 9780495391326

Discrete Mathematics with Applications | 4th Edition | ISBN: 9780495391326 | Authors: Susanna S. Epp

Solutions for Chapter 4.4

Solutions for Chapter 4.4
4 5 0 308 Reviews
31
3
Textbook: Discrete Mathematics with Applications
Edition: 4
Author: Susanna S. Epp
ISBN: 9780495391326

This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Discrete Mathematics with Applications , edition: 4. Discrete Mathematics with Applications was written by and is associated to the ISBN: 9780495391326. Chapter 4.4 includes 53 full step-by-step solutions. Since 53 problems in chapter 4.4 have been answered, more than 92404 students have viewed full step-by-step solutions from this chapter.

Key Math Terms and definitions covered in this textbook
  • Augmented matrix [A b].

    Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

  • Conjugate Gradient Method.

    A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax - x Tb over growing Krylov subspaces.

  • Covariance matrix:E.

    When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x - x) (x - x) T is positive (semi)definite; :E is diagonal if the Xi are independent.

  • Cross product u xv in R3:

    Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].

  • Fast Fourier Transform (FFT).

    A factorization of the Fourier matrix Fn into e = log2 n matrices Si times a permutation. Each Si needs only nl2 multiplications, so Fnx and Fn-1c can be computed with ne/2 multiplications. Revolutionary.

  • Fourier matrix F.

    Entries Fjk = e21Cijk/n give orthogonal columns FT F = nI. Then y = Fe is the (inverse) Discrete Fourier Transform Y j = L cke21Cijk/n.

  • Graph G.

    Set of n nodes connected pairwise by m edges. A complete graph has all n(n - 1)/2 edges between nodes. A tree has only n - 1 edges and no closed loops.

  • Jordan form 1 = M- 1 AM.

    If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

  • Multiplicities AM and G M.

    The algebraic multiplicity A M of A is the number of times A appears as a root of det(A - AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).

  • Norm

    IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

  • Positive definite matrix A.

    Symmetric matrix with positive eigenvalues and positive pivots. Definition: x T Ax > 0 unless x = O. Then A = LDLT with diag(D» O.

  • Projection matrix P onto subspace S.

    Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.

  • Row picture of Ax = b.

    Each equation gives a plane in Rn; the planes intersect at x.

  • Schur complement S, D - C A -} B.

    Appears in block elimination on [~ g ].

  • Singular Value Decomposition

    (SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

  • Spanning set.

    Combinations of VI, ... ,Vm fill the space. The columns of A span C (A)!

  • Special solutions to As = O.

    One free variable is Si = 1, other free variables = o.

  • Standard basis for Rn.

    Columns of n by n identity matrix (written i ,j ,k in R3).

  • Stiffness matrix

    If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.

  • Triangle inequality II u + v II < II u II + II v II.

    For matrix norms II A + B II < II A II + II B II·