×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Solutions for Chapter 13.5: Discrete Mathematics and Its Applications 7th Edition

Discrete Mathematics and Its Applications | 7th Edition | ISBN: 9780073383095 | Authors: Kenneth Rosen

Full solutions for Discrete Mathematics and Its Applications | 7th Edition

ISBN: 9780073383095

Discrete Mathematics and Its Applications | 7th Edition | ISBN: 9780073383095 | Authors: Kenneth Rosen

Solutions for Chapter 13.5

Solutions for Chapter 13.5
4 5 0 240 Reviews
23
5
Textbook: Discrete Mathematics and Its Applications
Edition: 7
Author: Kenneth Rosen
ISBN: 9780073383095

This textbook survival guide was created for the textbook: Discrete Mathematics and Its Applications, edition: 7. Discrete Mathematics and Its Applications was written by and is associated to the ISBN: 9780073383095. Chapter 13.5 includes 32 full step-by-step solutions. This expansive textbook survival guide covers the following chapters and their solutions. Since 32 problems in chapter 13.5 have been answered, more than 185770 students have viewed full step-by-step solutions from this chapter.

Key Math Terms and definitions covered in this textbook
  • Cramer's Rule for Ax = b.

    B j has b replacing column j of A; x j = det B j I det A

  • Diagonalizable matrix A.

    Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.

  • Full row rank r = m.

    Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.

  • Identity matrix I (or In).

    Diagonal entries = 1, off-diagonal entries = 0.

  • Kronecker product (tensor product) A ® B.

    Blocks aij B, eigenvalues Ap(A)Aq(B).

  • Linear transformation T.

    Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

  • Matrix multiplication AB.

    The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

  • Multiplier eij.

    The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).

  • Norm

    IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

  • Pivot.

    The diagonal entry (first nonzero) at the time when a row is used in elimination.

  • Projection matrix P onto subspace S.

    Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.

  • Pseudoinverse A+ (Moore-Penrose inverse).

    The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

  • Schur complement S, D - C A -} B.

    Appears in block elimination on [~ g ].

  • Schwarz inequality

    Iv·wl < IIvll IIwll.Then IvTAwl2 < (vT Av)(wT Aw) for pos def A.

  • Singular matrix A.

    A square matrix that has no inverse: det(A) = o.

  • Subspace S of V.

    Any vector space inside V, including V and Z = {zero vector only}.

  • Toeplitz matrix.

    Constant down each diagonal = time-invariant (shift-invariant) filter.

  • Trace of A

    = sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.

  • Tridiagonal matrix T: tij = 0 if Ii - j I > 1.

    T- 1 has rank 1 above and below diagonal.

  • Vector space V.

    Set of vectors such that all combinations cv + d w remain within V. Eight required rules are given in Section 3.1 for scalars c, d and vectors v, w.

×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide
×
Reset your password