Make up to $500 this semester by taking notes for StudySoup as an Elite Notetaker Apply Now

Solutions for Chapter 3.3: Analysis of Algorithms

Mathematical Structures for Computer Science | 7th Edition | ISBN: 9781429215107 | Authors: Judith L. Gersting

Full solutions for Mathematical Structures for Computer Science | 7th Edition

ISBN: 9781429215107

Mathematical Structures for Computer Science | 7th Edition | ISBN: 9781429215107 | Authors: Judith L. Gersting

Solutions for Chapter 3.3: Analysis of Algorithms

Solutions for Chapter 3.3
4 5 0 410 Reviews
22
3
Textbook: Mathematical Structures for Computer Science
Edition: 7
Author: Judith L. Gersting
ISBN: 9781429215107

Chapter 3.3: Analysis of Algorithms includes 40 full step-by-step solutions. This expansive textbook survival guide covers the following chapters and their solutions. Since 40 problems in chapter 3.3: Analysis of Algorithms have been answered, more than 4393 students have viewed full step-by-step solutions from this chapter. Mathematical Structures for Computer Science was written by Patricia and is associated to the ISBN: 9781429215107. This textbook survival guide was created for the textbook: Mathematical Structures for Computer Science, edition: 7.

Key Math Terms and definitions covered in this textbook
  • Adjacency matrix of a graph.

    Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

  • Associative Law (AB)C = A(BC).

    Parentheses can be removed to leave ABC.

  • Cayley-Hamilton Theorem.

    peA) = det(A - AI) has peA) = zero matrix.

  • Change of basis matrix M.

    The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)

  • Column space C (A) =

    space of all combinations of the columns of A.

  • Condition number

    cond(A) = c(A) = IIAIlIIA-III = amaxlamin. In Ax = b, the relative change Ilox III Ilx II is less than cond(A) times the relative change Ilob III lib II· Condition numbers measure the sensitivity of the output to change in the input.

  • Hessenberg matrix H.

    Triangular matrix with one extra nonzero adjacent diagonal.

  • Kronecker product (tensor product) A ® B.

    Blocks aij B, eigenvalues Ap(A)Aq(B).

  • Length II x II.

    Square root of x T x (Pythagoras in n dimensions).

  • Multiplier eij.

    The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).

  • Network.

    A directed graph that has constants Cl, ... , Cm associated with the edges.

  • Norm

    IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

  • Normal equation AT Ax = ATb.

    Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.

  • Projection matrix P onto subspace S.

    Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.

  • Pseudoinverse A+ (Moore-Penrose inverse).

    The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

  • Schwarz inequality

    Iv·wl < IIvll IIwll.Then IvTAwl2 < (vT Av)(wT Aw) for pos def A.

  • Semidefinite matrix A.

    (Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

  • Similar matrices A and B.

    Every B = M-I AM has the same eigenvalues as A.

  • Solvable system Ax = b.

    The right side b is in the column space of A.

  • Subspace S of V.

    Any vector space inside V, including V and Z = {zero vector only}.

×
Log in to StudySoup
Get Full Access to Mathematical Structures for Computer Science

Forgot password? Reset password here

Join StudySoup for FREE
Get Full Access to Mathematical Structures for Computer Science
Join with Email
Already have an account? Login here
Reset your password

I don't want to reset my password

Need help? Contact support

Need an Account? Is not associated with an account
Sign up
We're here to help

Having trouble accessing your account? Let us help you, contact support at +1(510) 944-1054 or support@studysoup.com

Got it, thanks!
Password Reset Request Sent An email has been sent to the email address associated to your account. Follow the link in the email to reset your password. If you're having trouble finding our email please check your spam folder
Got it, thanks!
Already have an Account? Is already in use
Log in
Incorrect Password The password used to log in with this account is incorrect
Try Again

Forgot password? Reset it here