×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Solutions for Chapter 10.2: Probability Outcomes and Trials

Discovering Algebra: An Investigative Approach | 2nd Edition | ISBN: 9781559537636 | Authors: Jerald Murdock, Ellen Kamischke, Eric Kamischke

Full solutions for Discovering Algebra: An Investigative Approach | 2nd Edition

ISBN: 9781559537636

Discovering Algebra: An Investigative Approach | 2nd Edition | ISBN: 9781559537636 | Authors: Jerald Murdock, Ellen Kamischke, Eric Kamischke

Solutions for Chapter 10.2: Probability Outcomes and Trials

This textbook survival guide was created for the textbook: Discovering Algebra: An Investigative Approach, edition: 2. This expansive textbook survival guide covers the following chapters and their solutions. Discovering Algebra: An Investigative Approach was written by and is associated to the ISBN: 9781559537636. Since 13 problems in chapter 10.2: Probability Outcomes and Trials have been answered, more than 8536 students have viewed full step-by-step solutions from this chapter. Chapter 10.2: Probability Outcomes and Trials includes 13 full step-by-step solutions.

Key Math Terms and definitions covered in this textbook
  • Cayley-Hamilton Theorem.

    peA) = det(A - AI) has peA) = zero matrix.

  • Dimension of vector space

    dim(V) = number of vectors in any basis for V.

  • Ellipse (or ellipsoid) x T Ax = 1.

    A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA-1 yll2 = Y T(AAT)-1 Y = 1 displayed by eigshow; axis lengths ad

  • Fibonacci numbers

    0,1,1,2,3,5, ... satisfy Fn = Fn-l + Fn- 2 = (A7 -A~)I()q -A2). Growth rate Al = (1 + .J5) 12 is the largest eigenvalue of the Fibonacci matrix [ } A].

  • Four Fundamental Subspaces C (A), N (A), C (AT), N (AT).

    Use AT for complex A.

  • Hermitian matrix A H = AT = A.

    Complex analog a j i = aU of a symmetric matrix.

  • Independent vectors VI, .. " vk.

    No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.

  • Matrix multiplication AB.

    The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

  • Norm

    IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

  • Nullspace matrix N.

    The columns of N are the n - r special solutions to As = O.

  • Outer product uv T

    = column times row = rank one matrix.

  • Particular solution x p.

    Any solution to Ax = b; often x p has free variables = o.

  • Projection matrix P onto subspace S.

    Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.

  • Projection p = a(aTblaTa) onto the line through a.

    P = aaT laTa has rank l.

  • Semidefinite matrix A.

    (Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

  • Similar matrices A and B.

    Every B = M-I AM has the same eigenvalues as A.

  • Singular Value Decomposition

    (SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

  • Skew-symmetric matrix K.

    The transpose is -K, since Kij = -Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

  • Solvable system Ax = b.

    The right side b is in the column space of A.

  • Vector addition.

    v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.

×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide
×
Reset your password