 10.2.1: For each trial, list the possible outcomes. a. tossing a coin b. ro...
 10.2.2: The table below shows the distribution by fragrance of candles in a...
 10.2.3: One hundred tiny cubes were dropped onto a circle like the one at r...
 10.2.4: Igbaita (pitch and toss) is a favorite recreational game in Africa...
 10.2.5: Draw and label a segment like this one. Plot and label points on yo...
 10.2.6: Suppose that 350 beans are randomly dropped on the rectangle shown ...
 10.2.7: Dr. Lynn Rogers of the North American Bear Center does research on ...
 10.2.8: Twenty randomly chosen high school students were asked to estimate ...
 10.2.9: In the Wheel of Wealth game, contestants spin a large wheel like th...
 10.2.10: The Candy Coated Carob Company produces six differentcolored candi...
 10.2.11: If you flip a paper cup into the air, what are the possible outcome...
 10.2.12: Give four pairs of coordinates that would create a shape like this ...
 10.2.13: APPLICATION The star hitter on the baseball team at City Community ...
Solutions for Chapter 10.2: Probability Outcomes and Trials
Full solutions for Discovering Algebra: An Investigative Approach  2nd Edition
ISBN: 9781559537636
Solutions for Chapter 10.2: Probability Outcomes and Trials
Get Full SolutionsThis textbook survival guide was created for the textbook: Discovering Algebra: An Investigative Approach, edition: 2. This expansive textbook survival guide covers the following chapters and their solutions. Discovering Algebra: An Investigative Approach was written by and is associated to the ISBN: 9781559537636. Since 13 problems in chapter 10.2: Probability Outcomes and Trials have been answered, more than 8536 students have viewed full stepbystep solutions from this chapter. Chapter 10.2: Probability Outcomes and Trials includes 13 full stepbystep solutions.

CayleyHamilton Theorem.
peA) = det(A  AI) has peA) = zero matrix.

Dimension of vector space
dim(V) = number of vectors in any basis for V.

Ellipse (or ellipsoid) x T Ax = 1.
A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA1 yll2 = Y T(AAT)1 Y = 1 displayed by eigshow; axis lengths ad

Fibonacci numbers
0,1,1,2,3,5, ... satisfy Fn = Fnl + Fn 2 = (A7 A~)I()q A2). Growth rate Al = (1 + .J5) 12 is the largest eigenvalue of the Fibonacci matrix [ } A].

Four Fundamental Subspaces C (A), N (A), C (AT), N (AT).
Use AT for complex A.

Hermitian matrix A H = AT = A.
Complex analog a j i = aU of a symmetric matrix.

Independent vectors VI, .. " vk.
No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.

Matrix multiplication AB.
The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

Norm
IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

Nullspace matrix N.
The columns of N are the n  r special solutions to As = O.

Outer product uv T
= column times row = rank one matrix.

Particular solution x p.
Any solution to Ax = b; often x p has free variables = o.

Projection matrix P onto subspace S.
Projection p = P b is the closest point to b in S, error e = b  Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) 1 AT.

Projection p = a(aTblaTa) onto the line through a.
P = aaT laTa has rank l.

Semidefinite matrix A.
(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

Similar matrices A and B.
Every B = MI AM has the same eigenvalues as A.

Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

Skewsymmetric matrix K.
The transpose is K, since Kij = Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

Solvable system Ax = b.
The right side b is in the column space of A.

Vector addition.
v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.