×
×

# Solutions for Chapter 12-4: Probability of Compound Events

## Full solutions for Algebra 1, Student Edition (MERRILL ALGEBRA 1) | 1st Edition

ISBN: 9780078738227

Solutions for Chapter 12-4: Probability of Compound Events

Solutions for Chapter 12-4
4 5 0 335 Reviews
12
1
##### ISBN: 9780078738227

This expansive textbook survival guide covers the following chapters and their solutions. Algebra 1, Student Edition (MERRILL ALGEBRA 1) was written by and is associated to the ISBN: 9780078738227. Chapter 12-4: Probability of Compound Events includes 71 full step-by-step solutions. Since 71 problems in chapter 12-4: Probability of Compound Events have been answered, more than 34578 students have viewed full step-by-step solutions from this chapter. This textbook survival guide was created for the textbook: Algebra 1, Student Edition (MERRILL ALGEBRA 1) , edition: 1.

Key Math Terms and definitions covered in this textbook
• Cayley-Hamilton Theorem.

peA) = det(A - AI) has peA) = zero matrix.

• Column space C (A) =

space of all combinations of the columns of A.

• Cramer's Rule for Ax = b.

B j has b replacing column j of A; x j = det B j I det A

• Eigenvalue A and eigenvector x.

Ax = AX with x#-O so det(A - AI) = o.

• Elimination.

A sequence of row operations that reduces A to an upper triangular U or to the reduced form R = rref(A). Then A = LU with multipliers eO in L, or P A = L U with row exchanges in P, or E A = R with an invertible E.

• Exponential eAt = I + At + (At)2 12! + ...

has derivative AeAt; eAt u(O) solves u' = Au.

• Hankel matrix H.

Constant along each antidiagonal; hij depends on i + j.

• Linear combination cv + d w or L C jV j.

• Linear transformation T.

Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

• Linearly dependent VI, ... , Vn.

A combination other than all Ci = 0 gives L Ci Vi = O.

• Matrix multiplication AB.

The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

• Orthogonal subspaces.

Every v in V is orthogonal to every w in W.

• Pascal matrix

Ps = pascal(n) = the symmetric matrix with binomial entries (i1~;2). Ps = PL Pu all contain Pascal's triangle with det = 1 (see Pascal in the index).

• Rank one matrix A = uvT f=. O.

Column and row spaces = lines cu and cv.

• Reflection matrix (Householder) Q = I -2uuT.

Unit vector u is reflected to Qu = -u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q-1 = Q.

• Row space C (AT) = all combinations of rows of A.

Column vectors by convention.

• Skew-symmetric matrix K.

The transpose is -K, since Kij = -Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

• Stiffness matrix

If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.

• Triangle inequality II u + v II < II u II + II v II.

For matrix norms II A + B II < II A II + II B II·

• Vector space V.

Set of vectors such that all combinations cv + d w remain within V. Eight required rules are given in Section 3.1 for scalars c, d and vectors v, w.

×