×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Solutions for Chapter 34: Expectation

Full solutions for Mathematics: A Discrete Introduction | 3rd Edition

ISBN: 9780840049421

Solutions for Chapter 34: Expectation

Solutions for Chapter 34
4 5 0 241 Reviews
19
2
Textbook: Mathematics: A Discrete Introduction
Edition: 3
Author: Edward A. Scheinerman
ISBN: 9780840049421

Chapter 34: Expectation includes 25 full step-by-step solutions. This textbook survival guide was created for the textbook: Mathematics: A Discrete Introduction, edition: 3. This expansive textbook survival guide covers the following chapters and their solutions. Since 25 problems in chapter 34: Expectation have been answered, more than 6937 students have viewed full step-by-step solutions from this chapter. Mathematics: A Discrete Introduction was written by and is associated to the ISBN: 9780840049421.

Key Math Terms and definitions covered in this textbook
  • Affine transformation

    Tv = Av + Vo = linear transformation plus shift.

  • Block matrix.

    A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

  • Complex conjugate

    z = a - ib for any complex number z = a + ib. Then zz = Iz12.

  • Determinant IAI = det(A).

    Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and

  • Diagonalization

    A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.

  • Dimension of vector space

    dim(V) = number of vectors in any basis for V.

  • Elimination matrix = Elementary matrix Eij.

    The identity matrix with an extra -eij in the i, j entry (i #- j). Then Eij A subtracts eij times row j of A from row i.

  • Factorization

    A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.

  • Full row rank r = m.

    Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.

  • Krylov subspace Kj(A, b).

    The subspace spanned by b, Ab, ... , Aj-Ib. Numerical methods approximate A -I b by x j with residual b - Ax j in this subspace. A good basis for K j requires only multiplication by A at each step.

  • Left nullspace N (AT).

    Nullspace of AT = "left nullspace" of A because y T A = OT.

  • Lucas numbers

    Ln = 2,J, 3, 4, ... satisfy Ln = L n- l +Ln- 2 = A1 +A~, with AI, A2 = (1 ± -/5)/2 from the Fibonacci matrix U~]' Compare Lo = 2 with Fo = O.

  • Markov matrix M.

    All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

  • Matrix multiplication AB.

    The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

  • Orthogonal subspaces.

    Every v in V is orthogonal to every w in W.

  • Orthonormal vectors q 1 , ... , q n·

    Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

  • Permutation matrix P.

    There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or -1) based on the number of row exchanges to reach I.

  • Singular matrix A.

    A square matrix that has no inverse: det(A) = o.

  • Stiffness matrix

    If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.

  • Symmetric matrix A.

    The transpose is AT = A, and aU = a ji. A-I is also symmetric.

×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide
×
Reset your password