 10.7.1: Suppose that a game has a payoff matrix (a) If players R and C use ...
 10.7.2: Construct a simple example to show that optimal strategies are not ...
 10.7.3: For the strictly determined games with the following payoff matrice...
 10.7.4: For the games with the following payoff matrices, find optimal stra...
 10.7.5: Player R has two playing cards: a black ace and a red four. Player ...
 10.7.6: Verify Equations 6, 7, and 8.
 10.7.7: Verify the statement in the last paragraph of Example 3
 10.7.8: Show that the entries of the optimal strategies and given in Theore...
 10.7.T1: Consider a game between two players where each player can make up t...
 10.7.T2: Consider a game between two players where each player can make up t...
Solutions for Chapter 10.7: Games of Strategy
Full solutions for Elementary Linear Algebra: Applications Version  10th Edition
ISBN: 9780470432051
Solutions for Chapter 10.7: Games of Strategy
Get Full SolutionsChapter 10.7: Games of Strategy includes 10 full stepbystep solutions. This textbook survival guide was created for the textbook: Elementary Linear Algebra: Applications Version, edition: 10. Elementary Linear Algebra: Applications Version was written by and is associated to the ISBN: 9780470432051. Since 10 problems in chapter 10.7: Games of Strategy have been answered, more than 14325 students have viewed full stepbystep solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions.

Affine transformation
Tv = Av + Vo = linear transformation plus shift.

Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.

Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

Back substitution.
Upper triangular systems are solved in reverse order Xn to Xl.

Big formula for n by n determinants.
Det(A) is a sum of n! terms. For each term: Multiply one entry from each row and column of A: rows in order 1, ... , nand column order given by a permutation P. Each of the n! P 's has a + or  sign.

Covariance matrix:E.
When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x  x) (x  x) T is positive (semi)definite; :E is diagonal if the Xi are independent.

Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.
Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).

Free columns of A.
Columns without pivots; these are combinations of earlier columns.

Fundamental Theorem.
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n  r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.

Inverse matrix AI.
Square matrix with AI A = I and AAl = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B1 AI and (AI)T. Cofactor formula (Al)ij = Cji! detA.

Left inverse A+.
If A has full column rank n, then A+ = (AT A)I AT has A+ A = In.

Linear transformation T.
Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

Network.
A directed graph that has constants Cl, ... , Cm associated with the edges.

Orthogonal subspaces.
Every v in V is orthogonal to every w in W.

Projection matrix P onto subspace S.
Projection p = P b is the closest point to b in S, error e = b  Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) 1 AT.

Rank r (A)
= number of pivots = dimension of column space = dimension of row space.

Reflection matrix (Householder) Q = I 2uuT.
Unit vector u is reflected to Qu = u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q1 = Q.

Rotation matrix
R = [~ CS ] rotates the plane by () and R 1 = RT rotates back by (). Eigenvalues are eiO and eiO , eigenvectors are (1, ±i). c, s = cos (), sin ().

Schwarz inequality
Iv·wl < IIvll IIwll.Then IvTAwl2 < (vT Av)(wT Aw) for pos def A.

Symmetric matrix A.
The transpose is AT = A, and aU = a ji. AI is also symmetric.