×
×

# Solutions for Chapter 11.1: Systems of Linear Equations: Substitution and Elimination

## Full solutions for Precalculus Enhanced with Graphing Utilities | 6th Edition

ISBN: 9780132854351

Solutions for Chapter 11.1: Systems of Linear Equations: Substitution and Elimination

Solutions for Chapter 11.1
4 5 0 286 Reviews
24
2
##### ISBN: 9780132854351

Chapter 11.1: Systems of Linear Equations: Substitution and Elimination includes 84 full step-by-step solutions. This textbook survival guide was created for the textbook: Precalculus Enhanced with Graphing Utilities, edition: 6. This expansive textbook survival guide covers the following chapters and their solutions. Since 84 problems in chapter 11.1: Systems of Linear Equations: Substitution and Elimination have been answered, more than 56208 students have viewed full step-by-step solutions from this chapter. Precalculus Enhanced with Graphing Utilities was written by and is associated to the ISBN: 9780132854351.

Key Math Terms and definitions covered in this textbook

A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax - x Tb over growing Krylov subspaces.

• Fourier matrix F.

Entries Fjk = e21Cijk/n give orthogonal columns FT F = nI. Then y = Fe is the (inverse) Discrete Fourier Transform Y j = L cke21Cijk/n.

• Gauss-Jordan method.

Invert A by row operations on [A I] to reach [I A-I].

• Gram-Schmidt orthogonalization A = QR.

Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.

• Graph G.

Set of n nodes connected pairwise by m edges. A complete graph has all n(n - 1)/2 edges between nodes. A tree has only n - 1 edges and no closed loops.

• Hessenberg matrix H.

Triangular matrix with one extra nonzero adjacent diagonal.

• Least squares solution X.

The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.

• Length II x II.

Square root of x T x (Pythagoras in n dimensions).

• Multiplier eij.

The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).

• Nilpotent matrix N.

Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.

• Nullspace matrix N.

The columns of N are the n - r special solutions to As = O.

• Partial pivoting.

In each column, choose the largest available pivot to control roundoff; all multipliers have leij I < 1. See condition number.

• Rotation matrix

R = [~ CS ] rotates the plane by () and R- 1 = RT rotates back by -(). Eigenvalues are eiO and e-iO , eigenvectors are (1, ±i). c, s = cos (), sin ().

• Saddle point of I(x}, ... ,xn ).

A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

• Semidefinite matrix A.

(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

• Spanning set.

Combinations of VI, ... ,Vm fill the space. The columns of A span C (A)!

• Spectrum of A = the set of eigenvalues {A I, ... , An}.

Spectral radius = max of IAi I.

• Trace of A

= sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.

• Vector space V.

Set of vectors such that all combinations cv + d w remain within V. Eight required rules are given in Section 3.1 for scalars c, d and vectors v, w.

• Vector v in Rn.

Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.

×