×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Already have an account? Login here
×
Reset your password

Solutions for Chapter 6.1: Matrix Solutions to Linear Systems

College Algebra | 7th Edition | ISBN: 9780134469164 | Authors: Robert F. Blitzer

Full solutions for College Algebra | 7th Edition

ISBN: 9780134469164

College Algebra | 7th Edition | ISBN: 9780134469164 | Authors: Robert F. Blitzer

Solutions for Chapter 6.1: Matrix Solutions to Linear Systems

Solutions for Chapter 6.1
4 5 0 296 Reviews
17
0
Textbook: College Algebra
Edition: 7
Author: Robert F. Blitzer
ISBN: 9780134469164

Since 73 problems in chapter 6.1: Matrix Solutions to Linear Systems have been answered, more than 32885 students have viewed full step-by-step solutions from this chapter. This textbook survival guide was created for the textbook: College Algebra , edition: 7. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 6.1: Matrix Solutions to Linear Systems includes 73 full step-by-step solutions. College Algebra was written by and is associated to the ISBN: 9780134469164.

Key Math Terms and definitions covered in this textbook
  • Column picture of Ax = b.

    The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).

  • Cyclic shift

    S. Permutation with S21 = 1, S32 = 1, ... , finally SIn = 1. Its eigenvalues are the nth roots e2lrik/n of 1; eigenvectors are columns of the Fourier matrix F.

  • Diagonalizable matrix A.

    Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.

  • Elimination.

    A sequence of row operations that reduces A to an upper triangular U or to the reduced form R = rref(A). Then A = LU with multipliers eO in L, or P A = L U with row exchanges in P, or E A = R with an invertible E.

  • Exponential eAt = I + At + (At)2 12! + ...

    has derivative AeAt; eAt u(O) solves u' = Au.

  • Free columns of A.

    Columns without pivots; these are combinations of earlier columns.

  • Full column rank r = n.

    Independent columns, N(A) = {O}, no free variables.

  • Gauss-Jordan method.

    Invert A by row operations on [A I] to reach [I A-I].

  • Hermitian matrix A H = AT = A.

    Complex analog a j i = aU of a symmetric matrix.

  • Inverse matrix A-I.

    Square matrix with A-I A = I and AA-l = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B-1 A-I and (A-I)T. Cofactor formula (A-l)ij = Cji! detA.

  • Jordan form 1 = M- 1 AM.

    If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

  • Multiplicities AM and G M.

    The algebraic multiplicity A M of A is the number of times A appears as a root of det(A - AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).

  • Outer product uv T

    = column times row = rank one matrix.

  • Pivot.

    The diagonal entry (first nonzero) at the time when a row is used in elimination.

  • Reflection matrix (Householder) Q = I -2uuT.

    Unit vector u is reflected to Qu = -u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q-1 = Q.

  • Simplex method for linear programming.

    The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

  • Singular Value Decomposition

    (SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

  • Solvable system Ax = b.

    The right side b is in the column space of A.

  • Toeplitz matrix.

    Constant down each diagonal = time-invariant (shift-invariant) filter.

  • Unitary matrix UH = U T = U-I.

    Orthonormal columns (complex analog of Q).