×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Solutions for Chapter 1.5: Elementary Matrices and a Method for Finding A1

Full solutions for Elementary Linear Algebra: Applications Version | 10th Edition

ISBN: 9780470432051

Solutions for Chapter 1.5: Elementary Matrices and a Method for Finding A1

Solutions for Chapter 1.5
4 5 0 417 Reviews
29
5
Textbook: Elementary Linear Algebra: Applications Version
Edition: 10
Author: Howard Anton, Chris Rorres
ISBN: 9780470432051

Elementary Linear Algebra: Applications Version was written by and is associated to the ISBN: 9780470432051. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Elementary Linear Algebra: Applications Version, edition: 10. Since 50 problems in chapter 1.5: Elementary Matrices and a Method for Finding A1 have been answered, more than 14160 students have viewed full step-by-step solutions from this chapter. Chapter 1.5: Elementary Matrices and a Method for Finding A1 includes 50 full step-by-step solutions.

Key Math Terms and definitions covered in this textbook
  • Back substitution.

    Upper triangular systems are solved in reverse order Xn to Xl.

  • Change of basis matrix M.

    The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)

  • Column picture of Ax = b.

    The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).

  • Commuting matrices AB = BA.

    If diagonalizable, they share n eigenvectors.

  • Conjugate Gradient Method.

    A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax - x Tb over growing Krylov subspaces.

  • Diagonalizable matrix A.

    Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.

  • Dimension of vector space

    dim(V) = number of vectors in any basis for V.

  • Hermitian matrix A H = AT = A.

    Complex analog a j i = aU of a symmetric matrix.

  • Independent vectors VI, .. " vk.

    No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.

  • Krylov subspace Kj(A, b).

    The subspace spanned by b, Ab, ... , Aj-Ib. Numerical methods approximate A -I b by x j with residual b - Ax j in this subspace. A good basis for K j requires only multiplication by A at each step.

  • lA-II = l/lAI and IATI = IAI.

    The big formula for det(A) has a sum of n! terms, the cofactor formula uses determinants of size n - 1, volume of box = I det( A) I.

  • Multiplicities AM and G M.

    The algebraic multiplicity A M of A is the number of times A appears as a root of det(A - AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).

  • Multiplier eij.

    The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).

  • Permutation matrix P.

    There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or -1) based on the number of row exchanges to reach I.

  • Pivot columns of A.

    Columns that contain pivots after row reduction. These are not combinations of earlier columns. The pivot columns are a basis for the column space.

  • Polar decomposition A = Q H.

    Orthogonal Q times positive (semi)definite H.

  • Positive definite matrix A.

    Symmetric matrix with positive eigenvalues and positive pivots. Definition: x T Ax > 0 unless x = O. Then A = LDLT with diag(D» O.

  • Rank one matrix A = uvT f=. O.

    Column and row spaces = lines cu and cv.

  • Reduced row echelon form R = rref(A).

    Pivots = 1; zeros above and below pivots; the r nonzero rows of R give a basis for the row space of A.

  • Schwarz inequality

    Iv·wl < IIvll IIwll.Then IvTAwl2 < (vT Av)(wT Aw) for pos def A.

×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide
×
Reset your password