×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Solutions for Chapter 9.1: LU-Decompositions

Full solutions for Elementary Linear Algebra, Binder Ready Version: Applications Version | 11th Edition

ISBN: 9781118474228

Solutions for Chapter 9.1: LU-Decompositions

Solutions for Chapter 9.1
4 5 0 426 Reviews
13
0
Textbook: Elementary Linear Algebra, Binder Ready Version: Applications Version
Edition: 11
Author: Howard Anton, Chris Rorres
ISBN: 9781118474228

This textbook survival guide was created for the textbook: Elementary Linear Algebra, Binder Ready Version: Applications Version, edition: 11. Elementary Linear Algebra, Binder Ready Version: Applications Version was written by and is associated to the ISBN: 9781118474228. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 9.1: LU-Decompositions includes 22 full step-by-step solutions. Since 22 problems in chapter 9.1: LU-Decompositions have been answered, more than 15662 students have viewed full step-by-step solutions from this chapter.

Key Math Terms and definitions covered in this textbook
  • Change of basis matrix M.

    The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)

  • Cofactor Cij.

    Remove row i and column j; multiply the determinant by (-I)i + j •

  • Condition number

    cond(A) = c(A) = IIAIlIIA-III = amaxlamin. In Ax = b, the relative change Ilox III Ilx II is less than cond(A) times the relative change Ilob III lib II· Condition numbers measure the sensitivity of the output to change in the input.

  • Covariance matrix:E.

    When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x - x) (x - x) T is positive (semi)definite; :E is diagonal if the Xi are independent.

  • Ellipse (or ellipsoid) x T Ax = 1.

    A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA-1 yll2 = Y T(AAT)-1 Y = 1 displayed by eigshow; axis lengths ad

  • Exponential eAt = I + At + (At)2 12! + ...

    has derivative AeAt; eAt u(O) solves u' = Au.

  • Hilbert matrix hilb(n).

    Entries HU = 1/(i + j -1) = Jd X i- 1 xj-1dx. Positive definite but extremely small Amin and large condition number: H is ill-conditioned.

  • Identity matrix I (or In).

    Diagonal entries = 1, off-diagonal entries = 0.

  • Indefinite matrix.

    A symmetric matrix with eigenvalues of both signs (+ and - ).

  • Inverse matrix A-I.

    Square matrix with A-I A = I and AA-l = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B-1 A-I and (A-I)T. Cofactor formula (A-l)ij = Cji! detA.

  • Iterative method.

    A sequence of steps intended to approach the desired solution.

  • Kronecker product (tensor product) A ® B.

    Blocks aij B, eigenvalues Ap(A)Aq(B).

  • Linear combination cv + d w or L C jV j.

    Vector addition and scalar multiplication.

  • Nullspace N (A)

    = All solutions to Ax = O. Dimension n - r = (# columns) - rank.

  • Orthonormal vectors q 1 , ... , q n·

    Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

  • Pseudoinverse A+ (Moore-Penrose inverse).

    The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

  • Rank one matrix A = uvT f=. O.

    Column and row spaces = lines cu and cv.

  • Rotation matrix

    R = [~ CS ] rotates the plane by () and R- 1 = RT rotates back by -(). Eigenvalues are eiO and e-iO , eigenvectors are (1, ±i). c, s = cos (), sin ().

  • Singular Value Decomposition

    (SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

  • Spanning set.

    Combinations of VI, ... ,Vm fill the space. The columns of A span C (A)!

×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide
×
Reset your password