×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Solutions for Chapter 1.4: Linear Algebra and Its Applications 5th Edition

Linear Algebra and Its Applications | 5th Edition | ISBN: 9780321982384 | Authors: David C. Lay; Steven R. Lay; Judi J. McDonald

Full solutions for Linear Algebra and Its Applications | 5th Edition

ISBN: 9780321982384

Linear Algebra and Its Applications | 5th Edition | ISBN: 9780321982384 | Authors: David C. Lay; Steven R. Lay; Judi J. McDonald

Solutions for Chapter 1.4

Solutions for Chapter 1.4
4 5 0 283 Reviews
15
0
Textbook: Linear Algebra and Its Applications
Edition: 5
Author: David C. Lay; Steven R. Lay; Judi J. McDonald
ISBN: 9780321982384

Chapter 1.4 includes 42 full step-by-step solutions. Since 42 problems in chapter 1.4 have been answered, more than 40831 students have viewed full step-by-step solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. Linear Algebra and Its Applications was written by and is associated to the ISBN: 9780321982384. This textbook survival guide was created for the textbook: Linear Algebra and Its Applications , edition: 5.

Key Math Terms and definitions covered in this textbook
  • Block matrix.

    A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

  • Change of basis matrix M.

    The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)

  • Cholesky factorization

    A = CTC = (L.J]))(L.J]))T for positive definite A.

  • Condition number

    cond(A) = c(A) = IIAIlIIA-III = amaxlamin. In Ax = b, the relative change Ilox III Ilx II is less than cond(A) times the relative change Ilob III lib II· Condition numbers measure the sensitivity of the output to change in the input.

  • Cross product u xv in R3:

    Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].

  • Diagonalization

    A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.

  • Dimension of vector space

    dim(V) = number of vectors in any basis for V.

  • Fast Fourier Transform (FFT).

    A factorization of the Fourier matrix Fn into e = log2 n matrices Si times a permutation. Each Si needs only nl2 multiplications, so Fnx and Fn-1c can be computed with ne/2 multiplications. Revolutionary.

  • Kronecker product (tensor product) A ® B.

    Blocks aij B, eigenvalues Ap(A)Aq(B).

  • Least squares solution X.

    The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.

  • Normal equation AT Ax = ATb.

    Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.

  • Orthonormal vectors q 1 , ... , q n·

    Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

  • Permutation matrix P.

    There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or -1) based on the number of row exchanges to reach I.

  • Pivot columns of A.

    Columns that contain pivots after row reduction. These are not combinations of earlier columns. The pivot columns are a basis for the column space.

  • Polar decomposition A = Q H.

    Orthogonal Q times positive (semi)definite H.

  • Projection matrix P onto subspace S.

    Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.

  • Row picture of Ax = b.

    Each equation gives a plane in Rn; the planes intersect at x.

  • Singular Value Decomposition

    (SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

  • Trace of A

    = sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.

  • Vector space V.

    Set of vectors such that all combinations cv + d w remain within V. Eight required rules are given in Section 3.1 for scalars c, d and vectors v, w.

×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide
×
Reset your password