×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Already have an account? Login here
×
Reset your password

Solutions for Chapter 4.9: Rank of a Matrix

Elementary Linear Algebra with Applications | 9th Edition | ISBN: 9780132296540 | Authors: Bernard Kolman David Hill

Full solutions for Elementary Linear Algebra with Applications | 9th Edition

ISBN: 9780132296540

Elementary Linear Algebra with Applications | 9th Edition | ISBN: 9780132296540 | Authors: Bernard Kolman David Hill

Solutions for Chapter 4.9: Rank of a Matrix

Solutions for Chapter 4.9
4 5 0 246 Reviews
26
5
Textbook: Elementary Linear Algebra with Applications
Edition: 9
Author: Bernard Kolman David Hill
ISBN: 9780132296540

Chapter 4.9: Rank of a Matrix includes 51 full step-by-step solutions. This expansive textbook survival guide covers the following chapters and their solutions. Since 51 problems in chapter 4.9: Rank of a Matrix have been answered, more than 58450 students have viewed full step-by-step solutions from this chapter. Elementary Linear Algebra with Applications was written by and is associated to the ISBN: 9780132296540. This textbook survival guide was created for the textbook: Elementary Linear Algebra with Applications, edition: 9.

Key Math Terms and definitions covered in this textbook
  • Big formula for n by n determinants.

    Det(A) is a sum of n! terms. For each term: Multiply one entry from each row and column of A: rows in order 1, ... , nand column order given by a permutation P. Each of the n! P 's has a + or - sign.

  • Change of basis matrix M.

    The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)

  • Diagonal matrix D.

    dij = 0 if i #- j. Block-diagonal: zero outside square blocks Du.

  • Echelon matrix U.

    The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.

  • Elimination matrix = Elementary matrix Eij.

    The identity matrix with an extra -eij in the i, j entry (i #- j). Then Eij A subtracts eij times row j of A from row i.

  • Factorization

    A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.

  • Independent vectors VI, .. " vk.

    No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.

  • Inverse matrix A-I.

    Square matrix with A-I A = I and AA-l = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B-1 A-I and (A-I)T. Cofactor formula (A-l)ij = Cji! detA.

  • Least squares solution X.

    The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.

  • Matrix multiplication AB.

    The i, j entry of AB is (row i of A)ยท(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

  • Orthogonal matrix Q.

    Square matrix with orthonormal columns, so QT = Q-l. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

  • Pivot columns of A.

    Columns that contain pivots after row reduction. These are not combinations of earlier columns. The pivot columns are a basis for the column space.

  • Projection matrix P onto subspace S.

    Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.

  • Rank r (A)

    = number of pivots = dimension of column space = dimension of row space.

  • Singular Value Decomposition

    (SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

  • Spanning set.

    Combinations of VI, ... ,Vm fill the space. The columns of A span C (A)!

  • Spectrum of A = the set of eigenvalues {A I, ... , An}.

    Spectral radius = max of IAi I.

  • Symmetric factorizations A = LDLT and A = QAQT.

    Signs in A = signs in D.

  • Toeplitz matrix.

    Constant down each diagonal = time-invariant (shift-invariant) filter.

  • Vector v in Rn.

    Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.