×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Solutions for Chapter 7.3: The Inverse of a Square Matrix

College Algebra | 8th Edition | ISBN: 9781439048696 | Authors: Ron Larson

Full solutions for College Algebra | 8th Edition

ISBN: 9781439048696

College Algebra | 8th Edition | ISBN: 9781439048696 | Authors: Ron Larson

Solutions for Chapter 7.3: The Inverse of a Square Matrix

Solutions for Chapter 7.3
4 5 0 259 Reviews
16
1
Textbook: College Algebra
Edition: 8
Author: Ron Larson
ISBN: 9781439048696

Since 41 problems in chapter 7.3: The Inverse of a Square Matrix have been answered, more than 30585 students have viewed full step-by-step solutions from this chapter. Chapter 7.3: The Inverse of a Square Matrix includes 41 full step-by-step solutions. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: College Algebra , edition: 8. College Algebra was written by and is associated to the ISBN: 9781439048696.

Key Math Terms and definitions covered in this textbook
  • Characteristic equation det(A - AI) = O.

    The n roots are the eigenvalues of A.

  • Cyclic shift

    S. Permutation with S21 = 1, S32 = 1, ... , finally SIn = 1. Its eigenvalues are the nth roots e2lrik/n of 1; eigenvectors are columns of the Fourier matrix F.

  • Echelon matrix U.

    The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.

  • Elimination matrix = Elementary matrix Eij.

    The identity matrix with an extra -eij in the i, j entry (i #- j). Then Eij A subtracts eij times row j of A from row i.

  • Elimination.

    A sequence of row operations that reduces A to an upper triangular U or to the reduced form R = rref(A). Then A = LU with multipliers eO in L, or P A = L U with row exchanges in P, or E A = R with an invertible E.

  • Fourier matrix F.

    Entries Fjk = e21Cijk/n give orthogonal columns FT F = nI. Then y = Fe is the (inverse) Discrete Fourier Transform Y j = L cke21Cijk/n.

  • Hessenberg matrix H.

    Triangular matrix with one extra nonzero adjacent diagonal.

  • Indefinite matrix.

    A symmetric matrix with eigenvalues of both signs (+ and - ).

  • Inverse matrix A-I.

    Square matrix with A-I A = I and AA-l = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B-1 A-I and (A-I)T. Cofactor formula (A-l)ij = Cji! detA.

  • Linearly dependent VI, ... , Vn.

    A combination other than all Ci = 0 gives L Ci Vi = O.

  • Matrix multiplication AB.

    The i, j entry of AB is (row i of A)ยท(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

  • Multiplier eij.

    The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).

  • Orthogonal subspaces.

    Every v in V is orthogonal to every w in W.

  • Pivot columns of A.

    Columns that contain pivots after row reduction. These are not combinations of earlier columns. The pivot columns are a basis for the column space.

  • Polar decomposition A = Q H.

    Orthogonal Q times positive (semi)definite H.

  • Projection matrix P onto subspace S.

    Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.

  • Pseudoinverse A+ (Moore-Penrose inverse).

    The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

  • Rayleigh quotient q (x) = X T Ax I x T x for symmetric A: Amin < q (x) < Amax.

    Those extremes are reached at the eigenvectors x for Amin(A) and Amax(A).

  • Spanning set.

    Combinations of VI, ... ,Vm fill the space. The columns of A span C (A)!

  • Sum V + W of subs paces.

    Space of all (v in V) + (w in W). Direct sum: V n W = to}.

×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide
×
Reset your password