×
Log in to StudySoup
Get Full Access to Algebra - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Algebra - Textbook Survival Guide

Already have an account? Login here
×
Reset your password

Solutions for Chapter 3.2: DETERMINANTS

Elementary Linear Algebra: A Matrix Approach | 2nd Edition | ISBN: 9780131871410 | Authors: Lawrence E. Spence

Full solutions for Elementary Linear Algebra: A Matrix Approach | 2nd Edition

ISBN: 9780131871410

Elementary Linear Algebra: A Matrix Approach | 2nd Edition | ISBN: 9780131871410 | Authors: Lawrence E. Spence

Solutions for Chapter 3.2: DETERMINANTS

Solutions for Chapter 3.2
4 5 0 395 Reviews
26
5
Textbook: Elementary Linear Algebra: A Matrix Approach
Edition: 2
Author: Lawrence E. Spence
ISBN: 9780131871410

Elementary Linear Algebra: A Matrix Approach was written by and is associated to the ISBN: 9780131871410. Since 85 problems in chapter 3.2: DETERMINANTS have been answered, more than 34899 students have viewed full step-by-step solutions from this chapter. Chapter 3.2: DETERMINANTS includes 85 full step-by-step solutions. This textbook survival guide was created for the textbook: Elementary Linear Algebra: A Matrix Approach, edition: 2. This expansive textbook survival guide covers the following chapters and their solutions.

Key Math Terms and definitions covered in this textbook
  • Block matrix.

    A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

  • Cholesky factorization

    A = CTC = (L.J]))(L.J]))T for positive definite A.

  • Circulant matrix C.

    Constant diagonals wrap around as in cyclic shift S. Every circulant is Col + CIS + ... + Cn_lSn - l . Cx = convolution c * x. Eigenvectors in F.

  • Determinant IAI = det(A).

    Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and

  • Elimination.

    A sequence of row operations that reduces A to an upper triangular U or to the reduced form R = rref(A). Then A = LU with multipliers eO in L, or P A = L U with row exchanges in P, or E A = R with an invertible E.

  • Hilbert matrix hilb(n).

    Entries HU = 1/(i + j -1) = Jd X i- 1 xj-1dx. Positive definite but extremely small Amin and large condition number: H is ill-conditioned.

  • Left nullspace N (AT).

    Nullspace of AT = "left nullspace" of A because y T A = OT.

  • Linearly dependent VI, ... , Vn.

    A combination other than all Ci = 0 gives L Ci Vi = O.

  • Matrix multiplication AB.

    The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

  • Multiplier eij.

    The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).

  • Orthonormal vectors q 1 , ... , q n·

    Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

  • Polar decomposition A = Q H.

    Orthogonal Q times positive (semi)definite H.

  • Reduced row echelon form R = rref(A).

    Pivots = 1; zeros above and below pivots; the r nonzero rows of R give a basis for the row space of A.

  • Schur complement S, D - C A -} B.

    Appears in block elimination on [~ g ].

  • Singular Value Decomposition

    (SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

  • Special solutions to As = O.

    One free variable is Si = 1, other free variables = o.

  • Spectral Theorem A = QAQT.

    Real symmetric A has real A'S and orthonormal q's.

  • Spectrum of A = the set of eigenvalues {A I, ... , An}.

    Spectral radius = max of IAi I.

  • Stiffness matrix

    If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.

  • Volume of box.

    The rows (or the columns) of A generate a box with volume I det(A) I.