×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Already have an account? Login here
×
Reset your password

Solutions for Chapter 1.2: Matrices, Vectors, and Gauss-Jordan Elimination

Full solutions for Linear Algebra with Applications | 4th Edition

ISBN: 9780136009269

Solutions for Chapter 1.2: Matrices, Vectors, and Gauss-Jordan Elimination

Solutions for Chapter 1.2
4 5 0 381 Reviews
18
4
Textbook: Linear Algebra with Applications
Edition: 4
Author: Otto Bretscher
ISBN: 9780136009269

Linear Algebra with Applications was written by and is associated to the ISBN: 9780136009269. Chapter 1.2: Matrices, Vectors, and Gauss-Jordan Elimination includes 78 full step-by-step solutions. This expansive textbook survival guide covers the following chapters and their solutions. Since 78 problems in chapter 1.2: Matrices, Vectors, and Gauss-Jordan Elimination have been answered, more than 45628 students have viewed full step-by-step solutions from this chapter. This textbook survival guide was created for the textbook: Linear Algebra with Applications, edition: 4.

Key Math Terms and definitions covered in this textbook
  • Augmented matrix [A b].

    Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

  • Commuting matrices AB = BA.

    If diagonalizable, they share n eigenvectors.

  • Cross product u xv in R3:

    Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].

  • Cyclic shift

    S. Permutation with S21 = 1, S32 = 1, ... , finally SIn = 1. Its eigenvalues are the nth roots e2lrik/n of 1; eigenvectors are columns of the Fourier matrix F.

  • Diagonalizable matrix A.

    Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.

  • Hankel matrix H.

    Constant along each antidiagonal; hij depends on i + j.

  • Identity matrix I (or In).

    Diagonal entries = 1, off-diagonal entries = 0.

  • Left inverse A+.

    If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.

  • Lucas numbers

    Ln = 2,J, 3, 4, ... satisfy Ln = L n- l +Ln- 2 = A1 +A~, with AI, A2 = (1 ± -/5)/2 from the Fibonacci matrix U~]' Compare Lo = 2 with Fo = O.

  • Multiplication Ax

    = Xl (column 1) + ... + xn(column n) = combination of columns.

  • Nilpotent matrix N.

    Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.

  • Rayleigh quotient q (x) = X T Ax I x T x for symmetric A: Amin < q (x) < Amax.

    Those extremes are reached at the eigenvectors x for Amin(A) and Amax(A).

  • Reduced row echelon form R = rref(A).

    Pivots = 1; zeros above and below pivots; the r nonzero rows of R give a basis for the row space of A.

  • Reflection matrix (Householder) Q = I -2uuT.

    Unit vector u is reflected to Qu = -u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q-1 = Q.

  • Schur complement S, D - C A -} B.

    Appears in block elimination on [~ g ].

  • Similar matrices A and B.

    Every B = M-I AM has the same eigenvalues as A.

  • Sum V + W of subs paces.

    Space of all (v in V) + (w in W). Direct sum: V n W = to}.

  • Symmetric factorizations A = LDLT and A = QAQT.

    Signs in A = signs in D.

  • Vandermonde matrix V.

    V c = b gives coefficients of p(x) = Co + ... + Cn_IXn- 1 with P(Xi) = bi. Vij = (Xi)j-I and det V = product of (Xk - Xi) for k > i.

  • Vector v in Rn.

    Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.