×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Already have an account? Login here
×
Reset your password

Solutions for Chapter 7.1: The Jordan Canonical Form I

Linear Algebra | 4th Edition | ISBN: 9780130084514 | Authors: Stephen H. Friedberg, Arnold J. Insel, Lawrence E. Spence

Full solutions for Linear Algebra | 4th Edition

ISBN: 9780130084514

Linear Algebra | 4th Edition | ISBN: 9780130084514 | Authors: Stephen H. Friedberg, Arnold J. Insel, Lawrence E. Spence

Solutions for Chapter 7.1: The Jordan Canonical Form I

Linear Algebra was written by and is associated to the ISBN: 9780130084514. This textbook survival guide was created for the textbook: Linear Algebra , edition: 4. This expansive textbook survival guide covers the following chapters and their solutions. Since 13 problems in chapter 7.1: The Jordan Canonical Form I have been answered, more than 12146 students have viewed full step-by-step solutions from this chapter. Chapter 7.1: The Jordan Canonical Form I includes 13 full step-by-step solutions.

Key Math Terms and definitions covered in this textbook
  • Complete solution x = x p + Xn to Ax = b.

    (Particular x p) + (x n in nullspace).

  • Cross product u xv in R3:

    Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].

  • Cyclic shift

    S. Permutation with S21 = 1, S32 = 1, ... , finally SIn = 1. Its eigenvalues are the nth roots e2lrik/n of 1; eigenvectors are columns of the Fourier matrix F.

  • Diagonal matrix D.

    dij = 0 if i #- j. Block-diagonal: zero outside square blocks Du.

  • Diagonalization

    A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.

  • Hankel matrix H.

    Constant along each antidiagonal; hij depends on i + j.

  • Independent vectors VI, .. " vk.

    No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.

  • Least squares solution X.

    The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.

  • Linear transformation T.

    Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

  • Minimal polynomial of A.

    The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A - AI) if no eigenvalues are repeated; always meA) divides peA).

  • Normal equation AT Ax = ATb.

    Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)ยท(b - Ax) = o.

  • Nullspace N (A)

    = All solutions to Ax = O. Dimension n - r = (# columns) - rank.

  • Partial pivoting.

    In each column, choose the largest available pivot to control roundoff; all multipliers have leij I < 1. See condition number.

  • Particular solution x p.

    Any solution to Ax = b; often x p has free variables = o.

  • Rank one matrix A = uvT f=. O.

    Column and row spaces = lines cu and cv.

  • Right inverse A+.

    If A has full row rank m, then A+ = AT(AAT)-l has AA+ = 1m.

  • Row picture of Ax = b.

    Each equation gives a plane in Rn; the planes intersect at x.

  • Similar matrices A and B.

    Every B = M-I AM has the same eigenvalues as A.

  • Spectral Theorem A = QAQT.

    Real symmetric A has real A'S and orthonormal q's.

  • Sum V + W of subs paces.

    Space of all (v in V) + (w in W). Direct sum: V n W = to}.