×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Already have an account? Login here
×
Reset your password

Solutions for Chapter 6: Vector Spaces

Linear Algebra: A Modern Introduction (Available 2011 Titles Enhanced Web Assign) | 3rd Edition | ISBN: 9780538735452 | Authors: David Poole

Full solutions for Linear Algebra: A Modern Introduction (Available 2011 Titles Enhanced Web Assign) | 3rd Edition

ISBN: 9780538735452

Linear Algebra: A Modern Introduction (Available 2011 Titles Enhanced Web Assign) | 3rd Edition | ISBN: 9780538735452 | Authors: David Poole

Solutions for Chapter 6: Vector Spaces

Solutions for Chapter 6
4 5 0 431 Reviews
13
0
Textbook: Linear Algebra: A Modern Introduction (Available 2011 Titles Enhanced Web Assign)
Edition: 3
Author: David Poole
ISBN: 9780538735452

Linear Algebra: A Modern Introduction (Available 2011 Titles Enhanced Web Assign) was written by and is associated to the ISBN: 9780538735452. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 6: Vector Spaces includes 308 full step-by-step solutions. Since 308 problems in chapter 6: Vector Spaces have been answered, more than 66547 students have viewed full step-by-step solutions from this chapter. This textbook survival guide was created for the textbook: Linear Algebra: A Modern Introduction (Available 2011 Titles Enhanced Web Assign), edition: 3.

Key Math Terms and definitions covered in this textbook
  • Affine transformation

    Tv = Av + Vo = linear transformation plus shift.

  • Basis for V.

    Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!

  • Cholesky factorization

    A = CTC = (L.J]))(L.J]))T for positive definite A.

  • Condition number

    cond(A) = c(A) = IIAIlIIA-III = amaxlamin. In Ax = b, the relative change Ilox III Ilx II is less than cond(A) times the relative change Ilob III lib II· Condition numbers measure the sensitivity of the output to change in the input.

  • Covariance matrix:E.

    When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x - x) (x - x) T is positive (semi)definite; :E is diagonal if the Xi are independent.

  • Diagonalizable matrix A.

    Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.

  • Fast Fourier Transform (FFT).

    A factorization of the Fourier matrix Fn into e = log2 n matrices Si times a permutation. Each Si needs only nl2 multiplications, so Fnx and Fn-1c can be computed with ne/2 multiplications. Revolutionary.

  • Graph G.

    Set of n nodes connected pairwise by m edges. A complete graph has all n(n - 1)/2 edges between nodes. A tree has only n - 1 edges and no closed loops.

  • Hessenberg matrix H.

    Triangular matrix with one extra nonzero adjacent diagonal.

  • Incidence matrix of a directed graph.

    The m by n edge-node incidence matrix has a row for each edge (node i to node j), with entries -1 and 1 in columns i and j .

  • Indefinite matrix.

    A symmetric matrix with eigenvalues of both signs (+ and - ).

  • Inverse matrix A-I.

    Square matrix with A-I A = I and AA-l = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B-1 A-I and (A-I)T. Cofactor formula (A-l)ij = Cji! detA.

  • Kronecker product (tensor product) A ® B.

    Blocks aij B, eigenvalues Ap(A)Aq(B).

  • Krylov subspace Kj(A, b).

    The subspace spanned by b, Ab, ... , Aj-Ib. Numerical methods approximate A -I b by x j with residual b - Ax j in this subspace. A good basis for K j requires only multiplication by A at each step.

  • Least squares solution X.

    The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.

  • Length II x II.

    Square root of x T x (Pythagoras in n dimensions).

  • Orthogonal matrix Q.

    Square matrix with orthonormal columns, so QT = Q-l. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

  • Rank r (A)

    = number of pivots = dimension of column space = dimension of row space.

  • Standard basis for Rn.

    Columns of n by n identity matrix (written i ,j ,k in R3).

  • Vector v in Rn.

    Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.