×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Solutions for Chapter 7.3: Symmetric Matrices and Orthogonal Diagonalization

Full solutions for Elementary Linear Algebra | 6th Edition

ISBN: 9780618783762

Solutions for Chapter 7.3: Symmetric Matrices and Orthogonal Diagonalization

Solutions for Chapter 7.3
4 5 0 234 Reviews
18
3
Textbook: Elementary Linear Algebra
Edition: 6
Author: Ron Larson, David C. Falvo
ISBN: 9780618783762

This textbook survival guide was created for the textbook: Elementary Linear Algebra, edition: 6. Elementary Linear Algebra was written by and is associated to the ISBN: 9780618783762. Since 41 problems in chapter 7.3: Symmetric Matrices and Orthogonal Diagonalization have been answered, more than 19579 students have viewed full step-by-step solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 7.3: Symmetric Matrices and Orthogonal Diagonalization includes 41 full step-by-step solutions.

Key Math Terms and definitions covered in this textbook
  • Big formula for n by n determinants.

    Det(A) is a sum of n! terms. For each term: Multiply one entry from each row and column of A: rows in order 1, ... , nand column order given by a permutation P. Each of the n! P 's has a + or - sign.

  • Change of basis matrix M.

    The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)

  • Circulant matrix C.

    Constant diagonals wrap around as in cyclic shift S. Every circulant is Col + CIS + ... + Cn_lSn - l . Cx = convolution c * x. Eigenvectors in F.

  • Complete solution x = x p + Xn to Ax = b.

    (Particular x p) + (x n in nullspace).

  • Covariance matrix:E.

    When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x - x) (x - x) T is positive (semi)definite; :E is diagonal if the Xi are independent.

  • Determinant IAI = det(A).

    Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and

  • Full column rank r = n.

    Independent columns, N(A) = {O}, no free variables.

  • Full row rank r = m.

    Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.

  • Linear transformation T.

    Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

  • Multiplicities AM and G M.

    The algebraic multiplicity A M of A is the number of times A appears as a root of det(A - AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).

  • Network.

    A directed graph that has constants Cl, ... , Cm associated with the edges.

  • Nilpotent matrix N.

    Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.

  • Orthogonal subspaces.

    Every v in V is orthogonal to every w in W.

  • Pascal matrix

    Ps = pascal(n) = the symmetric matrix with binomial entries (i1~;2). Ps = PL Pu all contain Pascal's triangle with det = 1 (see Pascal in the index).

  • Projection p = a(aTblaTa) onto the line through a.

    P = aaT laTa has rank l.

  • Pseudoinverse A+ (Moore-Penrose inverse).

    The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

  • Skew-symmetric matrix K.

    The transpose is -K, since Kij = -Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

  • Spectrum of A = the set of eigenvalues {A I, ... , An}.

    Spectral radius = max of IAi I.

  • Subspace S of V.

    Any vector space inside V, including V and Z = {zero vector only}.

  • Vector v in Rn.

    Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.

×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide
×
Reset your password