×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Solutions for Chapter 7.3: Diagonalization of Symmetric Matrices

Elementary Linear Algebra with Applications | 9th Edition | ISBN: 9780132296540 | Authors: Bernard Kolman David Hill

Full solutions for Elementary Linear Algebra with Applications | 9th Edition

ISBN: 9780132296540

Elementary Linear Algebra with Applications | 9th Edition | ISBN: 9780132296540 | Authors: Bernard Kolman David Hill

Solutions for Chapter 7.3: Diagonalization of Symmetric Matrices

Solutions for Chapter 7.3
4 5 0 417 Reviews
31
4
Textbook: Elementary Linear Algebra with Applications
Edition: 9
Author: Bernard Kolman David Hill
ISBN: 9780132296540

This expansive textbook survival guide covers the following chapters and their solutions. Chapter 7.3: Diagonalization of Symmetric Matrices includes 40 full step-by-step solutions. Elementary Linear Algebra with Applications was written by and is associated to the ISBN: 9780132296540. This textbook survival guide was created for the textbook: Elementary Linear Algebra with Applications, edition: 9. Since 40 problems in chapter 7.3: Diagonalization of Symmetric Matrices have been answered, more than 11938 students have viewed full step-by-step solutions from this chapter.

Key Math Terms and definitions covered in this textbook
  • Adjacency matrix of a graph.

    Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

  • Column picture of Ax = b.

    The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).

  • Column space C (A) =

    space of all combinations of the columns of A.

  • Commuting matrices AB = BA.

    If diagonalizable, they share n eigenvectors.

  • Complex conjugate

    z = a - ib for any complex number z = a + ib. Then zz = Iz12.

  • Conjugate Gradient Method.

    A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax - x Tb over growing Krylov subspaces.

  • Cross product u xv in R3:

    Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].

  • Diagonal matrix D.

    dij = 0 if i #- j. Block-diagonal: zero outside square blocks Du.

  • Elimination.

    A sequence of row operations that reduces A to an upper triangular U or to the reduced form R = rref(A). Then A = LU with multipliers eO in L, or P A = L U with row exchanges in P, or E A = R with an invertible E.

  • Ellipse (or ellipsoid) x T Ax = 1.

    A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA-1 yll2 = Y T(AAT)-1 Y = 1 displayed by eigshow; axis lengths ad

  • Graph G.

    Set of n nodes connected pairwise by m edges. A complete graph has all n(n - 1)/2 edges between nodes. A tree has only n - 1 edges and no closed loops.

  • Hermitian matrix A H = AT = A.

    Complex analog a j i = aU of a symmetric matrix.

  • Matrix multiplication AB.

    The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

  • Multiplier eij.

    The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).

  • Nilpotent matrix N.

    Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.

  • Orthonormal vectors q 1 , ... , q n·

    Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

  • Rank one matrix A = uvT f=. O.

    Column and row spaces = lines cu and cv.

  • Schur complement S, D - C A -} B.

    Appears in block elimination on [~ g ].

  • Singular matrix A.

    A square matrix that has no inverse: det(A) = o.

  • Unitary matrix UH = U T = U-I.

    Orthonormal columns (complex analog of Q).

×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide
×
Reset your password