 6.6.1: Find the matrix associated with each of the following quadratic for...
 6.6.2: Reorder the eigenvalues in Example 2 so that 1 = 4 and 2 = 2, and r...
 6.6.3: In each of the following, (i) find a suitable change of coordinates...
 6.6.4: Let 1 and 2 be the eigenvalues of A = a b b c What kind of conic se...
 6.6.5: Let A be a symmetric 2 2 matrix and let be a nonzero scalar for whi...
 6.6.6: Which of the matrices that follow are positive definite? Negative d...
 6.6.7: For each of the following functions, determine whether the given st...
 6.6.8: Show that if A is symmetric positive definite, then det(A) > 0. Giv...
 6.6.9: Show that if A is a symmetric positive definite matrix, then A is n...
 6.6.10: Let A be a singular n n matrix. Show that ATA is positive semidefin...
 6.6.11: Let A be a symmetric nn matrix with eigenvalues 1, . . . , n. Show ...
 6.6.12: Let A be a symmetric positive definite matrix. Show that the diagon...
 6.6.13: Let A be a symmetric positive definite nn matrix and let S be a non...
 6.6.14: Let A be a symmetric positive definite nn matrix. Show that A can b...
Solutions for Chapter 6.6: Quadratic Forms
Full solutions for Linear Algebra with Applications  8th Edition
ISBN: 9780136009290
Solutions for Chapter 6.6: Quadratic Forms
Get Full SolutionsThis textbook survival guide was created for the textbook: Linear Algebra with Applications, edition: 8. Linear Algebra with Applications was written by and is associated to the ISBN: 9780136009290. Chapter 6.6: Quadratic Forms includes 14 full stepbystep solutions. Since 14 problems in chapter 6.6: Quadratic Forms have been answered, more than 5070 students have viewed full stepbystep solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions.

Affine transformation
Tv = Av + Vo = linear transformation plus shift.

Big formula for n by n determinants.
Det(A) is a sum of n! terms. For each term: Multiply one entry from each row and column of A: rows in order 1, ... , nand column order given by a permutation P. Each of the n! P 's has a + or  sign.

CayleyHamilton Theorem.
peA) = det(A  AI) has peA) = zero matrix.

Dimension of vector space
dim(V) = number of vectors in any basis for V.

Eigenvalue A and eigenvector x.
Ax = AX with x#O so det(A  AI) = o.

Factorization
A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.

Hilbert matrix hilb(n).
Entries HU = 1/(i + j 1) = Jd X i 1 xj1dx. Positive definite but extremely small Amin and large condition number: H is illconditioned.

Incidence matrix of a directed graph.
The m by n edgenode incidence matrix has a row for each edge (node i to node j), with entries 1 and 1 in columns i and j .

Krylov subspace Kj(A, b).
The subspace spanned by b, Ab, ... , AjIb. Numerical methods approximate A I b by x j with residual b  Ax j in this subspace. A good basis for K j requires only multiplication by A at each step.

Length II x II.
Square root of x T x (Pythagoras in n dimensions).

Linearly dependent VI, ... , Vn.
A combination other than all Ci = 0 gives L Ci Vi = O.

Normal equation AT Ax = ATb.
Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)ยท(b  Ax) = o.

Outer product uv T
= column times row = rank one matrix.

Permutation matrix P.
There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or 1) based on the number of row exchanges to reach I.

Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.

Rank r (A)
= number of pivots = dimension of column space = dimension of row space.

Rayleigh quotient q (x) = X T Ax I x T x for symmetric A: Amin < q (x) < Amax.
Those extremes are reached at the eigenvectors x for Amin(A) and Amax(A).

Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

Stiffness matrix
If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.

Transpose matrix AT.
Entries AL = Ajj. AT is n by In, AT A is square, symmetric, positive semidefinite. The transposes of AB and AI are BT AT and (AT)I.