×
×

# Solutions for Chapter 8.8: Quadric Surfaces

## Full solutions for Elementary Linear Algebra with Applications | 9th Edition

ISBN: 9780471669593

Solutions for Chapter 8.8: Quadric Surfaces

Solutions for Chapter 8.8
4 5 0 391 Reviews
11
2
##### ISBN: 9780471669593

Chapter 8.8: Quadric Surfaces includes 28 full step-by-step solutions. This expansive textbook survival guide covers the following chapters and their solutions. Since 28 problems in chapter 8.8: Quadric Surfaces have been answered, more than 9141 students have viewed full step-by-step solutions from this chapter. Elementary Linear Algebra with Applications was written by and is associated to the ISBN: 9780471669593. This textbook survival guide was created for the textbook: Elementary Linear Algebra with Applications, edition: 9.

Key Math Terms and definitions covered in this textbook
• Adjacency matrix of a graph.

Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

• Companion matrix.

Put CI, ... ,Cn in row n and put n - 1 ones just above the main diagonal. Then det(A - AI) = ±(CI + c2A + C3A 2 + .•. + cnA n-l - An).

A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax - x Tb over growing Krylov subspaces.

• Echelon matrix U.

The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.

• Hankel matrix H.

Constant along each antidiagonal; hij depends on i + j.

• Hermitian matrix A H = AT = A.

Complex analog a j i = aU of a symmetric matrix.

• Identity matrix I (or In).

Diagonal entries = 1, off-diagonal entries = 0.

• Iterative method.

A sequence of steps intended to approach the desired solution.

• Jordan form 1 = M- 1 AM.

If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

• Length II x II.

Square root of x T x (Pythagoras in n dimensions).

• Norm

IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

• Orthogonal matrix Q.

Square matrix with orthonormal columns, so QT = Q-l. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

• Plane (or hyperplane) in Rn.

Vectors x with aT x = O. Plane is perpendicular to a =1= O.

• Polar decomposition A = Q H.

Orthogonal Q times positive (semi)definite H.

• Positive definite matrix A.

Symmetric matrix with positive eigenvalues and positive pivots. Definition: x T Ax > 0 unless x = O. Then A = LDLT with diag(D» O.

• Pseudoinverse A+ (Moore-Penrose inverse).

The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

• Special solutions to As = O.

One free variable is Si = 1, other free variables = o.

• Sum V + W of subs paces.

Space of all (v in V) + (w in W). Direct sum: V n W = to}.

• Trace of A

= sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.