- 6.8.1: Label the following statements as true or false. (a) Every quadrati...
- 6.8.2: Prove properties 1, 2, 3, and 4 on page 423.
- 6.8.3: (a) Prove that the sum of two bilinear forms is a bilinear form. (b...
- 6.8.4: Determine which of the mappings that follow are bilinear forms. Jus...
- 6.8.5: Verify that each of the given mappings is a bilinear form. Then com...
- 6.8.6: Let 77: R2 77 be the function // R be the function defined by = a i...
- 6.8.7: Let V and W be vector spaces over the same field, and let T: V * W ...
- 6.8.8: Assume the notation of Theorem 6.32. (a) Prove that for any ordered...
- 6.8.9: Assume the notation of Theorem 6.32. (a) Prove that for any ordered...
- 6.8.10: Prove Corollary 2 to Theorem 6.32.
- 6.8.11: Prove Corollary 3 to Theorem 6.32
- 6.8.12: Prove that the relation of congruence is an equivalence relation
- 6.8.13: The following outline provides an alternative proof to Theorem 6.33...
- 6.8.14: Let V be a finite-dimensional vector space and 77 G #(V). Prove tha...
- 6.8.15: Prove the following results. (a) Any square diagonal matrix is symm...
- 6.8.16: Let V be a vector space over a field F not of characteristic two, a...
- 6.8.17: For each of the given quadratic forms i^on a real inner product spa...
- 6.8.18: Let S be the set of all (*i, i2, *3) G R3 for which 3*? + St2 , + 3...
- 6.8.19: Prove the following refinement of Theorem 6.37(d). (a) If 0 < rank(...
- 6.8.20: Prove the following variation of the second-derivative test for the...
- 6.8.21: Let A and E be in Mnxn (F), with E an elementary matrix. In Section...
- 6.8.22: For each of the following matrices A with entries from R, find a di...
- 6.8.23: Prove that if the diagonal entries of a diagonal matrix are permute...
- 6.8.24: Let T be a linear operator on a real inner product space V, and def...
- 6.8.25: Prove the converse to Exercise 24(a): Let V be a finite-dimensional...
- 6.8.26: Prove that the number of distinct equivalence classes of congruent ...
Solutions for Chapter 6.8: Bilinear and Quadratic Forms
Full solutions for Linear Algebra | 4th Edition
Tv = Av + Vo = linear transformation plus shift.
Remove row i and column j; multiply the determinant by (-I)i + j •
Commuting matrices AB = BA.
If diagonalizable, they share n eigenvectors.
Complete solution x = x p + Xn to Ax = b.
(Particular x p) + (x n in nullspace).
Cramer's Rule for Ax = b.
B j has b replacing column j of A; x j = det B j I det A
Cross product u xv in R3:
Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].
Determinant IAI = det(A).
Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and
Diagonal matrix D.
dij = 0 if i #- j. Block-diagonal: zero outside square blocks Du.
Hilbert matrix hilb(n).
Entries HU = 1/(i + j -1) = Jd X i- 1 xj-1dx. Positive definite but extremely small Amin and large condition number: H is ill-conditioned.
Jordan form 1 = M- 1 AM.
If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.
Linearly dependent VI, ... , Vn.
A combination other than all Ci = 0 gives L Ci Vi = O.
Multiplicities AM and G M.
The algebraic multiplicity A M of A is the number of times A appears as a root of det(A - AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).
The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).
Normal equation AT Ax = ATb.
Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.
Polar decomposition A = Q H.
Orthogonal Q times positive (semi)definite H.
Similar matrices A and B.
Every B = M-I AM has the same eigenvalues as A.
Special solutions to As = O.
One free variable is Si = 1, other free variables = o.
Trace of A
= sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.
Triangle inequality II u + v II < II u II + II v II.
For matrix norms II A + B II < II A II + II B II·
Unitary matrix UH = U T = U-I.
Orthonormal columns (complex analog of Q).