 6.1: Let u = (xi. x2) and v = ( Yi. y2) be elements ofR2 Prove that the ...
 6.2: Determine the inner product, the norms, and the distance betweenthe...
 6.3: Let R2 have inner product defined by ({x1, x2), {y1, y2)) = 2x1y1 +...
 6.4: Consider R2 with the inner product ({x1, x2), {y1, Y2)) = 2X1Y1 + 3...
 6.5: Find the least squares linear approximation to f (x) = x2 + 2x  1 ...
 6.6: Find the fourthorder Fourier approximation to J(x) = 2x  l over[0...
 6.7: List all the vectors of V 7 that lie in the sphere having radius 1 ...
 6.8: Find the pseudoinverse of the matrix [ ! ]
 6.9: Determine the least squares parabola for the data (1, 6), (2, 2), (...
 6.10: Let A be a matrix with pseudoinverse B. Show that (a) ABA =A (b) BA...
Solutions for Chapter 6: Inner Product Spaces
Full solutions for Linear Algebra with Applications  8th Edition
ISBN: 9781449679545
Solutions for Chapter 6: Inner Product Spaces
Get Full SolutionsLinear Algebra with Applications was written by and is associated to the ISBN: 9781449679545. This textbook survival guide was created for the textbook: Linear Algebra with Applications, edition: 8. This expansive textbook survival guide covers the following chapters and their solutions. Since 10 problems in chapter 6: Inner Product Spaces have been answered, more than 8431 students have viewed full stepbystep solutions from this chapter. Chapter 6: Inner Product Spaces includes 10 full stepbystep solutions.

Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

Cofactor Cij.
Remove row i and column j; multiply the determinant by (I)i + j •

Complete solution x = x p + Xn to Ax = b.
(Particular x p) + (x n in nullspace).

Conjugate Gradient Method.
A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax  x Tb over growing Krylov subspaces.

Covariance matrix:E.
When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x  x) (x  x) T is positive (semi)definite; :E is diagonal if the Xi are independent.

Determinant IAI = det(A).
Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and

Diagonal matrix D.
dij = 0 if i # j. Blockdiagonal: zero outside square blocks Du.

Dimension of vector space
dim(V) = number of vectors in any basis for V.

Fast Fourier Transform (FFT).
A factorization of the Fourier matrix Fn into e = log2 n matrices Si times a permutation. Each Si needs only nl2 multiplications, so Fnx and Fn1c can be computed with ne/2 multiplications. Revolutionary.

Four Fundamental Subspaces C (A), N (A), C (AT), N (AT).
Use AT for complex A.

Full row rank r = m.
Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.

Graph G.
Set of n nodes connected pairwise by m edges. A complete graph has all n(n  1)/2 edges between nodes. A tree has only n  1 edges and no closed loops.

Indefinite matrix.
A symmetric matrix with eigenvalues of both signs (+ and  ).

Multiplicities AM and G M.
The algebraic multiplicity A M of A is the number of times A appears as a root of det(A  AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).

Permutation matrix P.
There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or 1) based on the number of row exchanges to reach I.

Projection p = a(aTblaTa) onto the line through a.
P = aaT laTa has rank l.

Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

Skewsymmetric matrix K.
The transpose is K, since Kij = Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

Spectrum of A = the set of eigenvalues {A I, ... , An}.
Spectral radius = max of IAi I.

Subspace S of V.
Any vector space inside V, including V and Z = {zero vector only}.