- 6.1: Let have the Euclidean inner product. (a) Find a vector in that is ...
- 6.2: Prove: If is the Euclidean inner product on , and if A is an matrix...
- 6.3: Let have the inner product that was defined in Example 6 of Section...
- 6.4: Let be a system of m equations in n unknowns. Show that is a soluti...
- 6.5: Use the CauchySchwarz inequality to show that if are positive real ...
- 6.6: Show that if x and y are vectors in an inner product space and c is...
- 6.7: Let have the Euclidean inner product. Find two vectors of length 1 ...
- 6.8: Find a weighted Euclidean inner product on such that the vectors fo...
- 6.9: Is there a weighted Euclidean inner product on for which the vector...
- 6.10: If u and v are vectors in an inner product space , then u, v, and c...
- 6.11: (a) As shown in Figure 3.2.6, the vectors (k, 0, 0), (0, k, 0), and...
- 6.12: Let u and v be vectors in an inner product space. (a) Prove that if...
- 6.13: Let u be a vector in an inner product space V, and let be an orthon...
- 6.14: Prove: If and are two inner products on a vector space V, then the ...
- 6.15: Prove Theorem 6.2.5.
- 6.16: Prove: If A has linearly independent column vectors, and if b is or...
- 6.17: Is there any value of s for which and is the leastsquares solution ...
- 6.18: Show that if p and q are distinct positive integers, then the funct...
- 6.19: Show that if p and q are positive integers, then the functions and ...
Solutions for Chapter 6: Inner Product Spaces
Full solutions for Elementary Linear Algebra: Applications Version | 10th Edition
Big formula for n by n determinants.
Det(A) is a sum of n! terms. For each term: Multiply one entry from each row and column of A: rows in order 1, ... , nand column order given by a permutation P. Each of the n! P 's has a + or - sign.
A = CTC = (L.J]))(L.J]))T for positive definite A.
Column picture of Ax = b.
The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).
Column space C (A) =
space of all combinations of the columns of A.
z = a - ib for any complex number z = a + ib. Then zz = Iz12.
When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x - x) (x - x) T is positive (semi)definite; :E is diagonal if the Xi are independent.
Dimension of vector space
dim(V) = number of vectors in any basis for V.
Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.
Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).
Echelon matrix U.
The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.
Fourier matrix F.
Entries Fjk = e21Cijk/n give orthogonal columns FT F = nI. Then y = Fe is the (inverse) Discrete Fourier Transform Y j = L cke21Cijk/n.
Full row rank r = m.
Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.
Hessenberg matrix H.
Triangular matrix with one extra nonzero adjacent diagonal.
Identity matrix I (or In).
Diagonal entries = 1, off-diagonal entries = 0.
Current Law: net current (in minus out) is zero at each node. Voltage Law: Potential differences (voltage drops) add to zero around any closed loop.
Linear transformation T.
Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.
Particular solution x p.
Any solution to Ax = b; often x p has free variables = o.
Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.
Rank r (A)
= number of pivots = dimension of column space = dimension of row space.
Constant down each diagonal = time-invariant (shift-invariant) filter.
Triangle inequality II u + v II < II u II + II v II.
For matrix norms II A + B II < II A II + II B II·