 6.6.1: Find the least squares approximation of f(x) = 1 + x over the inter...
 6.6.2: Find the least squares approximation of f(x) = x2 over the interval...
 6.6.3: (a) Find the least squares approximation of x over the interval [0,...
 6.6.4: (a) Find the least squares approximation of ex over the interval [0...
 6.6.5: (a) Find the least squares approximation of sin x over the interval...
 6.6.6: Use the GramSchmidt process to obtain the orthonormal basis (5) fro...
 6.6.7: Carry out the integrations indicated in Formulas (9a), (9b), and (9c)
 6.6.8: Find the Fourier series of f(x) = x over the interval [0, 2].
 6.6.9: Find the Fourier series of f(x) = 1, 0 x 2 over the interval [0, 2].
 6.6.10: What is the Fourier series of sin(3x)?
Solutions for Chapter 6.6: Function Approximation; Fourier Series
Full solutions for Elementary Linear Algebra, Binder Ready Version: Applications Version  11th Edition
ISBN: 9781118474228
Solutions for Chapter 6.6: Function Approximation; Fourier Series
Get Full SolutionsSince 10 problems in chapter 6.6: Function Approximation; Fourier Series have been answered, more than 15124 students have viewed full stepbystep solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Elementary Linear Algebra, Binder Ready Version: Applications Version, edition: 11. Elementary Linear Algebra, Binder Ready Version: Applications Version was written by and is associated to the ISBN: 9781118474228. Chapter 6.6: Function Approximation; Fourier Series includes 10 full stepbystep solutions.

Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

Diagonalization
A = S1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k SI.

Distributive Law
A(B + C) = AB + AC. Add then multiply, or mUltiply then add.

Hessenberg matrix H.
Triangular matrix with one extra nonzero adjacent diagonal.

Independent vectors VI, .. " vk.
No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.

Jordan form 1 = M 1 AM.
If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

Kirchhoff's Laws.
Current Law: net current (in minus out) is zero at each node. Voltage Law: Potential differences (voltage drops) add to zero around any closed loop.

Left inverse A+.
If A has full column rank n, then A+ = (AT A)I AT has A+ A = In.

Minimal polynomial of A.
The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A  AI) if no eigenvalues are repeated; always meA) divides peA).

Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q 1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

Projection matrix P onto subspace S.
Projection p = P b is the closest point to b in S, error e = b  Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) 1 AT.

Pseudoinverse A+ (MoorePenrose inverse).
The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

Reduced row echelon form R = rref(A).
Pivots = 1; zeros above and below pivots; the r nonzero rows of R give a basis for the row space of A.

Rotation matrix
R = [~ CS ] rotates the plane by () and R 1 = RT rotates back by (). Eigenvalues are eiO and eiO , eigenvectors are (1, ±i). c, s = cos (), sin ().

Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

Spectral Theorem A = QAQT.
Real symmetric A has real A'S and orthonormal q's.

Subspace S of V.
Any vector space inside V, including V and Z = {zero vector only}.

Symmetric factorizations A = LDLT and A = QAQT.
Signs in A = signs in D.

Tridiagonal matrix T: tij = 0 if Ii  j I > 1.
T 1 has rank 1 above and below diagonal.

Vector v in Rn.
Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.