 8.5.1: Integrate the trig identity 2 cos j x cos kx = cos(j + k)x + cos(j ...
 8.5.2: Show that 1, x, and x 2  1 are orthogonal, when the integration is...
 8.5.3: Find a vector (WI, W2, W3, . .. ) that is orthogonal to v = (I,!, i...
 8.5.4: The first three Legendre polynomials are 1, x, and x 2 1. Choose e...
 8.5.5: For the square wave I(x) in Example 3, show that 2n ' fo I(x) cosx ...
 8.5.6: The square wave has 11/112 = 2](. Then (6) gives what remarkable su...
 8.5.7: Graph the square wave. Then graph by hand the sum of two sine terms...
 8.5.8: Find the lengths of these vectors in Hilbert space: (a) v = (JI, ~,...
 8.5.9: Compute the Fourier coefficients ak and bk for f(x) defined from 0 ...
 8.5.10: When f(x) has period 2n, why is its integral from n to n the same ...
 8.5.11: From trig identities find the only two terms in the Fourier series ...
 8.5.12: The functions 1, cos x, sin x, cos 2x, sin 2x, ... are a basis for ...
 8.5.13: Find the Fourier coefficients ak and bk of the square pulse F(x) ce...
Solutions for Chapter 8.5: Fourier Series: Linear Algebra for Functions
Full solutions for Introduction to Linear Algebra  4th Edition
ISBN: 9780980232714
Solutions for Chapter 8.5: Fourier Series: Linear Algebra for Functions
Get Full SolutionsChapter 8.5: Fourier Series: Linear Algebra for Functions includes 13 full stepbystep solutions. Since 13 problems in chapter 8.5: Fourier Series: Linear Algebra for Functions have been answered, more than 8145 students have viewed full stepbystep solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Introduction to Linear Algebra, edition: 4. Introduction to Linear Algebra was written by and is associated to the ISBN: 9780980232714.

Complete solution x = x p + Xn to Ax = b.
(Particular x p) + (x n in nullspace).

Diagonal matrix D.
dij = 0 if i # j. Blockdiagonal: zero outside square blocks Du.

Diagonalization
A = S1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k SI.

Elimination.
A sequence of row operations that reduces A to an upper triangular U or to the reduced form R = rref(A). Then A = LU with multipliers eO in L, or P A = L U with row exchanges in P, or E A = R with an invertible E.

Ellipse (or ellipsoid) x T Ax = 1.
A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA1 yll2 = Y T(AAT)1 Y = 1 displayed by eigshow; axis lengths ad

Fibonacci numbers
0,1,1,2,3,5, ... satisfy Fn = Fnl + Fn 2 = (A7 A~)I()q A2). Growth rate Al = (1 + .J5) 12 is the largest eigenvalue of the Fibonacci matrix [ } A].

Fourier matrix F.
Entries Fjk = e21Cijk/n give orthogonal columns FT F = nI. Then y = Fe is the (inverse) Discrete Fourier Transform Y j = L cke21Cijk/n.

GaussJordan method.
Invert A by row operations on [A I] to reach [I AI].

Hermitian matrix A H = AT = A.
Complex analog a j i = aU of a symmetric matrix.

Indefinite matrix.
A symmetric matrix with eigenvalues of both signs (+ and  ).

Linear transformation T.
Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

Linearly dependent VI, ... , Vn.
A combination other than all Ci = 0 gives L Ci Vi = O.

Matrix multiplication AB.
The i, j entry of AB is (row i of A)ยท(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

Normal matrix.
If N NT = NT N, then N has orthonormal (complex) eigenvectors.

Orthogonal matrix Q.
Square matrix with orthonormal columns, so QT = Ql. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

Outer product uv T
= column times row = rank one matrix.

Pascal matrix
Ps = pascal(n) = the symmetric matrix with binomial entries (i1~;2). Ps = PL Pu all contain Pascal's triangle with det = 1 (see Pascal in the index).

Permutation matrix P.
There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or 1) based on the number of row exchanges to reach I.

Right inverse A+.
If A has full row rank m, then A+ = AT(AAT)l has AA+ = 1m.

Sum V + W of subs paces.
Space of all (v in V) + (w in W). Direct sum: V n W = to}.