 10.3.1: Multiply the three matrices in equation (3) and compare with F. In ...
 10.3.2: Invert the three factors in equation (3) to find a fast factorizati...
 10.3.3: F is symmetric. So transpose equation (3) to find a new Fast Fourie...
 10.3.4: All entries in the factorization of F6 involve powers of W6 = sixth...
 10.3.5: Ifv = (1,O,O,O)andw = (1, I, 1, 1),showthatFv = w and Fw = 4v. Ther...
 10.3.6: What is F2 and what is F4 for the 4 by 4 Fourier matrix?
 10.3.7: Put the vector c = (1, 0, 1, 0) through the three steps of the FFT ...
 10.3.8: Compute y = Fsc by the three FFT steps for c = (1,0,1,0,1,0,1,0). R...
 10.3.9: If w = e21Ci/64 then w2 and rw are among the __ and __ roots of 1.
 10.3.10: (a) Draw all the sixth roots of 1 on the unit circle. Prove they ad...
 10.3.11: The columns of the Fourier matrix F are the eigenvectors of the cyc...
 10.3.12: The equation det(P  AI) = 0 is A 4 = 1. This shows again that the ...
 10.3.13: (a) Two eigenvectors of Care (1, 1, 1, 1) and (1, i, i 2 , i 3 ). F...
 10.3.14: Find the eigenvalues of the "periodic" 1,2, 1 matrix from E = 21 ...
 10.3.15: To multiply C times a vector x, we can multiply F (E (FI X )) inst...
 10.3.16: Why is row j of F the same as row N  i of F (numbered to N  I)
Solutions for Chapter 10.3: The Fast Fourier Transform
Full solutions for Introduction to Linear Algebra  4th Edition
ISBN: 9780980232714
Solutions for Chapter 10.3: The Fast Fourier Transform
Get Full SolutionsChapter 10.3: The Fast Fourier Transform includes 16 full stepbystep solutions. Introduction to Linear Algebra was written by and is associated to the ISBN: 9780980232714. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Introduction to Linear Algebra, edition: 4. Since 16 problems in chapter 10.3: The Fast Fourier Transform have been answered, more than 12857 students have viewed full stepbystep solutions from this chapter.

Back substitution.
Upper triangular systems are solved in reverse order Xn to Xl.

Covariance matrix:E.
When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x  x) (x  x) T is positive (semi)definite; :E is diagonal if the Xi are independent.

Distributive Law
A(B + C) = AB + AC. Add then multiply, or mUltiply then add.

Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.
Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).

Exponential eAt = I + At + (At)2 12! + ...
has derivative AeAt; eAt u(O) solves u' = Au.

Free variable Xi.
Column i has no pivot in elimination. We can give the n  r free variables any values, then Ax = b determines the r pivot variables (if solvable!).

GaussJordan method.
Invert A by row operations on [A I] to reach [I AI].

Inverse matrix AI.
Square matrix with AI A = I and AAl = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B1 AI and (AI)T. Cofactor formula (Al)ij = Cji! detA.

Linearly dependent VI, ... , Vn.
A combination other than all Ci = 0 gives L Ci Vi = O.

Markov matrix M.
All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

Matrix multiplication AB.
The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

Multiplication Ax
= Xl (column 1) + ... + xn(column n) = combination of columns.

Network.
A directed graph that has constants Cl, ... , Cm associated with the edges.

Nullspace N (A)
= All solutions to Ax = O. Dimension n  r = (# columns)  rank.

Orthogonal matrix Q.
Square matrix with orthonormal columns, so QT = Ql. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q 1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

Pascal matrix
Ps = pascal(n) = the symmetric matrix with binomial entries (i1~;2). Ps = PL Pu all contain Pascal's triangle with det = 1 (see Pascal in the index).

Pseudoinverse A+ (MoorePenrose inverse).
The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

Row space C (AT) = all combinations of rows of A.
Column vectors by convention.

Singular matrix A.
A square matrix that has no inverse: det(A) = o.