- 10.18.1: Find the trigonometric polynomial of order 3 that is the least squa...
- 10.18.2: Find the trigonometric polynomial of order 4 that is the least squa...
- 10.18.3: Find the trigonometric polynomial of order 4 that is the least squa...
- 10.18.4: Find the trigonometric polynomial of arbitrary order n that is the ...
- 10.18.5: Find the trigonometric polynomial of arbitrary order n that is the ...
- 10.18.6: For the inner product u, v = 2 0 u(t)v(t) dt show that (a) 1 = 2 (b...
- 10.18.7: Show that the 2n + 1 functions 1, cost, cos 2t,... , cos nt,sin t,s...
- 10.18.8: If f (t) is defined and continuous on the interval [0, T ], show th...
- 10.18.T1: Let g be the function g(t) = 3 + 4 sin t 5 4 cost for 0 t 2. Use a ...
- 10.18.T2: Let g be the function g(t) = 3 + 4 sin t 5 4 cost for 0 t 2. Use a ...
Solutions for Chapter 10.18: A Least Squares Model for Human Hearing
Full solutions for Elementary Linear Algebra, Binder Ready Version: Applications Version | 11th Edition
Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.
Basis for V.
Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!
A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.
Column picture of Ax = b.
The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).
Complete solution x = x p + Xn to Ax = b.
(Particular x p) + (x n in nullspace).
A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.
Hilbert matrix hilb(n).
Entries HU = 1/(i + j -1) = Jd X i- 1 xj-1dx. Positive definite but extremely small Amin and large condition number: H is ill-conditioned.
Hypercube matrix pl.
Row n + 1 counts corners, edges, faces, ... of a cube in Rn.
Inverse matrix A-I.
Square matrix with A-I A = I and AA-l = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B-1 A-I and (A-I)T. Cofactor formula (A-l)ij = Cji! detA.
Current Law: net current (in minus out) is zero at each node. Voltage Law: Potential differences (voltage drops) add to zero around any closed loop.
= Xl (column 1) + ... + xn(column n) = combination of columns.
Pseudoinverse A+ (Moore-Penrose inverse).
The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).
Random matrix rand(n) or randn(n).
MATLAB creates a matrix with random entries, uniformly distributed on [0 1] for rand and standard normal distribution for randn.
R = [~ CS ] rotates the plane by () and R- 1 = RT rotates back by -(). Eigenvalues are eiO and e-iO , eigenvectors are (1, ±i). c, s = cos (), sin ().
Row picture of Ax = b.
Each equation gives a plane in Rn; the planes intersect at x.
Schur complement S, D - C A -} B.
Appears in block elimination on [~ g ].
Semidefinite matrix A.
(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.
Simplex method for linear programming.
The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!
Solvable system Ax = b.
The right side b is in the column space of A.
If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.