 5.7.1: Use the recursion formulas to calculate (a) T4, T5 and (b) H4, H5.
 5.7.2: Let p0(x), p1(x), and p2(x) be orthogonal with respect to the inner...
 5.7.3: Show that the Chebyshev polynomials have the following properties: ...
 5.7.4: Find the best quadratic least squares approximation to ex on [1, 1]...
 5.7.5: Let p0, p1, . . . be a sequence of orthogonal polynomials and let a...
 5.7.6: Let Tn(x) denote the Chebyshev polynomial of degree n, and define U...
 5.7.7: Let Un1(x) be defined as in Exercise 6 for n 1, and define U1(x) = ...
 5.7.8: Show that the Ui s defined in Exercise 6 are orthogonal with respec...
 5.7.9: Verify that the Legendre polynomial Pn(x) satisfies the secondorde...
 5.7.10: Prove each of the following: (a) H_ n(x) = 2nHn1(x), n = 0, 1, . . ...
 5.7.11: Given a function f (x) that passes through the points (1, 2), (2,1)...
 5.7.12: Show that if f (x) is a polynomial of degree less than n, then f (x...
 5.7.13: Use the zeros of the Legendre polynomial P2(x) to obtain a twopoin...
 5.7.14: (a) For what degree polynomials will the quadrature formula in Exer...
 5.7.15: Let x1, x2, . . . , xn be distinct points in the interval [1, 1] an...
 5.7.16: Let x1, x2, . . . , xn be the roots of the Legendre polynomial Pn. ...
 5.7.17: Let Q0(x), Q1(x), . . . be an orthonormal sequence of polynomials; ...
Solutions for Chapter 5.7: Orthogonal Polynomials
Full solutions for Linear Algebra with Applications  8th Edition
ISBN: 9780136009290
Solutions for Chapter 5.7: Orthogonal Polynomials
Get Full SolutionsThis textbook survival guide was created for the textbook: Linear Algebra with Applications, edition: 8. Linear Algebra with Applications was written by and is associated to the ISBN: 9780136009290. Since 17 problems in chapter 5.7: Orthogonal Polynomials have been answered, more than 4349 students have viewed full stepbystep solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 5.7: Orthogonal Polynomials includes 17 full stepbystep solutions.

Complex conjugate
z = a  ib for any complex number z = a + ib. Then zz = Iz12.

Condition number
cond(A) = c(A) = IIAIlIIAIII = amaxlamin. In Ax = b, the relative change Ilox III Ilx II is less than cond(A) times the relative change Ilob III lib II· Condition numbers measure the sensitivity of the output to change in the input.

Diagonalization
A = S1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k SI.

Distributive Law
A(B + C) = AB + AC. Add then multiply, or mUltiply then add.

Full row rank r = m.
Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.

Jordan form 1 = M 1 AM.
If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

Left inverse A+.
If A has full column rank n, then A+ = (AT A)I AT has A+ A = In.

Multiplication Ax
= Xl (column 1) + ... + xn(column n) = combination of columns.

Multiplicities AM and G M.
The algebraic multiplicity A M of A is the number of times A appears as a root of det(A  AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).

Norm
IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

Orthogonal matrix Q.
Square matrix with orthonormal columns, so QT = Ql. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

Outer product uv T
= column times row = rank one matrix.

Pseudoinverse A+ (MoorePenrose inverse).
The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

Random matrix rand(n) or randn(n).
MATLAB creates a matrix with random entries, uniformly distributed on [0 1] for rand and standard normal distribution for randn.

Schwarz inequality
Iv·wl < IIvll IIwll.Then IvTAwl2 < (vT Av)(wT Aw) for pos def A.

Symmetric factorizations A = LDLT and A = QAQT.
Signs in A = signs in D.

Transpose matrix AT.
Entries AL = Ajj. AT is n by In, AT A is square, symmetric, positive semidefinite. The transposes of AB and AI are BT AT and (AT)I.

Unitary matrix UH = U T = UI.
Orthonormal columns (complex analog of Q).

Vandermonde matrix V.
V c = b gives coefficients of p(x) = Co + ... + Cn_IXn 1 with P(Xi) = bi. Vij = (Xi)jI and det V = product of (Xk  Xi) for k > i.

Vector addition.
v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.