×
Log in to StudySoup
Get Full Access to Calculus and Pre Calculus - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Calculus and Pre Calculus - Textbook Survival Guide

Already have an account? Login here
×
Reset your password

Solutions for Chapter 15: Special Functions and Eigenfunction Expansions

Advanced Engineering Mathematics | 7th Edition | ISBN: 9781111427412 | Authors: Peter V. O'Neill

Full solutions for Advanced Engineering Mathematics | 7th Edition

ISBN: 9781111427412

Advanced Engineering Mathematics | 7th Edition | ISBN: 9781111427412 | Authors: Peter V. O'Neill

Solutions for Chapter 15: Special Functions and Eigenfunction Expansions

Solutions for Chapter 15
4 5 0 346 Reviews
13
2
Textbook: Advanced Engineering Mathematics
Edition: 7
Author: Peter V. O'Neill
ISBN: 9781111427412

Advanced Engineering Mathematics was written by and is associated to the ISBN: 9781111427412. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 15: Special Functions and Eigenfunction Expansions includes 72 full step-by-step solutions. This textbook survival guide was created for the textbook: Advanced Engineering Mathematics, edition: 7. Since 72 problems in chapter 15: Special Functions and Eigenfunction Expansions have been answered, more than 26532 students have viewed full step-by-step solutions from this chapter.

Key Math Terms and definitions covered in this textbook
  • Associative Law (AB)C = A(BC).

    Parentheses can be removed to leave ABC.

  • Circulant matrix C.

    Constant diagonals wrap around as in cyclic shift S. Every circulant is Col + CIS + ... + Cn_lSn - l . Cx = convolution c * x. Eigenvectors in F.

  • Companion matrix.

    Put CI, ... ,Cn in row n and put n - 1 ones just above the main diagonal. Then det(A - AI) = ±(CI + c2A + C3A 2 + .•. + cnA n-l - An).

  • Cramer's Rule for Ax = b.

    B j has b replacing column j of A; x j = det B j I det A

  • Full row rank r = m.

    Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.

  • Gram-Schmidt orthogonalization A = QR.

    Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.

  • Hypercube matrix pl.

    Row n + 1 counts corners, edges, faces, ... of a cube in Rn.

  • Kronecker product (tensor product) A ® B.

    Blocks aij B, eigenvalues Ap(A)Aq(B).

  • Least squares solution X.

    The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.

  • Linearly dependent VI, ... , Vn.

    A combination other than all Ci = 0 gives L Ci Vi = O.

  • Normal equation AT Ax = ATb.

    Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.

  • Orthogonal matrix Q.

    Square matrix with orthonormal columns, so QT = Q-l. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

  • Pivot.

    The diagonal entry (first nonzero) at the time when a row is used in elimination.

  • Projection p = a(aTblaTa) onto the line through a.

    P = aaT laTa has rank l.

  • Pseudoinverse A+ (Moore-Penrose inverse).

    The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

  • Row space C (AT) = all combinations of rows of A.

    Column vectors by convention.

  • Schur complement S, D - C A -} B.

    Appears in block elimination on [~ g ].

  • Singular Value Decomposition

    (SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

  • Skew-symmetric matrix K.

    The transpose is -K, since Kij = -Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

  • Spectral Theorem A = QAQT.

    Real symmetric A has real A'S and orthonormal q's.