- 184.108.40.206.43: Find the analytic solution of the initial-value problem in Example ...
- 220.127.116.11.44: Write a computer program to implement the AdamsBashforth-Moulton me...
- 18.104.22.168.45: In 3 and 4 use the Adams-Bashforth-Moulton method to approximate y(...
- 22.214.171.124.46: In 3 and 4 use the Adams-Bashforth-Moulton method to approximate y(...
- 126.96.36.199.47: In 58 use the Adams-Bashforth-Moulton method to approximate y(1.0),...
- 188.8.131.52.48: In 58 use the Adams-Bashforth-Moulton method to approximate y(1.0),...
- 184.108.40.206.49: In 58 use the Adams-Bashforth-Moulton method to approximate y(1.0),...
- 220.127.116.11.50: In 58 use the Adams-Bashforth-Moulton method to approximate y(1.0),...
Solutions for Chapter 9.3: Numerical Solutions of Ordinary Differential Equations
Full solutions for Differential Equations with Boundary-Value Problems, | 8th Edition
Solutions for Chapter 9.3: Numerical Solutions of Ordinary Differential EquationsGet Full Solutions
Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).
Column picture of Ax = b.
The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).
Cramer's Rule for Ax = b.
B j has b replacing column j of A; x j = det B j I det A
Determinant IAI = det(A).
Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and
Elimination matrix = Elementary matrix Eij.
The identity matrix with an extra -eij in the i, j entry (i #- j). Then Eij A subtracts eij times row j of A from row i.
Hankel matrix H.
Constant along each antidiagonal; hij depends on i + j.
Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.
Ln = 2,J, 3, 4, ... satisfy Ln = L n- l +Ln- 2 = A1 +A~, with AI, A2 = (1 ± -/5)/2 from the Fibonacci matrix U~]' Compare Lo = 2 with Fo = O.
Minimal polynomial of A.
The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A - AI) if no eigenvalues are repeated; always meA) divides peA).
The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).
Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •
Particular solution x p.
Any solution to Ax = b; often x p has free variables = o.
Permutation matrix P.
There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or -1) based on the number of row exchanges to reach I.
Positive definite matrix A.
Symmetric matrix with positive eigenvalues and positive pivots. Definition: x T Ax > 0 unless x = O. Then A = LDLT with diag(D» O.
Projection matrix P onto subspace S.
Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.
Projection p = a(aTblaTa) onto the line through a.
P = aaT laTa has rank l.
Row picture of Ax = b.
Each equation gives a plane in Rn; the planes intersect at x.
Standard basis for Rn.
Columns of n by n identity matrix (written i ,j ,k in R3).
Vandermonde matrix V.
V c = b gives coefficients of p(x) = Co + ... + Cn_IXn- 1 with P(Xi) = bi. Vij = (Xi)j-I and det V = product of (Xk - Xi) for k > i.
Vector v in Rn.
Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.