- 5.9.1: Use the Runge-Kutta method for systems to approximate the solutions...
- 5.9.2: Use the Runge-Kutta method for systems to approximate the solutions...
- 5.9.3: Use the Runge-Kutta for Systems Algorithm to approximate the soluti...
- 5.9.4: Use the Runge-Kutta for Systems Algorithm to approximate the soluti...
- 5.9.5: Change the Adams Fourth-Order Predictor-Corrector Algorithm to obta...
- 5.9.6: Repeat Exercise 2 using the algorithm developed in Exercise 5.
- 5.9.7: Repeat Exercise 1 using the algorithm developed in Exercise 5.
- 5.9.8: Suppose the swinging pendulum described in the lead example of this...
- 5.9.9: The study of mathematical models for predicting the population dyna...
- 5.9.10: In Exercise 9 we considered the problem of predicting the populatio...
Solutions for Chapter 5.9: Higher-Order Equations and Systems of Differential Equations
Full solutions for Numerical Analysis | 9th Edition
Solutions for Chapter 5.9: Higher-Order Equations and Systems of Differential EquationsGet Full Solutions
Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).
Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.
Cross product u xv in R3:
Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].
Determinant IAI = det(A).
Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and
Dimension of vector space
dim(V) = number of vectors in any basis for V.
Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.
Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).
Echelon matrix U.
The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.
Eigenvalue A and eigenvector x.
Ax = AX with x#-O so det(A - AI) = o.
0,1,1,2,3,5, ... satisfy Fn = Fn-l + Fn- 2 = (A7 -A~)I()q -A2). Growth rate Al = (1 + .J5) 12 is the largest eigenvalue of the Fibonacci matrix [ } A].
Free variable Xi.
Column i has no pivot in elimination. We can give the n - r free variables any values, then Ax = b determines the r pivot variables (if solvable!).
Hankel matrix H.
Constant along each antidiagonal; hij depends on i + j.
Hessenberg matrix H.
Triangular matrix with one extra nonzero adjacent diagonal.
Matrix multiplication AB.
The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .
If N NT = NT N, then N has orthonormal (complex) eigenvectors.
Nullspace matrix N.
The columns of N are the n - r special solutions to As = O.
In each column, choose the largest available pivot to control roundoff; all multipliers have leij I < 1. See condition number.
Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.
Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.
Similar matrices A and B.
Every B = M-I AM has the same eigenvalues as A.
Triangle inequality II u + v II < II u II + II v II.
For matrix norms II A + B II < II A II + II B II·