 7.4.1: Prove the statement following Theorem 7.4.1 for an arbitrary value ...
 7.4.2: In this problem we outline a proof of Theorem 7.4.3 in the case n =...
 7.4.3: Show that theWronskians of two fundamental sets of solutions of the...
 7.4.4: If x1 = y and x2 = y , then the second order equation y + p(t)y + q...
 7.4.5: Show that the general solution of x = P(t)x + g(t)is the sum of any...
 7.4.6: Consider the vectors x(1) (t) = t 1 and x(2) (t) = t 2 2t . (a) Com...
 7.4.7: Consider the vectors x(1) (t) = t 2 2t and x(2) (t) = et et , and a...
 7.4.8: Let x(1) , ... , x(m) be solutions of x = P(t)x on the interval < t...
 7.4.9: Let x(1) , ... , x(n) be linearly independent solutions of x = P(t)...
Solutions for Chapter 7.4: Basic Theory of Systems of First Order Linear Equations
Full solutions for Elementary Differential Equations and Boundary Value Problems  9th Edition
ISBN: 9780470383346
Solutions for Chapter 7.4: Basic Theory of Systems of First Order Linear Equations
Get Full SolutionsElementary Differential Equations and Boundary Value Problems was written by and is associated to the ISBN: 9780470383346. Since 9 problems in chapter 7.4: Basic Theory of Systems of First Order Linear Equations have been answered, more than 14370 students have viewed full stepbystep solutions from this chapter. Chapter 7.4: Basic Theory of Systems of First Order Linear Equations includes 9 full stepbystep solutions. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Elementary Differential Equations and Boundary Value Problems, edition: 9.

Complete solution x = x p + Xn to Ax = b.
(Particular x p) + (x n in nullspace).

Cramer's Rule for Ax = b.
B j has b replacing column j of A; x j = det B j I det A

Diagonal matrix D.
dij = 0 if i # j. Blockdiagonal: zero outside square blocks Du.

Elimination matrix = Elementary matrix Eij.
The identity matrix with an extra eij in the i, j entry (i # j). Then Eij A subtracts eij times row j of A from row i.

Ellipse (or ellipsoid) x T Ax = 1.
A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA1 yll2 = Y T(AAT)1 Y = 1 displayed by eigshow; axis lengths ad

Hilbert matrix hilb(n).
Entries HU = 1/(i + j 1) = Jd X i 1 xj1dx. Positive definite but extremely small Amin and large condition number: H is illconditioned.

Incidence matrix of a directed graph.
The m by n edgenode incidence matrix has a row for each edge (node i to node j), with entries 1 and 1 in columns i and j .

lAII = l/lAI and IATI = IAI.
The big formula for det(A) has a sum of n! terms, the cofactor formula uses determinants of size n  1, volume of box = I det( A) I.

Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b  Ax is orthogonal to all columns of A.

Linear transformation T.
Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

Multiplicities AM and G M.
The algebraic multiplicity A M of A is the number of times A appears as a root of det(A  AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).

Plane (or hyperplane) in Rn.
Vectors x with aT x = O. Plane is perpendicular to a =1= O.

Positive definite matrix A.
Symmetric matrix with positive eigenvalues and positive pivots. Definition: x T Ax > 0 unless x = O. Then A = LDLT with diag(D» O.

Rotation matrix
R = [~ CS ] rotates the plane by () and R 1 = RT rotates back by (). Eigenvalues are eiO and eiO , eigenvectors are (1, ±i). c, s = cos (), sin ().

Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

Symmetric matrix A.
The transpose is AT = A, and aU = a ji. AI is also symmetric.

Triangle inequality II u + v II < II u II + II v II.
For matrix norms II A + B II < II A II + II B II·

Vandermonde matrix V.
V c = b gives coefficients of p(x) = Co + ... + Cn_IXn 1 with P(Xi) = bi. Vij = (Xi)jI and det V = product of (Xk  Xi) for k > i.

Wavelets Wjk(t).
Stretch and shift the time axis to create Wjk(t) = woo(2j t  k).