 6.2.1: Find the general solution of each of the following systems: (a) y_ ...
 6.2.2: Solve each of the following initial value problems: (a) y_ 1 = y1 +...
 6.2.3: Given Y = c1e1tx1 + c2e2tx2 + +cnen txn is the solution to the init...
 6.2.4: Two tanks each contain 100 liters of a mixture. Initially, the mixt...
 6.2.5: Find the general solution of each of the following systems: (a) y__...
 6.2.6: Solve the initial value problem y__ 1 = 2y2 + y_ 1 + 2y_ 2 y__ 2 = ...
 6.2.7: In Application 2, assume that the solutions are of the form x1 = a1...
 6.2.8: Solve the the problem in Application 2, using the initial condition...
 6.2.9: Two masses are connected by springs as shown in the accompanying di...
 6.2.10: Three masses are connected by a series of springs between two fixed...
 6.2.11: Transform the nthorder equation y(n) = a0 y + a1 y_ + +an1 y(n1) i...
Solutions for Chapter 6.2: Systems of Linear Differential Equations
Full solutions for Linear Algebra with Applications  8th Edition
ISBN: 9780136009290
Solutions for Chapter 6.2: Systems of Linear Differential Equations
Get Full SolutionsSince 11 problems in chapter 6.2: Systems of Linear Differential Equations have been answered, more than 6174 students have viewed full stepbystep solutions from this chapter. Linear Algebra with Applications was written by and is associated to the ISBN: 9780136009290. This textbook survival guide was created for the textbook: Linear Algebra with Applications, edition: 8. Chapter 6.2: Systems of Linear Differential Equations includes 11 full stepbystep solutions. This expansive textbook survival guide covers the following chapters and their solutions.

Back substitution.
Upper triangular systems are solved in reverse order Xn to Xl.

Commuting matrices AB = BA.
If diagonalizable, they share n eigenvectors.

Conjugate Gradient Method.
A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax  x Tb over growing Krylov subspaces.

Diagonal matrix D.
dij = 0 if i # j. Blockdiagonal: zero outside square blocks Du.

Eigenvalue A and eigenvector x.
Ax = AX with x#O so det(A  AI) = o.

Elimination matrix = Elementary matrix Eij.
The identity matrix with an extra eij in the i, j entry (i # j). Then Eij A subtracts eij times row j of A from row i.

Four Fundamental Subspaces C (A), N (A), C (AT), N (AT).
Use AT for complex A.

Fourier matrix F.
Entries Fjk = e21Cijk/n give orthogonal columns FT F = nI. Then y = Fe is the (inverse) Discrete Fourier Transform Y j = L cke21Cijk/n.

Full column rank r = n.
Independent columns, N(A) = {O}, no free variables.

Jordan form 1 = M 1 AM.
If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

Linearly dependent VI, ... , Vn.
A combination other than all Ci = 0 gives L Ci Vi = O.

Multiplier eij.
The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).

Pivot.
The diagonal entry (first nonzero) at the time when a row is used in elimination.

Pseudoinverse A+ (MoorePenrose inverse).
The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

Random matrix rand(n) or randn(n).
MATLAB creates a matrix with random entries, uniformly distributed on [0 1] for rand and standard normal distribution for randn.

Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

Schwarz inequality
Iv·wl < IIvll IIwll.Then IvTAwl2 < (vT Av)(wT Aw) for pos def A.

Semidefinite matrix A.
(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

Simplex method for linear programming.
The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

Tridiagonal matrix T: tij = 0 if Ii  j I > 1.
T 1 has rank 1 above and below diagonal.