 10.3.1: Jellyfish locomotion In Exercise 10.1.34 we introduced a model of j...
 10.3.2: Hemodialysis is a process by which a machine is used to filter urea...
 10.3.3: Prostate cancer treatment During the treatment of prostate cancer s...
 10.3.4: Soil contamination A crop is planted in soil that is contaminated w...
 10.3.5: Metastasis of malignant tumors Metastasis is the process by which c...
 10.3.6: Radioimmunotherapy is a cancer treatment in which radioactive atoms...
 10.3.7: Cancer progression The development of many cancers, such as colorec...
 10.3.8: Metapopulations Consider a simple metapopulation in which subpopula...
 10.3.9: Suppose a glass of cold water is sitting in a warm room and you pla...
 10.3.10: Vaccine coverage The project on page 479 explores an epidemiologica...
 10.3.11: Pulmonary air embolism is a type of blood clot, which can occur dur...
 10.3.12: Systemic lupus erythematosus is an autoimmune disease in which some...
Solutions for Chapter 10.3: Applications
Full solutions for Biocalculus: Calculus for Life Sciences  1st Edition
ISBN: 9781133109631
Solutions for Chapter 10.3: Applications
Get Full SolutionsChapter 10.3: Applications includes 12 full stepbystep solutions. Biocalculus: Calculus for Life Sciences was written by and is associated to the ISBN: 9781133109631. This expansive textbook survival guide covers the following chapters and their solutions. Since 12 problems in chapter 10.3: Applications have been answered, more than 27748 students have viewed full stepbystep solutions from this chapter. This textbook survival guide was created for the textbook: Biocalculus: Calculus for Life Sciences , edition: 1.

Circulant matrix C.
Constant diagonals wrap around as in cyclic shift S. Every circulant is Col + CIS + ... + Cn_lSn  l . Cx = convolution c * x. Eigenvectors in F.

Commuting matrices AB = BA.
If diagonalizable, they share n eigenvectors.

Factorization
A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.

Full column rank r = n.
Independent columns, N(A) = {O}, no free variables.

GramSchmidt orthogonalization A = QR.
Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.

Hilbert matrix hilb(n).
Entries HU = 1/(i + j 1) = Jd X i 1 xj1dx. Positive definite but extremely small Amin and large condition number: H is illconditioned.

Iterative method.
A sequence of steps intended to approach the desired solution.

Jordan form 1 = M 1 AM.
If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

Kirchhoff's Laws.
Current Law: net current (in minus out) is zero at each node. Voltage Law: Potential differences (voltage drops) add to zero around any closed loop.

Linear transformation T.
Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

Nullspace N (A)
= All solutions to Ax = O. Dimension n  r = (# columns)  rank.

Orthogonal matrix Q.
Square matrix with orthonormal columns, so QT = Ql. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

Projection matrix P onto subspace S.
Projection p = P b is the closest point to b in S, error e = b  Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) 1 AT.

Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.

Reflection matrix (Householder) Q = I 2uuT.
Unit vector u is reflected to Qu = u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q1 = Q.

Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

Skewsymmetric matrix K.
The transpose is K, since Kij = Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

Trace of A
= sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.

Vector space V.
Set of vectors such that all combinations cv + d w remain within V. Eight required rules are given in Section 3.1 for scalars c, d and vectors v, w.

Wavelets Wjk(t).
Stretch and shift the time axis to create Wjk(t) = woo(2j t  k).