- Chapter Chapter 1: COMPLEX NUMBERS
- Chapter Chapter 10: APPLICATIONS OF CONFORMAL MAPPING
- Chapter Chapter 11: THE SCHWARZ-CHRISTOFFEL TRANSFORMATION
- Chapter Chapter 12: INTEGRAL FORMULAS OF THE POISSON TYPE
- Chapter Chapter 2: ANALYTIC FUNCTIONS
- Chapter Chapter 3: Elementary Functions
- Chapter Chapter 4: INTEGRALS
- Chapter Chapter 5: SERIES
- Chapter Chapter 6: RESIDUES AND POLES
- Chapter Chapter 7: APPLICATIONS OF RESIDUES
- Chapter Chapter 8: MAPPING BY ELEMENTARY FUNCTIONS
- Chapter Chapter 9: CONFORMAL MAPPING
Complex Variables and Applications 9th Edition - Solutions by Chapter
Full solutions for Complex Variables and Applications | 9th Edition
Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).
Characteristic equation det(A - AI) = O.
The n roots are the eigenvalues of A.
Cross product u xv in R3:
Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].
Ellipse (or ellipsoid) x T Ax = 1.
A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA-1 yll2 = Y T(AAT)-1 Y = 1 displayed by eigshow; axis lengths ad
Free columns of A.
Columns without pivots; these are combinations of earlier columns.
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n - r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.
Left inverse A+.
If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.
= Xl (column 1) + ... + xn(column n) = combination of columns.
In each column, choose the largest available pivot to control roundoff; all multipliers have leij I < 1. See condition number.
Permutation matrix P.
There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or -1) based on the number of row exchanges to reach I.
Polar decomposition A = Q H.
Orthogonal Q times positive (semi)definite H.
Projection matrix P onto subspace S.
Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.
Rayleigh quotient q (x) = X T Ax I x T x for symmetric A: Amin < q (x) < Amax.
Those extremes are reached at the eigenvectors x for Amin(A) and Amax(A).
Right inverse A+.
If A has full row rank m, then A+ = AT(AAT)-l has AA+ = 1m.
R = [~ CS ] rotates the plane by () and R- 1 = RT rotates back by -(). Eigenvalues are eiO and e-iO , eigenvectors are (1, ±i). c, s = cos (), sin ().
Schur complement S, D - C A -} B.
Appears in block elimination on [~ g ].
Singular matrix A.
A square matrix that has no inverse: det(A) = o.
Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.
Special solutions to As = O.
One free variable is Si = 1, other free variables = o.
Sum V + W of subs paces.
Space of all (v in V) + (w in W). Direct sum: V n W = to}.