- 18.104.22.168.51: Derive the mapping in Example 2 from (2).
- 22.214.171.124.52: (Inverse) Find the inverse of the mapping in Example 1. Show that u...
- 126.96.36.199.53: Verify the formula (3) for disks.
- 188.8.131.52.54: Derive the mapping in Example 4 from (2). Find its inverse and prov...
- 184.108.40.206.55: (Inverse) If w = f(z) is any transformation that has an inverse, pr...
- 220.127.116.11.56: CAS EXPERIMENT. Linear Fractional Transformations (LFTs). (a) Graph...
- 18.104.22.168.57: Find the LFT that maps the given three points onto the three given ...
- 22.214.171.124.58: Find the LFT that maps the given three points onto the three given ...
- 126.96.36.199.59: Find the LFT that maps the given three points onto the three given ...
- 188.8.131.52.60: Find the LFT that maps the given three points onto the three given ...
- 184.108.40.206.61: Find the LFT that maps the given three points onto the three given ...
- 220.127.116.11.62: Find the LFT that maps the given three points onto the three given ...
- 18.104.22.168.63: Find the LFT that maps the given three points onto the three given ...
- 22.214.171.124.64: Find the LFT that maps the given three points onto the three given ...
- 126.96.36.199.65: Find the LFT that maps the given three points onto the three given ...
- 188.8.131.52.66: Find all LFTs w(:.':) that map the x-axis onto the I-axis.
- 184.108.40.206.67: Find a LFT that maps 1:.0:1 ~ I onto Iwl ~ 1 so that z = i/2 is map...
- 220.127.116.11.68: Find an analytic function that maps the second quadrant of the z-pl...
- 18.104.22.168.69: Find an analytic function w = i(z) that maps the region o ~ arg z ~...
- 22.214.171.124.70: (Composite) Show that the composite of two LFrs is a LFT.
Solutions for Chapter 17.3: Special Linear Fractional Transformations
Full solutions for Advanced Engineering Mathematics | 9th Edition
A = CTC = (L.J]))(L.J]))T for positive definite A.
Cross product u xv in R3:
Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].
Diagonalizable matrix A.
Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.
A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.
Hypercube matrix pl.
Row n + 1 counts corners, edges, faces, ... of a cube in Rn.
Incidence matrix of a directed graph.
The m by n edge-node incidence matrix has a row for each edge (node i to node j), with entries -1 and 1 in columns i and j .
Jordan form 1 = M- 1 AM.
If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.
lA-II = l/lAI and IATI = IAI.
The big formula for det(A) has a sum of n! terms, the cofactor formula uses determinants of size n - 1, volume of box = I det( A) I.
Left inverse A+.
If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.
Minimal polynomial of A.
The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A - AI) if no eigenvalues are repeated; always meA) divides peA).
A directed graph that has constants Cl, ... , Cm associated with the edges.
IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.
If N NT = NT N, then N has orthonormal (complex) eigenvectors.
Nullspace matrix N.
The columns of N are the n - r special solutions to As = O.
Polar decomposition A = Q H.
Orthogonal Q times positive (semi)definite H.
Rayleigh quotient q (x) = X T Ax I x T x for symmetric A: Amin < q (x) < Amax.
Those extremes are reached at the eigenvectors x for Amin(A) and Amax(A).
Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.
Similar matrices A and B.
Every B = M-I AM has the same eigenvalues as A.
Simplex method for linear programming.
The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!
Volume of box.
The rows (or the columns) of A generate a box with volume I det(A) I.