- 10.13.1: The self-similar set in Figure Ex-1 has the sizes indicated. Given ...
- 10.13.2: Find the Hausdorff dimension of the self-similar set shown in Figur...
- 10.13.3: Each of the 12 self-similar sets in Figure Ex-3 results from three ...
- 10.13.4: For each of the self-similar sets in Figure Ex-4, find: (i) the sca...
- 10.13.5: Show that of the four affine transformations shown in Figure 10.13....
- 10.13.6: Find the coordinates of the tip of the fern in Figure 10.13.22. [Hi...
- 10.13.7: The square in Figure 10.13.7a was expressed as the union of 4 nonov...
- 10.13.8: Show that the four similitudes express the unit square as the union...
- 10.13.9: All of the results in this section can be extended to . Compute the...
- 10.13.10: The set in in Figure Ex-10 is called the Menger sponge. It is a sel...
- 10.13.11: The two similitudes and determine a fractal known as the Cantor set...
- 10.13.12: Compute the areas of the sets , , , , and in Figure 11.13.15.
- 10.13.T1: Use similitudes of the form to show that the Menger sponge (see Exe...
- 10.13.T2: Generalize the ideas involved in the Cantor set (in ), the Sierpins...
Solutions for Chapter 10.13: Fractals
Full solutions for Elementary Linear Algebra: Applications Version | 10th Edition
Tv = Av + Vo = linear transformation plus shift.
Upper triangular systems are solved in reverse order Xn to Xl.
Commuting matrices AB = BA.
If diagonalizable, they share n eigenvectors.
Diagonalizable matrix A.
Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.
A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.
Free columns of A.
Columns without pivots; these are combinations of earlier columns.
Current Law: net current (in minus out) is zero at each node. Voltage Law: Potential differences (voltage drops) add to zero around any closed loop.
Kronecker product (tensor product) A ® B.
Blocks aij B, eigenvalues Ap(A)Aq(B).
Left nullspace N (AT).
Nullspace of AT = "left nullspace" of A because y T A = OT.
Normal equation AT Ax = ATb.
Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.
Nullspace matrix N.
The columns of N are the n - r special solutions to As = O.
Every v in V is orthogonal to every w in W.
Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •
Projection p = a(aTblaTa) onto the line through a.
P = aaT laTa has rank l.
Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.
Similar matrices A and B.
Every B = M-I AM has the same eigenvalues as A.
Special solutions to As = O.
One free variable is Si = 1, other free variables = o.
Transpose matrix AT.
Entries AL = Ajj. AT is n by In, AT A is square, symmetric, positive semidefinite. The transposes of AB and A-I are BT AT and (AT)-I.
Triangle inequality II u + v II < II u II + II v II.
For matrix norms II A + B II < II A II + II B II·
Vandermonde matrix V.
V c = b gives coefficients of p(x) = Co + ... + Cn_IXn- 1 with P(Xi) = bi. Vij = (Xi)j-I and det V = product of (Xk - Xi) for k > i.