 10.12.1: The selfsimilar set in Figure Ex1 has the sizes indicated. Given ...
 10.12.2: Find the Hausdorff dimension of the selfsimilar set shown in Figur...
 10.12.3: Each of the 12 selfsimilar sets in Figure Ex3 results from three ...
 10.12.4: For each of the selfsimilar sets in Figure Ex4, find: (i) the sca...
 10.12.5: Show that of the four affine transformations shown in Figure 10.12....
 10.12.6: Find the coordinates of the tip of the fern in Figure 10.12.22. [Hi...
 10.12.7: The square in Figure 10.12.7a was expressed as the union of 4 nonov...
 10.12.8: Show that the four similitudes T1 x y = 3 4 1 0 0 1 x y T2 x y = 3 ...
 10.12.9: All of the results in this section can be extended to Rn. Compute t...
 10.12.10: The set in R3 in Figure Ex10 is called the Menger sponge. It is a ...
 10.12.11: The two similitudes T1 x y = 1 3 1 0 0 1 x y and T2 x y = 1 3 1 0 0...
 10.12.12: Compute the areas of the sets S0, S1, S2, S3, and S4 in Figure 10.1...
 10.12.T1: Use similitudes of the form Ti x y z = 1 3 100 010 001 x y z + ai b...
 10.12.T2: Generalize the ideas involved in the Cantor set (in R1), the Sierpi...
Solutions for Chapter 10.12: Fractals
Full solutions for Elementary Linear Algebra, Binder Ready Version: Applications Version  11th Edition
ISBN: 9781118474228
Solutions for Chapter 10.12: Fractals
Get Full SolutionsSince 14 problems in chapter 10.12: Fractals have been answered, more than 15558 students have viewed full stepbystep solutions from this chapter. This textbook survival guide was created for the textbook: Elementary Linear Algebra, Binder Ready Version: Applications Version, edition: 11. Elementary Linear Algebra, Binder Ready Version: Applications Version was written by and is associated to the ISBN: 9781118474228. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 10.12: Fractals includes 14 full stepbystep solutions.

Basis for V.
Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!

Block matrix.
A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

Cholesky factorization
A = CTC = (L.J]))(L.J]))T for positive definite A.

Column space C (A) =
space of all combinations of the columns of A.

Commuting matrices AB = BA.
If diagonalizable, they share n eigenvectors.

Determinant IAI = det(A).
Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and

Elimination matrix = Elementary matrix Eij.
The identity matrix with an extra eij in the i, j entry (i # j). Then Eij A subtracts eij times row j of A from row i.

Fibonacci numbers
0,1,1,2,3,5, ... satisfy Fn = Fnl + Fn 2 = (A7 A~)I()q A2). Growth rate Al = (1 + .J5) 12 is the largest eigenvalue of the Fibonacci matrix [ } A].

GaussJordan method.
Invert A by row operations on [A I] to reach [I AI].

GramSchmidt orthogonalization A = QR.
Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.

Independent vectors VI, .. " vk.
No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.

Particular solution x p.
Any solution to Ax = b; often x p has free variables = o.

Pseudoinverse A+ (MoorePenrose inverse).
The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

Simplex method for linear programming.
The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

Skewsymmetric matrix K.
The transpose is K, since Kij = Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

Standard basis for Rn.
Columns of n by n identity matrix (written i ,j ,k in R3).

Stiffness matrix
If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.

Symmetric factorizations A = LDLT and A = QAQT.
Signs in A = signs in D.

Trace of A
= sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.

Volume of box.
The rows (or the columns) of A generate a box with volume I det(A) I.