- 10.9.1: View 9 is a view of a square with vertices (0, 0, 0), (1, 0, 0), (1...
- 10.9.2: (a) If the coordinate matrix of View 9 is multiplied by the matrix ...
- 10.9.3: (a) The reflection about the xz-plane is defined as the transformat...
- 10.9.4: (a) View 13 is View 1 subject to the following five transformations...
- 10.9.5: (a) View 14 is View 1 subject to the following seven transformation...
- 10.9.6: Suppose that a view with coordinate matrix P is to be rotated throu...
- 10.9.7: This exercise illustrates a technique for translating a point with ...
- 10.9.8: . For the three rotation matrices given with Views 4, 5, and 6, sho...
- 10.9.T1: Let (a, b, c) be a unit vector normal to the plane ax + by + cz = 0...
- 10.9.T2: A vector v = (x, y, z) is rotated by an angle about an axis having ...
Solutions for Chapter 10.9: Computer Graphics
Full solutions for Elementary Linear Algebra, Binder Ready Version: Applications Version | 11th Edition
Basis for V.
Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!
Characteristic equation det(A - AI) = O.
The n roots are the eigenvalues of A.
Remove row i and column j; multiply the determinant by (-I)i + j •
Cross product u xv in R3:
Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].
Diagonal matrix D.
dij = 0 if i #- j. Block-diagonal: zero outside square blocks Du.
Diagonalizable matrix A.
Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.
A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.
Elimination matrix = Elementary matrix Eij.
The identity matrix with an extra -eij in the i, j entry (i #- j). Then Eij A subtracts eij times row j of A from row i.
Four Fundamental Subspaces C (A), N (A), C (AT), N (AT).
Use AT for complex A.
Free variable Xi.
Column i has no pivot in elimination. We can give the n - r free variables any values, then Ax = b determines the r pivot variables (if solvable!).
Set of n nodes connected pairwise by m edges. A complete graph has all n(n - 1)/2 edges between nodes. A tree has only n - 1 edges and no closed loops.
Hankel matrix H.
Constant along each antidiagonal; hij depends on i + j.
Normal equation AT Ax = ATb.
Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.
Particular solution x p.
Any solution to Ax = b; often x p has free variables = o.
The diagonal entry (first nonzero) at the time when a row is used in elimination.
Projection p = a(aTblaTa) onto the line through a.
P = aaT laTa has rank l.
Row picture of Ax = b.
Each equation gives a plane in Rn; the planes intersect at x.
Row space C (AT) = all combinations of rows of A.
Column vectors by convention.
Spectral Theorem A = QAQT.
Real symmetric A has real A'S and orthonormal q's.
Transpose matrix AT.
Entries AL = Ajj. AT is n by In, AT A is square, symmetric, positive semidefinite. The transposes of AB and A-I are BT AT and (AT)-I.