- 2.4.1: Label the following statements as true or false. In each part, V an...
- 2.4.2: For each of the following linear transformations T, determine wheth...
- 2.4.3: Which of the following pairs of vector spaces are isomorphic? Justi...
- 2.4.4: Let A and B be n x n invertible matrices. Prove that AB is invertib...
- 2.4.5: Let A be invertible. Prove that A1 is invertible and (A1 )'1 = (A'1...
- 2.4.6: Prove that if A is invertible and AB = O. then B = O.
- 2.4.7: Let A be an n x n matrix. (a) Suppose that A2 = O. Prove that A is ...
- 2.4.8: Prove Corollaries 1 and 2 of Theorem 2.18.
- 2.4.9: Let A and B be n x n matrices such that AB is invertible. Prove tha...
- 2.4.10: Let A and B be n x n matrices such that AB In. (a) Use Exercise 9 t...
- 2.4.11: Verify that the transformation in Example 5 is one-to-one.
- 2.4.12: Prove Theorem 2.21.
- 2.4.13: Let ~ mean "is isomorphic to." Prove that ~ is an equivalence relat...
- 2.4.14: Let Construct an isomorphism from V to F .
- 2.4.15: Let V and W be n-dimensional vector spaces, and let T: V * W be a l...
- 2.4.16: Let B be an n x n invertible matrix. Define $ : Mnxn ( F) > Mnxn (F...
- 2.4.17: Let V and W be finite-dimensional vector spaces and T: V > W be an ...
- 2.4.18: Repeat Example 7 with the polynomial p(x) = 1 + x + 2x2 + x 3 .
- 2.4.19: In Example 5 of Section 2.1, the mapping T: M2x2(i?) > M2x2(i?) def...
- 2.4.20: Let T: V > W be a linear transformation from an n-dimensional vecto...
- 2.4.21: Let V and W be finite-dimensional vector spaces with ordered bases ...
- 2.4.22: Let cn,ci,...,c n be distinct scalars from an infinite field F. Def...
- 2.4.23: Let V denote the vector space defined in Example 5 of Section 1.2, ...
- 2.4.24: Let T: V Z be a linear transformation of a vector space V onto a ve...
- 2.4.25: Let V be a nonzero vector space over a field F, and suppose that S ...
Solutions for Chapter 2.4: Invertibility and Isomorphisms
Full solutions for Linear Algebra | 4th Edition
A = CTC = (L.J]))(L.J]))T for positive definite A.
Circulant matrix C.
Constant diagonals wrap around as in cyclic shift S. Every circulant is Col + CIS + ... + Cn_lSn - l . Cx = convolution c * x. Eigenvectors in F.
cond(A) = c(A) = IIAIlIIA-III = amaxlamin. In Ax = b, the relative change Ilox III Ilx II is less than cond(A) times the relative change Ilob III lib II· Condition numbers measure the sensitivity of the output to change in the input.
Diagonalizable matrix A.
Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.
A(B + C) = AB + AC. Add then multiply, or mUltiply then add.
Echelon matrix U.
The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.
Set of n nodes connected pairwise by m edges. A complete graph has all n(n - 1)/2 edges between nodes. A tree has only n - 1 edges and no closed loops.
Hilbert matrix hilb(n).
Entries HU = 1/(i + j -1) = Jd X i- 1 xj-1dx. Positive definite but extremely small Amin and large condition number: H is ill-conditioned.
Hypercube matrix pl.
Row n + 1 counts corners, edges, faces, ... of a cube in Rn.
Current Law: net current (in minus out) is zero at each node. Voltage Law: Potential differences (voltage drops) add to zero around any closed loop.
Kronecker product (tensor product) A ® B.
Blocks aij B, eigenvalues Ap(A)Aq(B).
Left nullspace N (AT).
Nullspace of AT = "left nullspace" of A because y T A = OT.
Nilpotent matrix N.
Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.
If N NT = NT N, then N has orthonormal (complex) eigenvectors.
Permutation matrix P.
There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or -1) based on the number of row exchanges to reach I.
Plane (or hyperplane) in Rn.
Vectors x with aT x = O. Plane is perpendicular to a =1= O.
Row space C (AT) = all combinations of rows of A.
Column vectors by convention.
Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.
Special solutions to As = O.
One free variable is Si = 1, other free variables = o.
Constant down each diagonal = time-invariant (shift-invariant) filter.