- 2.4.7: Repeat Exercise 6 for A ~ [~ - I 2 1] - I 3- 3 7- I
- 2.4.8: Let A be an III x Il m
- 2.4.9: LeI A and B be III x" lIlall ices. Show that A is equivalent to B i...
- 2.4.10: For each of the following matrices A. find a matrix B of- A that is...
- 2.4.11: Let A and B be equivalent square matrices. PlOve tlial A is nonsing...
Solutions for Chapter 2.4: Equivalent Matrices
Full solutions for Elementary Linear Algebra with Applications | 9th Edition
When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x - x) (x - x) T is positive (semi)definite; :E is diagonal if the Xi are independent.
Determinant IAI = det(A).
Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and
Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.
Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).
Echelon matrix U.
The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.
Exponential eAt = I + At + (At)2 12! + ...
has derivative AeAt; eAt u(O) solves u' = Au.
0,1,1,2,3,5, ... satisfy Fn = Fn-l + Fn- 2 = (A7 -A~)I()q -A2). Growth rate Al = (1 + .J5) 12 is the largest eigenvalue of the Fibonacci matrix [ } A].
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n - r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.
Hankel matrix H.
Constant along each antidiagonal; hij depends on i + j.
Hypercube matrix pl.
Row n + 1 counts corners, edges, faces, ... of a cube in Rn.
A sequence of steps intended to approach the desired solution.
Linear transformation T.
Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.
Nullspace N (A)
= All solutions to Ax = O. Dimension n - r = (# columns) - rank.
Outer product uv T
= column times row = rank one matrix.
Rank r (A)
= number of pivots = dimension of column space = dimension of row space.
Schur complement S, D - C A -} B.
Appears in block elimination on [~ g ].
Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.
If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.
Symmetric factorizations A = LDLT and A = QAQT.
Signs in A = signs in D.
Tridiagonal matrix T: tij = 0 if Ii - j I > 1.
T- 1 has rank 1 above and below diagonal.
Unitary matrix UH = U T = U-I.
Orthonormal columns (complex analog of Q).