- 7.3.1E: In Exercises 1 and 2, find the change of variable x = Py that trans...
- 7.3.2E: In Exercise find the change of variable x = Py that transforms the ...
- 7.3.3E: In Exercises 1 and 2, find the change of variable x = Py that trans...
- 7.3.7E: Find a unit vector x in R3 at which Q(x) is maximized, subject to x...
- 7.3.8E: Find a unit vector x in R3 at which Q(x) is maximized, subject to x...
- 7.3.9E: Find the maximum value of subject to the constraint (Do not go on t...
- 7.3.10E: Find the maximum value of subject to the constraint (Do not go on t...
- 7.3.11E: Suppose x is a unit eigenvector of a matrix A corresponding to an e...
- 7.3.12E: Let be any eigenvalue of a symmetric matrix A. Justify the statemen...
- 7.3.13E: Let A be an n × n symmetric matrix, let M and m denote the maximum ...
- 7.3.15E: In Exercises 14–17, follow the instructions given for Exercises 3–6.
- 7.3.16E: In Exercises 14–17, follow the instructions given for Exercises 3–6.
- 7.3.17E: In Exercises 14–17, follow the instructions given for Exercises 3–6.
Solutions for Chapter 7.3: Linear Algebra and Its Applications 5th Edition
Full solutions for Linear Algebra and Its Applications | 5th Edition
Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).
Upper triangular systems are solved in reverse order Xn to Xl.
Column picture of Ax = b.
The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).
Put CI, ... ,Cn in row n and put n - 1 ones just above the main diagonal. Then det(A - AI) = ±(CI + c2A + C3A 2 + .•. + cnA n-l - An).
z = a - ib for any complex number z = a + ib. Then zz = Iz12.
Determinant IAI = det(A).
Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and
Dimension of vector space
dim(V) = number of vectors in any basis for V.
Free variable Xi.
Column i has no pivot in elimination. We can give the n - r free variables any values, then Ax = b determines the r pivot variables (if solvable!).
Hermitian matrix A H = AT = A.
Complex analog a j i = aU of a symmetric matrix.
IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.
Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •
Permutation matrix P.
There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or -1) based on the number of row exchanges to reach I.
Pivot columns of A.
Columns that contain pivots after row reduction. These are not combinations of earlier columns. The pivot columns are a basis for the column space.
Polar decomposition A = Q H.
Orthogonal Q times positive (semi)definite H.
Right inverse A+.
If A has full row rank m, then A+ = AT(AAT)-l has AA+ = 1m.
Spectral Theorem A = QAQT.
Real symmetric A has real A'S and orthonormal q's.
If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.
Tridiagonal matrix T: tij = 0 if Ii - j I > 1.
T- 1 has rank 1 above and below diagonal.
Unitary matrix UH = U T = U-I.
Orthonormal columns (complex analog of Q).
Volume of box.
The rows (or the columns) of A generate a box with volume I det(A) I.