×
×

Solutions for Chapter 3.2: The Rank of a Matrix and Matrix Inverses

Full solutions for Linear Algebra | 4th Edition

ISBN: 9780130084514

Solutions for Chapter 3.2: The Rank of a Matrix and Matrix Inverses

Solutions for Chapter 3.2
4 5 0 334 Reviews
31
5
ISBN: 9780130084514

This expansive textbook survival guide covers the following chapters and their solutions. Since 22 problems in chapter 3.2: The Rank of a Matrix and Matrix Inverses have been answered, more than 11010 students have viewed full step-by-step solutions from this chapter. Linear Algebra was written by and is associated to the ISBN: 9780130084514. Chapter 3.2: The Rank of a Matrix and Matrix Inverses includes 22 full step-by-step solutions. This textbook survival guide was created for the textbook: Linear Algebra , edition: 4.

Key Math Terms and definitions covered in this textbook
• Characteristic equation det(A - AI) = O.

The n roots are the eigenvalues of A.

• Companion matrix.

Put CI, ... ,Cn in row n and put n - 1 ones just above the main diagonal. Then det(A - AI) = ±(CI + c2A + C3A 2 + .•. + cnA n-l - An).

• Cyclic shift

S. Permutation with S21 = 1, S32 = 1, ... , finally SIn = 1. Its eigenvalues are the nth roots e2lrik/n of 1; eigenvectors are columns of the Fourier matrix F.

• Distributive Law

A(B + C) = AB + AC. Add then multiply, or mUltiply then add.

• Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.

Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).

• Elimination matrix = Elementary matrix Eij.

The identity matrix with an extra -eij in the i, j entry (i #- j). Then Eij A subtracts eij times row j of A from row i.

• Fibonacci numbers

0,1,1,2,3,5, ... satisfy Fn = Fn-l + Fn- 2 = (A7 -A~)I()q -A2). Growth rate Al = (1 + .J5) 12 is the largest eigenvalue of the Fibonacci matrix [ } A].

• Gauss-Jordan method.

Invert A by row operations on [A I] to reach [I A-I].

• Gram-Schmidt orthogonalization A = QR.

Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.

• Hankel matrix H.

Constant along each antidiagonal; hij depends on i + j.

• Jordan form 1 = M- 1 AM.

If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

• Left inverse A+.

If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.

• Length II x II.

Square root of x T x (Pythagoras in n dimensions).

• Norm

IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

• Normal matrix.

If N NT = NT N, then N has orthonormal (complex) eigenvectors.

• Positive definite matrix A.

Symmetric matrix with positive eigenvalues and positive pivots. Definition: x T Ax > 0 unless x = O. Then A = LDLT with diag(D» O.

• Row picture of Ax = b.

Each equation gives a plane in Rn; the planes intersect at x.

• Schur complement S, D - C A -} B.

Appears in block elimination on [~ g ].

• Standard basis for Rn.

Columns of n by n identity matrix (written i ,j ,k in R3).

• Unitary matrix UH = U T = U-I.

Orthonormal columns (complex analog of Q).

×