×
×

# Solutions for Chapter 2.8: MATRICES AND LINEAR TRANSFORMATIONS

## Full solutions for Elementary Linear Algebra: A Matrix Approach | 2nd Edition

ISBN: 9780131871410

Solutions for Chapter 2.8: MATRICES AND LINEAR TRANSFORMATIONS

Solutions for Chapter 2.8
4 5 0 361 Reviews
20
0
##### ISBN: 9780131871410

Elementary Linear Algebra: A Matrix Approach was written by and is associated to the ISBN: 9780131871410. Since 100 problems in chapter 2.8: MATRICES AND LINEAR TRANSFORMATIONS have been answered, more than 25395 students have viewed full step-by-step solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Elementary Linear Algebra: A Matrix Approach, edition: 2. Chapter 2.8: MATRICES AND LINEAR TRANSFORMATIONS includes 100 full step-by-step solutions.

Key Math Terms and definitions covered in this textbook
• Associative Law (AB)C = A(BC).

Parentheses can be removed to leave ABC.

• Complex conjugate

z = a - ib for any complex number z = a + ib. Then zz = Iz12.

• Cramer's Rule for Ax = b.

B j has b replacing column j of A; x j = det B j I det A

• Diagonalization

A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.

• Dimension of vector space

dim(V) = number of vectors in any basis for V.

• Eigenvalue A and eigenvector x.

Ax = AX with x#-O so det(A - AI) = o.

• Elimination matrix = Elementary matrix Eij.

The identity matrix with an extra -eij in the i, j entry (i #- j). Then Eij A subtracts eij times row j of A from row i.

• Exponential eAt = I + At + (At)2 12! + ...

has derivative AeAt; eAt u(O) solves u' = Au.

• Gauss-Jordan method.

Invert A by row operations on [A I] to reach [I A-I].

• Hermitian matrix A H = AT = A.

Complex analog a j i = aU of a symmetric matrix.

• Identity matrix I (or In).

Diagonal entries = 1, off-diagonal entries = 0.

• Matrix multiplication AB.

The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

• Norm

IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

• Orthonormal vectors q 1 , ... , q n·

Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

• Pivot.

The diagonal entry (first nonzero) at the time when a row is used in elimination.

• Projection p = a(aTblaTa) onto the line through a.

P = aaT laTa has rank l.

• Semidefinite matrix A.

(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

• Solvable system Ax = b.

The right side b is in the column space of A.

• Symmetric factorizations A = LDLT and A = QAQT.

Signs in A = signs in D.