 2.SE.1E: Use row operations to show that the determinants in Exercises 2–4 a...
 2.SE.2E: Find the matrix C whose inverse is
 2.SE.3E: Let . Show that A3 = 0. Use matrix algebra to compute the product .
 2.SE.4E: Suppose for some n > 1. Find an inverse for I – A.
 2.SE.5E: Suppose an n × n matrix A satisfies the equation
 2.SE.6E: Let These are Pauli spin matrices used in the study of electron spi...
 2.SE.7E: Compute without computing A–1. [Hint: A–1B is the solution of the e...
 2.SE.8E: Find a matrix A such that the transformation maps , respectively. [...
 2.SE.9E: Suppose Find A.
 2.SE.10E: Suppose A is invertible. Explain why ATA is also invertible. Then s...
 2.SE.11E: Let be fixed numbers. The matrix below, called a Vandermonde matrix...
 2.SE.12E: Let A = LU, where L is an invertible lower triangular matrix and U ...
 2.SE.13E: Given u in (an outer product) and Q = I – 2P . Justify statements (...
 2.SE.14E: .Determine P and Q as in Exercise 13, and compute Px andQx. The fig...
 2.SE.15E: Suppose , where are elementary matrices. Explain why C is row equiv...
 2.SE.16E: Let A be an n × n singular matrix. Describe how to construct an n ×...
 2.SE.17E: Let A be a 6 × 4 matrix and B a 4 × 6 matrix. Show that the 6 × 6 m...
 2.SE.18E: Suppose A is a 5 × 3 matrix and there exists a 3 × 5 matrix C such ...
 2.SE.19E: [M] Certain dynamical systems can be studied by examining powers of...
 2.SE.20E: [M] Let An be the n × n matrix with 0’s on the main diagonal and 1’...
Solutions for Chapter 2.SE: Linear Algebra and Its Applications 5th Edition
Full solutions for Linear Algebra and Its Applications  5th Edition
ISBN: 9780321982384
Solutions for Chapter 2.SE
Get Full SolutionsThis expansive textbook survival guide covers the following chapters and their solutions. Chapter 2.SE includes 20 full stepbystep solutions. This textbook survival guide was created for the textbook: Linear Algebra and Its Applications , edition: 5. Since 20 problems in chapter 2.SE have been answered, more than 43641 students have viewed full stepbystep solutions from this chapter. Linear Algebra and Its Applications was written by and is associated to the ISBN: 9780321982384.

Affine transformation
Tv = Av + Vo = linear transformation plus shift.

Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

Block matrix.
A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

Change of basis matrix M.
The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)

Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.
Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).

Elimination matrix = Elementary matrix Eij.
The identity matrix with an extra eij in the i, j entry (i # j). Then Eij A subtracts eij times row j of A from row i.

Ellipse (or ellipsoid) x T Ax = 1.
A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA1 yll2 = Y T(AAT)1 Y = 1 displayed by eigshow; axis lengths ad

Exponential eAt = I + At + (At)2 12! + ...
has derivative AeAt; eAt u(O) solves u' = Au.

Full column rank r = n.
Independent columns, N(A) = {O}, no free variables.

Indefinite matrix.
A symmetric matrix with eigenvalues of both signs (+ and  ).

Left nullspace N (AT).
Nullspace of AT = "left nullspace" of A because y T A = OT.

Length II x II.
Square root of x T x (Pythagoras in n dimensions).

Minimal polynomial of A.
The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A  AI) if no eigenvalues are repeated; always meA) divides peA).

Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.

Row picture of Ax = b.
Each equation gives a plane in Rn; the planes intersect at x.

Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

Singular matrix A.
A square matrix that has no inverse: det(A) = o.

Toeplitz matrix.
Constant down each diagonal = timeinvariant (shiftinvariant) filter.

Tridiagonal matrix T: tij = 0 if Ii  j I > 1.
T 1 has rank 1 above and below diagonal.

Vandermonde matrix V.
V c = b gives coefficients of p(x) = Co + ... + Cn_IXn 1 with P(Xi) = bi. Vij = (Xi)jI and det V = product of (Xk  Xi) for k > i.