- 1.2.1: Which of the matrices that follow are in row echelon form? Which ar...
- 1.2.2: The augmented matrices that follow are in row echelon form. For eac...
- 1.2.3: The augmented matrices that follow are in reduced row echelon form....
- 1.2.4: For each of the systems in Exercise 3, make a list of the lead vari...
- 1.2.5: For each of the systems of equations that follow, use Gaussian elim...
- 1.2.6: Use GaussJordan reduction to solve each of the following systems: (...
- 1.2.7: Give a geometric explanation of why a homogeneous linear system con...
- 1.2.8: Consider a linear system whose augmented matrix is of the form 1 2 ...
- 1.2.9: Consider a linear system whose augmented matrix is of the form 1 2 ...
- 1.2.10: Consider a linear system whose augmented matrix is of the form 1 1 ...
- 1.2.11: Given the linear systems (a) x1 + 2x2 = 2 3x1 + 7x2 = 8 (b) x1 + 2x...
- 1.2.12: Given the linear systems (a) x1 + 2x2 + x3 = 2 x1 x2 + 2x3 = 3 2x1 ...
- 1.2.13: Given a homogeneous system of linear equations, if the system is ov...
- 1.2.14: Given a nonhomogeneous system of linear equations, if the system is...
- 1.2.15: Determine the values of x1, x2, x3, and x4 for the following traffi...
- 1.2.16: Consider the traffic flow diagram that follows, where a1, a2, a3, a...
- 1.2.17: Let (c1, c2) be a solution of the 2 2 system a11x1 + a12x2 = 0 a21x...
- 1.2.18: In Application 3, the solution (6, 6, 6, 1) was obtained by setting...
- 1.2.19: Liquid benzene burns in the atmosphere. If a cold object is placed ...
- 1.2.20: Nitric acid is prepared commercially by a series of three chemical ...
- 1.2.21: In Application 4, determine the relative values of x1, x2, and x3 i...
- 1.2.22: Determine the amount of each current for the following networks:
Solutions for Chapter 1.2: Row Echelon Form
Full solutions for Linear Algebra with Applications | 8th Edition
Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).
cond(A) = c(A) = IIAIlIIA-III = amaxlamin. In Ax = b, the relative change Ilox III Ilx II is less than cond(A) times the relative change Ilob III lib II· Condition numbers measure the sensitivity of the output to change in the input.
Cramer's Rule for Ax = b.
B j has b replacing column j of A; x j = det B j I det A
Diagonal matrix D.
dij = 0 if i #- j. Block-diagonal: zero outside square blocks Du.
Eigenvalue A and eigenvector x.
Ax = AX with x#-O so det(A - AI) = o.
Invert A by row operations on [A I] to reach [I A-I].
lA-II = l/lAI and IATI = IAI.
The big formula for det(A) has a sum of n! terms, the cofactor formula uses determinants of size n - 1, volume of box = I det( A) I.
Linear transformation T.
Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.
Linearly dependent VI, ... , Vn.
A combination other than all Ci = 0 gives L Ci Vi = O.
Minimal polynomial of A.
The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A - AI) if no eigenvalues are repeated; always meA) divides peA).
Ps = pascal(n) = the symmetric matrix with binomial entries (i1~;2). Ps = PL Pu all contain Pascal's triangle with det = 1 (see Pascal in the index).
The diagonal entry (first nonzero) at the time when a row is used in elimination.
Plane (or hyperplane) in Rn.
Vectors x with aT x = O. Plane is perpendicular to a =1= O.
Projection matrix P onto subspace S.
Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.
Random matrix rand(n) or randn(n).
MATLAB creates a matrix with random entries, uniformly distributed on [0 1] for rand and standard normal distribution for randn.
Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.
Combinations of VI, ... ,Vm fill the space. The columns of A span C (A)!
Spectral Theorem A = QAQT.
Real symmetric A has real A'S and orthonormal q's.
v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.
Vector space V.
Set of vectors such that all combinations cv + d w remain within V. Eight required rules are given in Section 3.1 for scalars c, d and vectors v, w.