×
×

# Solutions for Chapter 2.1: LINEAR EQUATIONS IN TWO VARIABLES

## Full solutions for College Algebra | 8th Edition

ISBN: 9781439048696

Solutions for Chapter 2.1: LINEAR EQUATIONS IN TWO VARIABLES

Solutions for Chapter 2.1
4 5 0 316 Reviews
28
5
##### ISBN: 9781439048696

Chapter 2.1: LINEAR EQUATIONS IN TWO VARIABLES includes 146 full step-by-step solutions. This textbook survival guide was created for the textbook: College Algebra , edition: 8. College Algebra was written by and is associated to the ISBN: 9781439048696. Since 146 problems in chapter 2.1: LINEAR EQUATIONS IN TWO VARIABLES have been answered, more than 33191 students have viewed full step-by-step solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions.

Key Math Terms and definitions covered in this textbook
• Augmented matrix [A b].

Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

• Block matrix.

A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

• Circulant matrix C.

Constant diagonals wrap around as in cyclic shift S. Every circulant is Col + CIS + ... + Cn_lSn - l . Cx = convolution c * x. Eigenvectors in F.

• Column picture of Ax = b.

The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).

• Companion matrix.

Put CI, ... ,Cn in row n and put n - 1 ones just above the main diagonal. Then det(A - AI) = ±(CI + c2A + C3A 2 + .•. + cnA n-l - An).

• Diagonal matrix D.

dij = 0 if i #- j. Block-diagonal: zero outside square blocks Du.

• Dimension of vector space

dim(V) = number of vectors in any basis for V.

• Echelon matrix U.

The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.

• Full column rank r = n.

Independent columns, N(A) = {O}, no free variables.

• Gauss-Jordan method.

Invert A by row operations on [A I] to reach [I A-I].

• Lucas numbers

Ln = 2,J, 3, 4, ... satisfy Ln = L n- l +Ln- 2 = A1 +A~, with AI, A2 = (1 ± -/5)/2 from the Fibonacci matrix U~]' Compare Lo = 2 with Fo = O.

• Matrix multiplication AB.

The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

• Orthonormal vectors q 1 , ... , q n·

Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

• Partial pivoting.

In each column, choose the largest available pivot to control roundoff; all multipliers have leij I < 1. See condition number.

• Pivot.

The diagonal entry (first nonzero) at the time when a row is used in elimination.

• Right inverse A+.

If A has full row rank m, then A+ = AT(AAT)-l has AA+ = 1m.

• Saddle point of I(x}, ... ,xn ).

A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

• Skew-symmetric matrix K.

The transpose is -K, since Kij = -Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

• Tridiagonal matrix T: tij = 0 if Ii - j I > 1.

T- 1 has rank 1 above and below diagonal.

• Unitary matrix UH = U T = U-I.

Orthonormal columns (complex analog of Q).

×