×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Already have an account? Login here
×
Reset your password

Solutions for Chapter 3.1: Systems of Linear Equations in Two Variables

Full solutions for Intermediate Algebra for College Students | 6th Edition

ISBN: 9780321758934

Solutions for Chapter 3.1: Systems of Linear Equations in Two Variables

Solutions for Chapter 3.1
4 5 0 341 Reviews
31
2
Textbook: Intermediate Algebra for College Students
Edition: 6
Author: Robert F. Blitzer
ISBN: 9780321758934

Chapter 3.1: Systems of Linear Equations in Two Variables includes 125 full step-by-step solutions. Since 125 problems in chapter 3.1: Systems of Linear Equations in Two Variables have been answered, more than 87990 students have viewed full step-by-step solutions from this chapter. Intermediate Algebra for College Students was written by and is associated to the ISBN: 9780321758934. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Intermediate Algebra for College Students, edition: 6.

Key Math Terms and definitions covered in this textbook
  • Adjacency matrix of a graph.

    Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

  • Augmented matrix [A b].

    Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

  • Back substitution.

    Upper triangular systems are solved in reverse order Xn to Xl.

  • Condition number

    cond(A) = c(A) = IIAIlIIA-III = amaxlamin. In Ax = b, the relative change Ilox III Ilx II is less than cond(A) times the relative change Ilob III lib II· Condition numbers measure the sensitivity of the output to change in the input.

  • Echelon matrix U.

    The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.

  • Elimination matrix = Elementary matrix Eij.

    The identity matrix with an extra -eij in the i, j entry (i #- j). Then Eij A subtracts eij times row j of A from row i.

  • Free columns of A.

    Columns without pivots; these are combinations of earlier columns.

  • Free variable Xi.

    Column i has no pivot in elimination. We can give the n - r free variables any values, then Ax = b determines the r pivot variables (if solvable!).

  • Full row rank r = m.

    Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.

  • Hessenberg matrix H.

    Triangular matrix with one extra nonzero adjacent diagonal.

  • Kronecker product (tensor product) A ® B.

    Blocks aij B, eigenvalues Ap(A)Aq(B).

  • Left nullspace N (AT).

    Nullspace of AT = "left nullspace" of A because y T A = OT.

  • Markov matrix M.

    All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

  • Permutation matrix P.

    There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or -1) based on the number of row exchanges to reach I.

  • Pseudoinverse A+ (Moore-Penrose inverse).

    The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

  • Reduced row echelon form R = rref(A).

    Pivots = 1; zeros above and below pivots; the r nonzero rows of R give a basis for the row space of A.

  • Rotation matrix

    R = [~ CS ] rotates the plane by () and R- 1 = RT rotates back by -(). Eigenvalues are eiO and e-iO , eigenvectors are (1, ±i). c, s = cos (), sin ().

  • Row space C (AT) = all combinations of rows of A.

    Column vectors by convention.

  • Spectrum of A = the set of eigenvalues {A I, ... , An}.

    Spectral radius = max of IAi I.

  • Symmetric matrix A.

    The transpose is AT = A, and aU = a ji. A-I is also symmetric.