×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Already have an account? Login here
×
Reset your password

Solutions for Chapter 6.6: Zero and Negative Exponents

Discovering Algebra: An Investigative Approach | 2nd Edition | ISBN: 9781559537636 | Authors: Jerald Murdock, Ellen Kamischke, Eric Kamischke

Full solutions for Discovering Algebra: An Investigative Approach | 2nd Edition

ISBN: 9781559537636

Discovering Algebra: An Investigative Approach | 2nd Edition | ISBN: 9781559537636 | Authors: Jerald Murdock, Ellen Kamischke, Eric Kamischke

Solutions for Chapter 6.6: Zero and Negative Exponents

Discovering Algebra: An Investigative Approach was written by and is associated to the ISBN: 9781559537636. Since 15 problems in chapter 6.6: Zero and Negative Exponents have been answered, more than 14264 students have viewed full step-by-step solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 6.6: Zero and Negative Exponents includes 15 full step-by-step solutions. This textbook survival guide was created for the textbook: Discovering Algebra: An Investigative Approach, edition: 2.

Key Math Terms and definitions covered in this textbook
  • Companion matrix.

    Put CI, ... ,Cn in row n and put n - 1 ones just above the main diagonal. Then det(A - AI) = ±(CI + c2A + C3A 2 + .•. + cnA n-l - An).

  • Cramer's Rule for Ax = b.

    B j has b replacing column j of A; x j = det B j I det A

  • Cyclic shift

    S. Permutation with S21 = 1, S32 = 1, ... , finally SIn = 1. Its eigenvalues are the nth roots e2lrik/n of 1; eigenvectors are columns of the Fourier matrix F.

  • Dimension of vector space

    dim(V) = number of vectors in any basis for V.

  • Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.

    Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).

  • Four Fundamental Subspaces C (A), N (A), C (AT), N (AT).

    Use AT for complex A.

  • Gram-Schmidt orthogonalization A = QR.

    Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.

  • Hessenberg matrix H.

    Triangular matrix with one extra nonzero adjacent diagonal.

  • Hilbert matrix hilb(n).

    Entries HU = 1/(i + j -1) = Jd X i- 1 xj-1dx. Positive definite but extremely small Amin and large condition number: H is ill-conditioned.

  • Incidence matrix of a directed graph.

    The m by n edge-node incidence matrix has a row for each edge (node i to node j), with entries -1 and 1 in columns i and j .

  • Indefinite matrix.

    A symmetric matrix with eigenvalues of both signs (+ and - ).

  • lA-II = l/lAI and IATI = IAI.

    The big formula for det(A) has a sum of n! terms, the cofactor formula uses determinants of size n - 1, volume of box = I det( A) I.

  • Least squares solution X.

    The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.

  • Markov matrix M.

    All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

  • Pivot.

    The diagonal entry (first nonzero) at the time when a row is used in elimination.

  • Schur complement S, D - C A -} B.

    Appears in block elimination on [~ g ].

  • Standard basis for Rn.

    Columns of n by n identity matrix (written i ,j ,k in R3).

  • Trace of A

    = sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.

  • Vector addition.

    v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.

  • Vector v in Rn.

    Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.