×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Already have an account? Login here
×
Reset your password

Textbooks / Math / Modern Algebra: An Introduction 6

Modern Algebra: An Introduction 6th Edition - Solutions by Chapter

Modern Algebra: An Introduction | 6th Edition | ISBN: 9780470384435 | Authors: John R. Durbin

Full solutions for Modern Algebra: An Introduction | 6th Edition

ISBN: 9780470384435

Modern Algebra: An Introduction | 6th Edition | ISBN: 9780470384435 | Authors: John R. Durbin

Modern Algebra: An Introduction | 6th Edition - Solutions by Chapter

Solutions by Chapter
4 5 0 343 Reviews
Textbook: Modern Algebra: An Introduction
Edition: 6
Author: John R. Durbin
ISBN: 9780470384435

The full step-by-step solution to problem in Modern Algebra: An Introduction were answered by , our top Math solution expert on 03/16/18, 02:52PM. This expansive textbook survival guide covers the following chapters: 66. Modern Algebra: An Introduction was written by and is associated to the ISBN: 9780470384435. This textbook survival guide was created for the textbook: Modern Algebra: An Introduction, edition: 6. Since problems from 66 chapters in Modern Algebra: An Introduction have been answered, more than 12732 students have viewed full step-by-step answer.

Key Math Terms and definitions covered in this textbook
  • Change of basis matrix M.

    The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)

  • Commuting matrices AB = BA.

    If diagonalizable, they share n eigenvectors.

  • Gram-Schmidt orthogonalization A = QR.

    Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.

  • Iterative method.

    A sequence of steps intended to approach the desired solution.

  • Jordan form 1 = M- 1 AM.

    If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

  • lA-II = l/lAI and IATI = IAI.

    The big formula for det(A) has a sum of n! terms, the cofactor formula uses determinants of size n - 1, volume of box = I det( A) I.

  • Left inverse A+.

    If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.

  • Matrix multiplication AB.

    The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

  • Minimal polynomial of A.

    The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A - AI) if no eigenvalues are repeated; always meA) divides peA).

  • Multiplication Ax

    = Xl (column 1) + ... + xn(column n) = combination of columns.

  • Multiplicities AM and G M.

    The algebraic multiplicity A M of A is the number of times A appears as a root of det(A - AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).

  • Normal equation AT Ax = ATb.

    Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.

  • Particular solution x p.

    Any solution to Ax = b; often x p has free variables = o.

  • Permutation matrix P.

    There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or -1) based on the number of row exchanges to reach I.

  • Skew-symmetric matrix K.

    The transpose is -K, since Kij = -Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

  • Solvable system Ax = b.

    The right side b is in the column space of A.

  • Spectrum of A = the set of eigenvalues {A I, ... , An}.

    Spectral radius = max of IAi I.

  • Subspace S of V.

    Any vector space inside V, including V and Z = {zero vector only}.

  • Vector addition.

    v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.

  • Wavelets Wjk(t).

    Stretch and shift the time axis to create Wjk(t) = woo(2j t - k).