×
Log in to StudySoup
Get Full Access to Calculus, math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Calculus, math - Textbook Survival Guide

Already have an account? Login here
×
Reset your password

Textbooks / Math / Complex Variables and Applications 9

Complex Variables and Applications 9th Edition - Solutions by Chapter

Complex Variables and Applications | 9th Edition | ISBN: 9780073383170 | Authors: James Ward Brown

Full solutions for Complex Variables and Applications | 9th Edition

ISBN: 9780073383170

Complex Variables and Applications | 9th Edition | ISBN: 9780073383170 | Authors: James Ward Brown

Complex Variables and Applications | 9th Edition - Solutions by Chapter

This expansive textbook survival guide covers the following chapters: 12. This textbook survival guide was created for the textbook: Complex Variables and Applications, edition: 9. Since problems from 12 chapters in Complex Variables and Applications have been answered, more than 13959 students have viewed full step-by-step answer. Complex Variables and Applications was written by and is associated to the ISBN: 9780073383170. The full step-by-step solution to problem in Complex Variables and Applications were answered by , our top Math solution expert on 12/23/17, 04:39PM.

Key Math Terms and definitions covered in this textbook
  • Adjacency matrix of a graph.

    Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

  • Affine transformation

    Tv = Av + Vo = linear transformation plus shift.

  • Complete solution x = x p + Xn to Ax = b.

    (Particular x p) + (x n in nullspace).

  • Condition number

    cond(A) = c(A) = IIAIlIIA-III = amaxlamin. In Ax = b, the relative change Ilox III Ilx II is less than cond(A) times the relative change Ilob III lib II· Condition numbers measure the sensitivity of the output to change in the input.

  • Dimension of vector space

    dim(V) = number of vectors in any basis for V.

  • Fourier matrix F.

    Entries Fjk = e21Cijk/n give orthogonal columns FT F = nI. Then y = Fe is the (inverse) Discrete Fourier Transform Y j = L cke21Cijk/n.

  • Incidence matrix of a directed graph.

    The m by n edge-node incidence matrix has a row for each edge (node i to node j), with entries -1 and 1 in columns i and j .

  • Jordan form 1 = M- 1 AM.

    If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

  • Left inverse A+.

    If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.

  • Multiplication Ax

    = Xl (column 1) + ... + xn(column n) = combination of columns.

  • Network.

    A directed graph that has constants Cl, ... , Cm associated with the edges.

  • Normal matrix.

    If N NT = NT N, then N has orthonormal (complex) eigenvectors.

  • Particular solution x p.

    Any solution to Ax = b; often x p has free variables = o.

  • Positive definite matrix A.

    Symmetric matrix with positive eigenvalues and positive pivots. Definition: x T Ax > 0 unless x = O. Then A = LDLT with diag(D» O.

  • Rank one matrix A = uvT f=. O.

    Column and row spaces = lines cu and cv.

  • Reflection matrix (Householder) Q = I -2uuT.

    Unit vector u is reflected to Qu = -u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q-1 = Q.

  • Row space C (AT) = all combinations of rows of A.

    Column vectors by convention.

  • Singular matrix A.

    A square matrix that has no inverse: det(A) = o.

  • Symmetric matrix A.

    The transpose is AT = A, and aU = a ji. A-I is also symmetric.

  • Unitary matrix UH = U T = U-I.

    Orthonormal columns (complex analog of Q).