×
×

# Solutions for Chapter 2.5: Discrete Mathematics and Its Applications 7th Edition

## Full solutions for Discrete Mathematics and Its Applications | 7th Edition

ISBN: 9780073383095

Solutions for Chapter 2.5

Solutions for Chapter 2.5
4 5 0 393 Reviews
29
1
##### ISBN: 9780073383095

This textbook survival guide was created for the textbook: Discrete Mathematics and Its Applications, edition: 7. Since 40 problems in chapter 2.5 have been answered, more than 150256 students have viewed full step-by-step solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 2.5 includes 40 full step-by-step solutions. Discrete Mathematics and Its Applications was written by and is associated to the ISBN: 9780073383095.

Key Math Terms and definitions covered in this textbook
• Adjacency matrix of a graph.

Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

• Cholesky factorization

A = CTC = (L.J]))(L.J]))T for positive definite A.

• Dimension of vector space

dim(V) = number of vectors in any basis for V.

• Echelon matrix U.

The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.

• Exponential eAt = I + At + (At)2 12! + ...

has derivative AeAt; eAt u(O) solves u' = Au.

• Full column rank r = n.

Independent columns, N(A) = {O}, no free variables.

• Kirchhoff's Laws.

Current Law: net current (in minus out) is zero at each node. Voltage Law: Potential differences (voltage drops) add to zero around any closed loop.

• Linear combination cv + d w or L C jV j.

• Linear transformation T.

Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

• Nullspace N (A)

= All solutions to Ax = O. Dimension n - r = (# columns) - rank.

• Pivot.

The diagonal entry (first nonzero) at the time when a row is used in elimination.

• Plane (or hyperplane) in Rn.

Vectors x with aT x = O. Plane is perpendicular to a =1= O.

• Right inverse A+.

If A has full row rank m, then A+ = AT(AAT)-l has AA+ = 1m.

• Rotation matrix

R = [~ CS ] rotates the plane by () and R- 1 = RT rotates back by -(). Eigenvalues are eiO and e-iO , eigenvectors are (1, ±i). c, s = cos (), sin ().

• Row space C (AT) = all combinations of rows of A.

Column vectors by convention.

• Similar matrices A and B.

Every B = M-I AM has the same eigenvalues as A.

• Skew-symmetric matrix K.

The transpose is -K, since Kij = -Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

• Spanning set.

Combinations of VI, ... ,Vm fill the space. The columns of A span C (A)!

• Transpose matrix AT.

Entries AL = Ajj. AT is n by In, AT A is square, symmetric, positive semidefinite. The transposes of AB and A-I are BT AT and (AT)-I.

• Vandermonde matrix V.

V c = b gives coefficients of p(x) = Co + ... + Cn_IXn- 1 with P(Xi) = bi. Vij = (Xi)j-I and det V = product of (Xk - Xi) for k > i.

×