×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Solutions for Chapter 11.3: Implications;The Biconditional Connective;Tautologies

Full solutions for Finite Mathematics, Binder Ready Version: An Applied Approach | 11th Edition

ISBN: 9780470876398

Solutions for Chapter 11.3: Implications;The Biconditional Connective;Tautologies

Solutions for Chapter 11.3
4 5 0 390 Reviews
27
0
Textbook: Finite Mathematics, Binder Ready Version: An Applied Approach
Edition: 11
Author: Michael Sullivan
ISBN: 9780470876398

Finite Mathematics, Binder Ready Version: An Applied Approach was written by and is associated to the ISBN: 9780470876398. Chapter 11.3: Implications;The Biconditional Connective;Tautologies includes 38 full step-by-step solutions. Since 38 problems in chapter 11.3: Implications;The Biconditional Connective;Tautologies have been answered, more than 16715 students have viewed full step-by-step solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Finite Mathematics, Binder Ready Version: An Applied Approach, edition: 11.

Key Math Terms and definitions covered in this textbook
  • Back substitution.

    Upper triangular systems are solved in reverse order Xn to Xl.

  • Circulant matrix C.

    Constant diagonals wrap around as in cyclic shift S. Every circulant is Col + CIS + ... + Cn_lSn - l . Cx = convolution c * x. Eigenvectors in F.

  • Conjugate Gradient Method.

    A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax - x Tb over growing Krylov subspaces.

  • Cyclic shift

    S. Permutation with S21 = 1, S32 = 1, ... , finally SIn = 1. Its eigenvalues are the nth roots e2lrik/n of 1; eigenvectors are columns of the Fourier matrix F.

  • Diagonalization

    A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.

  • Dimension of vector space

    dim(V) = number of vectors in any basis for V.

  • Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.

    Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).

  • Hankel matrix H.

    Constant along each antidiagonal; hij depends on i + j.

  • Hermitian matrix A H = AT = A.

    Complex analog a j i = aU of a symmetric matrix.

  • Incidence matrix of a directed graph.

    The m by n edge-node incidence matrix has a row for each edge (node i to node j), with entries -1 and 1 in columns i and j .

  • Indefinite matrix.

    A symmetric matrix with eigenvalues of both signs (+ and - ).

  • Length II x II.

    Square root of x T x (Pythagoras in n dimensions).

  • Linear transformation T.

    Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

  • Minimal polynomial of A.

    The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A - AI) if no eigenvalues are repeated; always meA) divides peA).

  • Norm

    IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

  • Projection matrix P onto subspace S.

    Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.

  • Random matrix rand(n) or randn(n).

    MATLAB creates a matrix with random entries, uniformly distributed on [0 1] for rand and standard normal distribution for randn.

  • Saddle point of I(x}, ... ,xn ).

    A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

  • Vandermonde matrix V.

    V c = b gives coefficients of p(x) = Co + ... + Cn_IXn- 1 with P(Xi) = bi. Vij = (Xi)j-I and det V = product of (Xk - Xi) for k > i.

  • Vector addition.

    v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.

×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide
×
Reset your password