 1.R.1RQ: a) Define the negation of a proposition.________________b) What is ...
 1.R.2RQ: a) Define (using truth tables) the disjunction, conjunction, exclus...
 1.R.3RQ: a) d Describe at least five different ways to write the conditional...
 1.R.4RQ: ?a) What does it mean for two propositions to be logically equivale...
 1.R.5RQ: a) Given a truth table, explain how to use disjunctive normal form ...
 1.R.6RQ: What arc the universal and existential quantifications of a predica...
 1.R.7RQ: ?a) What is the difference between the quantification \(\exists x \...
 1.R.8RQ: Describe what is meant by a valid argument in propositional logic a...
 1.R.9RQ: Use rules of inference lo show that if the premises "All zebras hav...
 1.R.10RQ: a) Describe what is meant by a direct proof, a proof by contraposit...
 1.R.11RQ: a) Describe a way to prove the biconditional p? q.________________b...
 1.R.12RQ: To prove that the statements p1, p2, p3, and p4 are equivalent, is ...
 1.R.13RQ: a) Suppose that a statement of the form is false. How can this be p...
 1.R.14RQ: What is the difference between a constructive and nonconstructive ...
 1.R.15RQ: What arc the elements of a proof that there is a unique element x s...
 1.R.16RQ: Explain how a proof by cases can be used to prove a result about ab...
Solutions for Chapter 1.R: Propositional Equivalences
Full solutions for Discrete Mathematics and Its Applications  7th Edition
ISBN: 9780073383095
Solutions for Chapter 1.R: Propositional Equivalences
Get Full SolutionsSummary of Chapter 1.R: Propositional Equivalences
An important type of step used in a mathematical argument is the replacement of a statement with another statement with the same truth value.
Chapter 1.R: Propositional Equivalences includes 16 full stepbystep solutions. Discrete Mathematics and Its Applications was written by and is associated to the ISBN: 9780073383095. This expansive textbook survival guide covers the following chapters and their solutions. Since 16 problems in chapter 1.R: Propositional Equivalences have been answered, more than 851317 students have viewed full stepbystep solutions from this chapter. This textbook survival guide was created for the textbook: Discrete Mathematics and Its Applications, edition: 7.

Column picture of Ax = b.
The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).

Cramer's Rule for Ax = b.
B j has b replacing column j of A; x j = det B j I det A

Diagonal matrix D.
dij = 0 if i # j. Blockdiagonal: zero outside square blocks Du.

Dimension of vector space
dim(V) = number of vectors in any basis for V.

Factorization
A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.

Free columns of A.
Columns without pivots; these are combinations of earlier columns.

Kronecker product (tensor product) A ® B.
Blocks aij B, eigenvalues Ap(A)Aq(B).

Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b  Ax is orthogonal to all columns of A.

Matrix multiplication AB.
The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

Minimal polynomial of A.
The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A  AI) if no eigenvalues are repeated; always meA) divides peA).

Multiplicities AM and G M.
The algebraic multiplicity A M of A is the number of times A appears as a root of det(A  AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).

Nilpotent matrix N.
Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.

Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q 1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

Pivot columns of A.
Columns that contain pivots after row reduction. These are not combinations of earlier columns. The pivot columns are a basis for the column space.

Projection matrix P onto subspace S.
Projection p = P b is the closest point to b in S, error e = b  Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) 1 AT.

Rayleigh quotient q (x) = X T Ax I x T x for symmetric A: Amin < q (x) < Amax.
Those extremes are reached at the eigenvectors x for Amin(A) and Amax(A).

Reduced row echelon form R = rref(A).
Pivots = 1; zeros above and below pivots; the r nonzero rows of R give a basis for the row space of A.

Reflection matrix (Householder) Q = I 2uuT.
Unit vector u is reflected to Qu = u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q1 = Q.

Right inverse A+.
If A has full row rank m, then A+ = AT(AAT)l has AA+ = 1m.

Skewsymmetric matrix K.
The transpose is K, since Kij = Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.