 1.4.1: All flowers are plants. Pansies are flowers.
 1.4.2: All flowers are plants. Pansies are plants
 1.4.3: All flowers are red or purple. Pansies are flowers. Pansies are not...
 1.4.4: Some flowers are purple. All purple flowers are small.
 1.4.5: Some flowers are red. Some flowers are purple. Pansies are flowers.
 1.4.6: Some flowers are red. Some flowers are purple. Pansies are flowers.
 1.4.7: Justify each step in the following proof sequence of (E x)[P(x) S Q...
 1.4.8: Justify each step in the following proof sequence of (E x)P(x) ` (4...
 1.4.9: Consider the wff (4x)[(E y)P(x, y) ` (E y)Q(x, y)] S (4x)(E y)[P(x,...
 1.4.10: Consider the wff (4y)(E x)Q(x, y) S (E x)(4y)Q(x, y) a. Find an int...
 1.4.11: In Exercises 1116, prove that each wff is a valid argument.
 1.4.12: In Exercises 1116, prove that each wff is a valid argument.
 1.4.13: In Exercises 1116, prove that each wff is a valid argument.
 1.4.14: In Exercises 1116, prove that each wff is a valid argument.
 1.4.15: In Exercises 1116, prove that each wff is a valid argument.
 1.4.16: In Exercises 1116, prove that each wff is a valid argument.
 1.4.17: In Exercises 1730, either prove that the wff is a valid argument or...
 1.4.18: In Exercises 1730, either prove that the wff is a valid argument or...
 1.4.19: In Exercises 1730, either prove that the wff is a valid argument or...
 1.4.20: In Exercises 1730, either prove that the wff is a valid argument or...
 1.4.21: In Exercises 1730, either prove that the wff is a valid argument or...
 1.4.22: In Exercises 1730, either prove that the wff is a valid argument or...
 1.4.23: In Exercises 1730, either prove that the wff is a valid argument or...
 1.4.24: In Exercises 1730, either prove that the wff is a valid argument or...
 1.4.25: In Exercises 1730, either prove that the wff is a valid argument or...
 1.4.26: In Exercises 1730, either prove that the wff is a valid argument or...
 1.4.27: In Exercises 1730, either prove that the wff is a valid argument or...
 1.4.28: In Exercises 1730, either prove that the wff is a valid argument or...
 1.4.29: In Exercises 1730, either prove that the wff is a valid argument or...
 1.4.30: In Exercises 1730, either prove that the wff is a valid argument or...
 1.4.31: The Greek philosopher Aristotle (384322 b.c.e.) studied under Plato...
 1.4.32: Some plants are flowers. All flowers smell sweet. Therefore, some p...
 1.4.33: Every crocodile is bigger than every alligator. Sam is a crocodile....
 1.4.34: There is an astronomer who is not nearsighted. Everyone who wears g...
 1.4.35: Every member of the board comes from industry or government. Everyo...
 1.4.36: There is some movie star who is richer than everyone. Anyone who is...
 1.4.37: Everyone with red hair has freckles. Someone has red hair and big f...
 1.4.38: Cats eat only animals. Something fuzzy exists. Everything thats fuz...
 1.4.39: Every computer science student works harder than somebody, and ever...
 1.4.40: Every ambassador speaks only to diplomats, and some ambassador spea...
 1.4.41: Some elephants are afraid of all mice. Some mice are small. Therefo...
 1.4.42: Every farmer owns a cow. No dentist owns a cow. Therefore no dentis...
 1.4.43: Prove that [(4x)A(x)] 4 (E x)[A(x)] is valid. (Hint: Instead of a p...
 1.4.44: The equivalence of Exercise 43 says that if it is false that every ...
Solutions for Chapter 1.4: Predicate Logic
Full solutions for Mathematical Structures for Computer Science  7th Edition
ISBN: 9781429215107
Solutions for Chapter 1.4: Predicate Logic
Get Full SolutionsSince 44 problems in chapter 1.4: Predicate Logic have been answered, more than 9478 students have viewed full stepbystep solutions from this chapter. This textbook survival guide was created for the textbook: Mathematical Structures for Computer Science, edition: 7. Mathematical Structures for Computer Science was written by and is associated to the ISBN: 9781429215107. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 1.4: Predicate Logic includes 44 full stepbystep solutions.

Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

Complete solution x = x p + Xn to Ax = b.
(Particular x p) + (x n in nullspace).

Conjugate Gradient Method.
A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax  x Tb over growing Krylov subspaces.

Dimension of vector space
dim(V) = number of vectors in any basis for V.

Exponential eAt = I + At + (At)2 12! + ...
has derivative AeAt; eAt u(O) solves u' = Au.

Factorization
A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.

Fibonacci numbers
0,1,1,2,3,5, ... satisfy Fn = Fnl + Fn 2 = (A7 A~)I()q A2). Growth rate Al = (1 + .J5) 12 is the largest eigenvalue of the Fibonacci matrix [ } A].

Fourier matrix F.
Entries Fjk = e21Cijk/n give orthogonal columns FT F = nI. Then y = Fe is the (inverse) Discrete Fourier Transform Y j = L cke21Cijk/n.

Hermitian matrix A H = AT = A.
Complex analog a j i = aU of a symmetric matrix.

Identity matrix I (or In).
Diagonal entries = 1, offdiagonal entries = 0.

Kirchhoff's Laws.
Current Law: net current (in minus out) is zero at each node. Voltage Law: Potential differences (voltage drops) add to zero around any closed loop.

Left nullspace N (AT).
Nullspace of AT = "left nullspace" of A because y T A = OT.

Linear combination cv + d w or L C jV j.
Vector addition and scalar multiplication.

Markov matrix M.
All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

Multiplier eij.
The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).

Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q 1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

Permutation matrix P.
There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or 1) based on the number of row exchanges to reach I.

Pseudoinverse A+ (MoorePenrose inverse).
The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

Schwarz inequality
Iv·wl < IIvll IIwll.Then IvTAwl2 < (vT Av)(wT Aw) for pos def A.