- 7.3.1: Prove that(a) 7 is an accumulation point for [3, 7).(b) 0 is an acc...
- 7.3.2: Find an example of an infinite subset of that has(a) no accumulatio...
- 7.3.3: Find the derived set of each of the following sets.
- 7.3.4: Let Find
- 7.3.5: Prove that if and then z is an accumulation pointof A.
- 7.3.6: (a) Prove that if then(b) Is the converse of part (a) true? Explain.
- 7.3.7: Let A and B be subsets of(a) Prove that (The operation of finding t...
- 7.3.8: Let A and B be sets of real numbers. Prove that(a) if B is closed a...
- 7.3.9: (a) Prove that if x is an interior point of the set A, then x is an...
- 7.3.10: Which of the following must have at least one accumulation point? (...
- 7.3.11: Let A be a set of real numbers. Prove that (A)c (Ac) .
- 7.3.12: Let A and F be sets of real numbers and let F be finite. Prove that...
- 7.3.13: Assign a grade of A (correct), C (partially correct), or F (failure...
Solutions for Chapter 7.3: The BolzanoWeierstrass Theorem
Full solutions for A Transition to Advanced Mathematics | 7th Edition
Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.
A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.
A = CTC = (L.J]))(L.J]))T for positive definite A.
A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.
Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.
Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).
Echelon matrix U.
The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.
Gram-Schmidt orthogonalization A = QR.
Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.
Hessenberg matrix H.
Triangular matrix with one extra nonzero adjacent diagonal.
Independent vectors VI, .. " vk.
No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.
Ln = 2,J, 3, 4, ... satisfy Ln = L n- l +Ln- 2 = A1 +A~, with AI, A2 = (1 ± -/5)/2 from the Fibonacci matrix U~]' Compare Lo = 2 with Fo = O.
Nullspace matrix N.
The columns of N are the n - r special solutions to As = O.
Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •
The diagonal entry (first nonzero) at the time when a row is used in elimination.
Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.
R = [~ CS ] rotates the plane by () and R- 1 = RT rotates back by -(). Eigenvalues are eiO and e-iO , eigenvectors are (1, ±i). c, s = cos (), sin ().
Schur complement S, D - C A -} B.
Appears in block elimination on [~ g ].
Semidefinite matrix A.
(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.
Skew-symmetric matrix K.
The transpose is -K, since Kij = -Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.
Special solutions to As = O.
One free variable is Si = 1, other free variables = o.
Symmetric matrix A.
The transpose is AT = A, and aU = a ji. A-I is also symmetric.