 6.1: Write down the first four terms of the sequence if you start with: ...
 6.2: For each of the following write a description of the sequence and f...
 6.3: Describe the following number patterns and write down the next 3 te...
 6.4: Find the next two terms of: a 95, 91, 87, 83, .... b 5, 20, 80, 320...
 6.5: Evaluate the first five terms of the sequence f15 (2)ng.
 6.6: A sequence is defined by un = 71 7n 2 . a Prove that the sequence i...
 6.7: Find k given the consecutive arithmetic terms: a 32, k, 3 b k, 7, 1...
 6.8: Find the general term un for an arithmetic sequence with: a u7 = 41...
 6.9: a Insert three numbers between 5 and 10 so that all five numbers ar...
 6.10: Consider the arithmetic sequence 36, 35 1 3 , 34 2 3 , .... a Find ...
 6.11: An arithmetic sequence starts 23, 36, 49, 62, .... What is the firs...
 6.12: Five consecutive terms of an arithmetic sequence have a sum of 40. ...
Solutions for Chapter 6: NUMBER SEQUENCES
Full solutions for Mathematics for the International Student: Mathematics SL  3rd Edition
ISBN: 9781921972089
Solutions for Chapter 6: NUMBER SEQUENCES
Get Full SolutionsThis expansive textbook survival guide covers the following chapters and their solutions. Chapter 6: NUMBER SEQUENCES includes 12 full stepbystep solutions. Since 12 problems in chapter 6: NUMBER SEQUENCES have been answered, more than 10664 students have viewed full stepbystep solutions from this chapter. This textbook survival guide was created for the textbook: Mathematics for the International Student: Mathematics SL, edition: 3. Mathematics for the International Student: Mathematics SL was written by and is associated to the ISBN: 9781921972089.

Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

Cholesky factorization
A = CTC = (L.J]))(L.J]))T for positive definite A.

Complete solution x = x p + Xn to Ax = b.
(Particular x p) + (x n in nullspace).

Diagonalization
A = S1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k SI.

Fourier matrix F.
Entries Fjk = e21Cijk/n give orthogonal columns FT F = nI. Then y = Fe is the (inverse) Discrete Fourier Transform Y j = L cke21Cijk/n.

Full row rank r = m.
Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.

Hessenberg matrix H.
Triangular matrix with one extra nonzero adjacent diagonal.

Hypercube matrix pl.
Row n + 1 counts corners, edges, faces, ... of a cube in Rn.

Identity matrix I (or In).
Diagonal entries = 1, offdiagonal entries = 0.

Indefinite matrix.
A symmetric matrix with eigenvalues of both signs (+ and  ).

Linearly dependent VI, ... , Vn.
A combination other than all Ci = 0 gives L Ci Vi = O.

Outer product uv T
= column times row = rank one matrix.

Projection matrix P onto subspace S.
Projection p = P b is the closest point to b in S, error e = b  Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) 1 AT.

Reduced row echelon form R = rref(A).
Pivots = 1; zeros above and below pivots; the r nonzero rows of R give a basis for the row space of A.

Reflection matrix (Householder) Q = I 2uuT.
Unit vector u is reflected to Qu = u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q1 = Q.

Similar matrices A and B.
Every B = MI AM has the same eigenvalues as A.

Simplex method for linear programming.
The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

Symmetric factorizations A = LDLT and A = QAQT.
Signs in A = signs in D.