 9.1.1: Find the eigenvalues and associated eigenvectors of the following 3...
 9.1.2: Find the eigenvalues and associated eigenvectors of the following 3...
 9.1.3: Use the Gergorin Circle Theorem to determine bounds for the eigenva...
 9.1.4: Use the Gergorin Circle Theorem to determine bounds for the eigenva...
 9.1.5: For the matrices in Exercise 1 that have 3 linearly independent eig...
 9.1.6: For the matrices in Exercise 2 that have 3 linearly independent eig...
 9.1.7: Show that v1 = (2, 1)t , v2 = (1, 1)t , and v3 = (1, 3)t are linear...
 9.1.8: Show that the three eigenvectors in Example 3 are linearly independ...
 9.1.9: Show that a set {v1, ... , vk } of k nonzero orthogonal vectors is ...
 9.1.10: Show that if A is a matrix and 1, 2, , k are distinct eigenvalues w...
 9.1.11: Let {v1, ... , vn} be a set of orthonormal nonzero vectors in Rn an...
 9.1.12: Assume that {x1, x2}, {x1, x3}, and {x2, x3}, are all linearly inde...
 9.1.13: Consider the follow sets of vectors. (i) Show that the set is linea...
 9.1.14: Consider the follow sets of vectors. (i) Show that the set is linea...
 9.1.15: Use the Gergorin Circle Theorem to show that a strictly diagonally ...
 9.1.16: Prove that the set of vectors {v1, v2, ... , vk } described in the ...
 9.1.17: A persymmetric matrix is a matrix that is symmetric about both diag...
Solutions for Chapter 9.1: Linear Algebra and Eigenvalues
Full solutions for Numerical Analysis  9th Edition
ISBN: 9780538733519
Solutions for Chapter 9.1: Linear Algebra and Eigenvalues
Get Full SolutionsThis textbook survival guide was created for the textbook: Numerical Analysis, edition: 9. This expansive textbook survival guide covers the following chapters and their solutions. Since 17 problems in chapter 9.1: Linear Algebra and Eigenvalues have been answered, more than 12774 students have viewed full stepbystep solutions from this chapter. Chapter 9.1: Linear Algebra and Eigenvalues includes 17 full stepbystep solutions. Numerical Analysis was written by and is associated to the ISBN: 9780538733519.

Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.

Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

Block matrix.
A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

Eigenvalue A and eigenvector x.
Ax = AX with x#O so det(A  AI) = o.

Elimination matrix = Elementary matrix Eij.
The identity matrix with an extra eij in the i, j entry (i # j). Then Eij A subtracts eij times row j of A from row i.

Free columns of A.
Columns without pivots; these are combinations of earlier columns.

GaussJordan method.
Invert A by row operations on [A I] to reach [I AI].

Indefinite matrix.
A symmetric matrix with eigenvalues of both signs (+ and  ).

Iterative method.
A sequence of steps intended to approach the desired solution.

lAII = l/lAI and IATI = IAI.
The big formula for det(A) has a sum of n! terms, the cofactor formula uses determinants of size n  1, volume of box = I det( A) I.

Multiplicities AM and G M.
The algebraic multiplicity A M of A is the number of times A appears as a root of det(A  AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).

Norm
IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q 1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

Partial pivoting.
In each column, choose the largest available pivot to control roundoff; all multipliers have leij I < 1. See condition number.

Pivot.
The diagonal entry (first nonzero) at the time when a row is used in elimination.

Random matrix rand(n) or randn(n).
MATLAB creates a matrix with random entries, uniformly distributed on [0 1] for rand and standard normal distribution for randn.

Rank r (A)
= number of pivots = dimension of column space = dimension of row space.

Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

Semidefinite matrix A.
(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

Stiffness matrix
If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.