 3.1: Let u = (2, 0, 4), v = (3, 1, 6), and w = (2, 5, 5). Compute (a) 3v...
 3.2: Repeat Exercise 1 for the vectors u = 3i 5j + k, v = 2i + 2k, and w...
 3.3: Repeat parts (a)(d) of Exercise 1 for the vectors u = (2, 6, 2, 1),...
 3.4: (a) The set of all vectors in R2 that are orthogonal to a nonzero v...
 3.5: Let A, B, and C be three distinct noncollinear points in 3 space. ...
 3.6: Let A, B, and C be three distinct noncollinear points in 3 space. ...
 3.7: Consider the points P (3, 1, 4), Q(6, 0, 2), and R(5, 1, 1). Find t...
 3.8: Consider the points P (3, 1, 0, 6), Q(0, 5, 1, 2), and R(4, 1, 4, 0...
 3.9: Using the points in Exercise 7, find the cosine of the angle betwee...
 3.10: Using the points in Exercise 8, find the cosine of the angle betwee...
 3.11: Find the distance between the point P (3, 1, 3) and the plane 5x + ...
 3.12: Show that the planes 3x y + 6z = 7 and 6x + 2y 12z = 1 are parallel...
 3.13: In Exercises 1318, find vector and parametric equations for the lin...
 3.14: In Exercises 1318, find vector and parametric equations for the lin...
 3.15: In Exercises 1318, find vector and parametric equations for the lin...
 3.16: In Exercises 1318, find vector and parametric equations for the lin...
 3.17: In Exercises 1318, find vector and parametric equations for the lin...
 3.18: In Exercises 1318, find vector and parametric equations for the lin...
 3.19: In Exercises 1921, find a pointnormal equation for the given plane...
 3.20: In Exercises 1921, find a pointnormal equation for the given plane...
 3.21: In Exercises 1921, find a pointnormal equation for the given plane...
 3.22: Suppose that V = {v1, v2, v3} and W = {w1, w2} are two sets of vect...
 3.23: Show that in 3space the distance d from a point P to the line L th...
 3.24: Prove that u + v = u + v if and only if one of the vectors is a sca...
 3.25: The equation Ax + By = 0 represents a line through the origin in R2...
Solutions for Chapter 3: Euclidean Vector Spaces
Full solutions for Elementary Linear Algebra, Binder Ready Version: Applications Version  11th Edition
ISBN: 9781118474228
Solutions for Chapter 3: Euclidean Vector Spaces
Get Full SolutionsSince 25 problems in chapter 3: Euclidean Vector Spaces have been answered, more than 14944 students have viewed full stepbystep solutions from this chapter. Elementary Linear Algebra, Binder Ready Version: Applications Version was written by and is associated to the ISBN: 9781118474228. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 3: Euclidean Vector Spaces includes 25 full stepbystep solutions. This textbook survival guide was created for the textbook: Elementary Linear Algebra, Binder Ready Version: Applications Version, edition: 11.

Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

Commuting matrices AB = BA.
If diagonalizable, they share n eigenvectors.

Diagonalizable matrix A.
Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then SI AS = A = eigenvalue matrix.

Factorization
A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.

Fast Fourier Transform (FFT).
A factorization of the Fourier matrix Fn into e = log2 n matrices Si times a permutation. Each Si needs only nl2 multiplications, so Fnx and Fn1c can be computed with ne/2 multiplications. Revolutionary.

Iterative method.
A sequence of steps intended to approach the desired solution.

Kirchhoff's Laws.
Current Law: net current (in minus out) is zero at each node. Voltage Law: Potential differences (voltage drops) add to zero around any closed loop.

Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b  Ax is orthogonal to all columns of A.

Left inverse A+.
If A has full column rank n, then A+ = (AT A)I AT has A+ A = In.

Linear combination cv + d w or L C jV j.
Vector addition and scalar multiplication.

Norm
IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

Normal equation AT Ax = ATb.
Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b  Ax) = o.

Partial pivoting.
In each column, choose the largest available pivot to control roundoff; all multipliers have leij I < 1. See condition number.

Particular solution x p.
Any solution to Ax = b; often x p has free variables = o.

Plane (or hyperplane) in Rn.
Vectors x with aT x = O. Plane is perpendicular to a =1= O.

Rotation matrix
R = [~ CS ] rotates the plane by () and R 1 = RT rotates back by (). Eigenvalues are eiO and eiO , eigenvectors are (1, ±i). c, s = cos (), sin ().

Spanning set.
Combinations of VI, ... ,Vm fill the space. The columns of A span C (A)!

Spectral Theorem A = QAQT.
Real symmetric A has real A'S and orthonormal q's.

Stiffness matrix
If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.

Tridiagonal matrix T: tij = 0 if Ii  j I > 1.
T 1 has rank 1 above and below diagonal.