 7.1.1: In each part of Exercises 14, determine whether the matrix is ortho...
 7.1.2: In each part of Exercises 14, determine whether the matrix is ortho...
 7.1.3: In each part of Exercises 14, determine whether the matrix is ortho...
 7.1.4: In each part of Exercises 14, determine whether the matrix is ortho...
 7.1.5: In Exercises 56, show that the matrix is orthogonal three ways: fir...
 7.1.6: In Exercises 56, show that the matrix is orthogonal three ways: fir...
 7.1.7: Let TA: R3 R3 be multiplication by the orthogonal matrix in Exercis...
 7.1.8: Let TA: R3 R3 be multiplication by the orthogonal matrix in Exercis...
 7.1.9: Are the standard matrices for the reflections in Tables 1 and 2 of ...
 7.1.10: Are the standard matrices for the orthogonal projections in Tables ...
 7.1.11: What conditions must a and b satisfy for the matrix a + b b a a b b...
 7.1.12: Under what conditions will a diagonal matrix be orthogonal?
 7.1.13: Let a rectangular x y coordinate system be obtained by rotating a ...
 7.1.14: Repeat Exercise 13 with 0 = 3 / 4.
 7.1.15: Let a rectangular x y z coordinate system be obtained by rotating ...
 7.1.16: Repeat Exercise 15 for a rotation of = 3/4 counterclockwise about t...
 7.1.17: Repeat Exercise 15 for a rotation of = /3 counterclockwise about th...
 7.1.18: A rectangular x y z coordinate system is obtained by rotating an x...
 7.1.19: Repeat Exercise 18 for a rotation about the xaxis
 7.1.20: A rectangular x y z coordinate system is obtained by first rotatin...
 7.1.21: A linear operator on R2 is called rigid if it does not change the l...
 7.1.22: Can an orthogonal operator TA: Rn Rn map nonzero vectors that are n...
 7.1.23: The set S = 1 1 3 , 1 2 x, #3 2 x2 #2 3 2 is an orthonormal basis f...
 7.1.24: The sets S = {1, x} and S = 1 1 2 (1 + x), 1 2 (1 x)2 are orthonorm...
 7.1.25: Prove that if x is an n 1 matrix, then the matrix A = In 2 xT x xxT...
 7.1.26: Prove that a 2 2 orthogonal matrix A has only one of two possible f...
 7.1.27: (a) Use the result in Exercise 26 to prove that multiplication by a...
 7.1.28: In each part, use the result in Exercise 27(a) to determine whether...
 7.1.29: In each part, use the result in Exercise 27(a) to determine whether...
 7.1.30: Eulers Axis of Rotation Theorem states that: If A is an orthogonal ...
 7.1.31: . Prove the equivalence of statements (a) and (c) that are given in...
 7.1.T1: TF. In parts (a)(h) determine whether the statement is true or fals...
Solutions for Chapter 7.1: Orthogonal Matrices
Full solutions for Elementary Linear Algebra, Binder Ready Version: Applications Version  11th Edition
ISBN: 9781118474228
Solutions for Chapter 7.1: Orthogonal Matrices
Get Full SolutionsElementary Linear Algebra, Binder Ready Version: Applications Version was written by and is associated to the ISBN: 9781118474228. Since 32 problems in chapter 7.1: Orthogonal Matrices have been answered, more than 16676 students have viewed full stepbystep solutions from this chapter. Chapter 7.1: Orthogonal Matrices includes 32 full stepbystep solutions. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Elementary Linear Algebra, Binder Ready Version: Applications Version, edition: 11.

Commuting matrices AB = BA.
If diagonalizable, they share n eigenvectors.

Determinant IAI = det(A).
Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and

Eigenvalue A and eigenvector x.
Ax = AX with x#O so det(A  AI) = o.

Factorization
A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.

Four Fundamental Subspaces C (A), N (A), C (AT), N (AT).
Use AT for complex A.

Independent vectors VI, .. " vk.
No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.

Inverse matrix AI.
Square matrix with AI A = I and AAl = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B1 AI and (AI)T. Cofactor formula (Al)ij = Cji! detA.

Kronecker product (tensor product) A ® B.
Blocks aij B, eigenvalues Ap(A)Aq(B).

Lucas numbers
Ln = 2,J, 3, 4, ... satisfy Ln = L n l +Ln 2 = A1 +A~, with AI, A2 = (1 ± /5)/2 from the Fibonacci matrix U~]' Compare Lo = 2 with Fo = O.

Markov matrix M.
All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

Pascal matrix
Ps = pascal(n) = the symmetric matrix with binomial entries (i1~;2). Ps = PL Pu all contain Pascal's triangle with det = 1 (see Pascal in the index).

Permutation matrix P.
There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or 1) based on the number of row exchanges to reach I.

Pivot.
The diagonal entry (first nonzero) at the time when a row is used in elimination.

Plane (or hyperplane) in Rn.
Vectors x with aT x = O. Plane is perpendicular to a =1= O.

Projection matrix P onto subspace S.
Projection p = P b is the closest point to b in S, error e = b  Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) 1 AT.

Right inverse A+.
If A has full row rank m, then A+ = AT(AAT)l has AA+ = 1m.

Row space C (AT) = all combinations of rows of A.
Column vectors by convention.

Schur complement S, D  C A } B.
Appears in block elimination on [~ g ].

Solvable system Ax = b.
The right side b is in the column space of A.

Spanning set.
Combinations of VI, ... ,Vm fill the space. The columns of A span C (A)!