 5.3.1: In Exercises 12, find u, Re(u), Im(u), and u .u = (2 i, 4i, 1 + i)
 5.3.2: In Exercises 12, find u, Re(u), Im(u), and u .u = (6, 1 + 4i, 6 2i)
 5.3.3: In Exercises 34, show that u, v, and k satisfy Theorem 5.3.1.u = (3...
 5.3.4: In Exercises 34, show that u, v, and k satisfy Theorem 5.3.1.u = (6...
 5.3.5: Solve the equation ix 3v = u for x, where u and v are the vectors i...
 5.3.6: Solve the equation (1 + i)x + 2u = v for x, where u and v are the v...
 5.3.7: In Exercises 78, find A, Re(A), Im(A), det(A), and tr(A).A = 5i 4 2...
 5.3.8: In Exercises 78, find A, Re(A), Im(A), det(A), and tr(A).A = 4i 2 3...
 5.3.9: Let A be the matrix given in Exercise 7, and letB be the matrix B =...
 5.3.10: Let A be the matrix given in Exercise 8, and letB be the matrix B =...
 5.3.11: In Exercises 1112, compute u v, u w, and v w, and show that the vec...
 5.3.12: In Exercises 1112, compute u v, u w, and v w, and show that the vec...
 5.3.13: Compute (u v) w u for the vectors u, v, and w in Exercise 11.
 5.3.14: Compute (iu w) + ( u v) u for the vectors u, v, and w in Exercise 12.
 5.3.15: In Exercises 1518, find the eigenvalues and bases for the eigenspac...
 5.3.16: In Exercises 1518, find the eigenvalues and bases for the eigenspac...
 5.3.17: In Exercises 1518, find the eigenvalues and bases for the eigenspac...
 5.3.18: In Exercises 1518, find the eigenvalues and bases for the eigenspac...
 5.3.19: In Exercises 1922, each matrix C has form (15). Theorem 5.3.7 impli...
 5.3.20: In Exercises 1922, each matrix C has form (15). Theorem 5.3.7 impli...
 5.3.21: In Exercises 1922, each matrix C has form (15). Theorem 5.3.7 impli...
 5.3.22: In Exercises 1922, each matrix C has form (15). Theorem 5.3.7 impli...
 5.3.23: In Exercises 2326, find an invertible matrix P and a matrix C of fo...
 5.3.24: In Exercises 2326, find an invertible matrix P and a matrix C of fo...
 5.3.25: In Exercises 2326, find an invertible matrix P and a matrix C of fo...
 5.3.26: In Exercises 2326, find an invertible matrix P and a matrix C of fo...
 5.3.27: Find all complex scalars k, if any, for which u and v are orthogona...
 5.3.28: Show that if A is a real n n matrix and x is a column vector in Cn,...
 5.3.29: The matrices 1 = 0 1 1 0 , 2 = 0 i i 0 , 3 = 1 0 0 1 called Pauli s...
 5.3.30: If k is a real scalar and v is a vector in Rn, then Theorem 3.2.1 s...
 5.3.31: Prove part (c) of Theorem 5.3.1.
 5.3.32: Prove Theorem 5.3.2
 5.3.33: Prove that if u and v are vectors in Cn, then u v = 1 4 u + v 2 1 4...
 5.3.34: It follows from Theorem 5.3.7 that the eigenvalues of the rotation ...
 5.3.35: The two parts of this exercise lead you through a proof of Theorem ...
 5.3.36: In this problem you will prove the complex analog of the CauchySchw...
Solutions for Chapter 5.3: Complex Vector Spaces
Full solutions for Elementary Linear Algebra, Binder Ready Version: Applications Version  11th Edition
ISBN: 9781118474228
Solutions for Chapter 5.3: Complex Vector Spaces
Get Full SolutionsThis textbook survival guide was created for the textbook: Elementary Linear Algebra, Binder Ready Version: Applications Version, edition: 11. This expansive textbook survival guide covers the following chapters and their solutions. Elementary Linear Algebra, Binder Ready Version: Applications Version was written by and is associated to the ISBN: 9781118474228. Chapter 5.3: Complex Vector Spaces includes 36 full stepbystep solutions. Since 36 problems in chapter 5.3: Complex Vector Spaces have been answered, more than 14967 students have viewed full stepbystep solutions from this chapter.

Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

Back substitution.
Upper triangular systems are solved in reverse order Xn to Xl.

Column space C (A) =
space of all combinations of the columns of A.

Companion matrix.
Put CI, ... ,Cn in row n and put n  1 ones just above the main diagonal. Then det(A  AI) = ±(CI + c2A + C3A 2 + .•. + cnA nl  An).

Determinant IAI = det(A).
Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and

Distributive Law
A(B + C) = AB + AC. Add then multiply, or mUltiply then add.

Graph G.
Set of n nodes connected pairwise by m edges. A complete graph has all n(n  1)/2 edges between nodes. A tree has only n  1 edges and no closed loops.

Hermitian matrix A H = AT = A.
Complex analog a j i = aU of a symmetric matrix.

Incidence matrix of a directed graph.
The m by n edgenode incidence matrix has a row for each edge (node i to node j), with entries 1 and 1 in columns i and j .

Inverse matrix AI.
Square matrix with AI A = I and AAl = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B1 AI and (AI)T. Cofactor formula (Al)ij = Cji! detA.

lAII = l/lAI and IATI = IAI.
The big formula for det(A) has a sum of n! terms, the cofactor formula uses determinants of size n  1, volume of box = I det( A) I.

Left nullspace N (AT).
Nullspace of AT = "left nullspace" of A because y T A = OT.

Multiplicities AM and G M.
The algebraic multiplicity A M of A is the number of times A appears as a root of det(A  AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).

Network.
A directed graph that has constants Cl, ... , Cm associated with the edges.

Orthogonal matrix Q.
Square matrix with orthonormal columns, so QT = Ql. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

Polar decomposition A = Q H.
Orthogonal Q times positive (semi)definite H.

Pseudoinverse A+ (MoorePenrose inverse).
The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

Row picture of Ax = b.
Each equation gives a plane in Rn; the planes intersect at x.

Semidefinite matrix A.
(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

Solvable system Ax = b.
The right side b is in the column space of A.