 3.F.1: Explain why every linear functional is either surjective or the zer...
 3.F.2: Give three distinct examples of linear functionals on R0;1.
 3.F.3: Suppose V is finitedimensional and v 2 V with v 0. Prove that ther...
 3.F.4: Suppose V is finitedimensional and U is a subspace of V such thatU...
 3.F.5: Suppose V1;:::;Vm are vector spaces. Prove that .V1 Vm/0 andV10 Vm0...
 3.F.6: Suppose V is finitedimensional and v1;:::; vm 2 V. Define a linear...
 3.F.7: Suppose m is a positive integer. Show that the dual basis of the ba...
 3.F.8: Suppose m is a positive integer.(a) Show that 1; x 5; : : : ; .x 5/...
 3.F.9: Suppose v1;:::; vn is a basis of V and '1;:::;'n is the correspondi...
 3.F.10: Prove the first two bullet points in 3.101.
 3.F.11: Suppose A is an mbyn matrix with A 0. Prove that the rank of Ais ...
 3.F.12: Show that the dual map of the identity map on V is the identity map...
 3.F.13: Define T W R3 ! R2 by T .x; y; z/ D .4x C 5y C 6z; 7x C 8y C 9z/.Su...
 3.F.14: Define T W P.R/ ! P.R/ by .Tp/.x/ D x2p.x/ C p00.x/ for x 2 R.(a) S...
 3.F.15: Suppose W is finitedimensional and T 2 L.V; W /. Prove that T 0 D ...
 3.F.16: Suppose V and W are finitedimensional. Prove that the map that tak...
 3.F.17: Suppose U V. Explain why U0 D f' 2 V 0 W U null 'g.
 3.F.18: Suppose V is finitedimensional and U V. Show that U D f0g if andon...
 3.F.19: Suppose V is finitedimensional and U is a subspace of V. Show that...
 3.F.20: Suppose U and W are subsets of V with U W. Prove that W 0 U0.
 3.F.21: Suppose V is finitedimensional and U and W are subspaces of V with...
 3.F.22: Suppose U; W are subspaces of V. Show that .U C W /0 D U0 \ W 0
 3.F.23: Suppose V is finitedimensional and U and W are subspaces of V. Pro...
 3.F.24: Prove 3.106 using the ideas sketched in the discussion before the s...
 3.F.25: Suppose V is finitedimensional and U is a subspace of V. Show that...
 3.F.26: Suppose V is finitedimensional and is a subspace of V 0. Show that...
 3.F.27: Suppose T 2 LP5.R/;P5.R/and null T 0 D span.'/, where ' isthe linea...
 3.F.28: Suppose V and W are finitedimensional, T 2 L.V; W /, and there exi...
 3.F.29: Suppose V and W are finitedimensional, T 2 L.V; W /, and there exi...
 3.F.30: Suppose V is finitedimensional and '1;:::;'m is a linearly indepen...
 3.F.31: Suppose V is finitedimensional and '1;:::;'n is a basis of V 0. Sh...
 3.F.32: Suppose T 2 L.V /, and u1;:::;un and v1;:::; vn are bases of V. Pro...
 3.F.33: Suppose m and n are positive integers. Prove that the function that...
 3.F.34: The double dual space of V, denoted V 00, is defined to be the dual...
 3.F.35: Show thatP.R/0and R1 are isomorphic.
 3.F.36: Suppose U is a subspace of V. Let i W U ! V be the inclusion mapdef...
 3.F.37: Suppose U is a subspace of V. Let W V ! V=U be the usual quotientma...
Solutions for Chapter 3.F: Duality
Full solutions for Linear Algebra Done Right (Undergraduate Texts in Mathematics)  3rd Edition
ISBN: 9783319110790
Solutions for Chapter 3.F: Duality
Get Full SolutionsSince 37 problems in chapter 3.F: Duality have been answered, more than 6106 students have viewed full stepbystep solutions from this chapter. This textbook survival guide was created for the textbook: Linear Algebra Done Right (Undergraduate Texts in Mathematics), edition: 3. This expansive textbook survival guide covers the following chapters and their solutions. Linear Algebra Done Right (Undergraduate Texts in Mathematics) was written by and is associated to the ISBN: 9783319110790. Chapter 3.F: Duality includes 37 full stepbystep solutions.

Affine transformation
Tv = Av + Vo = linear transformation plus shift.

Block matrix.
A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

Column picture of Ax = b.
The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).

Commuting matrices AB = BA.
If diagonalizable, they share n eigenvectors.

Complex conjugate
z = a  ib for any complex number z = a + ib. Then zz = Iz12.

Distributive Law
A(B + C) = AB + AC. Add then multiply, or mUltiply then add.

Echelon matrix U.
The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.

Exponential eAt = I + At + (At)2 12! + ...
has derivative AeAt; eAt u(O) solves u' = Au.

Four Fundamental Subspaces C (A), N (A), C (AT), N (AT).
Use AT for complex A.

Free variable Xi.
Column i has no pivot in elimination. We can give the n  r free variables any values, then Ax = b determines the r pivot variables (if solvable!).

Hessenberg matrix H.
Triangular matrix with one extra nonzero adjacent diagonal.

Independent vectors VI, .. " vk.
No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.

Inverse matrix AI.
Square matrix with AI A = I and AAl = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B1 AI and (AI)T. Cofactor formula (Al)ij = Cji! detA.

Krylov subspace Kj(A, b).
The subspace spanned by b, Ab, ... , AjIb. Numerical methods approximate A I b by x j with residual b  Ax j in this subspace. A good basis for K j requires only multiplication by A at each step.

Minimal polynomial of A.
The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A  AI) if no eigenvalues are repeated; always meA) divides peA).

Multiplier eij.
The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).

Plane (or hyperplane) in Rn.
Vectors x with aT x = O. Plane is perpendicular to a =1= O.

Projection p = a(aTblaTa) onto the line through a.
P = aaT laTa has rank l.

Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

Symmetric matrix A.
The transpose is AT = A, and aU = a ji. AI is also symmetric.