×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Already have an account? Login here
×
Reset your password

Solutions for Chapter 5.1: The Scalar Product in Rn

Full solutions for Linear Algebra with Applications | 8th Edition

ISBN: 9780136009290

Solutions for Chapter 5.1: The Scalar Product in Rn

Solutions for Chapter 5.1
4 5 0 402 Reviews
13
2
Textbook: Linear Algebra with Applications
Edition: 8
Author: Steve Leon
ISBN: 9780136009290

Since 21 problems in chapter 5.1: The Scalar Product in Rn have been answered, more than 13248 students have viewed full step-by-step solutions from this chapter. Linear Algebra with Applications was written by and is associated to the ISBN: 9780136009290. This textbook survival guide was created for the textbook: Linear Algebra with Applications, edition: 8. Chapter 5.1: The Scalar Product in Rn includes 21 full step-by-step solutions. This expansive textbook survival guide covers the following chapters and their solutions.

Key Math Terms and definitions covered in this textbook
  • Basis for V.

    Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!

  • Characteristic equation det(A - AI) = O.

    The n roots are the eigenvalues of A.

  • Condition number

    cond(A) = c(A) = IIAIlIIA-III = amaxlamin. In Ax = b, the relative change Ilox III Ilx II is less than cond(A) times the relative change Ilob III lib II· Condition numbers measure the sensitivity of the output to change in the input.

  • Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.

    Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).

  • Elimination matrix = Elementary matrix Eij.

    The identity matrix with an extra -eij in the i, j entry (i #- j). Then Eij A subtracts eij times row j of A from row i.

  • Ellipse (or ellipsoid) x T Ax = 1.

    A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA-1 yll2 = Y T(AAT)-1 Y = 1 displayed by eigshow; axis lengths ad

  • Full column rank r = n.

    Independent columns, N(A) = {O}, no free variables.

  • Incidence matrix of a directed graph.

    The m by n edge-node incidence matrix has a row for each edge (node i to node j), with entries -1 and 1 in columns i and j .

  • Kirchhoff's Laws.

    Current Law: net current (in minus out) is zero at each node. Voltage Law: Potential differences (voltage drops) add to zero around any closed loop.

  • Left inverse A+.

    If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.

  • Markov matrix M.

    All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

  • Matrix multiplication AB.

    The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

  • Network.

    A directed graph that has constants Cl, ... , Cm associated with the edges.

  • Norm

    IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

  • Orthogonal subspaces.

    Every v in V is orthogonal to every w in W.

  • Permutation matrix P.

    There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or -1) based on the number of row exchanges to reach I.

  • Pivot.

    The diagonal entry (first nonzero) at the time when a row is used in elimination.

  • Row picture of Ax = b.

    Each equation gives a plane in Rn; the planes intersect at x.

  • Standard basis for Rn.

    Columns of n by n identity matrix (written i ,j ,k in R3).

  • Subspace S of V.

    Any vector space inside V, including V and Z = {zero vector only}.