- 9.4.1: Use the zero-product property to solve each equation. a. (x + 4)(x ...
- 9.4.2: Graph each equation and then rewrite it in factored form. a. y = x2...
- 9.4.3: Name the x-intercepts of the parabola described by each quadratic e...
- 9.4.4: Write an equation of a quadratic function that corresponds to each ...
- 9.4.5: Consider the equation y = (x + 1)(x 3). a. How many x-intercepts do...
- 9.4.6: Is the expression on the left equivalent to the expression on the r...
- 9.4.7: Use a rectangle diagram to factor each expression. a. x2 + 7x + 6 b...
- 9.4.8: The sum and product of the roots of a quadratic equation are relate...
- 9.4.9: Mini-Investigation In this exercise you will discover whether knowi...
- 9.4.10: Write a quadratic equation of a parabola with x-intercepts at 3 and...
- 9.4.11: APPLICATION The school ecology club wants to fence in an area along...
- 9.4.12: Mini-Investigation Consider the equation y = x2 9. a. Graph the equ...
- 9.4.13: Kayleigh says that the roots of 0 = x2 + 16 are 4 and 4 because (4)...
- 9.4.14: Reduce the rational expressions by dividing out common factors from...
- 9.4.15: Multiply and combine like terms. a. (x 21) (x + 2) b. (3x + 1)(x + ...
- 9.4.16: Edward is responsible for keeping the stockroom packed with the bes...
Solutions for Chapter 9.4: Factored Form
Full solutions for Discovering Algebra: An Investigative Approach | 2nd Edition
Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.
Basis for V.
Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!
cond(A) = c(A) = IIAIlIIA-III = amaxlamin. In Ax = b, the relative change Ilox III Ilx II is less than cond(A) times the relative change Ilob III lib II· Condition numbers measure the sensitivity of the output to change in the input.
A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.
Echelon matrix U.
The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.
Full row rank r = m.
Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.
Set of n nodes connected pairwise by m edges. A complete graph has all n(n - 1)/2 edges between nodes. A tree has only n - 1 edges and no closed loops.
Hypercube matrix pl.
Row n + 1 counts corners, edges, faces, ... of a cube in Rn.
A symmetric matrix with eigenvalues of both signs (+ and - ).
Linearly dependent VI, ... , Vn.
A combination other than all Ci = 0 gives L Ci Vi = O.
IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.
If N NT = NT N, then N has orthonormal (complex) eigenvectors.
Outer product uv T
= column times row = rank one matrix.
Polar decomposition A = Q H.
Orthogonal Q times positive (semi)definite H.
Projection matrix P onto subspace S.
Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.
Schur complement S, D - C A -} B.
Appears in block elimination on [~ g ].
Semidefinite matrix A.
(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.
Singular matrix A.
A square matrix that has no inverse: det(A) = o.
Skew-symmetric matrix K.
The transpose is -K, since Kij = -Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.
If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.