### Create a StudySoup account

#### Be part of our community, it's free to join!

Already have a StudySoup account? Login here

# Scientific Computation ECS 231

UCD

GPA 3.75

### View Full Document

## 74

## 0

## Popular in Course

## Popular in Engineering Computer Science

This 3 page Class Notes was uploaded by Ashleigh Dare on Tuesday September 8, 2015. The Class Notes belongs to ECS 231 at University of California - Davis taught by Zhaojun Bai in Fall. Since its upload, it has received 74 views. For similar materials see /class/187717/ecs-231-university-of-california-davis in Engineering Computer Science at University of California - Davis.

## Reviews for Scientific Computation

### What is Karma?

#### Karma is the currency of StudySoup.

#### You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 09/08/15

ECS231 Handout 6 April 21 2009 1 2 03 The landscape of solvers for the linear system of equations Ax b where A is an n x n matrix I is an n vector Direct Iterative A LU u AU Nonsymmetric A LU GMRES Symmetric positive de nite A Cholesky CG A general framework for iteratiue projection methods The basic idea of projection techniques is to extract an approximate solution from subspaces of R Let W and V be two m dimensional subspaces of R and 0 is a good initial guess of the solution then the projection technique is to Find E E 900 W such that b 7 AETV 1 Write i 0 z z E W and de ne initial residual r0 b 7 Am Notice that b 7 AE b 7 Am0 l 2 r0 7 A2 Then the formulation 1 is equivalent to Find 2 E W such that re 7 AzLV 1a This is a basic projection step in its most general form Most standard techniques use a succession of such projections Typically a new projection step uses a new pair of subspaces W and V updated from the previous step and an initial guess mo equal to the most recent approximation This simple framework is common to many numerical computing lf W V then it is called orthogonal projection method Otherwise it is called oblique projection method The orthogonality constraints in 1a is known as the Petrov Galerkin condition Matrix Representation of iterative subspace projection methods Let V 121122 um be an n x in matrix whose columns form a basis of V and simi larly W w1w2 wm an n x in matrix whose columns form a basis of W Then any approximation solutions in 0 W can be written as im0zx0Wy ie 2Wy and orthogonality implies VTr0 7 A2 0 thus VTAWy VTTO y VTAW 1VTr0 provided VTAW is invertible Putting it all together we have 5 x0 WVTAW 1VTTO Now we have a prototype subspace projection method 0 Let do be an initial approximation 1 lterate until convergence 2 Select a pair of subspaces V and W 3 Generate basis matrices V and W for V and W 4 r0 lt7 b 7 Axo 5 y lt7 VTAW 1VTTO 6 0 lt7 0 Wy There are two important remarks 1 In many practical algorithms to be discussed below7 the matrix VTAW does not have to be formed explicitly It is available as a by product of Steps 2 and 37 eg7 the Arnoldi process 2 The method is de ned only when VTAW is nonsingular7 which is not guaranteed to be true even when A is nonsingular 3 There are two important special cases where the nonsingularity of VTAW is guaranteed a If A is symmetric positive de nite SPD and W V7 then VTAW WTAW is nonsingular b If A is nonsingular7 and V AW7 then then VTAW WTATAW is nonsingular 4 Optimality for orthogonal projection Theorem 1 Assume that A is SPD and that V W Then a uector i is the result of I if and only if Hm 7 m mgng Hm 7 m where Hart 7 MM xat 7 mTAz 7 z and mat is the exact solution to Ax b 900 W r 0 PROOF Notice that A7 is an inner product on R Thus Hm 7 MM over all possible z 6 0 W is minimized at i if and only if m 7 ETAW ie7 Az 7Ew b7Ai7w 0 for any w EWV This is I Remark The corresponding methods are steepest descent method and conjugate gradient CG method 5 Optimality for oblique projection a Theorem 2 Let A be an arbitrary square matrix and assume V AW Then a uector E is the result of I if and only if b7A b7A H at megtlfwll sz r Ag Amo Amo PROOF Mb 7 Amllg over all possible z 6 m0 W is minimized at i if and only if b 7 AETAlV7 Le b 7 Air 0 for any 1 6 AW V This is I Remark The corresponding methods are minimal residual residual MR method and gener alized minimal residual GMRES method Steepest Descent SD method An one dimensional subject projection process is de ned when W spanw and V spanv7 where w and u are two vectors In this case7 the new approximation takes form x 7 z z z aw and the condition 1a implies UTr 7 A2 UTr 7 aAw O7 and thus T u r 7 UTAw39 04 When A is SPD and at each step7 u w r the residual vector This yields STEEPEST DESCENT SD ALGORITHM 1 Pick an initial guess mo 2 Until convergence for k 0172 do 3 rk b 7 Axk T 4 04k 7215777 5 n1 90k 04ka 6 Enddo We can View that each step of the above iteration minimizes d f 1 5 HA elli 90 MTAW 90 over all vectors of the form x 7 04Vfz7 where Vfz is the gradient of f at m Recall that the negative of the gradient direction is locally the direction that yields the fastest rate of decrease for f

### BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.

### You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

## Why people love StudySoup

#### "There's no way I would have passed my Organic Chemistry class this semester without the notes and study guides I got from StudySoup."

#### "I signed up to be an Elite Notetaker with 2 of my sorority sisters this semester. We just posted our notes weekly and were each making over $600 per month. I LOVE StudySoup!"

#### "There's no way I would have passed my Organic Chemistry class this semester without the notes and study guides I got from StudySoup."

#### "Their 'Elite Notetakers' are making over $1,200/month in sales by creating high quality content that helps their classmates in a time of need."

### Refund Policy

#### STUDYSOUP CANCELLATION POLICY

All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email support@studysoup.com

#### STUDYSOUP REFUND POLICY

StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here: support@studysoup.com

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to support@studysoup.com

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.