### Create a StudySoup account

#### Be part of our community, it's free to join!

Already have a StudySoup account? Login here

# Class Note for STAT 635 at OSU 09

### View Full Document

## 31

## 0

## Popular in Course

## Popular in Department

This 31 page Class Notes was uploaded by an elite notetaker on Friday February 6, 2015. The Class Notes belongs to a course at Ohio State University taught by a professor in Fall. Since its upload, it has received 31 views.

## Popular in Subject

## Reviews for Class Note for STAT 635 at OSU 09

### What is Karma?

#### Karma is the currency of StudySoup.

#### You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 02/06/15

STATISTICS 635 SUMMER 2005 STAT635 LECTURE OUTLINE 5 quotThoughts are forces Ralph Waldo Tri e CONSTRUCTION OF STRICTLY STATIONARY SERIES EH Let Z be an iid sequence7 which we know to be strictly stationary EH De ne the series Xt as follows Xt gltZtgt Ztila 39 quot Ztiql for some real valued function g EH The time series Xt is said to constructed by ltering the iid se quence Zt using filter EH Clearly7 Xt is strictly stationary7 since ltztih ZHHN lt2 ztiu for all integers h EH Xt is clearly qdependent Remark The above method provides one of the simplest ways to con struct time series that are strictly stationary STATISTICAL ANALYSIS OF TIME SERIES 1 STATISTICS 635 SUMMER 2005 THE MAq PROCESS AND RELATED RESULTS EH Let Z be a WNO 02 process Let 60 1 Let q be any integer7 and consider the real constants 61 62 6g with 6g y 0 De ne X15 Z7 61Zt71 62Zt72 39 39 39 qutq 60Zt 61Zt71 62Zt72 39 39 39 qutq q Z Qthj j0 EH Xt is the moving average of order q7 or the MAq process EH Xt is a linear combination of q 1 white noise variables EH The MAq process is q correlated7 since Xt and XHh are uncorre lated for all lags h gt q EH Using the backward shift operator Bth thj we can write Xt 6BZt where 63161362326qu EH The MAq process provides a very rich family of models ln fact7 distinctly different processes can be generated from different para meters See example Theorem 1 If Xt is a stationary q correlated time series with a zero mean then it can be represented as a MAq process STATISTICAL ANALYSIS OF TIME SERIES 2 STATISTICS 635 SUMMER 2005 SIMULATION OF MA2 PROCESSES EH To Simulate an MAlt2gt With 61 045 and 62 O457 in R7 use tssim lt arimasimn200 listmac045 O45 sd 1 EH Plots of two different MAlt2gt along With the ACF AcF of MA2 with theta1045 and theta2 045 ACF of MA2 with theta1045 and the a2045 2 1 GE 08 1a 1 1 is 51m ACF a4 1 U 2 1 an 402 404 1 u 5m 1m 15m mu m 5 1U 15 2D TIme Lag AcF of MA2 with theta145 and theta245 ACF of MA2 with theta145 and theta245 15 1 1U 1a 1 it we m 03 2 lt A 10 1 02 1 15 1 02 an 1 u 5m 1m 15m mu m 5 1U 15 2D TIme Lag EH Note the distinct qualitative differences EH Note also the Sharp cut off at lag 2 STATISTICAL ANALYSIS OF TIME SERIES 3 STATISTICS 635 SUMMER 2005 IJNEARIPROCESSES EH A set of real constants j E Z is absolutely summable if E WUltltn jem EH The time series Xt is a linear process if it has the representation Xt Z ijz r Vt jem where Z N WNO 02 and is absolutely summable EH Xt is therefore expressed as a linear combination of all the noise ternIs past7 current7 and future EH Absolute sunIrnability guarantees that the in nite surn converges EH Using the backward shift operator BXt thh we can write where 23000 ijj EH The operator is a sort of linear filter7 taking the white noise Zt as input and delivering the stationary series Xt as output EH A linear process is called a moving average or MAoo7 if 1 O for all j lt 07 ie Xt Z wjztij j0 STATISTICAL ANALYSIS OF TIME SERIES 4 STATISTICS 635 SUMMER 2005 ASPECTS OF LINEAR PROCESSES EH Every second order stationary process D ls a linear process7 or D Can be transformed to a linear process b By Wold7s decomposition discussed later EH The class of linear processes provide a general framework for studying stationary processes Proposition 1 Let be a stationary time series With mean 0 and covariance function 7y If 227 lt 00 then the time series Xt Z WYH WBWI F700 is stationary With mean 0 and autocovariance function mm 2 Z ijkvylthk jgt jeoojeoo In the special case Where Xt is a linear process VXW Z w nhaz F700 Intuitively7 the above proposition states that a linear lter7 when ap plied to any stationary time series as input produces a stationary time series as output STATISTICAL ANALYSIS OF TIME SERIES 5 STATISTICS 635 SUMMER 2005 AN AR1 PROCESS EH The de nition of the ARG process we know is Xt CbXtil Zt 1 where Z N WNO 02 lt 17 and Z is uncorrelated with X5 for each 5 lt t EH The solution to the ARG equation exists and is unique EH Consider the linear process XI Z quztij lt2 j0 EH We know that j Z O is absolutely surnnIable7 sinoe lt 1 EH lt is easy to verify that the solution to 1 is given by 27 and the ACVF is 00 2 h vlthgt Z Wh02 10 22 j0 for h 2 0 EH Prove that 2 is the only solution to the ARG equation EH Using the backward shift operator Bth Xtij and the operator 00 09gt LB ij j0 the solution 2 to 1 is easily derived as follows Xt 7rBZt Z 57th j0 STATISTICAL ANALYSIS OF TIME SERIES 6 STATISTICS 635 SUMMER 2005 SOLUTION TO AR1 EQUATION WHEN lgbl gt 1 EH lf gt 17 then the series 2 does not converge However7 Xt ltZ571Zti TlXt1 EH Therefore Xt Z Tthj lt3 j0 is the unique stationary solution to the ARG equation EH Xt In equation 3 are correlated With the future values Z5 5 gt t of the noise process EH This is the unnatural solution to the ARG equation7 and no used in practice Hence the restriction in practice to ARG With lt 1 SOME REMARKS EH Every ARG With gt 1 can be reexpressed as an ARG process With lt 1 and a new White noise sequence D Nothing is therefore lost by restricting attention to AR1 process with lt 1 EH The ARG With gb lt 1is said to beacausal or future independent function of th7 since Xt has a representation in terms of only Z5 5 g t The past and present noise values EH lf 17 there is no stationary solution to the ARG equation STATISTICAL ANALYSIS OF TIME SERIES 7 STATISTICS 635 SUMMER 2005 SAMPLE MEAN AND AUTOCORRELATION FUNCTIONS We are given a set of observations X1 Xn EH The sample mean X71 n71ltX1X2 Xngt is an unbiased estimator of u EH The mean squared error of Xn is E on m2 w i W Wt lt4gt TL Proposition 2 If Xt is a stationary time series With mean u and autocovariance function 70 then as n gt oo V0371 W031 M gt 0 n gt 0 also What is the sampling distribution of X Wm m N o w h7n An approximate 95 con dence interval for u is then Xn 196v12EXn 196v12 STATISTICAL ANALYSIS OF TIME SERIES 8 STATISTICS 635 SUMMER 2005 EXAMPLES OF CONFIDENCE INTERVALS FOR u Let Xt be an ARG process with mean I de ned by the equations Xt M CleH H Zt Where lt 1 and Z N WNO 02 Clearly7 for an ARC7 we have 12 0 gbh and hencevlt 121 gt02 15 ESTIMATION OF 7 AND p and i Wt Both and are biased even if n is replaced by n For k 2 17 the sample covariance matrix W W w 1gt m W m Wi 2 Iltk 1gtIltk 2gt W is nonnegative de nite STATISTICAL ANALYSIS OF TIME SERIES STATISTICS 635 SUMMER 2005 SAMPLING DISTRIBUTION OF 5 BE Barlett s formula 00 wz39j Z lz39k gtlt Tkj k1 Where lug M 239 W 239 2pltigtpltkgt m W 239 M j 2pltJgtpltkgt EH For large m the vector k M1 l T is approximately normally olistributeol7 k Nltpkgt HillV Where W wzrj is obtained from the Bartlett7s formula EXAMPLES EH The HD noise EH An MAG process EH An ARG process STATISTICAL ANALYSIS OF TIME SERIES 10 STATISTICS 635 SUMMER 2005 INTRODUCTION To ARMA PROCESSES EH The time series Xt is an ARMA11 process if it is stationary and satis es for every t Xt ngt1 Z 0Z4 5 where Z N WNO 02 and gb 6 75 0 EH With the backward shift operator B 7 the ARMA equation becomes 5ltBgtXt 9ltBgtZt 6 where and are linear lters de ned by B1 B and 631QB EH What are the gb and 6 for which a solution to equation 5 exists CONSTRUCTION OF ARMA11 WHEN lgbl lt 1 EH Assume that lt 17 and consider the power series expansion Ma ggywjlm fw Since lt 17 the coe icients gbj of 92 are absolutely summable EH Now de ne the ltering operator MB71 B H STATISTICAL ANALYSIS OF TIME SERIES 11 STATISTICS 635 SUMMER 2005 EH Our ARMAGJ process of equation 6 is rewritten as Xt XltBgt6ltBgtZt ltBgtZt 1MB 1 B 232163 1 B 232QB QBQ QQB3 1 6B 6 BQ 6 233 2 737 j0 With w01 and wjlt 6gt j1 forJ21 EH We conclude that the MAoo process Xt Z 9 2 721ij 7 j1 is the unique stationary solution of EH Since each Xt can be expressed in terms of the current and past values Z55 3 t7 Xt is said to be causal7 or more precisely a causal function of Z STATISTICAL ANALYSIS OF TIME SERIES 12 STATISTICS 635 SUMMER 2005 CONSTRUCTION OF ARMA11 WHEN lgbl gt 1 EH Show that With gt 17 we have 1 00 ejej Xlzli gig 39Elt 2 EH De ne the new operator XltBgt Z czWB j j1 EH Applying the approach as earlier to7 Xt XB6BZt7 we get Xt 9 571Zt lt 5 9 Z Cbejelzm 8 j1 as the unique stationary solution to 5 when gt 1 EH This solution is noncausal7 since Xt is then a function of Z5 5 Z t Remark lf gb l17 there is no stationary solution of Conse quently7 there is no such thing as an ARMAGJ process With gb l1 according to our de nition More fornIally7 Fact 1 A stationary solution of the ARMA 11 equations exists if and only if gb y i1 STATISTICAL ANALYSIS OF TIME SERIES 13 STATISTICS 635 SUMMER 2005 SIMULATION OF ARMA11 PROCESSES EH To Simulate an ARMA11 with gb O9 and 6 057 in R7 use tssim lt arimasimn200 listar O9 ma 05 sd 1 EH Below are plots of four different ARMA11 with various values of gb and 6 ARMA11 with phi o9 and theta 05 ARMA11 with phi 09 and theta 05 D T i1 7 ET T T T T T El 5D WEIEI 15D ZEIEI El 5D WEIEI 15D ZEIEI Time Time ARMA11 with phi o9 and theta o5 ARMA11 with phi 09 and theta o5 gtltt U gtltt U 1 T T i Cg 37 g g 5 EH Note the distinct qualitative differences STATISTICAL ANALYSIS OF TIME SERIES 14 STATISTICS 635 SUMMER 2005 ACF PLOTS OF THE ARMA11 PROCESSES EH To plot the ACF of the simulated ARMAGJ in R7 use acftssimcolquotgreenquotmainquotACF of ARMA11 with phi 09 and thetaO5quot EH Below are the plots of the ACF of the simulated ARMAGJ for various values of gb and 6 ACF orARMA11 with phi o9 and thetao5 ACF ofARMA11 with phi o9 and thetao5 ACF ACF a 4 quotquotquotquot W Him an an 02 i 405 u 5 in is in u 5 in is in L g L a ACF orARMA11 with phi o9 and theta o5 ACF ofARMA11 with phi o9 and theta o5 05 ACF 04 06 i i ACF an 02 i 405 an 410 EH Again7 note the distinct qualitative differences STATISTICAL ANALYSIS OF TIME SERIES 15 STATISTICS 635 SUMMER 2005 INVERTIBILITY OF ARMA11 PROCESSES De nition 1 The process Xt is invertible if Z is expressible in terms osz s g t Fact 2 The ARMA11 process of 5 is invertible if 6 lt 1 EH Assume that 6 lt 1 Then the power series expansion 1 1 ltZgt16z allows us to rewrite the ARMAltL1gt equation 5 as 00 Z 6jzj 1 6z 62z j0 j0 EH A little work reveals that Zt XI lt26 6gtZlt 6gtJ 1Xtij j1 EH The ARMAltL1gt process is invertible7 since each Zt can be expressed in terms of the present and past values of the process X 5 5 g t Fact 3 The ARMA11 process of 5 is noninvertible if 6 gt 1 Since7 00 Zr 671Xt Cb 6 Z lt6gt7j71Xtj j1 l ie Z is a function of X5 5 2 t7 when 6 1 STATISTICAL ANALYSIS OF TIME SERIES 16 STATISTICS 635 SUMMER 2005 ADDITIONAL IMPORTANT ASPECTS OF ARMA11 EH Causality and invertibility guide the determination of the admissi ble region for the values of parameters gb and 67 which is the square Sgz56 1ltgz5lt1 1lt6lt1 EH lf 6 17 then the ARMAGJ process is invertible7 since at least Z is the mean square limit of a nite linear combinations of X 5 5 g t D Note that in this case7 Z is not expressed an in nite linear com bination of X5 5 3 t EH Simple form of invertibility The process Xt is invertible if Zt Z Wthj j0 where 220 7rj lt 00 This is the de nition that will be used from now on throughout the course EB If the ARMAGJ process Xt is noncausal or noninvertible with 6 gt 17 then it is possible to nd a new white noise process such that Xt is a causal and noninvertible ARMAGJ process relative to D Nothing is lost by restricting attention to causal and invertible ARMAGJ7 in second order sense D This will turn out to be true also for higher order ARMA models STATISTICAL ANALYSIS OF TIME SERIES 17 STATISTICS 635 SUMMER 2005 FORECASTING STATIONARY TIME SERIES EH Let Xt be a stationary time series With known mean i and auto covariance function 7 EH Consider predicting the values of Xnh h gt O in terms of of the values Xm X714 Xl EH Goal Find the linear combination of 1XnXn1 X1 that forecasts th with minimum mean squared error EH This best linear predictor is denoted 131th whose form is Panh a0 8Lan a2Xn71quot39 anXi n a0 Z aan17i 21 EH Estimate the coef cients a0 a1 an that minimize 590 a1 39 quot 7am E ltXnh a0 aIXn aQXnil anX1gt2l EH Setting the derivative With respect to a0 to 0 gives E Xnh a0 Z aanHi 0 lt9 i1 from Which we derive a0ultI Zagt 10 21 D Exercise Establish equation 10 STATISTICAL ANALYSIS OF TIME SERIES 18 STATISTICS 635 SUMMER 2005 EH Setting the derivative With respect to a7 j 1 n to 0 gives E Xnh a0 Z 82Xn12gt Xn1j 0 11 21 from which we derive7 Fnan 7nh 12 an a1 anT7 722W Wt 39 WW n 1T7 and F22 jgtl221 D Exercise Establish equation 12 EB From all the above7 we get the best linear predictor 132th u Z a2ltXnH 22gt lt13 21 where the the vector an satis es equation 12 EB The mean squared error from the above best linear predictor is EIltXn2 P2Xn2gt21 2lt0gt 2Za22lth2 1gt lt14gt Z 2 827139 jlaj 21 21 70 al Ynlh 15 EB The best linear predictor found by the above equation is unique7 and its two determining equations 9 and 11 can be summarized succinctly as E Error gtlt PredictorVariable O STATISTICAL ANALYSIS OF TIME SERIES 19 STATISTICS 635 SUMMER 2005 ONE STEP PREDICTION OF AN AR1 SERIES EH Recall that the equation for an ARG is Xt quH Z t 01112 Where lt 1 and Z N WNO 02 BE Guessing and checking D IElZnHl DEZn1Xj n D The best onestep linear predictor for an ARG is therefore BE Using the method formally D The best linear predictor of XnH in terms of 1 X X1 is PanH aan D Reminder For an ARG7 we know that the ACVF is Wt WWW D The vector an such that Fnan 7100 Will satisty 1 lt25 2 925 1 1 a1 lt25 lt25 1 lt25 5 1 2 a2 2 7171 CZ57172 CZ57173 1 an Cbn STATISTICAL ANALYSIS OF TIME SERIES 20 STATISTICS 635 SUMMER 2005 D lt is straightforward to see that an gb O O OT is a solu tion to 167 and hence7 for an AR17 the best linear predictor is simply7 T PanH aian ngn with mean square error E ltXnh Panhgt2l PREDICTION OF SECOND ORDER RANDOM VARIABLES EH Let Y and Wm W1 be any random variables EH Assume HY M7 m7 covY Y covY and cowlV are all known7 and BED2 lt 00 and lt 00 EB De ne the following notations W 7W1gtT and 711101 7 covY W covY Wm covY WW1 covY W1T EH Let the covariance matrix be P covW W covWn1 Wn1jzj1 EH The best linear predictor of Y in terms of 1 Wm WW1 W1 is PYW My aTW MW 17 where Fa 7 18 STATISTICAL ANALYSIS OF TIME SERIES 21 STATISTICS 635 The mean squared error of the predictor is E Y PYW2 WY aT y ESTIMATING A MISSING VALUE EH Since we have X1 and X37 we set W X1 X3T7 and we have 1 2 252 1 EH The best estimator of X2 is thus lt25 1gz52 PX2W X1X3 EH The corresponding mean squared error is STATISTICAL ANALYSIS OF TIME SERIES SUMMER 2005 22 STATISTICS 635 SUMMER 2005 THE PREDICTION OPERATOR PW EH Given W Wm W1T and Y With nite second moments EH The function is called a prediction operator D The best linear predictor Pn is a prediction operator with W ltXngtXn71gt39 39 39 X1gtT EH Prediction operators can sometimes be used to simplify the calcula tion of best linear predictors Suppose that EU2 lt 007 ECU2 lt 007 F covW W7 a1 lm are constants Ha PUW EU aTW EW7 Where Ta covU W 33 E U PUWW 0 and E U PUW 0 33 E U PltUW2 WU aTcovU W B 130le OZQV lW a1PUW a2PVW 6 33 P 2271 042m lWgt 22 mm 6 Ha PUW EU If covU W 0 EH PltUWgt PltPltUW if V is a random vector such that the components of EVVT are all nite Exercise An ARG series With nonzero mean STATISTICAL ANALYSIS OF TIME SERIES 23 STATISTICS 635 SUMMER 2005 THE DURBIN LEVINSON ALGORITHM EH Let a stationary time series With nonzero mean u EH Let Xt Yt M7 for all t EH The best linear predictor of YHh can be determined by nding the best linear predictor for XHh and then adding M7 ie Pnlfnh u Panh EH We can restrict our focus to zero mean series7 making adjustments mutatis mutandis EH lf Xt is a zero mean stationary time series with ACVF y7 then the prediction operator of 17 and 18 provide all the needed details for nding the best linear predictor 131th of Xnh EH However7 the direct solution requires solving Fa 7 Which is com putationally intensive When 71 gets larger EH lt would be helpful if the one step predictor PanH based on 71 previous observations could be used to simplify the calculation of Pn1Xn27 the onestep predictor based on n 1 previous observa tions EH Predictions algorithms based on this idea are said to be recursive STATISTICAL ANALYSIS OF TIME SERIES 24 STATISTICS 635 SUMMER 2005 EH Let F be the nonsingular covariance matrix T Pan1 71an 39 39 39 CbnnXl Where Cbn I217 and 7 71 7nT7 and the mean squared error on EltXn1 Pan1gt2i 70 737 EH We need all the autocovariance matrices F1 F2 to be nonsingu lar A useful suf cient condition for this to be the case is 70 gt O and 7h gt O as h gt 00 The Durbin Levinson Algorithm 7171 7171 7n Z Cbnilhjfin Ugll j1 5711 717171 n717n71 i Cbnn 774171 n717n71 717171 and on 1114 1 Where 511 7170 and U0 70 EH The function de ned by MO 1 and Mn gbnmn 1 2 is known as the partial autocorrelation function of Xt STATISTICAL ANALYSIS OF TIME SERIES 25 STATISTICS 635 SUMMER 2005 DESCRIPTION OF THE Innovations ALGORITHM EH This algorithm is applicable to all series With nite second moments7 Whether they are stationary or not EH Let Xt be a series With EXt O and EHXAQ lt 00 for each t EH Let Ele39le WJ EH Also let A 07 n 1 Xn P717an 2 3 and U71 EltXn1 Pan1gt2 EH Let the innovations or onestep error be Un Xn Xn EH NOW7 let Un U1 UnT and Xn X1 XnT Then Un flan Where l 0 0 0 an 1 0 0 An 122 121 1 O 0 17147171 17147172 17147173 quot39 1 STATISTICAL ANALYSIS OF TIME SERIES 26 STATISTICS 635 SUMMER 2005 EH An is Clearly nonsingular and has an inverse7 say Cn of the form 1 O O O 611 1 O 0 On 622 621 1 0 0 9714714 97147172 97147173 quot39 1 The vector of onestep predictors is X X1P1X2 Pnian which can be expressed as XnXn UnonUn UnonXn Xn Where 0 O O O 611 O O O 8n 622 621 0 0 E O 97147171 97147172 97147173 0 and Xn itself satis es The prediction equation is therefore rewritten as O ifn0 Z 671739 XnnLlij Xn1ijgtgt n 17273739 39 39 j1 Xn1 lt19 STATISTICAL ANALYSIS OF TIME SERIES 27 STATISTICS 635 SUMMER 2005 The Innovations Algorithm For computing 611 an recursively Uo KG 1 kil amnik 71 K01 17k Z 6k7kij6n7nij uj 7 O S k lt H j0 nynij 7171 Un Hn1n1 262 Uj j0 Some important remarks about the two algorithms EH The Durbin Levinson recursion gives the coef cients of Xn X1 through n Xn1 Z qsannnLlij j1 D This algorithm is particularly well suited to forecasting ARltpgt processes7 since for them gbnj O for n j gt p EH The innovations algorithms gives the coe icients of Xn X X1 X1 through Xn1 Z 671739 Xn1ij Xn1ijgt j1 D This algorithm is particularly well suited to forecasting MAltqgt processes7 since for them 6W O for n j gt q EH The innovations algorithm has several advantages due to the fact that the innovations are uncorrelated STATISTICAL ANALYSIS OF TIME SERIES 28 STATISTICS 635 SUMMER 2005 PREDICTION FROM INFINITELY MANY PAST VALUES EH De ne the following prediction operator annh Pm7anh 77100 EH 51 is referred to as the prediction operator based on the infinite past Xt7 00 lt t g 71 EB Pn is is referred to as the prediction operator based on the finite past Xm Xl EH An approach is to assume that annh Z OZan1ij j1 in Which case the preceding equations reduce to 00 E XM 2 anMH Xnm 0 23912 j1 or equivalent OO 2w jgtozj wm 1 2 12 j1 This is an in nite set of linear equations for the unknown coef cients ozzr that determine Panh7 provided that the resulting series CODVGFgGS STATISTICAL ANALYSIS OF TIME SERIES 29 STATISTICS 635 SUMMER 2005 EXAMPLE OF AN ARMA11 Consider the following ARMAGJ series Xt Xt1 Zt QZtil N 02 So that7 Xn1 Zn1lt 6 Z Wilzn ej j1 and 00 Zn1 Xn1 Z lt6gtj71Xn1lt7 j1 Applying the 51 to the second equations gives 00 15an1 lt25 9 Z QljianHej jl On the rst equation PanH M 6gt Z qs zn ij j1 Hence7 Xn1 ann1 Zn1 so that the mean squared error of the predictor is PanH is Elzvg l 02 STATISTICAL ANALYSIS OF TIME SERIES 30 STATISTICS 635 SUMMER 2005 DETERMINISTIC PROCESSES EH Let Xt be a stationary time series if Xt has the property that Xn 1an O Vn then Xt is said to be a deterministic process THE WOLD DECOMPOSITION The Wold Dec0mposition if Xt is a nondeterministic stationary time series7 then Xt Z wjztij Vt lt20 j0 Where 1 0 1 and lt 007 2 Z N WNO 027 3 covZ5 Vt O for all 5 and t7 4 Z HZ for all t 5 Vt PSZt for all 5 and t7 6 Z is deterministic EH if the deterministic component Vt is O for all t the series is said to be purely nondeterministic This is the case of most ARMA processes considered in this course STATISTICAL ANALYSIS OF TIME SERIES 31

### BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.

### You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

## Why people love StudySoup

#### "There's no way I would have passed my Organic Chemistry class this semester without the notes and study guides I got from StudySoup."

#### "I signed up to be an Elite Notetaker with 2 of my sorority sisters this semester. We just posted our notes weekly and were each making over $600 per month. I LOVE StudySoup!"

#### "There's no way I would have passed my Organic Chemistry class this semester without the notes and study guides I got from StudySoup."

#### "It's a great way for students to improve their educational experience and it seemed like a product that everybody wants, so all the people participating are winning."

### Refund Policy

#### STUDYSOUP CANCELLATION POLICY

All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email support@studysoup.com

#### STUDYSOUP REFUND POLICY

StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here: support@studysoup.com

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to support@studysoup.com

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.