### Create a StudySoup account

#### Be part of our community, it's free to join!

Already have a StudySoup account? Login here

# Applied Time Series Analysis ST 730

NCS

GPA 3.79

### View Full Document

## 37

## 0

## Popular in Course

## Popular in Statistics

This 28 page Class Notes was uploaded by Jordane Kemmer on Thursday October 15, 2015. The Class Notes belongs to ST 730 at North Carolina State University taught by Peter Bloomfield in Fall. Since its upload, it has received 37 views. For similar materials see /class/223933/st-730-north-carolina-state-university in Statistics at North Carolina State University.

## Reviews for Applied Time Series Analysis

### What is Karma?

#### Karma is the currency of StudySoup.

#### You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 10/15/15

Forecasting 0 General problem predict xnm given xnxn1x1 0 General solution the conditional distribution of xnm given n7n1 331 o In particular the conditional mean is the best predictor ie minimum mean squared error 0 Special case if mt is Gaussian the conditional distribution is also Gaussian with a conditional mean that is a linear function of xnxn1x1 and a conditional variance that does not depend on xnxn1x1 Linear Forecasting o What if mt is not Gaussian o Use the best linear predictor xquot nm39 o Not the best possible predictor but computable One step Prediction 0 The hard way suppose x31 n1xn n2xn 1 cbnmxl 0 Choose n71 n72 gbmn to minimize the mean squared pre 2 dicton error E anrl x31gt o Differentiate and equate to zero n linear equations in the n unknowns o Solve recursively in n using the Durbin Levinson algorithm Incidentally the PACF is gbmn One step Prediction for an ARMA Model a The easy way suppose we can write xn1 2 some linear combination of xnxn1x1 something uncorrelated with xnxn1x1 0 Then the first part is the best linear predictor and the second part is the prediction error 0 Eg ARpp g n xn1 Eblxn 2xn 1 pn1g wn1 first part second part 4 General ARMA case a Now n1 1xn dumb 1 pxn1 p 91wn 92wn 1 I 9qwn1 q wn1 0 First part on the right hand side is a linear combination of 3371 71 17 0 Last part wn1 is uncorrelated with xnxn1x1 0 Middle part If the model is invertible 111 is a linear com bination of xtxt1 so if n is large we can truncate the sum at 1 and wnwn1wn1q are all approximately linear combinations of xnxn1x1 0 So the middle part is also approximately a linear combination of xnxn1x1 whence 33241 133n 2xn 1 pxn1 p alwn 92wn 1 eqwn1 q and wn1 is the prediction error xn1 x31 M ulti step Prediction o The easy way build on one step prediction Eg two step xn2 1n1 2M pn2 p 91wn1 92wn 39 9qwn2 q wn239 0 Replace xn1 by x 1 wn1i n2 131 2M pn2 p 92wn I 9qwn2 q wn2 1 91 wn1 o The first two parts are again approximately linear combi nations of xnxn1x1 and the last is uncorrelated with xnxn1x1 SO x32 131 2M pn2 p 92er eqwn2 q and the prediction error is n2 32 wn2 1 91 10n1 0 Note that the mean squared prediction error is 05 1 1 91 2 05 Mean squared prediction error increases as we predict further into the future Forecasting with proc arima o Eg the fishery recruitment data 0 prOC arima program and output 0 Note that predictions approach the series mean and std errors approach the series standard deviation o The autocorrelation test for residuals is borderline largely because of residual autocorrelations at lags 12 24 0 Spectrum analysis shows that these are caused by seasonal means which can be removed proc arima program and output 10 Multiple series 0 Jointly stationary series wt and yt have cross covariances WWI E 33th Mac yt rm o The cross spectral density is w fxyaa Z 7xyhe 27rzwh7 h oo and 12 nyUz 12 fxywe2m hdw o The cross spectral density is complex 33100 nyw iquw where cxyaa and qum are the cospectrum and quadspectrum respectively 39 7315301 WEEK h i fyxw 33100 gt ny w Qty 0 and nyw Qxyw squared Coherence 0 Recall fm adj is the variance of the sine and cosine coef ficients in the representation n 12 xt dCO2 2 dc wj COS27rjtn d3 wj sin27rjtn j1 0 Similarly cxy adj and qu adj are the covariances of the sine and cosine coefficients in the representations of wt and yt 0 Correlations are usually easier to interpret than covariances the squared coherence lfyxw2 fxxwfyyw measures the strength of the relationship between wt and yt at frequency w piano piyw o p5xw is also an analog of R2 it measures the fraction of variance in yt at frequency w fyy w explained by wt Spectral Matrix o For a general p variate time series Fh E xm u x L0 0 The spectral matrix is m fw Z rhe 2Wh h oo and 1 2 rm 2 12 emwhmmw c When p22 and the spectral matrix is just f11w f12 w Kw 13100 f22 w gt39 Nonparametric Estimation o Generalize the univariate case mad1kg NW where L 2m 1 is the number of terms in the sum a or more generally Hajj 2 kg th wjkgt Estimated Coherence o If we use the simple average fltwjgt we get the estimated squared coherence as lryxltwgtl2 fmwfyyw rim 0 If the true coherence is zero 2 p Aw L 1 y N F2 2L 2 1 7 o If we use instead Hajj a similar result holds approximately 8 Significant Coherence 0 So under the null hypothesis pgmm O 2 F22L 204 P py w gt L 1 F22L 204 a 0 At frequencies where gmaa is below this critical value we therefore cannot say that the coherence is significant 0 Frequencies at which gmm exceeds this value show signifi cant coherence Example 0 R code to show the squared coherence between the Southern Oscillation Index and the fisheries recruitment series parmfcol C2 1 s spectrumcbindsoi rec kernelquotdaniellquot 9 taper 0 fast FALSE plots plottype quotcohquot cilty 2 f qf999 2 Sdf 2 ablineh f sdf 2 2 f col quotredquot f qf95 2 sdf 2 ablineh f sdf 2 2 f col quotredquot lty 2 10 o Remove seasonal effects as monthly means then recalculate soiSA residualslmsoi quot factorcyclesoi soiSA tssoiSA start startsoi frequency frequencysoi recSA residualslmrec quot factorcyclerec recSA tsrecSA start startrec frequency frequencyrec sSA spectrumcbindsoiSA recSA kernelquotdaniellquot 9 taper 0 fast FALSE plotsSA plottype quotcohquot cilty 2 f qf999 2 sSAdf 2 ablineh f sSAdf 2 2 f col quotredquot f qf95 2 sSAdf 2 ablineh f sSAdf 2 2 f col quotredquot lty 2 11 Statistics 730 Fall 2008 Applied Time series Analysis Professor Peter Bloomfield email bloomfield statncsuedu httpwwwstat ncsu edupeoplebloomfieldcoursesst730 Characteristics Of Time Series 0 A time series is a collection of observations made at different times on a given system 0 For example Earnings per share of Johnson and Johnson stock quar terly Global temperature anomalies from 1856 1997 annual Investment returns on the New York Stock Exchange daily Digression Retrieving the Data Using R jj scanquothttpwwwstatpittedustoffertsa2datajjdatquot jj tsjj frequency 4 start C1960 1 plotjj globtemp scanquothttpwwwstatpittedustoffertsa2dataglobtempdatquot globtemp tsglobtemp start 1856 plotglobtemp nyse scanquothttpwwwstatpittedustoffertsa2datanysedatquot nyse tsnyse plotnyse Correlation o Time series data are almost always correlated with each other autocorreated c We may want to exploit that correlation or merely to cope with it Exploiting Correlation Forecasting 0 Suppose Yt is the 25 observation and we observe Y0 Y1 Yn1 What can we say about Yn o If we know the correlation structure or more precisely the joint distribution of YOY1Yn1Yn then we calculate the conditional distribution of YnYOY1Yn1 o The conditional mean is the best forecast of Yn and the con ditional standard deviation is the root mean square forecast error If the conditional distribution is normal we can use them to make probability statements about Yn Coping with Correlation Regression 0 Suppose instead that Yt is related to a covariate wt and we are interested in the regression of Yt on wt 0 Because the Ys are correlated we should not use Ordinary Least Squares to fit the regression o If we knew the correlation structure we would use General ized Least Squares 0 Usually we don t know it so we must estimate it typically using a parsimonious parametric model Time Domain and Frequency Domain 0 Methods that focus on how a time series evolves from one time to the next are called time domain methods 0 Some graphs eg residuals of global temperatures from a quadratic trend suggest the possibility of waves in the data 1 lmglobtemp timeglobtemp Itime globtemp 2 plotglobtemp fittedl 0 Since a wave is described in terms of its period or alterna tively its frequency methods that measure the waves in a time series are called frequency domain methods

### BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.

### You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

## Why people love StudySoup

#### "There's no way I would have passed my Organic Chemistry class this semester without the notes and study guides I got from StudySoup."

#### "I signed up to be an Elite Notetaker with 2 of my sorority sisters this semester. We just posted our notes weekly and were each making over $600 per month. I LOVE StudySoup!"

#### "I was shooting for a perfect 4.0 GPA this semester. Having StudySoup as a study aid was critical to helping me achieve my goal...and I nailed it!"

#### "Their 'Elite Notetakers' are making over $1,200/month in sales by creating high quality content that helps their classmates in a time of need."

### Refund Policy

#### STUDYSOUP CANCELLATION POLICY

All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email support@studysoup.com

#### STUDYSOUP REFUND POLICY

StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here: support@studysoup.com

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to support@studysoup.com

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.