Consider the Markov chain in Example 7.2, for the case where m = 4, as in Fig. 7.2. and assume that the process starts at any of the four states, with equal probability. Let Yn = 1 whenever the Markov chain is at state 1 or 2, and Yn = 2 whenever the Markov chain is at state 3 or 4. Is the process Yn a Markov chain?

Stat 461: Week 1 Notes Principles of Experimental Design 1. What is being assumed 2. Nuisance/Noise Factors (aka Confounding Variables) a. Def: factors other than one being studied which could alter the experiment (ex: if you’re seeing what type of fertilizer is best for a crop confound variables could be the soil; if two crops have two different soil types and two different fertilizer types the crop yield may be larger/smaller due to the soil and not the fertilizer) b. Statistical Model- the difference between the mean of the data and one piece of data is due to confounding variables ( = + = mean plus confounding