Solved: Consider a Markov chain with transition matrix If the initial distribution is
Chapter 10, Problem 13(choose chapter or problem)
Consider a Markov chain with transition matrix If the initial distribution is what is the probability distribution in the next observation?
Unfortunately, we don't have that question answered yet. But you can get it answered in just 5 hours by Logging in or Becoming a subscriber.
Becoming a subscriber
Or look for another answer