0
   

Expected value of a simple Markov Chain

 
 
Reply Fri 7 Jun, 2013 03:12 am
This is probably simple to those who know stochastic processes, but I am finding it difficult to understand how to solve expectations of a sequence.
If y(t) is a simple Markov chain where y(t) = r.y(t-1) and r is a constant, what is the unconditional expectation of the product of the last k observations y(t-k).y(t-k+1).....y(t-1).y(t)? I know how to solve it upto t-1 : E[y(t-1).y(t)] = r, but I need help in how to solve E[y(t-2).y(t-1).y(t)], for example. Would be grateful for even a lead.
  • Topic Stats
  • Top Replies
  • Link to this Topic
Type: Question • Score: 0 • Views: 439 • Replies: 0
No top replies

 
 

Related Topics

 
  1. Forums
  2. » Expected value of a simple Markov Chain
Copyright © 2024 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.13 seconds on 11/15/2024 at 01:51:57