Figure 1: State-Transition Diagram for a Binary Bivariate Markov Model
Let us illustrate how definitions (2)-(3) may be made operational by applying
them to a precise stochastic process and information set. To make the simplest
possible example, let us restrict the information set to the canonical filtration
associated to {T)}, and furthermore make the assumption that {T)} is a Markov
process (or Markov chain), so that
Pr{yt I yt-ι,∙∙∙,yo} = Pr{yt I yt-ι}
The most restrictive definition of Markov process requires that the transition
probabilities do not vary over time. More specifically, under this assumption the
process is defined a Markov chain with stationary transition probabilities. Notice
that the assumption of stationary transition probabilities alone does exclude any
impact of covariates on the transition probabilities. In this simplified framework,
the definitions given above specialize as follows:
Definition 3 - Strong one step ahead поп-causality for a Markov chain with
stationary transition probabilities: does not strongly cause Yf' one step
ahead, given KiL 1, ifi
Pr{2∕i ∣2∕t-ι}=Pr{2∕i1 ∣2∕i1-ι} ∀t∈{l,..∙,T} (4)
4The equivalence between (2) and (4) in this framework comes immediately by noticing
that, under the Markov assumption and the assumption that the information set ʃt-i coin-
cides with У(-1, the conditional independence statement (2) implies:
Pr{ut,Ut-ι l%1-ι} =Pr{¾1 I ¾1-ι}pr{%2-ι l%1-ι} ∀t∈{l,...,T}