Let us put ˜ X
i
: = X
i
− θ. Applying 3.3.2 to the function f x =
α
x
α i
− θ
α
x
α j
− θ
α
, using bounded convergence to interchange an infinite sum and ex- pectation, we get
∂ ∂
t
CovX
i
t, X
j
t =
k,l
ak − lE
α
˜ X
α k
t − ˜X
α l
tδ
il
˜X
α j
t + δ
jl
˜X
α i
t +2δ
i j
E[tr wX t]. 3.3.6
Inserting 3.3.4 we get
∂ ∂
t
C
t
j − i =
k
ak − iC
t
j − k − C
t
j − i
+
k
ak − jC
t
k − i − C
t
j − i
+2δ
i j
E[tr wX t]. 3.3.7
Substituting ˜ı := j − i, ˜ := k − i and ˜k := j − k and reordering the summations,
we find that
∂ ∂
t
C
t
˜ı =
˜
a ˜C
t
˜ı − ˜ − C
t
˜ı +
˜k
a − ˜kC
t
˜ı − ˜k − C
t
˜ı +2δ
˜ı0
E[tr wX t]. 3.3.8
This shows that formula 3.1.34 holds.
3.3.2 Random walk representations
Let B be the Banach space of bounded real functions on , equipped with the supremum norm. The operator G in 3.1.35 is a bounded linear operator on B.
We define a Feller semigroup on B by
P
t
f : = e
t G
f, 3.3.9
where e
t G
: =
∞ n
=0 1
n
t G
n
. This semigroup corresponds to a continuous-time random walk I
t t
≥0
on that jumps from i to j with rate a
S
j − i. By shift-
invariance there exists a function P : [0, ∞ × →
R
such that P
t
j − i = P
i
[I
t
= j]. 3.3.10
We can consider P
t
j − i as the i, j-th element of the matrix of the operator P
t
in 3.3.9, in the following sense P
t
f i =
j
P
t
j − i f j.
3.3.11
Lemma 3.3.1 Assume that f, g : [0,
∞ → B are continuous functions, where t
→ f
t
i is continuously differentiable for each i ∈ and
∂ ∂
t
f
t
i =
j
a
S
j − i f
t
j − f
t
i + g
t
i t
≥ 0, i ∈ . 3.3.12
Then f
t
i =
j
P
t
j − i f
j +
t j
P
s
j − ig
t −s
j ds t
≥ 0, i ∈ . 3.3.13
Proof of Lemma 3.3.1: We define derivatives and Riemann integrals of B- valued functions as in [16], chapter 1. In that language, we would like to rewrite
3.3.12 as
∂ ∂
t
f
t
= G f
t
+ g
t
t ≥ 0.
3.3.14 However, care is needed because it is not immediately clear that the derivative
∂ ∂
t
f
t
: = lim
ε →0
ε
−1
f
t +ε
− f
t
exists in the topology on B. To see that this is all right, we note that the function
t → G f
t
+ g
t
3.3.15 is continuous in t and therefore
t →
t
G f
s
+ g
s
ds 3.3.16
exists and is a continuously differentiable B-valued function. Formula 3.3.12 implies that
f
t
=
t
G f
s
+ g
s
ds 3.3.17
and it follows that t → f
t
is continuously differentiable and 3.3.14 holds. Let I
t t
≥0
be the continuous-time random walk with kernel a
S
. This process solves the martingale problem for G, and therefore
E
i
[ f
t
I ]
= E
i
[ f I
t
] −
t
E
i
[
∂ ∂
s
+ G f
t −s
I
s
]ds = E
i
[ f I
t
] +
t
E
i
[g
t −s
I
s
]ds. 3.3.18
This is formula 3.3.13.
3.3.3 Spatially ergodic measures
The σ -field of shift-invariant events is
S
: = {A ∈
B
K : T
−1 i
A = A ∀i ∈ }.
3.3.19 A probability measure µ on K
is spatially ergodic if for every A ∈
S
either µ
A = 1 or µA = 0. We state the following standard ergodic theorem in L
2
without proof see [24].
Lemma 3.3.2 For n
= 1, 2, . . ., let p
n
: → [0, ∞ be functions satisfying
i
p
n
i = 1 and
lim
n →∞
k
| p
n
i − k − p
n
j − k| = 0
∀i, j ∈ . 3.3.20
Let X = X
i i
∈
be a family of K -valued random variables with shift-invariant ergodic law
L
X . If E[X ]
= θ, then lim
n →∞
E θ
−
i
p
n
i X
i 2
= 0. 3.3.21
In our case, probability distributions p
n
satisfying 3.3.20 will arise in the follow- ing way.
Lemma 3.3.3 Let P : [0,
∞ × →
R
be as in 3.3.10. Then for any i, j ∈ :
lim
t →∞
k
|P
t
i − k − P
t
j − k| = 0.
3.3.22
Proof of Lemma 3.3.3: We use the Ornstein coupling [25]. To see how this works for random walks on arbitrary Abelian groups, let
⊂ be a set such that a
S
k 0 for each k
∈ and such that of each k ∈ with a
S
k 0, either k or −k but
not both is in . By irreducibility, we can decompose j − i as
j − i =
k ∈
nkk, 3.3.23
where nk ∈
Z
and only a finite number of nk’s are non-zero. We may couple two random walks starting in points i and j in such a way that they always make
a jump of size k or −k at the same time. They choose k or −k independently of
each other, until the walk starting in j has made nk more of these jumps than the walk starting in i . After that, they choose either both k or both
−k. This coupling is obviously successful and Lemma 3.3.3 now follows easily.