this can be easily checked by conditioning on the last visit of y when walking from x to w. We have the following important equations, which follow by conditioning on the last visits of x
i
and x, the first visits of y
i
and y respectively: G
i
x
i
, y
i
|z = G
i
x
i
, x
i
|z · L
i
x
i
, y
i
|z = F
i
x
i
, y
i
|z · G
i
y
i
, y
i
|z, Gx, y|z =
Gx, x|z · Lx, y|z = F x, y|z · G y, y|z. 2.4
Observe that the generating functions F ·, ·|z and L·, ·|z have also radii of convergence strictl bigger than 1.
3 The Asymptotic Entropy
3.1 Rate of Escape w.r.t. specific Length Function
In this subsection we prove existence of the rate of escape with respect to a specific length function. From this we will deduce existence and a formula for the asymptotic entropy in the upcoming
subsection.
We assign to each element x
i
∈ V
i
the “length” l
i
x
i
:= − log Lo, x
i
|1 = − log L
i
o
i
, x
i
|ξ
i
. We extend it to a length function on V by assigning to v
1
. . . v
n
∈ V the length lv
1
. . . v
n
:=
n
X
i=1
l
τv
i
v
i
= −
n
X
i=1
log Lo, v
i
|1 = − log Lo, v
1
. . . v
n
|1. Observe that the lengths can also be negative. E.g., this can be interpreted as height differences.
The aim of this subsection is to show existence of a number ℓ ∈ R such that the quotient lX
n
n tends to
ℓ almost surely as n → ∞. We call ℓ the rate of escape w.r.t. the length function l·. We follow now the reasoning of [11, Section 3]. Denote by X
k n
the projection of X
n
to the first k letters. We define the k-th exit time as
e
k
:= min m ∈ N
∀n ≥ m : X
k n
is constant .
Moreover, we define W
k
:= X
e
k
, τ
k
:= τW
k
and kn := max{k ∈ N | e
k
≤ n}. We remark that kX
n
k → ∞ as n → ∞, and consequently e
k
∞ almost surely for every k ∈ N; see [11, Prop. 2.5]. Recall that e
W
k
is just the laster letter of the random word X
e
k
. The process τ
k k∈N
is Markovian and has transition probabilities
ˆ qi, j =
α
j
α
i
ξ
i
ξ
j
1 − ξ
j
1 − ξ
i
1 1 − ξ
j
G
j
o
j
, o
j
|ξ
j
− 1 for i 6= j and ˆ
qi, i = 0; see [11, Lemma 3.4]. This process is positive recurrent with invariant probability measure
νi =
C
−1
· α
i
1 − ξ
i
ξ
i
1 − 1 − ξ
i
G
i
o
i
, o
i
|ξ
i
, where C
:= X
i∈I
α
i
1 − ξ
i
ξ
i
1 − 1 − ξ
i
G
i
o
i
, o
i
|ξ
i
; 81
see [11, Section 3]. Furthermore, the rate of escape w.r.t. the block length exists almost surely and is given by the almost sure constant limit
ℓ = lim
n→∞
kX
n
k n
= lim
k→∞
k
e
k
= 1
P
i, j∈I ,i6= j
νi α
j 1−
ξ
j
1− ξ
i
γ
′ i, j
1 see [11, Theorem 3.3], where
γ
i, j
z := 1
α
i
ξ
i
z ξ
j
z 1
1 − ξ
j
z G
j
o
j
, o
j
ξ
j
z − 1
.
Lemma 3.1. The process
e
W
k
, τ
k k∈N
is Markovian and has transition probabilities
q g, i, h, j =
α
j
α
i
ξ
i
ξ
j
1− ξ
j
1− ξ
i
L
j
o
j
, h| ξ
j
, if i 6= j,
0, if i = j.
Furthermore, the process is positive recurrent with invariant probability measure πg, i =
X
j∈I
ν jq ∗, j, g, i .
Remark: Observe that the transition probabilities q g, i, h, j of
e
W
k
, τ
k k∈N
do not depend on g. Therefore, we will write sometimes an asterisk instead of g.
Proof. By [11, Section 3], the process e
W
k
,
e
k
− e
k−1
, τ
k k∈N
is Markovian and has transition prob- abilities
˜ q g, m, i, h, n, j
=
1− ξ
j
1− ξ
i
P
s∈V
j
k
n−1 i
sps, h, if i 6= j,
0, if i = j,
where k
n i
s := P X
n
= s, ∀l ≤ n : X
l
∈ V
× i
|X = o] for s ∈ V
× ∗
\ V
i
. Thus, Ý
W
k
, τ
k k∈N
is also Markovian and has the following transition probabilities if i 6= j:
q g, i, h, j =
X
n≥1
˜ q g, ∗, i, h, n, j
= 1 −
ξ
j
1 − ξ
i
X
s∈V
j
X
n≥1
k
n−1 i
sps, h =
1 − ξ
j
1 − ξ
i
X
s∈V
j
L
j
o
j
, s| ξ
j
1 − ¯ H
i
1 ps, h =
α
j
α
i
ξ
i
ξ
j
1 − ξ
j
1 − ξ
i
L
j
o
j
, h| ξ
j
. In the third equality we conditioned on the last visit of o before finally walking from o to s and
we remark that h ∈ V
× j
. A straight-forward computation shows that π is the invariant probability
82
measure of e
W
k
, τ
k k∈N
, where we write A := g, i
i ∈ I , g ∈ V
× i
: X
g,i∈A
πg, i · q g, i, h, j =
X
g,i∈A
X
k∈I
νk · q ∗, k, g, i · q ∗, i, h, j
= X
i∈I
q ∗, i, h, j X
k∈I
νk X
g∈V
× i
q ∗, k, g, i =
X
i∈I
q ∗, i, h, j X
k∈I
νk · ˆ qk, i
= X
i∈I
q ∗, i, h, j · νi = πh, j.
Now we are able to prove the following:
Proposition 3.2. There is a number ℓ ∈ R such that
ℓ = lim
n→∞
lX
n
n almost surely.
Proof. Define h : A → R by hg, j := lg. Then P
k λ=1
h e W
λ
, τ
λ
= P
k λ=1
l e W
λ
= lW
k
. An application of the ergodic theorem for positive recurrent Markov chains yields
l W
k
k =
1 k
k
X
λ=1
h e W
λ
, τ
λ n→∞
−−−→ C
h
:= Z
h d π,
if the integral on the right hand side exists. We now show that this property holds. Observe that the values G
j
o
j
, g| ξ
j
are uniformly bounded from above for all g, j ∈ A : G
j
o
j
, g| ξ
j
= X
n≥0
p
n j
o
j
, g ξ
n j
≤ 1
1 − ξ
j
≤ 1
1 − ξ
max
. For g ∈ V
× ∗
, denote by |g| the smallest n ∈ N such that p
n τg
o
τg
, g 0. Uniform irreducibility
of the random walk P
i
on V
i
implies that there are some ǫ
0 and K ∈ N such that for all j ∈ I , x
j
, y
j
∈ V
j
with p
j
x
j
, y
j
0 we have p
k j
x
j
, y
j
≥ ǫ for some k ≤ K. Thus, for g, j ∈ A we
have G
j
o
j
, g| ξ
j
≥ ǫ
|g|
ξ
|g|·K j
≥ ǫ ξ
K min
|g|
. Observe that the inequality |g| ·
log ǫ ξ
K min
log11 − ξ
max
holds if and only if |g| log1 − ξ
max
logǫ ξ
K min
. Define the sets M
1
:= n
g ∈ V
× ∗
|g| ≥ log1 −
ξ
max
log ǫ
ξ
K min
o ,
M
2
:= n
g ∈ V
× ∗
|g| log1 −
ξ
max
log ǫ
ξ
K min
o .
83
Recall Equation 2.4. We can now prove existence of R
h d π:
Z |h| dπ =
X
g, j∈A
log L
j
o
j
, g| ξ
j
· πg, j ≤
X
g, j∈A
log G
j
o
j
, g| ξ
j
· πg, j + X
g, j∈A
log G
j
o
j
, o
j
|ξ
j
· πg, j ≤
X
g, j∈A :g∈M
1
log G
j
o
j
, g| ξ
j
· πg, j +
X
g, j∈A :g∈M
2
log G
j
o
j
, g| ξ
j
· πg, j + max
j∈I
log G
j
o
j
, o
j
|ξ
j
≤ X
g, j∈A :g∈M
1
logǫ ξ
K min
|g|
| · πg, j +
X
g, j∈A :g∈M
2
log1 − ξ
max
· πg, j + max
j∈I
log G
j
o
j
, o
j
|ξ
j
≤ X
g, j∈A :g∈M
1
logǫ ξ
K min
| · |g| · πg, j +
log1 − ξ
max
+ max
j∈I
log G
j
o
j
, o
j
|ξ
j
∞, since
P
g, j∈A
|g| · πg, j ∞; see [11, Proof of Prop. 3.2]. From this follows that lW
k
k tends to C
h
almost surely. The next step is to show that lX
n
− lW
kn
n
n→∞
−−−→ 0 almost surely.
3.1 To prove this, assume now that we have the representations
W
kn
= g
1
g
2
. . . g
kn
and X
n
= g
1
g
2
. . . g
kn
. . . g
kX
n
k
. Define M := max | logǫ
ξ
K min
|, | log1 − ξ
max
| . Then:
lX
n
− lW
kn
= −
kX
n
k
X
i=
kn+1
log L
τg
i
o
τg
i
, g
i
| ξ
τg
i
≤
kX
n
k
X
i= kn+1
log G
τg
i
o
τg
i
, g
i
| ξ
τg
i
G
τg
i
o
τg
i
, o
τg
i
| ξ
τg
i
≤
kX
n
k
X
i= kn+1:g
i
∈M
1
log G
τg
i
o
τg
i
, g
i
| ξ
τg
i
+
kX
n
k
X
i= kn+1:g
i
∈M
2
log G
τg
i
o
τg
i
, g
i
| ξ
τg
i
+ kX
n
k − kn ·
log1 − ξ
max
84
≤
kX
n
k
X
i= kn+1:g
i
∈M
1
logǫ ξ
K min
|g
i
|
+
kX
n
k
X
i= kn+1:g
i
∈M
2
log1 − ξ
max
+ kX
n
k − kn ·
log1 − ξ
max
≤
kX
n
k
X
i= kn+1:g
i
∈M
1
|g
i
| · M +
kX
n
k
X
i= kn+1:g
i
∈M
2
M + kX
n
k − kn · M
≤ 3 · M · n −
e
kn
. Dividing the last inequality by n and letting n → ∞ provides analogously to Nagnibeda and Woess
[23, Section 5] that lim
n→∞
lX
n
− lW
kn
n = 0 almost surely. Recall also that ke
k
→ ℓ and
e
kn
n → 1 almost surely; compare [23, Proof of Theorem D] and [11, Prop. 3.2, Thm. 3.3]. Now we can conclude:
lX
n
n =
lX
n
− lW
kn
n +
l W
kn
kn kn
e
kn
e
kn
n
n→∞
−−−→ C
h
· ℓ almost surely.
3.2
We now compute the constant C
h
from the last proposition explicitly: C
h
= X
g, j∈A
lg · X
i∈I
νi · q ∗, i, g, j =
X
i, j∈I , i6= j
X
g∈V
× j
− log L
j
o
j
, g| ξ
j
νi α
j
α
i
ξ
i
ξ
j
1 − ξ
j
1 − ξ
i
L
j
o
j
, g| ξ
j
. 3.3
We conclude this subsection with the following observation:
Corollary 3.3. The rate of escape ℓ is non-negative and it is the rate of escape w.r.t. the Greenian
metric, which is given by d
Green
x, y := − log F x, y|1. That is, ℓ = lim
n→∞
− 1
n log F e, X
n
|1 ≥ 0. Proof. By 2.4, we get
ℓ = lim
n→∞
− 1
n log F e, X
n
|1 − 1
n log GX
n
, X
n
|1 + 1
n log Go, o|1.
Since F e, X
n
|1 ≤ 1 it remains to show that Gx, x|1 is uniformly bounded in x ∈ V : for v, w ∈ V , the first visit generating function is defined as
Uv, w|z = X
n≥1
P X
n
= w, ∀m ∈ {1, . . . , n − 1} : X
m
6= w | X = v
z
n
. 3.4
Therefore, Gx, x|z =
X
n≥0
Ux, x|z
n
= 1
1 − Ux, x|z .
85
Since Ux, x|z 1 for all z ∈ [1, R, Ux, x|0 = 0 and Ux, x|z is continuous, stricly increasing
and strictly convex, we must have Ux, x|1 ≤
1 R
, that is, 1 ≤ Gx, x|1 ≤ 1 −
1 R
−1
. This finishes the proof.
3.2 Asymptotic Entropy