Definition 4.3. Let K + 1 be the number of equivalence classes of ↔ on I. We denote by l
i 1≤i≤K
and r
i 1≤i≤K
the left resp. right endpoints of the equivalence classes: • The equivalence classes of ↔ on I are {0}, [l
1
, r
1
], . . . , [l
K −1
, r
K −1
], [l
K
, r
K
. • 0 l
1
≤ r
1
l
2
≤ r
2
. . . ≤ r
K −1
l
K
r
K
= ∞. • We have l
K
≤ M + 1. We denote by P
k
, 1 ≤ k ≤ K the sub-matrices of P defined by P
k
def
= pi, j
l
k
≤i, j≤r
k
. By construction, the P
k
are irreducible sub-stochastic matrices and P has the form
P
=
1 . . .
. . . . . .
. . . . . .
∗
. . . .
. .
. .
. .
. .
∗
. . . .
. .
. .
. .
. .
. .
.
P
1
. .
. .
. .
. .
. .
. .
. . . .
. .
. .
. .
. .
∗
. . . .
. .
. .
. .
. .
∗
. .
.
P
2
. .
. .
. .
. .
. .
. .
. . . .
. .
. .
. .
. .
∗
. . .
∗
. . . . . .
. . . . . .
. . .
∗
P
K
infinite class
.
Remark 4.4. The sequences l
i 1≤i≤K
and r
i 1≤i≤K−1
can be explicitly expressed in terms of the positions of the zeros in the vector p
1
, . . . , p
M
. By construction, we have {l
i
, 1 ≤ i ≤ K} = {n ≥ 1, pn, n 0 and pn − 1, n = 0} {r
i
, 1 ≤ i ≤ K − 1} = {n ≥ 1, pn, n 0 and pn, n + 1 = 0}, which we may rewrite in terms of the cookie vector:
{l
i
, 1 ≤ i ≤ K} = {n ≥ 1, ♯{1 ≤ j ≤ 2n − 1, p
j
= 0} = n − 1 and p
2n−1
6= 0} {r
i
, 1 ≤ i ≤ K − 1} = {n ≥ 1, ♯{1 ≤ j ≤ 2n − 1, p
j
= 0} = n − 1 and p
2n
= 0}. For example, if there is no cookie with strength 0, then K = 1 and l
1
= 1. Conversely, if all the p
i
’s have strength 0 the digging random walk case, then K = 1 and l
1
= M + 1.
4.2 The process Z
In order to study the branching Markov chain L introduced in the previous section, it is convenient to keep track of the typical evolution of a particle of L: fix a deterministic sequence j
i i
≥0
∈ {1, . . . , b}
N
and set x
def
= o, x
i+ 1
def
=
→
x
i
ji
for i ≥ 0. 1642
Define the process Z = Z
n n
≥0
by Z
n
def
= ℓx
n
. According to c of Lemma 3.2, given a particle x located at ℓx, the positions of its b children
have the same law. Therefore, the law of Z does not depend on the choice of the sequence j
i i
≥0
. Moreover, Lemma 3.2 yields:
Lemma 4.5. Under P
i
, the process Z is a Markov chain starting from i, with transition matrix P given in Definition 3.1.
Let us note that, if Z
n
is in some irreducible class [l
k
, r
k
], it follows from Lemma 4.1 that Z
m
≤ r
k
for all m ≥ n. Thus, Z can only move from an irreducible class [l
k
, r
k
] to another class [l
k
′
, r
k
′
] where k
′
k. Recall also that {0} is always an irreducible class it is the unique absorbing state for Z. We introduce the absorption time
T
def
= inf{k ≥ 0, Z
k
= 0}. 7
Lemma 4.6. Assume that the cookie environment is such that q
bb + 1. Let i ∈ N, we have
a T ∞ P
i
-a.s. b For any α 0, sup
n
E
i
[Z
α n
] ∞. Proof.
The proof of the lemma is based on a coupling argument. Recall Definition 3.1 and notice that the sequence ξ
k k
≥M+1
is i.i.d. Thus, for any stopping time τ such that τ ≥ M + 1 a.s., the number of random variables in the sub-sequence ξ
k k
τ
taking value 1 before the first failure in this sub-sequence has a geometric distribution with parameter
s
def
= P{ξ
M + 1
= 1 | ξ
M + 1
∈ {0, 1}} = q
q + b 1 − q
. It follows that, for any i, the number of random variables in the sequence ξ
k k
≥1
taking value 1 before the i
th
failure is stochastically dominated by M + G
1
+ . . . + G
i
where G
k k
≥1
denotes a sequence of i.i.d. random variables with geometric distribution i.e.
P {G
k
= n} = 1 − ss
n
for n ≥ 0. This exactly means that, conditionally on Z
n
= i, the distribution of Z
n+ 1
is stochastically dominated by G
1
+ . . . + G
i
+ M . Let us therefore introduce a new Markov chain ˜ Z
with transition probabilities
P { ˜Z
n+ 1
= j | ˜ Z
n
= i} = P{G
1
+ . . . + G
i
+ M = j}, It follows from the stochastic domination stated above that we can construct both processes Z and
˜ Z
on the same space in such way that, under P
i
, almost surely, Z
= ˜ Z
= i and
Z
n
≤ ˜Z
n
for all n ≥ 1. 8
The process ˜ Z
is a branching process with geometric reproduction and with M immigrants at each generation. Setting
c
def
= q
b 1 − q
= E[G
1
], 1643
we get
E[ ˜ Z
n+ 1
| ˜ Z
n
] = c ˜ Z
n
+ M . 9
When q bb + 1, we have c 1 so that ˜ Z
n
≥ M1 − c implies E[ ˜Z
n+ 1
| ˜ Z
n
] ≤ ˜Z
n
. Therefore, the process ˜
Z stopped at its first hitting time of [0, M 1 − c] is a positive super-martingale which
converges almost surely. Since no state in M 1 − c, ∞ is absorbent for ˜Z, we deduce that ˜Z hits the set [0, M 1 − c] in finite time. Using the Markov property of ˜Z, it follows that ˜Z returns below
M 1−c infinitely often, almost surely. Since Z ≤ ˜Z, the same result also holds for Z. Furthermore,
the process Z has a strictly positive probability of reaching 0 from any i ≤ M1 − c in one step because no cookie has strength 1. Thus Z reaches 0 in finite time. This entails a.
Concerning assertion b, it suffices to prove the result for the process ˜ Z
when α is an integer. We
prove the result by induction on α. For α = 1, equation 9 implies E[ ˜ Z
n+ 1
] = cE
i
[ ˜ Z
n
] + M so that sup
n
E
i
[ ˜ Z
n
] ≤ maxi , M 1 − c.
Let us now assume that, for any β ≤ α, E
i
[ ˜ Z
β n
] is uniformly bounded in n. We have
E
i
[Z
α+1 n+
1
] =
E
i
[E[G
1
+ . . . + G
Z
n
+ M
α+1
|Z
n
]] =
c
α+1
E
i
[Z
α+1 n
] + E
i
[QZ
n
] 10
where Q is a polynomial of degree at most α. Therefore the induction hypothesis yields
sup
n
|E
i
[QZ
n
]| ∞. In view of 10, we conclude that sup
n
E
i
[Z
α+1 n
] ∞. The following lemma roughly states that Z does not reach 0 with a big jump.
Lemma 4.7. Assume that the cookie environment is such that q bb + 1. Recall that [l
K
, ∞ denotes the unique infinite irreducible class of Z. We have
inf
j ≥l
K
P
j
{∃n ≥ 0, Z
n
= l
K
} 0. Proof.
We introduce the stopping time σ
def
= inf{n 0, Z
n
≤ M + 1}. We are going to prove that
inf
j M +1
P
j
{Z
σ
= M + 1} 0. 11
This will entail the lemma since P
M + 1
{Z
1
= l
K
} 0 recall that l
K
≤ M + 1. According to a of Lemma 4.6, σ is almost surely finite from any starting point j so we can write, for j M + 1,
1 =
M + 1
X
k= ∞
X
i=M + 2
P
j
{Z
σ−1
= i and Z
σ
= k} =
M + 1
X
k= ∞
X
i=M + 2
P
j
{Z
σ−1
= i} pi
, k P
M + 1
m=
pi , m
. 12
1644
Let us for the time being admit that, for i M + 1 and k ∈ {0, . . . , M + 1}, pi
, k ≤ b
q
M + 1
pi , M + 1.
13 Then, combining 12 and 13, we get
1 ≤ b
q
M + 1
M + 2
∞
X
i=M + 2
P
j
{Z
σ−1
= i} pi
, M + 1 P
M + 1
m=
pi , m
= b
q
M + 1
M + 2P
j
{Z
σ
= M + 1}, which yields 11. It remains to prove 13. Recalling Definition 3.1, we have
pi , k =
∞
X
n=M
X
e
1
,...,e
n
s.t. ♯{ j≤n,e
j
=1}=k ♯{ j≤n,e
j
=0}=i−1
P {ξ
1
= e
1
, . . . , ξ
n
= e
n
}P{ξ
n+ 1
= 0}.
Keeping in mind that ξ
j j
≥M+1
are i.i.d. with Pξ
j
= 1 = qb, we get, for n ≥ M,
P {ξ
n+ 1
= 0} = b
q
M + 1−k
P {ξ
n+ 1
= 1, . . . , ξ
n+M + 1−k
= 1}P{ξ
n+M + 2−k
= 0}. Thus,
pi , k ≤
b q
M + 1−k ∞
X
˜ n=M
X
e
1
,...,e
˜ n
s.t. ♯{ j≤˜n,e
j
=1}=M+1 ♯{ j≤˜n,e
j
=0}=i−1
P
{ξ
1
= e
1
, . . . , ξ
˜ n
= e
˜ n
}P{ξ
˜ n+
1
= 0}
≤ b
q
M + 1
pi , M + 1.
5 Proof of Theorem 1.2
The monotonicity result of Theorem 1.2 was proved in Corollary 3.8. It remains to prove the recur- rencetransience criterion. The proof is split into four propositions: Proposition 5.2, 5.4, 5.5 and
5.6.
Definition 5.1. Given an irreducible non negative matrix Q, its spectral radius is defined as:
λ = lim
n →∞
q
n
i, j
1 n
, where q
n
i, j denotes the i, j coefficient of the matrix Q
n
. According to Vere-Jones [23], this quantity is well defined and is independent of i and j.
1645
When Q is a finite matrix, it follows from the classical Perron-Frobenius theory that λ is the largest positive eigenvalue of Q. In particular, there exist left and right λ-eigenvectors with positive coef-
ficients. However, when Q is infinite, the situation is more complicated. In this case, one cannot ensure, without additional assumptions, the existence of left and right eigenvectors associated with
the value λ. Yet, we have the following characterization of λ in terms of right sub-invariant vectors c.f. [23], p372:
• λ is the smallest value for which there exists a vector Y with strictly positive coefficients such that QY ≤ λY .
By symmetry, we have a similar characterization with left sub-invariant vectors. Let us stress that, contrarily to the finite dimensional case, this characterization does not apply to super-invariant
vectors: there may exist a strictly positive vector Y such that QY ≥ λ
′
Y for some λ
′
λ. For more details, one can refer to [19; 23].
Recall that, according to Definition 4.3, P
1
, . . . , P
K
denote the irreducible sub-matrices of P. Let λ
1
, . . . , λ
K
stand for their associated spectral radii. We denote by λ the largest spectral radius of these sub-matrices:
λ
def
= maxλ
1
, . . . , λ
K
. 14
5.1 Proof of recurrence