Directory UMM :Data Elmu:jurnal:S:Stochastic Processes And Their Applications:Vol90.Issue2.2000:
Stochastic Processes and their Applications 90 (2000) 243–262
www.elsevier.com/locate/spa
The necessary and sucient conditions for various
self-similar sets and their dimension (
Dihe Hu
College of Mathematical Science, Wuhan University, Wuhan, 430072, People’s Republic of China
Received 30 September 1999; received in revised form 17 April 2000; accepted 2 May 2000
Abstract
We have given several necessary and sucient conditions for statistically self-similar sets and
a.s. self-similar sets and have got the Hausdor dimension and exact Hausdor measure function
of any a.s. self-similar set in this paper. It is useful in the study of probability properties and
c 2000 Elsevier Science B.V. All
fractal properties and structure of statistically recursive sets.
rights reserved.
MSC: 60G10
Keywords: Statistically self-similar sets; A.s. self-similar sets; Hausdor dimension; Statistically
recursive sets; Statistical contraction operators
0. Introduction
Hutchinson (1981) has introduced the concepts of (strictly) self-similar sets and
self-similar measures in 1981 and has obtained many important results on fractal properties in the same paper. Falconer (1994), Graf (1987), and Mauldin and Williams
(1986) independently introduced the concepts of statistically self-similar sets and measures. They also obtained the Hausdor dimensions and Hausdor measures of some
statistically self-similar sets under some conditions. Let us give a preview of our results
in this paper now. First of all, we introduce the concept of a.s. self-similar sets, the concepts of statistically self-similar measure in combination with statistically self-similar
set, then we give several necessary and sucient conditions to ensure the statistically
recursive set being statistically self-similar set or a.s. self-similar set. Finally, we get the
Hausdor dimension and exact Hausdor measure function of any a.s. self-similar set.
1. Notations and lemmas
Let (
; F; P) be a complete probability space, (E; ) be a separable complete metric
space. K(E) denotes all non-empty compact sets in E, is the Hausdor metric on
(
Supported by the National Natural Science Foundation and the Doctoral Programme Foundation of China.
c 2000 Elsevier Science B.V. All rights reserved.
0304-4149/00/$ - see front matter
PII: S 0 3 0 4 - 4 1 4 9 ( 0 0 ) 0 0 0 4 3 - 0
244
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
K(E), that is to say, ∀I; J ∈ K(E)
(I; J ) = sup{(x; I ); (y; J ): x ∈ J; y ∈ I };
(x; I ) = inf {(x; y): y ∈ I )}:
(K(E); ) is also a separable complete metric space.
Let, N ¿2 be a xed integral, C0 ={∅}, Cn =Cn (N )={0; 1; : : : ; N −1}n ; (n¿1); D=
S
N
n¿0 Cn , C = {0; 1; : : : ; N − 1} , N = {1; 2; : : :}. ∀ ∈ C ∪ D, || is the length (or
dimension) of , ∀=(1 ; 2 ; : : : ; n ) ∈ D, =(1 ; 2 ; : : :), ∗ =(1 ; 2 ; : : : ; n ; 1 ; 2 ; : : :)
is the juxtaposition of and , |k = (1 ; : : : ; k ), (if ||¿k).
∀f: E → E, we call
Lip(f) =
sup
x6=y; x;y∈E
(f(x); f(y))
(x; y)
(1)
the Lipschitz coecient of f. Denote
con(E) = {f: Lip(f) ¡ 1; f: E → E};
con(E) carries the topology of pointwise convergence, {0; 1; : : : ; N − 1} carries the
discrete topology and C = {0; 1; : : : ; N − 1}N carries the product topology.
For any topology space T , B(T ) denotes the Borel -algebra, P(T ) denotes all
Borel probability measures on B(T ). Let
con(
; E) = {f(!): f(!) :
→ con(E); s:t: f−1 (B(con(E))) ⊂ F};
that is to say, con(
; E) is all random elements from (
; F; P) to con(E). We denote
the distribution of X , the random element on (
; F; P), by P ◦ X −1 .
For any subset A in metric space, A0 , A and diam(A) denote the interior, closure
and diameter of A. We always assume diam(E) ¡ ∞.
Denition 1. ∀{f0 ; : : : ; fN −1 }⊂con(E), K ∈K(E), we call K a (f0 ; : : : ; fN −1 )-(strictly)
SN −1
self-similar set, i K = i=0 fi (K).
(∀A ⊂ E, we always denote the image of f on A by f(A).)
Let {f0(!) ; : : : ; fN(!)
−1 } ⊂ con(
; E), Q ∈ P(K(E)), call Q a P-(f0 ; : : : ; fN −1 ) statistically self-similar measure, i ∀B ∈ B(K(E)),
)!
(
N[
−1
(!)
N
N
fi (Ji ) ∈ B
:
(2)
(!; J0 ; : : : ; JN −1 ) ∈
× K(E) :
Q(B) = P × Q
i=0
We call a random element K ∗ (!), from (
; F; P) to K(E), a P-(f0 ; : : : ; fN −1 ) statistically self-similar set, i its distribution P ◦ (K ∗ )−1 is a P-(f0 ; : : : ; fN −1 ) statistically
self-similar measure.
We call K ∗ (!) a P-(f0 ; : : : ; fN −1 ) a.s. self-similar set, i
K ∗ (!) =
N[
−1
i=0
fi(!) (K ∗ (!i ));
P N +1 -a:s: (!; !0 ; : : : ; !N −1 ):
(3)
245
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
Lemma 1. Let f: E → E:
(1)
(2)
(3)
(4)
(5)
if f is continuous; then f(K(E)) ⊂ K(E);
Lip(f) : con(E) → [0; 1) is lower-semicontinuous;
f(J ) : con(E) × K(E) → K(E) is continuous;
Sn
g(J1 ; : : : ; Jn ) = i=1 Ji : K(E)n → K(E) is continuous;
h(f1 ; : : : ; fm ) = f1 ◦ · · · ◦ fm : con(E)m → con(E) is continuous.
Proof. Cf. Graf (1987).
Lemma 2. If {f(!) ; f1(!) ; : : : ; fm(!) } ⊂ con(
; E), then
(1) f(!) (J ):
× K(E) → K(E) is Borel measurable;
(2) f1(!) ◦ · · · ◦ fm(!) ∈ con(
; E); and f1(!) ◦ · · · ◦ fm(!) (J ) is Borel measurable from
(
× K(E)) to K(E).
Proof. It is easy to prove this lemma by the denitions.
Lemma 3. Let (Wi ; Gi ; Qi ) be complete probability space (i = 1; 2); ∀ xed ∈ D;
h(u) : W1 → con(E); h−1
(B(con(E))) ⊂ G1 ,
L(v): W2 → K(E); L−1 (B(K(E))) ⊂ G2 ,
M (v): W2 → K(E); M −1 (B(K(E))) ⊂ G2 ,
d
if L = M , i.e. Q2 ◦ L−1 = Q2 ◦ M −1 , then; for any m¿1; we have
[
[
d
h(u) (L(v)) =
h(u) (M (v)) ( for Q1 × Q2 ):
∈Cm
∈Cm
Proof. Let 1A be the indicator function on A, then, for any B ∈ B(K(E)), we have
(
)!
[
(u; v) ∈ W1 × W2 :
h(u) (L(v)) ∈ B
(Q1 × Q2 ) =
∈Cm
=
=
=
Z
Z
Z
Q1 (du)
W1
Z
W2
Q1 (du)
W1
Q2 (dv)1{(u; v): ∪∈C
Z
−1
(Q2 ◦ L
K(E)
Q1 (du)
W1
= (Q1 × Q2 )
Z
(u)
m h (L(v))∈B}
K(E)
(
(u; v)
(u; J )
h(u) ( J )∈B}
) (d J )1{(u; J ): ∪∈C
m
(Q2 ◦ M −1 ) (d J )1{(u; J ): ∪∈C
(u; v) ∈ W1 × W2 :
(u)
m h ( J )∈B}
[
∈Cm
h(u) (M (v)) ∈ B
)!
(u; J )
:
Lemma 3 is proved.
(!)
(!)
F;
P)
be any complete probability space; {(g∗0
Lemma 4. Let (
;
; : : : ; g∗(N
−1) );
∈ D} be a collection of i.i.d. random elements from (
; F; P) to con(E)N ;
246
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
(!)
(!)
(!)
gn;
= g|1 ◦ · · · ◦ g|n , ( ∈ Cn ; n¿1), if
Lip(gi(!)
! ∈
:
) ¡ i }) = 1 (0 ¡ i ¡ 1; 06i ¡ N );
P({
then
P-a:s:
!:
(!)
lim sup Lip(gn;
) = 0;
n→∞ ∈Cn
(!)
(!)
; : : : ; Lip(g|n
))−1 ( ∈ Cn ; n¿1) be the P-distribution
of
Proof. Let = P ◦ (Lip(g|1
(!)
(!)
n
(Lip(g ); : : : ; Lip(g )). (It is a random element from
to [0; 1) by Lemmas 1 and 2.)
|1
Since
|n
(!)
(!)
{(g∗0 ; : : : ; g∗(N
−1) ),
∈ D} are i.i.d., so ∀; ∈ D, || =
6 ||, Lip(g(!)
) and
d
(!)
). Hence, for any =(1 ; : : : ; n ) ∈ Cn ,
) are independent and Lip(g∗i
) = Lip(gi(!)
Lip(g(!)
we have
n
n
n
))−1 = × k :
))−1 = × P ◦ (Lip(g(!)
= × P ◦ (Lip(g(!)
k=1
|k
k
k=1
k=1
∀ ¿ 0, let = max06i¡N i , we have
(
)!
!
n
n
Y
Y
(
!)
n
(t1 ; : : : ; tn ) ∈ [0; 1) :
tk ¿
Lip(g|k ) ¿ =
P
k=1
k=1
)!
(
n
Y
n
n
tk ¿
(t1 ; : : : ; tn ) ∈ [0; 1) :
= × k
k=1
=
k=1
)!
(
n
Y
n
(t1 ; : : : ; tn ) ∈ (0; )n :
× k
tk ¿
k=1
k=1
b (log )=log ):
(when || = n ¿ n0 =
(Since (t1 ; : : : ; tn ) ∈ (0; )n : k=1 tk ¿ is empty set when n ¿ (log =log ):)
Hence, when || = n ¿ log =log , we have
!
n
Y
(!)
(!)
Lip (g|k ) ¿
P sup Lip (gm: ) ¿ 6 P sup
=0
Qn
∈Cn k=1
∈Cn
6
X
∈Cn
P
n
Y
(!)
Lip (g|k
)¿
k=1
!
= 0:
Lemma 4 follows from the Borel–Cantelli lemma and the above inequality.
(!)
(!)
F;
P)
be any complete probability space; {(g∗0
Lemma 5. Let (
;
; : : : ; (g∗(N
−1) );
∈ D} ⊂ con(
; E);
(
)
n
Y
(
!)
lim
Lip(g|k ) = 0; ∀ ∈ C ;
(4)
0 = ! ∈
:
n→∞
1 =
(
k=1
lim sup
! ∈
:
then
0 =
1 .
n
Y
n→∞ ∈C
n k=1
(!)
Lip(g|k
)
)
=0 ;
(5)
247
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
Proof. Let
=
An (!)
(
∈ C:
n
Y
(!)
Lip(g|k
)¡
k=1
)
(6)
;
n¿1} is a collection of increasing open sets in the compact space C.
then {An (!);
S∞
∀! ∈
0 , we have k=1 Ak (!)
⊃ C, so there is an integral n0 such that
=
An (!)
n
[
Ak (!)
=C
(∀n¿n0 ):
k=1
It means
0 ⊂
1 :
1 ⊂
0 is obvious.
(!)
(!)
F;
P)
and {(g∗0
Lemma 6. Let (
;
; : : : ; g∗(N
−1) ); ∈ D} be dened as in Lemma 5.
N
Let G: con(E) → [0; 1) be measurable;
(
)
∞
Y
(
!)
(
!)
G(g(|(n−1))∗0 ; : : : ; g(|(n−1))∗(N −1) ) = 0; ∀ ∈ C ;
(7)
G = ! ∈
:
n=1
then
G ∈ F.
Proof. We can prove as in Lemma 5 that
∀m¿1; ∃ k ∈ N; s:t:
G = ! ∈
:
k
Y
(!)
(!)
G(g(|(n−1))∗0
; : : : ; g(|(n−1))∗(N
−1) ) ¡
n=1
=
∞ \
∞ [
\
m=1 k=1 ∈Ck
(
! ∈
:
k
Y
1
; ∀ ∈ Ck
m
)
(!)
(!)
G(g(|(n−1))∗0
; : : : ; g(|(n−1))∗(N
−1) ) ¡
n=1
)
1
∈ F:
m
Lemma 7. Let
D = ×
; FD = × F; P D = × P ; be the coordinate map
∈D
∈D
∈D
on
D ;
:
D →
; ((! ; ∈ D)) = !
(
; F ; P ) ≡ (
; F; P)
(∀ ∈ D);
(∀ ∈ D);
F:
× (
D )N →
D ;
∀!∅ ∈
; (! (0) ; : : : ; ! (N −1) ) ∈ (
D )N ;
F(!∅ ; ! (0) ; : : : ; ! (N −1) ) = ! = (!∅ ; !i∗ = (! (i) ); i ∈ C1 ; ∈ D);
then F is a measurable map; i.e. F −1 (FD ) ⊂ F × (FD )N ; and
P D (A) = (P × (P D )N )(F −1 (A))
(∀A ∈ FD ):
(8)
248
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
Sk+1
Proof. ∀k¿0; ∈
(
Cn ; A ∈ F , we have
)!
k+1
[
D
∈ A ; ∀ ∈
Cn
! ∈
: (!)
F −1
−1
=F
= F −1
n=0
n=0
(
D
∈ A∅ :i∗ (!)
∈ Ai∗ ; ∀06i ¡ N; ∈
! ∈
: ∅ (!)
(
Cn
n=0
! ∈
D : ∅ (!)
∈ A∅ ; (! (i) ) ∈ Ai∗ ; ∀06i ¡ N; ∈
k
[
N −1
= A∅ × ×
i=0
∈
∈ F × (FD )N ;
×
k
S
n=0
Cn
k
[
Cn
n=0
)!
)!
Ai∗ × ×
S
C
∈
n
n¿k
hence
F −1 (FD ) ⊂ F × (FD )N :
By Fubini theorem and the denition of F we can get (8).
(!)
( (!))
D
D
D
Lemma 8. Let {f0(!) ; : : : ; fN(!)
;
−1 }⊂con(
; E); (
; F; P) = (
; F ; P ); g∗i =fi
(i ∈ C1 ; ∈ D); G and
G are dened as in Lemma 6; F is dened as in Lemma 7.
Then
G ) = P D (
G ) = 1:
P(
(9)
∈ D). Let
Proof. We denote the element in
D =
by ! = (! ; ∈ D) = ( (!);
)
);
H (! ) = G(f0(! ) ; : : : ; fN(!−1
(
)
∞
Y
(
!)
(
!)
∃ ∈ C such that
G(g(|(n−1))∗0 ; : : : ; g(|(n−1))∗(N −1) )¿a
Aa = ! ∈
:
=
=
(
∃ ∈ C such that
! ∈
:
F
−1
n=1
(
∃ ∈ C; s:t:
! ∈
:
then
(Aa ) =
n=1
∞
Y
∞
Y
)
(!|(n−1) )
(!
)
)¿a
G(f0 |(n−1) ; : : : ; fN −1
)
H (!|(n−1) )¿a ;
n=1
(10)
(
(!∅ ; ! (0) ; : : : ; ! (N −1) ) ∈
× (
D )N : (!∅ ; !i∗ = (! (i) );
i ∈ C1 ; ∈ D) satises : ∃061 ¡ N; ∃ = (2 ; 3 ; : : :) ∈ C
)
∞
Y
H (!1 ∗(|(n−2)) )¿a :
such that H (!∅ )
n=2
(11)
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
249
we can prove Aa ∈ F
(∀a ¿ 0). Hence, p(a) =
a)
As in the proof of
G ∈ F,
b P(A
is a non-increasing function and
G =
−
∞
[
(12)
A1=m :
m=1
G ) = 1 that
So it is enough to prove P(
p(a) ≡ 0
(∀a ∈ (0; 1)):
(13)
From 06H ¡ 1 we have
H (!∅ )
∞
Y
H (!1 ∗(|(n−2)) )¿a ⇒ H (!∅ )¿a;
n=2
Since H (!∅ ) and
∞
Y
H (!∗(|(n−2)) )¿a:
(14)
n=2
Q∞
n=2
H (!∗(|(n−2)) ) are independent with measure P × (P D )N , so
p(a) = P D (Aa )
= (P × (P D )N ) (F −1 (Aa ))
6 NP(H (!∅ )¿a)p(a)
(∀a ∈ (0; 1))
(15)
by (11) and (14).
Since H ¡ 1, so there is b ∈ (0; 1) such that
1
:
N2
It follows from (15), (16) and N ¿2 that
P(H (!∅ )¿b) ¡
(16)
p(b) = 0:
(17)
Hence
=
b inf {a ∈ (0; 1): p(a) = 0} ¡ 1:
(18)
If ¿ 0, then we can choose a ∈ (; 1) such that
ab ¡ ;
p(a) = 0
(19)
by the denition of and p(·) being non-increasing and b ∈ (0; 1).
By (8) and the denitions of p(·) and Aa we have
p(ab) = (P × (P D )N ) (F −1 (Aab ))
6 N · (P × (P D )N )
(
(!∅ ; ! (0) ; : : : ; ! (N −1) ) ∈
× (
D )N :
(!∅ ; !i∗ = (! (i) ); i ∈ C1 ; ∈ D) satises ∃ ∈ C
)!
∞
Y
H (!1 ∗(|(n−1)) ¿ab
such that H (!∅ )
n=1
250
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
D
= N · (P × P )
H (!∅ )
∞
Y
(
(!∅ ; ! (1 ) ) ∈
×
D : ∃ ∈ C; s:t:
H (|(n−1) (! (1 ) ))¿ab
n=1
)!
(20)
;
1 = 1 − p(a) = P D (
D − Aa )
)!
(
∞
Y
D
D
H (|(n−1) (!))
¡ a; ∀ ∈ C
:
! ∈
:
=P
(21)
n=1
Hence from (20), (21) and (16) we have
(
p(ab) 6 N · (P × P D )
s:t: H (!∅ )¿b;
(!∅ ; ! (1 ) ) ∈
×
D : ∃ ∈ C;
∞
Y
H (|(n−1) (!
(1 )
)!
))¿ab
n=1
1
p(ab):
N
So p(ab) = 0. But ab ¡ , it is a contradiction. Hence = 0. Lemma 8 is proved.
= N · P(H (!∅ )¿b)p(ab)6
F;
P)
is a complete probability space;
Lemma 9. Suppose diam(E) ¡ ∞; (
;
E); gn; = g|1 ◦ · · · ◦ g|n ( ∈ Cn ; n¿1). Let
{g ; ∈ D} ⊂ con(
;
K=
∞ [
\
gn; (E):
n=1 ∈Cn
If
lim sup
n
Y
n→∞ ∈C
n k=1
Lip(g|k ) = 0;
a:s:;
then we have
F;
P)
to K(E); then
(I) If {J ; ∈ D} is any collection of random elements from (
;
!
[
gn; (J ) = 0; a:s:
lim K;
n→∞
∈Cn
F;
P)
to K(E).
(II) K is a random element from (
;
Proof. (I) Let n =
lim
m; n→∞
S
∈Cn
gn; (J ), we want to show
( n ; m ) = 0 a:s:
∀m ¿ n; ∈ Cn ; ∈ Cm−n ; let L∗ = g∗(|1) ◦ · · · ◦ g∗ (J∗ ) then
gm; ∗ (J∗ ) = gn; (L∗ ):
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
251
∀x ∈ n , there is a ∈ Cn such that x ∈ gn; (J ) ⊂ gn; (E). Hence
[
[
L∗
gn;
gn; (E) ∩ m = gn; (E) ∩
∈Cm−n
∈Cn
⊂ gn;
[
∈Cm−n
So
L∗ 6= ∅:
(x; m )6diam(gn; (E))6diam(E)
n
Y
Lip(g|k ):
(∗)
k=1
∀y ∈ m , there are ∈ Cn ; ∈ Cm−n such that
y ∈ gm; ∗ (J∗ ) = gn; (L∗ ):
Hence, there is t ∈ L∗ such that y = gn; (t) and then
[
gn;
(J
)
(y; n ) = gn; (t);
∈Cn
6 (gn; (t); gn; (J ))6diam(E)
n
Y
Lip(g|k ):
(∗∗)
k=1
It follows from (∗) and (∗∗) and the denition of that
lim
m; n→∞
( n ; m ) = 0 a:s:
But K(E) is complete, hence there is K ′ ∈ K(E) such that
lim (K ′ ; n ) = 0 a:s:
n→∞
It is easy to show that K ′ ⊂ K.
On the other hand, we can also prove K ⊂ K ′ . In fact, if there is an x ∈ K − K ′ , then
(x; K ′ ) ¿ 0 and there is a n ∈ Cn such that
x ∈ gn; n (E)
(∀n¿1):
But by the conditions of this lemma there is an n0 such that
diam(gn; n (E))6diam(E) Lip(gn; n ) ¡ 21 (x; K ′ ) (∀n¿n0 ) a:s:
Hence, it follows from x ∈ gn; n (E) and the above inequality that
gn; n (Jn ) ⊂ gn; n (E) ⊂{y: (x; y) ¡ 21 (x; K ′ )}
and then
(gn; n (Jn ); K ′ )¿ 21 (x; K ′ )
(∀n¿n0 ) a:s:
(∀n¿n0 ) a:s:
252
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
Hence,
(K ′ ; n ) ¿ sup{(K ′ ; y): y ∈ n }
¿ sup{(K ′ ; y): y ∈ gn; n (Jn )}
¿ (K ′ ; gn; n (Jn ))¿ 21 (x; K ′ ) ¿ 0
(∀n¿n0 ) a:s:
′
It is a contradiction. So K = K .
(II) It follows from Lemma 1 immediately that K is a random element.
2. Statistically self-similar sets
In this section, we will give several necessary and sucient conditions to ensure a
statistically recursive set being a statistically self-similar set.
First of all, we give the following convergence theorem.
D
D
D
Theorem 1. Let {f0(!) ; : : : ; fN(!)
−1 } ⊂ con(
; E) be xed. (
; F; P) = (
; F ; P );
(!)
(!)
(!)
(!)
= fi(x (!))
are as in Lemma 8; gn;
g∗i
= g|1 ◦ · · · ◦ g|n ( ∈ Cn ; n¿1). Let
K(!)
=
∞ [
\
(!)
gn;
(E)
(22)
n=1 ∈Cn
then we have
∈ D} is any collection of random elements from (
D ; FD ; P D ) to
(i) If {J (!);
K(E); then
!
[
(!)
gn; (J (!))
= 0 (P D -a:s: ! ∈
D ):
(23)
lim K(!);
n→∞
∈Cn
(ii) K(·) is a random element from (
D ; FD ; P D ) to K(E).
Proof. Let
0 and
1 be dened as in Lemma 5, taking G(h0 ; : : : ; hN −1 ) = max06i¡N
Lip(hi ) in Lemma 6, then we have
0 =
1 ⊃
G . It follows from Lemma 8 that
1 ) = 1;
0 ) = P(
P(
(24)
and then Theorem 1 is true by Lemma 9.
(!)
(!)
F;
P)
be any complete probability space; {(g∗0
Theorem 2. Let (
;
; g∗(N
−1) ); ∈ D}
N
be a collection of i.i.d. random elements from (
; F; P) to con(E) . Let
∞ [
\
(!)
gn;
∈
);
K(!)
=
(E) (!
n=1 ∈Cn
(!)
(!)
(!)
gn;
= g|1 ◦ · · · ◦ g|n
( ∈ Cn ; n¿1):
(!)
If limn→∞ sup∈Cn Lip(gn;
!);
then K(!)
is a P − (g0(!)
; : : : ; gN(!)
) = 0; (P-a:s:
−1 )
statistically self-similar set.
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
253
Proof. Cf. Hu (1999a,b, Theorem 2:3).
Theorem 3. Let K(!)
be as in the Theorem 1; K ∗ (!) be a random element from
(
; F; P) to K(E): Then the following statements are equivalent:
(a) K ∗ (!) is a P-(f0 ; : : : ; fN −1 ) statistically self-similar set;
(b) ∀n¿1; we have
[ (!)
d
(!)
=
g|1 ◦ · · · ◦ g|n
(K ∗ ( (!)))
K ∗ (∅ (!))
(25)
∈Cn
d
(= denotes equality in distribution);
˜ −1 = P D ◦ K −1 ;
(c) P ◦ (K ∗ )−1 = P D ◦ (K)
˜ !)
(K(
= K ∗ (∅ (!))):
(26)
Proof. (a) ⇒ (b): Suppose K ∗ (!) is a statistically self-similar set. It is easy to know
that (25) is true for n = 1. Suppose (25) is true for n6m. Since
[ (!)
(!)
g|1 ◦ · · · ◦ g|(m+1)
(K ∗ ( (!)))
∈Cm+1
[
=
(!)
(g|1
◦ ··· ◦
[
(!)
g|m
)
∈Cm
and
d
K ∗ (!∅ ) =
(!)
g∗i
(K ∗ (∗i (!)))
i∈C1
[
(!∅ )
fi
!
(K ∗ (!i ));
(27)
(28)
i∈C1
so we have
[ (! )
[ (!)
d
g∗i (K ∗ (∗i (!)))
=
fi (K ∗ (!∗i )) = K ∗ (! ) = K ∗ ( (!)):
i∈C1
(29)
i∈C1
It follows from (29), (27), Lemma 3 and (25) is true for n6m that
[ (!)
(!)
g|1 ◦ · · · ◦ g|(m+1)
(K ∗ ( (!)))
∈Cm+1
d
=
[
d
(!)
(!)
g|1
◦ · · · ◦ g|m
(K ∗ ( (!)))
= K ∗ (∅ (!)):
(30)
∈Cm
So (b) is true.
= K ∗ ( (!)).
Then the right-hand side
(b) ⇒ (c): Suppose (b) is true. Let J (!)
S
(!)
It follows from Theorem 1 that
of (25) is equal to ∈Cn gn; (J (!)).
!
[
(!)
gn; (J (!))
= 0 (P D -a:s: !):
(31)
lim K(!);
n→∞
Hence
[
∈Cn
∈Cn
W
(!)
gn;
→ K(!)
(J (!))
W
→ means convergence in distribution :
(32)
S
(!)
does not depend
But it follows from (b) that the distribution of ∈Cn gn;
(J (!))
∗ −1
on n and always equals to P ◦ (K ) , so (c) is true.
254
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
(!)
(c) ⇒ (a): Suppose (c) is true. It follows from the denition of {g∗i
; i ∈ C1 ; ∈ D}
(!)
(!)
that {(g∗0 ; : : : ; g∗(N −1) ); ∈ D} are i.i.d. random elements from (
D ; FD ; P D ) to
con(E)N .
It follows from Lemma 8 (taking G(h0 ; : : : ; hN −1 )=max06i6N Lip(hi )) and Theorem 2
˜
; : : : ; gN(!)
that K(!)
is a P D − (g0(!)
−1 ) statistically self-similar set. But (c) is true, so K
(!)
(!)
D
is also a P − (g0 ; : : : ; gN −1 ) statistically self-similar set. Hence ∀B ∈ B(K(E)), we
have
˜ −1 (B)
P ◦ (K ∗ )−1 (B) = P D ◦ (K)
˜ −1 )N )
= (P D × (P D ◦ (K)
(
D
N
(!;
J0 ; : : : ; JN −1 ) ∈
× K(E) :
N[
−1
gi(!)
(Ji ) ∈ B
i=0
= (P × (P ◦ (K ∗ )−1 )N )
(
N
(!∅ ; J0 ; : : : ; JN −1 ) ∈
× K(E) :
N[
−1
)!
(! )
fi ∅ (Ji ) ∈ B
i=0
)!
:
It means K ∗ (!) is a P − (f0(!) ; : : : ; fN(!)
−1 ) statistically self-similar set. The theorem is
proved.
∗
Theorem 4. Suppose {f0(!) ; : : : ; fN(!)
−1 } ⊂ con(
; E); K (!) is a random element from
(
; F; P) to K(E). Then the necessary and sucient conditions, to ensure K ∗ (!)
being a P-(f0 ; : : : ; fN ) statistically self-similar set, are that there are following objects:
F;
P);
(a) complete probability space ((
;
(b) a collection of maps { ; ∈ D}; :
→
; −1 (F) ⊂ F;
(!)
(!)
(c) a collections of i.i.d. random elements {(g∗0 ; : : : ; g∗(N
−1) ); ∈ D} from
N
(
; F; P) to con(E) ; satisfying
(!)
g∗i
= fi( (!))
;
P ◦ ∅−1 = P
(i ∈ C1 ; ∈ D; ! ∈
);
(!)
(!)
◦ · · · ◦ g|n
) = 0 (P-a:s:
!);
lim sup Lip(g|1
n→∞ ∈C
n
P ◦ (K ∗ )−1 = P ◦ K −1 ;
(33)
(34)
(35)
where
K(!)
=
∞ [
\
(!)
gn;
(E)
(! ∈
);
(36)
n=1 ∈Cn
(!)
(!)
(!)
gn;
= g|1 ◦ · · · ◦ g|n
( ∈ Cn ; n¿1):
(37)
255
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
Proof. Necessity: Suppose K ∗ (!) is a P-(f0 ; : : : ; fN −1 ) statistically self-similar set.
(!)
F;
P)
= (
D ; FD ; P D ); = ; g∗i
Let (
;
= fi( (!))
; (! ∈
=
D ; ∈ D; i ∈ C1 ).
It is easy to see that (
; F; P) is a complete probability space, { = ; ∈ D} is
(!)
(!)
a collection of measurable maps, {(g∗0
; : : : ; g∗(N
−1) ); ∈ D} is a collection of i.i.d.
random elements.
Eq. (33) is obviously true. Eq. (34) follows from Lemma 8 (taking G(h0 ; : : : ; hN −1 )=
max06i6N Lip(hi )). Eq. (35) follows from Theorem 3.
F;
P)
and a collection of
Suciency: If there is a complete probability space (
;
F;
P)
to (
; F; P) and a collection of i.i.d.
measurable maps ( ; ∈ D} from (
;
(!)
(!)
F;
P)
to con(E)N satisfying
; : : : ; g∗(N
);
∈
D}
from (
;
random elements {(g∗0
−1)
(!)
(!)
(33) – (37). It follows from Theorem 2 that K(!)
is a P-({(g
0 ; : : : ; gN −1 ) statisti-
cally self-similar set. It follows from the denition of statistically self-similar set and
(33) – (37) that
(P ◦ (K ∗ )−1 )(B) = (P ◦ K −1 )(B)
= (P × (P ◦ K −1 )N )
(
(!;
J0 ; : : : ; JN −1 ) ∈
× K(E) :
N
N[
−1
gi(!)
(Ji ) ∈ B
i=0
= (P × (P ◦ (K ∗ )−1 )N )
(
(!;
J0 ; : : : ; JN −1 ) ∈
× K(E) :
N
N[
−1
)!
( (!))
fi ∅ (Ji ) ∈ B
i=0
)!
(∀B ∈ B(K(E))):
(38)
Since P ◦ ∅−1 = P, so we have
P
! ∈
:
N[
−1
( (!))
fi ∅ (Ji ) ∈ B
i=0
)!
=P
(
! ∈
:
N[
−1
fi(!) (Ji ) ∈ B
i=0
)!
:
(39)
Using Fubini theorem in (38) and noting (39) we can get
(P ◦ (K ∗ )−1 )(B)
= (P × (P ◦ (K ∗ )−1 )N )
(
(!; J0 ; : : : ; JN −1 ) ∈
× K(E)N :
N[
−1
i=0
(∀B ∈ B(K(E))):
)!
fi(!) (Ji ) ∈ B
(40)
The suciency is proved.
Remark 1. If the three objects in Theorem 4 satisfy (33); then
(!)
(!)
−1
−1
P ◦ (g∗0
; : : : ; g∗(N
= P ◦ (f0(!) ; : : : ; fN(!)
−1 )
−1) )
(∀ ∈ D):
(41)
256
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
Proof.
(!)
(!)
−1
−1
; : : : ; g∗(N
= P ◦ (g0(!)
; : : : ; gN(!)
P ◦ (g∗0
−1 )
−1) )
( (!))
(∅ (!))
= P ◦ (f0 ∅ ; : : : ; fN −1
)−1
−1
= P ◦ (f0(!) ; : : : ; fN(!)
−1 ) :
∗
Remark 2. Let (f0(!) ; : : : ; fN(!)
−1 ) ⊂ con(
; E); K (!) is a P-(f0 ; : : : ; fN −1 ) statistically
self-similar set from (
; F; P) to K(E); then there are three objects as in Theorem 4;
which satisfy (34)–(37) and
(!)
= fi( (!))
;
g∗i
P ◦ −1 = P
(∀i ∈ C1 ; ∈ D; ! ∈
):
(33′ )
F;
P)
=
Proof. In the proof of necessity part of Theorem 4, we have constructed (
;
(!)
( (!))
D
D
D
D
(
, F ; P ); = ; g∗i = f∗i ; (! ∈
=
; ∈ D; i ∈ C1 ). Hence, for any
A ∈ F; ∈ D, we have
! ∈
D : (!)
! ∈
D : ! ∈ A})
∈ A}) = P({
(P ◦ −1 )(A) = P({
= P ({! ∈ A}) = P(A):
(42)
Theorem 5. Let {f0(!) ; : : : ; fN(!)
−1 } ⊂ con(
; E); satisfy
P({! ∈
: Lip(fi(!) ) ¡ i }) = 1 (i ¡ 1; 06i ¡ N ):
(43)
K ∗ (!) is a random element from (
; F; P) to K(E); then K ∗ (!) is a P-(f0 ; : : : ; fN −1 )
F;
P);
{ ; ∈ D}
statistically self-similar set if and only if there are three objects (
;
(!)
(!)
; : : : ; g∗(N
);
∈
D}
as
in
the
Theorem
4
which
satisfy
(33) and
and (g∗0
−1)
(35)–(37).
Proof. There is no need to prove the necessity part. It is enough to prove the suciency
part that under condition (43) the three objects, constructed as before which satisfy
(33), (35) – (37), always satisfy (34). In fact,
(∅ (!))
(!)
) ¡ i ) = P(Lip(fi(!) ) ¡ i ) = 1:
P(Lip(g
i ) ¡ i ) = P(Lip(fi
(44)
Hence it follows from the Lemma 4 that (34) is true.
3. A.s. self-similar sets
In this section we will give several necessary and sucient conditions to ensure a
statistically recursive set being an a.s. self-similar set.
Theorem 6. Let K(!)
be dened as in the Theorem 1; K ∗ (!) be a random element
from (
; F; P) to K(E); then the following statements are equivalent:
(a) K ∗ (!) is a P-(f0 ; : : : ; fN −1 ) a.s. self-similar set;
257
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
(b) for any n¿1 we have
[ (!)
(!)
K ∗ (∅ (!))
=
g|1 ◦ · · · ◦ g|n
(K ∗ ( (!)))
for P = P D -a:s: !;
(45)
∈Cn
= K(!);
(c) K ∗ (∅ (!))
P D -a:s: !:
(46)
Proof. (a) ⇒ (b): Suppose (a) is true. It follows from the denition of a.s. self-similar
set that (45) is true for n = 1. If (45) is true for n6m, then
[ (!)
(!)
g|1 ◦ · · · ◦ g|(m+1)
(K ∗ ( (!)))
∈Cm+1
=
[
(!)
(g|1
◦ ··· ◦
(!)
g|m
)
∈Cm
=
[
=
(!)
g∗i
(K ∗ (∗i (!)))
i∈C1
(!)
(g|1
◦ ··· ◦
(!)
g|m
)
∈Cm
[
[
[
!
fi( (!))
(K ∗ (∗i (!)))
i∈C1
!
(!)
(!)
(g|1
◦ · · · ◦ g|m
)(K ∗ ( (!)))
∈Cm
= K ∗ (∅ (!))
P D -a:s: !:
Hence (45) is true for any n¿1.
= K ∗ ( (!))
in Theorem 1, it follows
(b) ⇒ (c): Suppose (b) is true. Taking J (!)
from Theorem 1 that
!
[ (!)
(!)
∗
g|1 ◦ · · · ◦ g|n (K ( (!))
= 0; P D -a:s: !:
(47)
lim K(!);
n→∞
∈Cn
Hence (c) is true by (b) and (47).
(c) ⇒ (a): Suppose (c) is true. It follows from Lemma 1 that
h = h1 ◦ h2 ◦ · · · ◦ hs : con(E)s → con(E);
J=
t
[
Ji : K(E)t → K(E)
i=1
are continuous maps, hence, from Theorem 1, we know
[ (!)
(!)
g|1 ◦ · · · ◦ g|n
(K ∗ ( (!)))
K(!)
= lim
n→∞
= lim
n→∞
=
[
i∈C1
∈Cn
[
i∈C1
gi(!)
gi(!)
lim
n→∞
[
∈Cn−1
[
∈Cn−1
(!)
(!)
gi∗(|1)
◦ · · · ◦ gi∗
(K ∗ (i∗ (!)))
(!)
(!)
◦ · · · ◦ gi∗
(K ∗ (i∗ (!)))
;
gi∗(|1)
(for Hausdor metric ):
(P D -a:s: !);
(48)
258
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
Let ! (i) = (!(i) ; ∈ D) = (!i∗ ; ∈ D) and = (1 ; : : : ; n−1 ); then by Theorem 1
[ (!)
(!)
lim
gi∗(|1)
◦ · · · ◦ gi∗
(K ∗ (i∗ (!)))
n→∞
∈Cn−1
[
= lim
n→∞
)
∈Cn−1
= lim
n→∞
(!
i∗(|(n−2))
(K ∗ ( (! (i) )))
f(!1 i ) ◦ · · · ◦ fn−1
[
(i)
(!
g|1
)
(i)
(! )
◦ · · · ◦ g|(n−1)
(K ∗ ( (! (i) )))
∈Cn−1
= K(! (i) );
P D -a:s: ! (i)
From (48) and (49) we have
[ (!)
gi (K(! (i) );
K(!)
=
(for Hausdor metric ):
(P D )N +1 -a:s: (!;
! (0) ; : : : ; ! (N −1) ):
(49)
(50)
∈C1
∗
; : : : ; gN(!)
is
It means that K(!)
is a P D -(g0(!)
−1 ) a.s. self-similar set, and then K (∅ (!))
(x∅ (!))
(!)
(!)
(!)
∗
, hence K (!) is a P-(f0 ; : : : ; fN −1 ) a.s. self-similar
also by (c). Since gi = fi
set. Theorem 6 is proved.
∗
Theorem 7. Let {f0(!) ; : : : ; fN(!)
−1 } ⊂ con(
; E). Then every random element K (!)
from (
; F; P) to K(E) is a P-(f0 ; : : : ; fN −1 ) a.s. self-similar set if and only if
there are the following three objects:
F;
P);
(a) complete probability space (
;
(∀ ∈ D);
(b) a collection of measurable maps { ; ∈ D}: :
→
; −1 (F) ⊂ F;
(!)
(!)
F;
P)
(c) a collection of i.i.d. random elements {(g∗0
; : : : ; g∗(N
);
∈
D}
from
(
;
−1)
N
to con(E) , which satisfy
(!)
= fi( (!))
;
g∗i
P ◦ −1 = P
(i ∈ C1 ; ∈ D; ! ∈
);
(!)
(!)
◦ · · · ◦ g|n
) = 0 (P-a:s:
!);
lim sup Lip (g|1
n→∞ ∈C
(51)
(52)
n
= K(!);
K ∗ (∅ (!))
P-a:s:
!;
(53)
where
K(!)
=
∞ [
\
(!)
(!)
g|1
◦ · · · ◦ g|n
(E) (! ∈
):
(54)
n=1 ∈Cn
Proof. Necessity: We construct the three objects as in Theorem 4. It is easy to know
that (51) and (52) are obviously true. From Theorem 6, (53) is also true.
Suciency: If there are three objects dened as in the theorem which satisfy
(51) – (53), using the similar argument in the proof of “(c) ⇒ (a)” in Theorem 6
∗
; : : : ; gN(!)
we can prove that K(!)
is a P D -(g0(!)
−1 ) a.s. self-similar set. Hence K (!) is
a P-(f0(!) ; : : : ; fN(!)
−1 ) a.s. self-similar set by condition (53). The theorem is proved.
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
259
4. Dimension and measure
In this section, we will give the Hausdor dimension and the exact Hausdor measure
function of any a.s. self-similar set.
Theorem 8. Let (
; F; P) be a complete probability space; N ¿2 be an integer.
∗
{f0(!) ; : : : ; fN(!)
−1 } ⊂ con(
; E); K (!) is any P-(f0 ; : : : ; fN −1 ) a.s. self-similar set;
satisfying
(1) E = E 0 ⊂ Rd ; E is compact;
b {f ∈ con(
; E): f(!) is a similar operator from
(2) {f0 ; : : : ; fN −1 } ⊂ sicon(
; E) =
E to E for any xed ! ∈
};
(3) fi (E 0 ) ⊂ E 0 ; fi (E 0 ) ∩ fj (E 0 ) = ∅ (06i; j ¡ N; i 6= j);
(4) min06i¡N Lip(fi ) ¿ 0;
(5) sup!∈
; 06i¡N Lip(fi(!) )6 ¡ 1;
then
(55)
P({!: dim(K ∗ (!)) = }) = 1
where is the unique solution of the following equation:
!
++
**
N
−1
X
(!)
P
Lip(fi ) = 1; 066d
:
E
(56)
i=0
E P is the expectation operator w.r.t. the probability measure P; dim( · ) is the
Hausdor dimension.
In order to prove Theorem 8, we need
(!)
F;
P)
be any complete probability space {(g∗i
; 06i ¡ N ):
Lemma 10. Suppose (
;
∈ D} are i.i.d. random elements from
to con(E)N satisfying
(1)
(2)
(3)
(4)
(5)
(6)
E = E 0 ⊂ Rd ; E is compact;
E);
{gs ; ∈ D} ⊂ sicon(
;
T
gn; (E 0 ) gn; (E 0 ) = ∅; gn; = g|1 ◦ · · · ◦ g|n ; (∀; ∈ Cn ; 6= ; n¿1);
min06i¡N Lip(gi ) ¿ 0;
26N ¡ ∞;
sup∈D-{∅} Lip(g ) = ¡ 1.
Let
K(!)
=
∞ [
\
(!)
(!)
g|1
◦ · · · ◦ g|n
(E) ;
n=1 ∈Cn
then
P(dim(K(
!))
= ) = 1;
where is the unique solution of the following equation:
!
++
**
X
(!)
P
:
Lip(gi ) = 1; 066d
E
06i¡N
This lemma is just Theorem 1 in Hu (1999a,1999b).
260
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
Proof of Theorem 8. For any a.s. P-(f0 ; : : : ; fN −1 ) self-similar set K ∗ (!), according
to the proof of necessity in Theorem 7, there are three objects:
(a) complete probability space
F;
P)
= (
D ; FD ; P D );
(
;
(b) a collection of measurable maps
{ ; ∈ D};
:
→
;
= ! ;
(!)
(when ! = (! ; ∈ D) ∈
D );
−1 (F) ⊂ F
(!)
(!)
; : : : ; g∗(N
(c) a collection of i.i.d. random elements {(g∗0
−1) ); ∈ D} from (
; F; P)
N
to con (E) , which satisfy (51) – (53) and
K(!)
=
b
∞ [
\
(!)
(!)
g|1
◦ · · · ◦ g|n
(E):
(54)
n=1 ∈Cn
Since dim(·) : K(E) → [0; ∞) is measurable, hence it follows from (53) and (51)
that
! : dim(K(!))
! : dim(K ∗ (∅ (!)))
= })
P({
= }) = P({
−1 )({! : dim(K ∗ (!)) = })
= (P
∅
= P({!: dim(K ∗ (!)) = }):
Similarly, Lip(·) : con(E) → [0; 1) is also measurable, hence
!
!
N
−1
N
−1
X
X
(!)
(!)
P
P
Lip(fi )
Lip(gi ) = E
E
(57)
(58)
i=0
i=0
and then is the unique solution of the following equation:
!
++
**
N
−1
X
(!)
P
:
Lip(gi ) = 1; 066d
E
(59)
i=0
In order to prove Theorem 8, it is enough to prove
P(dim
K(!)
= ) = 1;
(60)
where is the unique solution of Eq. (59).
Using Lemma 10 it is enough to prove (60), that conditions (1) – (6) in Lemma 10
(!)
D
=fi( (!))
; (i ∈ C1 ; ∈ D; ! ∈
=
), conditions
are satised. Since in Theorem 8 g∗i
(1) – (4) in Lemma 10 follow from conditions (1) – (4) in Theorem 8 immediately.
Condition (5) in Lemma 10 is obviously true, so we only need to prove condition (6)
in Lemma 10.
In fact, by condition (6) in Theorem 8 we have
(!)
)
) = sup Lip(g∗i
sup Lip(g(!)
06i¡N
∈D-{∅}
∈D
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
= sup
06i¡N
∈D
= sup
06i¡N
!∈
261
Lip(fi( (!))
)
Lip(fi(!)
)661:
The theorem is proved .
Lemma 11. Under the conditions in Lemma 10, if
(1) P(min
Lip(gi )¿) = 1 (for some one ¿ 0);
P 06i¡N
(2) P( 06i¡N Lip(gi ) = 1) = 1,
then
¡ H (K(!))
¡ ∞) = 1;
P(0
where H (·) is the Hausdor measure dened by ’(s) = s .
This lemma is just Theorem 2 in Hu (1999a,1999b).
Theorem 9. Under the conditions in Theorem 8, if
(1) P(min06i¡N Lip(fi )¿) = 1 (for some one ¿ 0);
P
(2) P( 06i¡N Lip(fi ) = 1) = 1,
then
P(0 ¡ H (K ∗ (!)) ¡ ∞) = 1;
it means that the exact Hausdor measure function of K ∗ is ’(s) = s .
(!)
, P ◦ −1 = P (i ∈ C1 ; ∈ D; ! ∈
=
D ) in this theorem,
Proof. Since g∗i
= fi( (!))
hence we can get Theorem 9 from Lemma 11 immediately.
Remark 3. If fi(!) does not depend on !; (06i ¡ N ), then Theorems 8 and 9 become
Theorem 1(ii) in Hutchinson (1981; p: 737).
Remark 4. If we take
= con(E)N ;
=
D = (con(E)N )D in Theorems 8 and 9.
! = (! ; ∈ D); !∅ = (S0 ; : : : ; SN −1 ) ∈
= con(E)N ; ! =
For any element ! ∈
;
N
(S∗0 ; : : : ; S∗(N −1) ); ( ∈ D); (f0(!) ; : : : ; fN(!)
−1 ) = !; (∀! ∈
= con(E) ),
then
(!)
(!)
( (!))
(!))
; : : : ; g∗(N
; : : : ; fN(−1
)
(g∗0
−1) ) = (f0
= ! = (S∗0 ; : : : ; S∗(N −1) ) (∀! = (! ; ∈ D)):
= (!)
Hence;
K(!)
=
\ [
(!)
(!)
g|1
◦ · · · ◦ g|n
(E)
n¿1 ∈Cn
=
\ [
S|1 ◦ · · · ◦ S|n (E);
n¿1 ∈Cn
i.e. K(!)
is just the random set dened in Theorem 2:2 in Graf (1987).
262
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
F;
P)
= (
D ; FD ; P D ), we have proved in the
For complete probability space (
;
0 ; : : : ; gN −1 ) a.s. self-similar
proof of (c) ⇒ (a) in Theorem 6 that K(!)
is also a P-(g
set. In this case, Theorems 8 and 9 become Theorems 7:6 and 7:8 in Graf (1987),
respectively.
Remark 5. Remark 4 tells us that the random set K dened in Theorem 2:2 in
0 ; : : : ; gN −1 ) statistically self-similar set but also a
Graf (1987) is not only a P-(g
P-(g0 ; : : : ; gn−1 ) a.s. self-similar set. Hence Theorems 7:6 and 7:8 in Graf (1987) are
the special cases of Theorems 8 and 9, respectively.
5. For further reading
The following references are also of interest to the reader: Graf et al., 1988; Matthias,
1991.
References
Falconer, K.J., 1994. The multifractal spectrum of statistically self-similar measures. J. Theoret. Probab. 7,
681–702.
Graf, S., 1987. Statistically self-similar fractals. Probab. Theory Related Fields 74, 357–392.
Graf, S., Mauldin, R.D., Williams, S.C., 1988. The exact Hausdor dimension in random recursive
construction. Mem. Amer. Math. Soc. 38, 1–121.
Hu, D., 1999a. Dimension and measure of a class of statistically recursive sets. Chinese Sci. Abstracts 5
(9), 1152.
Hu, D., 1999b. The probability character of statistically self-similar sets and measures. Acta Math. Sci.
19 (3), 338–346.
Hutchinson, J.E., 1981. Fractals and self-similarit. Indiana Univ. Math. J. 30, 713–747.
Matthias, A., 1991. Random recursive construction of self-similar fractal measures. The non compact case.
Probab. Theory Related Fields 88, 497–520.
Mauldin, R.D., Williams, S.C., 1986. Random recursive construction: asympototic geometric and topological
properties. Trans. Amer. Math. Soc. 295, 325–346.
www.elsevier.com/locate/spa
The necessary and sucient conditions for various
self-similar sets and their dimension (
Dihe Hu
College of Mathematical Science, Wuhan University, Wuhan, 430072, People’s Republic of China
Received 30 September 1999; received in revised form 17 April 2000; accepted 2 May 2000
Abstract
We have given several necessary and sucient conditions for statistically self-similar sets and
a.s. self-similar sets and have got the Hausdor dimension and exact Hausdor measure function
of any a.s. self-similar set in this paper. It is useful in the study of probability properties and
c 2000 Elsevier Science B.V. All
fractal properties and structure of statistically recursive sets.
rights reserved.
MSC: 60G10
Keywords: Statistically self-similar sets; A.s. self-similar sets; Hausdor dimension; Statistically
recursive sets; Statistical contraction operators
0. Introduction
Hutchinson (1981) has introduced the concepts of (strictly) self-similar sets and
self-similar measures in 1981 and has obtained many important results on fractal properties in the same paper. Falconer (1994), Graf (1987), and Mauldin and Williams
(1986) independently introduced the concepts of statistically self-similar sets and measures. They also obtained the Hausdor dimensions and Hausdor measures of some
statistically self-similar sets under some conditions. Let us give a preview of our results
in this paper now. First of all, we introduce the concept of a.s. self-similar sets, the concepts of statistically self-similar measure in combination with statistically self-similar
set, then we give several necessary and sucient conditions to ensure the statistically
recursive set being statistically self-similar set or a.s. self-similar set. Finally, we get the
Hausdor dimension and exact Hausdor measure function of any a.s. self-similar set.
1. Notations and lemmas
Let (
; F; P) be a complete probability space, (E; ) be a separable complete metric
space. K(E) denotes all non-empty compact sets in E, is the Hausdor metric on
(
Supported by the National Natural Science Foundation and the Doctoral Programme Foundation of China.
c 2000 Elsevier Science B.V. All rights reserved.
0304-4149/00/$ - see front matter
PII: S 0 3 0 4 - 4 1 4 9 ( 0 0 ) 0 0 0 4 3 - 0
244
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
K(E), that is to say, ∀I; J ∈ K(E)
(I; J ) = sup{(x; I ); (y; J ): x ∈ J; y ∈ I };
(x; I ) = inf {(x; y): y ∈ I )}:
(K(E); ) is also a separable complete metric space.
Let, N ¿2 be a xed integral, C0 ={∅}, Cn =Cn (N )={0; 1; : : : ; N −1}n ; (n¿1); D=
S
N
n¿0 Cn , C = {0; 1; : : : ; N − 1} , N = {1; 2; : : :}. ∀ ∈ C ∪ D, || is the length (or
dimension) of , ∀=(1 ; 2 ; : : : ; n ) ∈ D, =(1 ; 2 ; : : :), ∗ =(1 ; 2 ; : : : ; n ; 1 ; 2 ; : : :)
is the juxtaposition of and , |k = (1 ; : : : ; k ), (if ||¿k).
∀f: E → E, we call
Lip(f) =
sup
x6=y; x;y∈E
(f(x); f(y))
(x; y)
(1)
the Lipschitz coecient of f. Denote
con(E) = {f: Lip(f) ¡ 1; f: E → E};
con(E) carries the topology of pointwise convergence, {0; 1; : : : ; N − 1} carries the
discrete topology and C = {0; 1; : : : ; N − 1}N carries the product topology.
For any topology space T , B(T ) denotes the Borel -algebra, P(T ) denotes all
Borel probability measures on B(T ). Let
con(
; E) = {f(!): f(!) :
→ con(E); s:t: f−1 (B(con(E))) ⊂ F};
that is to say, con(
; E) is all random elements from (
; F; P) to con(E). We denote
the distribution of X , the random element on (
; F; P), by P ◦ X −1 .
For any subset A in metric space, A0 , A and diam(A) denote the interior, closure
and diameter of A. We always assume diam(E) ¡ ∞.
Denition 1. ∀{f0 ; : : : ; fN −1 }⊂con(E), K ∈K(E), we call K a (f0 ; : : : ; fN −1 )-(strictly)
SN −1
self-similar set, i K = i=0 fi (K).
(∀A ⊂ E, we always denote the image of f on A by f(A).)
Let {f0(!) ; : : : ; fN(!)
−1 } ⊂ con(
; E), Q ∈ P(K(E)), call Q a P-(f0 ; : : : ; fN −1 ) statistically self-similar measure, i ∀B ∈ B(K(E)),
)!
(
N[
−1
(!)
N
N
fi (Ji ) ∈ B
:
(2)
(!; J0 ; : : : ; JN −1 ) ∈
× K(E) :
Q(B) = P × Q
i=0
We call a random element K ∗ (!), from (
; F; P) to K(E), a P-(f0 ; : : : ; fN −1 ) statistically self-similar set, i its distribution P ◦ (K ∗ )−1 is a P-(f0 ; : : : ; fN −1 ) statistically
self-similar measure.
We call K ∗ (!) a P-(f0 ; : : : ; fN −1 ) a.s. self-similar set, i
K ∗ (!) =
N[
−1
i=0
fi(!) (K ∗ (!i ));
P N +1 -a:s: (!; !0 ; : : : ; !N −1 ):
(3)
245
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
Lemma 1. Let f: E → E:
(1)
(2)
(3)
(4)
(5)
if f is continuous; then f(K(E)) ⊂ K(E);
Lip(f) : con(E) → [0; 1) is lower-semicontinuous;
f(J ) : con(E) × K(E) → K(E) is continuous;
Sn
g(J1 ; : : : ; Jn ) = i=1 Ji : K(E)n → K(E) is continuous;
h(f1 ; : : : ; fm ) = f1 ◦ · · · ◦ fm : con(E)m → con(E) is continuous.
Proof. Cf. Graf (1987).
Lemma 2. If {f(!) ; f1(!) ; : : : ; fm(!) } ⊂ con(
; E), then
(1) f(!) (J ):
× K(E) → K(E) is Borel measurable;
(2) f1(!) ◦ · · · ◦ fm(!) ∈ con(
; E); and f1(!) ◦ · · · ◦ fm(!) (J ) is Borel measurable from
(
× K(E)) to K(E).
Proof. It is easy to prove this lemma by the denitions.
Lemma 3. Let (Wi ; Gi ; Qi ) be complete probability space (i = 1; 2); ∀ xed ∈ D;
h(u) : W1 → con(E); h−1
(B(con(E))) ⊂ G1 ,
L(v): W2 → K(E); L−1 (B(K(E))) ⊂ G2 ,
M (v): W2 → K(E); M −1 (B(K(E))) ⊂ G2 ,
d
if L = M , i.e. Q2 ◦ L−1 = Q2 ◦ M −1 , then; for any m¿1; we have
[
[
d
h(u) (L(v)) =
h(u) (M (v)) ( for Q1 × Q2 ):
∈Cm
∈Cm
Proof. Let 1A be the indicator function on A, then, for any B ∈ B(K(E)), we have
(
)!
[
(u; v) ∈ W1 × W2 :
h(u) (L(v)) ∈ B
(Q1 × Q2 ) =
∈Cm
=
=
=
Z
Z
Z
Q1 (du)
W1
Z
W2
Q1 (du)
W1
Q2 (dv)1{(u; v): ∪∈C
Z
−1
(Q2 ◦ L
K(E)
Q1 (du)
W1
= (Q1 × Q2 )
Z
(u)
m h (L(v))∈B}
K(E)
(
(u; v)
(u; J )
h(u) ( J )∈B}
) (d J )1{(u; J ): ∪∈C
m
(Q2 ◦ M −1 ) (d J )1{(u; J ): ∪∈C
(u; v) ∈ W1 × W2 :
(u)
m h ( J )∈B}
[
∈Cm
h(u) (M (v)) ∈ B
)!
(u; J )
:
Lemma 3 is proved.
(!)
(!)
F;
P)
be any complete probability space; {(g∗0
Lemma 4. Let (
;
; : : : ; g∗(N
−1) );
∈ D} be a collection of i.i.d. random elements from (
; F; P) to con(E)N ;
246
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
(!)
(!)
(!)
gn;
= g|1 ◦ · · · ◦ g|n , ( ∈ Cn ; n¿1), if
Lip(gi(!)
! ∈
:
) ¡ i }) = 1 (0 ¡ i ¡ 1; 06i ¡ N );
P({
then
P-a:s:
!:
(!)
lim sup Lip(gn;
) = 0;
n→∞ ∈Cn
(!)
(!)
; : : : ; Lip(g|n
))−1 ( ∈ Cn ; n¿1) be the P-distribution
of
Proof. Let = P ◦ (Lip(g|1
(!)
(!)
n
(Lip(g ); : : : ; Lip(g )). (It is a random element from
to [0; 1) by Lemmas 1 and 2.)
|1
Since
|n
(!)
(!)
{(g∗0 ; : : : ; g∗(N
−1) ),
∈ D} are i.i.d., so ∀; ∈ D, || =
6 ||, Lip(g(!)
) and
d
(!)
). Hence, for any =(1 ; : : : ; n ) ∈ Cn ,
) are independent and Lip(g∗i
) = Lip(gi(!)
Lip(g(!)
we have
n
n
n
))−1 = × k :
))−1 = × P ◦ (Lip(g(!)
= × P ◦ (Lip(g(!)
k=1
|k
k
k=1
k=1
∀ ¿ 0, let = max06i¡N i , we have
(
)!
!
n
n
Y
Y
(
!)
n
(t1 ; : : : ; tn ) ∈ [0; 1) :
tk ¿
Lip(g|k ) ¿ =
P
k=1
k=1
)!
(
n
Y
n
n
tk ¿
(t1 ; : : : ; tn ) ∈ [0; 1) :
= × k
k=1
=
k=1
)!
(
n
Y
n
(t1 ; : : : ; tn ) ∈ (0; )n :
× k
tk ¿
k=1
k=1
b (log )=log ):
(when || = n ¿ n0 =
(Since (t1 ; : : : ; tn ) ∈ (0; )n : k=1 tk ¿ is empty set when n ¿ (log =log ):)
Hence, when || = n ¿ log =log , we have
!
n
Y
(!)
(!)
Lip (g|k ) ¿
P sup Lip (gm: ) ¿ 6 P sup
=0
Qn
∈Cn k=1
∈Cn
6
X
∈Cn
P
n
Y
(!)
Lip (g|k
)¿
k=1
!
= 0:
Lemma 4 follows from the Borel–Cantelli lemma and the above inequality.
(!)
(!)
F;
P)
be any complete probability space; {(g∗0
Lemma 5. Let (
;
; : : : ; (g∗(N
−1) );
∈ D} ⊂ con(
; E);
(
)
n
Y
(
!)
lim
Lip(g|k ) = 0; ∀ ∈ C ;
(4)
0 = ! ∈
:
n→∞
1 =
(
k=1
lim sup
! ∈
:
then
0 =
1 .
n
Y
n→∞ ∈C
n k=1
(!)
Lip(g|k
)
)
=0 ;
(5)
247
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
Proof. Let
=
An (!)
(
∈ C:
n
Y
(!)
Lip(g|k
)¡
k=1
)
(6)
;
n¿1} is a collection of increasing open sets in the compact space C.
then {An (!);
S∞
∀! ∈
0 , we have k=1 Ak (!)
⊃ C, so there is an integral n0 such that
=
An (!)
n
[
Ak (!)
=C
(∀n¿n0 ):
k=1
It means
0 ⊂
1 :
1 ⊂
0 is obvious.
(!)
(!)
F;
P)
and {(g∗0
Lemma 6. Let (
;
; : : : ; g∗(N
−1) ); ∈ D} be dened as in Lemma 5.
N
Let G: con(E) → [0; 1) be measurable;
(
)
∞
Y
(
!)
(
!)
G(g(|(n−1))∗0 ; : : : ; g(|(n−1))∗(N −1) ) = 0; ∀ ∈ C ;
(7)
G = ! ∈
:
n=1
then
G ∈ F.
Proof. We can prove as in Lemma 5 that
∀m¿1; ∃ k ∈ N; s:t:
G = ! ∈
:
k
Y
(!)
(!)
G(g(|(n−1))∗0
; : : : ; g(|(n−1))∗(N
−1) ) ¡
n=1
=
∞ \
∞ [
\
m=1 k=1 ∈Ck
(
! ∈
:
k
Y
1
; ∀ ∈ Ck
m
)
(!)
(!)
G(g(|(n−1))∗0
; : : : ; g(|(n−1))∗(N
−1) ) ¡
n=1
)
1
∈ F:
m
Lemma 7. Let
D = ×
; FD = × F; P D = × P ; be the coordinate map
∈D
∈D
∈D
on
D ;
:
D →
; ((! ; ∈ D)) = !
(
; F ; P ) ≡ (
; F; P)
(∀ ∈ D);
(∀ ∈ D);
F:
× (
D )N →
D ;
∀!∅ ∈
; (! (0) ; : : : ; ! (N −1) ) ∈ (
D )N ;
F(!∅ ; ! (0) ; : : : ; ! (N −1) ) = ! = (!∅ ; !i∗ = (! (i) ); i ∈ C1 ; ∈ D);
then F is a measurable map; i.e. F −1 (FD ) ⊂ F × (FD )N ; and
P D (A) = (P × (P D )N )(F −1 (A))
(∀A ∈ FD ):
(8)
248
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
Sk+1
Proof. ∀k¿0; ∈
(
Cn ; A ∈ F , we have
)!
k+1
[
D
∈ A ; ∀ ∈
Cn
! ∈
: (!)
F −1
−1
=F
= F −1
n=0
n=0
(
D
∈ A∅ :i∗ (!)
∈ Ai∗ ; ∀06i ¡ N; ∈
! ∈
: ∅ (!)
(
Cn
n=0
! ∈
D : ∅ (!)
∈ A∅ ; (! (i) ) ∈ Ai∗ ; ∀06i ¡ N; ∈
k
[
N −1
= A∅ × ×
i=0
∈
∈ F × (FD )N ;
×
k
S
n=0
Cn
k
[
Cn
n=0
)!
)!
Ai∗ × ×
S
C
∈
n
n¿k
hence
F −1 (FD ) ⊂ F × (FD )N :
By Fubini theorem and the denition of F we can get (8).
(!)
( (!))
D
D
D
Lemma 8. Let {f0(!) ; : : : ; fN(!)
;
−1 }⊂con(
; E); (
; F; P) = (
; F ; P ); g∗i =fi
(i ∈ C1 ; ∈ D); G and
G are dened as in Lemma 6; F is dened as in Lemma 7.
Then
G ) = P D (
G ) = 1:
P(
(9)
∈ D). Let
Proof. We denote the element in
D =
by ! = (! ; ∈ D) = ( (!);
)
);
H (! ) = G(f0(! ) ; : : : ; fN(!−1
(
)
∞
Y
(
!)
(
!)
∃ ∈ C such that
G(g(|(n−1))∗0 ; : : : ; g(|(n−1))∗(N −1) )¿a
Aa = ! ∈
:
=
=
(
∃ ∈ C such that
! ∈
:
F
−1
n=1
(
∃ ∈ C; s:t:
! ∈
:
then
(Aa ) =
n=1
∞
Y
∞
Y
)
(!|(n−1) )
(!
)
)¿a
G(f0 |(n−1) ; : : : ; fN −1
)
H (!|(n−1) )¿a ;
n=1
(10)
(
(!∅ ; ! (0) ; : : : ; ! (N −1) ) ∈
× (
D )N : (!∅ ; !i∗ = (! (i) );
i ∈ C1 ; ∈ D) satises : ∃061 ¡ N; ∃ = (2 ; 3 ; : : :) ∈ C
)
∞
Y
H (!1 ∗(|(n−2)) )¿a :
such that H (!∅ )
n=2
(11)
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
249
we can prove Aa ∈ F
(∀a ¿ 0). Hence, p(a) =
a)
As in the proof of
G ∈ F,
b P(A
is a non-increasing function and
G =
−
∞
[
(12)
A1=m :
m=1
G ) = 1 that
So it is enough to prove P(
p(a) ≡ 0
(∀a ∈ (0; 1)):
(13)
From 06H ¡ 1 we have
H (!∅ )
∞
Y
H (!1 ∗(|(n−2)) )¿a ⇒ H (!∅ )¿a;
n=2
Since H (!∅ ) and
∞
Y
H (!∗(|(n−2)) )¿a:
(14)
n=2
Q∞
n=2
H (!∗(|(n−2)) ) are independent with measure P × (P D )N , so
p(a) = P D (Aa )
= (P × (P D )N ) (F −1 (Aa ))
6 NP(H (!∅ )¿a)p(a)
(∀a ∈ (0; 1))
(15)
by (11) and (14).
Since H ¡ 1, so there is b ∈ (0; 1) such that
1
:
N2
It follows from (15), (16) and N ¿2 that
P(H (!∅ )¿b) ¡
(16)
p(b) = 0:
(17)
Hence
=
b inf {a ∈ (0; 1): p(a) = 0} ¡ 1:
(18)
If ¿ 0, then we can choose a ∈ (; 1) such that
ab ¡ ;
p(a) = 0
(19)
by the denition of and p(·) being non-increasing and b ∈ (0; 1).
By (8) and the denitions of p(·) and Aa we have
p(ab) = (P × (P D )N ) (F −1 (Aab ))
6 N · (P × (P D )N )
(
(!∅ ; ! (0) ; : : : ; ! (N −1) ) ∈
× (
D )N :
(!∅ ; !i∗ = (! (i) ); i ∈ C1 ; ∈ D) satises ∃ ∈ C
)!
∞
Y
H (!1 ∗(|(n−1)) ¿ab
such that H (!∅ )
n=1
250
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
D
= N · (P × P )
H (!∅ )
∞
Y
(
(!∅ ; ! (1 ) ) ∈
×
D : ∃ ∈ C; s:t:
H (|(n−1) (! (1 ) ))¿ab
n=1
)!
(20)
;
1 = 1 − p(a) = P D (
D − Aa )
)!
(
∞
Y
D
D
H (|(n−1) (!))
¡ a; ∀ ∈ C
:
! ∈
:
=P
(21)
n=1
Hence from (20), (21) and (16) we have
(
p(ab) 6 N · (P × P D )
s:t: H (!∅ )¿b;
(!∅ ; ! (1 ) ) ∈
×
D : ∃ ∈ C;
∞
Y
H (|(n−1) (!
(1 )
)!
))¿ab
n=1
1
p(ab):
N
So p(ab) = 0. But ab ¡ , it is a contradiction. Hence = 0. Lemma 8 is proved.
= N · P(H (!∅ )¿b)p(ab)6
F;
P)
is a complete probability space;
Lemma 9. Suppose diam(E) ¡ ∞; (
;
E); gn; = g|1 ◦ · · · ◦ g|n ( ∈ Cn ; n¿1). Let
{g ; ∈ D} ⊂ con(
;
K=
∞ [
\
gn; (E):
n=1 ∈Cn
If
lim sup
n
Y
n→∞ ∈C
n k=1
Lip(g|k ) = 0;
a:s:;
then we have
F;
P)
to K(E); then
(I) If {J ; ∈ D} is any collection of random elements from (
;
!
[
gn; (J ) = 0; a:s:
lim K;
n→∞
∈Cn
F;
P)
to K(E).
(II) K is a random element from (
;
Proof. (I) Let n =
lim
m; n→∞
S
∈Cn
gn; (J ), we want to show
( n ; m ) = 0 a:s:
∀m ¿ n; ∈ Cn ; ∈ Cm−n ; let L∗ = g∗(|1) ◦ · · · ◦ g∗ (J∗ ) then
gm; ∗ (J∗ ) = gn; (L∗ ):
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
251
∀x ∈ n , there is a ∈ Cn such that x ∈ gn; (J ) ⊂ gn; (E). Hence
[
[
L∗
gn;
gn; (E) ∩ m = gn; (E) ∩
∈Cm−n
∈Cn
⊂ gn;
[
∈Cm−n
So
L∗ 6= ∅:
(x; m )6diam(gn; (E))6diam(E)
n
Y
Lip(g|k ):
(∗)
k=1
∀y ∈ m , there are ∈ Cn ; ∈ Cm−n such that
y ∈ gm; ∗ (J∗ ) = gn; (L∗ ):
Hence, there is t ∈ L∗ such that y = gn; (t) and then
[
gn;
(J
)
(y; n ) = gn; (t);
∈Cn
6 (gn; (t); gn; (J ))6diam(E)
n
Y
Lip(g|k ):
(∗∗)
k=1
It follows from (∗) and (∗∗) and the denition of that
lim
m; n→∞
( n ; m ) = 0 a:s:
But K(E) is complete, hence there is K ′ ∈ K(E) such that
lim (K ′ ; n ) = 0 a:s:
n→∞
It is easy to show that K ′ ⊂ K.
On the other hand, we can also prove K ⊂ K ′ . In fact, if there is an x ∈ K − K ′ , then
(x; K ′ ) ¿ 0 and there is a n ∈ Cn such that
x ∈ gn; n (E)
(∀n¿1):
But by the conditions of this lemma there is an n0 such that
diam(gn; n (E))6diam(E) Lip(gn; n ) ¡ 21 (x; K ′ ) (∀n¿n0 ) a:s:
Hence, it follows from x ∈ gn; n (E) and the above inequality that
gn; n (Jn ) ⊂ gn; n (E) ⊂{y: (x; y) ¡ 21 (x; K ′ )}
and then
(gn; n (Jn ); K ′ )¿ 21 (x; K ′ )
(∀n¿n0 ) a:s:
(∀n¿n0 ) a:s:
252
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
Hence,
(K ′ ; n ) ¿ sup{(K ′ ; y): y ∈ n }
¿ sup{(K ′ ; y): y ∈ gn; n (Jn )}
¿ (K ′ ; gn; n (Jn ))¿ 21 (x; K ′ ) ¿ 0
(∀n¿n0 ) a:s:
′
It is a contradiction. So K = K .
(II) It follows from Lemma 1 immediately that K is a random element.
2. Statistically self-similar sets
In this section, we will give several necessary and sucient conditions to ensure a
statistically recursive set being a statistically self-similar set.
First of all, we give the following convergence theorem.
D
D
D
Theorem 1. Let {f0(!) ; : : : ; fN(!)
−1 } ⊂ con(
; E) be xed. (
; F; P) = (
; F ; P );
(!)
(!)
(!)
(!)
= fi(x (!))
are as in Lemma 8; gn;
g∗i
= g|1 ◦ · · · ◦ g|n ( ∈ Cn ; n¿1). Let
K(!)
=
∞ [
\
(!)
gn;
(E)
(22)
n=1 ∈Cn
then we have
∈ D} is any collection of random elements from (
D ; FD ; P D ) to
(i) If {J (!);
K(E); then
!
[
(!)
gn; (J (!))
= 0 (P D -a:s: ! ∈
D ):
(23)
lim K(!);
n→∞
∈Cn
(ii) K(·) is a random element from (
D ; FD ; P D ) to K(E).
Proof. Let
0 and
1 be dened as in Lemma 5, taking G(h0 ; : : : ; hN −1 ) = max06i¡N
Lip(hi ) in Lemma 6, then we have
0 =
1 ⊃
G . It follows from Lemma 8 that
1 ) = 1;
0 ) = P(
P(
(24)
and then Theorem 1 is true by Lemma 9.
(!)
(!)
F;
P)
be any complete probability space; {(g∗0
Theorem 2. Let (
;
; g∗(N
−1) ); ∈ D}
N
be a collection of i.i.d. random elements from (
; F; P) to con(E) . Let
∞ [
\
(!)
gn;
∈
);
K(!)
=
(E) (!
n=1 ∈Cn
(!)
(!)
(!)
gn;
= g|1 ◦ · · · ◦ g|n
( ∈ Cn ; n¿1):
(!)
If limn→∞ sup∈Cn Lip(gn;
!);
then K(!)
is a P − (g0(!)
; : : : ; gN(!)
) = 0; (P-a:s:
−1 )
statistically self-similar set.
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
253
Proof. Cf. Hu (1999a,b, Theorem 2:3).
Theorem 3. Let K(!)
be as in the Theorem 1; K ∗ (!) be a random element from
(
; F; P) to K(E): Then the following statements are equivalent:
(a) K ∗ (!) is a P-(f0 ; : : : ; fN −1 ) statistically self-similar set;
(b) ∀n¿1; we have
[ (!)
d
(!)
=
g|1 ◦ · · · ◦ g|n
(K ∗ ( (!)))
K ∗ (∅ (!))
(25)
∈Cn
d
(= denotes equality in distribution);
˜ −1 = P D ◦ K −1 ;
(c) P ◦ (K ∗ )−1 = P D ◦ (K)
˜ !)
(K(
= K ∗ (∅ (!))):
(26)
Proof. (a) ⇒ (b): Suppose K ∗ (!) is a statistically self-similar set. It is easy to know
that (25) is true for n = 1. Suppose (25) is true for n6m. Since
[ (!)
(!)
g|1 ◦ · · · ◦ g|(m+1)
(K ∗ ( (!)))
∈Cm+1
[
=
(!)
(g|1
◦ ··· ◦
[
(!)
g|m
)
∈Cm
and
d
K ∗ (!∅ ) =
(!)
g∗i
(K ∗ (∗i (!)))
i∈C1
[
(!∅ )
fi
!
(K ∗ (!i ));
(27)
(28)
i∈C1
so we have
[ (! )
[ (!)
d
g∗i (K ∗ (∗i (!)))
=
fi (K ∗ (!∗i )) = K ∗ (! ) = K ∗ ( (!)):
i∈C1
(29)
i∈C1
It follows from (29), (27), Lemma 3 and (25) is true for n6m that
[ (!)
(!)
g|1 ◦ · · · ◦ g|(m+1)
(K ∗ ( (!)))
∈Cm+1
d
=
[
d
(!)
(!)
g|1
◦ · · · ◦ g|m
(K ∗ ( (!)))
= K ∗ (∅ (!)):
(30)
∈Cm
So (b) is true.
= K ∗ ( (!)).
Then the right-hand side
(b) ⇒ (c): Suppose (b) is true. Let J (!)
S
(!)
It follows from Theorem 1 that
of (25) is equal to ∈Cn gn; (J (!)).
!
[
(!)
gn; (J (!))
= 0 (P D -a:s: !):
(31)
lim K(!);
n→∞
Hence
[
∈Cn
∈Cn
W
(!)
gn;
→ K(!)
(J (!))
W
→ means convergence in distribution :
(32)
S
(!)
does not depend
But it follows from (b) that the distribution of ∈Cn gn;
(J (!))
∗ −1
on n and always equals to P ◦ (K ) , so (c) is true.
254
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
(!)
(c) ⇒ (a): Suppose (c) is true. It follows from the denition of {g∗i
; i ∈ C1 ; ∈ D}
(!)
(!)
that {(g∗0 ; : : : ; g∗(N −1) ); ∈ D} are i.i.d. random elements from (
D ; FD ; P D ) to
con(E)N .
It follows from Lemma 8 (taking G(h0 ; : : : ; hN −1 )=max06i6N Lip(hi )) and Theorem 2
˜
; : : : ; gN(!)
that K(!)
is a P D − (g0(!)
−1 ) statistically self-similar set. But (c) is true, so K
(!)
(!)
D
is also a P − (g0 ; : : : ; gN −1 ) statistically self-similar set. Hence ∀B ∈ B(K(E)), we
have
˜ −1 (B)
P ◦ (K ∗ )−1 (B) = P D ◦ (K)
˜ −1 )N )
= (P D × (P D ◦ (K)
(
D
N
(!;
J0 ; : : : ; JN −1 ) ∈
× K(E) :
N[
−1
gi(!)
(Ji ) ∈ B
i=0
= (P × (P ◦ (K ∗ )−1 )N )
(
N
(!∅ ; J0 ; : : : ; JN −1 ) ∈
× K(E) :
N[
−1
)!
(! )
fi ∅ (Ji ) ∈ B
i=0
)!
:
It means K ∗ (!) is a P − (f0(!) ; : : : ; fN(!)
−1 ) statistically self-similar set. The theorem is
proved.
∗
Theorem 4. Suppose {f0(!) ; : : : ; fN(!)
−1 } ⊂ con(
; E); K (!) is a random element from
(
; F; P) to K(E). Then the necessary and sucient conditions, to ensure K ∗ (!)
being a P-(f0 ; : : : ; fN ) statistically self-similar set, are that there are following objects:
F;
P);
(a) complete probability space ((
;
(b) a collection of maps { ; ∈ D}; :
→
; −1 (F) ⊂ F;
(!)
(!)
(c) a collections of i.i.d. random elements {(g∗0 ; : : : ; g∗(N
−1) ); ∈ D} from
N
(
; F; P) to con(E) ; satisfying
(!)
g∗i
= fi( (!))
;
P ◦ ∅−1 = P
(i ∈ C1 ; ∈ D; ! ∈
);
(!)
(!)
◦ · · · ◦ g|n
) = 0 (P-a:s:
!);
lim sup Lip(g|1
n→∞ ∈C
n
P ◦ (K ∗ )−1 = P ◦ K −1 ;
(33)
(34)
(35)
where
K(!)
=
∞ [
\
(!)
gn;
(E)
(! ∈
);
(36)
n=1 ∈Cn
(!)
(!)
(!)
gn;
= g|1 ◦ · · · ◦ g|n
( ∈ Cn ; n¿1):
(37)
255
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
Proof. Necessity: Suppose K ∗ (!) is a P-(f0 ; : : : ; fN −1 ) statistically self-similar set.
(!)
F;
P)
= (
D ; FD ; P D ); = ; g∗i
Let (
;
= fi( (!))
; (! ∈
=
D ; ∈ D; i ∈ C1 ).
It is easy to see that (
; F; P) is a complete probability space, { = ; ∈ D} is
(!)
(!)
a collection of measurable maps, {(g∗0
; : : : ; g∗(N
−1) ); ∈ D} is a collection of i.i.d.
random elements.
Eq. (33) is obviously true. Eq. (34) follows from Lemma 8 (taking G(h0 ; : : : ; hN −1 )=
max06i6N Lip(hi )). Eq. (35) follows from Theorem 3.
F;
P)
and a collection of
Suciency: If there is a complete probability space (
;
F;
P)
to (
; F; P) and a collection of i.i.d.
measurable maps ( ; ∈ D} from (
;
(!)
(!)
F;
P)
to con(E)N satisfying
; : : : ; g∗(N
);
∈
D}
from (
;
random elements {(g∗0
−1)
(!)
(!)
(33) – (37). It follows from Theorem 2 that K(!)
is a P-({(g
0 ; : : : ; gN −1 ) statisti-
cally self-similar set. It follows from the denition of statistically self-similar set and
(33) – (37) that
(P ◦ (K ∗ )−1 )(B) = (P ◦ K −1 )(B)
= (P × (P ◦ K −1 )N )
(
(!;
J0 ; : : : ; JN −1 ) ∈
× K(E) :
N
N[
−1
gi(!)
(Ji ) ∈ B
i=0
= (P × (P ◦ (K ∗ )−1 )N )
(
(!;
J0 ; : : : ; JN −1 ) ∈
× K(E) :
N
N[
−1
)!
( (!))
fi ∅ (Ji ) ∈ B
i=0
)!
(∀B ∈ B(K(E))):
(38)
Since P ◦ ∅−1 = P, so we have
P
! ∈
:
N[
−1
( (!))
fi ∅ (Ji ) ∈ B
i=0
)!
=P
(
! ∈
:
N[
−1
fi(!) (Ji ) ∈ B
i=0
)!
:
(39)
Using Fubini theorem in (38) and noting (39) we can get
(P ◦ (K ∗ )−1 )(B)
= (P × (P ◦ (K ∗ )−1 )N )
(
(!; J0 ; : : : ; JN −1 ) ∈
× K(E)N :
N[
−1
i=0
(∀B ∈ B(K(E))):
)!
fi(!) (Ji ) ∈ B
(40)
The suciency is proved.
Remark 1. If the three objects in Theorem 4 satisfy (33); then
(!)
(!)
−1
−1
P ◦ (g∗0
; : : : ; g∗(N
= P ◦ (f0(!) ; : : : ; fN(!)
−1 )
−1) )
(∀ ∈ D):
(41)
256
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
Proof.
(!)
(!)
−1
−1
; : : : ; g∗(N
= P ◦ (g0(!)
; : : : ; gN(!)
P ◦ (g∗0
−1 )
−1) )
( (!))
(∅ (!))
= P ◦ (f0 ∅ ; : : : ; fN −1
)−1
−1
= P ◦ (f0(!) ; : : : ; fN(!)
−1 ) :
∗
Remark 2. Let (f0(!) ; : : : ; fN(!)
−1 ) ⊂ con(
; E); K (!) is a P-(f0 ; : : : ; fN −1 ) statistically
self-similar set from (
; F; P) to K(E); then there are three objects as in Theorem 4;
which satisfy (34)–(37) and
(!)
= fi( (!))
;
g∗i
P ◦ −1 = P
(∀i ∈ C1 ; ∈ D; ! ∈
):
(33′ )
F;
P)
=
Proof. In the proof of necessity part of Theorem 4, we have constructed (
;
(!)
( (!))
D
D
D
D
(
, F ; P ); = ; g∗i = f∗i ; (! ∈
=
; ∈ D; i ∈ C1 ). Hence, for any
A ∈ F; ∈ D, we have
! ∈
D : (!)
! ∈
D : ! ∈ A})
∈ A}) = P({
(P ◦ −1 )(A) = P({
= P ({! ∈ A}) = P(A):
(42)
Theorem 5. Let {f0(!) ; : : : ; fN(!)
−1 } ⊂ con(
; E); satisfy
P({! ∈
: Lip(fi(!) ) ¡ i }) = 1 (i ¡ 1; 06i ¡ N ):
(43)
K ∗ (!) is a random element from (
; F; P) to K(E); then K ∗ (!) is a P-(f0 ; : : : ; fN −1 )
F;
P);
{ ; ∈ D}
statistically self-similar set if and only if there are three objects (
;
(!)
(!)
; : : : ; g∗(N
);
∈
D}
as
in
the
Theorem
4
which
satisfy
(33) and
and (g∗0
−1)
(35)–(37).
Proof. There is no need to prove the necessity part. It is enough to prove the suciency
part that under condition (43) the three objects, constructed as before which satisfy
(33), (35) – (37), always satisfy (34). In fact,
(∅ (!))
(!)
) ¡ i ) = P(Lip(fi(!) ) ¡ i ) = 1:
P(Lip(g
i ) ¡ i ) = P(Lip(fi
(44)
Hence it follows from the Lemma 4 that (34) is true.
3. A.s. self-similar sets
In this section we will give several necessary and sucient conditions to ensure a
statistically recursive set being an a.s. self-similar set.
Theorem 6. Let K(!)
be dened as in the Theorem 1; K ∗ (!) be a random element
from (
; F; P) to K(E); then the following statements are equivalent:
(a) K ∗ (!) is a P-(f0 ; : : : ; fN −1 ) a.s. self-similar set;
257
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
(b) for any n¿1 we have
[ (!)
(!)
K ∗ (∅ (!))
=
g|1 ◦ · · · ◦ g|n
(K ∗ ( (!)))
for P = P D -a:s: !;
(45)
∈Cn
= K(!);
(c) K ∗ (∅ (!))
P D -a:s: !:
(46)
Proof. (a) ⇒ (b): Suppose (a) is true. It follows from the denition of a.s. self-similar
set that (45) is true for n = 1. If (45) is true for n6m, then
[ (!)
(!)
g|1 ◦ · · · ◦ g|(m+1)
(K ∗ ( (!)))
∈Cm+1
=
[
(!)
(g|1
◦ ··· ◦
(!)
g|m
)
∈Cm
=
[
=
(!)
g∗i
(K ∗ (∗i (!)))
i∈C1
(!)
(g|1
◦ ··· ◦
(!)
g|m
)
∈Cm
[
[
[
!
fi( (!))
(K ∗ (∗i (!)))
i∈C1
!
(!)
(!)
(g|1
◦ · · · ◦ g|m
)(K ∗ ( (!)))
∈Cm
= K ∗ (∅ (!))
P D -a:s: !:
Hence (45) is true for any n¿1.
= K ∗ ( (!))
in Theorem 1, it follows
(b) ⇒ (c): Suppose (b) is true. Taking J (!)
from Theorem 1 that
!
[ (!)
(!)
∗
g|1 ◦ · · · ◦ g|n (K ( (!))
= 0; P D -a:s: !:
(47)
lim K(!);
n→∞
∈Cn
Hence (c) is true by (b) and (47).
(c) ⇒ (a): Suppose (c) is true. It follows from Lemma 1 that
h = h1 ◦ h2 ◦ · · · ◦ hs : con(E)s → con(E);
J=
t
[
Ji : K(E)t → K(E)
i=1
are continuous maps, hence, from Theorem 1, we know
[ (!)
(!)
g|1 ◦ · · · ◦ g|n
(K ∗ ( (!)))
K(!)
= lim
n→∞
= lim
n→∞
=
[
i∈C1
∈Cn
[
i∈C1
gi(!)
gi(!)
lim
n→∞
[
∈Cn−1
[
∈Cn−1
(!)
(!)
gi∗(|1)
◦ · · · ◦ gi∗
(K ∗ (i∗ (!)))
(!)
(!)
◦ · · · ◦ gi∗
(K ∗ (i∗ (!)))
;
gi∗(|1)
(for Hausdor metric ):
(P D -a:s: !);
(48)
258
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
Let ! (i) = (!(i) ; ∈ D) = (!i∗ ; ∈ D) and = (1 ; : : : ; n−1 ); then by Theorem 1
[ (!)
(!)
lim
gi∗(|1)
◦ · · · ◦ gi∗
(K ∗ (i∗ (!)))
n→∞
∈Cn−1
[
= lim
n→∞
)
∈Cn−1
= lim
n→∞
(!
i∗(|(n−2))
(K ∗ ( (! (i) )))
f(!1 i ) ◦ · · · ◦ fn−1
[
(i)
(!
g|1
)
(i)
(! )
◦ · · · ◦ g|(n−1)
(K ∗ ( (! (i) )))
∈Cn−1
= K(! (i) );
P D -a:s: ! (i)
From (48) and (49) we have
[ (!)
gi (K(! (i) );
K(!)
=
(for Hausdor metric ):
(P D )N +1 -a:s: (!;
! (0) ; : : : ; ! (N −1) ):
(49)
(50)
∈C1
∗
; : : : ; gN(!)
is
It means that K(!)
is a P D -(g0(!)
−1 ) a.s. self-similar set, and then K (∅ (!))
(x∅ (!))
(!)
(!)
(!)
∗
, hence K (!) is a P-(f0 ; : : : ; fN −1 ) a.s. self-similar
also by (c). Since gi = fi
set. Theorem 6 is proved.
∗
Theorem 7. Let {f0(!) ; : : : ; fN(!)
−1 } ⊂ con(
; E). Then every random element K (!)
from (
; F; P) to K(E) is a P-(f0 ; : : : ; fN −1 ) a.s. self-similar set if and only if
there are the following three objects:
F;
P);
(a) complete probability space (
;
(∀ ∈ D);
(b) a collection of measurable maps { ; ∈ D}: :
→
; −1 (F) ⊂ F;
(!)
(!)
F;
P)
(c) a collection of i.i.d. random elements {(g∗0
; : : : ; g∗(N
);
∈
D}
from
(
;
−1)
N
to con(E) , which satisfy
(!)
= fi( (!))
;
g∗i
P ◦ −1 = P
(i ∈ C1 ; ∈ D; ! ∈
);
(!)
(!)
◦ · · · ◦ g|n
) = 0 (P-a:s:
!);
lim sup Lip (g|1
n→∞ ∈C
(51)
(52)
n
= K(!);
K ∗ (∅ (!))
P-a:s:
!;
(53)
where
K(!)
=
∞ [
\
(!)
(!)
g|1
◦ · · · ◦ g|n
(E) (! ∈
):
(54)
n=1 ∈Cn
Proof. Necessity: We construct the three objects as in Theorem 4. It is easy to know
that (51) and (52) are obviously true. From Theorem 6, (53) is also true.
Suciency: If there are three objects dened as in the theorem which satisfy
(51) – (53), using the similar argument in the proof of “(c) ⇒ (a)” in Theorem 6
∗
; : : : ; gN(!)
we can prove that K(!)
is a P D -(g0(!)
−1 ) a.s. self-similar set. Hence K (!) is
a P-(f0(!) ; : : : ; fN(!)
−1 ) a.s. self-similar set by condition (53). The theorem is proved.
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
259
4. Dimension and measure
In this section, we will give the Hausdor dimension and the exact Hausdor measure
function of any a.s. self-similar set.
Theorem 8. Let (
; F; P) be a complete probability space; N ¿2 be an integer.
∗
{f0(!) ; : : : ; fN(!)
−1 } ⊂ con(
; E); K (!) is any P-(f0 ; : : : ; fN −1 ) a.s. self-similar set;
satisfying
(1) E = E 0 ⊂ Rd ; E is compact;
b {f ∈ con(
; E): f(!) is a similar operator from
(2) {f0 ; : : : ; fN −1 } ⊂ sicon(
; E) =
E to E for any xed ! ∈
};
(3) fi (E 0 ) ⊂ E 0 ; fi (E 0 ) ∩ fj (E 0 ) = ∅ (06i; j ¡ N; i 6= j);
(4) min06i¡N Lip(fi ) ¿ 0;
(5) sup!∈
; 06i¡N Lip(fi(!) )6 ¡ 1;
then
(55)
P({!: dim(K ∗ (!)) = }) = 1
where is the unique solution of the following equation:
!
++
**
N
−1
X
(!)
P
Lip(fi ) = 1; 066d
:
E
(56)
i=0
E P is the expectation operator w.r.t. the probability measure P; dim( · ) is the
Hausdor dimension.
In order to prove Theorem 8, we need
(!)
F;
P)
be any complete probability space {(g∗i
; 06i ¡ N ):
Lemma 10. Suppose (
;
∈ D} are i.i.d. random elements from
to con(E)N satisfying
(1)
(2)
(3)
(4)
(5)
(6)
E = E 0 ⊂ Rd ; E is compact;
E);
{gs ; ∈ D} ⊂ sicon(
;
T
gn; (E 0 ) gn; (E 0 ) = ∅; gn; = g|1 ◦ · · · ◦ g|n ; (∀; ∈ Cn ; 6= ; n¿1);
min06i¡N Lip(gi ) ¿ 0;
26N ¡ ∞;
sup∈D-{∅} Lip(g ) = ¡ 1.
Let
K(!)
=
∞ [
\
(!)
(!)
g|1
◦ · · · ◦ g|n
(E) ;
n=1 ∈Cn
then
P(dim(K(
!))
= ) = 1;
where is the unique solution of the following equation:
!
++
**
X
(!)
P
:
Lip(gi ) = 1; 066d
E
06i¡N
This lemma is just Theorem 1 in Hu (1999a,1999b).
260
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
Proof of Theorem 8. For any a.s. P-(f0 ; : : : ; fN −1 ) self-similar set K ∗ (!), according
to the proof of necessity in Theorem 7, there are three objects:
(a) complete probability space
F;
P)
= (
D ; FD ; P D );
(
;
(b) a collection of measurable maps
{ ; ∈ D};
:
→
;
= ! ;
(!)
(when ! = (! ; ∈ D) ∈
D );
−1 (F) ⊂ F
(!)
(!)
; : : : ; g∗(N
(c) a collection of i.i.d. random elements {(g∗0
−1) ); ∈ D} from (
; F; P)
N
to con (E) , which satisfy (51) – (53) and
K(!)
=
b
∞ [
\
(!)
(!)
g|1
◦ · · · ◦ g|n
(E):
(54)
n=1 ∈Cn
Since dim(·) : K(E) → [0; ∞) is measurable, hence it follows from (53) and (51)
that
! : dim(K(!))
! : dim(K ∗ (∅ (!)))
= })
P({
= }) = P({
−1 )({! : dim(K ∗ (!)) = })
= (P
∅
= P({!: dim(K ∗ (!)) = }):
Similarly, Lip(·) : con(E) → [0; 1) is also measurable, hence
!
!
N
−1
N
−1
X
X
(!)
(!)
P
P
Lip(fi )
Lip(gi ) = E
E
(57)
(58)
i=0
i=0
and then is the unique solution of the following equation:
!
++
**
N
−1
X
(!)
P
:
Lip(gi ) = 1; 066d
E
(59)
i=0
In order to prove Theorem 8, it is enough to prove
P(dim
K(!)
= ) = 1;
(60)
where is the unique solution of Eq. (59).
Using Lemma 10 it is enough to prove (60), that conditions (1) – (6) in Lemma 10
(!)
D
=fi( (!))
; (i ∈ C1 ; ∈ D; ! ∈
=
), conditions
are satised. Since in Theorem 8 g∗i
(1) – (4) in Lemma 10 follow from conditions (1) – (4) in Theorem 8 immediately.
Condition (5) in Lemma 10 is obviously true, so we only need to prove condition (6)
in Lemma 10.
In fact, by condition (6) in Theorem 8 we have
(!)
)
) = sup Lip(g∗i
sup Lip(g(!)
06i¡N
∈D-{∅}
∈D
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
= sup
06i¡N
∈D
= sup
06i¡N
!∈
261
Lip(fi( (!))
)
Lip(fi(!)
)661:
The theorem is proved .
Lemma 11. Under the conditions in Lemma 10, if
(1) P(min
Lip(gi )¿) = 1 (for some one ¿ 0);
P 06i¡N
(2) P( 06i¡N Lip(gi ) = 1) = 1,
then
¡ H (K(!))
¡ ∞) = 1;
P(0
where H (·) is the Hausdor measure dened by ’(s) = s .
This lemma is just Theorem 2 in Hu (1999a,1999b).
Theorem 9. Under the conditions in Theorem 8, if
(1) P(min06i¡N Lip(fi )¿) = 1 (for some one ¿ 0);
P
(2) P( 06i¡N Lip(fi ) = 1) = 1,
then
P(0 ¡ H (K ∗ (!)) ¡ ∞) = 1;
it means that the exact Hausdor measure function of K ∗ is ’(s) = s .
(!)
, P ◦ −1 = P (i ∈ C1 ; ∈ D; ! ∈
=
D ) in this theorem,
Proof. Since g∗i
= fi( (!))
hence we can get Theorem 9 from Lemma 11 immediately.
Remark 3. If fi(!) does not depend on !; (06i ¡ N ), then Theorems 8 and 9 become
Theorem 1(ii) in Hutchinson (1981; p: 737).
Remark 4. If we take
= con(E)N ;
=
D = (con(E)N )D in Theorems 8 and 9.
! = (! ; ∈ D); !∅ = (S0 ; : : : ; SN −1 ) ∈
= con(E)N ; ! =
For any element ! ∈
;
N
(S∗0 ; : : : ; S∗(N −1) ); ( ∈ D); (f0(!) ; : : : ; fN(!)
−1 ) = !; (∀! ∈
= con(E) ),
then
(!)
(!)
( (!))
(!))
; : : : ; g∗(N
; : : : ; fN(−1
)
(g∗0
−1) ) = (f0
= ! = (S∗0 ; : : : ; S∗(N −1) ) (∀! = (! ; ∈ D)):
= (!)
Hence;
K(!)
=
\ [
(!)
(!)
g|1
◦ · · · ◦ g|n
(E)
n¿1 ∈Cn
=
\ [
S|1 ◦ · · · ◦ S|n (E);
n¿1 ∈Cn
i.e. K(!)
is just the random set dened in Theorem 2:2 in Graf (1987).
262
D. Hu / Stochastic Processes and their Applications 90 (2000) 243–262
F;
P)
= (
D ; FD ; P D ), we have proved in the
For complete probability space (
;
0 ; : : : ; gN −1 ) a.s. self-similar
proof of (c) ⇒ (a) in Theorem 6 that K(!)
is also a P-(g
set. In this case, Theorems 8 and 9 become Theorems 7:6 and 7:8 in Graf (1987),
respectively.
Remark 5. Remark 4 tells us that the random set K dened in Theorem 2:2 in
0 ; : : : ; gN −1 ) statistically self-similar set but also a
Graf (1987) is not only a P-(g
P-(g0 ; : : : ; gn−1 ) a.s. self-similar set. Hence Theorems 7:6 and 7:8 in Graf (1987) are
the special cases of Theorems 8 and 9, respectively.
5. For further reading
The following references are also of interest to the reader: Graf et al., 1988; Matthias,
1991.
References
Falconer, K.J., 1994. The multifractal spectrum of statistically self-similar measures. J. Theoret. Probab. 7,
681–702.
Graf, S., 1987. Statistically self-similar fractals. Probab. Theory Related Fields 74, 357–392.
Graf, S., Mauldin, R.D., Williams, S.C., 1988. The exact Hausdor dimension in random recursive
construction. Mem. Amer. Math. Soc. 38, 1–121.
Hu, D., 1999a. Dimension and measure of a class of statistically recursive sets. Chinese Sci. Abstracts 5
(9), 1152.
Hu, D., 1999b. The probability character of statistically self-similar sets and measures. Acta Math. Sci.
19 (3), 338–346.
Hutchinson, J.E., 1981. Fractals and self-similarit. Indiana Univ. Math. J. 30, 713–747.
Matthias, A., 1991. Random recursive construction of self-similar fractal measures. The non compact case.
Probab. Theory Related Fields 88, 497–520.
Mauldin, R.D., Williams, S.C., 1986. Random recursive construction: asympototic geometric and topological
properties. Trans. Amer. Math. Soc. 295, 325–346.