parameters than VARMA model. Then, Pfeifer dan Deutsch [6,7] further studied those models and developed the procedure of their modeling
In STAR model, the autoregressive parameters are assumed to be the same for all locations. This assumption is impractical, since different locations
usually lead different parameters. A more flexible model, i.e., Generalized STAR model was proposed by Borovkova et.al [2], allowing different
autoregressive parameters.
This paper present the method of GSTAR modeling through the procedures adopted from Box and Jenkins [1]. The procedures start from stationer
condition. Discussions about stationery GSTAR model can be read extensively in [8,9] and [12]. This paper focuses the discussion on the three stage modeling
procedure. Those are model identification, parameters estimation, and diagnostic check to ensure that the model has white noise error vector.
2. GSTAR Generalized Space-Time Autoregressive Model
The GSTAR model is specific form of VAR Vector Autoregressive model. It reveals linear dependencies of space and time. The main difference is
on the spatial dependent, that in GSTAR model, it is expressed by weight matrix. Let
{ }
: 0, 1, 2,...
Z t t
= ± ± be a multivariate time series of N
components. In matrix notation, the GSTAR model of autoregressive order p and spatial orders
λ
1
λ
2,...,
λ
p
, GSTAR p λ
1
λ
2,...,
λ
p
could be written as see [2]
1 1
s
p k
s sk
s k
Z t W
Z t s
e t
λ =
=
⎡ ⎤
= Φ + Φ
− + ⎢
⎥ ⎣
⎦
∑ ∑
1
where
1 N
s0 s0
diag ,
,
s
φ φ
Φ = …
and
1 N
sk sk
diag ,
,
sk
φ φ
Φ = …
, weights are choosen to satisfy
k ii
w =
and 1
k ij
i j
w
≠
=
∑
.
3. Model Identification
As in time series modeling, the first step is identifying a tentative model which is characterized by spatial and time order. Spatial order is generally
restricted on order 1, since the higher order is difficult to be interpreted. Approving the method in VAR and VARMA models, the time autoregressive
order is determined by using the Akaike Information Criterion AIC see [13]
AICi =
2
2 ˆ
ln T
i
k i Σ +
2 where k is the number of parameters in the model and T is the number of
observation. The autoregressive order of GSTAR model is p such that AICp =
min AIC
i p
i
≤ ≤
. This value p can be obtained by performing SAS program using PROCSTATESPACE.
4. Parameters Estimation
4.1 Determination of space weight
Determination of space weight by using the normalisation result of cross- correlation between locations at the appropriate time lag is firstly proposed by
Suhartono and Atok see [10,11]. They demonstrated by simulation study that
the method well performed on GSTAR1
1
model. In general, cross-correlation between two variables or location i and j at the time lag k,
], ,
[ corr
k t
Z t
Z
j i
−
defined as see [1, 14]
,
j i
ij ij
k k
σ σ
γ ρ
= …
, 2
, 1
, ±
± =
k
3 where
k
ij
γ
is cross-covarians between observation in location i and j at the time lag k,
i
σ
and
j
σ
is standard deviation of observation in location i and j. The estimated of cross-correlation in sample data is
⎟⎟⎠ ⎞
⎜⎜⎝ ⎛
− ⎟⎟⎠
⎞ ⎜⎜⎝
⎛ −
− −
− =
∑ ∑
∑
= =
+ =
n t
j j
n t
i i
n k
t j
j i
i ij
Z t
Z Z
t Z
Z k
t Z
Z t
Z k
r
1 2
1 2
1
] [
] [
] ][
[
. 4
Then, determination of space weight could be done by normalisation of the the cross-correlation between locations at the appropriate time lag. This
process generally yields space weight for GSTARp
1
model, i.e. ,
| |
ij ij
ik k i
r k w
r k
≠
=
∑
where
j i
≠
, k = 1, …, p 5
and this weight satisfies
1 |
|
1
=
∑
≠ j
ij
w
Space weights by using the normalisation of the cross-correlation between locations at the appropriate time lag give all form possibilities of the
relationship between locations. Hence, there is no strict constraint about the weight values that must depend on distance between locations. This weight
also gives flexibility on the sign and size of the relationship between locations.
4.2. Autoregressive Parameter Estimation of GSTAR p
λ
1
λ
2,...,
λ
p
model
Recall GSTAR p λ
1
λ
2,...,
λ
p
model
1 1
2 2
1
s
p i
k k
k i
sk i
i iN
N i
s k
Z t w Z t
s w Z t
s w Z
t s
t
λ
ϕ
= =
= − +
− + + −
+
∑∑
e 6
for ,
1,..., t
p p T
= +
, i = 1, 2, ..., N, where
1
ij
w = for i = j and zero otherwise.
It is assumed that t
e ∼ White Noise 0,
∑
, where et =
1
e t ,
2
e t , …,
N
e t
Least square estimator of autoregressive parameter has been dirived by Borovkova et.al [3]. They define new notations
N k
k i
ij j
j i
V t
w Z t
≠
=
∑
for k ≥ 1 and
i i
V t
Z t =
,
,...,
i i
i
Z p Z T ′
= Y
, ,...,
i i
i
e p e T ′
=
u
1 1
1 1
1 1
1 1
p p
i i
i i
i i
i i
i
V p
V p
V V
V T
V T
V T
p V
T p
λ λ
λ λ
⎛ ⎞
− −
⎜ ⎟
= ⎜ ⎟
⎜ ⎟
− −
− −
⎝ ⎠
X
and
1 2
10 1
20 2
,..., ,
,..., ,...,
,...,
p
i i
i i
i i
i p
λ λ
λ
φ φ φ
φ φ
φ ′ =
β
Equation 6 can be expressed for all location simultaneously as linear model =
+ Y
X β u , 7
where
1
,...,
N
′ ′
′ =
Y Y
Y ,
1
,...,
N
= X
X X
,
1
,...,
N
′ ′
′ =
β β
β
,
1
,...,
N
′ ′
′ =
u u
u
Thus, least square estimator ˆ
T
β is of the form
1
ˆ
T −
′ ′
= β
X X X u
8 Asymptotic normality of the estimator ˆ
T
β will be explained below.
Write
1 N
diag ,...,
M M
M = where
i1 ip
diag ,...,
M M
M
i
= , for
i = 1, 2, ... , N and s = 1, 2, ...,
p
1 1
1 1
1 , 1
, 1 ,
1 , 1
, 1
1
s s
s s
i i i
i i i N
is i
i i i i
iN
w w
w w
w w
w w
− +
− +
⎛ ⎞
⎜ ⎟
⎜ ⎟
= ⎜ ⎟
⎜ ⎟
⎜ ⎟
⎝ ⎠
M
λ λ
λ λ
9
Define covariance matrix
1 1
1 2
1 2
p
p p
p p
− − +
⎡ ⎤
⎢ ⎥
− + ⎢
⎥ = ⎢
⎥ ⎢
⎥ −
− ⎣
⎦ Γ
Γ Γ
Γ Γ
Γ Γ
Γ Γ
Γ
10
where
[ ]
s E
t t
s ′
= +
Γ Z
Z .
Under some conditions Borovkova et.al. [3] have shown asymptotic
multivariate normality of least square estimator in GSTAR model, i.e.
ˆ
T
T −
β β
,
d d
⎯⎯→ 0 C
11 where d =
1 p
d p
N λ
λ = + + +
, and
1 1
p p
p −
−
′ ′
′ =
⊗ ⊗
⊗ C
M I Γ
M M
Γ M M I
Γ M
∑ ,
M is defined in 9 and
p
Γ is defined in 10.
The parameter ∑ can be estimated by
1 ˆ
ˆ ˆ
1
T T
t p
t t
t t
T p
=
′ =
− −
− +
∑
Z Z
Z Z
∑
where
1 1
ˆ ˆ
ˆ
s
p k
s sk
s k
t t
s t
s
λ =
=
= − +
−
∑ ∑
Z Φ Z
Φ W Z
The elements of matrix
p
Γ can be estimated
1 ˆ
1
T s T
t
s t
t s
T s
− =
′ =
− − +
∑
Γ Z
Z
and ˆ ˆ
T T
s s
′ − =
Γ Γ
, for s ≥ 0. Cosequently the estimate of
p
Γ is obtained.
It is also proven in [3] that ˆ
T
∑ dan ˆ
p
Γ are consistent estimators for ∑ and
p
Γ respectively.
The hypothesis to test the significantcy of parameter in GSTAR model can be written as
H :
r =
R β
against H
1
: r
≠ R
β
. 12 By using property 11 and the facts that ˆ
T
∑ dan ˆ
p
Γ are consistent estimators
for ∑ and
p
Γ , and modifying Wald Statistic on linear model from White [15] we develope Wald satistic for the hypothesis 12 .Let m = rank R, if H
: r
= R
β
holds, under some conditions,
i ˆ
T
T r
− R
β
,
d d
⎯⎯→ 13
where d =
1 p
d p
N λ
λ = + + +
, and =
′ RC R
ii The Wald statistic
1
ˆ ˆ
ˆ
T T
T r
r
−
′ =
− −
W R
β R
β
2 d
m
⎯⎯→χ 14 where ˆ =
ˆ ′
RC R
5. Check Diagnostic
After found the significant model, the final step is checking wether the model fulfill the assumtion of
white noise error vector. White noise meants residuals of model form an uncorrelated sequence that leads model fit. Based
on the estimated model, the error estimate can be written as ˆ
ˆ t
t =
e Z
- Z ,
The Portmanteau Multivariat test, a generalization of Ljung and Box in
[13]for multivariate case, can be used to test white noise error vector. The
null hypothesis of this test is H
:
1 m
= =
= ρ
ρ
, 15
where
ρ is correlation matrix of error vector.
The test statistic is
2 1
1 1
1 ˆ ˆ ˆ ˆ
m N
t t
t
Q m
T tr
T t
− −
=
′ =
−
∑
Γ Γ Γ Γ 16
Under the null hypothesis 15, the statistic test 16 approximate Chi Square distribution with N
2
m degree of freedem.
6. Empirical Study