X.E. Xu, C. Wu International Review of Economics and Finance 8 1999 375–397 379
Another issue is related to the specification in the volatility regression. In JKL’s study, they separated volume into the components of average trade size and the
number of transactions. The average trade size is defined as the total number of shares traded in a period day divided by the number of transactions. By definition, total
trading volume is simply the product of average trade size and the number of transac- tions. Thus, one cannot linearly separate total trading volume into these two compo-
nents. One possible consequence is that the total effect of trading volume may not be completely captured by the sum of the effects of the size and frequency of trades.
An alternative specification is to take a log transformation of each transaction variable so that the log total volume is exactly equal to the sum of the log values of average
trade size and the number of transactions. In principle, this log transformation may provide a better decomposition of the total volume effect. In our empirical investiga-
tion, we use the variables of size and frequency of trades with and without the log transformation and examine the sensitivity of estimation results to these variable
specifications.
3. Empirical methodology
We use the GMM method for empirical estimation Hansen, 1982. The GMM requires much weaker distributional assumptions than other estimation methods. It
is therefore especially suitable for estimating a model that may not satisfy some regularity conditions of least squares. It is well known that heteroskedasticity of return
volatility is a pervasive phenomenon both at the individual security and portfolio levels.
6
Also, in the present case, the volatility measure is essentially the conditional absolute returns, which may be subject to serial correlation.
7
The GMM provides a more robust framework for parameter estimation and hypothesis tests in the presence
of both statistical problems. The GMM estimation involves specifying a set of moment conditions. For example,
for each individual regression specified in Eq. 2, the following orthogonal conditions must be satisfied:
E m
t
5 0 E
m
t
ATS
t
5 0 E
m
t
NT
t
5 0 E
m
t
OPEN
t
5 0 E
m
t
CLOSE
t
5 0 E
m
t
MONDAY
t
5 0 E
m
t
VOL
t
2
j
5 0, j 5 1, 2, . . . , 14. 3
These conditions state that the expected cross products of unobservable distur- bances, m
t
, and the explanatory variables are equal to 0. The first moment of the cross-products is:
m
N
u 5 1
N
o
N t
5
1
m
t
uZ
t
4
380 X.E. Xu, C. Wu International Review of Economics and Finance 8 1999 375–397
where Z
t
is a vector of variables included in the regression model, u is a vector of parameters to be estimated, N is the number of observations, and the disturbance m
t
is stationary. For each regression, there are 20 orthogonal conditions and 20 parameters to be estimated.
8
For example, for 141 cross-sectional units, there are a total of 141 3 20 5 2,820 parameters to be estimated. Since the number of parameters to be estimated
equals the number of orthogonal conditions, the model is just identified. We estimate the true parameter vector u
by the value of uˆ that minimizes the following quadratic function:
S u,V 5 [Nm
N
u]9V
2
1
[Nm
N
u]9 5
where V 5
Cov[Nm
N
u ],[Nm
N
u ]9.
We impose the moment condition [Eq. 3] in estimating the volatility regressions of individual securities. The procedure of the GMM estimation involves selecting an
estimator to set the linear combination of the moment conditions to 0 while minimizing [Eq. 5]. We apply the GMM to the system of equations involving all securities in
the sample. More specifically, we estimate Eqs. 2 or 2a simultaneously for all securities to provide efficient estimates of parameters by accounting for cross-correla-
tion in error terms.
4. Data and empirical results