Estimation issues Directory UMM :Data Elmu:jurnal:S:Structural Change and Economic Dynamics:Vol12.Issue1.Mar2001:

the market has developed. In Fig. 1 and Figs. 6 and 7 in the appendix, the emerging dominant standard F provides slightly different costbenefits vis-a´-vis G. That is, F is at x \ Dx and G is at x B Dx, where Dx is a small change in benefits. Assuming that firms favor F over G, this slight initial difference in position between F and G gets magnified as network externalities grow. This drives F and G farther apart as shown in Figs. 1, 6 and 7. Interesting market examples of this behavior are provided by the AppleATARI competition in the early 1980’s, and the VHSBETA competition in VCR’s. While the AppleATARI battle is more interesting, the VHSBETA competition is the most often cited. Conventional wisdom argues that in the VCR’s the VHS versus BETA format ‘war,’ the small initial advantage of a 3 versus 2.5 h recording capability ultimately resulted in more consumers picking VHS over BETA. Since American football games run around 3 h, only VHS allowed them to be taped in their entirety. This led to more software rental movies being available in VHS format; which, in turn, led new consumers to pick VHS over BETA as they came into the market. In time a better competitor may enter the market G to challenge F in the now developed market. Since the market now has high network externalities, it must provide benefits in excess of the threshold value x, for firms to switch. If it does, then the firms will shift in a bandwagon to the new competitor G, as shown in Figs. 6 and 7.

3. Estimation issues

Estimation of chaos models, in general, and catastrophe models, in particular, is difficult because of their nonlinear dynamic characteristics. For example, Eq. 1 presents a problem because it is implicit, multivalued in the dependent variable, and discontinuous. So while the cusp model is parsimonious in its ability to describe a large variety of complex behaviors, it presents major estimation difficulties. Initial efforts to estimate catastrophe models were simplistic. The first published empirical social science application of a catastrophe model is generally credited to Zeeman et al., 1976. It focused on institutional disturbances riots and takeovers in a United Kingdom prison. The approach used might best be described as quasi- graphical. The approach was later refined by Sheridan and Abelson 1983 who used an updated version of the graphical approach in a study of employee turnover. Taking a different tact, Oliva et al. 1981 modeled a collective bargaining situation that used a set of rule-based predictions about the dependent variables behavior. Their method made predictions about bargaining system behavior, then used a Chi-square type measure to assess the accuracy of their model. Although more empirically satisfying than the Zeeman et al. 1976 method, it was simple and ad hoc. While these were critical first steps, they tended to provide only limited empirical support. Important progress was made by Loren Cobb 1978, 1981 who was working on developing statistical distributions for catastrophe models in the biosciences. Draw- ing on Cobb’s analytical work, Guastello 1982 Guastello 1995 developed a promising statistical specification for the cusp model by starting with the following deterministic equation: dz = Z 3 − ZY − X = 0. By inserting beta weights and setting dt equal to 1, he developed his statistical expression: DZ=Z 2 − Z 1 = b + b 1 Z 1 3 + b 2 Z 1 Y + b 3 X + o. A major limitation of this and Cobb’s approach is that it does not allow for a priori specification of the control variables. Rather the technique ‘finds’ catastrophe if it exists and identifies in a probabilistic sense which independent variables are associated with the control factor X in Eq. 1 and which independent variables are associated with the splitting factor Y in Eq. 1. Clearly, this is a problem when the researcher is trying to develop a confirmatory estimate a specific catastro- phe model. Additionally, the dependent variable is required to be univariate. Consequently, its usefulness is limited when the catastrophe model uses or requires a multivariate dependent construct. To deal with the problem, researchers using Cobb 1981 or Guastello’s 1995 techniques have typically averaged or otherwise scaled the measures to get a single dependent measure. Unfortunately, such averaging techniques can cause the loss of valuable information when a true catastrophe model is present as demonstrated in Oliva et al. 1987. A solution to the problem was developed by Oliva et al. 1987 . Their method called the General Multivariate Methodology for Estimating Catastrophe Models GEMCAT, used a scaling approach that allows for the a priori specification of variable type, and can handle multivariate constructs in all the variables. In particular, Oliva et al. 1987 generalize the variables Z, Y, and X to their multivariate counterparts. The Z i , Y j , and X k are observable indicator variables with weights a i , b j , and g k , respectively. Hence, we define Eq. 2, Eq. 3, and Eq. 4 below: This allows the catastrophe equation to be rewritten as shown in Eq. 5: Z t = I i = 1 Z it a i , 2 Y t = J j = 1 Y JT b j , 3 X t = K k = 1 X kt g k , 4 0 = Z t 3 − X t − Y t · Z t 5 From equation Eq. 5 the estimation goal is to minimize Eq. 6: Min a i ,b j ,g k F = t 2 = T [Z t 3 − X t − Y t · Z t ] 2 , 6 where the error is equal to t . For a given set of measures on the constructs, the object is to estimate the impact coefficients that define their respective latent variables. This is analogous to the minimization of error sum-of-squares in a regression analysis by making F as close to zero as possible Oliva, et al., 1987. We explicitly note that from a statistical standpoint the error structure in Eq. 5 would be an extremely complex equation. Hence, the approach reported is a way to make the estimation problem tractable. Extensive simulation results Lang et al. 1999 indicate the approach can distinguish the difference between linear and nonlinear surfaces. And, Bootstrap and Jackknife procedures provide some assurance that the parameter estimates are reasonable. GEMCAT approaches have been successfully applied in a number of different organizational research contexts e.g. Oliva, 1991; Gresov et al., 1993; Kauffman and Oliva, 1994. More recently, Lang et al. 1999 developed an improved version of the algorithm called GEMCAT II, which provides greater speed, efficiency, utility and flexibility in terms of analysis and testing. For example, the new version has options to perform both Bootstrap and Jackknife testing procedures and it produces SPSS files for further analysis. In addition, GEMCAT II is slightly more general as it allows offsets a , b , and g to be included in equations Eq. 2, Eq. 3, and Eq. 4. Finally, in their comparison of Cobb 1981 and Guastello 1995 techniques versus the GEMCAT approach, Alexander et al. 1992 note that for exploratory situations in which theory construction is the focus, or when the existence of catastrophe data is the issue, and univariate dependent measures are sufficient, Cobb related approaches are the best choice. However, Alexander et al. 1992 argue that GEMCAT is the best choice for theory testing or confirmatory contexts, and those requiring multivariate indicators in the dependent variable. Given the use of a multivariate dependent construct and confirmatory nature of this work, the GEMCAT II procedure is the appropriate estimation technique to use.

4. Data

Dokumen yang terkait

Analisis Komparasi Internet Financial Local Government Reporting Pada Website Resmi Kabupaten dan Kota di Jawa Timur The Comparison Analysis of Internet Financial Local Government Reporting on Official Website of Regency and City in East Java

19 819 7

Analisis Pengendalian Persediaan Bahan Baku Tembakau Dengan Metode Economic Order Quantity (EOQ) Pada PT Mangli Djaya Raya

3 126 8

FAKTOR-FAKTOR PENYEBAB KESULITAN BELAJAR BAHASA ARAB PADA MAHASISWA MA’HAD ABDURRAHMAN BIN AUF UMM

9 176 2

ANTARA IDEALISME DAN KENYATAAN: KEBIJAKAN PENDIDIKAN TIONGHOA PERANAKAN DI SURABAYA PADA MASA PENDUDUKAN JEPANG TAHUN 1942-1945 Between Idealism and Reality: Education Policy of Chinese in Surabaya in the Japanese Era at 1942-1945)

1 29 9

Improving the Eighth Year Students' Tense Achievement and Active Participation by Giving Positive Reinforcement at SMPN 1 Silo in the 2013/2014 Academic Year

7 202 3

Improving the VIII-B Students' listening comprehension ability through note taking and partial dictation techniques at SMPN 3 Jember in the 2006/2007 Academic Year -

0 63 87

The Correlation between students vocabulary master and reading comprehension

16 145 49

Improping student's reading comprehension of descriptive text through textual teaching and learning (CTL)

8 140 133

The correlation between listening skill and pronunciation accuracy : a case study in the firt year of smk vocation higt school pupita bangsa ciputat school year 2005-2006

9 128 37

Transmission of Greek and Arabic Veteri

0 1 22