Perbandingan Antara Estimasi M dengan Type Welsch dengan Least Trimmed Square untuk Mengatasi Adanya Data Pencilan

(1)

LAMPIRAN 1

Hasil perhitungan dengan matlab iterasi estimasi M type Welsch ITERASI I

w=[0.9832,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0.9825,0,0,0,0,0,0 ,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0.9345,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 ,0,0,0,0,0,0,0;0,0,0,0.9813,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0.9 984,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0.9487,0,0,0,0,0,0,0,0,0,0 ,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0.8680,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0 ,0,0,0,0,0,0,0.9995,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0.7250,0,0 ,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0.9999,0,0,0,0,0,0,0,0,0,0,0,0,0,0 ,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0.9378,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0 ,0,0,0.9838,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0.6739,0,0,0,0,0,0 ,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0.9890,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0 ,0,0,0,0,0,0,0,0,0,0,0.6983,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.5 971,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.7523,0,0,0,0,0,0,0,0,0,0 ,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.8908,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0 ,0,0,0,0,0,0,0.9349,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.9882,0,0 ,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.8164,0,0,0,0,0,0,0;0,0,0,0,0,0,0 ,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.8848,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 ,0,0,0.9992,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.9814,0,0,0,0;0,0 ,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.9665,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 ,0,0,0,0,0,0,0,0,0,0,0.9244,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.8 636,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.7581]

w =

Columns 1 through 7

0.9832 0 0 0 0 0 0 0 0.9825 0 0 0 0 0 0 0 0.9345 0 0 0 0 0 0 0 0.9813 0 0 0


(2)

0 0 0 0 0.9984 0 0 0 0 0 0 0 0.9487 0 0 0 0 0 0 0 0.8680 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 Columns 8 through 14

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0


(3)

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.9995 0 0 0 0 0 0

0 0.7250 0 0 0 0 0 0 0 0.9999 0 0 0 0 0 0 0 0.9378 0 0 0 0 0 0 0 0.9838 0 0 0 0 0 0 0 0.6739 0 0 0 0 0 0 0 0.9890 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 Columns 15 through 21

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0


(4)

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.6983 0 0 0 0 0 0

0 0.5971 0 0 0 0 0 0 0 0.7523 0 0 0 0 0 0 0 0.8908 0 0 0 0 0 0 0 0.9349 0 0 0 0 0 0 0 0.9882 0 0 0 0 0 0 0 0.8164 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 Columns 22 through 28

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0


(5)

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.8848 0 0 0 0 0 0

0 0.9992 0 0 0 0 0 0 0 0.9814 0 0 0 0 0 0 0 0.9665 0 0 0 0 0 0 0 0.9244 0 0 0 0 0 0 0 0.8636 0 0 0 0 0 0 0 0.7581 >>

y=[7.6;7.7;4.3;5.9;5;6.5;8.3;8.2;13.2;12.6;10.4;10.8;13.1;12.3;10.4;10.5;7.7;9.5;1 2;12.6;13.6;14.1;13.5;11.5;12;13;14.1;15.1]

y = 7.6000 7.7000 4.3000 5.9000


(6)

5.0000 6.5000 8.3000 8.2000 13.2000 12.6000 10.4000 10.8000 13.1000 12.3000 10.4000 10.5000 7.7000 9.5000 12.0000 12.6000 13.6000 14.1000 13.5000 11.5000 12.0000 13.0000 14.1000 15.1000 >>

X=[1,8.2,4,23.005;1,7.6,5,23.873;1,4.6,0,26.417;1,4.3,1,24.868;1,5.9,2,29.895;1,5 ,3,24.2;1,6.5,4,23.215;1,8.3,5,21.862;1,10.1,0,22.274;1,13.2,1,23.83;1,12.6,2,25.1 44;1,10.4,3,22.43;1,10.8,4,21.785;1,13.1,5,22.38;1,13.3,0,23.927;1,10.4,1,33.443; 1,10.5,2,24.859;1,7.7,3,22.686;1,10,0,21.789;1,12,1,22.041;1,12.1,4,21.033;1,13. 6,5,21.005;1,15,0,25.865;1,13.5,1,26.29;1,11.5,2,22.932;1,12,3,21.313;1,13,4,20. 769;1,14.1,5,21.393]


(7)

X =

1.0000 8.2000 4.0000 23.0050 1.0000 7.6000 5.0000 23.8730 1.0000 4.6000 0 26.4170 1.0000 4.3000 1.0000 24.8680 1.0000 5.9000 2.0000 29.8950 1.0000 5.0000 3.0000 24.2000 1.0000 6.5000 4.0000 23.2150 1.0000 8.3000 5.0000 21.8620 1.0000 10.1000 0 22.2740 1.0000 13.2000 1.0000 23.8300 1.0000 12.6000 2.0000 25.1440 1.0000 10.4000 3.0000 22.4300 1.0000 10.8000 4.0000 21.7850 1.0000 13.1000 5.0000 22.3800 1.0000 13.3000 0 23.9270 1.0000 10.4000 1.0000 33.4430 1.0000 10.5000 2.0000 24.8590 1.0000 7.7000 3.0000 22.6860 1.0000 10.0000 0 21.7890 1.0000 12.0000 1.0000 22.0410 1.0000 12.1000 4.0000 21.0330 1.0000 13.6000 5.0000 21.0050 1.0000 15.0000 0 25.8650 1.0000 13.5000 1.0000 26.2900 1.0000 11.5000 2.0000 22.9320 1.0000 12.0000 3.0000 21.3130 1.0000 13.0000 4.0000 20.7690 1.0000 14.1000 5.0000 21.3930


(8)

>> A=X’ A =

Columns 1 through 7

1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 8.2000 7.6000 4.6000 4.3000 5.9000 5.0000 6.5000 4.0000 5.0000 0 1.0000 2.0000 3.0000 4.0000 23.0050 23.8730 26.4170 24.8680 29.8950 24.2000 23.2150 Columns 8 through 14

1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 8.3000 10.1000 13.2000 12.6000 10.4000 10.8000 13.1000 5.0000 0 1.0000 2.0000 3.0000 4.0000 5.0000 21.8620 22.2740 23.8300 25.1440 22.4300 21.7850 22.3800 Columns 15 through 21

1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 13.3000 10.4000 10.5000 7.7000 10.0000 12.0000 12.1000

0 1.0000 2.0000 3.0000 0 1.0000 4.0000

23.9270 33.4430 24.8590 22.6860 21.7890 22.0410 21.0330 Columns 22 through 28

1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 13.6000 15.0000 13.5000 11.5000 12.0000 13.0000 14.1000

5.0000 0 1.0000 2.0000 3.0000 4.0000 5.0000 21.0050 25.8650 26.2900 22.9320 21.3130 20.7690 21.3930


(9)

>> J=inv(A*w*X) J =

7.0254 -0.0961 -0.1874 -0.2334 -0.0961 0.0046 0.0013 0.0019 -0.1874 0.0013 0.0167 0.0056 -0.2334 0.0019 0.0056 0.0084 >> K=A*w*y

K =

1.0e+003 * 0.2620 2.8937 0.6709 6.1084

>> β=J*K

β =

10.9825 0.7686 -0.0725 -0.3476


(10)

ITERASI II

w=[0.7090,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0.9630,0,0,0,0,0,0 ,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0.8282,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 ,0,0,0,0,0,0,0;0,0,0,0.9811,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0.9 99,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0.9839,0,0,0,0,0,0,0,0,0,0, 0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0.9217,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0, 0,0,0,0,0,0,0.7763,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0.4280,0,0, 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0.9948,0,0,0,0,0,0,0,0,0,0,0,0,0,0, 0,0,0,0;0,0,0,0,0,0,0,0,0,0,0.7148,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0, 0,0,0.9954,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0.6092,0,0,0,0,0,0, 0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0.9368,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0, 0,0,0,0,0,0,0,0,0,0,0.3368,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.1 613,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.3140,0,0,0,0,0,0,0,0,0,0 ,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.9169,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0 ,0,0,0,0,0,0,0.8658,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.9971,0,0 ,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.8621,0,0,0,0,0,0,0;0,0,0,0,0,0,0 ,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.9812,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 ,0,0,0.9999,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.9289,0,0,0,0;0,0 ,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.9848,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 ,0,0,0,0,0,0,0,0,0,0,0.9694,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.9 316,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.8149]

w =

Columns 1 through 7

0.7090 0 0 0 0 0 0 0 0.9630 0 0 0 0 0 0 0 0.8282 0 0 0 0 0 0 0 0.9811 0 0 0 0 0 0 0 0.9990 0 0


(11)

0 0 0 0 0 0.9839 0 0 0 0 0 0 0 0.9217 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 Columns 8 through 14

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0


(12)

0 0 0 0 0 0 0 0.7763 0 0 0 0 0 0

0 0.4280 0 0 0 0 0 0 0 0.9948 0 0 0 0 0 0 0 0.7148 0 0 0 0 0 0 0 0.9954 0 0 0 0 0 0 0 0.6092 0 0 0 0 0 0 0 0.9368 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 Columns 15 through 21

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0


(13)

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.3368 0 0 0 0 0 0

0 0.1613 0 0 0 0 0 0 0 0.3140 0 0 0 0 0 0 0 0.9169 0 0 0 0 0 0 0 0.8658 0 0 0 0 0 0 0 0.9971 0 0 0 0 0 0 0 0.8621 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 Columns 22 through 28

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0


(14)

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.9812 0 0 0 0 0 0

0 0.9999 0 0 0 0 0 0 0 0.9289 0 0 0 0 0 0 0 0.9848 0 0 0 0 0 0 0 0.9694 0 0 0 0 0 0 0 0.9316 0 0 0 0 0 0 0 0.8149 >>

X=[1,8.2,4,23.005;1,7.6,5,23.873;1,4.6,0,26.417;1,4.3,1,24.868;1,5.9,2,29.895;1,5 ,3,24.2;1,6.5,4,23.215;1,8.3,5,21.862;1,10.1,0,22.274;1,13.2,1,23.83;1,12.6,2,25.1 44;1,10.4,3,22.43;1,10.8,4,21.785;1,13.1,5,22.38;1,13.3,0,23.927;1,10.4,1,33.443; 1,10.5,2,24.859;1,7.7,3,22.686;1,10,0,21.789;1,12,1,22.041;1,12.1,4,21.033;1,13. 6,5,21.005;1,15,0,25.865;1,13.5,1,26.29;1,11.5,2,22.932;1,12,3,21.313;1,13,4,20. 769;1,14.1,5,21.393]

X =


(15)

1.0000 7.6000 5.0000 23.8730 1.0000 4.6000 0 26.4170 1.0000 4.3000 1.0000 24.8680 1.0000 5.9000 2.0000 29.8950 1.0000 5.0000 3.0000 24.2000 1.0000 6.5000 4.0000 23.2150 1.0000 8.3000 5.0000 21.8620 1.0000 10.1000 0 22.2740 1.0000 13.2000 1.0000 23.8300 1.0000 12.6000 2.0000 25.1440 1.0000 10.4000 3.0000 22.4300 1.0000 10.8000 4.0000 21.7850 1.0000 13.1000 5.0000 22.3800 1.0000 13.3000 0 23.9270 1.0000 10.4000 1.0000 33.4430 1.0000 10.5000 2.0000 24.8590 1.0000 7.7000 3.0000 22.6860 1.0000 10.0000 0 21.7890 1.0000 12.0000 1.0000 22.0410 1.0000 12.1000 4.0000 21.0330 1.0000 13.6000 5.0000 21.0050 1.0000 15.0000 0 25.8650 1.0000 13.5000 1.0000 26.2900 1.0000 11.5000 2.0000 22.9320 1.0000 12.0000 3.0000 21.3130 1.0000 13.0000 4.0000 20.7690 1.0000 14.1000 5.0000 21.3930 >> A=X'


(16)

Columns 1 through 7

1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 8.2000 7.6000 4.6000 4.3000 5.9000 5.0000 6.5000 4.0000 5.0000 0 1.0000 2.0000 3.0000 4.0000 23.0050 23.8730 26.4170 24.8680 29.8950 24.2000 23.2150 Columns 8 through 14

1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 8.3000 10.1000 13.2000 12.6000 10.4000 10.8000 13.1000 5.0000 0 1.0000 2.0000 3.0000 4.0000 5.0000 21.8620 22.2740 23.8300 25.1440 22.4300 21.7850 22.3800 Columns 15 through 21

1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 13.3000 10.4000 10.5000 7.7000 10.0000 12.0000 12.1000

0 1.0000 2.0000 3.0000 0 1.0000 4.0000

23.9270 33.4430 24.8590 22.6860 21.7890 22.0410 21.0330 Columns 22 through 28

1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 13.6000 15.0000 13.5000 11.5000 12.0000 13.0000 14.1000

5.0000 0 1.0000 2.0000 3.0000 4.0000 5.0000 21.0050 25.8650 26.2900 22.9320 21.3130 20.7690 21.3930 >> J=inv(A*w*X)


(17)

10.1329 -0.1263 -0.2495 -0.3469 -0.1263 0.0050 0.0016 0.0030 -0.2495 0.0016 0.0193 0.0078 -0.3469 0.0030 0.0078 0.0126 >>

y=[7.6;7.7;4.3;5.9;5;6.5;8.3;8.2;13.2;12.6;10.4;10.8;13.1;12.3;10.4;10.5;7.7;9.5;1 2;12.6;13.6;14.1;13.5;11.5;12;13;14.1;15.1]

y = 7.6000 7.7000 4.3000 5.9000 5.0000 6.5000 8.3000 8.2000 13.2000 12.6000 10.4000 10.8000 13.1000 12.3000 10.4000 10.5000 7.7000 9.5000 12.0000 12.6000 13.6000


(18)

14.1000 13.5000 11.5000 12.0000 13.0000 14.1000 15.1000 >> K=A*w*y K =

1.0e+003 * 0.2423 2.6904 0.6527 5.5885

>> β=J*K

β= 14.1079

0.7489 -0.1159 -0.4651

ITERASI III

>>w=[0.6784,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0.9699,0,0,0,0, 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0.8486,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0, 0,0,0,0,0,0,0,0,0;0,0,0,0.9887,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,


(19)

0.9368,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0.9890,0,0,0,0,0,0,0,0, 0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0.9416,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0, 0;0,0,0,0,0,0,0,0.7168,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0.5344, 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0.9934,0,0,0,0,0,0,0,0,0,0,0,0, 0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0.7705,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0, 0,0,0,0,0.9825,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0.6734,0,0,0,0, 0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0.9326,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0, 0,0,0,0,0,0,0,0,0,0,0,0,0.3217,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0, 0.0403,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.3398,0,0,0,0,0,0,0,0, 0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.9528,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0, 0,0,0,0,0,0,0,0,0.9505,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.9971, 0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.9227,0,0,0,0,0,0,0;0,0,0,0,0, 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.9957,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0, 0,0,0,0,0.9938,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.9756,0,0,0,0; 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.9945,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0, 0,0,0,0,0,0,0,0,0,0,0,0,0.9952,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0, 0.9750,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.8498]

w =

Columns 1 through 7

0.6784 0 0 0 0 0 0 0 0.9699 0 0 0 0 0 0 0 0.8486 0 0 0 0 0 0 0 0.9887 0 0 0 0 0 0 0 0.9368 0 0 0 0 0 0 0 0.9890 0 0 0 0 0 0 0 0.9416 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0


(20)

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0

Columns 8 through 14

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.7168 0 0 0 0 0 0

0 0.5344 0 0 0 0 0 0 0 0.9934 0 0 0 0 0 0 0 0.7705 0 0 0


(21)

0 0 0 0 0.9825 0 0 0 0 0 0 0 0.6734 0 0 0 0 0 0 0 0.9326 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 Columns 15 through 21

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0


(22)

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.3217 0 0 0 0 0 0

0 0.0403 0 0 0 0 0 0 0 0.3398 0 0 0 0 0 0 0 0.9528 0 0 0 0 0 0 0 0.9505 0 0 0 0 0 0 0 0.9971 0 0 0 0 0 0 0 0.9227 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 Columns 22 through 28

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0


(23)

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.9957 0 0 0 0 0 0

0 0.9938 0 0 0 0 0 0 0 0.9756 0 0 0 0 0 0 0 0.9945 0 0 0 0 0 0 0 0.9952 0 0 0 0 0 0 0 0.9750 0 0 0 0 0 0 0 0.8498

>>

y=[7.6;7.7;4.3;5.9;5;6.5;8.3;8.2;13.2;12.6;10.4;10.8;13.1;12.3;10.4;10.5;7.7;9.5;1 2;12.6;13.6;14.1;13.5;11.5;12;13;14.1;15.1]

y = 7.6000 7.7000 4.3000 5.9000 5.0000 6.5000 8.3000 8.2000 13.2000 12.6000


(24)

10.4000 10.8000 13.1000 12.3000 10.4000 10.5000 7.7000 9.5000 12.0000 12.6000 13.6000 14.1000 13.5000 11.5000 12.0000 13.0000 14.1000 15.1000

>>

X=[1,8.2,4,23.005;1,7.6,5,23.873;1,4.6,0,26.417;1,4.3,1,24.868;1,5.9,2,29.895;1,5 ,3,24.2;1,6.5,4,23.215;1,8.3,5,21.862;1,10.1,0,22.274;1,13.2,1,23.83;1,12.6,2,25.1 44;1,10.4,3,22.43;1,10.8,4,21.785;1,13.1,5,22.38;1,13.3,0,23.927;1,10.4,1,33.443; 1,10.5,2,24.859;1,7.7,3,22.686;1,10,0,21.789;1,12,1,22.041;1,12.1,4,21.033;1,13. 6,5,21.005;1,15,0,25.865;1,13.5,1,26.29;1,11.5,2,22.932;1,12,3,21.313;1,13,4,20. 769;1,14.1,5,21.393]

X =

1.0000 8.2000 4.0000 23.0050 1.0000 7.6000 5.0000 23.8730 1.0000 4.6000 0 26.4170


(25)

1.0000 4.3000 1.0000 24.8680 1.0000 5.9000 2.0000 29.8950 1.0000 5.0000 3.0000 24.2000 1.0000 6.5000 4.0000 23.2150 1.0000 8.3000 5.0000 21.8620 1.0000 10.1000 0 22.2740 1.0000 13.2000 1.0000 23.8300 1.0000 12.6000 2.0000 25.1440 1.0000 10.4000 3.0000 22.4300 1.0000 10.8000 4.0000 21.7850 1.0000 13.1000 5.0000 22.3800 1.0000 13.3000 0 23.9270 1.0000 10.4000 1.0000 33.4430 1.0000 10.5000 2.0000 24.8590 1.0000 7.7000 3.0000 22.6860 1.0000 10.0000 0 21.7890 1.0000 12.0000 1.0000 22.0410 1.0000 12.1000 4.0000 21.0330 1.0000 13.6000 5.0000 21.0050 1.0000 15.0000 0 25.8650 1.0000 13.5000 1.0000 26.2900 1.0000 11.5000 2.0000 22.9320 1.0000 12.0000 3.0000 21.3130 1.0000 13.0000 4.0000 20.7690 1.0000 14.1000 5.0000 21.3930 >> K=A*w*y

K =


(26)

0.2475 2.7521 0.6649 5.6879

>> J=inv(A*w*X) J =

11.1556 -0.1335 -0.2566 -0.3879 -0.1335 0.0050 0.0016 0.0033 -0.2566 0.0016 0.0189 0.0082 -0.3879 0.0033 0.0082 0.0142

>> β=J*K

β = 16.4382

0.7283 -0.1536 -0.5517

ITERASI IV

w=[0.6719,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0.9805,0,0,0,0,0,0 ,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0.8634,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 ,0,0,0,0,0,0,0;0,0,0,0.9928,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0.8 192,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0.9906,0,0,0,0,0,0,0,0,0,0 ,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0.9482,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0 ,0,0,0,0,0,0,0.6896,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0.6036,0,0 ,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0.9960,0,0,0,0,0,0,0,0,0,0,0,0,0,0 ,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0.8281,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0 ,0,0,0.9753,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0.6982,0,0,0,0,0,0


(27)

,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0.9456,0,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0 ,0,0,0,0,0,0,0,0,0,0,0.3262,0,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.0 105,0,0,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.3745,0,0,0,0,0,0,0,0,0,0 ,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.9679,0,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0 ,0,0,0,0,0,0,0.9843,0,0,0,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.9878,0,0 ,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.9449,0,0,0,0,0,0,0;0,0,0,0,0,0,0 ,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.9977,0,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 ,0,0,0.9713,0,0,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.9978,0,0,0,0;0,0 ,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.9963,0,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 ,0,0,0,0,0,0,0,0,0,0,0.9997,0,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.9 876,0;0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0.8495]

w =

Columns 1 through 7

0.6719 0 0 0 0 0 0 0 0.9805 0 0 0 0 0 0 0 0.8634 0 0 0 0 0 0 0 0.9928 0 0 0 0 0 0 0 0.8192 0 0 0 0 0 0 0 0.9906 0 0 0 0 0 0 0 0.9482 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0


(28)

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 Columns 8 through 14

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.6896 0 0 0 0 0 0

0 0.6036 0 0 0 0 0 0 0 0.9960 0 0 0 0 0 0 0 0.8281 0 0 0 0 0 0 0 0.9753 0 0 0 0 0 0 0 0.6982 0 0 0 0 0 0 0 0.9456 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0


(29)

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 Columns 15 through 21

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.3262 0 0 0 0 0 0

0 0.0105 0 0 0 0 0 0 0 0.3745 0 0 0 0 0 0 0 0.9679 0 0 0


(30)

0 0 0 0 0.9843 0 0 0 0 0 0 0 0.9878 0 0 0 0 0 0 0 0.9449 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 Columns 22 through 28

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0


(31)

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.9977 0 0 0 0 0 0

0 0.9713 0 0 0 0 0 0 0 0.9978 0 0 0 0 0 0 0 0.9963 0 0 0 0 0 0 0 0.9997 0 0 0 0 0 0 0 0.9876 0 0 0 0 0 0 0 0.8495 >>

X=[1,8.2,4,23.005;1,7.6,5,23.873;1,4.6,0,26.417;1,4.3,1,24.868;1,5.9,2,29.895;1,5 ,3,24.2;1,6.5,4,23.215;1,8.3,5,21.862;1,10.1,0,22.274;1,13.2,1,23.83;1,12.6,2,25.1 44;1,10.4,3,22.43;1,10.8,4,21.785;1,13.1,5,22.38;1,13.3,0,23.927;1,10.4,1,33.443; 1,10.5,2,24.859;1,7.7,3,22.686;1,10,0,21.789;1,12,1,22.041;1,12.1,4,21.033;1,13. 6,5,21.005;1,15,0,25.865;1,13.5,1,26.29;1,11.5,2,22.932;1,12,3,21.313;1,13,4,20. 769;1,14.1,5,21.393]

X =

1.0000 8.2000 4.0000 23.0050 1.0000 7.6000 5.0000 23.8730 1.0000 4.6000 0 26.4170 1.0000 4.3000 1.0000 24.8680 1.0000 5.9000 2.0000 29.8950 1.0000 5.0000 3.0000 24.2000 1.0000 6.5000 4.0000 23.2150 1.0000 8.3000 5.0000 21.8620 1.0000 10.1000 0 22.2740 1.0000 13.2000 1.0000 23.8300 1.0000 12.6000 2.0000 25.1440 1.0000 10.4000 3.0000 22.4300


(32)

1.0000 10.8000 4.0000 21.7850 1.0000 13.1000 5.0000 22.3800 1.0000 13.3000 0 23.9270 1.0000 10.4000 1.0000 33.4430 1.0000 10.5000 2.0000 24.8590 1.0000 7.7000 3.0000 22.6860 1.0000 10.0000 0 21.7890 1.0000 12.0000 1.0000 22.0410 1.0000 12.1000 4.0000 21.0330 1.0000 13.6000 5.0000 21.0050 1.0000 15.0000 0 25.8650 1.0000 13.5000 1.0000 26.2900 1.0000 11.5000 2.0000 22.9320 1.0000 12.0000 3.0000 21.3130 1.0000 13.0000 4.0000 20.7690 1.0000 14.1000 5.0000 21.3930 >> A=X'

A =

Columns 1 through 7

1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 8.2000 7.6000 4.6000 4.3000 5.9000 5.0000 6.5000 4.0000 5.0000 0 1.0000 2.0000 3.0000 4.0000 23.0050 23.8730 26.4170 24.8680 29.8950 24.2000 23.2150 Columns 8 through 14

1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 8.3000 10.1000 13.2000 12.6000 10.4000 10.8000 13.1000


(33)

5.0000 0 1.0000 2.0000 3.0000 4.0000 5.0000 21.8620 22.2740 23.8300 25.1440 22.4300 21.7850 22.3800 Columns 15 through 21

1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 13.3000 10.4000 10.5000 7.7000 10.0000 12.0000 12.1000

0 1.0000 2.0000 3.0000 0 1.0000 4.0000

23.9270 33.4430 24.8590 22.6860 21.7890 22.0410 21.0330 Columns 22 through 28

1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 13.6000 15.0000 13.5000 11.5000 12.0000 13.0000 14.1000

5.0000 0 1.0000 2.0000 3.0000 4.0000 5.0000 21.0050 25.8650 26.2900 22.9320 21.3130 20.7690 21.3930 >>

y=[7.6;7.7;4.3;5.9;5;6.5;8.3;8.2;13.2;12.6;10.4;10.8;13.1;12.3;10.4;10.5;7.7;9.5;1 2;12.6;13.6;14.1;13.5;11.5;12;13;14.1;15.1]

y = 7.6000 7.7000 4.3000 5.9000 5.0000 6.5000 8.3000 8.2000 13.2000


(34)

12.6000 10.4000 10.8000 13.1000 12.3000 10.4000 10.5000 7.7000 9.5000 12.0000 12.6000 13.6000 14.1000 13.5000 11.5000 12.0000 13.0000 14.1000 15.1000

>> J=inv(A*w*X) J =

11.6986 -0.1343 -0.2610 -0.4110 -0.1343 0.0050 0.0015 0.0034 -0.2610 0.0015 0.0187 0.0084 -0.4110 0.0034 0.0084 0.0152 >> K=A*w*y


(35)

1.0e+003 * 0.2498 2.7799 0.6693 5.7332 >> β=J*K

β =

17.5284 0.7201 -0.1721 -0.5931

LAMPIRAN II Hasil program SPSS

Dengan program SPSS 19 akan diperoleh hasil sebagai berikut : REGRESSION

Model Summary

Variables Entered/Removedb

Model Variables Entered

Variables

Removed Method

1 X3, X1, X2a . Enter

a. All requested variables entered. b. Dependent Variable: Y


(36)

Model R R Square

Adjusted R Square

Std. Error of the Estimate

1 ,909a ,826 ,805 1,33032

a. Predictors: (Constant), X3, X1, X2

Model Sum of Squares Df Mean Square F Sig.

1 Regression 202,176 3 67,392 38,080 ,000a

Residual 42,474 24 1,770

Total 244,650 27

a. Predictors: (Constant), X3, X1, X2 b. Dependent Variable: Y

Coefficientsa

Model

Unstandardized Coefficients

Standardized Coefficients

T Sig.

B Std. Error Beta

1 (Constant) 9,590 3,125 3,069 ,005

X1 ,777 ,086 ,800 9,013 ,000

X2 -,026 ,161 -,015 -,158 ,875

X3 -,295 ,107 -,275 -2,762 ,011

a. Dependent Variable: Y

Hasil program SPSS untuk Least Trimmed Square

Iterasi I

Model Summary

Model R R Square

Adjusted R Square

Std. Error of the Estimate


(37)

Model Summary

Model R R Square

Adjusted R Square

Std. Error of the Estimate

1 ,909a ,826 ,805 1,3303197

a. Predictors: (Constant), X3, X1, X2

Iterasi II

Model Summary

Model R R Square

Adjusted R Square

Std. Error of the Estimate

1 ,972a ,945 ,917 106095,3938

475 Predictors: (Constant), X3, X1, X


(38)

DAFTAR PUSTAKA

Alfigari.2002.Analisis Regresi.Sekolah Tinggi Ilmu Ekonomi YKPN.Yogyakarta.

Cahmayati Dian dan Tanuji Hadi. 2009.Efektivitas Metode Regresi Robust PendugaWelsch dalam mengatasi Pencilan pada Pemodelan Regresi Linier Berganda.Universitas Sriwijaya.

Dixon J,Wilfrid dan Massey J.Frank.1991.Pengantar Analisis Statistik.Universitas Gadjah Mada.Yogyakarta.

Draper, N.R dan H. Smith. 1992.Analisis Regresi Terapan.Terj.Bambang Sumantri.Gramedia.Jakarta.

Hasan Iqbal.1999.Pokok-pokok materi statistik.Penerbit.Bumi Aksara.Jakarta. Hadi Sutrisno.2000.Statistika.Penerbit Andi.Yogyakarta.

Heryanto Nar dan Gantin Tuti.2010.Pengantar Statistika Matematika.Penerbit Yrama Widia.Bandung

Santoso Singgih.2004.Statistik Diskriftif.Penerbit Andi.Yogyakarta. Supangat Andi.2007.Statistika Dalam kajian Deskriftif,Inferensi, dan

Nonparametrik.Kencana Prenada Media Group.Bandung.

Roosseeuw J.Peter dan Leroy M,Annick.1986.Robust Regression And Outlier Detection.USA.


(39)

BAB 3

PEMBAHASAN

Berikut ini data yang mengandung data pencilan yaitu data pengukuran keasinan garam dan arus sungai di Carolina’s Pamlico Sound Utaradari buku Robust Regression And Outlier Detection.

Tabel 3.1. Salinity Data

Index Lagged salinity Trend Discharge Salinity

(i) (X1) (X2) (X3) (Y)

1 8,2 4 23,005 7,6

2 7,6 5 23,873 7,7

3 4,6 0 26,417 4,3

4 4,3 1 24,868 5,9

5 5,9 2 29,895 5,0

6 5,0 3 24,200 6,5

7 6,5 4 23,215 8,3

8 8,3 5 21,862 8,2

9 10,1 0 22,274 13,2

10 13,2 1 23,830 12,6

11 12,6 2 25,144 10,4

12 10,4 3 22,430 10,8

13 10,8 4 21,785 13,1

14 13,1 5 22,380 12,3

15 13,3 0 23,927 10,4

16 10,4 1 33,443 10,5

17 10,5 2 24,859 7,7

18 7,7 3 22,686 9,5

19 10,0 0 21,789 12,0

20 12,0 1 22,041 12,6


(40)

23 15,0 0 25,865 13,5

24 13,5 1 26,290 11,5

25 11,5 2 22,932 12,0

26 12,0 3 21,313 13,0

27 13,0 4 20,769 14,1

28 14,1 5 21,393 15,1

Source: Robust Regresion And outlier Detection 1986

3.2.Pendeteksian Outlier

Berikut ini adalah pendeteksian masing-masing variabel dari data tersebut dan masing masing variabel telah diurutkan dari variabel terkecil ke variabel terbesar.

Tabel 3.2.Data Lagged Salinity

Index Lagged salinity Index Lagged

salinity

1 4,3 15 10,8

2 4,6 16 11,5

3 5 17 12

4 5,9 18 12

5 6,5 19 12,1

6 7,6 20 12,6

7 7,7 21 13

8 8,2 22 13,1

9 8,3 23 13,2

10 10 24 13,3

11 10,1 25 13,5

12 10,4 26 13,6

13 10,4 27 14,1

14 10,5 28 15

Penyelesaian:

1. 7,95

2 2 , 8 7 , 7 2

8 7

1 

  

x x


(41)

3.Q3Q113,057,955,1

4.Nilai dari 1,5R=1,5x5,1=7,65

Tabel 3.3. Data Trend (X2)

Index Trend Index Trend

1 0 15 3

2 0 16 3

3 0 17 3

4 0 18 3

5 0 19 4

6 1 20 4

7 1 21 4

8 1 22 4

9 1 23 4

10 1 24 5

11 2 25 5

12 2 26 5

13 2 27 5

14 2 28 5

Penyelesaian: 1. 1 2 1 1 2 8 7 1    

x x

Q 2. 4 2 4 4 2 22 21 3    

x x

Q

3. Q3Q1 413

4. Nilai dari 1,5R=1,5x3=4,5

Tabel 3.4.Data discharge

Index Discharge Index Discharge

1 23,005 15 22,932

2 20,769 16 23,215


(42)

5 21,313 19 23,927

6 21,393 20 24,2

7 21,785 21 24,859

8 21,789 22 24,868

9 21,862 23 25,144

10 22,041 24 25,865

11 22,274 25 26,29

12 22,38 26 26,417

13 22,43 27 29,895

14 22,686 28 33,443

Penyelesaian

1. 21,787

2 789 , 21 785 , 21 2 8 7 1    

x x

Q

2. 24,8635

2 868 , 24 859 , 24 2 22 21 3    

x x

Q

3. Q3 Q1 =24,8635 21,787=3,0765

4. Nilai dari 1,5R=1,5x3,0765=4,61475

Tabel 3.5. Data Salinity

Index Salinity Index Salinity

1 4,3 15 11,5

2 5 16 12

3 5,9 17 12

4 6,5 18 12,3

5 7,6 19 12,6

6 7,7 20 12,6

7 7,7 21 13

8 8,2 22 13,1

9 8,3 23 13,2

10 9,5 24 13,5


(43)

13 10,5 27 14,1

14 10,8 28 15,1

1. 7,95

2 2 , 8 7 , 7 2 8 7 1    

x x

Q

2. 13,05

2 1 , 13 13 2 22 21 3    

x x

Q

3. Q3Q1 13,057,955,1

4. Nilai dari 1,5R=1,5x5,1=7,65

Tabel Perhitungan IQR

Dari perhitungan di atas kita dapat mengambil kesimpulan bahwa data X2 yang kurang 1,5 IQR terhadap Q1merupakan suatu data pencilan yaitu data yang dengan :

Dari keseluruhan data pencilan dapat diperlihatkan sebagai berikut:

Tabel 3.6.Tabel IQR

Variabel Nilai Q1 Nilai Q3 Nilai IQR

X1 7,95 13,05 7,65

X2 1 4 1,5

X3 21,787 24,8635 4,61475

Y 7,95 13,05 7,65

Tabel 3.7.Data pencilan

Index Trend 1 0 2 0 3 0 4 0 5 0 6 1


(44)

3.3 Metode Least Trimmed Square

Langkah–langkah penyelesaian dengan Least Trimmed Square 1.Mencari nilai β0,β1,β2,β3

Dari program diatas didapat nilai: β1=0,77,β2 =-0,26,β3 =-0,295,β0=9,590 Sehingga Persamaan Yˆ = β0 + β1x1 +β2x2 +β3x3

atau

3 2

1 0,26 0,295 777

, 0 590 , 9

ˆ x x x

Y    

2.Selanjutnya adalah mencari niai residual

Tabel 3.8.Nilai Residual

NO (X1) (X2) (X3) Y Yˆ YYˆ (YYˆ)2

1 8,2 4 23,005 7,6 8,134925 -0,53493 0,28614

2 7,6 5 23,873 7,7 7,152665 0,547335 0,29958

3 4,6 0 26,417 4,3 5,371185 -1,07119 1,14744

4 4,3 1 24,868 5,9 5,335040 0,56496 0,31918

5 5,9 2 29,895 5 4,835275 0,164725 0,02713

6 5,0 3 24,200 6,5 5,556000 0,944 0,89114

7 6,5 4 23,215 8,3 6,752075 1,547925 2,39607

8 8,3 5 21,862 8,2 8,289810 -0,08981 0,00807

9 10,1 0 22,274 13,2 10,866870 2,33313 5,44350

10 13,2 1 23,830 12,6 12,556550 0,04345 0,00189

11 12,6 2 25,144 10,4 11,442720 -1,04272 1,08726

12 10,4 3 22,430 10,8 10,273950 0,52605 0,27673

13 10,8 4 21,785 13,1 10,515025 2,584975 6,68210

14 13,1 5 22,380 12,3 11,866600 0,4334 0,18784

15 13,3 0 23,927 10,4 12,865635 -2,46564 6,07936

16 10,4 1 33,443 10,5 7,545115 2,954885 8,73135

17 10,5 2 24,859 7,7 9,895095 -2,1951 4,81844


(45)

20 12,0 1 22,041 12,6 12,151905 0,448095 0,20079

21 12,1 4 21,033 13,6 11,746965 1,853035 3,43374

22 13,6 5 21,005 14,1 12,660725 1,439275 2,07151

23 15,0 0 25,865 13,5 13,614825 -0,11483 0,01318

24 13,5 1 26,290 11,5 12,063950 -0,56395 0,31804

25 11,5 2 22,932 12 11,240560 0,75944 0,57675

26 12,0 3 21,313 13 11,846665 1,153335 1,33018

27 13,0 4 20,769 14,1 12,524145 1,575855 2,48332

28 14,1 5 21,393 15,1 12,934765 2,165235 4,68824

JUMLAH 295,5 279,071815 16,42819 56,89757

ITERASI 1

Langkah awal yang dilakukan adalah menentukan coverage (h) h=[n/2]+[(p+1)/2]

h=[(n+p+1)/2] h=[(28+3+1)/2] h=16

Selanjutnya, mengurutkan nilai kuadrat residual dari yang terkecil sampai ke yang terbesar:

Tabel 3.9.Kuadrat Residual

No Kuadrat Residual No Kuadrat Residual

1 0,00189 15 1,14010

2 0,00807 16 1,14744

3 0,01318 17 1,33018

4 0,02713 18 1,95852

5 0,18784 19 2,07151

6 0,20079 20 2,39607

7 0,27673 21 2,48332

8 0,28614 22 3,43374

9 0,29958 23 4,68824

10 0,31804 24 4,81844

11 0,31918 25 5,44350


(46)

14 1,08726 28 8,73135

Karena h=16,maka residu yang digunakan dari yang terkecil adalah

Tabel 3.10.Kuadrat Residual setelah diurutkan iterasi 1

No. Residual kuadrat No. Residual kuadrat

1 0,00189 9 0,29958

2 0,00807 10 0,31804

3 0,01318 11 0,31918

4 0,02713 12 0,57675

5 0,18784 13 0,89114

6 0,20079 14 1,08726

7 0,27673 15 1,1401

8 0,28614 16 1,14744

Nilai

 hnew

i

new r i

1 2

78126 , 6 ) ( ˆ

β

Data setelah diurutkan dari residual terkecil sampai ke residual terbesar:

Tabel 3.11.Data setelah diurutkan dari kuadrat terkecil

No X1 X2 X3 Y

1 13,2 1 23,830 12,6

2 8,3 5 21,862 8,2

3 15,0 0 25,865 13,5

4 5,9 2 29,895 5

5 13,1 5 22,380 12,3

6 12,0 1 22,041 12,6

7 10,4 3 22,430 10,8

8 8,2 4 23,005 7,6

9 7,6 5 23,873 7,7

10 13,5 1 26,290 11,5

11 4,3 1 24,868 5,9

12 11,5 2 22,932 12


(47)

16 4,6 0 26,417 4,3 Jumlah 155,2 35 386,821 152,9

Dengan SPSS diperoleh persamaan:Yˆ=13,865+0,731X1-0,286X2-0,446X3 -Estimasi parameter bnewdari hopengamatan

Dengan mengalikan bnewterhadap Xidan Y diperoleh data sebagai berikut:

Tabel 3.12.Perkalian bnewpada iterasi 1

No X1 X2 X3

new

Yˆ YYˆnew (YYˆnew)2 Yˆ Y

1 89,5126 6,7813 161,5974 85,4448 -72,8439 5306,2274 12,6000 85,4439 2 56,2845 33,9063 148,2519 59,3494 -46,8545 2195,3427 8,7518 55,6067 3 101,7189 0,0000 175,3973 90,1523 -78,2528 6123,5007 13,2942 91,5476 4 40,0094 13,5625 202,7258 28,9753 -29,6336 878,1485 4,2727 33,9065 5 88,8345 33,9063 151,7646 81,5768 -71,3799 5095,0870 12,0296 83,4100 6 81,3751 6,7813 149,4658 84,9070 -72,9232 5317,7876 12,5207 85,4444 7 70,5251 20,3438 152,1037 71,9203 -62,6320 3922,7659 10,6056 73,2380 8 55,6063 27,1250 156,0029 57,3362 -43,0826 1856,1109 8,4550 51,5379 9 51,5376 33,9063 161,8890 49,7973 -44,8725 2013,5377 7,3432 52,2160 10 91,5470 6,7813 178,2793 79,4918 -66,2623 4390,6964 11,7222 77,9850 11 29,1594 6,7813 168,6364 38,1873 -34,3783 1181,8649 5,6312 40,0097 12 77,9845 13,5625 155,5079 77,7943 -69,9033 4886,4702 11,4718 81,3756 13 33,9063 20,3438 164,1065 39,7987 -38,2094 1459,9575 5,8688 44,0785 14 85,4439 13,5625 170,5080 76,5570 -59,2357 3508,8715 11,2894 70,5255 15 67,8126 0,0000 147,7569 77,6944 -69,9180 4888,5287 11,4571 81,3756 16 31,1938 0,0000 179,1405 36,9290 -23,7138 562,3443 5,4456 29,1596 Dengan spss akan diperoleh persamaan sebagai berikut:

Yˆ=94,023+0,731X1-0,286X2-0,446X3 ITERASI II

Langkah awal yang dilakukan adalah menentukan coverage (h) h=[n/2]+[(p+1)/2]


(48)

Selanjutnya, mengurutkan nilai kuadrat residual dari yang terkecil sampai ke yang terbesar: Karena h=10,maka residu yang digunakan dari yang terkecil sampai yang ke 10 adalah

Tabel 3.13.Kuadrat Residual yang diurutkan dari yang terkecil

No Kuadrat Residual No Kuadrat Residual

1 562,3443 6 2013,538

2 878,1485 7 2195,343

3 1181,865 8 3508,872

4 1459,958 9 3922,766

5 1856,111 10 4390,696

Nilai

 hnew

i

new r i

1 2

21969,64 )

( ˆ

β

Didapatkan data berdasarkan kuadrat residual terkecil sampai terbesar menjadi:

Tabel 3.14.Data berdasarkan Kuadrat Residual dari yang terkecil

Dengan spss diperoleh persamaan :68,212+0,679X1-0,9X2-0,32X3 -Estimasi parameter bnewdari hopengamatan

Tabel 3.15. Estimasi parameter bnew

No X1 X2 X3 Y Yˆnew YYˆnew

No X1 X2 X3 Y Yˆ

1 31,1938 0,0000 179,1405 29,1596 32,0676 2 40,0094 13,5625 202,7258 33,9065 18,2999 3 29,1594 6,7813 168,6364 40,0097 27,9444 4 33,9063 20,3438 164,1065 44,0785 20,4109 5 55,6063 27,1250 156,0029 51,5379 31,6352 6 51,5376 33,9063 161,8890 52,2160 20,8859 7 56,2845 33,9063 148,2519 55,6067 28,4729 8 85,4439 13,5625 170,5080 70,5255 59,4596 9 70,5251 20,3438 152,1037 73,2380 49,1159 10 91,5470 6,7813 178,2793 77,9850 67,2199


(49)

2 878992,11 297963,24 4453812,84 744913,60 482208,25 -744895,30 3 640621,52 148982,72 3704881,00 878998,71 680614,50 -878970,76 4 744909,20 446945,96 3605360,73 968388,78 513314,04 -968368,37 5 1221650,39 595926,49 3427327,55 1132269,11 756704,86 -1132237,47 6 1132262,52 744909,20 3556643,05 1147166,72 522872,76 -1147145,84 7 1236550,20 744909,20 3257040,87 1221659,18 684163,96 -1221630,71 8 1877171,72 297963,24 3745999,38 1549419,85 1373731,87 -1549360,39 9 1549411,06 446945,96 3341663,53 1609012,49 1139207,36 -1608963,38 10 2011254,63 148982,72 3916732,04 1713302,38 1547295,37 -1713235,16 Dengan spss akan diperoleh persamaan sebagai berikut:

Yˆ= 1.498.591+0,679X1-0,9X2-0,302X3

3.3.1 Interpretasi dari persamaan dengan Least Trimmed Square

Interpretasi dari persamaan ini adalah: Tiap titik dalam garis persamaan regresi itu merupakan sebuah estimasi yang diharapkan atau nilai rata-rata variabel Y apabila dihubungkan dengan nilai X tertentu, yang biasa ditulis dengan E(Y X).Artinya setiap pertambahan nilai X1,X2dan nilai X3sebesar 1 maka didapatkan nilai Yˆsebesar 1.498.590,48.

Koefisien determinasinya adalah

 

2

3 3 2 2 1 1 2

y

y x b y x b y x b R

Dengan software SPSS dipeoleh nilai R2 pada iterasi 1 =0,826 dan nilai R2adjusted =0,805

Dari iterasi ke II diperoleh nilai :R2=0,945 dan

3.4 Fungsi obyektif,fungsi pengaruh dan pembobot Welsch

Metode Regresi Robust dengan fungisi objektif, fungsi pengaruh dan fungsi pembobot Welsch adalah fungsi parameter menghasilkan model yang lebih baik dari model hasil . Ini adalah salah satu type dari regresi robust dengan penduga M.


(50)

2.Mengestimasi parameter model model regresi menggunakan metode kuadrat kuadrat kecil sehingga didapatkan yˆi,o dan menghitung εi,0yiyˆi,0,

3.Menentukan σˆ0 dan pembobot awal

) (ε ) ψ(ε w i i,0 * i,0

i,0  .Dengan

0 0 , * 0 , ˆ σ ε ε i

i  Nilaiσˆ0

diperoleh dengan menggunakan rumus

6745 , 0 ] ˆ [ 1 6745 , 0

ˆ0

1

   n i i Y Y n MAR

σ untuk

masing-masing iterasi t

4.Berdasarkan tabel diperoleh ψ(ε*i,0)(ε*i,0)exp((ε*i,0/c)2).

5.Mencari estimasi pada masing-masing iterasi dengan weighted least square yaitu

(XTWt-1X)-1XTWt-1Y

Y=adalah vektor kolom berukuran nx1 pengamatan dari variabel tak bebas y

X=adalah matriks berukuran nxp dengan n pengamatan dari p variabel bebas x1sampai dengan xn.

Tahap (3) dan (4) diulang sampai diperoleh estimasi parameter model yang konvergen, artinya selisih hasil iterasi t dengan t-1 bernilai 0

6.Perhitungan dilakukan menggunakan komputer.

Penyelesaian dari Estimasi M dengan Type Welsch

1.Datanya yang diambil yang mengandung data pencilan yaitu data pengukuran keasinan

garam dan arus sungai di Carolina’s Pamlico Sound Utaradari buku Robust Regression And Outlier Detection.

2.Mengestimasi parameter model regresi menggunakan metode kuadrat kecil sehingga

didapatkan yˆi,o dan menghitug εi,0yiyˆi,0, Dari program diatas didapat nilai:

1

β =0,777, β2 =-0,26, β3 =-0,295, β0=9,590 Sehingga Persamaan Yˆ= β0 +β1X1 + β2X2 + β3X3

3 2

1 0,26 0,295 777 , 0 590 , 9

ˆ X X X

Y    


(51)

Tabel 3.16.Nilai Residual

NO (X1) (X2) (X3) Y Yˆ Y Y

o

i,   ˆ

ε YYˆ

1 8,2 4 23,005 7,6 8,134925 -0,53493 0,53493

2 7,6 5 23,873 7,7 7,152665 0,547335 0,547335

3 4,6 0 26,417 4,3 5,371185 -1,07119 1,07119

4 4,3 1 24,868 5,9 5,335040 0,56496 0,56496

5 5,9 2 29,895 5 4,835275 0,164725 0,164725

6 5,0 3 24,200 6,5 5,556000 0,944 0,944

7 6,5 4 23,215 8,3 6,752075 1,547925 1,547925

8 8,3 5 21,862 8,2 8,289810 -0,08981 0,08981

9 10,1 0 22,274 13,2 10,866870 2,33313 2,33313

10 13,2 1 23,830 12,6 12,556550 0,04345 0,04345

11 12,6 2 25,144 10,4 11,442720 -1,04272 1,04272

12 10,4 3 22,430 10,8 10,273950 0,52605 0,52605

13 10,8 4 21,785 13,1 10,515025 2,584975 2,584975

14 13,1 5 22,380 12,3 11,866600 0,4334 0,4334

15 13,3 0 23,927 10,4 12,865635 -2,46564 2,46564

16 10,4 1 33,443 10,5 7,545115 2,954885 2,954885

17 10,5 2 24,859 7,7 9,895095 -2,1951 2,1951

18 7,7 3 22,686 9,5 8,100530 1,39947 1,39947

19 10,0 0 21,789 12 10,932245 1,067755 1,067755

20 12,0 1 22,041 12,6 12,151905 0,448095 0,448095

21 12,1 4 21,033 13,6 11,746965 1,853035 1,853035

22 13,6 5 21,005 14,1 12,660725 1,439275 1,439275

23 15,0 0 25,865 13,5 13,614825 -0,11483 0,11483

24 13,5 1 26,290 11,5 12,063950 -0,56395 0,56395

25 11,5 2 22,932 12 11,240560 0,75944 0,75944

26 12,0 3 21,313 13 11,846665 1,153335 1,153335

27 13,0 4 20,769 14,1 12,524145 1,575855 1,575855

28 14,1 5 21,393 15,1 12,934765 2,165235 2,165235


(52)

3.Menentukan σˆ0 dan pembobot awal ) (ε ) ψ(ε w i i,0 * i,0

i,0  .Dengan

0 0 , * 0 , ˆ σ ε ε i

i  .Nilai σˆ0

diperoleh dengan menggunakan rumus

6745 , 0 ] ˆ [ 1 6745 , 0

ˆ0

1

   n i i Y Y n MAR

σ untuk masing-masing

iterasi t dan diperoleh ψ(ε ) (ε )exp( (ε* / )2) i,0 *

i,0 *

i,0   c .

c=2,3849 6745 , 0 ] ˆ [ 1 6745 , 0

ˆ0

1

   n i i Y Y n MAR σ 1,7253 6745 , 0 1,1637 6745 , 0 ) 58450 , 32 ( 28 1 6745 , 0 ] ˆ [ 1

ˆ0 1   

 

n i i Y Y n σ

Tabel 3.17.Pembobot awal

0 0 , * 0 , ˆ σ ε ε i i  * 0 , i

ε ψ(εi*) wi*) -0,3101 0,3101 -0,8024 0,9832 0,3172 0,3172 -0,1551 0,9825 -0,6209 0,6209 -0,6706 0,9345 0,3275 0,3275 0,2175 0,9813 0,0955 0,0955 0,3494 0,9984 0,5472 0,5472 0,2774 0,9487 0,8972 0,8972 0,5403 0,8680 -0,0521 0,0521 -0,7155 0,9995 1,3523 1,3523 1,0008 0,7250 0,0252 0,0252 -0,1020 0,9999 -0,6044 0,6044 -0,7551 0,9378 0,3049 0,3049 -0,0971 0,9838 1,4983 1,4983 0,9556 0,6739


(53)

1,7127 1,7127 0,6640 0,5971 -1,2723 1,2723 -1,0207 0,7523 0,8111 0,8111 0,4924 0,8908 0,6189 0,6189 0,4548 0,9349 0,2597 0,2597 -0,0003 0,9882 1,0740 1,0740 0,6307 0,8164 0,8342 0,8342 0,3198 0,8848 -0,0666 0,0666 0,0911 0,9992 -0,3269 0,3269 -0,2986 0,9814 0,4402 0,4402 0,2201 0,9665 0,6685 0,6685 0,2773 0,9244 0,9134 0,9134 0,4552 0,8636 1,2550 1,2550 0,7813 0,7581

4.Mencari estimasi pada masing-masing iterasi dengan weighted least square yaituβ=

(XTWt-1X)-1XTWt-1Y

Dengan matlab diperoleh Iterasi 1 (XTW0X)-1XTW0Y

β = 10.9825

0.7686 -0.0725 -0.3476

Iterasi 2

Mencari Nilai Residual

Tabel 3.18.Nilai Residual Iterasi 2

NO (X1) (X2) (X3) Y Yˆ εi,oYYˆ YYˆ

1 8,2 4 23,005 7,6 8,9985 -0,53493 0,53493


(54)

4.Mencari estimasi pada masing-masing iterasi dengan weighted least square yaitu =

β (XTWt-1X)-1XTWt-1Y

Dengan matlab diperoleh Iterasi 2

4 4,3 1 24,868 5,9 5,5709 0,56496 0,56496

5 5,9 2 29,895 5 4,9807 0,164725 0,164725

6 5,0 3 24,200 6,5 6,1961 0,944 0,944

7 6,5 4 23,215 8,3 7,6189 1,547925 1,547925

8 8,3 5 21,862 8,2 9,4001 -0,08981 0,08981

9 10,1 0 22,274 13,2 11,0029 2,33313 2,33313

10 13,2 1 23,830 12,6 12,7722 0,04345 0,04345

11 12,6 2 25,144 10,4 11,7818 -1,04272 1,04272

12 10,4 3 22,430 10,8 10,9618 0,52605 0,52605

13 10,8 4 21,785 13,1 11,4209 2,584975 2,584975

14 13,1 5 22,380 12,3 12,9094 0,4334 0,4334

15 13,3 0 23,927 10,4 12,8879 -2,46564 2,46564

16 10,4 1 33,443 10,5 7,2787 2,954885 2,954885

17 10,5 2 24,859 7,7 10,2668 -2,1951 2,1951

18 7,7 3 22,686 9,5 8,7976 1,39947 1,39947

19 10,0 0 21,789 12 11,0946 1,067755 1,067755

20 12,0 1 22,041 12,6 12,4717 0,448095 0,448095

21 12,1 4 21,033 13,6 12,6815 1,853035 1,853035

22 13,6 5 21,005 14,1 13,7716 1,439275 1,439275

23 15,0 0 25,865 13,5 13,5208 -0,11483 0,11483

24 13,5 1 26,290 11,5 12,1477 -0,56395 0,56395

25 11,5 2 22,932 12 11,7052 0,75944 0,75944

26 12,0 3 21,313 13 12,5798 1,153335 1,153335

27 13,0 4 20,769 14,1 13,4650 1,575855 1,575855

28 14,1 5 21,393 15,1 14,0211 2,165235 2,165235


(55)

6745 , 0 ] ˆ [ 1 6745 , 0

ˆ0

1

   n i i Y Y n MAR σ 1,7253 6745 , 0 1,1637 6745 , 0 ) 58450 , 32 ( 28 1 6745 , 0 ] ˆ [ 1

ˆ0 1   

 

n i i Y Y n σ

Tabel 3.19.Nilai pembobot Iterasi 2

0 0 , * 0 , ˆ σ ε ε i i  * 0 , i

ε ψ(εi*) ( ) *

i

w ε

-0,3101 0,3101 -0,8024 0,9832 0,3172 0,3172 -0,1551 0,9825 -0,6209 0,6209 -0,6706 0,9345 0,3275 0,3275 0,2175 0,9813 0,0955 0,0955 0,3494 0,9984 0,5472 0,5472 0,2774 0,9487 0,8972 0,8972 0,5403 0,8680 -0,0521 0,0521 -0,7155 0,9995 1,3523 1,3523 1,0008 0,7250 0,0252 0,0252 -0,1020 0,9999 -0,6044 0,6044 -0,7551 0,9378 0,3049 0,3049 -0,0971 0,9838 1,4983 1,4983 0,9556 0,6739 0,2512 0,2512 -0,2803 0,9890 -1,4291 1,4291 -1,0147 0,6983 1,7127 1,7127 0,6640 0,5971 -1,2723 1,2723 -1,0207 0,7523 0,8111 0,8111 0,4924 0,8908 0,6189 0,6189 0,4548 0,9349 0,2597 0,2597 -0,0003 0,9882 1,0740 1,0740 0,6307 0,8164 0,8342 0,8342 0,3198 0,8848 -0,0666 0,0666 0,0911 0,9992


(56)

0,4402 0,4402 0,2201 0,9665 0,6685 0,6685 0,2773 0,9244 0,9134 0,9134 0,4552 0,8636 1,2550 1,2550 0,7813 0,7581

4.Mencari estimasi pada masing-masing iterasi dengan weighted least square yaitu β= (XTW1X)-1XTW1Y

Dengan matlab diperoleh Iterasi 2

β=

14.107 0.7489 -0.1159

-0.4651

ITERASI 3

Mencari Nilai Residual

Tabel 3.20.Nilai Residual Iterasi 3

NO (X1) (X2) (X3) Y Yˆ Y Y

o

i,   ˆ

ε YYˆ

1 8,2 4 23,005 7,6 9,0857 -1,4857 1,4857

2 7,6 5 23,873 7,7 8,1167 -0,4167 0,4167

3 4,6 0 26,417 4,3 5,2663 -0,9663 0,9663

4 4,3 1 24,868 5,9 5,6462 0,2538 0,2538

5 5,9 2 29,895 5 4,3904 0,6096 0,6096

6 5,0 3 24,200 6,5 6,2493 0,2507 0,2507

7 6,5 4 23,215 8,3 7,7149 0,5851 0,5851

8 8,3 5 21,862 8,2 9,5763 -1,3763 1,3763

9 10,1 0 22,274 13,2 11,3122 1,8878 1,8878

10 13,2 1 23,830 12,6 12,7941 -0,1941 0,1941


(57)

13 10,8 4 21,785 13,1 11,6002 1,4998 1,4998

14 13,1 5 22,380 12,3 12,9301 -0,6301 0,6301

15 13,3 0 23,927 10,4 12,9398 -2,5398 2,5398

16 10,4 1 33,443 10,5 6,2262 4,2738 4,2738

17 10,5 2 24,859 7,7 10,1776 -2,4776 2,4776

18 7,7 3 22,686 9,5 8,9755 0,5245 0,5245

19 10,0 0 21,789 12 11,4628 0,5372 0,5372

20 12,0 1 22,041 12,6 12,7275 -0,1275 0,1275

21 12,1 4 21,033 13,6 12,9235 0,6765 0,6765

22 13,6 5 21,005 14,1 13,9440 0,1560 0,1560

23 15,0 0 25,865 13,5 13,3116 0,1884 0,1884

24 13,5 1 26,290 11,5 11,8747 -0,3747 0,3747

25 11,5 2 22,932 12 11,8228 0,1772 0,1772

26 12,0 3 21,313 13 12,8343 0,1657 0,1657

27 13,0 4 20,769 14,1 13,7203 0,3797 0,3797

28 14,1 5 21,393 15,1 14,1380 0,9620 0,9620

JUMLAH 295,5 25,2509

4.Mencari estimasi pada masing-masing iterasi dengan weighted least square yaitu

β

=(XTW1X)-1XTW1Y

1,3370 6745 , 0 0,9018 6745 , 0 ) 2509 , 25 ( 28 1 6745 , 0 ] ˆ [ 1

ˆ2 1   

 

n i i Y Y n σ

Tabel 3.21.Nilai pembobot iterasi 3

0 0 , * 0 , ˆ σ ε ε i i  * 0 , i

ε ψ(εi*) wi*) -1,1112 1,1112 -1,0078 0,6784 -0,3117 0,3117 -0,4042 0,9699

6745 , 0 ] ˆ [ 1 6745 , 0

ˆ2

1

   n i i Y Y n MAR σ


(58)

Dengan matlab diperoleh Iterasi 3

β =

16.4382 0.7283 -0.1536

0,4559 0,4559 0,5710 0,9368 0,1875 0,1875 0,2480 0,9890 0,4377 0,4377 0,5510 0,9416 -1,0293 1,0293 -0,9864 0,7168 1,4120 1,4120 1,0089 0,5344 -0,1452 0,1452 -0,1929 0,9934 -0,9108 0,9108 -0,9383 0,7705 -0,2368 0,2368 -0,3110 0,9825 1,1217 1,1217 1,0099 0,6734 -0,4712 0,4712 -0,5876 0,9326 -1,8996 1,8996 -0,8171 0,3217 3,1965 3,1965 0,1722 0,0403 -1,8531 1,8531 -0,8420 0,3398

0,3923 0,3923 0,4998 0,9528 0,4018 0,4018 0,5106 0,9505 -0,0954 0,0954 -0,1272 0,9971 0,5059 0,5059 0,6242 0,9227 0,1167 0,1167 0,1553 0,9957 0,1409 0,1409 0,1872 0,9938 -0,2802 0,2802 -0,3655 0,9756 0,1326 0,1326 0,1762 0,9945 0,1239 0,1239 0,1649 0,9952 0,2840 0,2840 0,3702 0,9750 0,7195 0,7195 0,8175 0,8498


(59)

ITERASI 4 Mencari Nilai Residual

Tabel 3.22.Nilai Residual Iterasi 4

NO (X1) (X2) (X3) Y Yˆ Y Y

o

i,   ˆ

ε YYˆ

1 8,2 4 23,005 7,6 9,1040 -1,5040 1,5040

2 7,6 5 23,873 7,7 8,0345 -0,3345 0,3345

3 4,6 0 26,417 4,3 5,2141 -0,9141 0,9141

4 4,3 1 24,868 5,9 5,6966 0,2034 0,2034

5 5,9 2 29,895 5 3,9349 1,0651 1,0651

6 5,0 3 24,200 6,5 6,2678 0,2322 0,2322

7 6,5 4 23,215 8,3 7,7500 0,5500 0,5500

8 8,3 5 21,862 8,2 9,6538 -1,4538 1,4538

9 10,1 0 22,274 13,2 11,5055 1,6945 1,6945

10 13,2 1 23,830 12,6 12,7511 -0,1511 0,1511

11 12,6 2 25,144 10,4 11,4356 -1,0356 1,0356

12 10,4 3 22,430 10,8 11,1771 -0,3771 0,3771

13 10,8 4 21,785 13,1 11,6707 1,4293 1,4293

14 13,1 5 22,380 12,3 12,8639 -0,5639 0,5639

15 13,3 0 23,927 10,4 12,9241 -2,5241 2,5241

16 10,4 1 33,443 10,5 5,4084 5,0916 5,0916

17 10,5 2 24,859 7,7 10,0634 -2,3634 2,3634

18 7,7 3 22,686 9,5 9,0694 0,4306 0,4306

19 10,0 0 21,789 12 11,7002 0,2998 0,2998

20 12,0 1 22,041 12,6 12,8642 -0,2642 0,2642

21 12,1 4 21,033 13,6 13,0323 0,5677 0,5677

22 13,6 5 21,005 14,1 13,9866 0,1134 0,1134

23 15,0 0 25,865 13,5 13,0930 0,4070 0,4070

24 13,5 1 26,290 11,5 11,6125 -0,1125 0,1125

25 11,5 2 22,932 12 11,8549 0,1451 0,1451


(60)

28 14,1 5 21,393 15,1 14,1367 0,9633 0,9633

JUMLAH 295,5

25,0993

4.Mencari estimasi pada masing-masing iterasi dengan weighted least square yaitu β =(XTW1X)-1XTW1Y

1,3290 6745 , 0 0,8964 6745 , 0 ) 0993 , 25 ( 28 1 6745 , 0 ] ˆ [ 1

ˆ3 1   

 

n i i Y Y n σ Tabel 3.23.Nilai pembobot iterasi 4

0 0 , * 0 , ˆ σ ε ε i i  * 0 , i

ε ψ(εi*) wi*) -1,1317 1,1317 -1,0105 0,6719 -0,2517 0,2517 -0,3280 0,9805 -0,6878 0,6878 -0,7892 0,8634 0,1530 0,1530 0,2019 0,9928 0,8014 0,8014 0,8725 0,8192 0,1747 0,1747 0,2300 0,9906 0,4138 0,4138 0,5215 0,9482 -1,0939 1,0939 -1,0026 0,6896 1,2751 1,2751 1,0228 0,6036 -0,1137 0,1137 -0,1505 0,9960 -0,7793 0,7793 -0,8577 0,8281 -0,2837 0,2837 -0,3678 0,9753 1,0755 1,0755 0,9980 0,6982 -0,4243 0,4243 -0,5332 0,9456 -1,8992 1,8992 -0,8235 0,3262 3,8312 3,8312 0,0534 0,0105 -1,7784 1,7784 -0,8852 0,3745 0,3240 0,3240 0,4167 0,9679

6745 , 0 ] ˆ [ 1 6745 , 0

ˆ3

1

   n i i Y Y n MAR σ


(61)

0,4271 0,4271 0,5364 0,9449 0,0853 0,0853 0,1131 0,9977 0,3063 0,3063 0,3953 0,9713 -0,0846 0,0846 -0,1122 0,9978 0,1092 0,1092 0,1446 0,9963 0,0311 0,0311 0,0414 0,9997 0,2006 0,2006 0,2632 0,9876 0,7248 0,7248 0,8183 0,8495 Dengan matlab diperoleh Iterasi 4

β = 17.5284

0.7201 -0.1721 -0.5931

Nilai β0,β1,β2,β3 dengan iterasi

Tabel 3.24. Nilai β0,β1,β2,β3 dengan iterasi

0

β β1 β2 β3

Iterasi 1 10,9825 0.7686 -0.0725 -0.3476

Iterasi 2 14.1079 0.7489 -0.1159 -0.4651

Iterasi 3 16.4382 0.7283 -0.1536 -0.5517 Iterasi 4 17.5284 0.7201 -0.1721 -0.5931

Iterasi 5 17.9654 0.7178 -0.1799 -0.6102


(62)

Iterasi 8 18.2602 0.7168 -0.1855 -0.6221

Iterasi 9 18.2778 0.7168 -0.1858 -0.6228

Iterasi 10 18.2841 0.7168 -0.1859 -0.6230

Iterasi 11 18.2848 0.7168 -0.1860 -0.6231

Iterasi 12 18.2874 0.7168 -0.1860 -0.6232 Iterasi 13 18.2855 0.7168 0,1860 -0.6231

3.4.1 Interpretasi dari persamaan dengan estimasi M type Welsch

Berdasarkan iterasi didapat terlihat bahwa selisih estimasi parameter iterasi 10 dan iterasi ke 11 lebih konvergen (lebih mendekati ke nol).

Sehingga diperoleh model persamaan regresi robust Yˆ=18,2848+0,7168X1-0,1860X2 -0,6231X3.

Artinya setiap pertambahan nilai X1,X2dan nilai X3sebesar 1 maka didapatkan nilaiYˆ sebesar 18,1925.

KOEFISIEN DETERMINASI

Berdasarkan nilai R2dapat diketahui tingkat signifikansi atau kesesuaian hubungaa antara variabel tak bebas dengan variabel bebas dalam model regresi yang

dihasilkan.Menggunakan rumus

   2 3 3 2 2 1 1 2 y y x b y x b y x b R 157,3102 = 219,4618 -3272,6100 1 1

1   

 

Xn Y

Y X y


(63)

17,7500 738,7500 -756,5000 2 2

2    

 

Xn Y

Y X y x -108,7445 7013,0909 -6904,3464 3 3

3    

x y

X Y

 

Xn Y

244,6496 3118,5804 -3363,2300 ) ( 2 2

2    

nY Y y 6496 , 244 7445 , 108 6232 , 0 7500 , 17 186 , 0 157,3102 7168 , 0

2 x x x

R   

2

R =0,9065=90,65%

Jadi koefisien determinasinya adalah 90,65%

Berdasarkan nilai perbandingan kedua metode diatas, diperoleh bahwa least trimmed

squares memiliki nilai R2yang paling tinggi dari estimasi M type Welsch, dengan kata lain estimasi least trimmed square lebih bagus dari estimasi M type Welsch


(64)

BAB 4

KESIMPULAN DAN SARAN

4.1 Kesimpulan

Dari hasil penelitian dan pembahasan, dapat diambil kesimpulan sebagai berikut : 1. Metode least trimmed squares (LTS) menggunakan konsep pengepasan metode

kuadrat terkecil (OLS) untuk meminimumkan kuadrat sisaan dari n residual menjadi

h residual.

2. Metode Least Trimmed Square merupakan metode terbaik dalam penelitian saya ini karena mampu menghasilkan estimasi koefisien regresi yang baik dari Estimasi M Type Welsch hal ini dapat terlihat dari penaksir LTS memiliki R2determinasi yang lebih tinggi meskipun memiliki selisih yang kecil yakni Metode Least Trimmed

Square memiliki nilai R2 sebesar 0,945 sementara Estimasi M Type Welsch memiliki nilai R2 sebesar 0,9065.

.

4.2 Saran

Berikut saran yang dapat diberikan dari hasil penelitian :

1. Dalam penulisan tugas akhir ini, peneliti menggunakan regresi robust dengan Metode Least Trimmed Square dan Estimasi M type Welsch dengan jumlah data yang sedikit.Bagi yang berminat dapat menelitinya dengan jumlah data yang lebih banyak.

2. Peneliti menggunakan pemodelan dengan regresi berganda, namun pemodelan lainnya juga bisa digunakan yaitu pemodelan dengan regresi linier sederhana.


(1)

ABSTRAK

Analisis regresi digunakan untuk mengetahui hubungan antara variabel bebas dan variabel terikat. Salah satu metode penaksir parameter dalam model analisis regresi yaitu metode kuadrat terkecil (OLS). Jika terdapat pencilan, metode OLS tidak lagi efisien sehingga metode yang cocok untuk permasalahan pencilan yaitu metode regresi robust. Pencilan adalah data yang tidak mengikuti sebagian besar pola dan terletak jauh dari pusat data, dapat dideteksi dengan metode boxplot (Interquartil Range)dan menentukan nilai Leverage, DfFITS danCook’s Distance. Least trimmed squares (LTS) yaitu metode penaksiran parameter regresi robust yang menggunakan konsep pemangkasan OLS untuk meminimumkan jumlah kuadrat residual. Penaksir M yaitu metode dalam mengatasi pencilan dan dapat menggunakan fungsi Welsch dalam mengestimasi parameter regresi. Tujuan penelitian ini yaitu membandingkan dua metode regresi robust yakni penaksir LTS dan penaksir M Type Welsch dalam mengatasi permasalahan data pencilan. Hasil penelitian yang diperoleh yaitu penaksir LTS merupakan metode paling baik karena mampu mengatasi pencilan dan diperoleh bahwa least trimmed squares memiliki nilai R2yang paling tinggi dari penaksir M type Welsch, dengan kata lain penaksir least trimmes square lebih bagus dari penaksir M type Welsch

Kata kunci : pencilan, metode kuadrat terkecil, regresi robust, penaksir least trimmed squares, dan penaksir M type Welsch,interquartil range,boxplot


(2)

The Studi Comparing of M Estimator Welsch Type with Least Trimmed Squares Estimator in Robust Regresion to Overcome the Outlier data

ABSTRACT

Regression analysis is used to determine the relationship between variables. One of methods for estimating the parameters in model analysis is ordinary least square (OLS). If there are outliers, OLS is not efficient again so the suitable method for problems of outliers is robust regression method. Outlier is data that inconsistent with the pattern and located away from the data center, can be detected with interquatiland determine the leverage value, DfFITS and Cook’s

Distance. Least trimmed squares (LTS) is an estimating method of robust regression that using a fitting concept of OLS to minimize the sum square error. M estimator is a method to overcome the outliers and can use Welsch Type in estimating the regression parameter. The purpose of this study is comparing two methods of robust regression, those are LTS and M estimator Welsch Type to overcome the problems of outlier. The conclutions of it are LTS is the best method because it can overcome the outliers and give that least trimmed squared have highest R2from M estimation with Welsch type, with another hand least trimmed square better from M estimation with Welsch type

Keywords : outliers,ordinary least square, robust regression, least trimmed squares estimator, and M estimator.welsch type, interquartil range, boxplot.


(3)

DAFTAR ISI

Halaman

Persetujuan ii

Pernyataan iii

Penghargaan iv

Abstrak vi

Abstract vii

Daftar Isi viii

Daftar Tabel x

Daftar Gambar xi

Daftar Lampiran xiii

Bab 1 Pendahuluan 1

1.1 Latar Belakang 1

1.2 Rumusan Masalah 2

1.3 Tujuan Penelitian 3

1.4 Batasan Masalah 3

1.5 Kontribusi Penelitian 3

1.6 Tinjauan Pustaka 3

Bab 2 Landasan teori 7

2.1 Pengertian Regresi Linier 7

2.2 Pendeteksian Pencilan 8

2.3 Metode Kuadrat Terkecil 10

2.4 Regresi Robust dengan Metode Least Trimmed Square 11 2.5 Estimasi M Dengan fungsi objektif, fungsi pengaruh dan

fungsi pembobot Welsch 12

2.6 Koefisien Determinasi 13


(4)

Bab 3 Pembahasan 15

3.1 Data 15

3.2 Pendeteksian Outlier 16

3.3 Metode Least Trimmed Square 20

3.3.1 Interpretasi dari persamaan dengan Least Trimmed

Square 26

3.4 Fungsi obyektif,fungsi pengaruh dan pembobot Welsch 26 3.4.1 Interpretasi dari persamaan dengan estimasi M type

Welsch 40

Bab 4 Kesimpulan dan Saran 42

4.1 Kesimpulan 42

4.2 Saran 42


(5)

DAFTAR TABEL

Tabel 2.1 Type Regresi Robust dengan penduga 12

Tabel 3.1 Salinity Data 15

Tabel 3.2 Data Lagged Salinity 16

Tabel 3.3 Data Trend 17

Tabel 3.4 Data Discharge 17

Tabel 3.5 Data Salinity 18

Tabel 3.6 Tabel IQR 19

Tabel 3.7 Data pencilan 19

Tabel 3.8 Nilai Residual 20

Tabel 3.9 Kuadrat Residual 21

Tabel 3.10 Kuadrat Residual setelah diurutkan 22

Tabel 3.11 Data setelah diurutkan dari kuadrat terkecil 22

Tabel 3.12 Perkalian bnewpada iterasi 1 23

Tabel 3.13 Kuadrat Residual yang diurutkan dari yang terkecil 24 Tabel 3.14 Data berdasarkan Kuadrat Residual dari yang terkecil 25

Tabel 3.15 Estimasi parameter bnew 25

Tabel 3.16 NilaiResidual 27

Tabel 3.17 Pembobot awal 29

Tabel 3.18 Nilai Residual Iterasi 2 31

Tabel 3.19 Nilai pembobot Iterasi 2 33

Tabel 3.20 Nilai Residual iterasi 3 34 Tabel 3.21 Nilai pembobot iterasi 3 35

Tabel 3.22 Nilai Residual Iterasi 4 36

Tabel 3.23 Nilai pembobot iterasi 4 39

Tabel 3.24 Nilai β0,β1,β2,β3 dengan iterasi 39


(6)

DAFTAR GAMBAR

Halaman Gambar 2.1 Skema Identifikasi Data Pencilan dengan IQR 9 Gambar 2.2 Kriteria Pengambilan Keputusan Adanya Pencilan 10