Distinction between deterministic chaos and stochastic noise
state vector by using the nearest neighbor of the present state.
We now locate nearby M-dimensional points in the phase space and choose a minimal neighborhood with K closest
neighbors such that the predictee the point from which the prediction is made is contained within the smallest simplex.
To enclose a point in an M-dimensional space, we require a simplex with a minimum of M þ 1 points. Then, to obtain a
prediction, we project the domain of the chosen nearest neighbors T
P
prediction step steps forward and compute
F
p
to get the predicted value. Since it becomes increasingly difficult to define an enclosing simplex for higher dimen-
sional embedding spaces, we have extended the above idea to the nearest neighbors in an Euclidean sense. A minimum
of M þ 1 nearest neighbors are chosen based on the Euclidean distance between the neighbor and the predictee.
Then, we project the domain of the chosen neighbors T
p
step forward and estimate the predicted value. We have explored
several estimation kernels including arithmetic average, weighted average and weighted regression to estimate the
predicted value. It was found that arithmetic average provides comparable prediction accuracy and requires no
tuning parameters and hence we have chosen arithmetic average of projected neighboring points to obtain the
predicted value in this study.
There are only two parameters to be chosen for this phase-space prediction model: embedding dimension M,
and number of nearest neighbors K. In general, M
min
. 2D þ 1 where D is the attractor dimension. An estimate
of the attractor dimension may be obtained from the corre- lation dimension
4,5
. Prediction results are sensitive to the choice of M
10
. We will look at the prediction accuracy correlation between predicted and observed as a function
of embedding dimension to choose an optimum value of M for our prediction algorithm. Since to enclose a point in an
M -dimensional space, we require to construct a simplex
with a minimum of M þ 1 points, one has K
min
. M þ 1. Use of the phase space to develop a forecasting model
may appear to be similar to an autoregressive model: a pre- diction is estimated based on time-lagged vectors. However,
the crucial difference is that understanding phase-space geometry frames forecasting as recognizing and then repre-
senting underlying dynamical structures. For example, two neighboring points in a phase space may not be close to each
other within the context of a time sequence. The traditional autoregressive AR model relies on time-lagged signals
that are neighbors in a temporal sense, whereas a neighbor in a phase space is close in a dynamic sense. In addition,
once the number of lags exceeds the minimum embedding dimension, the geometry of the underlying dynamics will
not change. A global linear model, such as the AR model, must do this with a single hyperplane with no fundamental
insight into the underlying geometric structure. Unlike traditional AR models, the proposed methodology also
promises to make a tentative distinction between stochastic noise and low-dimensional chaos. A characteristic feature of
chaotic dynamics is that the prediction accuracy exponen- tially decays as the prediction time increases. On the other
hand, for a noisy system the prediction accuracy does not decay sharply with prediction lead time
9,10
.