Lab Talk

Correlation Dimension

Computing Correlation Dimension in EEG

Correlation dimension is a method of computing the dimension of an attractor and has been applied to time series such as EEG with some luck (and many caveats).

In the previous blogpost on Lyapunov exponent we saw how from an univariate time series we can reconstruct the phase space and attractor by applying the Taken’s embedding theorem. Similar to Lyapunov exponent (LE), Correlation dimension (D2) is another measure frequently used to characterize the attractor of a dynamical system. While LE are used to quantify the divergence of trajectories, the idea behind D2 is based on correlation sum, which is simply the fraction of the pairs of points in the trajectory of the attractor whose distance is less than a certain threshold.

The concept of dimension

Intuitively, dimension of any object can be thought of as an exponent that expresses the scaling of the object’s volume (or area or any measure) with its size. To make this more concrete, let us consider a square. We know that a square is two-dimensional. What would happen to its area if we doubled the length of the size? The area would increase by 4 = 2^2. What if we triple the length? The area would increase by 9 = 3^2. In general, if we multiply the length by n, the area would increase by n^2. In case of a cube, the volume would increase by n^3. It is obvious that the exponent (2 in case of square and 3 in case of cube) is equal to the dimension of the object. This exponent, when it’s an integer (as in case of standard shapes like square, line, cube etc) agrees perfectly with the value of standard dimension, also known as topological dimension.  So ‘dimension’ here does not refer to the physical dimensions we normally think of but rather to how something scales at different ‘resolutions’. But what happens when the object is not symmetric? This is the idea of a ‘fractal dimension’ and this scaling exponent or dimension can even be a fraction!  It is probably best explained in this very nicely done video.

Attractor dimension

So how can one estimate the dimension of an attractor (trajectories that are frequently visited) that exist in some d-dimensional space?  This is where the idea of exponents introduced in the previous section becomes useful. Like the video above shows, assume that we can cover the attractor with N(r) d-dimensional cubes with r being the edge length.  If we reduce the length to r/2, we would need more cubes. Make it r/4, r/8 and so on and we would be needing more and more cubes. It can be shown that N(r) scales as, N(r) ~ r^(D0), where D0 is known as box-counting dimension and is just one way of measuring the attractor dimension. However this is not a good choice to measure attractor dimension as it does not give more importance to cubes in which the trajectories of attractor spend more time.

Figure source :

In the figure above, we see that the attractor visits certain part of the phase space more often than the others. Box-counting dimension only looks at the cubes needed to cover the attractor.

Correlation dimension D2, introduced by Grassberger and Procaccia [1] is one way of computing the dimension of an attractor by giving more weights to boxes in which attractor spends more time. A straightforward way to do this is simply to cover the attractor with boxes of a given size r (or sometimes ‘hyper’ or n-dim spheres or radius r) and then computing the probability pi(r, which is equivalent to the relative frequency, of having a point of the attractor in the ith box where i = 1,2..N.  . When we have large number of such boxes, (i.e.  in the limit N approaching infinity), the probability can be given as correlation sum C(r)[2],

Where Xi and Xj are two points on the attractor and |Xi – Xj| is the distance (for example Euclidean) between them. Θ() is known as the Heaviside function which takes the value of 1 when this distance is less than the threshold ‘r’, and 0 otherwise. Thus what we are doing above is simply counting all the pairs that have the distance less than the threshold and finally dividing with all possible comparisons (I.r., N^2),

For sufficiently large N and small ‘r’, we have:

C (r)~rD2  

Thus the correlation dimension D2 can be expressed as,

where D2 can be computed as the slope of the straight line scaling region in a plot of log(C(r)) versus log(r) as shown in panel C of the figure below.

Figure from [3] : A time series (A) and its reconstructed attractor following Taken’s embedding (B). Log-Log plot of  r vs C(r) for increasing choice of embedding (top to bottom) dimension (C) and plot of the slope of linear scaling region in (C) with respect to log (r) which converges to a value of around 2 forr small values of r (D).

Application in EEG

One of the earliest applications of D2 to EEG was by Babloyantz et al., where they showed that D2 decreases from REM sleep stage 2 to slow wave sleep [4]. Following this, many studies have applied D2 on sleep EEG and shown D2 to reduce from wake to sleep I-III and increase again during REM( see [5] and references therein). Furthermore, studies of neonatal EEG suggested that D2 during active sleep tends to be higher than that during quiet sleep [6]. Another popular area for the application of D2 has been in the prediction of epileptic seizures. Lehnertz and Elger showed that a modified version of D2 known as effective D2, whose lower values are associated with ‘lower complexity’ of the system. They showed that several minutes before the seizure, effective D2 started to decrease [7]. However, in another study, using EcOG data, it was shown that neither D2 nor correlation sum could reliably predict seizures and showed that correlation sum is related to variation in the time-frequency energy rather than any nonlinear dynamics [8].  Besides application in sleep EEG and epilepsy, D2 was shown to increase with eye-opening in non-demented subjects compared to subjects with Alzheimer’s [9].



There are many pitfalls associated with the computation of D2.

  1. D2 can be biased due to autocorrelation in the time series. This can be reduced to some extent by discarding pair of points in the trajectories with time indices less than the autocorrelation time [3], i.e, choosing the embedding delay (to reconstruct the attractor ) such that is at least 3 times the autocorrelation time (time when the autocorrelation function of the time series first reaches 1/e).
  2. It has been shown that insufficient data length can also bias the estimate of D2 and the bound for D2 is 2log10N, where N is the data length. When this bound is saturated, presence of low dimensional dynamics can be unreliable [10].
  3. Presence of noise can lead to overestimation of D

Thus reliable estimation of D2 requires sufficiently long, (and stationary) data with high signal-to-ratio, which are conditions seldom satisfied by EEG signals. Thus, application of D2 to EEG signals must be done with caution and thoroughly tested with surrogate methods [11].



  1. Peter Grassberger and Itamar Procaccia (1983). “Characterization of Strange Attractors”. Physical Review Letters. 50 (5): 346‒349.
  2. Boon, Mei Ying, et al. “The correlation dimension: A useful objective measure of the transient visual evoked potential?.” Journal of vision 8.1 (2008): 6-6.
  3. Stam, Cornelis J. “Nonlinear dynamical analysis of EEG and MEG: review of an emerging field.” Clinicalneurophysiology 116.10 (2005): 2266-2301.
  4. Babloyantz A. Evidence for slow brain waves: a dynamical approach.Electroenceph Clin Neurophysiol 1991;78:402–5.
  5. Ma, Yan, et al. “Nonlinear dynamical analysis of sleep electroencephalography using fractal and entropy approaches.” Sleep medicine reviews 37 (2018): 85-93.
  6. Röschke, J. “Strange attractors, chaotic behavior and informational aspects of sleep EEG data.” Neuropsychobiology 25.3 (1992): 172-176.
  7. Lehnertz, Klaus, and Christian E. Elger. “Can epileptic seizures be predicted? Evidence from nonlinear time series analysis of brain electrical activity.” Physical review letters 80.22 (1998): 5019.
  8. Harrison, Mary Ann F., et al. “Correlation dimension and integral do not predict epileptic seizures.” Chaos: An Interdisciplinary Journal of Nonlinear Science 15.3 (2005): 033106.
  9. Pritchard WS, Duke DW, Coburn KL. Altered EEG dynamical responsivity associated with normal aging and probable Alzheimer’s disease.Dementia 1991;2:102–5.
  10. Eckmann JP, Ruelle D. Fundamental limitations for estimating dimensions and Lyapunov exponents in dynamical systems. Physica D 1992;56:185–7.
  11. Schreiber, Thomas, and Andreas Schmitz. “Surrogate time series.” Physica D: Nonlinear Phenomena 142.3-4 (2000): 346-382.

Leave a Reply