Here we provide a background on the lyapunov exponent as a measure of nonlinearity and its application to EEG.
Chaos in dynamical systems
To understand the point of the lyapunov exponent, we provide first a brief background on dynamical systems (skip ahead if this is old news). Any system evolving with time is known as a dynamical system. Such systems can be described using differential equations, which basically tell you the rule or function according to which the system should evolve. If the equation is nonlinear in nature, then such systems are known as nonlinear dynamical system.
Nonlinear dynamical systems exhibit an interesting phenomenon known as chaos. To understand this, let’s first consider a simple pendulum. The motion of the pendulum can be described in what is known as the phase space, where the two variables, the angular velocity and position (in terms of angle) completely describe this system. A single pendulum would simply swing back and forth along a trajectory that is predictable and boring. Let’s see what happens if we add another pendulum. Take a look at the image below.
( source : https://tenor.com/view/pendulum-gif-6854340)
Now the trajectory for each looks seemingly random and unpredictable. But it is not random as the motion is still governed by deterministic laws and there is no source of randomness entering the system anywhere. This is an example of dynamics that seem random, but are in fact deterministic. What is more interesting is however, the difference between the two examples above. The double pendulums each have slightly different starting positions or initial conditions. In the case on the right the initial angle of the second pendulum is slightly different. (These angles are indicated at the top of the image).
As you can see, this small change in the initial condition has caused a huge change in the way the trajectory evolves for each example, even though the motion of both double pendulums are governed by exactly the same laws! Such deterministic systems, where a slight perturbation in the initial condition can cause the trajectories to diverge significantly are known as chaotic systems.
Now one may be interested in quantifying the separation or divergence of the trajectories in chaotic systems in a more principled way. Given two trajectories that start with an initial separation of δx0 , the rate at which these two trajectories will diverge can be approximated by,
where λ is referred to as the Lyapunov exponent. If the Lyapunov exponent is positive, evidently the exponential function would acquire larger values as time increases and if its negative, it will tend to zero. If the vector x has n dimensions, then there will be n Lyapunov exponents. In the case of double pendulum, the phase space is 4-dimensional (velocity and distance for each pendulum) and hence such a system will have four Lyapunov exponents! And if the sum of all the Lyapunov exponents is positive, the system is then chaotic.
Application in EEG
The application of Lyapunov exponents to EEG data was very popular in the early 90s, particularly to analyze epileptic data and sleep. It was shown that Lyapunov exponent decreased from stage I to stage III during REM sleep, indicating that Lyapunov exponent decreased as sleep depth increased and this was taken as the evidence for low-dimensional chaos underlying sleep EEG (See the review by Stam et al. and references there in ). However, with the introduction of surrogate analysis technique, it was shown that these Lyapunov exponents cannot distinguish between true (chaotic) non-linearity and noise.
The basic idea behind surrogate analysis is non-linear signal analysis is to manipulate the signal at hand in such a way that it resembles the original signal in all aspects except the presence of non-linearity. One way to do this is to compute the Fourier transform of the signal and randomize its phase , but preserve its amplitude distribution and re-transform the signal back to time domain. Since the early 2000s, the focus then shifted towards the more reasonable goal of establishing the presence of a non-linear structure in EEG signals rather than establishing the presence of chaotic dynamics.
Establishing the presence of non-linear structure is particularly useful in the analysis of epileptic data. Several studies have shown Lyapunov exponents to be useful in localizing epileptic foci from interictal recordings and as well as in predicting seizures. For examples, Iasimides et al. Showed that the Lyapunov exponent decreases during the beginning of the seizure and increases in the post-seizure period . However its applicability to finite-length, non-stationary and noisy data has been questioned and its predictive power has been shown to be reduced in signals like EcOG due to non-stationary, i.e., time-dependent evolutionary rules .
Thus the application of Lyapunov exponents to EEG data , while providing a potential measure of different states or behavior has challenges in its reliability. This is in general the problem of several nonlinear measures, including correlation dimension and nonlinear prediction error .
Computing Lyapunov exponent
In case of the pendulum, we have direct access to the phase space, which has two variables – position and velocity. However, in case of brain recordings, we do not have direct access to the phase space, but rather measure some property of it, like the EEG. So one has to first reconstruct the phase space from the time-domain data . This can be done using Taken’s embedding theorem (For proof see : https://en.wikipedia.org/wiki/Takens%27s_theorem). In practice, this is achieved by taking ‘m’ consecutive samples at a certain delay d for every time sample, where ‘m’ is our best estimate (in practice, one uses false nearest neighbor method ; see )for the underlying dimension of the dynamical system . The time delay ‘d’ can be thought of as the minimum delay in the signal after which the auto-correlation of the signal reaches its minimum. This is illustrated below
Source : Ref 
In the above figure, we set m=3 and select three consecutive points from the time series at lag d. This represents one point in the 3-d space. When this is repeated for every time-point, we end up with the trajectory as shown in the bottom right.
Thus for an EEG signal of length N in one channel, there will be (N-(m-1))d such segments or trajectories, i.e. [x(t),x(t-d), …x(N-(m-1)d] . Wolf algorithm  is commonly used to compute Lyapunov exponent and it mainly depends on three parameters – the embedding dimension m, optimal time-delay d and threshold e. The first step is to represent the data which is usually in the time domain to phase space using time-delay embedding theorem. To compute Lyapunov exponent, a point in the trajectory is selected. Then its nearest neighbor is selected and their distance is computed (for example using Euclidean distance metric), let this be L0. As the time progresses, the difference in the trajectory (as the distance) is computed, until this distance exceeds a threshold (e). Once the threshold is exceeded, store the distance as L’0 and find the new nearest neighbor and continue the process. This process is shown below:
Source : https://www.cs.colorado.edu/~lizb/chaos/wolf-notes.pdf
The lyapunov exponent is then given as the average of the log of these distance ratios
This algorithm mainly depends on three parameters – embedding dimension m, delay d and threshold e. How these are selected can have significant impact on the outcome. Selecting too low a value for e will amplify the impact of noise, while too high an e will result in a smaller region for exponential growth. Further, selecting too low an embedding dimension m can inadequately reconstruct the phase space while a higher embedding dimension requires a larger amount of data, which also increases the amount of noise and non-stationarity.
The bottom line for EEG
For a signal like EEG which is a time series that arises as a composite of many factors and for the which the true dimensionality (and governing equations) is unknown, selection of the embedding dimension is a bit of guesswork. Furthermore, noise in EEG that arises due to various factors of instrumentation and artifact can also substantially distort the Lyapunov exponent such that there may be no exponential divergence for the high values of e that are required to mitigate the impact of noise. Further, instrumentation and electrode characteristics with different noise profiles can differently distort the outcome. Consequently, it is not easy to interpret Lyapunov exponents arising from the EEG in the context of nonlinear dynamics and chaos. On the other hand, they can simply serve as a metric that indicates differences in the EEG in different states where exploring the behavior across parameter space using consistent instrumentation could be informative. We also note that there are other ways to to quantify the geometry of the trajectories in the phase space as well, such as those based on recurrence  which may perform better in analyzing noisy and finite-length biological signals such as EEG and which we will discuss in subsequent posts.
- Stam, Cornelis J. “Nonlinear dynamical analysis of EEG and MEG: review of an emerging field.” Clinical neurophysiology116.10 (2005): 2266-2301.
- Iasemidis LD, Sackellares JC, Zaveri HP, Williams WJ. Phase space topography and the Lyapunov exponent of electrocorticograms in partial seizures. Brain Topogr 1990; 2: 187-201.
- Lai, Ying-Cheng, et al. “Inability of Lyapunov exponents to predict epileptic seizures.” Physical review letters91.6 (2003): 068102.
- Subramaniyam, Narayan Puthanmadam, and Jari Hyttinen. “Characterization of dynamical systems under noise using recurrence networks: Application to simulated and EEG data.” Physics Letters A378.46 (2014): 3464-3474.
- Hegger R, Kantz H. Improved false nearest neighbor method to detect determinism in time series. Phys Rev E 1999;60:4970–3
- Wolf, Alan, et al. “Determining Lyapunov exponents from a time series.” Physica D: Nonlinear Phenomena16.3 (1985): 285-317.