Measuring brain activity with EEG along with body movement in a real world environment can tell us much more than controlled lab experiments. Welcome MoBI.
Walking, running, dancing, skipping, cycling, jumping. They are all actions. Actions which involve not only your brain, but also your body.
But what if I asked you how you could accurately use EEG to measure the way the brain performs these actions in the real world? A world where you don’t just walk, you walk along a busy street filled with various obstacles that you need to avoid. Where you don’t just dance, you dance to music on a crowded dance floor. A world where your brain is not only controlling your actions, but doing so in the context of an ever-changing 3-dimensional, multi-sensory environment.
Researchers are starting to tackle this complex challenge.
The term mobile brain/body imaging, or MoBI for short, was first coined by Scott Makeig and colleagues from the Swartz Center for Computational Neuroscience at the University of California in a study published in 2009. The ultimate aim of MoBI is not only to be able to measure high density mobile EEG using dry, lightweight electrode sensors, but to integrate this EEG with a whole host of other behavioral, physiological and situational metrics.
The system aligns real-time EEG signals with concurrent recordings of eye movements, body movements and autonomic responses. Simultaneous video and audio recordings then provide a contextual measure of the on-going changes in the external environment.
Beyond the initial aim of understanding the control of motor responses, systems like MoBI also provide the necessary metrics to support the accurate interpretation of real-time EEG across a broader range of applications.
From Ladouce S, Donaldson DI, Dudchenko PA and Ietswaart M (2017) Understanding Minds in Real-World Environments: Toward a Mobile Cognition Approach. Front. Hum. Neurosci. 10:694.
MoBI: the technical challenges.
Taking EEG out of the lab is notoriously difficult, especially when you are asking your participants to freely move about (see here for a previous discussion on the topic). First, there is the practical challenge of ensuring the MoBI equipment is comfortable to wear. Not an easy task if you imagine all the various wires and sensors required for high density EEG, as well as for monitoring the head, trunk and limbs of the wearer’s body.
Then you have to remove unwanted artifacts from the EEG data which appear when the participant moves around. These motion artifacts become mixed up with the EEG recordings, compromising the quality of the data. Techniques such as Independent Component Analysis (ICA) can be used to isolate the different brain and non-brain “components” within the EEG signal, allowing you to focus the analysis only on the components which originate from relevant neural sources (see here for more information on how ICA works). In this way you can potentially remove the artifacts, whilst retaining the integrity of the EEG data. The complexity of doing this effectively is one of the reasons why researchers have, in the past, been so reticent to explore the control of body action using EEG.
Finally, integrating together all the different data streams requires some clever analysis tools. As most EEG analysis software isn’t designed to integrate this kind of multidimensional data, the researchers at the University of California have also developed their own MoBI analysis toolbox.
A promising start.
What started out as a call to action is now slowly becoming reality. MoBI equipment is more readily available, in partnership with companies such as Brain Products, and in 2016, the International Conference on Mobile Brain-Body Imaging and the Neuroscience of Art, Innovation and Creativity brought together 100 engineers, scientists and artists to foster collaboration in the field.
Feasibility studies using the P300 Oddball task, such as this one from Joseph Gwin and co-authors at the University of Michigan, have shown that high density EEG data can be effectively collected while people are walking on a treadmill. What’s more, the researchers were able to show that walking evoked synchronous fluctuations in spectral power (across alpha, beta and gamma frequencies) originating from the anterior cingulate, posterior parietal cortex and sensorimotor cortex, potentially reflecting visual-motor integration and error monitoring.
Another study by Evelyn Jungnickel and Klaus Gramann from the Berlin Mobile Brain-Body Imaging Lab (BeMoBIL) at the Berlin Institute of Technology used a dynamic version of the Oddball task where participants had to move their eyes and head to track, and then point to, moving targets appearing on the screen. Again they showed that the MoBI system, and the associated analysis of independent brain and non-brain components, was able to cope with this rapid, and artifact inducing, pattern of motor responses.
Related studies like Geo-EEG discussed in an earlier post seek to study how the human mind interacts with urban environments.
Although these studies are still some way off from measuring high density EEG and body movements in real-life situations such as walking down a busy street, or pointing to an aeroplane in the sky, they herald a new era for research into “embodied cognition” (see here for a review on the topic).
In the future, the ability to combine mobile EEG (and even better, high density mobile EEG) with other body and situational metrics in one complete system, will open up new opportunities for measuring real-time embodied cognition in action.