Over half a million fMRI and EEG papers have been published to date . What drives the trajectory? What have we learned? Who would actually know?
Of all the techniques to observe the living human brain, fMRI and EEG are the most popular, providing a completely safe and non-invasive functional view with no radioactive tracers or harmful radiations. This post is about the quantitative trajectory of research using these techniques based on an analysis of Pubmed data over the last 80 years. (The two techniques are fundamentally different.One measures electrical activity through sensors placed on the skull and the other measures blood oxygen. You can read more about the techniques and their differences here or do a deep dive into the EEG signal here).
So what has the research community produced over the decades? Here’s what we found out:
Research by the Numbers
EEG was first measured by Hans Berger in 1924 (read about its interesting history here) though he did not publish his results for many years and the technique did not gain traction in the research community until after his death in 1941. After this slow start EEG began to gain popularity in the 1960s. By 1980 however it began to lose luster as perhaps its promise began to fade or as advances in magnetic resonance imaging began to take center stage and interest.
The BOLD (blood oxygen) signal behind fMRI was first discovered and used in the late 1980s with the first paper published in 1990 by a Japanese group led by Seiji Ogawa. It immediately took human research by storm and the number of papers simply took off in an astronomical rise, quickly outpacing EEG.
Beginning in the year 2000 however, EEG began to regain its popularity as advances in computing have allowed more interesting and sophisticated approaches to signal analysis. When the number of papers is normalized to that in the year 2000, you can see that the growth of both techniques are now matched.
To date a total of 581,000 EEG and fMRI papers (maybe some MRI sneaked into the count, it was hard to separate) have been published with the volume growing each year over the past decades. In 2015 alone a total of 29,295 fMRI papers and 5,979 EEG papers were published.
Economics of a Paper
In contrast to EEG however, fMRI is an extraordinarily expensive technique. Compared to the $100,000 cost of a high-end EEG system today, an fMRI system only begins at $300,000 and can go all the way up to $3 million. Part of the reason, perhaps, that fMRI gained such volume relative to EEG is that when Universities invest in an fMRI machine, they must have sufficient utilization of the device to justify the cost and therefore a critical mass of researchers dedicated to the technique. This would likely leave little budget to hire researchers of EEG (someone is looking at ROI yes? Even if the scientists aren’t).
Of course, another reason for the dominance of fMRI over EEG was likely the relative difficulty of interpretation of the EEG signal into easily visualized results. fMRI was able to provide a visualization of brain activity with ‘this part lights up’ pictures that was easier to understand and perpetuate to the public. This no doubt made for easier presentations to non-scientific committees managing the purse strings.
That said, despite almost 5X more fMRI papers, if you look simply at the quantitative output per unit device cost (assuming a mid value for EEG at $50,000 and FMRI at $1M), EEG has consistently done 4X better.
What do they say?
Of course, science is not simply a numbers game. What matters is what these papers say. With over half a million papers out there and over 30k published just in the last year, one thing is abundantly clear: no one can actually know the literature. It is simply too vast. If I am at all representative, I skim perhaps 200-300 papers in a year (about 0.7% of annual output) and thoroughly read about 30-40 (0.1% of annual output). So what has really been discovered about the human brain so far? Who’s keeping track?
5 thoughts on “EEG and FMRI Papers by the Numbers”
The flood of unreplicated EEG studies are maddening to us neuroscientist and muddy the waters of the fields of psychiatric research. So why mention them?
That’s a pretty shoot from the hip comment. If you are a scientist, back it up. Otherwise, its best not to judge any field of science that you are unfamiliar. Each methodology has its advantages and disadvantages.
I will take one example. You should have a look at the work of Robert McCarley’s group of Harvard. He identified a phenomenon in Schizophrenia using ERPs (a form of EEG), He followed it up with source reconstruction, which then led him to MRI work and an excellent elucidation of dysfunction in the temporal lobe. Most of this work is published in Science and replicated in multiple labs. That’s why it is should be mentioned
Certainly there are good studies and bad studies out there. The point though is that with so many papers published, how is one to know which ones are good or bad or worth trying to replicate if anyone can only read a fraction of a percent of them.
That is what the peer review process is about Top journals in any field employ experts so that results are curated. Yes, there are studies being published in little known journals, particularly in the field of neuromarketing, but even in that field there are solid scientists who welcome replication and criticism and publish in mainstream journals.
There is also a role for systematic reviews and meta-analyses in coping with the quantity of published research. fMRI and EEG studies are often time-consuming for both experimenters and participants, who can be tested only one at a time, so there is a tendency for them to be statistically under-powered. Combining data across studies is a good way to overcome these limitations. Even underpowered studies can have a positive influence in developing techniques and highlighting research questions. Methodology has undergone huge advances. The resurgence of EEG research may be in part because of the mathematical tools developed originally for fMRI. The increasing requirement to provide open access to data will reduce any tendency to wasted effort.