
Brain-to-computer interfaces are an emerging technology that will revolutionize the way we interact with our digital environment by establishing direct communication between neural activity in the brain and computers, without need for any peripheral input device like a keyboard or a mouse. While this might sound like science-fiction to a lot of people, the technology and concept behind it have been around for almost a century.
While it would take until 1973 for the term “BCI” to be coined by computer scientist Jacques Vidal, the concept itself has been around since at least the invention of electroencephalography (EEG) by German psychiatrist Hans Berger in 1929. But for the past century, BCI technology has mostly remained confined to the medical and academic realms, due to the massive cost of research-grade systems and the need for highly-skilled medical expertise to interpret neural data. The advent of the IT revolution has changed this equation though. Mass-production of sensors and electronic components for the global IT market has, as of 2021, almost brought the cost of research-grade BCI systems within the means of regular consumers while the emergence of machine-learning algorithms allows for automatic interpretation of neural data without need for a human expert.
It is this current wave of “democratization” of BCI technology that Cortex Machina is harnessing to finally get BCI technology out of the lab and into the daily lives of regular consumers. As the title of the article confers, Cortex Machina has opted specifically for EEG-based BCI technology to achieve this. To explain and justify this choice, we are going to delve into the nitty-gritty of BCI systems to give the reader a sense of the types of systems that exist, their advantages and disadvantages and their main applications.
Characteristics of BCI systems
Brain-to-computer communication can be achieved through various means, but broadly speaking every BCI system consists of at least three components:
- A device to record neural activity data
- A device to process and interpret the gathered data and generate output commands
- A (computer) system operating via the generated output commands
For each of these components there exists multiple methods and technologies, each with their own advantages and disadvantages. A general overview of these will be given in the sections below.
A device to record neural activity data
Recording neural activity can be done with various types of sensors. An important distinction to make here are invasive and non-invasive systems.
Invasive systems use sensors that are implanted directly inside the brain, typically providing very high temporal and spatial resolution and allowing for access to neuron clusters deep inside the brain that are inaccessible to non-invasive systems. Currently, these systems require very complex, expensive and risky brain surgery and are thus mostly confined to animal testing and medical patients with extreme forms of epilepsy whose symptoms can be mitigated using electrodes implanted in the brain as part of deep brain stimulation therapy against seizures. But invasive BCI systems outperform non-invasive systems by such a margin that some companies, like Elon Musk’s Neuralink for example, aim for this type of systems anyway. But consumer-friendly invasive systems that are applicable on a massive scale are very probably still decades away.
For consumer applications, non-invasive systems, where the sensors remain external to the body, are a much better fit. The general disadvantage of non-invasive systems is that they only record the activity of the upper layers of the brain (the cortex which is the seat of our higher cognitive functions like vision, language, reasoning, learning, touch and fine-motor control). Most of the unconscious thought processes regulating our metabolism are however seated deeper in the brain and remain inaccessible to non-invasive systems. The 3 main examples of non-invasive systems are:
- fNIRS (functional near-infrared spectroscopy);
- MEG (magnetoencephalography);
- EEG (electroencephalography).
fNIRS measures blood oxygenation at the surface of the brain using sensors that “film” the near-infrared light coming from the scalp. It gives a measure of energy consumption by the neurons and thus shows which parts of the brain are more active at a given moment. While relatively low-cost and providing high spatial resolution, this technique suffers from low brain sensitivity and low temporal resolution. EEG and MEG both work by recording ionic electrical currents in and around the neurons of the cortex. The former measures the fluctuations in the electric field these currents generate using electrodes placed on the scalp, while the latter measures the fluctuations in the magnetic field using heavy and expensive magnetometers. Both techniques generate similar data, allow for high temporal resolution but suffer from low spatial resolution. But EEG systems are orders of magnitude cheaper and more portable than MEG systems, making them a far better choice for consumer BCI systems.
A device to process and interpret the gathered data and generate output commands
Historically, processing and interpreting the data were two separate processes, with processing meaning the physical transformation inside the amplifier (a printed circuit board connected to each individual sensor) of the analog signal coming from the sensors into a usable digital signal, and interpreting meaning the actual interpretation by a human being of the resulting data. But the lines between processing and interpreting have become far more blurred with the introduction of computer technology. While most current systems still have a physical amplifier for pre-processing of the signal (combination, noise reduction), most of the actual processing and interpreting is now done by specialized software on a computer.
A (computer) system operating via the generated output commands
Here’s where we make the difference between monitoring systems and actual BCI systems. Monitoring systems display the data for the user to interpret, typically in a medical or academic setting. BCI systems instead harness the data to generate commands and operate in effect like a peripheral input device. This requires software that can independently recognize neural patterns in the data and generate subsequent commands. Most BCI systems use algorithms that can detect specific and well-defined neural patterns, like common event-related potentials (ERP’s), event-related desynchronization/synchronization (ERD/ERS) and slow cortical potentials (SCP’s), which is sufficient for broad neurofeedback programs (e.g. user with an EEG headset moving an arrow on-screen) and for instance prosthetic limb control. But the emergence of machine-learning algorithms about a decade ago has changed the game. Such algorithms monitor neural activity and can gradually learn to associate any number of desired commands with chosen neural patterns (e.g. move character on-screen forward when you think about running). After some training, these algorithms can detect much finer and user-specific neural patterns than traditional algorithms, and can also automatically calibrate themselves to a specific user, i.e. teach themselves to better recognize the neural patterns of a specific user. This opens the door towards consumer-friendly BCI systems that don’t require a lengthy and complex set-up or that come with a steep learning-curve.
Synthesis
After nearly a century of step-by-step advances and developments in various fields, EEG-based BCI systems have come out of the realm of science-fiction and into the real world. The exponential progress seen in the technology over these past 20 years have led to a host of impressive applications, from mind-controlled prosthetics and robots to enabling locked-in patients to communicate with their loved ones. Unfortunately, the technology is still mostly confined to academic and medical settings due to its complexity of use, generally requiring qualified –and expensive– staff, lack of portability and its high cost per unit due to the small current demand. Any BCI system wishing to make the leap from the lab to the global electronics consumer market must find a way to solve these issues.
Of all the BCI systems considered previously, the best contender to overcome these issues seems to be an EEG-based BCI system equipped with machine-learning software. The ever ongoing miniaturization of electronic components has reached a point where it becomes feasible to design research-grade but compact and portable EEG systems, while the parallel decrease in cost of EEG hardware has almost brought unit prices within the means of your average electronics enthusiast or avid gamer. At the same time, the introduction of machine-learning to BCI systems has the potential to make the technology accessible to regular non-qualified consumers. With self-calibrating algorithms that teach themselves to recognize and interpret the users neural patterns, installing and operating an out-of-the-box BCI system could become as simple as installing the latest PlayStation.
If these trends and developments hold fast, it is expected that the technological and economical hurdles that used to prevent EEG-BCI technology from going mainstream will be overcome in the next few years. But this alone does not guarantee the future commercial success of consumer EEG-BCI systems, as demand for, and awareness of, such systems among the wider public currently remains very limited. This is where Cortex Machina will make the difference by coupling the development of next gen EEG hardware and software described above with a marketing strategy focused on generating demand for EEG-BCI systems in the gaming sector… But this will be a subject for another article.