IN BRIEF

Brain-computer interfaces (BCIs) could revolutionize human-machine interaction if they become widely available to consumers. However, their arrival raises significant ethical, societal, and security dilemmas that require proactive governance. 

THE GIST

In May of this year, Apple announced a partnership with Synchron, a brain-computer interface manufacturer. This partnership will soon enable individuals with motor impairments to directly control Apple devices such as Iphones or Vision Pro headsets with their minds (i.e. without the need for physical movement or voice commands). This development marks a significant step forward in terms of accessibility, human-machine interaction and the integration of neurotechnologies in the consumer market. 

As their name suggests, Brain Computer Interfaces (BCIs) are neurotechnologies that connect the mind and the machine by converting the brain’s electrical signals into computer-executable commands. As such, BCIs allow users to interact with or control external devices solely through mental willpower. The methods used to detect brain signals can be invasive – involving surgical procedures to implant electrodes into the brain (e.g. Elon Musk’s Neuralink) – or non-invasive – using wearable devices and techniques such as electroencephalography (EEG). 

Although BCI systems were initially conceived in clinical settings and most devices on the market are still used for medical purposes, the promising results in the healthcare sector have paved the way for a growing number of consumer applications. In fact, dedicated consumer neurotech firms now account for 60% of the global neurotechnology landscape, most of which are geared toward wellness, fitness, and cognitive enhancement.  

As these BCI systems are increasingly embedded with artificial intelligence and integrated into everyday products, they have the potential to transform not only how we interact with machines but also the very core of human interaction – raising significant ethical, societal, and security dilemmas. 

THE TAKEAWAY

Most BCI consumer tools are still in the development or experimental phase. Yet, some of the potential future applications that are being explored include using brain activity monitors in the workplace to enhance productivity and performance; personalized learning platforms that respond to cognitive load, concentration, and emotional states; immersive gaming experiences that allow players to control the game with their mind; and marketing applications to obtain real-time insights into consumers’ emotional states and personalize their retail experience even further. Beyond civilian applications, defense ministries are also investing in BCIs for the battlefield to remotely manage weaponry, improve soldiers’ cognitive performance, or enhance training simulations.  

Albeit exciting, some of these potential BCI applications raise concerns about the risks associated with a technology capable of recording and interacting with our most personal data – our neural data. These risks include threats to users’ mental privacy, autonomy and integrity, as BCIs could allow malicious actors to collect, process and exploit highly sensitive data related to how people feel or think for surveillance or manipulation purposes. Furthermore, additional concerns exist with regards to the unequal access to BCI enhancements and the implications of BCI use on people’s personal identity and cognitive development. 

Overall, the shift toward unregulated consumer applications of BCI technology creates a significant governance gap that needs to be addressed before the technology becomes socially widespread. This requires further research and well-informed dialogues on the potential effects of human-machine and “human to human” interaction through BCIs and other forms of neurotechnology. It also entails developing clear rules and regulations on neural data ownership and management, consent, and security standards to ensure we are able to control machines with our thoughts while preventing our thoughts to be controlled by machines.  

DELVE DEEPER