Yves here. The photo for this article shows a man wearing virtual reality goggles. The article makes clear that the data sources come from neuronal activity, as in using neuromotor signals, presumably captured via skin monitors (I think of a less intrusive version of the stickies and leads used for EKGs). Of course, these developments were long anticipated in science fiction, see Neuromancer and many tales having brain implants and related human capability enhancement as a major plot device.
The concerns raised here are of yet more individual data capture and sale and loss of privacy. Paranoid Luddites like me cannot fathom why so many are cavalier about this sort of thing.
Back to the goggles. The fact that one use case is better VR, make me wonder if people who don’t have binocular vision (as have no depth perception1 and can’t use VR and therefore would not be included in datasets with VR type applications) will be excluded from some of these “advances,” at least for a while.
By Michael Nolan, a science and technology writer. His writing covers neurotechnology, data privacy and emerging neuroscience research. Originally published at Undark
The past few decades of neuroscience research have produced a wide array of technologies capable of measuring human brain activity. Functional magnetic resonance imaging, implanted electrode systems, and electroencephalograms, or EEGs, among other techniques, have helped researchers better understand how our brains respond to and control our bodies’ interactions with the world around us.
Now some of these technologies — most notably, EEG — have broken out of the lab and into the consumer market. The earliest of these consumer-facing neurotechnology devices, relatively simple systems that measured electrical signals conducted across the skull and scalp, were marketed mostly as focus trainers or meditation aids to so-called “biohackers” seeking to better themselves through technology. However, tech industry giants have lately taken notice, and they are exploring inventive new ways to make use of the inner electrical conversations in our brains.
In 2019, Meta, then still known as Facebook, paid nearly $1 billion to purchase CTRL-Labs, a startup whose flagship product was a wristband that detects neuromotor signals, allowing the wearer to manipulate a computer system using a range of forearm, hand, and finger movements. Last year, Snap, the parent company managing Snapchat, spent an undisclosed sum to acquire NextMind, whose headset uses EEG technology to let a user “push a virtual button simply by focusing on it.” Even Valve, the video game publisher that manages the massive Steam video game store, has partnered with brain-computer interface developer OpenBCI, with an eye toward integrating brain-computer interfaces into virtual reality headsets.
The promise of these systems is to give users a new, potentially more widely accessible way to control computers — an alternative to standard interfaces such as mouses, handheld controllers, and touchscreens. What is sure to appeal to tech industry behemoths, however, are the troves of real-time data that these devices collect about a person’s neuronal activity. This latest revolution in neurotech could conceivably yield a windfall for companies like Meta and Snap, which have built their business models around data-driven advertising. For the average consumer, however, it may portend a new kind of threat to data privacy — one that regulators seem woefully unprepared to corral.
Companies like Meta and Snap make substantial profits by collecting data on users’ web activity, using those data to identify highly specific target demographics for advertising clients, and selling access to user information to third-party businesses and researchers. A key tenet of this model is the idea that, with enough information about individuals and their habits, developers can divine, with fine-tooth specificity, how a certain person will respond to certain advertisements. To that end, companies might use feedback surveys to try to determine whether or not an ad was successful, or track people’s online interactions with ads through measures such as clickthrough rates or the time a person spends hovering their mouse pointer over a given image or video.
Tracking a person’s brain activity in real time, however, could in theory offer a more reliable, more precise, and personalized representation of an ad’s effectiveness. In laboratory experiments, researchers have shown that certain EEG signals can be used to accurately detect when a person has seen a strong sensory stimulus, or suddenly starts paying attention to something new. These signals, called event-related potentials, can in turn be used to gauge user interest and assess advertisement effectiveness. For platforms like Snapchat and Meta, it could herald a faster, more accurate way to get feedback about ad performance.
The practice of measuring neurological activity to gain insights into consumer behavior, known as neuromarketing, has been around since the early 1990s. Neuromarketing methods have so far been deployed only in controlled research environments, and it’s unclear how well, if at all, they will work in the wild. Still, the recent moves by ad-revenue-driven social media platforms to develop brain-computer interface technology suggest that neuromarketing might be on the cusp of going mainstream. With companies like Meta and Snap already investing billions of dollars into virtual and augmented reality, it is not a stretch to imagine them integrating EEG signal collection into the suite of user data already being collected through head-mounted VR and AR devices. In fact, OpenBCI, which is collaborating with Valve, has already integrated EEG into its Galea VR headset.
Social media firms have long aggregated user data for the purpose of targeted advertising, but the prospect of including neurological data in this brokerage represents an uncharted territory that is laden with risks.
For one thing, it’s not clear what neuromarketing would mean for the user experience. Neuromarketing metrics are produced from measurements of basal electrochemical reactions in a person’s brain — they are less a genuine measure of whether someone is interested in a product than they are the neurological equivalent of a knee-jerk reflex test. Algorithms that optimize advertising content based on neuromarketing metrics could potentially lead developers to pepper users with the most eye-catching stimuli possible, turning EEG-integrated VR use into a bombardment of weapons-grade annoyance.
Large-scale neuromarketing could also have unforeseen negative consequences on data privacy. If platform companies like Meta and Snap were to connect even rough measurements of a person’s brain activity with the already dauntingly large stores of data they already record — including information on users’ location, buying habits, and online activity — it could provide them with a much more complete image of their users than the average person might be comfortable handing out. Although capabilities of EEG and other neurotechnologies fall far short of mind reading, they capture sensory reactions that users have little if any control over, and that could in theory reveal attentive responses to intrusive environmental stimuli a user didn’t intend to focus on.
Algorithms linking heightened neural responses to a world of distractions may erroneously flag arbitrary interactions as important or meaningful.
Meanwhile, laws and regulations of neural data privacy are not just behind the curve — they are nearly nonexistent. Legislation such as Europe’s General Data Protection Regulation gives individuals some control and protection over their own digital footprint, and at least two states in the U.S. have enacted biometric privacy laws that protect people from unknowingly being subjected to physiological measurements in public spaces. But some experts have argued that neural data privacy is a special case that requires a new regulatory approach. So far, technology firms looking to build out neuromarketing efforts and other neural data monetization schemes have largely been left to police themselves.
That should be enough to give all of us pause.
____
1 Those with no depth perception function pretty normally because they can judge distance by motion v. a background. But sports like golf, where you have to “see” how far away the ball is while keeping you head still, are probably not on.