Imagine a world where your emotions, attention span, mental health, and even your unconscious reactions aren’t just personal experiences, they’re data. U.S. Senators Chuck Schumer, Maria Cantwell, and Ed Markey have formally asked the Federal Trade Commission (FTC) to investigate how brain-computer interface (BCI) companies are collecting and monetizing neural data, a new, form of biometric information that taps directly into your thoughts, emotions, and mental states.
Their letter follows a landmark study by the Neurorights Foundation, which reviewed 30 neurotech companies and found gaps in transparency and consumer protections. According to a 2024 report by the Neurorights Foundation, which analysed the privacy policies of 30 consumer neurotechnology companies, 29 of them collected users’ neural data, but only 14 provided users with the option to it. Just a handful allowed users to opt out of data collection, especially outside the EU.
Many neurotech products today, like brain-sensing gadgets, are sold as consumer wellness tools, not as medical devices because of this classification, they don’t fall under HIPAA. The U.S. law designed to protect sensitive health data, instead these companies fall under weaker consumer laws that weren’t neural data in mind. In Europe, the General Data Protection Regulation (GDPR) offers broad protections for personal data, including neural information, but it only applies to EU citizens. In the U.S., HIPAA only safeguards neural data when it’s collected in a healthcare setting. So, if your brainwave data comes from a meditation app or a neurofeedback headband, you’re likely not covered. That leaves the FTC as the main watchdog, but it mostly steps in when companies engage in deceptive practices, and there are no clear rules for neural data. Even when companies do offer privacy policies, they’re often vague, confusing, and hidden behind walls of legal jargon. Consent is usually bundled into long, unreadable terms of service, making it hard for everyday users to truly understand what they’re agreeing to, or how their most personal data is being used.
In response to these concerns, a growing number of scientists, ethicists, and human rights advocates are calling for Neurorights, a proposed set of digital rights tailored to protect our cognitive freedoms. These include the right to mental privacy, meaning your thoughts should remain private unless you explicitly share them; the right to cognitive liberty, so you can think, learn, and decide without manipulation; the right to psychological continuity, which protects your identity and sense of self from unwanted interference; and the right to fair access to mental enhancement, so brain-boosting technologies are accessible to all. Countries like Chile have already started embedding these principles into law. The U.S., however, is still playing catch-up.
If you’ve used a meditation app, a smart headband, a VR headset with biometric sensors, or even some certain fitness wearables, you’ve likely shared some form of brain or nervous system data. Yet, unless you live in a region with strict privacy laws like the EU or California, there’s a little clarity on how that data is collected, stored, or sold.
Neurotechnology is evolving fast. These tools could help paralyzed individuals communicate, detecting early signs of Alzheimer’s, or boosting focus and creativity. But with that potential comes immense risk of misuse. Without strong regulations, neural data could become the next digital gold rush. Except this time, they won’t just know what we click, they’ll know what we think.
What’s unfolding isn’t just a tech trend. It’s a civil rights issue for the digital age. In a world where algorithms already predict our behaviour, the last thing we need is corporations predicting our thoughts and profiting from them. Whether we act now or later will decide whether neural data becomes a force for empowerment or exploitation.
If your organization is handling copious amounts of personal data, do visit www.tsaaro.com
News of the week
- TikTok Fined €530 Million by EU Over Data Transfers to China
TikTok has been fined €530 million by the European Union for violating data privacy rules. The Irish Data Protection Commission found that the platform allowed employees in China to access European user data without proper safeguards, breaching the EU’s General Data Protection Regulation (GDPR). The regulator also said TikTok failed to clearly inform users about and where their data was being shared.
TikTok now has a timeline to bring its practices into compliance or stop transferring European data to China altogether. The company plans to appeal the decision and has pointed to its €12 billion initiative, Project Clover, which aims to improve data security in Europe through local data centres and tighter access controls.
- Co-op Confirms Customer Data Breach
The Co-operative Group has revealed that hackers have gained access to personal data of some of its members, including names, contact details, and birthdates. Fortunately, financial details and passwords were not compromised. While the breach has raised concerns, the Co-op is working closely with the UK’s National Cyber Security Centre (NCSC) and the National Crime Agency (NCA) to investigate what happened and prevent future attacks. The company has assured customers that it is taking steps to protect their information and will continue to provide updates as the investigation progresses.
https://www.bbc.com/news/articles/crkx3vy54nzo
- M&S Hit by Major Cyberattack, Disrupting Operations
Marks & Spencer (M&S) is dealing with a major ransomware attack that has affected its IT systems, causing widespread disruptions. The attack, reportedly from the Scattered Spider group, led to issues with online sales and in-store operations, including failed payments and problems with the food supply chain. This has resulted in an estimated £40 million loss for the retailer. M&S is working with cybersecurity firms like Microsoft and CrowdStrike to recover and strengthen its systems. However, many internal systems are still down, and the company is facing significant operational challenges as it works through the crisis.