• Politics
  • Diversity, equity and inclusion
  • Financial Decision Making
  • Telehealth
  • Patient Experience
  • Leadership
  • Point of Care Tools
  • Product Solutions
  • Management
  • Technology
  • Healthcare Transformation
  • Data + Technology
  • Safer Hospitals
  • Business
  • Providers in Practice
  • Mergers and Acquisitions
  • AI & Data Analytics
  • Cybersecurity
  • Interoperability & EHRs
  • Medical Devices
  • Pop Health Tech
  • Precision Medicine
  • Virtual Care
  • Health equity

Ethical Concerns for Cutting-Edge Neurotechnologies

Article

“People could end up behaving in ways that they struggle to claim as their own.”

Although it may be years before brain-computer interface (BCI) technology becomes part of daily life, a team of experts is calling for conversation on the ethics of the field to begin.

The demand comes from a group of neuroscientists, neurotechnicians, clinicians, ethicists, and engineers called the Morningside Group. Their concerns about the technology range from data privacy minutiae to science-fiction nightmare. They say existing ethical guidelines are insufficient.

“We are on a path to a world in which it will be possible to decode people's mental processes and directly manipulate the brain mechanisms underlying their intentions, emotions and decisions,” they write in a new commentary published this month in Nature.

There are 4 main issues that they would like to see addressed: privacy and consent, agency and identity, augmentation, and bias.

“People could end up behaving in ways that they struggle to claim as their own,” they write. BCI technology could be vulnerable to exploitation by hackers, and they also worry that it could fundamentally alter individual agency and people’s “private mental life.”

The team proposes that patients get the right to opt out of sharing any personal information gathered by the study and implementation of BCI technology. That option should be the default, they argue, rather than something a consumer must find. Neural information should be treated the same way as donated live organs and tissue.

Neural data from those who choose to share may be enough to glean observations about those who don’t. The experts also recommend that any commercial use of such information be regulated, and regulation should include provisions against “having neural activity written directly into their brains for financial reward.”

To protect agency and avert a neural “arms race” from developing, the Morningside authors look to international treaties from the last century. To ensure limits on BCI capabilities, an equivalent to 1948 Universal Declaration of Human Rights could be drafted, requiring signatures, oversight, and sanctions for countries in violation.

They insist that any military use of the technology, like the enhancement of human sensory or endurance capacities, must be “stringently regulated.” This would be similar to existing limits on chemical and nuclear weapons. The authors discourage outright bans, however, for fear of pushing the technology underground.

Bias is another concern. The well-documented underrepresentation of women and minorities in the tech sphere will manifest in the resulting algorithms, the authors write, and risk becoming embedded in neural devices. From the earliest stages of development, marginalized groups should be given voice in the creation of neurotechnologies.

Underlying the piece is a concern that “profit hunting will often trump social responsibility.” Ethics, they write, should be taught to engineers in the same way patient privacy principles are taught to doctors.

They estimate that current spending on artificial intelligence neurotechnology exceeds $100 million globally, and that number that will grow quickly. Many of the world’s major tech companies, like Google, IBM, and Facebook, are investing heavily in neural networks, as are countless startups. While early achievements have been impressive, the team calls them “coarse” compared to what will come.

“The possible clinical and societal benefits of neurotechnologies are vast,” they conclude. “To reap them, we must guide their development in a way that respects, protects, and enables what is best in humanity.”

Recent Videos
Image: Ron Southwick, Chief Healthcare Executive
George Van Antwerp, MBA
Edmondo Robinson, MD
Craig Newman
Related Content
© 2024 MJH Life Sciences

All rights reserved.