Neurotech – Marvel of Medicine or Threat to Mental Integrity?

Hosted by Dr Michael Wildenauer from the Centre for Artificial Intelligence and Digital Ethics (CAIDE) at the University of Melbourne as part of the Ninian Stephen Law Program powered by the Menzies Foundation, February’s “Neurotechnology and the Law” forum featured four expert panelists who illuminated some of the key legal concerns around this rapidly emerging technology.

The event highlighted CAIDE’s interdisciplinary focus, with an audience including professionals and students from across fields such as medicine, law and psychology: and topics that ranged from protections against data-driven advertising to the very framework of modern human rights.

Nick Opie

Professor Nick Opie, the founding CTO of Synchron and Head of the Vascular Bionics Laboratory at the University of Melbourne, provided a scientific perspective on neurotechnology, using Synchron’s Stentrode as a focus for his presentation. This neurotech device allows patients with functioning brains but who are paralysed (through stroke, motor neuron disease, spinal cord injury, traumatic injury resulting in loss of limbs, etc) to communicate with and regain some control over their world.

The brain-computer interface (BCI) is designed to enable people to use their thoughts to control digital devices which enables them to use technology such as computers and phones to communicate and complete tasks of daily living. Although medically groundbreaking, and of huge benefit to patients, Prof Opie also highlighted some key challenges of BCIs, such as the difficulty in regulating brain implants involving adaptive algorithms which continue to change after being implanted.

While these algorithms may have significant advantages over static software for enabling fast and accurate control of assistive technology, when they control moving devices (such as wheelchairs or exoskeletons) or are coupled with stimulation for treatment of Parkinsons’ tremor or epilepsy, it is imperative that guard rails are put in place to ensure the software continues to function doing exactly what it ought to be doing.

When BCI devices transition out of the medical field and become readily available to consumers as wearable devices, it may prove challenging to ensure that they are entirely safe and monitored correctly as the delineation between 'medical products' and 'toys' becomes blurred.

Patrick Hooton

Patrick Hooton, a Human Rights Advisor from the Australian Human Rights Commission, who led the Commission’s Background Paper on Neurotechnology and Human Rights, advocated for human rights guardrails to be built into all new and emerging technologies. Patrick focused on three areas: neurorights, the traditional right to privacy, and rights of people with disability.

Neurorights is an umbrella term used to guide discussions about human rights protection for the mind. Patrick discussed recent arguments that existing human rights framework may not be sufficiently tailored to  neurotechnologies. For example, although the human right to bodily integrity exists, neurorights advocates call for a human right to mental integrity to be created, to protect the mind from neurotechnology which may not interfere with the body as such.

Hooton went on to discuss the importance of privacy: neural data is the most sensitive and private form of personal information that we have. He used the case of Second Sight, a company providing implants for artificial vision, to illustrate the potential downfalls of neurotech. This company became insolvent and abruptly discontinued their services to those who had the device implanted. There was at least one case of a user’s device ceasing to function without warning, and others were left without promised software upgrades and the possibility of their vision going dark in case of technical failure.

As this example illustrates, certain consequences associated with neurotech will disproportionately affect those with a disability. There need to be sufficient safeguards to protect people who use neurotech, especially those who are most vulnerable.

Veronica Scott

Veronica Scott, previously a Partner at KPMG Law leading the Cyber, Privacy and Digital Data Practice in Australia (and now a partner at Pinsent Masons), discussed neurotechnology from a privacy angle. Ms Scott outlined how the law can be extremely slow at responding to digital developments and societal needs in response, highlighting the three years of the Privacy Act Review without draft legislation as an example of the slow pace of legislative change in this area.

Data and information flow along a lifecycle, with risks at every stage, therefore regulation is necessary for every stage of the lifecycle. Scott emphasised that with no Bill of Rights in Australia, any general right to privacy is founded only in the Privacy Act 1988 (Cth) (which reflects Australia’s international commitments).

However, when we talk about neural data and information privacy, there is a certain blurring between the information about someone that sits separate to them, information that is integral to them, and information that influences them, and a new approach must be developed to deal with information privacy and neurotechnology.

Whilst Ms Scott agreed with Hooton on the need to bring neuro-rights into the conversation, she outlined how a lack of “strong guidance for organisations… [and] a common framework” on existing law around data protection and privacy might pose a more immediate challenge and could help set the foundation for a more effective legislative framework.

Michelle Sharpe

Dr Michelle Sharpe, a barrister with wide expertise in Consumer Law, previous chair of the Victorian Bar’s Health and Wellbeing Committee, and co-author of the article: ‘What is neurotechnology and why are lawyers getting involved?’, gave a presentation with a focus on competition and consumers. Dr Sharpe considered welfare to be enhanced by an efficient economy, which requires competition, which in turn requires consumers to make free and informed choices on the goods and services that they consume.

Regarding neurotechnological devices such as BCIs, the data created from monitoring a person’s brain activity could be sold to companies, which could influence the market by corrupting free choice among consumers. To illustrate this, Sharpe gave the example of a person wearing a BCI device while gaming. The gamer's brain would show excitability at certain points, and with access to this data a developer would be able to deploy micro-targeted advertising.

As neurotechnology provides a way for the most intimate of data to be sold and used, there must be adequate safeguards in place. Although section 21 of the ACL prohibits unconscionable conduct in trade or commerce, the difficulty with BCIs is that the consumer may not even be aware that their data is being manipulated, and if they are aware, they may not have enough evidence to prove it. Dr Sharpe urged lawmakers to turn their attention to the ways in which consumers may be electronically cornered into a purchase (arguing these are simply the evolution of protection for consumers “cornered by a door-to-door salesman”), and to legislate sufficient protections.

CAIDE thanks our panel of experts for their assistance in the process of bringing together technologists, lawyers and advocates to better understand the impacts of an emerging technology that combines AI with neuroscience and biomedical engineering, on society as a whole and on the law in particular.

Update: CAIDE provided verbal and written submissions to the AHRC for their Neurotechnology and Human Rights project. A link to the new background paper on Neurotechnology and Human Rights mentioned by Patrick Hooton in his talk was released by the Australian Human Rights Commission in early March and can be found here: Protecting Cognition.

This summary was prepared by Lada Volkova with additional material by Andrew Lim and Michael Wildenauer.