Looking into the Psychiatric Panopticon: Ethical, Legal and Social Issues of Automated Nursing Observations in Acute Psychiatric Settings

This project was awarded funding under the Centre for AI and Digital Ethics' 2021 seed funding round: Pervasive Devices.

In acute psychiatric units, patients are routinely monitored in their bedrooms by nurses. ‘Digitally assisted nursing observation’ has emerged as a novel attempt to semi-automate this process. One aim is to improve patients’ safety and minimise nighttime sleep disruption. Patients’ bedrooms are fitted with sensors that monitor the person’s body using ‘computer vision, signal processing and AI techniques to [remotely and continuously] track micromovements and colour changes’.1 The person’s pulse and breathing rate can be detected by the sensors and efforts are reportedly underway to detect behaviours associated with self-harm, assault and suicide.

Psychiatric Panopticon

An initial UK trial was reportedly well received by patients, relatives and staff.2 However, a ‘patient and public involvement’ research phase elicited concerns about patient safety, privacy, and the impact on staffing levels.3 Several Victorian hospitals are planning to introduce this technology to acute psychiatric units. Yet, there remains a research gap concerning the ethical, legal and social implications of such a move, and these implications are serious given around half of inpatients in these settings are detained and treated involuntarily.

Researchers

References

[1] Alvaro Barrera et al, ‘Introducing Artificial Intelligence in Acute Psychiatric Inpatient Care: Qualitative Study of Its Use to Conduct Nursing Observations’ (2020) 23(1) Evidence-Based Mental Health 34 (‘Introducing Artificial Intelligence in Acute Psychiatric Inpatient Care’).

[2] Ibid.

[3] Ibid.