Interrogating the Ethics of Biometric Capture in Immersive Musical Performance

This project was awarded funding under CAIDE's Inaugural Art, AI and Digital Ethics Seed funding round.

Immersive Performance

Immersive environments provide opportunities for new forms of musical performance co-created by humans and artificial intelligence (AI), while at the same time raising significant questions concerning ethics and ownership of performance. For example, how should we manage authentication and identity within virtual environments? What ethical and privacy considerations arise with technology owned by third parties? What wearables and headsets are considered secure or safe?

This project aims to cast a critical lens on these issues, while developing a discourse around ethical performance practice frameworks in immersive environments. It will provide opportunities to explore questions of ethics, creative ownership, and data security. To do this, we will produce a new performance work co-created by AI that utilises biometric data captured from performers. The performance will allow us to challenge perceptions of creativity, artistic ownership, and the future of ethical performance practice.

We will work alongside the University Cybersecurity, Privacy and Legal teams to conduct a privacy impact assessment. This will lead to the creation of a white paper to prompt discussion and inform debate among the University and wider community.

Biometric information captured through wearable and headset technology can be used across health, training, and performance applications. Along with the benefits of this rich data set comes with it important considerations around areas such as authentication, privacy, storage, access, and accessibility. The rapid speed of technological advances and increasing accessibility creates an urgent need to address these areas.

This project is significant in its ability to help lead public debate, develop ethical frameworks, provide agency to performers and creative practitioners, and instigate research best-practice in a timely manner.

Project Team

  • Ryan Kelly
    Dr Ryan Kelly

    Senior Research Fellow (Human-Computer Interaction)

    Faculty of Engineering and IT

    University of Melbourne

  • Solange Glasser
    Dr Solange Glasser

    Lecturer in Music (Music Psychology)

    Faculty of Fine Arts and Music

    University of Melbourne

  • Margaret Osborne
    Dr Margaret Osborne

    Senior Lecturer in Psychology and Music

    Melbourne School of Psychological Sciences

    University of Melbourne

  • Ben Loveridge
    Ben Loveridge

    Coordinator, Immersive Media (Augmented Reality and Virtual Reality)

    Student and Scholarly Services

    University of Melbourne