Experts discuss: Is AI the Future of Mental Health Care?

We are revolutionising how we deliver mental health care in order to address the enormous unmet need that exists in Australia.

On 31 May, University of Melbourne and industry experts came together to share their insights with the alumni community on how AI technology could provide tangible and sustainable solutions in the field of mental health care.

146 alumni attended the panel discussion at the Ian Potter Auditorium in the Parkville campus’ Kenneth Myer Building, with more than 1200 alumni registering to watch the discussion online.


Panel moderator and ABC journalist Natasha Mitchell kicked off the session by polling the audience. She asked, “Do you feel confident or apprehensive about a future where artificial intelligence and machine learning plays a role in delivering mental health support services?”

The poll was almost split evenly, with 49% of alumni responding ‘confident’ and 51% ‘apprehensive’ through a link on their smartphones.

With such a divided response, the expert panellists were eager to share their knowledge on what the intersection of AI and mental health care looks like now and into the future.

The current landscape

The conversation began with some insights into the current state of mental health care in our country.

Professor Mario Alvarez-Jimenez, Chief of Orygen Digital and Professor Digital Mental Health, the Centre for Youth Mental Health at the University of Melbourne, delivered some alarming statistics.

“According to the latest research from the National Mental Health Service, 40 per cent of young people in Australia experience mental illness every year,” said Professor Alvarez-Jimenez.

“The real issue is there’s a huge unmet need. People come to access mental support, and many have to wait months before they see a clinician.”

The academics then moved into sharing their optimism for how AI could alleviate this desperate need for faster and more effective services.

Dr Simon D’Alfonso, Lecturer in Digital Health within the School of Computing and Information Systems at the University of Melbourne, described his team’s research in digital phenotyping technology and how it could be implemented practically.

Dr Simon D’Alfonso and Dr Rahul Khanna

Dr Simon D’Alfonso (left) and Dr Rahul Khanna (right).

“We leave behind a vast digital footprint in our daily lives – interactions with digital devices in the modern age constantly, such as smartphones, laptops and smart watches,” explained Dr D’Alfonso.

“Digital phenotyping is basically about mining this data with computational AI techniques, predominantly machine learning, to try to gain some behavioural insight into the individual.”

He then gave the audience some examples of how our digital footprint can become useful information in a clinical setting.

“We could be able to determine somebody's level of socialisation for the week – are they receiving more calls than they're making? Are they missing a lot of calls, perhaps deliberately?” said Dr D’Alfonso.

Natasha asked about the importance of consent being transparently negotiated, and whether this kind of monitoring could be perceived as a form of surveillance.

“With a traditional psychometric test, the patient fills it in once and it migrates to their mental health practitioner. If we’re trying to get this real-time data to infer something about their psychological state, that data could keep being reused,” said Dr D’Alfonso.

“With this ongoing monitoring process, we might have to periodically ask somebody for their consent.”

Putting AI into practice

Professor Wendy Chapman, Director of the Centre for Digital Transformation of Health and Associate Dean of Digital Health and Informatics at the University of Melbourne, conducts research on how digital innovations can be best implemented within health care settings.

“We need to, from the very beginning, be thinking about what the clinicians need and how the information in the electronic medical record can be merged with the information that's coming from the patient,” said Professor Chapman.

Professor Wendy Chapman Professor Wendy Chapman.

“If you're asking a therapist or a psychiatrist to monitor a panel of patients and their social media use, when are they going to look at that, and who's going to look at it? These are all things that depend on the economic model of the system that you're in.”

Professor Chapman also spoke to programs that have been developed in the US where patients choose how the insights gleaned from their digital phenotype are shared.

“You nominate people that you want to know when things are going downhill, and you can choose different people for different types of signals that you might see,” explained Professor Chapman.

“I think it opens up this opportunity for more people to be involved and to increase the power that the individual has.”

Dr Rahul Khanna, Program Director, Mental Health State-wide Trauma Service, and Director, Innovation and Medical Governance at Phoenix Australia, is a practicing psychiatrist with a focus on treating people with a lived experience of trauma.

Reflecting on the effectiveness of ChatGPT, Dr Khanna speculated how similar AI technologies could assist those facing post-traumatic stress disorder.

“Within the trauma space, a lot of our work is in therapies that use language and story to heal, and of course, these are the things that generative AI in particular is doing incredibly well,” said Dr Khanna.

“One approach is what we call nightmare re-scripting – we'll get a client’s recurring nightmares and then we'll find ways in which that story or narrative could be reformulated in subtle ways. That person thinks about this before bed, which can reshape or reduce the intensity of those nightmares.

“It's a very practical thing that a lot of people do with a clinician, but it's the sort of thing that you could imagine happening with an app or language bot without too much difficulty.”

Dr Khanna also spoke to how existing technology, such as typing notes throughout a consultation, can get in the way of building a strong therapeutic relationship between patient and clinician.

“I think we're finally in a space where you don't need a physical, human scribe, which can be very intrusive. I think there are opportunities to enhance that interaction and buy more time to build that alliance.”

Addressing ethics and inclusion

With 113 questions submitted online throughout the discussion, the audience was captivated by the conversation and enthusiastic to learn more.

One alum asked, are there new ethical considerations when using AI in experiments with people, and are our ethics boards prepared?

“I think there are some considerations, and I think probably, our ethics communities are not completely skilled in this yet,” responded Professor Alvarez-Jimenez.

“For example, where data is analysed is very important – whether it goes through to the cloud and then comes back to the device, or whether it’s saved locally. That makes a huge difference in terms of safety for consumers.”

Professor Mario Alvarez-Jimenez Professor Mario Alvarez-Jimenez.

Another alum posited a question around inclusion – what risks are expected for diverse communities with AI? Can you train AI on cultural sensitivity, and how will you measure the quality of care?

Professor Chapman explained how although AI will perpetuate biases that exist in the industry, simply because it learns from existing data, it also presents an opportunity for correction.

“I think people, at least in the research world, are really aware of this. There are companies and people that are creating frameworks, from how you collect the data to what features you choose,” said Professor Chapman.

“All these steps along the way are opportunities for biases, and new frameworks are coming out about how to avoid them.”

At the conclusion of the discussion, panellists and alumni stuck around to ask more burning questions and continue their conversations over drinks and canapés.

Many attendees expressed their gratitude for the opportunity to learn from leading industry and academic experts in this dynamic and emerging field.

Learn more about how the University of Melbourne is shaping care in mental health delivery.

Learn more

Give Read more - Research Impact