Using AI in teaching, learning and assessment

An opportunity to re-evaluate the way we teach, what we are teaching, and why.

The advent of globally available Generative AI tools has profoundly changed the world we live in. In terms of education, this new landscape demands that we re-evaluate not only the way we teach (for example, adapting to and integrating these new tools), but also what we are teaching and why. Consequently, the availability of AI technology also poses a unique challenge for secure assessment design solutions that can still ensure academic integrity.

According to 2024 survey data collected from 8000 Australian students (from The University of Queensland, Monash, Deakin and the University of Technology Sydney):

  • 67% of students say they successfully use GenAI for different purposes
  • 65% say that they always or almost always change what GenAI produces to suit their needs
  • 10% admit that they never change what GenAI produces and copy and paste AI output directly into their work. (Shibani & Lim, 2024).

These findings indicate that student use of AI is already widely integrated into their higher education learning.

In this context, this article will provide advice on responding to the prevalence of GenAI through the lens of three equally important facets:

  1. Intellectual property and privacy protection
  2. Securing assessment
  3. Using GenAI in teaching and learning

Intellectual property and privacy protection

Of primary concern are the issues of intellectual property and privacy. Widely available tools such as ChatGPT  allow for potential breaches of individual privacy and intellectual property. If, for example, work produced by staff or students (their intellectual property) is uploaded it will become part of the training data for the model. As teaching staff, when handling students’ work we have a responsibility to protect students’ intellectual property (and also that of colleagues and peers). For this reason, it is advisable to avoid submitting students’ work or anyone’s intellectual property to open source tools. Likewise to protect privacy, do not upload personal or private information to GenAI tools.

For teaching staff, the best practice is:

  • Protect the University’s intellectual property: Avoid uploading teaching materials produced by educators, including lecture slides, assignment rubrics etc. to third-party software. Instead consider using  SparkAI, where staff can experiment with GenAI using private or confidential information with certain restrictions, see Spark’s data guidelines for more. If such materials are found on third party sites there is a takedown process that can be followed.
  • Student work remains the IP of the student and should not be uploaded to any open third-party software that are not part of the University’s enterprise suite.
  • Materials to which we do not own the copyright (eg readings provided to students through ReadingsOnline or other means) cannot be further shared.

Securing assessment

In addition, there is the challenge of providing secure modes of assessment.

As Australia’s Tertiary Education Quality and Standards Agency (TEQSA) shifts to a focus on regulating assurance of learning in the assessment space, the University of Melbourne has responded with its Assuring learning at Melbourne guidance,. As Australia’s Tertiary Education Quality and Standards Agency (TEQSA) shifts to a focus on regulating assurance of learning in the assessment space, the University of Melbourne has responded with its Assuring learning at Melbourne guidance,. In regards to this, The University currently considers the following options as ‘secure assessment’:

For teaching staff, the best takeaways are:

  • Review the Assuring Learning guidelines. Note that your faculty will be developing plans and timelines to work towards the goal of ensuring that 50% of your subject’s assessments can be classed as ‘secure’ i.e. that they are any of the suggested options: observed exam or test, interactive oral assessment, performance or observed internship of placement. Alternatively, your faculty may consider investing in full-scale redesign of your program to incorporate programmatic assessment.
  • Note that checking whether an assignment prompt generates an artefact of passable standard is an unreliable way of assessing its vulnerability. There is now a large variety of GenAI tools available (some of which have features available only by subscription) which a sophisticated user can combine. You would therefore be ill-advised to take a basic test returning a poor quality artefact as evidence that the assessment is secure. (from Generative AI in T&L at the University of Melbourne – FAQs)

Using GenAI in teaching and learning

As well as the obvious challenges, this new landscape of integrated GenAI provides us with the valuable opportunity to re-evaluate not only the way we teach (e.g. adapting to and integrating these new tools), but also to interrogate what we are teaching and why.

The following approach may be useful for those who want to re-design their assessments to align better with the world our graduates will be entering.

  1. Consider the knowledge and skills required for intended professions In the relevant industries that your graduates will be entering, how might professionals be currently using GenAI to assist in their roles? Consequently, in your subject, how might students be learning the skills relevant to this usage? How could this inform your learning outcomes and assessment tasks?
  2. Explore practical approaches around assessment design in context of GenAI Consider some of the options that mitigate students ability to rely on GenAI including interactive oral assessment, assessing students on process as opposed to artefact (e.g. evidence such as a visual diary or iterative drafts with cycles of revision).
  3. Reconsider your current assessment task(s) to identify areas needing an adaptation or redesign If your current assessment tasks are ones that GenAI can perform, consider the value of assessing your students on these sorts of tasks. In consideration of how the relevant professional industries may be outsourcing certain tasks to GenAI, in some instances it may be worth re-evaluating if these same tasks are still relevant assessments.

For teaching staff, the best practice takeaways are:

  • Iteratively improve your understanding of GenAI capabilities and consider relevant use cases in intended professions.
  • Reframe your approach from “How do I secure the assessment?” to “How do I evidence learning?
  • Consider designing more assessments that assess the process of learning as opposed to an artefact e.g. iterative drafts or multiple submission points with cycles of revisions, interactive oral assessments, observed performance or practical assessments.
  • For considerations regarding using GenAI for student feedback, you may want to consult the University’s Advice on using artificial intelligence tools for student assessment and feedback

Resources

The University of Melbourne has developed policy and resources on how we should be navigating the use of GenAI, which can be accessed from the TLI GenAI Policy Guide (internal institutional link only).

In addition there are the following resources:

As well as the obvious challenges, this new landscape of integrated GenAI provides us with the valuable opportunity to re-evaluate not only the way we teach (e.g. adapting to and integrating these new tools), but also to interrogate what we are teaching and why.

The following approach may be useful for those who want to re-design their assessments to align better with the world our graduates will be entering.

  1. Consider the knowledge and skills required for intended professions In the relevant industries that your graduates will be entering, how might professionals be currently using GenAI to assist in their roles? Consequently, in your subject, how might students be learning the skills relevant to this usage? How could this inform your learning outcomes and assessment tasks?
  2. Explore practical approaches around assessment design in context of GenAI Consider some of the options that mitigate students ability to rely on GenAI including interactive oral assessment, assessing students on process as opposed to artefact (e.g. evidence such as a visual diary or iterative drafts with cycles of revision).
  3. Reconsider your current assessment task(s) to identify areas needing an adaptation or redesign If your current assessment tasks are ones that GenAI can perform, consider the value of assessing your students on these sorts of tasks. In consideration of how the relevant professional industries may be outsourcing certain tasks to GenAI, in some instances it may be worth re-evaluating if these same tasks are still relevant assessments.

For teaching staff, the best practice takeaways are:

  • Iteratively improve your understanding of GenAI capabilities and consider relevant use cases in intended professions.
  • Reframe your approach from “How do I secure the assessment?” to “How do I evidence learning?
  • Consider designing more assessments that assess the process of learning as opposed to an artefact e.g. iterative drafts or multiple submission points with cycles of revisions, interactive oral assessments, observed performance or practical assessments.
  • For considerations regarding using GenAI for student feedback, you may want to consult the University’s Advice on using artificial intelligence tools for student assessment and feedback

Resources

The University of Melbourne has developed policy and resources on how we should be navigating the use of GenAI, which can be accessed from the TLI GenAI Policy Guide (internal institutional link only).

In addition there are the following resources:

References

Chung, J., Henderson, M., Pepperell, N., Slade, C., Liang, Y. (2024). Student perspectives on AI in Higher Education: Student Survey. Student Perspectives on AI in Higher Education Project. https://doi.org/10.26180/27915930

Fawns, T., & Schuwirth, L. (2024). Rethinking the value proposition of assessment at a time of rapid development in generative artificial intelligence. Medical Education, 58(1), 14-16.

Lodge, J. M., Howard, S., & Broadbent, J. (2023). Assessment redesign for generative AI: A taxonomy of options and their viability.

Perkins, M., Furze, L., Roe, J., & MacVaugh, J. (2024). The Artificial Intelligence Assessment Scale (AIAS): A framework for ethical integration of generative AI in educational assessment. Journal of University Teaching and Learning Practice, 21(06).

Shibani, A. & Lim, L. (2024). Assumption: Students don’t know how to use AI critically. Future Campus, 19 November. Accessed 6 June 2025. https://futurecampus.com.au/2024/11/19/assumption-students-dont-know-how-to-use-ai-critically/

Zaphir, L., Lodge, J. M., Lisec, J., McGrath, D., & Khosravi, H. (2024). How critically can an AI think? A framework for evaluating the quality of thinking of generative artificial intelligence. arXiv preprint arXiv:2406.14769