Governance and management
The Student Learning Surveys (SLS) are governed by policy and procedures approved by Academic Board, and supported by operational guidelines available for download here.
Use of the SLS at the University of Melbourne is approved by Academic Board. The Teaching and Learning Quality Assurance Committee (TALQAC) provides recommendations to the Board on policy and procedures. The following organisations play a role in management and administration of the mid and end of subject surveys.
Learning Environments provides service management, operations and coordination for the various centrally supported educational technologies that may be used for delivering the mid subject surveys (MSS) and the Explorance Blue platform used for the end of subject surveys (ESS).
University Decision Support (previously Business Intelligence and Reporting
University Decision Support (UDS) administers centralised reporting and results.
Schools and faculties
Administration of surveys and oversight of reporting distribution and use is provided within each faculty or school.
TALQAC: Academic Board’s Teaching and Learning Quality Assurance Committee
As an important instrument of the University's subject quality assurance protocols, SLS are governed via the standing Teaching and Learning Quality Assurance Committee (TALQAC).
Staff involved in administration of the SES or use of survey results are expected to be familiar with the following documents.
Policy and procedures
MPF1327: Courses, Subjects, Awards and Programs Policy [Melbourne Policy Library]
Rules and guidelines of the Academic Board
Please note that at this time, the nomenclature of Subject Experience Survey (SES) remains in the policy and on the Academic Board website pending updates to encompass the naming changes agreed in the September 2021 meeting of the board.
Per clause 5.54 of the above policy, the Academic Secretary publishes the rules for implementation and use of the SES on the Board’s website.
Those rules and guidelines currently include information on encouraging higher rates of engagement from students, providing feedback to students, survey timing and duration, and the rules that govern how and what reporting is provided and rules governing dissemination of reports.
- Guidelines on Interpretation of SES data
- Quality of teaching and learning - Course reviews
- Quality of teaching and learning - Subject reviews
Additional SES resources are available from the Academic Board webpage.
History of Subject Experience Surveys at the University
2019 to 2021
The Deputy Vice-Chancellor Academic commissioned a paper on the Subject Experience Survey (SES) to explore possible future options for the SES methodology. One impetus behind this request is the unexpected discontinuity in the use of the existing SES methodology in 2020 and Semester 1 2021 that was created by the pandemic, presenting a window for reconsideration and renewal of the methodology.
The Working Group that was convened by Chancellery Academic considered ways in which the SES methodology might be modified to improve confidence in the findings and improve their usefulness and relevance, as well as enhance the general experience of the survey for both students and staff.
The Working Group drew on the report on the SES prepared for TALQAC in 2019 and the experiences of the use of modified approaches to subject surveys in 2020.
In considering potential changes in the SES methodology, the Working Group was mindful of the:
- Need to provide valid and reliable information for the survey’s purposes (described below)
- Importance of students being offered an opportunity to provide their personal feedback
- Survey demands on students
- Administrative feasibility and costs
- Value for subject coordinators in rapid turnaround of survey findings
- Importance of ease of access to and ease of interpretation of the survey findings.
The SES served two primary purposes:
- Feedback for subject coordinators to allow ongoing improvement in teaching and learning.
- Information for quality assurance.
Both of these purposes continue to be valid. In particular, academic staff will continue to use SES findings as one form of evidence of the quality of teaching and learning under their leadership for the purposes of annual review, confirmation and promotion.
There is a further secondary purpose for the SES: the questionnaire items should represent and signal the characteristics of effective teaching and learning that are considered to be sufficiently common and relevant across all subject environments within the University. In this sense, the questionnaire items, even though these are a relatively short list, are an explicit statement of what is known about the characteristics of an effective learning environment and what is most valued within the University.
Endorsement from Academic Board
The resulting paper from this work was endorsed by the Academic Board on 2 September 2021 (Academic Board Meeting 6/2021 Tab C.4.a)
The DVC Academic has agreed to a change of service name also at this time to help avoid confusion with the QILT Student Experience Survey (SES) acronym clashing with the University's Subject Experience Survey (SES) - The Subject Learning Surveys (SLS) service encompass delivery and support from Learning Environments for both the mid subject survey and the end of subject survey as outlined in the paper approved by Academic Board.
2015 to 2018
In 2016 and 2017, TALQAC formed two working groups to examine (i) how to provide effective feedback and (ii) reasons for students’ disengagement with their courses and with the university. The Improving Student Engagement Working Group was established in response to committee members’ concerns about findings from the 2015 SES.
In 2011, the SES replaced the former Quality of Teaching (QoT) survey. Although similar to the QoT, the SES placed a stronger focus on students' learning experiences. The move to online survey delivery also improved the survey's efficiency, and made it easier for students to provide more detailed subject feedback.