DONNELLY, Alan and HEATON, Caroline (2021). A Process and Impact Evaluation of a University’s Module Evaluation Questionnaire (MEQ). [STEER Evaluation Collection] [STEER Evaluation Collection]
Documents
34404:724660
Microsoft PowerPoint
Donnelly_2021_A_Process_and_Impact_Evaluation.pptx - Published Version
Available under License Creative Commons Attribution.
Donnelly_2021_A_Process_and_Impact_Evaluation.pptx - Published Version
Available under License Creative Commons Attribution.
Download (1MB)
Abstract
In 2020/21, a review was commissioned by Sheffield Hallam University’s Leadership Team into the Module Evaluation Questionnaire (MEQ) to find out what difference it was making and how the data was being used. A range of evidence was gathered and drawn upon for this mixed-methods evaluation, including: online focus groups and interviews with 60 module leaders across three colleges; an online reflective activity with teaching and learning (T&L) portfolio leads from 10 departments; interviews with 19 student course representatives; monitoring data and other evidence; and findings from literature and research across the sector. The steady fall in response rates in recent years, not just when the delivery of MEQs changed from paper to online, had limited the use of the data and its ability to fulfil its intended purposes in relation to quality assurance and quality enhancement. The standardised design of the MEQ was seen as a factor that restricted its potential usefulness, with many module leaders and some T&L portfolio leads and, to a lesser extent, students stating that the MEQ should be more reflective of the complexity of their modules. Relevant literature highlighted a range of factors that are helpful in promoting questionnaires to students, but the findings of this evaluation suggested that there was a limit to how much effect these practices had. Some practices were perceived to be harder to implement when teaching and learning was delivered predominantly in an online environment. According to module leaders, the switch to online delivery of MEQs a few years before made it harder to personalise and explain their purpose and value to students. Some student reps were motivated to take part in the MEQ process, but others were reluctant to engage. There is a need to strengthen evaluation capacity building at all levels, of which a key component is to provide more guidance about interpreting and analysing students’ responses, particularly when response rates are low. Recommendations are provided to address specific issues identified from the evaluation findings.
More Information
Statistics
Downloads
Downloads per month over past year
Metrics
Altmetric Badge
Dimensions Badge
Share
Actions (login required)
View Item |