Abstract
The demands of employers are for Higher Education graduates to possess both professional procedural knowledge as well as theoretical principles. Laboratories play an important role in allowing students to acquire practice-based skills. A key question is how to assess the output of educational modules that may include practical outcomes. If assessment is weighted to the end of the module students may act strategically, missing important concepts in the process. To load balance student effort in assessment, and to improve motivation for engagement, coursework might be assessed at regular intervals. Surveys show that students have concerns over poor feedback, so they would benefit from more regular feedback that could realistically be used to inform continuous improvement. However this would increase the demands on the time of academic staff during periods when they are committed mainly to teaching. To mitigate this to some extent an application has been developed that automates some of the feedback generation processes. Using an educational module, based on the subject of Cyber Security, efforts were made to compare the performance and student perceptions of the new application compared to more conventional means of assessing student coursework. The results show that using the new application staff are able to provide detailed meaningful feedback to students that they perceive as useful to their progress, yet in a more timely manner compared to alternative methods.
Original language | English |
---|---|
Title of host publication | 2020 IEEE Global Engineering Education Conference (EDUCON) 27-30 April 2020 |
Publisher | IEEE |
Pages | 419-428 |
ISBN (Electronic) | 978-1-7281-0930-5 |
DOIs | |
Publication status | Published - 25 Jun 2020 |