Using learning analytics to alleviate course and student support administrative load for large classes: a case study

Vanessa Honson (University of New South Wales, Sydney, Australia)
Thuy Vu (University of New South Wales, Sydney, Australia)
Tich Phuoc Tran (University of New South Wales, Sydney, Australia)
Walter Tejada Estay (University of New South Wales, Sydney, Australia)

Journal of Work-Applied Management

ISSN: 2205-2062

Article publication date: 23 February 2024

Issue publication date: 13 September 2024

816

Abstract

Purpose

Large class sizes are becoming the norm in higher education against concerns of dropping learning qualities. To maintain the standard of learning and add value, one of the common strategies is for the course convenor to proactively monitor student engagement with learning activities against their assessment outcomes and intervene timely. Learning analytics has been increasingly adopted to provide these insights into student engagement and their performance. This case study explores how learning analytics can be used to meet the convenor’s requirements and help reduce administrative workload in a large health science class at the University of New South Wales.

Design/methodology/approach

This case-based study adopts an “action learning research approach” in assessing ways of using learning analytics for reducing workload in the educator’s own context and critically reflecting on experiences for improvements. This approach emphasises reflexive methodology, where the educator constantly assesses the context, implements an intervention and reflects on the process for in-time adjustments, improvements and future development.

Findings

The results highlighted ease for the teacher towards the early “flagging” of students who may not be active within the learning management system or who have performed poorly on assessment tasks. Coupled with the ability to send emails to the “flagged” students, this has led to a more personal approach while reducing the number of steps normally required. An unanticipated outcome was the potential for additional time saving through improving the scaffolding mechanisms if the learning analytics were customisable for individual courses.

Originality/value

The results provide further benefits for learning analytics to assist the educator in a growing blended learning environment. They also reveal the potential for learning analytics to be an effective adjunct towards promoting personal learning design.

Keywords

Citation

Honson, V., Vu, T., Tran, T.P. and Tejada Estay, W. (2024), "Using learning analytics to alleviate course and student support administrative load for large classes: a case study", Journal of Work-Applied Management, Vol. 16 No. 2, pp. 303-315. https://doi.org/10.1108/JWAM-11-2023-0121

Publisher

:

Emerald Publishing Limited

Copyright © 2024, Vanessa Honson, Thuy Vu, Tich Phuoc Tran, Walter Tejada Estay

License

Published in Journal of Work-Applied Management. Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode


Introduction

While the trend of higher education moving towards large class teaching is not new, it is comprising a more complex model that combines the traditional in-person class with a greater online environment, whether that is hybrid, blended or flipped classrooms (Al Ansi and Al-Ansi, 2020; Kansal et al., 2021). Institutionally, these larger classes are particularly relevant in undergraduate and foundational courses and programmes (Bikowski et al., 2022; Hornsby and Osman, 2014; Hubbard, 2020; Kumar, 2013; Mulryan-Kyne, 2010). According to the study by Hubbard (2020), any class size of 100 or more students was considered large by the interviewed academics. Challenges present themselves for the provisions of personalised attention and engagement, with some studies suggesting a decline in student performance and satisfaction amidst the increase in large class sizes (Bikowski et al., 2022; Hornsby and Osman, 2014; Wadesango, 2021).

In order not to compensate quality (of learning) for quantity (of class size), it is essential for large class teaching to maintain meaningful interactions between teachers and students, promote student engagement and foster interactive learning (Bikowski et al., 2022; Hornsby and Osman, 2014; Mulryan-Kyne, 2010; Wadesango, 2021). Indeed, previous studies have found a significant correlation between student engagement and academic success within higher education contexts (Delfino, 2019, Mohamed et al., 2021; Rajabalee et al., 2020; Schnitzler et al., 2021).

This case study seeks to explore the possibilities of leveraging learning analytics (LA) and the associated insights to alleviate the administrative workload for the teaching staff in large classes. Drawing on the reflections of the first author serving as the course convenor, the case study also seeks to identify implications for the effectiveness and efficacy of using learning analytics in higher education. This exploration is particularly pertinent within the evolving landscape of blended learning, with a specific focus on addressing challenges in large class settings.

Literature review

Alongside curriculum design, educational technologies have proved effective in promoting interactions, engagement and active learning (Bikowski et al., 2022; Chronopoulos, 2018; Hornsby and Osman, 2014; Manpreet et al., 2022; Mulryan-Kyne, 2010; Wadesango, 2021). Against the effective use of educational technologies, the learning analytics provided by the technologies has been under-utilised (Ifenthaler et al., 2019). Learning analytics encompasses a comprehensive process of collecting and analysing educational data to gain a deeper understanding of students' learning behaviours and performance (Siemens et al., 2011). Schmitz and Hanke (2023, p. 1133) put forward the following: “Monitoring student engagement can help identify students who are on track for success and those who require additional help”, and that student engagement is significant to the improvement of teaching and learning quality, especially in the online environments.

The necessary and sudden shift to remote teaching resulting from the pandemic has indelibly altered the landscape around teaching delivery (Tran et al., 2022; Valverde-Berrocoso et al., 2020). Whilst the gains may include flexibility of learning and delivery, the ability for teachers to interact with students and gauge their understanding of concepts has reduced. Furthermore, the incidental opportunities arising with face-to-face classes to clear confusion and inspire active engagement in a topic matter are less likely in a synchronous online environment and minimal with asynchronous delivery. These reduced interactions between students and teachers lead to a greater necessity for looking towards LA to provide a more immediate means to support student progression through a course.

In recent years, the use of, and facility for, learning analytics in higher education has become more advanced, especially with greater integration of learning management systems and tools, together with rapid advances in software applications (Ferguson, 2012; Picciano, 2012; Tsai et al., 2020). The emergence of big data, the coinciding demand for online learning and the increased call for institutional performance metrics have fuelled the use and spread of LA across higher education institutions (Ferguson, 2012). Given the correlation between student engagement and academic success within higher education contexts, a significant use of LA has been focusing on student engagement (Delfino, 2019, Mohamed et al., 2021; Rajabalee et al., 2020; Schnitzler et al., 2021). Learning analytics provides insights into student engagement, which could then be used to identify disengaged students, reinforce engagement and thereby, enhance performance and outcomes (Foster and Siddle, 2020; Francis et al., 2020, Summers et al., 2021).

According to Cents-Boonstra et al. (2021), there are three types of student engagement in digital environments: relational or social/emotional, behavioural and cognitive. Relational engagement refers to students' interactions in the classroom and community, including interacting with other students and staff via chats and forums. Behavioural engagement refers to their participation in the classroom and community, including asking and responding to questions, participating in class activities and attempting learning opportunities. Cognitive engagement refers to their investment in academic tasks and striving for success, including completing activities, accessing learning resources and feedback and following instructions. This case study will focus on digital academic engagement, namely academic behaviours and cognitive engagement in the online environment. These include accessing, clicking and viewing sites and resources and attempting formative and summative assessment tasks and performance in these tasks.

The utilisation of learning data to detect students in need of support and employ nudges to facilitate positive behavioural changes has gained considerable attention in recent years. Gatare et al. (2023) explored the optimised use of nudges in an online learning platform through a co-design process involving undergraduate students. Categorised into confront, social and deceive, the nudges were analysed for effectiveness. Social and reinforce nudges were found to be most effective, while confront and fear nudges were discouraged. This insight helps tailor nudges based on reading rates, task engagement and goal achievement. On a similar note, the study by Fouh et al. (2021) investigated the impact of email nudges on student procrastination in an introductory course at the university level. While a single nudge showed mixed results, with increased late-day usage, its effects were short-lived. The study suggests the need for more frequent nudges and explores potential improvements through varied interventions and data analysis methods. Motz et al. (2021) presented a mobile app intervention that employs push notifications to reduce missed assignments. By addressing accidental neglect, the app significantly decreased missed assignments and improved course grades. This study emphasises the positive impact of timely, data-driven interventions in enhancing student outcomes. Finally, focusing on personalised nudges in healthcare education, Piotrkowicz et al. (2020) emphasised the challenges faced by healthcare professionals in undertaking learning activities. The study explores the relationships between user characteristics and the perceived effectiveness of nudges, highlighting the usefulness of data through observing student access to learning resources to aid the educator towards successful student outcomes. Collectively, the recent research underscores the importance of tailoring nudges to individual characteristics, engagement levels and learning contexts to maximise their impact on student success. Future research should further explore personalised nudging strategies and their scalability in diverse educational environments.

Given the correlation between student engagement and academic success, engagement in early stages could be used to predict future performance and outcomes (Schmitz and Hanke, 2023; Summers et al., 2021). Summers et al. studied 1,602 first-year UK undergraduates and found that measures of engagement as early as weeks 3 and 4 of the semester could indicate students' future behaviour and outcomes. Hence, identifying the disengagement early and intervening timely could lead to improved performance. In this regard, learning analytics is effective as it can provide insights into student engagement, particularly where, when or whom disengagement occurs and assist the teacher in personalising support for those who need it and when it is needed the most for improving their chances of academic success (Foster and Siddle, 2020; Francis et al., 2020; Summers et al., 2021). Indeed, LA, in the form of insight reports and dashboards, has been widely used across the globe in various contexts, from teaching and learning to support services, due to increasing evidence of efficiencies and efficacies. Some examples include the UK (Foster and Siddle, 2020, Summers et al., 2021), Australia (Gribble and Huber, 2022, Tran et al., 2022), Europe and Latin America (Gutiérrez et al., 2020) and the US (Hefling, 2019). Against these potentials of LA, a range of issues remain relevant and warrant further research to increase our understanding and use of learning analytics insights. Some of these include the usability of data, technical capabilities of the users and associated workload in accessing, interpreting and making use of the data insights. This case study aims to explore these issues in the context of the collaboration between an LA project and a second-year course in the health science programme at the University of New South Wales (UNSW).

The learning analytics project

UNSW uses Moodle as the primary learning management system (LMS), along with some complementary platforms to support specific functions, such as Teams, ECHO360 or Inspera. Historically, Moodle provides learning analytics in the form of downloadable logs recording unique users' actions on the course site, that is, logging in, number of times content is viewed or accessed, forum posts created/responded and more. However, the Moodle logs are provided as raw data, and hence are clunky and complicated.

With the rise of learning analytics and its potentials to support the management of learning and courses at scale, a 3-year strategic project was initiated at UNSW. The project, “Data Insights for Student Learning and Support”, aims to enhance student learning and experience through leveraging educational learning analytics. The project brings together multiple streams of educational data progressively, with Moodle, Student Information Systems (SiMs) and Staff Information Systems (PiMs) for 2023. The project will then analyse and utilise machine learning and artificial intelligence to provide a regularly updated 360° view of students' learning and day-to-day progress.

During 2023, the project developed and tested an alpha version of the Academic Success Monitor (ASM) primarily for course convenors. This monitor uses historical data to create a machine learning model that tracks patterns of student engagement with digital learning opportunities, their chances of success and their actual performance in the course. ASM helps to identify early students who might be struggling and/or left behind, alert the course convenor and suggest support services that might benefit the students. The second-year course discussed in this case study is amongst the 19 other courses included in the testing of the monitor in Term 3, 2023 (UNSW operates three conventional terms in an academic year).

Description of the course

Students entering this second-year course come with predominantly an interest in science and a desire to pursue a career in the health sciences. On average, this course usually has 110–120 students, a large clinical practical class size and the course convenor is also the single teacher in the course. Prior to this course, students have undertaken foundational courses in vision sciences and general STEM courses such as cell biology, optics and anatomy and physiology of the body and eye. In the term before this course, students take an optical dispensing course, which begins the bridging process for linking geometric optics and visual optics to the correction of visual needs with optical appliances. This particular Term-3 year-2 course continues the stream, building the links between cell biology, anatomy and physiology of the eye with the visual needs of individuals with deficits due to refractive errors. The whole process is complex and requires students to integrate their learning in the previous courses into a coherent clinical understanding of the expected impacts for vision.

The course consists of a suite of activities to scaffold the process, including pre-recorded lectures followed by in-person practicals and weekly live online review sessions with interactive polls and discussions of questions students may have. It also utilises formative assessments to drive student learning with student-led quizzes, case-based forum discussions and encouragement to access a link at the end of each pre-recorded lecture and to write a learning point and question.

In the student-led quizzes, students are required to generate a minimum of two questions per term on two lecture topics selected from the given sheet and answer their peers' questions. As motivation for students, a small number of these questions will be fed into the final exams. In the case-based discussions, a forum is created whereby a weekly case scenario is presented to guide the students on how to apply learning to authentic examples and build their clinical thinking skills. The case scenario is drawn from the previous week’s pre-recorded lecture content. Students are asked to address the points in the case scenario, and only after their post is made, they are able to view other students’ responses during the week. At the end of each week, the forum is locked before the case discussion during the live online sessions at the beginning of the following week. This weekly review session also covers the learning points and questions captured directly after the students have listened to the lectures in their own time. Learning is complex, and the methods by which students learn are diverse. A major aim of offering the varied participatory activities for low-grade assessments was to address the diversity of learning practices and to guide self-directed and active learning. One of the other goals for the weekly case studies, included in the formative assessment, was to reinforce the need for keeping up-to-date with the weekly content – a challenge in an asynchronous learning environment – and to initiate links between theory and clinical concepts before the students attended the practicals.

Having these formative assessment tasks requires a certain level of monitoring to ensure students' engagement level with the learning environment and to check this against their preparedness to meet the learning outcomes. The current method to track student activity for viewing the recorded content, participation in Html-5-Package (H5P) activities, completion of forum posts and attendance at the live session required a certain level of manual work from the convenor. It was also critical to know early those students who had not submitted their assignments by the due date so as to offer them timely support. With these learning analytics goals, the first author was eager to volunteer to test the ASM in her course.

Learning analytics applications in the course

Prior to the availability of ASM, the first author used Moodle logs in combination with Personalised Learning Designer (PLD) to monitor student engagement and send out nudges to those students who meet the criteria of low engagement as determined by the first author. Moodle logs capture all student actions within its platform, categorised as either a “view” – when a student accesses a resource to read, or a “post” – when a student writes something on the platform. For a course with over 100 students, the Moodle log expands to 5,000 entries per week on average. Processing this amount of data for tracking individual students’ level of actions takes considerable time and effort.

PLD is an add-on tool provided by Moodle where the course convenor can set up conditions for certain activities in Moodle and automatically send out emails to students who meet the conditions, for example, not logging in the Moodle site for 10 days, not submitting an assignment or achieving less than 50% of a certain assessment task. In the case of the first author, she set up the PLD to alert students when they had not logged on for a required set of days. The PLD requires time to set up, as it does not roll over with the course. Moreover, within the current PLD setup, there is a strong possibility that repeated emails will be sent to students who continue meeting the conditions of not logging on, which can produce counter-effects such as distractions from other critical learning activities, tendency for students to ignore repeated emails and fatigue.

With these constraints of Moodle logs and PLD, the first author volunteered to trial the alpha version of ASM in T3 2023. This version utilised data from Moodle and student and staff information systems, including students’ previous academic performance, course historical data and student digital engagement in the current course. ASM provides a range of insights including predicted chances of passing, flagging those at high risk of failing, new enrolments, no Moodle log-in, low Moodle activities, no submission and low mark in the course assessments to date. ASM also allows the course convenors to send bulk emails to the list of students as per each insight. Refer to Figures 1–3 of ASM for reference.

Method

In this case study, we seek to examine the course convenor’s practices of, experience in and insights from using ASM to answer three questions underpinning the essential issues of learning analytics, namely:

  1. Does ASM help to achieve my learning analytics goals?

  2. Does ASM require technical expertise?

  3. Does ASM help to reduce workload for me?

The case study also seeks to answer an additional question about the implication of the tool for other large-class teaching courses and contexts, as follows:

  1. Does ASM meet the course convenor’s goals in applying technology to manage a large course at scale?

This case study takes advantage of an “action learning research” approach in its reporting of the author’s own experience with how the usage of the ASM may aid in a proactive approach for maximising student engagement whilst assessing associated workload implications. Action learning and action research methodologies are interwoven through the iterative process of engaging in the ASM pilot study. The reflection from the course convenor influences subsequent actions, and these actions, in turn, impact the learning experience (Attia and Edge, 2017; Knoblauch, 2021; McNiff, 2013; Zuber-Skerritt, 2001, 2002). Action learning places emphasis on a metaphorical “revolving door” where solutions continually evolve and there is not necessarily one solution that fits all sizes (Zuber-Skerritt, 2001, 2002). These approaches centre around the reflexive methodology to be effective (Attia and Edge, 2017; Knoblauch, 2021).

This case study has not gone through the ethical clearance processes at UNSW as it reports on the first author’s own experiences and her efforts to support students and enhance their academic performance in the course. It aims to draw from these first-hand experiences into insights and implications for applying technology, in this case learning analytics, to manage courses at scale. To support the insights, the case study uses only aggregated and non-identifiable data within the course from readily available sources, such as myExperience data and course learning analytics. How the insights have the potential to be adapted in aiding the author’s workload has been reported, but not the data itself. Likewise, Figures 1–3 represent the ASM view from the course convenor’s institution that they would have access to, but the de-identified aggregate data are from another course to preserve the anonymity of information that may be recognisable from this course.

Findings and discussion

The overall experience revealed that the ASM not only assisted in achieving some of my predetermined goals but also provided unexpected benefits. These unforeseen advantages prompted me to contemplate further on ways to leverage learning analytics for supporting and scaffolding student learning. ASM flags students who have not logged into Moodle for any 7-day period, and students with little or no Moodle activity. In a most general sense, ASM tells how active my students are in the online environment.

Administrative time during the term was spent actively reviewing Moodle activity reports manually, such as logs for viewing pre-recorded content, the completion of quizzes if offered and weekly case study participation. Some of the necessity for search information cropped up if absenteeism was noted or group project work became a concern – often these ad hoc searches through Moodle reports tended to add an administrative load time during the course session. Additionally, it was necessary to validate relevant information obtained from Moodle reports prior to sending check-in emails to students. This process was undertaken when the convenor believed that extending offers of support or taking specific actions would nudge the student to action.

Before ASM, the author needed to set up the PLD with certain rules to alert students when they had not logged on for a required set of days. PLD is a Moodle plugin and not directly connected to ASM, relying on data exclusively accessible within the Moodle platform. It is mentioned here as it has value for providing a time-saving tool to academics. However, the PLD still requires a certain learning curve at the start of each term, necessitating administration and an understanding of the limitations where students may automatically receive the same email consecutively, thereby negating the personal conveyance of messages. This in-built email system within the ASM reduced the requirement for setting up the PLD. An important benefit offered in conjunction with the ASM is the ability to send emails to students who were flagged. Here, the integration of ASM into my course has revolutionised course administration by seamlessly combining analytical insights and interventions into a unified and mutually supportive interface. This represents a departure from the previous scenario where the teaching staff had to independently engage in data analytics and intervention methods. ASM not only furnishes essential indicators for decision-making regarding students requiring attention but also provides a prompt and intuitive mechanism for taking necessary actions. The application of this feature shows much potential for saving time and providing a greater reach to “at-risk” students. In total, the utility of ASM helped to save up to 40% of administrative time.

Besides student level of actions in the Moodle site, ASM provides predicted chances of passing the course for individual students, which ties in with other risk factors through machine learning modelling, such as repeating the course and low marks in course assessment(s). This feature provided me with a heads up to ensure I had not missed any students who might benefit from additional support to enhance their academic success. It also saved me time going through the records to gain an understanding of the cohort and who might need additional support at the start of the course to plan for student support needs.

In the first few weeks of the term, regular updates of student enrolments provided by ASM helped me to ensure that all students were well prepared for the course and that they were set up for success. As I can see, this feature could be particularly useful in large classes with group-work activities to ensure the correct configuration of groups, especially as the convenor was alerted to late enrolments on the ASM. Looking ahead, improvements to the ASM would be to enable convenors to prioritise weighting of various analytics captured according to the convenor’s assessments and teaching materials. Whilst counting of logging on and viewing content is beneficial, students’ learning patterns do vary, and the reporting of students requiring support primarily based on these counts could be falsely identified or missed. For example, capturing at-risk students early may require greater weighting of formative assessments. Monitoring during the term may have a change in emphasis taking into account the summative assessment data, as well as Moodle activity.

ASM requires not only less effort but also low technical expertise for its use. ASM is embedded in the Moodle course, with key insights supporting the course convenor duties. These insights are presented in simple English, telling you how many students may need urgent support, how many may be falling behind or the students who are tracking well for passing the course, how many new enrolments, who has not logged into the Moodle site or who have had low/no Moodle activities (see Figure 1). On a click, you are able to view the list of all students in these categories and see if any of them also have other academic risks or concerns, that is repeating the course, not submitting their assignments and low assessment marks (see Figure 2). The in-built tool allowing personalised emails to be sent directly to flagged students is shown in Figure 3. All these functions have resulted in time efficiencies for managing the course and student support needs.

It is clear from the findings that ASM met some goals in tracking student digital engagement, required no technical expertise to use and alleviated administrative workload for the course convenors. In order to assess whether ASM can provide a technological solution to manage large classes at scale, we now examine the performance of the tool itself.

In my course, ASM was able to predict all students who eventually failed – a recall rate of 100%. Testing on historical data, the ASM recall rate is around 80% of all students who fail. In terms of early detection, the ASM is able to pick up 74% of all students who fail as early as week 3 or 4. This timeline is early enough to support students and make a difference (Schmitz and Hanke, 2023, Summers et al., 2021). Additionally, interventions at this stage often involve low investment such as nudging bulk emails, reminders of assessments, referrals to support services or reinforcements at tutorial classes. These early interventions could help to direct students to positive learning behaviours, which could eventually improve student performance in assessments (Gatare et al., 2023; Foster and Siddle, 2020; Francis et al., 2020; Motz et al., 2021, Summers et al., 2021). In my course, 80% of students who had a high risk of failure pull through and went onto pass. In short, utilising ASM with its high recall rate combined with the ability to flag students at risk at an early stage and a function to implement low-risk interventions could benefit large classes in managing student engagement and success at scale. Indeed, 70% of my fellow testers found ASM very helpful in this regard, a sentiment consistent with my own experience.

Current limitations

Against these benefits, there is room for improvements and better meeting the needs of course convenors. For instance, in my course, I wanted to monitor student engagement in specific activities that were vital to student learning. Receiving alerts for these specific activities for non/low-engaged students would enable me to intervene in a timely manner, with specific instructions for improving engagement. Additionally, ASM relies primarily on Moodle data for measuring engagement. Like all LMS, Moodle measures actions at entry points, that is when students make a click or a post, but not the duration of the actions; hence, data may fail to capture the full picture of their learning. To address this limitation, ASM plans to incorporate additional data sources, including H5P and Echo360. H5P serves as an interactive learning tool that requires student interaction to progress to the next stage, effectively indicating the length of engagement. Similarly, Echo360, an advanced video capturing platform, comes with its own engagement measures.

There are instances in my course where students who should have been flagged have not, or vice versa. This could be due to the fact that student success in my course is dependent on their engagement in a selected range of activities or at different points in time, but not all. Being able to add a weighting to the algorithm that calculates the pass probability according to the specifics of the course might have improved the accuracy of the prediction. Incorporating a variety of educational data sources as discussed earlier might have provided a fuller picture of student learning, hence might have increased the accuracy of the prediction, as well as the provision of personalised support to individual students.

Finally, while the emailing function is a time saver, it could be advanced if something similar to a customer relationship manager (CRM) could be integrated to manage all communications to students as well as any follow-ups. It also helps to save the course convenor in filtering students to ensure that one particular student does not receive multiple messages if they belong to multiple risk categories.

Conclusion

In summary, this article outlines the utilisation of learning analytics in teaching and administering large classes and its effectiveness in supporting the teaching staff. In my course, the implementation of learning analytics resulted in a significant reduction in time spent on identifying and reaching out to students in need of support. Specifically, automatic updates from ASM minimised the necessity for manual review of activity and streamlined the process of sending personalised emails to students identified as requiring support. Additionally, the machine-learning model of ASM facilitated the extraction of vital patterns from multiple dimensions, encompassing engagement, past student performance and historical course performance – insights that would be otherwise time-consuming given the large class size. This case study highlights the capacity of educators to use learning analytics as a tool to tailor support and timely prompts to students, which is more efficient than a manual approach.

While my personal experience reflects the time-saving and convenience that the ASM brought to me, a recent survey of all pilot testers revealed a unanimous positive perception. Moving forward, it is crucial to quantify the perceived usefulness of these benefits. This could be achieved by incorporating a set of more specific questions on high weighted tasks and weeks, at-risk detection and automated nudging into the feedback survey conducted by the project.

Figures

Learning analytics insights provided by ASM and as seen by the course convenor

Figure 1

Learning analytics insights provided by ASM and as seen by the course convenor

List of students in a particular category with other academic risks in display

Figure 2

List of students in a particular category with other academic risks in display

ASM emailing/nudging function

Figure 3

ASM emailing/nudging function

References

Al Ansi, A.M. and Al-Ansi, A. (2020), “Future of education post covid-19 pandemic: reviewing changes in learning environments and latest trends”, Solid State Technology, Vol. 63 No. 6, pp. 201584-201600.

Attia, M. and Edge, J. (2017), “Be (com) ing a reflexive researcher: a developmental approach to research methodology”, Open Review of Educational Research, Vol. 4 No. 1, pp. 33-45, doi: 10.1080/23265507.2017.1300068.

Bikowski, D., Park, H.K. and Tytko, T. (2022), “Teaching large-enrollment online language courses: faculty perspectives and an emerging curricular model”, System, Vol. 105, 102711, doi: 10.1016/j.system.2021.102711.

Cents-Boonstra, M., Lichtwarck-Aschoff, A., Denessen, E., Aelterman, N. and Haerens, L. (2021), “Fostering student engagement with motivating teaching: an observation study of teacher and student behaviours”, Research Papers in Education, Vol. 36 No. 6, pp. 754-779, doi: 10.1080/02671522.2020.1767184.

Chronopoulos, D. (2018), “Delivering quality along with quantity: the challenge of teaching a large and heterogeneous engineering class”, International Journal of Mechanical Engineering Education, Vol. 46 No. 4, pp. 331-344, doi: 10.1177/0306419018764118.

Delfino, A.P. (2019), “Student engagement and academic performance of students of partido state university”, Asian Journal of University Education, Vol. 15 No. 1, pp. 42-55, doi: 10.24191/ajue.v15i3.05.

Ferguson, R. (2012), “Learning analytics: drivers, developments and challenges”, International Journal of Technology Enhanced Learning, Vol. 4 Nos 5-6, pp. 304-317, doi: 10.1504/ijtel.2012.051816.

Foster, E. and Siddle, R. (2020), “The effectiveness of learning analytics for identifying at-risk students in higher education”, Assessment and Evaluation in Higher Education, Vol. 45 No. 6, pp. 842-854, doi: 10.1080/02602938.2019.1682118.

Fouh, E., Lee, W. and Baker, R.S. (2021), “Nudging students to reduce procrastination in office hours and forums”, 2021 25th International Conference Information Visualisation (IV), Sydney, Australia, pp. 248-254, doi: 10.1109/IV53921.2021.00047.

Francis, P., Broughan, C., Foster, C. and Wilson, C. (2020), “Thinking critically about learning analytics, student outcomes, and equity of attainment”, Assessment and Evaluation in Higher Education, Vol. 45 No. 6, pp. 811-821, doi: 10.1080/02602938.2019.1691975.

Gatare, K., Yang, Y.Y., Majumdar, R. and Ogata, H. (2023), “Co-designing nudges for self directed learning within GOAL system”, 2023 IEEE International Conference on Advanced Learning Technologies (ICALT), Orem, UT, USA, pp. 86-88, doi: 10.1109/ICALT58122.2023.00031.

Gribble, L.C. and Huber, E. (2022), “In the business of connecting: nudging students”, in Wilson, S., Arthars, N., Wardak, D., Yeoman, P., Kalman, E. and Liu, D.Y.T. (Eds), Reconnecting relationships through technology. Proceedings of the 39th International Conference on Innovation, Practice and Research in the Use of Educational Technologies in Tertiary Education, ASCILITE 2022 in Sydney: e22222, available at: https://doi/org/10.14742/apubs.2022.222

Gutiérrez, F., Seipp, K., Ochoa, X., Chiluiza, K., De Laet, T. and Verbert, K. (2020), “LADA: a learning analytics dashboard for academic advising”, Computers in Human Behavior, Vol. 107, 105826, doi: 10.1016/j.chb.2018.12.004.

Hefling, K. (2019), “The ‘Moneyball’ solution for higher education”, Politico, available at: https://www.politico.com/agenda/story/2019/01/16/tracking-student-datagraduation-000868

Hornsby, D.J. and Osman, R. (2014), “Massification in higher education: large classes and student learning”, Higher Education, Vol. 67 No. 6, pp. 711-719, doi: 10.1007/s10734-014-9733-1.

Hubbard, K. and Tallents, L. (2020), “Challenging, exciting, impersonal, nervous: academic experiences of large class teaching”, Journal of Perspectives in Applied Academic Practice, Vol. 8 No. 1, pp. 59-73, doi: 10.14297/jpaap.v8i1.405.

Ifenthaler, D., Mah, D.K. and Yau, J.Y.K. (2019), “Utilising learning analytics for study success: reflections on current empirical findings”, in Ifenthaler, D., Mah, D.K. and Yau, J.Y.K. (Eds), Utilizing Learning Analytics to Support Study Success, Springer, Cham, doi: 10.1007/978-3-319-64792-0_2.

Kansal, A.K., Gautam, J., Chintalapudi, N., Jain, S. and Battineni, G. (2021), “Google trend analysis and paradigm shift of online education platforms during the COVID-19 pandemic”, Infectious Disease Reports, Vol. 13 No. 2, pp. 418-428, doi: 10.3390/idr13020040.

Knoblauch, H. (2021), “Reflexive methodology and the empirical theory of science”, Historical Social Research/Historische Sozialforschung, Vol. 46 No. 2, pp. 59-79.

Kumar, S. (2013), “Rack 'em, pack 'em and stack 'em: challenges and opportunities in teaching large classes in higher education”, F1000Research, Vol. 2, p. 42, doi: 10.12688/f1000research.2-42.v1.

Manpreet, K., Soumen, M., Himani, A. and Manasi, B. (2022), “Flipped classroom (FCR) as an effective teaching-learning module for a large classroom: a mixed-method approach”, Cureus, Vol. 14 No. 8, e28173, doi: 10.7759/cureus.28173.

McNiff, J. (2013), Action Research: Principles and Practice, 3rd ed., Routledge, New York.

Mohamed, M.H., Bayoumy and Alsayed, B. (2021), “Investigating relationship of perceived learning engagement, motivation, and academic performance among nursing students: a multisite study”, Advances in Medical Education and Practice, Vol. 12, pp. 351-369, doi: 10.2147/amep.s272745.

Motz, B.A., Mallon, M.G. and Quick, J.D. (2021), “Automated educative nudges to reduce missed assignments in college”, IEEE Transactions on Learning Technologies, Vol. 14 No. 2, pp. 189-200, doi: 10.1109/TLT.2021.3064613.

Mulryan-Kyne, C. (2010), “Teaching large classes at college and university level: challenges and opportunities”, Teaching in Higher Education, Vol. 15 No. 2, pp. 175-185, doi: 10.1080/13562511003620001.

Picciano, A.G. (2012), “The evolution of big data and learning analytics in American higher education”, Journal of Asynchronous Learning Networks, Vol. 16 No. 3, pp. 9-20, doi: 10.24059/olj.v16i3.267.

Piotrkowicz, A., Dimitrova, V., Hallam, J. and Price, R. (2020), “Towards personalisation for learner motivation in healthcare: a study on using learner characteristics to personalise nudges in an e-learning context”, Adjunct Publication of the 28th ACM Conference on User Modeling, Adaptation and Personalization (UMAP '20 Adjunct), New York, NY, USA, Association for Computing Machinery, pp. 287-292, available at: https://doi-org.wwwproxy1.library.unsw.edu.au/10.1145/3386392.3399290

Rajabalee, B.Y., Santally, M.I. and Rennie, F. (2020), “A study of the relationship between students' engagement and their academic performances in an eLearning environment”, E-Learning and Digital Media, Vol. 17 No. 1, pp. 1-20, doi: 10.1177/2042753019882567.

Schmitz, B. and Hanke, K. (2023), “Engage me: learners' expectancies and teachers' efforts in designing effective online classes”, Journal of Computer Assisted Learning, Vol. 2023 No. 39, pp. 1132-1140, doi: 10.1111/jcal.12636.

Schnitzler, K., Holzberger, D. and Seidel, T. (2021), “All better than being disengaged: student engagement patterns and their relations to academic self-concept and achievement”, European Journal of Psychology of Education, Vol. 36 No. 3, pp. 627-652, doi: 10.1007/s10212-020-00500-6.

Siemens, G., Long, P., Gašević, D. and Conole, G. (2011), “Call for papers”, 1st International Conference Learning Analytics and Knowledge (LAK 2011), available at: https://tekri.athabascau.ca/analytics/call-papers

Summers, R.J., Higson, H.E. and Moores, E. (2021), “Measures of engagement in the first three weeks of higher education predict subsequent activity and attainment in first year undergraduate students: a UK case study”, Assessment and Evaluation in Higher Education, Vol. 46 No. 5, pp. 821-836, doi: 10.1080/02602938.2020.1822282.

Tran, T.P., Jan, T. and Kew, S.N. (2022), “Learning analytics for improved course delivery: applications and techniques”, Proceedings of the 6th International Conference on Digital Technology in Education, pp. 100-106.

Tsai, Y.S., Rates, D., Moreno-Marcos, P.M., Muñoz-Merino, P.J., Jivet, I., Scheffel, M., Drachsler, H., Kloos, C.D. and Gašević, D. (2020), “Learning analytics in European higher education—trends and barriers”, Computers and Education, Vol. 155, 103933, doi: 10.1016/j.compedu.2020.103933.

Valverde-Berrocoso, J., Garrido-Arroyo, M.D.C., Burgos-Videla, C. and Morales-Cevallos, M.B. (2020), “Trends in educational research about e-learning: a systematic literature review (2009-2018)”, Sustainability, Vol. 12 No. 12, p. 5153, doi: 10.3390/su12125153.

Wadesango, N. (2021), “Challenges of teaching large classes”, African Perspectives of Research in Teaching and Learning, Vol. 5 No. 2, pp. 127-135.

Zuber-Skerritt, O. (2001), “Action learning and action research: paradigm, praxis and programs”, Effective Change Management Through Action Research and Action Learning: Concepts, Perspectives, Processes and Applications, Vol. 1 No. 20, pp. 1-27.

Zuber‐Skerritt, O. (2002), “The concept of action learning”, The Learning Organization, Vol. 9 No. 3, pp. 114-124, doi: 10.1108/09696470210428831.

Acknowledgements

Whilst the case study received no funding, the learning analytic tool that was reported in the case study was provided by PVCESE Innovation Team at the University of New South Wales. The tool is part of a 3-year strategic project aiming at leveraging educational learning analytics to enhance the learning quality and the staff and student experience.

Corresponding author

Vanessa Honson can be contacted at: v.honson@unsw.edu.au

Related articles