The purpose of this paper is to present a systematic review of the mounting research work on learning analytics.
This study collects and summarizes information on the use of learning analytics. It identifies how learning analytics has been used in the higher education sector, and the expected benefits for higher education institutions. Empirical research and case studies on learning analytics were collected, and the details of the studies were categorized, including their objectives, approaches, and major outcomes.
The results show the benefits of learning analytics, which help institutions to utilize available data effectively in decision making. Learning analytics can facilitate evaluation of the effectiveness of pedagogies and instructional designs for improvement, and help to monitor closely students’ learning and persistence, predict students’ performance, detect undesirable learning behaviours and emotional states, and identify students at risk, for taking prompt follow-up action and providing proper assistance to students. It can also provide students with insightful data about their learning characteristics and patterns, which can make their learning experiences more personal and engaging, and promote their reflection and improvement.
Despite being increasingly adopted in higher education, the existing literature on learning analytics has focussed mainly on conventional face-to-face institutions, and has yet to adequately address the context of open and distance education. The findings of this study enable educational organizations and academics, especially those in open and distance institutions, to keep abreast of this emerging field and have a foundation for further exploration of this area.
Wong, B.T.M. (2017), "Learning analytics in higher education: an analysis of case studies", Asian Association of Open Universities Journal, Vol. 12 No. 1, pp. 21-40. https://doi.org/10.1108/AAOUJ-01-2017-0009
Emerald Publishing Limited
Copyright © 2017, Billy Tak Ming Wong
Published in the Asian Association of Open Universities Journal. Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode
Learning analytics (LA) refers to the process of collecting, evaluating, analysing, and reporting organizational data for decision making (Campbell and Oblinger, 2007). It involves the use of big data analysis for understanding and improving the performance of educational institutions in educational delivery. Open and distance learning (ODL) institutions present an ideal context for the use of LA as, with their large student numbers and the increasing use of the internet and mobile technologies, they already have a very substantial amount of data available for analysis with analytics.
Despite LA being increasingly applied in a wide range of educational organizations, the literature in this area has usually focussed on conventional face-to-face institutions. In the ODL setting, there is yet to be a systematic review summarizing existing work on the potential benefits of LA to open and distance institutions (Firat and Yuzer, 2016; Prinsloo and Slade, 2014), and relevant research findings potentially applicable to these institutions (Rienties et al., 2016).
This paper gives a systematic review of the mounting research work on LA that has been published in recent years to provide an overview of this emerging field and serves as a foundation for further exploration. It addresses the potential problems of ODL institutions that could be solved by using LA, and the benefits that could be obtained according to the existing case studies. It also presents a meta-analysis of relevant empirical studies which shows the effect of intervention for at-risk students based on the use of LA.
LA involves the use of a broad range of data and techniques for analysis – covering, for example, statistical tests, explanatory and predictive models, and data visualization (Arroway et al., 2016). Various stakeholders, such as administrators, teaching staff, and students, can then act on the data-driven analysis. Without a standardized methodology, LA has been implemented using diverse approaches for various objectives. Gašević et al. (2016) summarized three major themes in LA implementation, namely, the development of predicators and indicators for various factors (e.g. academic performance, student engagement, and self-regulated learning skills); the use of visualizations to explore and interpret data and to prompt remedial actions; and the derivation of interventions to shape the learning environment. The diversity in LA implementation poses a challenge for education institutions which plan to be involved in it, leading to a commonly voiced question – “How do we start the process for the adoption of institutional learning analytics?” (Gašević et al., 2016, p. 4).
As an emerging field of study, an increasing number of case studies relevant to the implementation of LA in higher education have been published. However, only a small number of reviews summarize these individual case studies. Among them, Dyckhoff (2011) reviewed the research questions and methods of these studies. The findings showed that existing studies have focussed on six types of research questions: qualitative evaluation; quantitative measures of use and attendance; differentiation between groups of students; differentiation between learning offerings; data consolidation; and effectiveness. The research methods used include online surveys, log files, observations, group interviews, students’ class attendance, eye tracking, and the analysis of examination grades. Based on the results, suggestions were given on LA indicators for improving teaching.
Papamitsiou and Economides (2014) focussed on the impacts of LA and educational data mining on adaptive learning. They reviewed the experimental case studies between 2008 and 2013, and identified four distinct categories, namely, pedagogy-oriented issues, contextualization of learning, networked learning, and the handling of educational resources.
Also, Nunn et al. (2016) discussed LA’s methods, benefits, and challenges. It was found that the methods used included visual data analysis, social network analysis, semantic analysis, and educational data mining. The benefits of LA were seen to revolve around targeted course offerings; curriculum development; student learning outcomes; behaviours and processes; personalized learning; improvements in instructor performance; post-educational employment opportunities; and enhancement of educational research. The challenges included the tracking, collection, evaluation and analysis of data, as well as a lack of connection to learning science, the need for learning environment optimization, and issues concerning ethics and privacy.
Focussing on computer science courses, Ihantola et al. (2015) surveyed LA case studies in terms of their goals, approaches, contexts, subjects, tasks, data and collection, and methods of analysis. The goals were related to students, programming, and the learning environment. The approaches included case studies, constructive research, experimental studies, and survey research. They also found that most of the research work was undertaken in a course context, with the number of subjects ranging from 10 to 265,000, with 64 per cent of the studies having 500 or fewer subjects. In most of the studies, students were required to complete multiple programming tasks. Over 60 per cent of the studies used automated data collection that logged students’ actions, and a variety of data analysis methods such as descriptive and inferential statistics.
The existing reviews of LA case studies provide a basic descriptive summary. However, as a new area in education, there remain many uncertainties for ODL institutions about involving themselves in it. To make an informed decision on whether or not to implement LA, a key question is: “What are the expected benefits for the institution?” This paper addresses this issue by surveying the outcomes of LA implementation for institutions.
This study aims to investigate how LA has been used in higher education institutions and the outcomes obtained. Relevant case studies were collected from Scopus, using the key terms “academic analytics” and “learning analytics” for the period from 2007 to 2016. The studies were selected based on the following criteria:
the study reported one or more empirical cases of the use of LA in a higher education institution;
the institution in question was accredited by the government or government-related bodies;
the institution had 1,000 or more students; and
the source information contained the aims of using LA, a description of the analytics, its implementation and the outcomes.
An initial search returned 1,492 results. After screening, a total of 43 cases which fulfilled the criteria for inclusion were selected for further analysis. They were analysed in terms of their objectives, approaches, and major outcomes.
A meta-analysis was also conducted to synthesize the empirical findings reported in the case studies. Studies which included relevant quantitative data analysis were chosen, resulting in six studies on student support and analysis of learning behaviours, with the effect of LA intervention validated and reported.
Benefits for institutions, staff, and students
A summary of the objectives and approaches of the use of LA in the institutions chosen is presented in Table AI. The benefits of LA for the institutions, staff and students revolve around the following aspects.
Improving student retention
Table I presents the use of LA which improved student retention. By closely monitoring students’ learning and persistence, undesirable learning behaviours and emotional states can be detected, and students who are at risk can be identified early. Factors leading to student dropout or retention can be identified and prediction models developed. Staff can take prompt follow-up action and provide proper assistance to students who need extra support, such as counselling, suggesting learning resources, and formulating individual learning plans. Students’ level of achievement, as well as their retention, can be enhanced.
Supporting informed decision making
Table II shows the use of LA which supported informed decision making. Institutions are provided with information and analyses generated from a massive amount of data for informed decision making. For example, planning can be carried out on course development and resources allocation on the basis of information about the popularity of courses, and types and frequency of materials reviewed by students.
Table III presents cases of LA use which increased cost-effectiveness. LA can be integrated with other platforms such as the learning management system. Instructors can then access various kinds of information online for providing feedback and support to students. Analyses and feedback on students’ study progress can be delivered to staff, students, or parents in an automatic and cost-effective manner.
Understanding students’ learning behaviours
Table IV presents the use of LA for understanding students’ learning behaviours. By analysing diverse sources of data (e.g. learning management systems and social networks), institutions and academic staff can understand the relationships among students’ utilization of resources, learning behaviours and characteristics, and learning outcomes, which helps them to evaluate the effectiveness of pedagogies and instructional designs for improvement. For instance, the use of LA helps to capture the students’ behaviours in watching course videos by highlighting the patterns of their preferences and behaviours as well as showing the parts of videos which were watched most and least frequently. Curriculum and learning materials can thus be better designed to address students’ preferences and needs.
Providing personalized assistance for students
Table V illustrates the use of LA for providing students with insightful data about their learning characteristics and patterns, which can make their learning experiences more personal and engaging, and facilitate their reflections and improvements while a course is still in progress. Early alerts can be automatically generated and sent to students if their academic performance is below a certain standard. Students can also be encouraged to engage more in the personalized learning activities which are conducive to success in their studies.
Timely feedback and intervention
Table VI presents the use of LA for timely feedback and intervention. Instructors can obtain up-to-date and holistic information about students’ study progress, so that timely feedback can be given and individualized interventions made. Students develop a sense of belonging to the learner community through personalized feedback given to them. For example, the use of social network analytics allows instructors to understand the development of the learner community and identify students who are performing poorly or are isolated from the main discussion, and then provide intervention during discussion in real time. This is especially important for ODL institutions, where students may be using different study modes and social media is a major communication channel.
Meta-analysis of the effect of interventions on student success
An important function of LA is to predict at-risk students and deliver early alerts and interventions to them, in order to improve their academic attainment, and their retention and graduation rate. This section provides a meta-analysis of the various prediction models utilized in LA systems, and the effect of the intervention solutions on enhancing students’ success.
Among the case studies examined, only six which provided quantitative analysis results were selected and the results are synthesized in this section. The effect sizes for each analysis were calculated where the data required for the calculation were available, and a descriptive comparison of the effect sizes across the studies was made. Table VII presents a summary of the predictive models and intervention solutions employed in the six case studies; and Table VIII summarizes the results of quantitative analyses for the intervention solutions and the effect sizes for each study.
To summarize, a common approach utilized in the cases of intervention for student success was to collect and analyse data from students’ learning activities and employ a specific computational model to predict and prioritize those students who were at-risk of dropping out or getting poor academic results. Based on the findings of the predictive modelling, subsequent measures can be taken for intervention. A common practice was to get academic staff to contact the at-risk students and provide personalized learning support to them. Such an approach to prediction and intervention was found to effectively enhance students’ success, as measured by various indicators such as GPA, study progress, the retention rate, and the graduation rate.
According to the meta-analysis of the quantitative results, all the institutions found improvement in the students’ success in the intervention group compared to the control group, although the effect size varied across different types of indicators for success and different institutions. For instance, the intervention groups in the case of Marist College showed a 6 per cent improvement in the students’ final grades compared to the non-intervention control groups (Sclater et al., 2016), while the effect size was in the range of small to medium based on Cohen’s (1988) convention. For the retention rate examined in Mattingly et al. (2012) for the Course Signal System of Purdue University, the intervention groups showed a nearly 50 per cent performance improvement compared to the control groups. In spite of the small sample size, the meta-analysis showed an encouraging result for the benefits of LA in aiding institutions to make effective informed decisions to improve students’ learning performance and success.
Discussion and conclusion
This study shows that positive outcomes have been widely reported in relevant case studies. The results suggest great potential for ODL institutions to utilize LA for analysing existing data, which is expected to benefit their operations in areas such as quality assurance and student support. This study also reviewed various predictive models for student success which were developed and validated to identify and prioritize students who may be in need of support. The quantitative analyses confirmed that the learning performance of these students improved after they had been approached for LA-based interventions. The findings of this study thus provide various stakeholders – institutions, staff, and students – with the benefits they may gain from LA.
In particular, the results related to student learning suggest that, to change students’ behaviours, it may suffice to simply make them aware of their learning engagement through LA tools in relation to other students or indicate that they are at risk (Jayaprakash et al., 2014; Sclater and Mullan, 2017). Complex data visualizations or dashboards may not be necessary. What is more important, as recommended in Gašević et al. (2016), is to help students to interpret correctly the information from visualizations or dashboards.
The meta-analysis revealed that only a few case studies related to LA implementation provided quantitative analyses data – a limitation which may be caused by the relatively new development of LA. Therefore, empirical investigations and validation of many new models and new theories in this area remain to be carried out. While an increase in the quantity of empirical and quantitative research can be expected in future, it is also important to develop and test innovative solutions supported by LA. Present LA-based interventions, as reviewed in this paper, were mostly based on the interaction and discussion between students and instructors. Although such interventions were shown to be effective in general, their effectiveness may vary among different groups of students in different contexts.
A challenge in measuring the effectiveness of LA implementation lies in the difficulty of identifying the extent to which any change after the LA implementation is attributed to the LA itself. As discussed in Sclater and Mullan (2017), it may not be feasible to isolate the influence of LA when it is part of a wider initiative to develop data-informed approaches in an institution. The case studies published and reviewed in this paper would thus be biased to the institutions which only deployed LA without other measures in their data-informed approaches.
In the ODL context, work on LA remains at an initial stage. Features of ODL, such as open admission which allows a broad range of students to study the same course with very limited face-to-face interaction, are yet to be studied in relation to LA implementation. It is therefore suggested that future research can involve more fine-grained validation studies to identify the effect of the various factors involved the implementation of LA. In particular, investigation on those factors related to ODL institutions, staff and students, as well as the plausible constraints on their use of LA, would shed light on how they can benefit more from involvement in LA.
Use of LA which improved student retention
|Bowie State University||More student activities and communication were initiated through the system||Chacon et al. (2012)|
|Edith Cowan University||The student retention rate for those who got support was higher than the university’s average rate||Atif et al. (2013)|
|Harvard University||The results demonstrate the potential for natural language processing to contribute to predicting student success in MOOCs and other forms of open online learning||Robinson et al. (2016)|
|New York Institute of Technology||An at-risk model of high predictive power was developed||Sclater et al. (2016)|
|Northern Arizona University||Student-instructor interaction was increased and personal interventions were given; and students showed better academic performance, retention and graduation rates||Star and Collette (2010)|
|Paul Smith’s College||Students devoting more efforts in their studies resulted in a higher chance of success, and better persistence and graduation rates||McAleese and Taylor (2012)|
|Rio Salado Community College||A 40% decrease in drop-out rate was obtained for students who received welcome e-mails compared with those who did not||Smith et al. (2012)|
|The Open University (UK)||A vast majority of students showed continuous engagement
Student retention was at an average to good level
Students demonstrated higher satisfaction
|Rienties et al. (2016)|
|University of New England||The student attrition dropped from 18 to 12%
Students demonstrated an increase in their sense of belonging to the learner community and learning motivation
|Sclater et al. (2016)|
Use of LA which supported informed decision making
|Grand Rapids College||Better decisions can be made about course delivery to help to ensure student success through a LA tool which is easy for end user analysis||Fritz and Kunnen (2010)|
|The Open University (UK)||Elements tacitly implicated in pedagogical decisions during course design were unpicked||Toetenel and Rienties (2016)|
|University of Adelaide||Educators were provided with guidelines to design collaborative learning activities||Tarmazdi et al. (2015)|
|University of Edinburgh||Through identification of socially engaged students, the instructional team can identify suitable teaching assistants||Kovanović et al. (2016)|
|University of North Bengal||Counsellors and faculty members were provided with useful inputs to advise learners on the best possible completion options||Yasmine (2013)|
|University of Salamanca||Visual analytics was shown to help to lead to better understanding of what is happening in a student. Informed decisions can be made that help students to succeed||Conde et al. (2015)|
|The Technical University of Madrid||Information was provided by the LA system which helped to prevent problems, carry out corrective measures and make informed decisions to improve students’ learning||Fidalgo-Blanco et al. (2015)|
Use of LA which increased cost-effectiveness
|Bridgewater College||Notifications were automatically generated and sent to students and their parents to recognize students’ good performance||Sclater et al. (2016)|
|Drexel University||Faculty, programme developers, and programme administrators were able to analyse the connections between a specific programme outcome and data related to that outcome||Harvey (2013)|
|Georgia Institute of Technology and Carnegie Mellon University||High reliability was achieved for analysing students’ online discussion data||Wang et al. (2016)|
|Harvard University||A machine learning prediction model was shown to be effective for predicting students who would complete an online course||Robinson et al. (2016)|
|Lancaster University||Tutors could efficiently access various kinds of data for providing students with timely support||Sclater et al. (2016)|
|New York Institute of Technology||A dashboard simple and easy to use by staff was developed||Sclater et al. (2016)|
|Open University of Catalonia||Information could be updated and maintained automatically||Guitart et al. (2015)|
|Portland State University||Operation efficiency was increased, e.g. faster generation of reports
The system could easily be modified to fit the needs of other institutions
|Purdue University||Students who had engaged with the LA system sought more help and resources than other students||Arnold and Pistilli (2012)|
|Rio Salado College||The likelihood of successful course completion was accurately assessed||Smith et al. (2012)|
|The Hong Kong Institute of Education||There was greater interaction between teachers and students||Wong and Li (2016)|
|University of Adelaide||Lecturers were allowed to assess and monitor students’ collaboration in an online environment, without having to traverse a large discussion forum||Tarmazdi et al. (2015)|
|University of Michigan||The system demonstrated high scalability and extensibility||Mattingly et al. (2012)|
|University of Salamanca||The system allowed the provision of learning support to students in an automatic manner||Cruz-Benito et al. (2014)|
|University of the South Pacific||The utilization of open source resources could be modified and adapted by anyone to meet specific user needs||Prasad et al. (2016)|
|University of Sydney||LA features such as instant feedback and auto-grading are especially useful for instructors teaching subjects in computer science education||Gramoli et al. (2016)|
Use of LA which helped in understanding students’ learning behaviours
|Ball State University||Data analyses showed the consistent predictive power of the LA system on students’ academic performance, persistence, retention and graduation||Jones and Woosley (2011)|
|Georgia Institute of Technology and Carnegie Mellon University||Students who displayed more higher-order thinking behaviours learnt more through deeper engagement with course materials displayed by their discussion behaviours
These students in turn also learnt more than students who were constantly off topic in the forums
Social-oriented topics triggered richer discussion compared with biopsychology oriented topics, and higher-order thinking behaviours tended to appear together within threads in the forums
|Wang et al. (2016)|
|McGill University||It provides an unprecedented opportunity to use data from real learners in authentic learning situations to better understand learning processes
The study demonstrated how to detect learner misconceptions
Prediction precision and weighted relative accuracy were significantly increased
|Poitras et al. (2016)|
|Oxford Brookes University||Problems were identified with ethnic minority students in particular courses||Sclater et al. (2016)|
|The Hong Kong Institute of Education||Potential indicators were found for predicting student performance, such as the contribution of in-depth contents in online discussion||Wong and Li (2016)|
|The Open University (UK)||Common pedagogical patterns were identified from learning designs, showing the relationship between learning activities and students’ learning outcomes||Toetenel and Rienties (2016)|
|The Technical University of Madrid||Relationship between student interaction and individual performance was identified||Fidalgo-Blanco et al. (2015)|
|The University of Melbourne||Relationships among students’ motivation, participation and performance in MOOCs were found||Barba et al. (2016)|
|The University of Melbourne||Learners’ learning progress could be visualized showing their development from novice to expert||Milligan (2015)|
|University of Adelaide||Lecturers could track the evolution of team roles across each study group and identify various sentiments within each group||Tarmazdi et al. (2015)|
|University of Edinburgh||Patterns of students’ engagement in MOOC learning activities were found, showing differences in their learning behaviours between enrolments in the same courses||Kovanović et al. (2016)|
|University of North Bengal||Factors leading to students’ dropout were identified, such as pregnancy and the remoteness of residence locations||Yasmine (2013)|
|University of Rijeka||Student activities on the learning management system (e.g. assignment uploads and course views) were shown as predictors of academic success||Sisovic et al. (2015)|
|University of Santiago de Compostela||Teachers could understand more clearly how students behave during a course that facilitated the evaluation process||Gewerc et al. (2014)|
Use of LA for providing personalized assistance to students
|Albany Technical College||Based on analysis of students’ study results, demographics and social data, at-risk students were identified for providing individual counselling||Karkhanis and Dumbre (2015)|
|Bridgewater College||Tutors were provided with detailed information to discuss with students on their progress against targets and suggested actions||Sclater et al. (2016)|
|Open Universities Australia||Students obtained from the system recommended content and activities and a personalized learning environment||Atif et al. (2013)|
|The Technical University of Madrid||The LA system provided information for preventing problems, carrying out corrective measures and improving students’ learning||Fidalgo-Blanco et al. (2015)|
|University of Michigan||Customized recommendations were provided, including suggestions on study habits, assignment practice, feedback on progress and encouragement||Mattingly et al. (2012)|
Use of LA for timely feedback and intervention
|Edith Cowan University||Students likely to need support were automatically identified and support staff could efficiently reach them for interventions||Sclater et al. (2016)|
|Marist College||Interventions resulted in a 6% improvement in final grades for the treatment group compared to the control group||Jayaprakash et al. (2014)|
|Northern Arizona University||Instructors’ feedback was available to individual students and to university personnel, facilitating a comprehensive support network for all students||Star and Collette (2010)|
|Purdue University||Interventions were provided to at-risk students, and a higher student retention rate was achieved||Arnold and Pistilli (2012)|
|San Diego State University||Interventions through e-mails were shown to be the best treatment within constraints, while having an impact on student achievement||Dodge et al. (2015)|
|University of Adelaide||The LA system allowed instructors to be aware when particular students are behaving differently from the others for making appropriate and timely interventions||Tarmazdi et al. (2015)|
|University of Edinburgh||Instant feedback was shown to be a useful LA feature for students in courses on computer programming||Kovanović et al. (2016)|
|University of Michigan||Students were provided with feedback (e.g. grade prediction) for self-reflection||Mattingly et al. (2012)|
|University of Wollongong||Students who are isolated from the main discussion could be identified, and interventions could be provided during discussion in real time||Mat et al. (2013)|
Summary of predictive model and intervention solution for selected case studies
|Institution||Learning analytics system (s)||Predictive model||Intervention solution|
|Georgia Institute of Technology and Carnegie Mellon University (Wang et al., 2016)||Interactive-Constructive-Active-Passive (ICAP) framework||It was predicted that engaging in higher-order thinking behaviours results in better learning outcomes than paying general or focussed attention to course materials||Students’ online discussion behaviours were categorized into three types:
Higher-order – the student has contributed at least one constructive or interactive post during a course
Paying-attention – the student has contributed at least one active post during the course but has not displayed any constructive or interactive posts
No contribution to any on-topic discussion during the course
Together with the students’ other persistent characteristics, treatment and control groups were formed to investigate differences in their learning outcomes
|Hong Kong Institute of Education (Wong and Li, 2016)||KeyGraph algorithm and Polaris (a software tool)||A test-mining analytical tool was used to predict students’ academic performance. The tool visualizes the hidden patterns and linkages among students’ learning activities. The findings of the study showed that this approach can provide insights into predicting students’ performance, and students with a higher grade tended to contribute more in-depth contents in an online learning environment||Students’ posts in an online learning forum were extracted and analysed – how the students presented concepts, specifically whether they can make linkage among various concepts. Such a pattern was correlated with the grades they obtained. The findings can be used to guide interventions on students’ learning process, and inform ways to give feedback to improve teaching and learning|
|Marist College (Jayaprakash et al., 2014)||Open Academic Analytics Initiative||A machine learning algorithm and logistic regression were used to predict whether students are at risk based on their demographic details, aptitude data, and various aspects of their usage of the virtual learning environment obtained from the LA system||An online academic support environment was developed containing study skills materials and community support for specialists and student mentors. At-risk students identified by the predictive model were directed to the support environment|
|Nottingham Trent University (Sclater et al., 2016)||NTU Student Dashboard||Students’ engagement was assessed using indicators, such as door swipes into academic buildings, visits to the virtual learning environment, the submission of assignments, and the frequency of borrowing library resources. Each student received one of five engagement ratings: high, good, partial, low and not fully enroled||Tutors are prompted to contact students to give assistance when the students’ engagement drops off. Students can view their own engagement scores on the dashboard so that they will be self-motivated|
|Paul Smith’s College (McAleese and Taylor, 2012)||Rapid Insight’s Veera, Starfish EARLY ALERT, and CONNECT||Rapid Insight’s Veera combines different file types and uses automatic analyses and predictive modelling to identify at-risk students prior to their enrolment. Starfish EARLY ALERT automates data collection and uses analytics to increase the identification of at-risk students||The Starfish EARLY ALERT and CONNECT automatically prioritize students who are identified as at-risk and facilitate intervention and outreach|
|Purdue University (Arnold and Pistilli, 2012)||Course Signal System||The Course Signal System predicted students’ performance relying on a series of variables, including students’ demographic characteristics, academic performance, past academic history, and students’ efforts devoted to study||Instructors provided real-time personalized feedback to each student based on the outcomes generated from LA, in which the student is informed about how he/she is doing|
Summary of quantitative analysis results for selected case studies
|Institution||Independent variable||Dependent variable||Statistical method||Description of result||Effect size type||Effect size [95% CI]||Interpretation of effect size|
|Georgia Institute of Technology and Carnegie Mellon University||Higher-order thinking behaviours||Test score||Regression||The average posttest score of the treatment group (with higher-order thinking behaviour) was significantly higher than that of the control group (without higher-order thinking behaviour)||Hedge’s g||0.237 [0.018, 0.492]||Small-to-medium effect size|
|Hong Kong Institute of Education||“Contribution” and “innovation” from students’ postings in discussion forum||Final grade||χ2 test of independence||Students who obtained better grades usually contributed more in-depth contents in their posts which linked to other concepts compared to those with lower grades who tended to provide isolated facts with little or no connection or transition from one concept to another||Odds ratio (OR)||0.634 [0.504, 0.798]||The students who contributed more in-depth contents were 63.4% more likely to get a higher grade than those contributing isolated facts|
|Marist College||Intervention||Final grade||One-way ANOVA||Groups receiving intervention obtained significantly higher final grade than groups receiving no intervention||Hedge’s g||0.373 [0.176, 0.571]||Small-to-medium effect size|
|Nottingham Trent University||Level of engagement rating||Progression status||Descriptive categorical data analysisa||A much larger proportion of students with satisfactory to high engagement ratings obtained progression status than those with low engagement ratings||–||–||–|
|Paul Smith’s College||Intervention||Grade, suspension or probation rate, graduation rate||Descriptive categorical data analysisa||Student groups receiving intervention were less likely to get a grade D or below, to end a semester with probation or suspension, and more likely to get good standing by GPA and to graduate on time||–||–||–|
|Purdue University||Intervention||Retention rate||χ2 test of independence||Student groups receiving intervention had a higher retention rate than those receiving no intervention||Odds ratio (OR)||0.455b [0.427, 0.485]||The intervention group was 45.5% less likely to dropout than the non-intervention group|
Notes: aThe results presented in the case studies of these two institutions did not involve any statistical tests and complete information for the data – that is, sample size for each category was not provided. Therefore, no effect size could be calculated from the available data; bthe effect size was computed by combining the data for the second-year retention rate for three cohorts (2007, 2008, 2009) from the original tables in Mattingly et al. (2012)
Summary of the objectives and approaches of higher education institutions in the use of learning analytics
|1. Albany Technical College||Monitoring, intervention||Identify at-risk students and provide them with counselling||Karkhanis and Dumbre (2015)|
|2. Ball State University||Monitoring, intervention||Identify at-risk students and provide them with counselling
Increase effectiveness by reducing the time required to diagnose problems and targeting specific issues
Help the institution to make informed decisions about student success programmes and retention services
Allow students to become aware of the gaps between their behaviours and expected outcomes, to understand elements of their academic success, and to utilize on-campus resources to solve their problems
|Jones and Woosley (2011)|
|3. Bowie State University||Monitoring, intervention||Support student retention
Track students’ progress towards graduation to facilitate decision making
Provide early alerts for staff to intervene to prevent dropout
|Chacon et al. (2012)|
|4. Bridgewater College||Monitoring, intervention||Track students’ attainment level
Support students to do better than the national average
|Sclater et al. (2016)|
|5. California State University||Monitoring||Analyse how students use the learning management system||Allen et al. (2012)|
|6. Drexel University||Updating data and curriculum||Measure the effectiveness of specific course components through maintaining data records aligned with the curriculum, courses and syllabi, course learning objectives and assessment strategies
Manage student learning outcomes and performance criteria
|7. Edith Cowan University||Monitoring, intervention||Identify students who need support
Establish a system to contact a large number of students and manage interventions
Improve student retention
Improve graduation rates
|Sclater et al. (2016)|
|8. Georgia Institute of Technology and Carnegie Mellon University||Monitoring, analysis||Better scaffolded online discussion to improve learning in a MOOC context
Explore effects of higher-order thinking behaviours in learning
Identify kinds of discussion behaviours associated with learning
Investigate types of learning materials which trigger richer discussion
|Wang et al. (2016)|
|9. Harvard University||Monitoring, prediction||Analyse the extent to which students’ responses about motivation and utility value can predict persistence and completion of study||Robinson et al. (2016)|
|10. Lancaster University||Monitoring, intervention, feedback||Allow tutors to access the transcripts of their students
Allow early intervention
Ensure student work is graded and feedback given to students in a timely manner
|Sclater et al. (2016)|
|11. Loughborough University||Feedback||Provide academics with a better and more holistic picture of student engagement
Provide staff with actionable insights into student learning experience
Provide students with their own educational data in a meaningful way
|Sclater et al. (2016)|
|12. Manchester Metropolitan University||Monitoring, curriculum design||Improve student experience as reflected in the National Student Survey
Provide data for improving the undergraduate curriculum
|Sclater et al. (2016)|
|13. Marist College||Prediction, intervention||Predict academic success
|Jayaprakash et al. (2014)|
|14. McGill University||Monitoring, analysis||Identify misconceptions of medical students as reflected in their interactions in the online learning environment||Poitras et al. (2016)|
|15. New York Institute of Technology||Prediction, intervention||Create an at-risk model to identify students in need of support
Improve student retention in their first year of study
Provide information that could support counsellor in their work
|Sclater et al. (2016)|
|16. Northern Arizona University||Feedback||Facilitate online interaction between students and instructors
Allow students to receive direct feedback on issues such as academic concerns and grades
|Star and Collette (2010)|
|17. Nottingham Trent University||Intervention||Enhance retention and improve attainment
Increase students’ sense of belonging within the course community, particularly with tutors
|Sclater et al. (2016)|
|18. Open Universities Australia||Intervention||Identify at-risk students
Suggest alternative modules to students which are more appropriate for their needs
|Atif et al. (2013)|
|19. Open University of Catalonia||Information collection and management||Identify automatically pieces of knowledge taught in each subject
Gather students’ information
Keep information updated
|Guitart et al. (2015)|
|20. Oxford Brookes University||Monitoring||Improve student experience
Support progress evaluation of modules and programmes, and the identification of priorities at an institutional level
|Sclater et al. (2016)|
|21. Paul Smith’s College||Monitoring, intervention||Identify at-risk students and prioritize outreach for them
Provide more efficient and effective interventions for student success
|McAleese and Taylor (2012)|
|22. Portland State University||Information management||Make information more accessible and easier to use||Blanton (2012)|
|23. Purdue University||Monitoring, intervention||Give students early and frequent performance notifications
Help faculty members to steer students towards additional campus resources as needed
|Arnold and Pistilli (2012)|
|24. Rio Salado College||Prediction||Identify factors having a significant statistical correlations with final course outcomes||Grush (2011)|
|25. San Diego State University||Intervention||Identify methods and interventions that would alleviate students’ failure
Discover approaches that could be applied with minimal support and are scalable to a large number of courses
|Dodge et al. (2015)|
|26. The Hong Kong Institute of Education||Monitoring, feedback||Provide insights into predicting students’ performance
Develop measures to assess students’ online learning
Boost teachers’ and students’ interaction
Allow students to realize their knowledge discovery
Facilitate teachers to assess students’ performance
|Wong and Li (2016)|
|27. The Open University (UK)||Monitoring, intervention, personalization||Identify learners at risk and needing support
Improve learning design
Deliver personalized intervention for students
|Rienties et al. (2016)|
|Identifying patterns||Identify common patterns in course design
Find out pedagogical implications for various patterns and learning designs
|Toetenel and Rienties (2016)|
|28. The Technical University of Madrid||Monitoring, evaluation||Support teachers’ monitoring and evaluation of individual students’ progress within a team||Fidalgo-Blanco et al. (2015)|
|29. The University of Adelaide||Monitoring, feedback||Analyse students’ online discussion data, such as team mood, role distribution and emotional climate
Develop students’ soft skills necessary for collaborative work
|Tarmazdi et al. (2015)|
|30. The University of East London||Monitoring, feedback||Monitor student attendance and learning activities
Collect student data, such as demographic information, library activities, coursework, and download of free books
Send automated e-mails to students showing their attendance, and warnings to students without satisfactory attendance
|Sclater et al. (2016)|
|31. The University of Melbourne||Monitoring, analysis||Investigate how motivation and participation influence students’ performance in a MOOC||Barba et al. (2016)|
|Analyse how MOOC participants use online forums to support learning||Milligan (2015)|
|Investigate how students interpret feedback delivered via learning analytics dashboard and the relevant influence on their learning strategies and motivation||Corrin and Barba (2015)|
|32. Universidad a Distancia de Madrid||Monitoring, analysis||Find predictors of teamwork and commitment as cross-curricular competences||Iglesias-Pradas et al. (2015)|
|33. University of Edinburgh||Analysis, prediction||Examine MOOC data about students who enroled in the same course at least twice
Identify changes in their behaviours between the two enrolments to the same course
|Kovanović et al. (2016)|
|34. University of Maryland, Baltimore County||Monitoring, feedback, reflection||Reduce student barriers
Create a community of learners
Improve students’ self-awareness by providing feedback
Provide early alerts to students if their GPA falls below a level
|Mattingly et al. (2012)|
|35. University of Michigan||Monitoring, personalization, reflection||Identify at-risk students
Provide personalized feedback to students
|Mattingly et al. (2012)|
|36. University of New England||Monitoring, intervention||Foster a sense of community among students studying part-time, at a distance as well as on-campus
Identify students who are struggling in order to provide timely support
Develop a dynamic, systematic and automated process to capture the learning well-being status of students
Encourage peer-to-peer student networking
Disseminate information and connect support staff with the students
|Sclater et al. (2016)|
|37. University of North Bengal||Prediction||Examine the predictive relationship between learners’ pre-entry demographic information and their dropout behaviours||Yasmine (2013)|
|38. University of Rijeka||Data mining, analysis||Find out factors leading to student success in study
Identify problems timely and increase the course pass rate
|Sisovic et al. (2015)|
|39. University of Salamanca||Information extraction, analysis||Extract information useful for teaching/administrative staff, such as interaction of students with peers, teachers, the system, and course contents
Provide teachers with tools to facilitate managerial tasks
|Conde et al. (2015)|
|Support practical learning in a 3D virtual environment, analyse the problems that arisen, and report relevant data to students and teachers||Cruz-Benito et al. (2014)|
|40. University of Santiago de Compostela||Analysis, evaluation||Generate automatically reports of learners’ activities that take place in a virtual learning environment
Improve the efficiency of the evaluation process
|Gewerc et al. (2014)|
|41. University of Sydney||Analysis, observation||Identify the relationship among student performance, choices of programming languages for study, and times at which a student starts and stops working on an assignment||Gramoli et al. (2016)|
|42. University of the South Pacific||Monitoring||Track individual learners’ online and offline interactions with open learning resources||Prasad et al. (2016)|
|43. University of Wollongong||Analysis, intervention, reflection||Visualize patterns of student interactions on discussion forums
Allow instructors to identify at-risk students and potentially high and low performing students for planning interventions, and the extent to which a learner community is developing in a class
|Mat et al. (2013)|
Allen, W.R., Fernandes, K. and Whitmer, J. (2012), “Analytics in progress: technology use, student characteristics, and student achievement”, EDUCAUSE Review, 12 August, available at: http://er.educause.edu/articles/2012/8/analytics-in-progress-technology-use-student-characteristics-and-student-achievement (accessed 28 December 2016).
Arnold, K.E. and Pistilli, M.D. (2012), “Course signals at Purdue: using learning analytics to increase student success”, The 2nd International Conference on Learning Analytics and Knowledge, Vancouver, pp. 267-270.
Arroway, P., Morgan, G., O’Keefe, M. and Yanosky, R. (2016), “Learning analytics in higher education”, EDUCAUSE, available at: https://library.educause.edu/~/media/files/library/2016/2/ers1504la.pdf (accessed 28 February 2017).
Atif, A., Richards, D., Bilgin, A. and Marrone, M. (2013), “A panorama of learning analytics featuring the technologies for the learning and teaching domain”, The 30th Ascilite Conference, Sydney, pp. 68-72.
Barba, P.D., Kennedy, G. and Ainley, M. (2016), “The role of students’ motivation and participation in predicting performance in a MOOC”, Journal of Computer Assisted Learning, Vol. 32 No. 3, pp. 218-231.
Blanton, S.E. (2012), “Datamaster: success and failure on a journal of business intelligence”, EDUCAUSE case studies, available at: http://er.educause.edu/articles/2012/7/datamaster-success-and-failure-on-a-journey-to-business-intelligence (accessed 28 December 2016).
Campbell, J.P. and Oblinger, D.G. (2007), “Academic analytics”, EDUCAUSE, available at: https://net.educause.edu/ir/library/pdf/PUB6101.pdf (accessed 28 December 2016).
Chacon, F., Spicer, D. and Valbuena, A. (2012), “Analytics in support of student retention and success”, available at: https://net.educause.edu/ir/library/pdf/ERB1203.pdf (accessed 28 December 2016).
Cohen, J. (1988), Statistical Power Analysis for the Behavioral Sciences, 2nd ed., Lawrence Erlbaum, Hillsdale, NJ.
Conde, M.A., Garcia-Penalvo, F.J., Gomez-Aguilar, D. and Theron, R. (2015), “Exploring software engineering subjects by using visual learning analytics techniques”, IEEE Revista Iberoamericana De Tecnologias Del Aprendizaje, Vol. 10 No. 4, pp. 242-252.
Corrin, L. and Barba, P. (2015), “How do students interpret feedback delivered via dashboards?”, Proceedings of the 5th International Conference on Learning Analytics and Knowledge, Poughkeepsie, New York, NY, 16-20 March, pp. 430-431.
Cruz-Benito, J., Theron, R., Garcia-Penalvo, F.J., Maderuelo, C., Perez-Blanco, J.S., Zazo, H. and Martin-Suarez, A. (2014), “Monitoring and feedback of learning processes in virtual worlds through analytics architectures: a real case”, The 9th Iberian Conference on Information Systems and Technologies, Barcelona, pp. 1126-1131.
Dodge, B., Whitmer, J. and Frazee, J.P. (2015), “Improving undergraduate student achievement in large blended courses through data-driven interventions”, The 5th International Conference on Learning Analytics and Knowledge, New York, NY, pp. 412-413.
Dyckhoff, A.L. (2011), “Implications for learning analytics tools: a meta-analysis of applied research questions”, International Journal of Computer Information Systems and Industrial Management Applications, Vol. 3, pp. 594-601.
Fidalgo-Blanco, Á., Sein-Echaluce, M.L., García-Peñalvo, F.J. and Conde, M.Á. (2015), “Using learning analytics to improve teamwork assessment”, Computers in Human Behavior, Vol. 47, June, pp. 149-156.
Firat, M. and Yuzer, T.V. (2016), “Learning analytics: assessment of mass data in distance education”, International Journal on New Trends in Education and their Implications, Vol. 7 No. 2, available at: www.ijonte.org/FileUpload/ks63207/File/01.mehmet_firat_.pdf (accessed 28 December 2016).
Fritz, J. and Kunnen, E. (2010), “Using analytics to intervene with underperforming college students”, available at: www.educom.edu/eli/events/eli-annual-meeting/2010/using-analytics-intervene-underperforming-college-students-innovative-practice (accessed 28 December 2016).
Gašević, D., Dawson, S. and Pardo, A. (2016), “How do we start? State and directions of learning analytics adoption”, International Council for Open and Distance Education, available at: https://icde.memberclicks.net/assets/RESOURCES/dragan_la_report%20cc%20licence.pdf (accessed 28 February 2017).
Gewerc, A., Montero, L. and Lama, M. (2014), “Collaboration and social networking in higher education”, Media Education Research Journal, Vol. 21 No. 42, pp. 55-63.
Gramoli, V., Charleston, M., Jeffries, B., Koprinska, I., McGrane, M., Radu, A., Viglas, A. and Yacef, K. (2016), “Mining autograding data in computer science education”, The Eighteenth Australasian Computing Education Conference, Canberra, 1-5 February, available at: http://dl.acm.org/citation.cfm?id=2843070 (accessed 28 December 2016).
Grush, M. (2011), “Monitoring the PACE of student learning: analytics at Rio Salado College”, Campus Technology, 14 December, available at: https://campustechnology.com/articles/2011/12/14/monitoring-the-pace-of-student-learning-analytics-at-rio-salado-college.aspx
Guitart, I., More, J., Duran, J., Conesa, J., Baneres, D. and Ganan, D. (2015), “A semi-automatic system to detect relevant learning content for each subject”, The 7th International Conference on Intelligent Networking and Collaborative Systems, Taipei, pp. 301-307.
Harvey, F. (2013), “Technology tools for developing, delivering, updating and assessing sustainable high-quality academic programs”, Society for Information Technology & Teacher Education International Conference, New Orleans, LA, pp. 2131-2133.
Iglesias-Pradas, S., Ruiz-de-Azcárate, C. and Agudo-Peregrina, Á.F. (2015), “Assessing the suitability of student interactions from moodle data logs as predictors of cross-curricular competencies”, Computers in Human Behavior, Vol. 47, June, pp. 81-89.
Ihantola, P., Vihavainen, A., Ahadi, A., Butler, M., Börstler, J., Edwards, S.H., Isohanni, E., Korhonen, A., Petersen, A., Rivers, K., Rubio, M.Á., Sheard, J., Skupas, B., Spacco, J., Szabo, C. and Toll, D. (2015), “Educational data mining and learning analytics in programming: literature review and case studies”, The 20th Annual Conference on Innovation and Technology in Computer Science Education – Working Group Reports, ACM, New York, NY, pp. 41-63, available at: http://dl.acm.org/citation.cfm?doid=2858796.2858798 (accessed 28 December 2016).
Jayaprakash, S.M., Moody, E.W., Lauria, E.J.M., Regan, R. and Baron, J.D. (2014), “Early alert of academically at-risk students: an open source analytics initiative”, Journal of Learning Analytics, Vol. 1 No. 1, pp. 6-47.
Jones, D. and Woosley, S. (2011), “The foundation of MAP-Works: research and theoretical underpinnings of MAP-Works”, Educational Benchmarking (EBI), available at: www2.indstate.edu/studentsuccess/pdf/The Foundation of MAP-Works.pdf (accessed 28 December 2016).
Karkhanis, P.S. and Dumbre, S.S. (2015), “A study of application of data mining and analytics in education domain”, International Journal of Computer Applications, Vol. 120 No. 22, pp. 23-29.
Kovanović, V., Joksimović, S., Gašević, D., Owers, J., Scott, A. and Woodgate, A. (2016), “Profiling MOOC course returners: how does student behavior change between two course enrollments?”, The Third ACM Conference on Learning @ Scale, Edinburgh, pp. 269-272.
McAleese, V. and Taylor, L. (2012), “Beyond retention: using targeted analytics to improve student success”, EDUCAUSE Review, 17 July, available at: http://er.educause.edu/articles/2012/7/beyond-retention-using-targeted-analytics-to-improve-student-success (accessed 28 December 2016).
Mat, U.B., Buniyamin, N., Arsad, P.M. and Kassim, R. (2013), “An overview of using academic analytics to predict and improve students’ achievement: a proposed proactive intelligent intervention”, IEEE 5th Conference on Engineering Education (ICEED), pp. 126-130.
Mattingly, K.D., Rice, M.C. and Berge, Z.L. (2012), “Learning analytics as a tool for closing the assessment loop in higher education”, Knowledge Management & E-learning: An International Journal, Vol. 4 No. 3, pp. 236-247.
Milligan, S. (2015), “Crowd-sourced learning in MOOCs”, The 5th International Conference on Learning Analytics and Knowledge, New York, NY, pp. 151-155.
Nunn, S., Avella, J.T., Kanai, T. and Kebritchi, M. (2016), “Learning analytics methods, benefits, and challenges in higher education: a systematic literature review”, Online Learning, Vol. 20 No. 2, available at: https://olj.onlinelearningconsortium.org/index.php/olj/article/view/790 (accessed 28 December 2016).
Papamitsiou, Z. and Economides, A.A. (2014), “Learning analytics and educational data mining in practice: a systematic literature review of empirical evidence”, Educational Technology & Society, Vol. 17 No. 4, pp. 49-64.
Poitras, E.G., Naismith, L.M., Doleck, T. and Lajoie, S.P. (2016), “Using learning analytics to identify medical student misconceptions in an online virtual patient environment”, Online Learning, Vol. 20 No. 2, pp. 239-250.
Prasad, D., Totaram, R. and Usagawa, T. (2016), “Development of open textbooks learning analytics system”, International Review of Research in Open and Distributed Learning, Vol. 17 No. 5, pp. 215-234.
Prinsloo, P. and Slade, S. (2014), “Educational triage in open distance learning: walking a moral tightrope”, International Review of Research in Open and Distance Learning, Vol. 15 No. 4, pp. 306-331.
Rienties, B., Boroowa, A., Cross, S., Kubiak, C., Mayles, K. and Murphy, S. (2016), “Analytics4action evaluation framework: a review of evidence-based learning analytics interventions at the Open University UK”, Journal of Interactive Media in Education, Vol. 1 No. 2, pp. 1-13.
Rienties, B., Boroowa, A., Cross, S., Farrington-Flint, L., Herodotou, C., Prescott, L., Mayles, K., Olney, T., Toetenel, L. and Woodthorpe, J. (2016), “Reviewing three case-studies of learning analytics interventions at the Open University UK”, The 6h International Conference on Learning Analytics & Knowledge, ACM, New York, NY, pp. 534-535.
Robinson, C., Yeomans, M., Reich, J., Hulleman, C. and Gehlbach, H. (2016), “Forecasting student achievement in MOOCs with natural language processing”, The 6th International Conference on Learning Analytics & Knowledge, Edinburgh, pp. 383-387.
Sclater, N. and Mullan, J. (2017), “Jisc briefing: learning analytics and student success – assessing the evidence”, available at: http://repository.jisc.ac.uk/6560/1/learning-analytics_and_student_success.pdf (accessed 28 February 2017).
Sclater, N., Peasgood, A. and Mullan, J. (2016), “Learning analytics in higher education: a review of UK and international practice”, available at: www.jisc.ac.uk/reports/learning-analytics-in-higher-education (accessed 28 December 2016).
Sisovic, S., Matetic, M. and Bakaric, M.B. (2015), “Mining student data to assess the impact of moodle activities and prior knowledge on programming course success”, The 16th International Conference on Computer Systems and Technologies, Dublin, pp. 366-373.
Smith, V.C., Lange, D.R.H. and Huston, D.R. (2012), “Predictive modelling to forecast student outcomes and drive effective interventions in online community college courses”, Journal of Asynchronous Learning Networks, Vol. 16 No. 3, pp. 51-61.
Star, M. and Collette, L. (2010), “GPS: shaping student success one conversation at a time”, EDUCAUSE, available at: http://er.educause.edu/articles/2010/12/gps-shaping-student-success-one-conversation-at-a-time (accessed 28 December 2016).
Tarmazdi, H., Vivian, R., Szabo, C., Falkner, K. and Falkner, N. (2015), “Using learning analytics to visualise computer science teamwork”, The 2015 ACM Conference on Innovation and Technology in Computer Science Education, Vilnius, pp. 165-170.
Toetenel, L. and Rienties, B. (2016), “Analysing 157 learning designs using learning analytic approaches as a means to evaluate the impact of pedagogical decision making”, British Journal of Educational Technology, Vol. 47 No. 5, pp. 981-992.
Wang, X., Wen, M. and Rosé, C.P. (2016), “Towards triggering higher-order thinking behaviors in MOOCs”, The 6th International Conference on Learning Analytics & Knowledge, Edinburgh, pp. 398-407.
Wong, G.K.W. and Li, S.Y.K. (2016), “Academic performance prediction using chance discovery from online discussion forums”, IEEE 40th Annual Computer Software and Applications Conference, GA, pp. 706-711.
Yasmine, D. (2013), “Application of the classification tree model in predicting learner dropout behaviour in open and distance learning”, Distance Education, Vol. 34 No. 2, pp. 218-231.
Pirani, J.A. and Albrecht, B. (2005), University of Phoenix: Driving decisions Through Academic Analytics, Educause Center for Applied Research, available at: https://net.educause.edu/ir/library/pdf/ers0508/cs/ecs0509.pdf (accessed 28 December 2016).
The work described in this paper was partially supported by a grant from the Research Grants Council of the Hong Kong Special Administrative Region, China (UGC/IDS16/15).