Real-time learning analytics system for improvement of on-site lectures

Atsushi Shimada (Kyushu University, Fukuoka, Japan)
Shin’ichi Konomi (Kyushu University, Fukuoka, Japan)
Hiroaki Ogata (Kyoto University, Kyoto, Japan)

Interactive Technology and Smart Education

ISSN: 1741-5659

Article publication date: 5 December 2018

Issue publication date: 10 December 2018

5203

Abstract

Purpose

The purpose of this study is to propose a real-time lecture supporting system. The target of this study is on-site classrooms where teachers give lectures and a lot of students listen to teachers’ explanations, conduct exercises, etc.

Design/methodology/approach

The proposed system uses an e-learning system and an e-book system to collect teaching and learning activities from a teacher and students in real time. The collected data are immediately analyzed to provide feedback to the teacher just before the lecture starts and during the lecture. For example, the teacher can check which pages were well previewed and which pages were not previewed by students using the preview achievement graph. During the lecture, real-time analytics graphs are shown on the teacher’s PC. The teacher can easily grasp students’ status and whether or not students are following the teacher’s explanation.

Findings

Through the case study, the authors first confirmed the effectiveness of each tool developed in this study. Then, the authors conducted a large-scale experiment using a real-time analytics graph and investigated whether the proposed system could improve the teaching and learning in on-site classrooms. The results indicated that teachers could adjust the speed of their lecture based on the real-time feedback system, which also resulted in encouraging students to put bookmarks and highlights on keywords and sentences.

Originality/value

Real-time learning analytics enables teachers and students to enhance their teaching and learning during lectures. Teachers should start considering this new strategy to improve their lectures immediately.

Keywords

Citation

Shimada, A., Konomi, S. and Ogata, H. (2018), "Real-time learning analytics system for improvement of on-site lectures", Interactive Technology and Smart Education, Vol. 15 No. 4, pp. 314-331. https://doi.org/10.1108/ITSE-05-2018-0026

Publisher

:

Emerald Publishing Limited

Copyright © 2018, Atsushi Shimada, Shin’ichi Konomi and Hiroaki Ogata.

License

Published by Emerald Group Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode


Introduction

Much attention has been paid to learning analytics in the technology-enhanced learning research domain. The Society for Learning Analytics and Research defined learning analytics as the measurement, collection, analysis and reporting of data about learners and their context for the purpose of understanding and optimizing learning and the environments in which it occurs. In the early stages of learning analytics, researchers discussed methods for measuring a learning environment and collecting data. Recently, virtual learning environments and learning management systems such as Blackboard (Bradford et al., 2007) and Moodle (Dougiamas and Taylor, 2003) have enabled us to collect large-scale educational data (educational big data) easily. The latest research has focused on methods for analyzing educational big data and reporting, that is, how to provide feedback on analysis results to teachers/students.

Khalil et al. gave a survey (Khalil and Ebner, 2016) on learning analytics and divided the methods into seven categories:

  1. data mining techniques – the prediction of students’ academic achievement (Asif et al., 2017), detecting students at risk using clicker responses (Choi et al., 2018) and forecasting the relation between studying time and learning performance (Jo et al., 2014);

  2. statistics and mathematics – building a grading system (Vogelsang and Ruppertz, 2015) and temporal discourse analysis of an online discussion (Lee and Tan, 2017);

  3. text mining, semantics and linguistic analysis – summarization of students’ learning journals (Taniguchi et al., 2017) and understanding students’ self-reflections (Kovanović, 2018);

  4. visualization – comprehensive overview of students’ learning from learning management system (Poon et al., 2017), awareness tool for teachers and learners (Martinez-Maldonado et al., 2015) and a learning analytics dashboard (Aljohani et al., 2018);

  5. network analysis – relationship analysis between technology use and cognitive presence (Kovanović, 2017), classification of students’ patterns into categories based on the level of engagement (Khalil and Ebner, 2016) and a network analysis of LAK (Learning Analytics and Knowledge) conference papers (Dawson et al., 2014);

  6. qualitative analysis – an evaluation of discussion forums of MOOCs (Ezen-Can et al., 2015) and analyzing instructors comments (Gardner et al., 2016); and

  7. gamification – e-assessment platform with gamification (Gañán et al., 2017), gamified dashboard (Freitas et al., 2017) and a competency map (Grann and Bushway, 2014).

Results of learning analytics are helpful for teachers and learners to improve their teaching and learning. Therefore, one of the important issues in learning analytics is obtaining feedback for optimizing the learning environment and learners themselves. There are roughly three types of feedback loops in terms of their frequency: yearly, weekly and real-time feedback. The above-mentioned studies are basically categorized into yearly feedback or weekly feedback types because the analyses results are not immediately fed back to on-site teachers/students who provide their educational/learning logs for the analytics. The reason is obvious – learning analytics is basically performed after classes, school terms or school years. Therefore, the feedback is delayed accordingly. However, if a real-time feedback can be obtained, then it can be very useful and helpful for teachers and students in on-site classrooms.

Our study focused on feedback – specifically, how to provide feedback – on efficient information to on-site classrooms even during lectures. The aim of this study is to realize real-time feedback, which has not often been discussed with respect to on-site educational environments. Our target is on-site classrooms where teachers give lectures and a lot of students listen to teachers’ explanations, conduct exercises, etc. In such a large classroom, it is not easy for teachers to grasp students’ situations and activities. We utilize not only an e-learning system but also an e-book system to collect real-time learning activities during the lectures. We have developed two main feedback systems. One is useful for a teacher just before the lecture starts. The system provides summary reports of the previews of the given materials and quiz results. The teacher can check which pages were well previewed and which pages were not previewed by students using the preview achievement graph. Additionally, the teacher can check which quizzes were difficult for students, and the suggested pages that should be used in the lecture to aid students. The other is real-time analytics graphs, which are helpful for the teacher to control hihe/sher lecture speed during the lecture. The system collects e-book logs operated by students sequentially and performs analytics in real time to determine how many students are following the teacher’s explanation. In the rest of this paper, we introduce the details of our real-time feedback system and report the experimental results.

Literature review

There are roughly three types of feedback loops in terms of their frequency: yearly, weekly and real-time feedback. A typical example of a yearly (or term-by-term) feedback loop is the assessment and improvement of education. Students’ grades, examination results, class questionnaires and so on are typically analyzed and evaluated. The relationship between self-efficacy and learning behaviors on the e-book system was analyzed (Yamada et al., 2015). Teachers receive new information that student behaviors, with regard to markers and annotations, are related to their self-efficacy and to the intrinsic value of the learning materials. Other examples of a yearly (or term-by-term) feedback loop include an analysis of students’ performance (Okubo et al., 2016) and a prediction of students’ final grade (Mouri et al., 2016). The yearly feedback loop is designed so that the feedback results will be delivered in the next year (or term). In other words, students and teachers will not directly receive the feedback results acquired by analyzing their own learning logs. The feedback could also be an analysis of learning logs collected in previous years.

A weekly feedback loop can recommend related materials based on students’ status determined using a prediction of academic performance through the analysis of learning logs such as attendance reports and quiz results. For example, the analytics of preview and review patterns (Oi et al., 2015) or learning behavior analytics (Yin et al., 2015) is helpful to understand the weekly performance of students. Text analytics technology provides summarized materials for preview (Shimada et al., 2015) and review (Shimada et al., 2016). In contrast to a yearly feedback loop, the analysis results are directly fed back to the students and teachers who provide the learning logs.

There are several related works that tackle real-time learning analytics. Minovic et al. proposed a visualization tool for teachers to track students’ learning progress in real time while in a gameplay session (Minovic and Milovanovic, 2013). Piech et al. collected tens of thousands of program codes and applied a machine learning approach to identify “sink” states of students. Feedback was obtained for students just before they were about to enter such problematic “sink” states (Piech et al., 2012). Freitas et al. discuss about the effectiveness of immediate feedback to students especially in the impact of gamification in university education (Freitas et al., 2017). Fu et al. also proposed a real-time analysis of program codes (Fu et al., 2017). They provided a learning dashboard to capture the behavior of students in the classroom and identify different difficulties faced by students. Although these studies obtained real-time feedback, the target of the analytics and its feedback were activities in virtual learning environments. In contrast, our study aims to realize real-time feedback loops where the analysis results can be fed back to on-site students and teachers even during a lecture. A teacher can check what students are doing, for example, whether students are following the explanation or whether they are doing something not related to the lecture. A teacher can flexibly control the speed of the lecture and/or take more time for exercises rather than engaging in a nonstop talk.

Implementation

Cyber-physical educational system

In our university, various kinds of educational/learning logs are collected by three systems: e-learning (Moodle), e-portfolio (Mahara) and e-book (BookRoll). Students submit their reports, answer quizzes, access materials and reflect on their learning activities using these systems. More precise learning logs are collected by the e-book system, such as when a student opens some material or when he/she turns a page of the material. All students use their own laptops so that they can access these systems from anywhere, either on or off campus.

The e-book logs were collected via an e-book system called “BookRoll”. Table I shows samples of e-book logs. There are many types of operations recorded in the logs. For example, OPEN means that the student opened the e-book file, whereas NEXT means that the student clicked the next button to move to the subsequent page. The browsing duration for each page can be calculated by subtracting the subsequent timestamps. Learning logs on the e-leaning system such as attendance and quiz scores are collected from tables in the Moodle database. The system analyzes the quiz scores and class attendance by integrating related tables. In this study, we mainly use the e-learning system and e-book system to realize the proposed real-time learning analytics system.

On-site lecture supporting system

We present the example case study shown in Figure 1, which was actually applied to a lecture in our university. The time line is divided into two parts: before starting a class and during a class. During the previous lecture, a teacher gave students some preview materials that were automatically generated using the summarization technique (Shimada et al., 2017). Students previewed the given materials before the class, and the operation logs recorded during the previews were collected by the system. Before the class started, students answered the quizzes, and the results were collected on the server.

Just before the lecture started, our system analyzed the learning logs to make a summary report containing previews of the achievement and quiz results (details are given in Section 2.6). Additionally, the system provided information regarding important pages that should be explained well in the lecture. For example, the teacher should focus on pages that are related to quizzes, especially those that have led to lower quiz scores. Our system analyzed the relationship between quiz statements and their related pages in the lecture material in advance. Section 2.3 explains how we automatically discovered important pages.

During the lecture, a teacher explained the contents of the materials, and students browsed the materials on their laptops. In our university, students were asked to open and browse the same page as the teacher and to put highlights or memos on the important points. During the lecture, learning logs were sequentially collected and stored. The analysis results were immediately visualized on the web interface and updated each minute. Therefore, the teacher could check the latest student activities. The visualization included real-time information regarding how many students were following the lecture, how many students were browsing previous pages, etc. The web interface is described in Section 2.6. The teacher adaptively controlled the speed of the lecture according to the students. For example, if many students were not following the lecture and were still on the previous page, then the teacher slowed down the lecture.

Important page mining

There is a strong relationship between lecture materials and quizzes because quizzes are often generated using the contents of lecture materials. Related pages are important to understanding the contents of the materials. However, lecture materials and quizzes are stored separately or are very weakly connected in systems using subject names, for example. We can manually assess the relationship between a quiz item and its related pages, but this is not easy or realistic when the number of quiz items and/or the number of pages increase. Furthermore, if the lecture material is updated, that is, the page numbering changes, then the teacher must update the correspondence. Therefore, we developed a method that automatically determines the correspondences.

Our strategy assumes that a related page contains the same keyword as the quiz statement. Each quiz statement, QS, is divided into morphemes. Then, we extract the nouns n (1,⋯, n,⋯, N). For each noun n, a normalized histogram hn is created. Each bin bu,n of the histogram hn represents how many times page u contains noun n. Note that the bins are normalized after counting the number of times noun n appears in all the pages. To acquire the final results, we sum the frequencies of all nouns. We define the normalized value ru as the related score of page u.

Although the mining method finds pages that are highly related to a given quiz statement, it does not consider the relationships among pages. Therefore, we also apply a ranking method that assigns a ranking score to each page. This idea was inspired by VisualRank (Jing and Baluja, 2008). A ranking vector R is iteratively updated using:

R=αS×R+1-αB
where S is the column normalized similarity matrix, and Su,v measures the page similarity between pages u and v. In this study, we simply evaluate the similarity using the L2 norm between two feature vectors represented by a bag of words (Zhang et al., 2010). B is a bias vector. We use the relate score ru as an element of B. R is repeatedly updated until it converges. α, (0 ≤ α ≤ 1) controls the balance between the similarity matrix and the bias vector. According to the literature (Jing and Baluja, 2008), α > 0.8 is often used in practice. After the ranking vector R converges, pages that are related to important pages have larger ranking scores. We select the top N ranked pages as important.

Preview achievement

By analyzing e-book operation logs, we can determine how long students spend previewing each page of a given material. The previewing time for each page can be easily acquired by subtracting two successive timestamps from the operation logs. Note that we ignored durations less than 3 seconds and more than 600 seconds to discard skipped and abandoned pages. Figure 2 shows an example of a visualized result of preview achievement. A teacher can check the preview status of the given materials in advance before beginning hihe/sher lecture.

Quiz results

The quiz results and questions are collected from the e-learning system, and the scores are aggregated in the class. We set a threshold for the ratio of correct answers (in our implementation, we set the threshold to 50), and if the accuracy is lower than the threshold, then important pages, which are automatically mined in advance, are displayed below the summary graph. See Figure 3 for an example of the web page. A summary graph of the quiz results is followed by the quiz statements and related page information, if necessary.

Visualizer of web pages

The proposed visualizer of the analysis results was implemented as a web system. A teacher can easily access the web page from a PC. Before the lecture starts, a teacher can access the web pages that provide summary reports of the previews of given materials and quiz results, as shown in Figures 2 and 3. The teacher can check which pages were well previewed and which pages were not previewed by students using the preview achievement graph. Additionally, the teacher can check which quizzes were difficult for students and the suggested pages that should be explained in the lecture to aid students.

During the lecture, the teacher can access two kinds of real-time analytics graphs. One is the real-time heat map shown in Figure 4. The horizontal and vertical axes represent the time of day and the page number, respectively. In other words, a vertical line corresponds to the distribution of the number of students who are browsing each page. The vertical lines are updated each minute; that is, a new line is added per minute. Each cell represents the number of students. The page being explained by the teacher is highlighted by red-colored rectangles. If a brighter color (red, orange, yellow or green) is used on the page being explained by the teacher, then most students are following the teacher’s explanation. Students are asked to try to be on the same page as the teacher and to add highlights or memos if necessary. Therefore, when the distribution of the students is skewed downward, some students are still browsing previous pages. In such a case, the teacher should slow down the lecture so that students can keep up.

The other real-time analytics graph is the circular chart (left part of Figure 5), which is a summarized version of the above heat map. A teacher can take some time to check and understand the situation from the heat map. To provide a visual summary of the heat map, the second visual focuses on the ratio of three types of students: browsing previous pages (blue), browsing the same page as the teacher (green) and browsing the next pages (red). This chart is also updated each minute to display the latest status of students. The visualizer also provides a breakdown of the three types based on whether students previewed the page in advance (light color) or not (dark color). For example, if many students are still browsing previous pages but most of them previewed the pages in advance, and if the pages are important ones that were related to a difficult quiz, then the teacher should wait for students to catch up and explain the material slowly and carefully. Another example is that a teacher should proceed with the lecture when many students are browsing the subsequent pages and when most of them have previewed the materials in advance. In such a situation, students may get bored during a teacher’s long explanation, or some students may have finished a given exercise.

The right part of Figure 5 is a time series of the circular chart. The teacher can see the recent trends of each status. As described above, real-time analytics graphs provide an opportunity to flexibly adjust the lecture progress based on students’ status.

Experimental results

Investigation of each tool

We investigated the effectiveness of each proposed tool in two classes at our university. One was a control group (N = 58) that did not use the above system, and the other was an experimental group (N = 157) that used the system. The contents of the two lectures were completely the same. Students chose one of them according to their schedules. Therefore, the number of students was not balanced between the two classes. The class was designed to provide an introduction to information and communication technology in a number of disciplines. First-year students, including both arts and science students, attended the class, which commenced in October 2016. All the students brought their own laptops to the class.

The lecture was given by the same teacher using the same materials. The teacher used two materials: Material 1 consisted of 37 pages, and Material 2 consisted of 47 pages. The teacher began with Material 1 and moved on to Material 2 and asked students to follow the pages in the materials using bookmarks, highlights and memos. Operation logs were sequentially collected to the server, and real-time analysis was conducted. The results were fed back to the teacher minute by minute in the experimental group only. More details are provided in Table II. We conducted a pretest to determine students’ basic knowledge of information science. There was no significant difference between the two groups.

Synchronization

While the teacher conducted the lecture with the students in the experimental group, he monitored the display on which the real-time analysis results were drawn. He controlled the speed of the lecture to help students keep up as much as possible. Our hypothesis in this experiment is that the students in the experimental group would open and follow the page explained by the teacher compared with those in control group. We evaluated the synchronization of the classroom, that is, how many students were on the pages that were being explained by the teacher. We counted the number minute by minute while including the allowable delay setting, which refers to the short period for accepting the delay in e-book operations.

Table III shows the ratio of synchronization of each group. For example, if we set the allowable delay to 3 minutes (i.e. if students opened the same page as the teacher within a 3-minute delay), the synchronization ratio of the experimental group was 0.7661. The score was significantly different from the score of the control group. In other allowable delay settings, the synchronization ratios of the experimental groups were higher than those of the control group. We believe that such high synchronization was realized by the lecture speed control through real-time feedback on classroom activities.

Effectiveness of important page suggestions

The analyses of preview status and quiz scores were conducted just before the lecture started. The system reported that most students wrongly answered two of eleven questions. The pages related to the quizzes (Page #10 of Material 1 and Page #27 of Material 2) were shown on the display, and the teacher confirmed them. Our hypothesis in this investigation is that the teacher would spend some more time on the explanation of these pages and that would result in encouraging students to leave many learning actions on the pages.

We analyzed the time duration of pages, and found out that Page #10 was opened by the teacher for 3 minutes in the experimental group and 1 minute for the control group. In addition, we analyzed the number of bookmarks, highlights and memos on the above two pages, for which the teacher had emphasized his explanations. About 61 per cent of students used the functions in the experimental group, whereas about 53 per cent of the students used the functions in the control group.

Additionally, we analyzed the utilization ratios of three functions through the materials and compared the ratios between the two groups. Table IV shows that more students in the experimental group used the functions compared with the students in the control group. We believe that the students in the experimental group had enough time to use bookmarks, highlights and/or memos because the teacher emphasized his explanations for important pages with adjusting the speed of his lecture based on the real-time situation of the classroom.

Investigation of on-site lectures

We conducted other experiments in large-scale classrooms with more than 150 students in our university in April 2018. Three classes – Class 1 (N = 174), Class 2 (N = 157) and Class 3 (N = 159) – joined our experiments. We confirmed that there was no significant difference in the basic knowledge among the classes in advance. The lecture was designed to teach cyber security to first-year students. In the three classrooms, the same lecture materials were used over two weeks. All students brought their laptops to the class. Class 1 was conducted by Teacher A without a supporting system, whereas Classes 2 and 3 were conducted by Teacher B with our supporting system. Classes 1 and 2 were conducted in parallel. Class 3 was conducted just after Class 2 on the same day. In all classes, students were asked to follow the materials pages explained by the teacher and to use bookmarks, highlights and memos. The experiments mentioned above were conducted here as well.

In Classes 2 and 3, the teacher controlled the lecture speed by checking real-time heat map. Furthermore, Teacher 2 developed a better way of following the teaching plan for Class 3 by checking the distribution of the real-time heat map just after Class 2. In fact, there was a 20-minute break between Classes 2 and 3 so that the improvement could immediately be applied to the following class. Teacher B discovered several time slots when the distribution of the real-time heat map was skewed below in Class 2. Then, he checked the corresponding pages in the lecture materials and considered a new teaching strategy for the pages in Class 3. Table VIII summarizes the pages and the improvement plans that the teacher considered and took after Class 2.

Effectiveness of real-time heat map

We make a hypothesis that the real-time heat map would provide adaptive control of lecture speed, resulting in giving enough time to students to use e-book functions of bookmarks, highlights and memos. First, we investigated the utilization ratios of these three functions, that is, how often the students used these functions during the class. We counted the number of operations used by each student and evaluated the differences among classes. According to ANOVA, there was a significant difference among the classes; therefore, we conducted a t-test for possible combinations by setting a stringent level of statistical significance.

Tables V and VI show the average number of operations for bookmarks, highlights and memos and their standard deviations (SDs) in each class. Further, Figures 6 and 7 show the visual differences of the average number of operations among the three classes. The results of the t-test are drawn as *(p < 0.05/3), **(p < 0.01/3) and ***(p < 0.001/3). Although the usage of the memo function was not different among the classes, the other functions were frequently utilized by students in Classes 2 and 3 over two weeks. In these two classes, Teacher B controlled the speed of his lecture by checking the real-time heat map. Regarding the results, we believe students left many bookmarks and highlights on each page of the lecture materials. Furthermore, the students in Class 3 tended to leave more highlights than those in Class 2. We believe these effects came from the improvement in the lecture design implemented just after Class 2 was finished. We discuss the details in the following subsection.

Second, we investigated how students followed the lecture by evaluating the page distribution by students. The evaluation was conducted by calculating the average number of pages and its SD minute by minute. The more students browsed the same page, the smaller the SD. However, if the SD was large, then this indicated that students browsed a variety of pages. Regarding the intent of the lecture design, the ideal situation was small SD because students were asked to listen to the talk given by the teacher carefully and also asked to follow the pages by leaving bookmarks, highlights and memos as much as possible. The bottom parts of Figures 8 and 9 depict the period when the teacher was talking about the contents of the lecture material in each class. Teacher A (Class 1) took a longer time for giving an explanation than Teacher B (Classes 2 and 3). We calculated the average SDs over the talking period and found out the average SDs of Classes 2 and 3 were much smaller than the SD of Class 1. Table VII summarizes the details of the average SDs. In the classrooms, with our real-time heat map (Classes 2 and 3), not only the average SD but also its variance was smaller than the one in the classroom without our system (Class 1). This means that most students followed the pages over time, which resulted in students leaving more bookmarks and highlights with listening to the explanation given by the teacher.

Investigation of lecture improvement

In this experiment, we investigate our hypothesis that the teacher could make better lecture strategies for the upcoming lecture, resulting in better engagement of students. As mentioned above, Classes 2 and 3 were conducted by the same teacher, and Class 2 was followed by Class 3 on the same day. During the break time between the two classes, the teacher checked the heat map again and considered the improved lecture plan as shown in Table VIII. There were a total of five cases of improvement. For instance, in Cases 1 and 2, the lecture materials provided a tutorial on how to back up data and how to update software in operating systems such as iOS, Android and Windows. In Class 2, the teacher explained all the contents sequentially. According to the real-time heat map, shown in Figure 10, the distribution was skewed downward. After Class 2, the teacher decided to allow students to read these pages by themselves based on their own environment (some students focus on iOS pages, other students focus on Android pages, etc.). As a result, students freely browsed the pages in Class 3. The bottom right part of Figure 10 shows the distribution of Class 3. The distribution seems not so different from the one of Class 2, but the number of operations, bookmarks, and highlights drastically increased in Class 3 (Table IX).

Table IX shows the number of operations – bookmark, highlight and memo – recorded in the pages corresponding to the five cases in Table VIII. In total, the students in Class 3 tended to leave more bookmarks and highlights in the pages. Additionally, the number of students who utilized these functions also drastically increased, especially the functions of the highlights in Cases 3, 4 and 5. In these cases, the teacher used the improvement strategy to spend more time on thinking, researching and reviewing the contents. We believe that students were naturally encouraged to put bookmarks and highlights on the keywords and sentences that they thought were important.

Conclusion

We proposed a lecture supporting system based on real-time learning analytics for on-site classrooms. Our system provided summary reports of previews and quiz scores just before a lecture started. The report was helpful for teachers to check which pages were well previewed and which pages were not previewed by students using the preview achievement graph. Additionally, the teacher could check which quizzes were difficult for students. Our system automatically suggested related pages that needed to be explained in the lecture to aid students. Furthermore, real-time analytics graphs were helpful for the teacher to control hihe/sher lecture speed during the lecture. The proposed real-time learning analytics system supported on-site lectures in the following aspects:

  • The teacher could adjust the speed of his or her lecture based on the real-time feedback system.

  • The teacher could emphasize important points that were misunderstood by students.

  • The following effects were confirmed.

  • The students could keep up with the lecture by following the pages explained by the teacher.

  • Many students could put bookmarks, highlights and memos on important pages.

Through the empirical investigation in the lectures of our university, we found out the positive effect. First, the teacher could confirm the students’ situation whether they could follow his lecture by checking the real-time heat map. When many students opened previous pages, the teacher could slow down the speed of lecture. The adaptive control of lecture speed provided higher synchronization between the teacher and students. We believe that high synchronization leads to the improvement of lecture satisfaction level. Additionally, in the class that used the real-time heat map, more students tended to follow the same page with smaller variance than in the class that did not use the map. The real-time heat map not only allows regulated browsing of lecture material but also gives students an opportunity to use e-book operations, such as bookmarks, highlights and memos. Second, the important page suggestion system helped the teacher to quickly check the pages which students had difficulty in understanding. The teacher could spend longer time for the explanation of suggested pages. As the results, students left more highlights and memos on the pages for better understanding of the contents. Third, the improvement of lecture strategy was conducted thanks to the real-time heat map. The teacher could reflect his lecture just after the lecture was finished and could consider new lecture plans. The improvement of the lecture was immediately conducted, and the effectiveness was clearly confirmed. In fact, many students were encouraged to put more bookmarks, highlights and memos on the pages where the teacher changed his explanation plan. We are sure that the traditional weekly or yearly feedback loops in learning analytics could not realize the immediate improvement of the lecture plan. The real-time learning analytics becomes a powerful tool to realize a real-time feedback loop, which support not only the improvement of lecture plans but also supporting teaching and learning process adaptively based on the situation in on-site classrooms.

In future works, we continue to use the proposed real-time learning analytics system for the other lectures and investigate the effectiveness in larger scale. Besides, we plan to analyze the relationship between the learning activities and learning performance of students. We also plan to develop other report graphs that support the teacher’s decisions in the classroom. Another important aspect is the qualitative evaluation how the system encourages the motivation and satisfaction of students and teachers. We are going to discuss with researchers in the cognitive and pedagogical fields and conduct further evaluation.

Figures

A case study

Figure 1.

A case study

Example preview achievement

Figure 2.

Example preview achievement

Example quiz results and related information

Figure 3.

Example quiz results and related information

Real-time heat map of browsed pages

Figure 4.

Real-time heat map of browsed pages

Real-time circular chart of student status

Figure 5.

Real-time circular chart of student status

The average number of operations per student in the first week

Figure 6.

The average number of operations per student in the first week

The average number of operations per student in the second week

Figure 7.

The average number of operations per student in the second week

The time-series standard deviation of page-view distribution during the lecture in the first week

Figure 8.

The time-series standard deviation of page-view distribution during the lecture in the first week

The time-series standard deviation of page-view distribution during the lecture in the second week

Figure 9.

The time-series standard deviation of page-view distribution during the lecture in the second week

Real-time heat map of Classes 2 and 3 in the first week

Figure 10.

Real-time heat map of Classes 2 and 3 in the first week

Sample e-book logs

User Material Operation Page no. Date Time
X Material A OPEN 0 2014/10/15 9:01:09
X Material A CLOSE 1 2014/10/15 9:01:13
Y Material B PREV 25 2014/10/29 10:05:35
Y Material C NEXT 2 2014/11/19 8:52:47
Z Material D NEXT 9 2014/11/12 9:31:30

Detailed information on each group (n.s.: not significant)

Control Experimental p-value
# of students 58 157
pretest average 6.85 ± 2.28 6.99 ± 2.38 n.s.
e-Book logs 16,335 39,722
logs/students 281.6 ± 123.3 253.0 ± 129.1 n.s.

Synchronization ratio of each group for three allowable delay lengths. *: p < 0.05, **: p < 0.01

Control Experimental p-value
1 minute 0.4275 0.5174 0.0403 *
3 minutes 0.6598 0.7661 0.0033 **
5 minutes 0.7508 0.8599 0.0014 **

Utilization ratios of three functions during the lecture

Function Control Experimental
Bookmarks 0.828 0.904
Highlights 0.759 0.834
Notes 0.293 0.471

The average number of operations and standard deviation per student in the first week

Class 1 Class 2 Class 3
Average SD Average SD Average SD
Bookmark 0.48 2.13 1.44 3.49 2.48 4.34
Highlight 3.19 7.74 7.34 13.46 10.32 14.76
Memo 1.20 3.41 1.02 3.92 1.20 3.21

The average number of operations and standard deviations per student in the second week

Class 1 Class 2 Class 3
Average SD Average SD Average SD
Bookmark 0.23 1.11 1.70 4.52 2.05 3.71
Highlight 1.71 5.69 5.04 12.25 14.60 21.87
Memo 0.91 2.95 0.39 1.92 0.79 2.31

The average SDs of browsed pages over the talking period

Class 1 Class 2 Class 3
Average SD SD of SDs Average SD SD of SDs Average SD SD of SDs
1st week 7.42 2.85 4.60 1.70 3.53 1.52
2nd week 7.31 2.63 5.01 1.85 4.95 2.20

Improvement strategy toward Class 3 from the reflection of Class 2

Week Pages Contents in the pages and the improvement strategy
Case 1 First week 35-37 Tutorial on how to backup data in each OS (iOS/Android/Windows)
-> Allow students to read the page depending on their own environment
Case 2 First week 43-55 Tutorial on how to update the software and applications in each OS
-> Allow students to read the page depending on their own environment
Case 3 Second week 16-20 Password management; What is a good/bad password?
-> Take a little bit longer to make students think about their own password
Case 4 Second week 22 Introduction to password management software
-> Allow a few minutes to research the software on the Web
Case 5 Second week 39–42 Wi-Fi security; points to keep in mind
-> Spend a long time to review the contents in the pages

The number of operations and the number of students who utilized the function for each case

The number of operations/The number of students
Bookmark Highlight Memo
Class 2 Class 3 Class 2 Class 3 Class 2 Class 3
Case 1 7/6 35/28 40/22 74/27 13/9 10/9
Case 2 33/19 65/32 160/37 228/55 21/10 6/5
Case 3 15/11 18/11 37/11 139/38 4/2 13/9
Case 4 6/6 10/9 8/7 54/27 0/0 1/1
Case 5 32/25 93/62 71/15 258/48 7/3 15/10

References

Aljohani, N.R. Daud, A. Abbasi, R.A. Alowibdi, J.S. Basheri, M. and Aslam, M.A. (2018), “An integrated framework for course adapted student learning analytics dashboard, computers in human behavior”.

Asif, R., Merceron, A., Ali, S.A. and Haider, N.G. (2017), “Analyzing undergraduate students’ performance using educational data mining”, Computers and Education, Vol. 113, pp. 177-194.

Bradford, P., Porciello, M., Balkon, N. and Backus, D. (2007), “The blackboard learning system: the be all and end all in educational instruction? ”, Journal of Educational Technology Systems, Vol. 35 No. 3, pp. 301-314.

Choi, S.P.M., Lam, S.S., Li, K.C. and Wong, B.T.M. (2018), “Learning analytics at low cost: at-risk student prediction with clicker data and systematic proactive interventions”, Journal of Educational Technology and Society, Vol. 21 No. 2, pp. 273-290.

Dawson, S., Gasevic, D., Siemens, G. and Joksimovic, S. (2014), “Current state and future trends: a citation network analysis of the learning analytics field”, in Proceedings of the Fourth International Conference on Learning Analytics And Knowledge, LAK14, pp. 231-240.

Dougiamas, M. and Taylor, P. (2003), “Moodle: Using learning communities to create an open source course management system”, in D. Lassner and C. McNaught, (Eds), Proceedings of EdMedia, World Conference on Educational Media and Technology 2003, pp. 171-178.

Ezen-Can, A., Boyer, K.E., Kellogg, S. and Booth, S. (2015), “Unsupervised modeling for understanding mooc discussion forums: a learning analytics approach”, in Proceedings of the Fifth International Conference on Learning Analytics AndKnowledge, LAK15, pages 146-150.

Freitas, S.d., Gibson, D., Alvarez, V., Irving, L., Star, K., Charleer, S. and Verbert, K. (2017), “ow to use gamified dashboards and learning analytics for providing immediate student feedback and performance tracking in higher education”, Proceedings of the 26th International Conference on World Wide Web Companion, pp. 429-434.

Fu, X., Shimada, A., Taniguchi, Y., Suehiro, D. and Ogata, H. (2017), “Real-time learning analytics for C programming language courses”, The 7th International Conference on Learning Analytics and Knowledge Understanding.

Gañán, D., Caballé, S., Clarisó, R., Conesa, J. and Bañeres, D. (2017), “ICT-FLAG: a web-based e-assessment platform featuring learning analytics and gamification”, International Journal of Web Information Systems, Vol. 13 No. 1, pp. 25-54.

Gardner, E.E., Anderson, L.B. and Wolvin, A.D. (2016), “Understanding instructor immediacy, credibility, and facework strategies through a qualitative analysis of written instructor feedback”, Qualitative Research Reports in Communication, Vol. 18 No. 1, pp. 27-35.

Grann, J. and Bushway, D. (2014), “Competency map: Visualizing student learning to promote student success”, in Proceedings of the Fourth International Conference on Learning Analytics And Knowledge, LAK14, pp. 168-172.

Jing, Y. and Baluja, S. (2008), “Visualrank: applying pagerank to large-scale image search”, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 30, pp. 1877-1890.

Jo, I-H., Kim, D. and Yoon, M. (2014), “Analyzing the log patterns of adult learners in LMS using learning analytics”, in Proceedings of the Fourth International Conference on Learning Analytics and Knowledge, LAK14, pp. 183-187.

Kovanović, V., Joksimović, S., Poquet, S., Hennis, T., Dawson, S., Gašević, D., d., Vries, P., Hatala, M. and Siemens, G. (2017), “Understanding the relationship between technology-use and cognitive presence in MOOCs”, in Proceedings of the Seventh International Conference on Learning Analytics And Knowledge, LAK17, pp. 582-583.

Kovanović, V., Joksimović, S., Mirriahi, N., Blaine, E., Gašević, D., Siemens, G. and Dawson, S. (2018), “Understand students’ self-reflections through learning analytics”, in Proceedings of the Eighth International Conference on Learning Analytics And Knowledge, LAK18, pp. 389-398.

Khalil, M. and Ebner, M. (2016), “Clustering patterns of engagement in massive open online courses (MOOCs): the use of learning analytics to reveal student categories”, Journal of Computing in Higher Education, Vol. 29 No. 1, pp. 114-132.

Khalil, M. and Ebner, M. (2016), “What is learning analytics about? a survey of different methods used in 2013-2015”, in Proceedings of Smart Learning Conference, pp. 294-304.

Lee, A.V.Y. and Tan, S.C. (2017), “Temporal analytics with discourse analysis: tracing ideas and impact on communal discourse”, in Proceedings of the Seventh International Conference on Learning Analytics And Knowledge, LAK17, pp. 120-127.

Martinez-Maldonado, R., Pardo, A., Mirriahi, N., Yacef, K., Kay, J. and Clayphan, A. (2015), “The latux workflow: Designing and deploying awareness tools in technology-enabled learning settings”, in Proceedings of the Fifth International Conference on Learning Analytics And Knowledge, LAK15, pp. 1-10.

Minovic, M. and Milovanovic, M. (2013), “Real-time learning analytics in educational games”, The First International Conference on Technological Ecosystem for Enhancing Multiculturality, pp. 245-251.

Mouri, K., Okubo, F., Shimada, A. and Ogata, H. (2016), “Bayesian network for predicting students ‘final grade using e-book logs in university education”, in IEEE International Conference on Advanced Learning Technologies(ICALT2016), pp. 85-89.

Oi, M., Okubo, F., Shimada, A., Yin, C. and Ogata, H. (2015), “Analysis of preview and review patterns in undergraduates ‘e‐ book logs’”, in The 23rd International Conference on Computers in Education (ICCE2015), pp. 166-171.

Okubo, F. Hirokawa, S. Oi, M. Yin, C. Shimada, A. Kojima, K. Yamada, M. and Ogata, H. (2016), “Learning activity features of high performance students. In Cross-LAK2016”.

Piech, C., Sahami, M., Koller, D., Cooper, S. and Blikstein, P. (2012), “Modeling how students learn to program”, in Proceedings of the 43rd ACM Technical Symposium on Computer Science Education, SIGCSE12, pp. 153-160.

Poon, K.M., Kong, S.C., YAU, T.S.H., WONG, M. and LING, (2017), “M. H. A learning analytics for monitoring students participation online: Visualizing navigational patterns on learning management system, blended learning”, New Challenges and Innovative Practices, Vol. 10309, pp. 166-176.

Shimada, A. Okubo, F. Yin, C. and Ogata, H. (2016), “Automatic generation of personalized review materials based on across-learning-system analysis. In Cross-LAK2016”.

Shimada, A., Okubo, F., Yin, C. and Ogata, H. (2015), Automatic Summarization of Lecture Slides for Enhanced Student Preview The 23rd International Conference on Computers in Education (ICCE2015), pp. 218-227.

Shimada, A., Okubo, F., Yin, C. and Ogata, H. (2017), Automatic Summarization of Lecture Slides for Enhanced Student Preview -Technical Report and User Study - IEEE Transactions on Learning Technologies, No. 99.

Taniguchi, Y., Okubo, F., Shimada, A. and Konomi, S. (2017), “Exploring students’ learning journals with Web-Based interactive report tool”, 14th International Conference On Cognition and Exploratory Learning in The Digital Age (CELDA 2017), pp. 251-254.

Vogelsang, T. and Ruppertz, L. (2015), “On the validity of peer grading and a cloud teaching assistant system”, in Proceedings of the Fifth International Conference on Learning Analytics And Knowledge, LAK15, pp. 41-50.

Yamada, M., Yin, C., Shimada, A., Kojima, K., Okubo, F. and Ogata, H. (2015), “Preliminary research on self-regulated learning and learning logs in a ubiquitus learning environment”, in 15th IEEE International Conference on Advanced Learning Technologies, ICALT 2015, pp. 93-95.

Yin, C., Okubo, F., Shimada, A., Hirokawa, S., Ogata, H. and Oi, M. (2015), “Identifying and analyzing the learning behaviors of students using e‐books”, in The 23rd International Conference on Computers in Education (ICCE2015), pp. 118-120.

Zhang, Y., Jin, R. and Zhou, Z.-H. (2010), “Understanding bag-of-words model: a statistical framework”, International Journal of Machine Learning and Cybernetics, Vol. 1 Nos 1/4, pp. 43-52.

Further reading

Chiu, M.M. and Fujita, N. (2014), “Statistical discourse analysis of online discussions: Informal cognition, social metacognition and knowledge creation”, in Proceedings of the Fourth International Conference on Learning Analytics And Knowledge, LAK14, pp. 217-225.

Daud, A., Aljohani, N.R., Abbasi, R.A.L., M.D. and Abbas, F., A.J.S. (2017), “Predicting student performance using advanced learning analytics”, in Proceedings of the 26th International Conference on World Wide Web Companion, pp. 415-421.

Acknowledgements

This work was supported by JST PRESTO Grant Number JPMJPR1505, and JSPS KAKENHI Grand Number JP16H06304, Japan.

Corresponding author

Atsushi Shimada can be contacted at: atsushi@ait.kyushu-u.ac.jp

Related articles