Within higher education, there was an abrupt shift from face-to-face to online lecturing with the introduction of social distancing measures in light of a global pandemic. The purpose of this study is to enrich the connection between students and instructors, the authors integrated elaborated interactive activities into large online lectures to enhance both students’ cognitive activities and social presence.
In this study, the goals are twofold. First, the authors introduce a classroom orchestration system and its features that support active learning across learning environments. Second, they investigate the differences and similarities between student behaviors during these activities in face-to-face and online settings.
The findings reveal individual differences in student behaviors between student cohorts, but no differences between learning environments, highlighting the versatility of the orchestration system across face-to-face and online environments.
This work presents the use of a classroom orchestration tool that is designed to easily support teaching and learning in online and face-to-face contexts and is particularly well suited for large classes.
Online lectures can be more than watching a teacher speaking on the computer display. Rich class-wide learning activities can be integrated into online lectures to support more cognitive engagement during the lecture.
Olsen, J.K., Faucon, L. and Dillenbourg, P. (2020), "Transferring interactive activities in large lectures from face-to-face to online settings", Information and Learning Sciences, Vol. ahead-of-print No. ahead-of-print. https://doi.org/10.1108/ILS-04-2020-0109Download as .RIS
Emerald Publishing Limited
Copyright © 2020, Emerald Publishing Limited
Within higher education, there was an abrupt shift from face-to-face to online lecturing with the introduction of social distancing measures in light of a global pandemic. Some instructors have addressed this challenge by primarily focusing on the continued dissemination of information through prerecorded lectures, while others have shifted to synchronous video conferencing for continued support through interaction with many solutions falling in between. For large lectures, this shift to online instruction may lead to a loss of classroom awareness and social presence from both the instructor and student perspectives, leading to lower learning outcomes (Nortvig et al., 2018; Stott, 2016). Many of the face-to-face cues that instructors use to orchestrate their classrooms – with classroom orchestration being defined as the real-time management of multi-social level activities and multiple constraints in the classroom (Dillenbourg, 2013) – such as the noise level of the classroom and non-verbal posture and facial expressions, are no longer available. From a student perspective, there is a loss of connection with the class and material when sitting alone in a room.
To support the connection between students and instructors, we propose integrating interactive activities into the lecture to both engage and motivate students with the material (Bonwell and Eison, 1991) and to provide instructors with a sense of engagement from the students through awareness features of orchestration tools. Many of the tools developed to support large classes are designed with a specific learning environment in mind and do not necessarily easily transfer between the two. For example, in MOOCs, discussion boards are used to asynchronously engage large classes of students in interacting with the material and each other (Hew, 2016). However, in the classroom, often class response systems, which are designed around the physical presence of students, are used (Kay and LeSage, 2009). When implementing synchronous online learning, it is not enough to only have tools that support student interaction but also tools for the instructor to support awareness of the learning environment especially with large lectures that are not traditionally supported through video conferencing. In this paper, we are interested in how active learning activities translate from a face-to-face setting to an online environment where, one could argue, they can add more value, as they not only engage the students cognitively but also may provide social connections between the students and instructor.
Freeman et al. (2014) show that the use of active learning in STEM classes increases student performance. Particularly in lectures, short exercises and quizzes support student understanding through data collected by the instructor being used to assess students’ current state of knowledge (Wessels et al., 2007). The questions and exercises are most effective when they support peer discussions and adaptive teaching to the needs of the students (Draper and Brown, 2004). In contrast to the active learning occurring after the relevant instruction, as is often done in lectures, research shows that it may be more effective for students to engage in the active learning activities before receiving any direct instruction (Loibl et al., 2017; Schwartz and Bransford, 1998). Research has shown the importance of student interaction with the content, instructor and classmates across face-to-face and online learning environments (Bolliger and Martin, 2018) but that how these connections are established may be different (Szeto and Cheng, 2016).
How active learning is supported can depend on the learning environment and class size. In face-to-face learning, we often have synchronous learning sessions in which everyone is collocated independent of the class size. In this case, instructors can engage their students through audience response systems to make student data more visible in the classroom through aggregating student answers (Kay and LeSage, 2009; Wessels et al., 2007). For example, classroom response systems have been used to gauge student understanding during a lecture by having students answer conceptual questions throughout the lesson (Wessels et al., 2007) and to provide real-time feedback on open-ended questions (Wylie et al., 2014). Response systems are beneficial by increasing the interactions and engagement that students have in class, which in turn positively impacts student learning (Blasco-Arcas et al., 2013). However, they are often limited to short sets of independent questions and answers while we are aiming to integrate scenarios with multiple steps with data flow between activities.
In online settings, there is often a reduction in the interaction with the faculty (Paulsen and McCormick, 2020). With smaller class sizes, instructors can engage students in synchronous video conferencing to support student interactions (Lee et al., 2017). But, as with face-to-face classes, as the class size grows, individual interactions between the instructor and students are not as feasible, and other forms of engagement are needed. Unlike in face-to-face settings in which these tools support real-time interactions, often online, the interactions transition to asynchronous, which can support a higher capacity of students (Elison-Bowers et al., 2011). However, this shift does not address the issues of social presence and awareness that are still encountered during a large online lecture.
In this study, our goals are twofold. First, we will introduce our orchestration system and its features that support active learning across learning environments. Second, we investigate the research question: What are the differences and similarities between student behaviors during interactive learning activities in face-to-face and online settings? Particularly, we analyze the engagement of students with the activity in terms of their:
start and completion times; and
accuracy of answers.
Through this case study, we aim to present a method of engaging students in large lectures that can be supported in both face-to-face and online settings, allowing for the smooth transition between the two.
In this paper, we compare student performance across three classes in which the students all engaged with the same four interactive activities during lecture. All three of our data sets are from a HCI bachelor’s course. The three data sets come from consecutive years between 2018 and 2020. Each year, the course had around 80 students attending the lecture. For the 2018 and 2019 data sets, the course was fully given face-to-face. For the 2020 data set, the students began the semester face-to-face and transitioned to online. In this case, the data from 2020 was split between the two learning environments.
During lectures, the students engaged with four experiential activities that aimed at teaching concepts of cognition and user interfaces. The four activities are called:
Genealogy: The students solve logic problems based on family trees of increasing difficulty necessitating more effort and working memory. They experience the effects of cognitive load, especially overload.
Dual: The students discover another measure of cognitive load. They experience the split attention effect, as they were asked to indicate if two shapes were the symmetric reflection of each other while playing an increasingly faster game of Breakout.
Stroop: The students experience the Stroop effect by answering questions identifying a color word that is either written in the matching color or not.
Train: The students use four different user interfaces for the same task of ordering three train tickets on each with the goal being to compare the usability of these interfaces.
Each of the activities was used to introduce a concept before lecturing about that concept, following the principles of “a time for telling” (Schwartz and Bransford, 1998). After the students completed the task, the class data were aggregated and used by the instructor during his lecture to connect the learners’ data with the concept of the day.
For each of the learning activities, the instructor used FROG, a tool for orchestrating a learning scenario composed of individual, team and class-wide activities in technology-enhanced classrooms (Håklev et al., 2017). The originality of FROG as compared to other classroom participation systems is that it can transfer data or artefacts between consecutive activities, which enables functionalities such as automatic team formation or debriefing support. For example, after the train activity, the students were asked to rank the interfaces in order of usability. These ranking data were then used to both:
pair students with differing opinions (to exploit the socio-cognitive conflict mechanism); and
inform the pair of their performance (time to order a train ticket, number of errors) as a factual basis for their comparison of their experiences.
It would be almost impossible for an instructor to manually form pairs of students with opposite opinions in a class of 80 students. Additionally, as was done with the four activities used in this study, instructors could display data to the whole class when debriefing the results to tie them to the lecture material. In the case of a face-to-face lecture, where the instructors need students’ attention, the instructors can restrict this sharing to only their device. In an online setting, where there is no central focus, the instructor can configure the information to be displayed to all students.
FROG also facilitates class management by providing information about the current state of the students through teacher dashboards. For example, as with class response systems (Wessels et al., 2007), instructors can view an aggregation of the class data, either to visualize a concept in class, such as cognitive overload in Figure 1, or to display students’ answer distributions to a question representing the class’s understanding. Additionally, the instructor can view the engagement and progress of students during the activity through the timing dashboard. The timing dashboard provides a live update of how many students have finished the current activity and the average progress that students have made within the class (Figure 2). It also provides a prediction of both the completion and progress of the class for the next 3 min of the activity (Faucon et al., 2020). This dashboard serves two purposes. First, the dashboard can support instructors in planning their class as they can proactively decide how much more time students may need for an activity rather than trying to have students estimate, which can be difficult to do in a large classroom. If 80% of students have finished, should the instructor give the remaining 20% more 2 min to keep them on board, knowing that 80% are wasting time? To take such a decision, the instructor can see on the dashboard a prediction of how many students would finish the activity if more 2 min are given. This is especially important in an online setting where the normal classroom cues are not available. Second, the dashboard can indicate in real-time the engagement of the students with the activity. If students are not making progress on the activity, the instructor can intervene rather than waiting until the end to understand if there were any issues. Again, this can be important in an online classroom setting in which the instructor cannot walk around to see if students are doing the task. These features together can support the instructor in having more awareness of the class by being more connected with the students’ actions.
To investigate how students engaged with the integrated learning activities between face-to-face and online settings, we analyzed students’ behaviors in terms of timing and correctness. In terms of timing, we were interested in both the time it took students to start an activity (using the first student logging on as a baseline) and how long it took them to complete an activity as signs of disengagement (Gobert et al., 2015). Because response time was an important metric for each of the activities, if students were taking longer on the task, they may be less engaged. As the dual activity had a fixed amount of time for all students, we did not include it in the total time analysis. Second, we were interested in if students in an online environment perform more uniformly than when working face-to-face. In this case, we analyzed the differences between the variances for both the start delay and total time on task. Finally, students may show disengagement with the task through gaming the system (Baker et al., 2008). In other words, students may still complete the task but just select an answer randomly to finish as quickly as possible. To assess this difference between conditions, we analyzed per cent correct between the two environments.
To partially account for the differences that might occur between years, we analyzed the differences between activities that occurred face-to-face for all years as a baseline. Furthermore, we found that the movement to online instruction led to an additional 24% reduction in attendance compared to regular class variance. Given that it is likely that more motivated students attended the online lecture, we limited our analysis in all years to students that completed all four of the activities, with these students likely being the motivated students, to account for these motivational differences between students. Finally, we also compared the delayed start between the face-to-face and online activities for the 2020 data set to investigate the impact of the environment.
To answer our research question, we ran Levine’s tests to assess the equality of variance and Kruskal–Wallis test by rank for all other measures with Bonferroni corrected pairwise Wilcox tests for post hoc analyses. All statistical results can be found in Table 1. For the baseline comparisons, we found that all of the tests except the Stroop total time were significant indicating performance differences between the class years. In post hoc analyses, we found that these differences were always either the 2018 or 2019 students having significant differences from the other groups. For example, for the train activity correctness, the students in 2019 (M = 71.6%) performed significantly worse than in 2018 (81.0%) or 2020 (80.8%), while there was not a significant difference between 2018 and 2020. The one exception is the time variance for the train activity in which the 2020 students had a significantly lower variance than the 2018 or 2019 students. Given that in these activities the students were all in a face-to-face setting, the differences may be owing to either student or instructional differences. However, as the differences were often in a year other than 2020 (in which the students worked online in the next activities), any differences seen between the online and face-to-face genealogy and dual activities may still be owing to environmental factors.
In terms of the face-to-face and online comparisons from the genealogy and dual data, we again found significant differences in all of the comparisons except the delay of start for the dual activity and the delay variance for the genealogy activity. As with the baseline comparisons, in post hoc analyses, we found that these differences were always either the 2018 or 2019 students having significant differences from the other groups. For example, for the dual correctness, the students in 2019 (M = 69.7%) performed significantly worse than in 2018 (77.2%) or 2020 (80.5%), while there was not a significant difference between 2018 and 2020. Because 2020 was not significantly different from both 2018 and 2019, which were conducted face-to-face, we cannot conclude that the online environment impacted the correctness. We find this same pattern for the other variables as well shown in Table 1.
Discussion and conclusions
From our findings, we see that the main differences between years were owing to individual or instructional differences. In the face-to-face and online comparison, although we found significant differences, these differences were never primarily involving the online condition. Further, in the baseline comparisons in which all students were in a face-to-face setting, we still found these same differences indicating that they are not owing to the learning environment. Before this study, we might assume that online learning allows students to begin these activities more efficiently, as they are already using their computers or that the online learning environment is more distracting (Hollis and Was, 2016) leading to lower performance, but this does not seem to be the case. Rather, in relation to our research question around the similarities and differences between student behaviors between online and face-to-face learning environments, our results show that we can engage students in large lectures in both face-to-face and online settings equally, allowing for the smooth transition between the two. However, to fully understand the cause of the year-to-year differences and the impact of the learning environment, further research is needed.
A limitation of our work is that all of our data comes from a single class and from only three years. In this case, although we found differences between the years, with only three years of data, the individual differences may be more prominent than any environmental differences. Each year, the students are different and the instructor may change how they conduct the class leading to different student behaviors. In this case, we would need a much larger data set to be able to attribute which differences are owing to the students, instruction or the learning environment. However, our work does illustrate how active learning activities can be integrated into online learning environments to support more interaction between students and instructors during the lecture.
Given our results, we would recommend that instructors integrate more interactive learning activities into the classroom supported by classroom orchestration systems. From a student viewpoint, they are able to engage with the concepts that are being taught in the lecture, allowing them to connect with the material in a way that may otherwise be difficult to replicate within a large online lecture. From the instructor perspective, orchestration tools can bring awareness to the lecture that may be missing in large face-to-face lectures and even more so in an online lecture where facial and audio cues are also missing. By using tools that track student’s progress and completion, instructors can track the engagement and understanding of the students to tailor their lecture to the students, as they might do in a face-to-face lecture using non-verbal cues from the students. Further, we would recommend that these activities not only be used in online settings but in all large lectures regardless of the learning environment to support student learning in a way that is highly adaptable. As we did not find evidence of differences in student performance between learning environments, these activities can be used in face-to-face, online or hybrid settings, providing some sense of coherence in the learning process and a consistent experience as the learning environment is required to change.
Results from comparing the baseline face-to-face activities and face-to-face versus online between and within years
|Total time||Time variance||Start delay||Delay variance||Correctness percentage|
|S||H(2) = 4.5
p = 0.10
|**F(2,137) = 7.9
p < 0.05
|**H(2) = 53.2
p < 0.05
|**F(2,137) = 9.2
p < 0.05
|**H(2) = 13.4
p < 0.05
|T||H(2) = 10.3
p < 0.05
|***F(2,137) = 4.2
p < 0.05
|*H(2) = 39.7
p < 0.05
|*F(2,137) = 11.1
p < 0.05
|**H(2) = 12.1
p < 0.05
|Face-to-face versus online between years|
|G||*H(2) = 16.5
p < 0.05
|*F(2,137) = 7.8
p < 0.05
|*H(2) = 9.1
p < 0.05
|F(2,137) = 0.7
p = 0.48
|*H(2) = 25.4
p < 0.05
|D||–||–||H(2) = 3.1
p = 0.22
|*F(2,137) = 3.6
p < 0.05
|**H(2) = 16.0
p < 0.05
|Face-to-face versus online within a year|
|20||t(99) = 1.2
p = 0.22
|F(1,198) = 2.1
p = 0.14
In the table S = stroop; T = train; G = genealogy; D = dual; and 20 = year 2020.
*indicates a significant difference with just the 2018 students;
**indicates a significant difference with just the 2019 students;
***indicates a significant difference with just the 2020 students
Baker, R., Walonoski, J., Heffernan, N., Roll, I., Corbett, A. and Koedinger, K. (2008), “Why students engage in ‘gaming the system’ behavior in interactive learning environments”, Journal of Interactive Learning Research, Vol. 19 No. 2, pp. 185-224.
Blasco-Arcas, L., Buil, I., Hernández-Ortega, B. and Sese, F.J. (2013), “Using clickers in class. the role of interactivity, active collaborative learning and engagement in learning performance”, Computers and Education, Vol. 62, pp. 102-110.
Bolliger, D.U. and Martin, F. (2018), “Instructor and student perceptions of online student engagement strategies”, Distance Education, Vol. 39 No. 4, pp. 568-583.
Bonwell, C.C. and Eison, J.A. (1991), “Active Learning: creating Excitement in the Classroom. 1991 ASHE-ERIC Higher Education Reports”, ERIC Clearinghouse on Higher Education, The George Washington University, One Dupont Circle, Suite 630, Washington, DC 20036-1183.
Dillenbourg, P. (2013), “Design for classroom orchestration”, Computers and Education, Vol. 69, pp. 485-492.
Draper, S.W. and Brown, M.I. (2004), “Increasing interactivity in lectures using an electronic voting system”, Journal of Computer Assisted Learning, Vol. 20 No. 2, pp. 81-94.
Elison-Bowers, P., Sand, J., Barlow, M.R. and Wing, T.J. (2011), “Strategies for managing large online classes”, The International Journal of Learning: Annual Review, Vol. 18 No. 2, pp. 57-66, doi: 10.18848/1447-9494/CGP/v18i02/47489.
Faucon, L., Olsen, J.K., Håklev, S. and Dillenbourg, P. (in press), “Real-time prediction of students’ activity progress and completion rates”, Journal of Learning Analytics.
Freeman, S., Eddy, S.L., McDonough, M., Smith, M.K., Okoroafor, N., Jordt, H. and Wenderoth, M.P. (2014), “Active learning increases student performance in science, engineering, and mathematics”, Proceedings of the National Academy of Sciences, Vol. 111 No. 23, pp. 8410-8415.
Gobert, J.D., Baker, R.S. and Wixon, M.B. (2015), “Operationalizing and detecting disengagement within online science microworlds”, Educational Psychologist, Vol. 50 No. 1, pp. 43-57.
Håklev, S., Faucon, L., Hadzilacos, T. and Dillenbourg, P. (2017), “Orchestration graphs: enabling rich social pedagogical scenarios in MOOCs”, Proceedings of the Fourth (2017) ACM Conference on Learning@ Scale, pp. 261-264.
Hew, K.F. (2016), “Promoting engagement in online courses: what strategies can we learn from three highly rated MOOCS”, British Journal of Educational Technology, Vol. 47 No. 2, pp. 320-341.
Hollis, R.B. and Was, C.A. (2016), “Mind wandering, control failures, and social media distractions in online learning”, Learning and Instruction, Vol. 42, pp. 104-112.
Kay, R.H. and LeSage, A. (2009), “Examining the benefits and challenges of using audience response systems: a review of the literature”, Computers and Education, Vol. 53 No. 3, pp. 819-827.
Lee, D.H., You, Y.W. and Kim, Y. (2017), “An analysis of online learning tools based on participatory interaction: focused on an analysis of the Minerva school case”, Advances in Computer Science and Ubiquitous Computing, Springer, pp. 1199-1206.
Loibl, K., Roll, I. and Rummel, N. (2017), “Towards a theory of when and how problem solving followed by instruction supports learning”, Educational Psychology Review, Vol. 29 No. 4, pp. 693-715.
Nortvig, A.M., Petersen, A.K. and Balle, S.H. (2018), “A literature review of the factors influencing E-Learning and blended learning in relation to learning outcome, student satisfaction and engagement”, Electronic Journal of E-Learning, Vol. 16 No. 1, pp. 46-55.
Paulsen, J. and McCormick, A.C. (2020), “Reassessing disparities in online learner student engagement in higher education”, Educational Researcher, Vol. 49 No. 1, p. 0013189X19898690.
Schwartz, D.L. and Bransford, J.D. (1998), “A time for telling”, Cognition and Instruction, Vol. 16 No. 4, pp. 475-5223.
Stott, P. (2016), “The perils of a lack of student engagement: reflections of a ‘lonely, brave, and rather exposed’ online instructor”, British Journal of Educational Technology, Vol. 47 No. 1, pp. 51-64.
Szeto, E. and Cheng, A.Y. (2016), “Towards a framework of interactions in a blended synchronous learning environment: what effects are there on students’ social presence experience?”, Interactive Learning Environments, Vol. 24 No. 3, pp. 487-503.
Wessels, A., Fries, S., Horz, H., Scheele, N. and Effelsberg, W. (2007), “Interactive lectures: effective teaching and learning in lectures using wireless networks”, Computers in Human Behavior, Vol. 23 No. 5, pp. 2524-2537.
Wylie, R. Chi, M.T. Talbot, R. Dutilly, E. Trickett, S. Helding, B. and Nielsen, R.D. (2014), “Comprehension SEEDING: providing real-time formative assessment to enhance classroom discussion”, International Society of the Learning Sciences, Boulder, CO.
This paper is part of the special issue, “A Response to Emergency Transitions to Remote Online Education in K-12 and Higher Education” which contains shorter, rapid-turnaround invited works, not subject to double-blind peer review. The issue was called, managed and produced on short timeline in Summer 2020 toward pragmatic instructional application in the Fall 2020 semester.
This work has been partially supported by the Swiss National Science Foundation through grant No. 187534.