Real-time polling to help corral university-learners’ wandering minds

Purpose – This research investigates the use of real-time online polling to enhance university teaching and learning. Design/methodology/approach –Using a case study and employing action research, this work shows how polling can improve professional practice, learner engagement and teaching performance. Findings – Incorporating the right type of online real-time polling into lessons is a professional challenge and can be hard work for teachers but has overriding benefits. Research limitations/implications – This research reports one lecturer’s experiences within two UK universities, limited to location, variety of students and lecturer technical capability. The research implications are that online polling, especially in different learning environments, is needed. Previous research is outdated or limited to real-time polling for teaching and learning during physical classes. There are research opportunities therefore in the use of polling before, during and after class. Practical implications – This research finds that the field of online polling needs to be seen as a modern teaching tool that now uses students’ personal technology for easier use by students and teachers: it is more than the use of archaic “clickers”which were extra classroom items to be bought and maintained. Also, online polling, before, during and after classes, can be employed usefully and have validitywithin teachers’ toolboxes. This paper shows how such polls can be successfully deployed. Originality/value – Whilst there are previous reports of polling undertaken within teaching and learning events, this paper builds upon those experiences and boosts collective understanding about the use of polling as a way to improve professional practice and increase learning.


Introduction
As we entered into the 21st century, technology-enhanced learning became mainstream in many university-level courses (Price, 2010). Developments in teaching-and learning-related hardware and software have provided both university staff and students many opportunities to develop new and improved learning, teaching and assessment techniques. For example, the almost ubiquitous use of smartphone technologies, which rapidly develop with each manufacturer's upgrade, is an area where students' personal technology has often overtaken university systems.
Even though there has been bad press about online polling used outside education, the challenge for teaching staff remains the same: now is not the time to question whether or not to use such technologies (Guardian, 2020). Rather, the question for teaching professionals is how best to use the software and hardware that are readily available to university staff and students in order to keep teaching and learning professionally current, applicable and valid.
Some, especially more senior and experienced, faculty staff members may not be up-todate with current smartphone developments, nor have the time to ponder and design new or redesign their traditional teaching methods to make full use of current opportunities. Faculty staff are usually "concerned that the time and effort required for course revision would be prohibitive, that their students would learn less content, that outcomes could not be reliably assessed in any case, and that such changes would take students and faculty alike out of their current comfort zones" (Allen and Tanner, 2005). Such mindsets could be marooned in the age of "clicker" (alternatively known as audience response systems) technology giving those tutors the wrong impression that real-time polling is more challenging than it needs to be (Stover et al., 2015;Bunce et al., 2010;Kam and Sommer, 2006).
As well as educational technological developments, the development of teaching practice also moves on. For example (Knight and Wood, 2005), "There is now a great deal of evidence that lecturing is a relatively ineffective pedagogical tool for promoting conceptual understanding". In fact "decreased lecturing and addition of student participation and cooperative problem solving during class time, including frequent in-class assessment of understanding" is a way to improve teaching, learning and summative assessment environments (Knight and Wood, 2005).
Also, and as most experienced, professionally reflective teachers quickly realise, student's minds often wander, putting at risk learner performance (Risko et al., 2012). It is well known that learners have limited attention spans: such concentration naturally falls after 10-15 min (Penner, 1984). The natural erosion of learners' attention is further compounded by technological distractions such as the instant notifications presented to their users by smartphones.
So to boost student engagement, the best classes for highest levels of learner performance are those which have active participation, deep understanding on the part of students, regular assessments, visible learning, increased use of the senses and increased participation and engagement (Stover et al., 2015). Also, anything that can cut student anxiety whilst raising students' attention levels is also a good development. All such characteristics are enhanced by real-time polling (Sun et al., 2018). This paper focuses upon using students' personal smartphones to deploy real-time polling as a way to corral the wandering minds of distracted university students. After all, one way of "re-engaging attention and minimizing mind wandering is to periodically pose students carefully considered questions" (McGivern and Coxon, 2015).
This paper provides insight into how advancements in both smartphone technology and changes in teaching practice can be used to enhance university teaching and student learning. For those perhaps unfamiliar with this area of technology-enhanced learning, this paper gives an easy-to-use guide to improve practice by maintaining teaching currency and simultaneously boosting student performance.
The ins and outs of how this was achieved are described, along with a discussion of the experience and its results. Later, the implications and limitations of the work are described and conclusions drawn. These conclusions include ways in which the novice can leap to the front of technology-enhanced learning development in this real-time polling arenakeeping students engaged in pursuit of that Holy Grail of academic lifeenhanced learning performance.

Methodology
A total of 226 undergraduate and postgraduate students studying 6 different modules within 2 UK universities were asked to respond to a series of online polls as described below. See Table 1.
These polls were run by the same tutor over a 6-week period during timetabled lectures, tutorials and workshop sessions. The students were a mix of ethnicities, genders and ages. JRIT Almost half were part-time students, the others full-time students. This study did not set out to investigate the effects such demographic variables have on polling success. Rather it aims to provide a case study, based upon real-life action research resulting in a set of achievable actions that enable other tutors to easily and quickly reap the benefits of online polling.
A thematic approach was used based upon the reflections of the tutor who led the research and student participants. Data were recorded in a verbatim format from online polling participants, whilst experiential information was gleaned via reflective professional journaling. After each poll was completed, respondents' inputs were recorded. Also after each poll, a professional reflection was written that explored in a structured way, the tutor's observations of how the poll went, what went well with the poll and what could be improved.

Modifying teaching materials and polling software training
In both face-to-face and online lectures, teaching materials were redesigned to give brief instruction to learners on the use of real-time polling software available online and via a smartphone app. The app used was called Vevox (available at vevox.com). In its basic form, Vevox is available to be freely download for iOS and Android users. A fully functional version of the software was available to the tutor via a professional subscription.

Pilot testing
A pilot project was used to test the proposed systems. In this, two cohorts of undergraduate (one face-to-face and one online) and a cohort of (online) postgraduate learners were asked to download the real-time polling app onto their smartphones. Intermixed within all three classes the same six questions were broadcast in real-time via the Vevox app. Initially, the free version of the app, with limited functionality, was used. Feedback from the real-time polls was used to instantaneously tailor-teaching content.
The questions asked within this poll comprised 2 multiple-choice questions and 4 textbased answers which were limited to 1-word answers due to limited functionality available in the free polling software. These questions were asked in two sections of the class and were concerned about students' assessment of their own performance.
Once familiarity with the systems was gained, several different approaches to polling were used. Familiarity with the app took only a few hours of self-study on the part of the tutor including self-guided training with the fully functional app. Added to this was the time taken for the pilot test which happened over 2 weeks within the normal teaching calendar for this tutor.

Real-time questioningbreaking lecture monotony
To enhance student engagement online, real-time polling was scattered throughout four classes to entice and encourage instantaneous learner feedback and to break the monotony of traditional "talk and chalk" teaching. These polls were focused upon assessing the levels of understanding of course content as well as students' feelings about their own performance. The instantaneous feedback gained was then used by the tutor to modify planned classroom activities to plug learning gaps. The questions asked within this poll comprised 1 multiplechoice question and 4 text-based answers which were limited to 2,000-word answers.
2.4 Pre-class polling for tailored classroom activities Polls were broadcast to targeted student cohorts before two timetabled classes. The results were used to help develop more valuable classroom activities. These were focused upon assessment revision.
The question asked within this poll comprised text-based answers which were limited to 2,000 words. This question was focused upon suggestions for content to be included in a future class.

Post-exam polling for critical learner reflection
After exams, polls were used to encourage learners to constructively and critically ponder upon their learning and assessment experience. This helped the tutor develop classroom strategies to help students realise their strengths and weaknesses and develop mitigations.
The 3 questions asked within this poll comprised text-based answers which were limited to 2,000 words. These questions were focused upon the students' personal reflections of their performance in an exam. Due to the nature of the assessment strategies for the modules included in this study, only one module cohort (the "Sustainable Technology" module at the University of South Wales) was included in this reflective-practice poll.

Polling to help enhance learning support materials
Online and physical learning resources were made available to learners at the start of all the courses. These included electronic reading lists, audio and visual media collated by the tutor based upon the syllabus. Part-way through the course, these polls asked students what different resources (or changes to currently provided resources) they would like to see that would enhance their learning.
The question asked within this poll comprised text-based answers which were limited to 2,000 words. This question was focused upon suggestions about how to modify, add to or delete resources included in online learning support materials at the start of each module. Bespoke learning support materials were originally supplied for each module, with each module having different support materials.
2.7 Using polling in the flipped classroom to address gaps in learning Polls in this category were used to gauge the success with respect to levels of student understanding of out-of-classroom activities used in a "flipped classroom" (Price and Neale, 2013). Such assessment of learning enabled the tutor to modify classroom content to help plug learning voids or misunderstandings.
The questions asked within this poll comprised text-based, multiple-choice and Likertscaled answers. The number of questions asked was different for each module as each module had different numbers of flipped classroom activities planned within them.

Data collection and analysis
Data in the form of responses to the above types of polls were collected. The polls were used throughout the classes shown in Table 1.
This data were analysed qualitatively to investigate the success or otherwise of the polling experience for both the students and the tutor. Personal reflections from the tutor and some of those polled were also analysed.

Analyses
Results of the pilot test were encouragingthe system worked and students used the app almost problem-free. It soon became apparent that too much polling or polling without purpose could easily become a negative experience. A judicious number of polls per unit of classroom activity were needed. Over this number, polling became judged by those polled as pointless and dull.
In the early stages of this work, in-class real-time polling was a new experience for the tutor. As such, he was soon presented with the following challenges: (1) Problem-solving challenges of a technical nature. Some students could not use the polling app due to lack of understanding about how to, or inability to, download the app.
(2) His own understanding and ability to run the polling software properly. Whilst in class, this usually meant grappling with the polling software functions while also delivering a lecture and using all the normal tools and devices required. Simultaneously using a slide deck whilst showing poll questions takes practice; managing the classroom and answering students' questions about the polls and its software can take time away from focusing upon the class content.
(3) Polling software or IT equipment crashes or technical "glitches" somewhere in the system. This can scupper even the best-laid plans.
Once familiarity with the software was gained, and polling-related queries from students became less frequent, the polling experience became much more beneficial to the tutor. There was less fire-fighting and anxiety, and the full gamut of polling benefits became apparent. This presented the tutor with opportunities to upgrade teaching, learning and assessment strategies. Students of all levels and in a variety of subjects were generally engaged with the online polls. They were also positive in their feedback about the polls. Participants were also happy to use their own smartphones for polling, with not one complaint about downloading and using the Vevox app. Ad hoc comments were that the polls "were easy to use", "made me wake up during class" and "got me to think about things I would not have normally".
Outside of the classroom, it was found that responses to polls were of better quality: respondents' answers tended to be longer and better explained. Students had more time to ponder and more accurately convey their suggestions for improvements to planned learning activities. This was because: (1) More time was available to formulate a carefully considered response.
(2) There were fewer distractions present at the time of making the response.
(3) Responding individually meant that there was no peer pressure, real or imagined.
Using polling for online classes seemed to work better as the students were already primed to use online systems and were not physically present with the tutor. Sometimes students in the physical classroom did not always see the reason why online polling was being used when a "tutor could just ask [them] the questions". A positive dimension to this aspect is also reflected in the pre-and post-class polling whereby questions from learners can be posed regardless of the tutor's availability.
Before online in-class polling to test learning gained in out-of-class learning activities, that is, using a flipped classroom mode of teaching, the tutor had to rely upon traditional face-toface questioning. This raises all the commonplace issues around student timidity and increased anxiety, especially for those with weaker learning capability. Using anonymous, online and real-time polling enabled the tutor to minimise such audience anxiety and fright. All this whilst simultaneously gauging understanding. Importantly, this brought to the classroom a previously under-utilised ability for the tutor to identify, not just gaps in the understanding of individual learners but also gaps or confusion in the understanding of large parts of the cohort. In the case of the former, ad hoc interventions could plug those localised knowledge gaps and/or misunderstandings. In the latter, the responses to real-time polls were used by the tutor to seed peer-group discussions aimed at plugging such large learning gaps. This did sometimes require some nimble academic deftness on behalf of the tutor. But this is nothing different to experiencing vocalised questions within traditional classes.

Discussion
Web-based, digital pedagogy has started to become an important part of engaged learning (Sarvary and Gifford, 2017). It is here to stay. Tutors need to engage and upgrade their teaching and assessment strategies accordingly for fear of losing relevancy and stagnating. Online polling provides one example of such an opportunity.
There is a range of research already published about polling (Allen and Tanner, 2005;Kam and Sommer, 2006;McGivern and Coxon, 2015). The findings of this study bolster understanding of this topic and show that students and tutors can benefit from carefully planned online polling. What is different here is that different forms of polling, polling at the right time, polls deployed at the right frequency and of the right length within timetabled learning activities have been found to be important factors for success. The new insights that this study brings are those emanating from investigations about polling at 5 different stages of learning, which are as follows.

Pre-class polling for enhanced and bespoke classroom activities
This type of polling is most useful for tutors who are willing to take on suggestions from student cohorts about what could be usefully included in future planned classroom activities. This is particularly useful for timetabled sessions aimed at assessment revision, for example, a scheduled set of student presentations for which a "preparation workshop" session was planned. The tutor used polls which ran for a week to elicit suggestions from students about what should be included in the workshop. This was found to evoke carefully considered, detailed and valuable poll responses. Sufficient time to allow for polled results to be fed into classroom materials has to be planned between the end of the poll and the affected classroom session.

Real-time polling to address gaps in traditional in-class learning
Real-time, anonymous online polling using a small number (usually <5) of questions peppered appropriately throughout lectures is most beneficial to gauge where misunderstandings and knowledge gaps may have occurred. So to avoid monotony and boredom, these sorts of polls have to be used astutely and with a clear and transparent learning purpose. If many students seem to have the same misunderstanding or knowledge gap, then the skilful tutor can use the knowledge from such poll responses to seed a group discussion aligned with interactional scaffolding techniques (Riches, 2018;Mahan, 2020).

Using polling in a flipped classroom to address gaps in out-of-class learning
Concerns about student engagement with out-of-class learning activities can be fruitfully addressed by using polling to assess levels of engagement and subsequent learning. Such polling presents to students a clear purpose for the activities requiring their out-of-class engagement. It also indicates future opportunities for learners to show their learning prowess JRIT which is important to some who enjoy such feelings of competition. After all, it is known that as long as the competition is just for fun, ". . . using competition to increase trainees' drive to improve their performance is an acceptable strategy to help them mobilize their talents" (Makhoul et al., 2018).

Post-exam polling for critical learner reflection
Learner reflection plays an important role in knowledge construction and helps to improve learning performance (Nian-Shing et al., 2011). So post-exam polling can be used to spur learners on to critical self-reflection upon their recent exam performance. The challenge to the tutor is to usefully identify and provide feedback to the student cohort of insights gained that are common across the cohort and are perhaps otherwise out of sight for individual learners.
4.5 Out-of-class polling to help enhance learning support materials Requiring sufficient time for the tutor to react to insights gained, this type of poll is most valuable for the tutor-led development of teaching and learning resources and future cohorts, rather than current learners as a quick reactive turnaround is probably difficult to achieve. Nonetheless, the results from these types of polls enable the tutor's professional practice to improve as new and more useful-to-the-students learning support materials can be identified. Of course, learner's suggestions are not always relevant to module syllabi, and so the tutor's professional judgement has to be used to gauge the usefulness of gleaned suggestions. These polls, however, can multiply the tutor's effort by having other people (learners) be on the look-out for materials that can be usefully deployed in future teaching and learning.
As can be seen from the above, incorporating the right real-time polling into teaching, learning and assessment strategies can be technically and professionally challenging and can be a lot of work for teachers (Sarvary and Gifford, 2017). However, for those prepared to forge ahead, the benefits are as follows: (1) Increased participation: The anonymity of polls allows every student to provide an honest answer without the fear of public humiliation. Polling increases (and can sustain) student engagement.
(2) Reduced learner anxiety: Polling helps to reduce the anxiety of sharing a wrong answer, and can also help students gain insights about one another and build confidence in their own learning.
(3) Uncover learning gaps: Polls are useful to identify gaps in the comprehension of individual learners and/or large parts of the whole student cohort.
(4) Provide better feedback for learners: Some students who have used polling in the classroom report that the feedback they get from polling is an important part of their own learning.
The free version of the polling app is limited to multiple-choice and one-word free text answers. This latter constraint was quickly noted by one student user who became frustrated that she could not respond in full sentences. She had more to say than one word would allow! This provides insight into the importance of carefully considered poll design with respect to both the answer format and types of questions asked.
As the literature and experience from this study show, learners are positive towards using online polling, reporting that they are fun; provide a welcome break; helped learning; provide feedback for what is learnt; make learners think; and help with continuous assessment (Salzer, 2018).

Implications and limitations of this study
The implications of this study are that for online polling to be of the utmost success: (1) There needs to be institutional buy-in to online pedagogical tools and subscription to polling software. This is not the case in all institutions. Even where such positivity exists, different systems are used, and so tutors will need to be open to such differences.
(2) Academics will require training in the technical aspects of using online polling software, as well as pedagogical guidance in the use of polls and polled question development. This paper goes someway to address the latter. However, sharing knowledge from polling experiences via academic peer-groups (perhaps managed by polling companies) to help provide support and drive innovation could enhance such learning.
(3) Research into the effects of different types of polling discussed in this paper would be beneficial to better understand how pedagogical theories such as scaffolding, the gamification of learning and negative learner characteristics (such as anxiety, loneliness and uncertainty) whereby "the learners' emotional state before learning [is] an important predictor for learning success" can be better served by polling strategies (Chowanda and Chowanda, 2016;Knorzer et al., 2016). Further research would also be beneficial in universities where technology-enhanced learning is not so embedded and also in cohorts of students where higher occurrence of ownership and more regular use of personal technologies for learning occur.
(4) Individual academics should: Embrace utilising the technologies that learners bring to the learning environment. The owners of such personal technology are willing and able to use such devices for their learning benefit.
Understand that online polling is easier now. Gone are the days of having to build a business case for the purchase of "clickers"; students' own, more familiar, technology can be deployed.
Use their professionalism to utmost effectinnovation and trying new things with learner cohorts is a positive development. Learners will understand that technology can contain glitches: you will not look stupid in front of your class. In fact, such things can positively benefit students and tutors' practice and are becoming the norm. Those reticent to change show a lack of creativity and inspiration and risk becoming outdated.
This study was undertaken within two UK-based universities and as such is limited in its scope of participants, types of technology used by students and the software subscribed to. It is also based upon one lecturer's work and experience. Different institutional settings, in different geographical areas and with different cohorts of learners could bring about different results. An example is using polling where not all learners have smartphones (due, for example, prohibitive financial cost) would obviously be difficult. Another is where teachers are less technologically minded and resistant to these sorts of challenges.

Conclusions
Real-time polling is a positive way to corral wandering minds of distracted learners of all ages, learning levels and abilities. Real-time polling does engage students. This engagement has been shown to have a positive impact on student learning (Stover et al., 2015). Other advantages include (Selwyn et al., 2015) the following:

JRIT
(1) Immediate feedback: tutors are able to identify learning issues within a large class immediately and respond appropriately.
(2) Student self-measurement: students are able to measure their own understanding in comparison to their peers.
(3) Creating an interactive environment: live polling increased engagement of students and promoted improved learning and assessment performance.
(4) Pausing the flow of lecture content, particularly "teacher talk" to allow space for student reflection and response.
(5) Creating topics for positive peer discussion.
(6) Giving ideas for learning support material enhancements.
(7) Give ideas for enhancing future classroom-based activities.
However, how good polling is achieved is dependent upon several things, such as: (1) Avoiding polling overload. Those doing the polling need to avoid "polling overload" whereby learners are polled to breaking point just because this is the teacher's new toy. Real-time polling should be judiciously used as one ingredient on a currently relevant teacher's smorgasbord.
(2) Institutional sign-up to polling software. If online software can be utilised to its fullest, then more options for improved teaching and learning are available. This often requires subscription to such a service. Failing that, the more basic, free services provide more limited teaching and learning options.
(3) Open-minded, positive-to-change teaching staff, ideally of a problem-solving disposition. Using software and relying upon students' technology can present teaching staff with novel, and sometimes trying, challenges. An open mind and recognition that we are all on a learning journey can help offset such negative experiences.
(4) Encouraging students to try new things. Students are generally willing and pliant towards the idea of trying new things and usually see polling as a positive development. So anything a teacher can do to fight boredom, enhance learning and add interest to their classrooms is seen as a good thing.
(5) Accessible for university and student devices. Polling systems have moved on from the need to use bespoke cumbersome "clickers" to using easier-to-use, more familiar ubiquitous smartphone technology. Along with this technological advance come new ways for real-time polling. Hence, robust wireless infrastructure needs to be in place. Also, the success of live polling in teaching is dependent on the software being easily accessible by student and lecturer devices, including tablets and smartphones. This might not always be the case in institutions where use of some software is restricted or blocked.
(6) Alignment with tutor's teaching philosophy. For the most beneficial polling, there is a clear need on the part of the tutor "for an alignment or marrying of technology, activity and educational beliefs" (Selwyn et al., 2015).
5.1 A game plan to enable tutors to reap the benefits of online polling Some ways to introduce polls into tutors' learning strategies include: (1) Check institutional infrastructure to ensure that it is compatible with large-scale polling and if institutional subscriptions to online polling systems exist.
(2) Identify poll-able moments within learning, teaching and assessment activities.
Remembering that overloading teaching activities with too many polls will lead to deleterious results, tutors need to carefully think when to use a poll and which type of poll to use. The workflow shown in Figure 1 can help.
(3) Plan the number, content and format of questions asked within a poll: too many and the audience will get bored; are the responses likely to meet the aims of the poll; and which format of answers is most appropriatefor example Likert-scale, multiple choice or free text?
(4) Stimulate peer-to-peer discussion. If in-class polling indicates large-scale misunderstanding or knowledge gaps, then tutors can use poll responses to seed peer group discussions aimed at plugging such gaps. This will require a flexible, reduced didactic approach from tutors whereby planned tutor-led "teaching" time gives way to a more student-led activity. The tutor's role then will be critical at squeezing full value from insights gained but not immediately apparent to all learners.
(5) Do not forget variety. Polls of the same type and format can quickly become routine and boring. Avoid this with the judicious use of polling which always has a clear and transparent purpose. Also mix up the form and purpose of polls and the question response formats.
(6) Gather (and use) data analytics. After running a number of successful polls, associated data will be available to the pollster. These data can be analysed and used to further help tutors positively influence subsequent teaching practice. Do not let this go to waste.