Abstract
Purpose
The purpose of this paper is to articulate how the user experience (UX) approach was initiated and integrated into the centre’s scope of operations with the objective of improving the e-learning layout on the D2L learning management system (LMS). One of the most effective ways to collect user feedback has historically been to evaluate user interfaces using strategies from user testing. The integration of a UX approach by the Centre for ODL Experiences (COLE) at Wawasan Open University has led to a more user-oriented design of FlexLearn by conducting user testing on students as the target users of the platform and gathering course leaders’ (CLs) feedback after the presentation of the new template.
Design/methodology/approach
Since the process of design and development is a looping process, the first user testing methods employed were observation and interviews, which were conducted over the course of numerous sessions. The data collection used a mixed-methods approach, combining quantitative demographic and background data with qualitative feedback from open-ended questions and real-time interview responses. A standardized questionnaire gathered demographic information, while questions for feedback forms and interviews were adjusted based on specific tasks to explore usability and user interactions comprehensively.
Findings
The findings revealed overall positive feedback, with some concerns highlighted by the students who claimed to have trouble navigating the courses during the initial prototype. In addition to the qualitative data from the user testing session, a quantitative method based on an online questionnaire was also utilised for the CLs after the presentation of the final layout. Positive responses were received from the CLs, and constructive suggestions were considered for FlexLearn 3.0.
Research limitations/implications
This paper is among the first that articulates the process of initiating and integrating user-centred design in an effort to improve the user experience of online and ODL platforms and LMSs. It will contribute to a dialogue on investigating and prioritising learners’ ODL experiences to ensure education equity across all levels or categories of students, which aligns with the United Nations Sustainable Development Goals.
Practical implications
The integration of UX and user testing allows us to better identify what users like, their concerns and their needs. We gain important input on how easy or difficult it is to use the system, move around it and how much they enjoy using it. This feedback helps us make changes to the design so that the final product is more in line with what users want. It also allows us to discover problems before they become major, saving time and effort later on. Finally, integrating user input improves the LMS, delivering a more fun and successful learning experience for everyone.
Social implications
User-friendly systems arise as institutions prioritise user-centred design, breaking down barriers for various learners. This develops an innovative culture, improving present learning experiences and setting a precedent for future generations. The emphasis on user demands helps to create a more accessible, adaptive and egalitarian educational landscape by connecting education with current technological trends. As education becomes more inclusive, the broader community benefits, emphasising the beneficial social impact of LMS user testing.
Originality/value
By articulating the process of integrating user testing on an LMS/e-learning prototype, helps us understand what users like, where they face problems and what needs improvement. By involving users in testing, we get valuable feedback on how easy it is to use the system, navigate around and overall, how much they enjoy using it. Case studies like this also offer universities concrete examples of real-world challenges and successes.
Keywords
Citation
Mohamad Noor, N. (2024), "Initiating and integrating user-centred design in upgrading the e-learning layout: a case study on FlexLearn (Brightspace) 2.0 template", Asian Association of Open Universities Journal, Vol. 19 No. 1, pp. 101-115. https://doi.org/10.1108/AAOUJ-12-2023-0139
Publisher
:Emerald Publishing Limited
Copyright © 2024, Nurdayana Mohamad Noor
License
Published in the Asian Association of Open Universities Journal. Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) license. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this license may be seen at http://creativecommons.org/licences/by/4.0/ legalcode
Introduction
As a leading open university in Malaysia, Wawasan Open University (WOU) has been venturing into the digitalisation of learning content to provide its students with the highest quality of open and distance learning (ODL) education. FlexLearn is the university’s trademarked e-learning platform, running on the D2L Brightspace learning management system (LMS) and managed by a unit called the Centre for ODL Experiences (COLE). COLE boasts a dynamic team comprising digital learning designers, user interface (UI) designers, LMS technologists, senior web developers and interaction designers. This collective expertise and varied experience within the team provide diverse and crucial technical perspectives, significantly contributing to COLE's growth.
In an era where education and technology converge in unprecedented ways, WOU believes in the imperative to invest both time and resources in the development of robust e-learning platforms and LMS. This aligns with the seismic transformation of the educational landscape, where more higher education institutions worldwide are embracing technological solutions to adapt in the wake of the global disruptions caused by events like the COVID-19 pandemic (Schneider and Council, 2021). Fortunately, the digitalisation of existing courses that also include new course development has already begun since 2019, before the COVID-19 pandemic.
Accurately promoting these tools for top-notch learning experiences is part of realising the potential of e-learning platforms and LMS (Leo et al., 2021; Nuseir et al., 2021). The design of FlexLearn and its accessibility via laptops, computers, smartphones and tablets align with learners' digital fluency and prepare them for success in a technology-driven world (Kesharwani, 2020). Keeping in mind that learners are seamlessly integrating digital devices into their lives, working professionals can also benefit from FlexLearn. This is why investing time and resources in e-learning platforms and LMS is a great investment in the future of education. D2L Brightspace is the LMS provider, and by trademarking it as FlexLearn, WOU aims to revolutionise knowledge sharing, transcend geographical barriers and cater to diverse audiences. COLE, which specialises in digitalising learning content and system management, further complements this vision by fostering engagement, collaboration and self-directed learning.
FlexLearn's integration as a transformative step resonates with the evolving demands of education. This commitment symbolises an investment in learners, educators and education itself. Amid a continuously evolving educational landscape, embracing this paradigm shift propels a future where education empowers society without being confined by limitations, hence utilising technology's potential for comprehensive enhancement. Despite the numerous challenges and hurdles encountered throughout the journey, the implementation thus far has demonstrated productivity and advantages. As FlexLearn solidifies its position as a stable e-learning platform and LMS among WOU's internal staff and students, COLE has initiated a user analysis phase to gain deeper insights into users’ needs and experiences. A user experience (UX) approach has been incorporated into this analysis process, aiming to comprehensively evaluate the experiences of students and staff when utilising the platform.
This paper thus undertakes the task of meticulously detailing the initiation and integration of user-centred design principles, strategically aimed at enhancing the user experience within the realm of online, ODL. The objective of embracing this approach is to usher in a new era of educational technology epitomised by the LMS that surpasses its conventional role and evolves into an upgraded, highly effective facilitator of learning. In doing so, this initiative becomes an integral contributor to a broader discourse – one that prioritises the thorough investigation and elevation of learners' ODL experience. This unwavering focus on the learners' journey is not merely a technological advancement; it is a conscious step towards attaining equity in education. This effort is parallel to the fourth United Nations Sustainable Development Goal, particularly with the goal of ensuring inclusive and quality education for all to support lifelong learning.
The research questions guiding this inquiry include: How was the UX approach initiated and integrated into FlexLearn's development processes? What insights emerged from user testing sessions with students and subject matter experts (SMEs)? How do the collected feedback and recommendations inform the iterative refinement of FlexLearn's usability and user experience? Lastly, what are the implications of integrating user-centred design principles into LMS development, particularly in online education contexts?
This paper reports a case study that examined the integration of UX within the research and development endeavours of WOU’s LMS platform. The case study unfolds through a series of distinct phases:
The initiation of a UX workshop was conducted during the COLE regroup training.
The commencement of the first on-site user testing session involving students and the initial prototype of the LMS platform.
Subsequent iterations of user testing with students featured an improved prototype.
Delivery of the finalised prototype for evaluation by SMEs.
The launch of pilot courses designed for real-time course testing with the primary goal of gathering user feedback for further analysis and developmental refinement.
Methodology
The data collection for these series of user testing employed a mixed-methods approach. Demographic and background data were gathered quantitatively, while initial responses were collected through open-ended questions in a feedback form via Google Forms. A quantitative method based on an online questionnaire was also utilised for the CLs after the presentation of the final layout. The subsequent interview responses were qualitative, with questions developed in real-time based on the participants' answers. These oral responses were transcribed and only the key points regarding concerns, challenges and recommendations were recorded and discussed with the team. The integration of both quantitative and qualitative data are advocated by Creswell and Plano Clark (2017), which is favourable for providing a comprehensive understanding of user experiences.
A standardized questionnaire was formulated to gather demographic and background information from the students, while a series of questions were crafted following NN/g's guidelines for assessing usability. The open-ended questions for both the feedback form and interviews were adjusted according to the specific tasks assigned, which varied across different user testing sessions. Questionnaires and interviews are effective methods for exploring how users interact with systems and identifying the user’s preferred features (Watanabe et al., 2020). This understanding is essential for assessing usability, which determines how simple user interfaces are to operate (Seong, 2022). Usability is a critical requirement for an LMS as it enhances the efficiency of student learning and the overall educational experience (Hasan, 2019).
UX workshop during COLE regroup training
The initial workshop took place during the one-week COLE Regroup 2022: Design for Learning Workshop and was led by Dr Nurdayana, a digital learning designer with expertise in UX research. This four-hour session encompassed an introduction to UX, the fundamental principles of UX, its application in LMS and concluded with two group activities: storyboarding and reverse brainstorming.
Storyboarding
A UX method of representing user interactions through a step-by-step comic strip is often used to assist in planning and enhancing the user experience (Habibi, 2020). During this activity, three distinct scenarios were explored to illustrate LMS usage.
Reverse brainstorming
Another method in UX involves identifying causes or reasons behind a problem rather than generating solutions (Svihla and Kachelmeier, 2020). This approach was utilised to understand the root causes of issues identified from the storyboard activity and find effective solutions to enhance the user experience.
The session's content was primarily informed by design thinking principles advocated by the Nielsen Norman Group (Alao et al., 2022) and enriched by case studies aligned with these principles. The Nielsen Norman Group (NN/g) is a pioneer in user-centred design research and development, co-founded by the father of UX, Donald Norman, with their guidance on user experience and interaction design being one of the most referred to scholarly and practically (Alao et al., 2022; Hasan, 2019; Rodwell, 2024). NN/g emphasizes the criticality of user testing in authentic business contexts, a practice they rigorously adhere to by engaging in global weekly research activities. Their extensive empirical research, which underpins their training, consulting and research services, includes testing over 2,000 interfaces, observing more than 4,000 users across 18 countries on five continents and analysing thousands of hours of user observations (Rodwell, 2024).
Additionally, Mark Hassenzahl's work (Hassenzahl, 2018) contributed valuable insights, resulting in a harmonious blend of practical industry knowledge and academic underpinnings. His work offers critical insights into user experience design, particularly emphasizing the significance of user emotions and needs in interaction with digital products (Følstad and Petter Bae Brandtzaeg, 2020). His theoretical framework, which integrates pragmatic and hedonic qualities, highlights the importance of designing for pleasure and usability. This approach informs the user testing case study by highlighting the necessity of balancing functional performance with positive affective experiences, thus guiding the evaluation and enhancement of interface designs to better meet user expectations and improve overall satisfaction.
Meanwhile, the group activities were conducted as part of the ideation phase and guided by the principles outlined in What is Ideation? (2016), with the aim of delving deeper into the challenges faced by various user types within the LMS. These user categories encompassed students, subject matter experts (SMEs), and UI designers. The activities provided valuable insights into the diverse experiences and obstacles encountered during LMS navigation and usage, thereby informing ongoing efforts to enhance its usability.
First on-site user testing session
The on-site user testing session was conducted with four respondents who were students. This session served as the usability evaluation of Prototype 1, focusing on user experiences among students. A one-to-one interview method was employed to glean insights into user preferences and challenges that would guide the refinement of the navigation and interface. All respondents were allotted 15 min to complete the given task (completing Self-check 1 from the first landing page in Unit 1), followed by an interview to obtain their comments and actions for subsequent analysis.
User 1 articulated a preference for a content layout reminiscing about Lithan's design, another external LMS platform used by one of the schools. They expressed appreciation for the quiz feature and the two-column content structure. However, challenges were noted in task completion due to excessive scrolling, despite satisfaction with the navigation. They advocated for an auto-play video pop-out and emphasised the need for concise content to mitigate scrolling. Meanwhile, User 2 disapproved of extensive scrolling and recommended increased flexibility in quiz answers. They commended the quick link feature and proposed a floating video pop-out for simultaneous reading and watching. Viewing quizzes as recreational, they suggested enhancements and contemplated exploring a universally appealing video pop-out.
User 3 expressed dissatisfaction with excessive scrolling and favoured a pop-out video feature, echoing sentiments similar to those of User 2. However, User 4 disapproved of numerous animated effects and extensive scrolling. They recommended a sticky subtitle/header for improved navigation, suggested subsections within an accordion and preferred a condensed overview of all content on a single accordion page. User 4 also offered a comprehensive set of suggestions to reduce animated effects and explored additional navigation features. Their responses are presented in Table 1.
The findings from this session highlighted scrolling issues and ways to optimise content. User preferences for auto-play video pop-outs and enhanced quiz features also emphasised the importance of aligning the interfaces with recreational learning experiences. Therefore, the team discussed and planned on enhancing the interface by considering feedback on scrolling, video pop-outs and quiz features.
Second user testing session
During the second user testing for Prototype 2, eight respondents (students) were assigned the task of completing a sample quiz in Unit 1.2 on the FlexLearn platform. All respondents initiated the task from the first landing page and after the quiz, they were required to partake in a survey administered via Google Form with a designated timeframe of 15 min. It was followed by a one-to-one interview.
User feedback from this testing phase revealed nuanced perspectives among the respondents. User 1, a frequent FlexLearn user via laptop, expressed preference for the new layout's design yet underscored issues with the reliability of app notifications. A suggestion was made to introduce a live group chat pop-up directly on the FlexLearn platform, a feature slated for potential incorporation in FlexLearn 3.0. User 2, another regular FlexLearn user, reported confusion with the current layout, praising the clearer navigation and improved design of the new version. Several recommendations were offered on the implementation of a search box, the rectification of previous semester quiz visibility and the inclusion of additional external learning materials.
User 3, an occasional FlexLearn user, voiced contentment with the extant design but sought chunkier information, minimised scrolling and a dedicated search function for key points. User 4, a daily FlexLearn user, acknowledged the design improvements in the new layout and advocated for further enhancements, including the introduction of collapsible content. User 5, an infrequent FlexLearn user preferring apps, lauded the new layout's design but raised concerns about slow loading during internet fluctuations, prompting recommendations to reduce file sizes for faster loading and the incorporation of more videos. User 6, a daily FlexLearn user on both laptops and apps, expressed satisfaction with the new layout and preferred comprehensive content organised under expandable subtopics.
As daily FlexLearn users, Users 7 and 8 conveyed contentment with the new layout's overall design and improvements. Aside from the general satisfaction, User 8 articulated a preference for single paragraphs over columned content and suggested the inclusion of more infographics and integrating podcasts into the course. Observational analysis discerned a user's struggle with columned content, prompting the consideration of collapsible accordion lists for improved readability. Their responses are presented in Table 2.
In conclusion, the user feedback and observational analysis from Prototype 2 revealed positive responses to the new layout, with identified areas for refinement. The proposed modifications centred on enhancing navigation, restructuring content organisation and addressing technical concerns to foster an optimised and engaging learning environment for FlexLearn users. The proposed changes are envisioned as essential enhancements in the ongoing evolution of the FlexLearn platform, with further refinements slated for the prospective development of FlexLearn 3.0. Figure 1 illustrates the principal user testing protocol employed in this series of user testing.
Finalised prototype presentation to subject matter experts
Following further enhancement based on the second user testing, the team finalised Prototype 2.0, an improved version of Prototype 2 and presented it to the course leaders (CLs) who were the subject matter experts (SMEs). They were given a Google Form survey to provide their feedback to gauge their perception and acceptance of this new prototype. Evaluation of their feedback on Prototype 2.0 yielded comprehensive insights. The SMEs, totalling 20 respondents, conveyed positive first impressions towards the prototype, expressing curiosity and interest in exploring its interactive features. Although the majority expressed favourable views, a single negative comment, somewhat unrelated to the prototype, was observed.
Regarding the navigational aspects of FlexLearn, all SMEs provided ratings exceeding 7, with varied scores including 50% at 8, 25% at 9 and 25% at 10. This was indicative of an overall positive perception of the navigation design. Furthermore, the user-friendliness of the navigation received approval, with 70% of the SMEs assigning a rating of 4 and 15% attributing a rating of 5. The perceived usefulness of new features, such as the Pomodoro timer and games, also garnered positive responses, with all SMEs assigning ratings above 3. Notably, 70% of them assigned a rating of 4 and 15% assigned a rating of 5, signalling a consensus on their utility. In the domain of preferences, the SMEs exhibited a collective liking for the fresh and original aesthetic of the prototype, citing tidiness, user-friendly navigation and an overall positive reception. However, several SMEs expressed a need for hands-on interaction to provide a more informed and nuanced assessment.
Regarding specific features, all SMEs appreciated the user-friendliness, enhanced interactivity and improved navigation facilitated by the new features. Additionally, the provision of more options for applications and games resonated positively among the SMEs. Despite predominantly positive feedback, the SMEs articulated various dislikes, including concerns related to content wording, preferences for PDF printing and reservations about the suitability of games for senior students. Such mixed feedback underscores the multifaceted nature of user preferences within the CL domain.
However, all SMEs expressed positive reactions to the clean graphics, colourful design and realisation of in-house UI/UX expertise, thus highlighting the proficiency of the development team. Concerning irritations, most SMEs reported a lack of discernible irritation. Nevertheless, one SME alluded to potential improvements in managing technical jargon to enhance the overall clarity of the presentation.
The feedback on Prototype 2.0, as summarised in Table 3, was highly useful to the team in their effort to refine and tailor the platform to the unique needs of the SMEs. The diverse perspectives offered shed light on positive perceptions, constructive criticisms and areas for potential enhancement, contributing substantively to the iterative development of FlexLearn. Due to a few negative responses received from several SMEs, a follow-up user feedback form was sent selectively to the said SMEs to obtain a more in-depth insight that can result in actionable feedback.
Results from the follow-up user feedback highlighted important insights into the SMEs’ experiences with the FlexLearn prototype. The responses from five SMEs demonstrated their limited experience with LMS, primarily having used Moodle. This suggests a potential learning curve for those transitioning from Moodle to FlexLearn. The SMEs expressed a desire for centralised systems, ease of use and better integration with the student information systems in FlexLearn. However, some indicated a need for familiarisation with FlexLearn's unique features. Suggestions for additional features included increased storage capacity, improved navigation, the ability for tutors and CLs to mark assignments within FlexLearn, and the introduction of gamification elements. Overall, the SMEs expressed satisfaction with the prototype, despite one of them mentioning an “ordinary” colour scheme, suggesting a potential area for improvement. The positive overall opinions indicate a favourable impression of the prototype, and further exploration and clarification on colour preferences may be necessary. Upon careful consideration of the feedback, COLE had selected two courses to be developed using the enhanced template 2.0 for the coming semester with actual students so as to gauge their user experience with the template.
Pilot courses launched for real-time course testing
The pilot courses, however, did not gather much user feedback. The Google Form survey only collected seven responses from the actual students of the two pilot courses. The initial impressions were generally positive, with two negative comments, one of which was irrelevant to the layout but focused on content. Navigational experiences varied, with five students rating 5 and above while two rating below 5. The ease of navigation received mixed ratings, with six students rated 3 and above and one student rated 1.
Regarding the overall experience with the new FlexLearn template, four students rated 3 and above, while three rated below 3. Satisfaction with template traits showed diverse responses where the overall design and layout received positive feedback, interactivity had mixed ratings and colour was generally well-received. Additionally, navigation and new features like a study timer and quick links received varied feedback.
Among the positive aspects of the prototype highlighted by the students were announcements, segmented groups, design, colouring and layout. However, their concerns centred around quiz errors, misplaced content and difficulty downloading materials. Unexpectedly, the experience with the new interface and interactivity was mentioned positively, while issues with downloading and understanding layout formats were sources of irritation for some students.
Furthermore, several concerns were raised regarding the loss of marks, the format of notes in PDFs and difficulties in understanding the discussion layout, which were all irrelevant to the template as it only applied to the course content and not the discussion tab in FlexLearn. Some of the responses displayed a certain level of reluctance to adapt to digital learning. Overall, student suggestions were limited, with some expressing uncertainty and others preferring a return to PDF files. The report indicates a generally positive response to the FlexLearn 2.0 template, with specific areas for improvement, particularly in content organisation and download functionality. The summary is presented in Table 4.
Due to the underwhelming number of responses received from actual students of the courses, COLE conducted two separate methods to gather feedback on the pilot courses through the voluntary participation of students not enrolled in the courses:
On-site user testing session with voluntary on-campus learning (OCL) students.
Online user testing session for a given time window for ODL students.
The on-site user testing session revealed a predominantly positive response from the students. Initial impressions were favourable with most students finding the new template interesting and user-friendly. While the majority had a smooth navigation experience and satisfactory overall feedback, some students noted complications, especially for new users. The report emphasises the importance of addressing these concerns to enhance the template's user-friendliness and suggests iterative improvements based on positive user feedback.
Specific trait satisfaction ratings, including design, layout, interactivity, colour, navigation and new features, indicated overall positive sentiments, with some areas identified for refinement. Notably, students appreciated the colour scheme and the quiz feature providing instant correct answers. Suggestions for improvement included adjusting quiz time, organising content more effectively and enhancing the notification system. One user expressed concern about the unseparated notifications between general announcements and personalised notifications. She also compared the platform with Moodle, which was used at her previous university. Although the user testing was focused on the template of the course content and not the whole LMS platform, her insights were still valuable for future discussion. The summary is presented in Table 5.
The online session for ODL students revealed generally positive impressions of the platform. Students found FlexLearn to be “nice” and “refreshing” with positive comments were propounded on ease of use and improved content location. Navigating the platform received high ratings, with the majority providing scores of 5 or higher, indicating an overall pleasant experience. However, there were variations in the ease of navigation ratings and some students took longer than 15 min to complete the task given.
In terms of overall template experience, the majority of students expressed satisfaction, particularly with the design, layout, colour scheme and navigation features. Nevertheless, a few students provided lower ratings for interactivity and one reported an issue with discussions and notifications. While some users liked the overall design and colourfulness, concerns were raised about capturing information, too many tabs and the lack of downloadable content options. The report suggests areas for improvement, including addressing the discussion and notification issues and considering alternative content download formats. Further user input is recommended for ongoing enhancements to the FlexLearn platform. The summary of the online session responses is summarised in Table 6.
Analysis
The collected data from user testing, SME feedback and pilot course feedback were analysed through collaborative discussions with the COLE team. This process involved systematically reviewing and interpreting the data to identify key themes and patterns. The team leveraged their collective expertise to contextualize the feedback within the framework of the FlexLearn prototype project’s objectives.
Feedback was meticulously examined to assess user satisfaction, identify usability issues and understand user needs and preferences. The UX researcher led the team in employing qualitative analysis techniques, such as coding and thematic analysis, to categorize the feedback into meaningful themes. These themes were then cross-referenced with project goals to evaluate alignment and identify areas for improvement.
Subsequent discussions synthesized the insights gained from the feedback, leading to actionable recommendations for enhancing the user experience. The team prioritized these recommendations based on feasibility, impact and alignment with strategic objectives. This iterative process ensured that user feedback was comprehensively understood and effectively translated into practical improvements for the e-learning platform.
Discussion
The discussion in this paper is based on a series of iterative processes involving research, design, development and testing. This iterative approach aims to plan future template development and fully utilise the capabilities of FlexLearn. The conclusions and results gathered from the integration of UX into FlexLearn 2.0 provided valuable insights into various aspects of the platform. The following are the key findings from the series of user testing and analyses:
Initial impressions
Most users had positive initial impressions, describing FlexLearn as “nice” and “refreshing”. Positive feedback indicates a successful design that captures users' attention and makes a favourable first impression. However, some of the users provided clueless or irrelevant comments. This suggests a need for further clarification or improvement in communicating key features to users upon their first interaction with the platform. It is one of the many roadblocks in this design and development process that hinder progress.
Navigating experience
Almost all users rated their navigating experience as 5 or above, indicating an overall positive and pleasant experience. This suggests that users find FlexLearn intuitive and user-friendly for achieving their study goals. Despite the high ratings, the limited variability of scores may indicate a ceiling effect, suggesting that while users find navigation easy, there might be opportunities for further improvement that are not reflected in the current rating scale. As we try to adapt inclusivity and accessibility in our design to ensure that all levels of proficiency and ability of users are catered for, navigation is the utmost important aspect to always be included in the research and development phase.
Ease of navigation
From the first prototype to the final template, responses to the navigation had been improved. The majority of users rated the ease of navigation highly as we continued to listen to their feedback and redesigned the template. This consistency in positive responses aligns with the high overall navigating experience scores, confirming that users generally found it easy to navigate FlexLearn with the final template. While most users found navigation easy, the presence of a few low ratings suggests that there are some aspects or features that might need improvement to enhance the overall ease of navigation.
Overall experience with the template
The majority of users rated their overall experience with the final FlexLearn template as 4 or 5, indicating a satisfying user experience. This positive response suggests that the template’s design and features contribute to a generally favourable learning environment. Despite the overall satisfaction, some users rated their experience low, which calls for continuous effort in user testing. This further confirms that identifying and addressing the specific concerns of these users will provide valuable insights for template enhancement.
Satisfaction with template traits
Most users expressed satisfaction with various traits of the latest template, including design, layout, colour, navigation and new features. This indicates that the majority of the template's characteristics meet or exceed user expectations. All these traits aim to create a more engaging learning experience with an appropriate degree of interactivity. However, some users provided lower ratings for interactivity, which indicates either the level of interactivity is too minimal or they find it impractical. Some users can be very objective, whose only concern is the learning content and find the interactivity features somewhat irritating or unhelpful. They may also express concern about the notification with varied opinions. Exploring these concerns to address any issues related to discussions and notifications can enhance the overall satisfaction with these features.
Time to complete the task
One of the ways to measure the template’s overall experience and navigation was by looking at the time taken by users to complete the given tasks. Most users efficiently completed the tasks in 10 min or less; however, only a few users admitted taking longer than 15 min. Therefore, we analysed the tasks that took longer time in an attempt to understand the reasons and challenges behind the extended duration and identify potential usability improvements. Most of the reasons given by the users are their lack of familiarity with FlexLearn because they rarely use the platform and often download the content for offline access. Understanding the specific aspects that users found easy or challenging can guide targeted improvements. After all, the consensus that the tasks were easy to complete aligns with the high overall satisfaction ratings.
Full screen viewing
We included this question to understand how users usually access the platform. The full features of the template are at their best and fully optimised when users view the course content in full screen. However, only half of the users indicated viewing in full screen, while the other half did not. Exploring the reasons behind these preferences and the potential benefits of full-screen viewing can inform future design considerations.
Likes and dislikes about the template
The users generally liked the overall design, colourfulness and clarity of the icons. However, concerns about capturing information, too many tabs and the inability to download content in specific formats highlight the uninformed users lack of knowledge about how to download the content for offline access. This actually gives an opportunity for further enhancement. For example, the highlighted comment on too many tabs could be further explored to identify specific pain points and improve the organisation of content.
Irritations and concerns
Most users did not express significant irritation or concerns about their experience with the final template. Although one user mentioned difficulties with chat and discussions that could possibly indicate potential issues that need attention, this issue is beyond the scope of the template, which only concerns the course content layout and design. Chat and discussion are presented on a different tab within the FlexLearn platform and are not integrated into the course content.
Thoughts and suggestions
Most users did not provide specific thoughts or suggestions during the survey, indicating a potential area for improvement in gathering actionable feedback. Only one user from the interview for the pilot course for OCL students offered extensive thoughts and suggestions on the template. In the future, we need to emphasise encouraging users to share more detailed insights that can provide a richer understanding of their needs and preferences.
Conclusion
In summary, the comprehensive series of user testing conducted for the FlexLearn 2.0 layout yielded predominantly positive insights into the user experience. While the overall sentiment is favourable, a detailed analysis of the feedback underscores several areas demanding improvement. Notably, attention should be directed towards resolving issues pertaining to navigation, restructuring the organisation of content for increased user clarity and exploring options to cater to user preferences for full-screen viewing. These identified areas for enhancement signify valuable opportunities to refine the FlexLearn platform further.
Integrating the UX approach into e-learning or LMS platforms has been proven to significantly enhance the overall learning experience by ensuring that courses are designed with a focus on users’ needs, preferences and behaviours, leading to increased engagement and interactivity. The emphasis on usability and accessibility makes the platform more user-friendly, catering to a diverse audience. Personalised learning journeys based on individual profiles and preferences contribute to a more adaptive and effective educational environment. Moreover, the application of UX principle results in higher retention and completion rates as well as more efficient task completion.
Continuous feedback loops enable ongoing improvements, aligning the technology seamlessly with pedagogical goals. A well-designed UX can not only positively influence the brand image but also foster user loyalty, creating a conducive and successful learning ecosystem. To boost the optimisation of the overall learning experience, a more in-depth investigation should be undertaken to delve into specific user concerns and preferences. By addressing these nuanced aspects, the development team can implement targeted enhancements that align more closely with user expectations, ensuring that this LMS continues to evolve as a user-friendly and effective learning platform. Generally, all learning institutions should invest more in the research and development of the LMS platform, as e-learning optimises processes, allocates resources efficiently and mitigates traditional classroom limitations (Laufer et al., 2021). As an environmentally conscious approach, minimising the establishment of budget-friendly tuition underscores financial prudence and ecological responsibility.
Ethically investing in LMS development empowers a technologically advanced learning environment that can equip students with skills for a digital future (Kesharwani, 2020; Valtonen et al., 2022). For universities, the insights gained from testing help enhance the overall functionality and usability of the LMS. By identifying areas for improvement through case studies, universities can tailor the LMS to better align with the preferences and challenges of their student body. This, in turn, fosters a more engaging and efficient online learning experience. Additionally, universities profit from resolving technical issues uncovered during user testing, ensuring a seamless learning process. The collaborative effort involving digital learning designers, UI designers, LMS technologists and interaction designers within university teams contributes to a well-rounded perspective, promoting continuous growth. For students, the advantages are significant. This user testing results in a user-friendly platform that is intuitive and easy to navigate, which saves time, reduces frustration and ultimately enhances the quality of the educational journey. By addressing specific needs and preferences identified through testing, universities can create a positive and effective learning environment, contributing to increased student satisfaction and success in their academic pursuits.
Our commitment to inclusivity aligns with the institution's motto – ensuring high-performers have access to flexible, self-paced online learning. This nurture attributes essential for personal and professional success: digital literacy, critical thinking and adaptability. We hope that more universities will prioritise students’ user experience with e-learning platforms in order to provide the best learning experience and engage in a learning environment conducive to lifelong, affordable learning.
Figures
Responses for user testing on Prototype 1
User | Preferences | Appreciations | Challenges | Recommendations |
---|---|---|---|---|
1 | Content layout similar to Lithan's design; two-column content structure | Quiz feature; navigation | Excessive scrolling hindering task completion | Auto-play video pop-out; concise content to reduce scrolling |
2 | Increased flexibility in quiz answers | Quick link feature | Extensive scrolling | Floating video pop-out for simultaneous reading and watching; enhancements to quizzes |
3 | Pop-out video feature | – | Excessive scrolling | – |
4 | Sticky subtitle/header for navigation; subsections within an accordion; condensed overview on a single accordion page | – | Numerous animated effects; extensive scrolling | Reduce animated effects; explore additional navigation features |
Source(s): Author's work
Responses for user testing on Prototype 2
User | Preferences | Appreciations | Challenges | Recommendations |
---|---|---|---|---|
1 | – | New layout's design | Unreliable app notifications | Live group chat pop-up on FlexLearn platform |
2 | – | Clearer navigation; improved design of the new version | Confusion with current layout | Search box; rectify previous semester quiz visibility; additional external learning materials |
3 | Chunkier information; dedicated search function for key points | Contentment with existing design | Excessive scrolling | Minimize scrolling |
4 | Collapsible content | Design improvements in the new layout | – | Further design enhancements |
5 | – | New layout's design | Slow loading during internet fluctuations | Reduce file sizes; incorporate more videos |
6 | Comprehensive content organized under expandable subtopics | New layout | – | – |
7 | – | New layout's overall design and improvements | – | – |
8 | Single paragraphs over columned content | New layout's overall design and improvements | – | More infographics; integrate podcasts into the course |
Source(s): Author's work
SMEs’ feedback on Prototype 2.0
Aspect | Feedback | Summary |
---|---|---|
First impressions | Positive | Curiosity and interest expressed towards interactive features. A single negative comment observed |
Navigational aspects | Majority ratings: 8; 25% at 9; 25% at 10 | Overall positive perception. User-friendliness approved |
User-friendliness | 70% - rating: 4; 15% - rating: 5 | High approval rate for user-friendliness |
New features | All ratings above 3 | Positive responses for Pomodoro timer and games |
Preferences | Liking for fresh aesthetic; need for hands-on interaction | Positive reception; desire for more hands-on experience |
Specific features | Appreciation for user-friendliness, enhanced interactivity, improved navigation | Positive feedback for new features; some dislikes expressed |
Graphics and UI/UX expertise | Positive reactions | Clean graphics, colourful design and in-house expertise acknowledged |
Irritations | Lack of discernible irritations; one SME suggested improvement in managing technical jargon | Minor irritations reported; room for clarity improvement |
Source(s): Author's work
Pilot course feedback by actual students
Aspect | Feedback |
---|---|
Initial impressions | Generally positive with two negative comments, one of which was irrelevant to the layout |
Navigational experiences | Varied ratings, with five students rating 5 and above, and two rating below 5. Ease of navigation received mixed ratings, with six students rating 3 and above and one rating 1 |
Overall experience | Mixed ratings, with four students rating 3 and above, and three rating below 3. Diverse responses regarding template traits, with positive feedback on design and layout, mixed ratings for interactivity and generally positive feedback on colour |
Positive aspects | Highlighted aspects include announcements, segmented groups, design, colouring and layout |
Concerns | Centred around quiz errors, misplaced content, difficulty in downloading materials and issues with understanding layout formats |
Unexpected positive mentions | Experience with new interface and interactivity mentioned positively |
Relevant concerns raised | Losing marks, format of notes in PDFs, difficulties in understanding the discussion layout, deemed irrelevant to the template as they applied to course content |
Student suggestions | Limited, with some expressing uncertainty and others preferring a return to PDF files |
Overall response | Generally positive response to FlexLearn 2.0 template, with specific areas for improvement noted in content organization and download functionalities |
Source(s): Author's work
On-site user testing on the pilot courses
Aspect | Feedback |
---|---|
Initial impressions | Most students found the new template interesting and user-friendly. Some noted complications, especially for new users |
Specific trait satisfaction | Overall positive sentiments with areas identified for refinement. Notably, appreciation for colour scheme and quiz feature |
Suggestions for improvement | Adjusting quiz time, organizing content more effectively and enhancing notification system. Concerns about unseparated notifications and comparison with Moodle |
Source(s): Author's work
Online session responses on the pilot courses
Aspect | Feedback |
---|---|
Online session feedback | Generally positive impressions of platform. Positive comments on ease of use and improved content location |
Navigation experience | High ratings for navigating the platform, but variations in ease of navigation ratings. Some students took longer than 15 min to complete tasks |
Overall template experience | Majority expressed satisfaction with design, layout, colour scheme and navigation. Some lower ratings for interactivity and concerns about discussions and notifications |
Areas for improvement | Addressing discussion and notification issues, considering alternative content download formats. Further user input recommended for ongoing enhancements |
Source(s): Author's work
References
Alao, O.D., Priscilla, E.A., Amanze, R.C., Kuyoro, S.O. and Adebayo, A.O. (2022), “User-centered/user experience Uc/Ux design thinking approach for designing a university information management system”, Ingenierie Des Systemes d'Information, Vol. 27 No. 4, pp. 577-590, doi: 10.18280/isi.270407.
Creswell, J.W. and Plano Clark, V.L. (2017), Designing and Conducting Mixed Methods Research, SAGE, Thousand Oaks, CA.
Følstad, A. and Petter Bae Brandtzaeg (2020), “Users' experiences with chatbots: findings from a questionnaire study”, Quality and User Experience, Vol. 5 No. 1, pp. 1-14, doi: 10.1007/S41233-020-00033-2.
Habibi, M. (2020), “Storyboarding in user experience (UX) design”, Medium, available at: https://bootcamp.uxdesign.cc/storyboarding-in-user-experience-ux-design-7eba60e10fa0
Hasan, L. (2019), “The usefulness and usability of Moodle LMS as employed by Zarqa University in Jordan”, Journal of Information Systems and Technology Management, Vol. 16, pp. 1-19, doi: 10.4301/s1807-1775201916009.
Hassenzahl, M. (2018), “The thing and I: understanding the relationship between user and product”, in Blythe, M.A., Overbeeke, K., Monk, A.F. and Wright, P.C. (Eds), Funology: from Usability to Enjoyment, Springer, Netherlands, pp. 301-313, doi: 10.1007/978-3-319-68213-6_19.
Kesharwani, A. (2020), “Do (how) digital natives adopt a new technology differently than digital immigrants? A longitudinal study”, Information and Management, Vol. 57 No. 2, 103170, doi: 10.1016/J.IM.2019.103170.
Laufer, M., Leiser, A., Deacon, B., Perrin de Brichambaut, P., Fecher, B., Kobsda, C. and Hesse, F. (2021), “Digital higher education: a divider or bridge builder? Leadership perspectives on edtech in a COVID-19 reality”, International Journal of Educational Technology in Higher Education, Vol. 18 No. 1, p. 51, doi: 10.1186/s41239-021-00287-6.
Leo, S., Alsharari, N.M., Abbas, J. and Alshurideh, M.T. (2021), “From offline to online learning: a qualitative study of challenges and opportunities as a response to the COVID-19 pandemic in the UAE higher education context”, Studies in Systems, Decision and Control, Vol. 334, pp. 203-217, doi: 10.1007/978-3-030-67151-8_12/COVER.
Nuseir, M.T., El-Refae, G.A. and Aljumah, A. (2021), “The e-learning of students and university's brand image (post COVID-19): how successfully Al-ain university have embraced the paradigm shift in digital learning”, Studies in Systems, Decision and Control, Vol. 334, pp. 171-187, doi: 10.1007/978-3-030-67151-8_10/COVER.
Rodwell, E. (2024), “Hyped but invisible: good UX and good gender practices in and out of the conversational AI sandbox”, Practicing Anthropology, Vol. 46 No. 1, pp. 46-53, doi: 10.1080/08884552.2024.2305560.
Schneider, S.L. and Council, M.L. (2021), “Distance learning in the era of COVID-19”, Archives of Dermatological Research, Vol. 313 No. 5, pp. 389-390, doi: 10.1007/s00403-020-02088-9.
Seong, D.K.-S. (2022), “The framework of the transition of UX design workshops into the non-Face-to-Face”, The Journal of the Korea Contents Association, Vol. 22 No. 3, pp. 309-321, doi: 10.5392/JKCA.2022.22.03.309.
Svihla, V. and Kachelmeier, L. (2020), “The wrong theory protocol: a design thinking tool to enhance creative ideation”, Proceedings of the 6th International Conference on Design Creativity, ICDC 2020, pp. 223-230, doi: 10.35199/ICDC.2020.28.
Valtonen, T., López-Pernas, S., Saqr, M., Vartiainen, H., Sointu, E.T. and Tedre, M. (2022), “The nature and building blocks of educational technology research”, Computers in Human Behavior, Vol. 128, 107123, doi: 10.1016/j.chb.2021.107123.
Watanabe, M., Washio, T., Iwasaki, M., Arai, T., Saijo, M. and Ohashi, T. (2020), “How effectively do experts predict elderly target-users of assistive devices? Importance of expert knowledge in device development”, Design, User Experience, and Usability. Interaction Design, 9th International Conference, DUXU 2020, Held as Part of the 22nd HCI International Conference, HCII 2020, Copenhagen, Denmark, July 19–24, 2020, Proceedings, Part I, 1st ed, doi: 10.1007/978-3-030-49713-2_20, available at: https://link.springer.com/book/10.1007/978-3-030-49713-2#bibliographic-information
What is Ideation? (2016), “Interaction design foundation”, available at: https://www.interaction-design.org/literature/topics/ideation