Facilitation of information literacy through a multilingual MOOC considering cultural aspects

Stefan Dreisiebner (Department of Information Science and Information Systems, University of Graz, Graz, Austria) (Department of Information Science and Natural Language Processing, University of Hildesheim, Hildesheim, Germany)
Anna Katharina Polzer (Department of Information Science and Information Systems, University of Graz, Graz, Austria)
Lyn Robinson (Centre for Information Science, City, University of London, London, UK)
Paul Libbrecht (Technology Based Assessment Unit, Leibniz Institute for Research and Information in Education, Frankfurt am Main, Germany) (IUBH International University of Applied Sciences, Bad Honnef, Germany)
Juan-José Boté-Vericad (Department of Library and Information Science and Media, University of Barcelona, Barcelona, Spain)
Cristóbal Urbano (Department of Library and Information Science and Media, University of Barcelona, Barcelona, Spain)
Thomas Mandl (Department of Information Science and Natural Language Processing, University of Hildesheim, Hildesheim, Germany)
Polona Vilar (Department of Library and Information Science and Book Studies, University of Ljubljana, Ljubljana, Slovenia)
Maja Žumer (Department of Library and Information Science and Book Studies, University of Ljubljana, Ljubljana, Slovenia)
Mate Juric (Department of Information Sciences, University of Zadar, Zadar, Croatia)
Franjo Pehar (Department of Information Sciences, University of Zadar, Zadar, Croatia)
Ivanka Stričević (Department of Information Sciences, University of Zadar, Zadar, Croatia) (National and University Library, Zagreb, Croatia)

Journal of Documentation

ISSN: 0022-0418

Article publication date: 23 December 2020

Issue publication date: 8 April 2021




The purpose of this paper is to demonstrate the rationale, technical framework, content creation workflow and evaluation for a multilingual massive open online course (MOOC) to facilitate information literacy (IL) considering cultural aspects.


A good practice analysis built the basis for the technical and content framework. The evaluation approach consisted of three phases: first, the students were asked to fill out a short self-assessment questionnaire and a shortened adapted version of a standardized IL test. Second, they completed the full version of the IL MOOC. Third, they were asked to fill out the full version of a standardized IL test and a user experience questionnaire.


The results show that first the designed workflow was suitable in practice and led to the implementation of a full-grown MOOC. Second, the implementation itself provides implications for future projects developing multilingual educational resources. Third, the evaluation results show that participants achieved significantly higher results in a standardized IL test after attending the MOOC as mandatory coursework. Variations between the different student groups in the participating countries were observed. Fourth, self-motivation to complete the MOOC showed to be a challenge for students asked to attend the MOOC as nonmandatory out-of-classroom task. It seems that multilingual facilitation alone is not sufficient to increase active MOOC participation.


This paper presents an innovative approach of developing multilingual IL teaching resources and is one of the first works to evaluate the impact of an IL MOOC on learners' experience and learning outcomes in an international evaluation study.



Dreisiebner, S., Polzer, A.K., Robinson, L., Libbrecht, P., Boté-Vericad, J.-J., Urbano, C., Mandl, T., Vilar, P., Žumer, M., Juric, M., Pehar, F. and Stričević, I. (2021), "Facilitation of information literacy through a multilingual MOOC considering cultural aspects", Journal of Documentation, Vol. 77 No. 3, pp. 777-797. https://doi.org/10.1108/JD-06-2020-0099



Emerald Publishing Limited

Copyright © 2020, Stefan Dreisiebner, Anna Katharina Polzer, Lyn Robinson, Paul Libbrecht, Juan-José Boté-Vericad, Cristóbal Urbano, Thomas Mandl, Polona Vilar, Maja Žumer, Mate Juric, Franjo Pehar and Ivanka Stričević


Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode


Information literacy (IL) refers to a “set of integrated abilities encompassing the reflective discovery of information, the understanding of how information is produced and valued, and the use of information in creating new knowledge and participating ethically in communities of learning” (ACRL, 2016). IL can be considered as a necessary perquisite to participate in today's knowledge society (Klusek and Bornstein, 2006; UNESCO and IFLA, 2012). Nevertheless, several studies have found IL skills of students to be weak. For example, students seem to have troubles to use Boolean operators, organize literature and tend not to know the appropriate sources to find the scientific literature (Skipton and Bail, 2014; Maurer et al., 2016). In addition, students tend to overestimate their IL skills (Michalak and Rysavy, 2016). New digital channels, especially massive open online courses (MOOCs), seem to be a promising approach for developing IL skills of students (Creed-Dikeogu and Clark, 2013; Gore, 2014; Dreisiebner, 2019). MOOCs have a course-like structure and no formal requirements for enrollment; they have no limit regarding the number of participants and all materials are available online (Veletsianos and Shepherdson, 2016; Wulf et al., 2014).

Today we see a rising number of MOOCs on IL. While early literature agreed on the possible benefits of MOOCs on IL, concerns were also expressed about how to design them and the question whether libraries would have the capacities for such projects (Creed-Dikeogu and Clark, 2013; Gore, 2014). As IL MOOCs are provided online with unlimited enrollment possibilities, participants might be from all over the world. This provides challenges when designing the content and instructional design, particularly in IL where also country- and language-specific aspects such as different local information resources should be considered. However, multi-lingual and multi-cultural implementations of IL MOOCs are still rare (Dreisiebner, 2019; Nowrin et al., 2019). Peer-reviewed literature about the multilingual implementation of MOOCs is still rare and focuses on the potentials for language learning (Martín-Monje and Bárcena, 2015; Clinnin, 2014). A recent overview has been provided by Nowrin et al. (2019). Multilingual facilitation might also have a positive effect on active MOOC participation (Colas et al., 2016). This paper describes the outcomes of a good practice analysis in IL education and the resulting technical framework, content creation workflow and content framework for a multilingual MOOC to facilitate IL considering cultural aspects and reports on the evaluation of the impact on participants' IL skills and how the MOOC is perceived in terms of user experience. This leads to the main research questions: How could IL be facilitated through a multilingual MOOC? How is such a MOOC perceived by students and how does it impact their IL skills?

The ILO project approach

Methodologically, this research follows the design-based research approach for library and information science, which “draws upon the authentic knowledge of practitioners and users” (Bowler and Large, 2008) and “sets out to solve real-life problems with innovative solutions, results in a working product that has been honed to suit the actual needs of its users, and contributes explanatory models or theory” (Bowler and Large, 2008). Based on the situation that multi-lingual and multi-cultural implementations of IL MOOCs are still rare the project information literacy online (ILO) was started. Accordingly, the main outcome of the ILO project was a MOOC for developing IL focusing on higher education students. Seven European higher education institutions collaborated within the project: the University of Graz (Austria), University of Hildesheim (Germany), Leibniz Institute for Research and Information in Education (Germany), City, University of London (Great Britain), University of Barcelona (Spain), University of Ljubljana (Slovenia) and the University of Zadar (Croatia). ILO was co-funded by the European Union through the Erasmus + Key Action 2: Strategic Partnerships for higher education and implemented in the period from 2016 to 2019. All results including the courses are available online at https://informationliteracy.eu

ILO aimed to design a MOOC that is as generic as possible and thus to focus on IL elements which are relevant for all disciplines. Examples for such general IL elements are being acquainted with information landscape and resources, recognizing own information needs, Boolean operators, basic principles in knowledge organization or basic knowledge of copyright law and information ethics. As IL also covers subject-specific elements (Grafstein, 2002; Dreisiebner and Schlögl, 2019), the project demonstrated the applicability of the generic ILO MOOC to Business Administration and Psychology by developing two sample subject-specific extensions. A special aspect of the project concerned offering this content for six European cultural and language groups: English, German, Spanish, Catalan, Slovenian and Croatian. The multilingual approach aimed to not only consider formal translation but also cultural-specific differences in the various realizations. ILO also aimed to implement a technology-based assessment component which allows participants to get feedback on their learning success. Starting point for the project was an analysis of international good practices in IL education, which led to the collaborative design of a content framework and didactic concept for the ILO MOOC. Based on this framework, the partners collaboratively developed the ILO MOOC in English. As soon as the English MOOC was finished, the single partners translated and adapted the MOOC for their respective local language. The German version was developed in cooperation by partners in Austria and Germany. Evaluations of all language versions followed, which led to some adaptations before the MOOC was finally published to the public. Figure 1 provides an overview of the aims and structure of the ILO project.

Good practices in information literacy education

The first stage of the project was a survey of international good practices in information literacy education, to inform the content and structure of the MOOC. This involved a selective identification and analysis of published literature and Internet sources, qualified by expert opinion. The survey was selective rather than comprehensive and focused on those aspects of a large literature considered to be of most significance in identifying good practices in teaching and training for information literacy, to guide the structure, content, nature of interaction and pedagogical practices of the MOOC. No attempt was made to identify “best” practice; rather to identify approaches which had been reported to have successful application in several relevant contexts, without ignoring recent and hence less widely reported, promising examples.

The main survey was carried out between January and March 2017 and updated through September 2017, when design of the MOOC began. No further full updating of the report was carried out, but additional significant material noted was added up to October 2019. Aspects of the survey and its results have been presented in detail (Robinson and Bawden, 2018a, b; Nowrin et al., 2019); a summary is given here.

Sources used for the survey were: seven bibliographic databases (Library and Information Science Abstracts, Library and Information Science and Technology Abstracts, Web of Science, Educational Abstracts, British Education Index, Educational Resources Information Centre and Applied Social Sciences Index and Abstracts); Internet search engines; other Internet sources (specialist blogs, associations and curricula); citation indexes (Web of Science and Google Scholar) to follow up relevant sources and catalogues of four libraries (City, University of London, University College London, London University Senate House and the British Library). The contents of four particularly relevant journals (Journal of Information Literacy, Communications in Information Literacy, Nordic Journal of Information Literacy in Higher Education and British Journal of Educational Technology) were scanned, as were the contents of relevant book series and monographs.

Since the relevant literature (published and unpublished) is extensive, and practice rapidly developing, the focus was on recent materials: a cut-off publication date of 2012 (five years prior to the start of the analysis) was taken, although particularly significant older material was also included where appropriate. The aim was not to produce a comprehensive bibliography but rather a selective list of resources providing evidence of good practice. However, a comprehensive approach was taken to issues of particular relevance to the project, such as multi-lingual and multi-cultural information literacy education and self-assessment in MOOCs. An analysis of the features of 22 existing MOOCs for information literacy education was also carried out. Finally comments on aspects of the draft report were obtained from nine information literacy experts, not associated with the ILO project, in the UK, Germany, Slovenia and Croatia, to assess the validity of the conclusions.

The analysis confirmed that there is little evidence of good practice in multilingual and multicultural IL instruction, nor in self-assessment of learning so that the ILO project is indeed innovative in these respects. In order to focus on these aspects, the recommendations focused on evidenced standard good practice in other respects. The material from the survey was analyzed under eleven themes: information literacy definition and models; information literacy content and contexts; pedagogical frameworks; teaching and learning methods; interaction and collaboration by learners; structuring learning materials; assessment methods; multi-lingual and multi-cultural aspects; information literacy outside higher education; general MOOC issues and MOOCs specifically for library/information education and for information literacy. A series of good practice recommendations for the development of IL MOOCs were made and the more important are summarized here.

A suggested schedule of content for the MOOC was derived by evaluating the content of standard models of information literacy, choosing those which occurred in two or more of six information literacy models: ACRL Framework (ACRL, 2016), ACRL Standards (ACRL, 2000), SCONUL Seven Pillars (SCONUL, 2011), Big Six Skills (Eisenberg and Berkowitz, 1990), ANCIL Model (Coonan and Secker, 2011) and Metaliteracy Model (Mackey and Jacobson, 2011). The ACRL Standards were included, despite its age, because of its use in online tutorials and MOOCs. The reason for this approach was that the emphasis in the project was to focus on the multi-lingual and multi-cultural aspects, rather than to reconsider the basics of information literacy.

This gave a set of sixteen core concepts, to be the basis for MOOC development:

  1. Understand the information environment (in the widest sense)

  2. Use digital tools effectively

  3. Recognise information needs, and how to address them

  4. Know relevant information resources

  5. Find and access information

  6. Critically evaluate information and information sources

  7. Critically evaluate online interactions and online tools

  8. Manage information

  9. Collaborate in information handling

  10. Share digital content ethically

  11. Become an independent and self-directed learner; and a lifelong learner

  12. Learn to learn; develop metacognition

  13. Understand ethical issues of information

  14. Present and communicate information

  15. Create information products

  16. Synthesize information and create new knowledge

Materials for each of the concepts should be created in a generic manner, allowing for use in and possibly customisation for specific contexts and subjects. Good practice was found to create the materials as “bite-sized” units and to weave them into a longer-duration context. Ideally, they would be sufficiently comprehensive, generic and granular as to be useful in contexts outside higher education. The MOOC should be designed so as to be easily adopted for blended learning within an institution, to be used both as instructor-led over a defined time period and also as “any-time self-paced”. It should also be hospitable to self-study and to participants wishing to “drop in” to examine sections of the material.

The basic learning unit ideally takes a two-part (didactic/active) structure, with elements of student interaction and collaborative learning included where possible. As much realistic interaction in exercises and activities as possible should be provided, and there should be an emphasis on reflection for learning, to complement activities and skills. Self-administered quizzes were recommended as the main method for self-assessment; despite their acknowledged limitations, they appear to be best available choice if the MOOC is too useable in all environments, including self-study. Gamification was not recommended, on the basis of published experience (Markey et al., 2014; Guo and Goh, 2016), despite its popularity in some circumstances. Finally, the MOOC materials should be constructed as re-useable learning objects.

Relatively little attention has been given to multi-lingual and multi-cultural education for information literacy; such prior art as could be found is described by Nowrin et al. (2019). This confirmed that the concept of the ILO MOOC project was novel and original.

MOOC implementation

Content framework of the MOOC

Based on the initial survey of international good practices in information literacy education, the content framework of the MOOC was derived. Regarding the content and pedagogical approach, the ILO MOOC is innovative in several ways, thus offering original material for learning and teaching IL and for developing MOOCs. The modules, lessons, units and learning objects are based on a combination of the best components of relevant approaches and models of IL and active learning. The multilingual and multicultural nature of the MOOC is considered and self-assessment components are included in it. It is useable as standing-alone course, but also enables inclusion in subject specific contents and blended learning.

This process of MOOC creation was led by a team at one partner institution, which produced the first draft. This first draft was then discussed with the project teams including all partners, which led to further comments, adaptations and additional discussions. This iterative process led to the final content framework. Materials for each of the modules were created in a manner which was generic, but which allows for use in, and possibly customisation for, specific contexts and subjects. Those core concepts generate specification of competencies on several levels in six areas: understand and engage in digital practices, clarify research questions, find information, critically evaluate information, online interactions and online tools, manage and communicate information, collaborate and share digital content. The usual structure of tutorials or MOOCs is modular because of different needs and levels of competences of potential users. Thus, the general MOOC contains six modules:

  1. Module 1: orienting in an information landscape

  2. Module 2: research is a journey of inquiries

  3. Module 3: the power of search

  4. Module 4: critical information appraisal

  5. Module 5: information use: the right and fair way

  6. Module 6: let's create something new based on information and share it!

Each module consists of lessons and lessons contain a certain number of the units. There are 23 lessons and 141 units in total. Beside general information at the beginning of each module and expected outcomes, lessons that follow start with a short introduction and a list of particular learning outcomes related to the lessons and units within it. Altogether, the MOOC contains seven final quizzes as tools for feedback, with different types of questions related to particular module or subjects. Beside the body text, the units consist of different audio-video and textual learning objects including text, diagrams, audio-video materials, screencast, images, presentations to attract attention, links, original simulations, scenarios, exercises, quizzes and self-assessment units. Special attention was given to these self-assessment components as a major shortcoming of current IL courses, and MOOCs is the lack of self-assessment components.

As this MOOC is multilingual, the compatibility and harmonization of work of the project team members has been a permanent issue, so all the stages were adjusted and balanced. First, so called master content was developed in English. Based on the previously agreed aims and learning objectives, each partner developed a draft of the contents for one module, which led to a kind of script of the proposed student's workflow in the module with the details of the number of lessons, suggested content and exercises with details of the text and graphic/audiovisual material visible to the student, alongside with guidelines for design that content/exercises, and at the end, assessments pieces for each module or some lessons quizzes. This was followed with discussions and checks for overlapping and for consistency between all the modules. The English master content was then translated and adapted (culturally and linguistically) to other languages: German, Spanish, Catalan, Slovenian and Croatian. The same was with any change that appeared – it was agreed between the partners and then applied in English and other languages afterwards. Adaptions in the other languages involved for example the change of recommended readings and videos, adaptation of exercises, the addition of language-specific rankings and country- and language-specific information sources.

Technical implementation

As described above, it was decided to create the ILO MOOC as a learning opportunity that can be consumed entirely at one's own pace. This decision implied that the MOOC was not to be used in collaborative scenarios or have fixed start and conclusion times. This also implied that all evaluations were to be automatic or self-driven, and that both the enrollment and certificate generation is to be automated.

Beyond the learner-side requirements, the ILO MOOC set as its central goal to provide a multilingual offer, to deliver its content without the use of external services so as to respect privacy in a course intended to explain the best-practice of the exchange of information. Finally, the qualities of a long-term stability and of a re-useable content were put to the agenda.

This set of requirements has brought the project to choose an authoring process, including the services where it happens, and to choose a delivery platform. The requirements led us to exclude technologies that were often the goal of hacker-attacks and to exclude external services for the delivery of the videos as much as possible. It was thus chosen to use a self-hosted non-PHP-based platform which already had strong multilingual facilities: only OpenEdX and Canvas remained as candidates for the final choice. Among the two, OpenEdX appeared to be stronger in terms of multilingual advances, with a huge community performing the crowdsourcing translation of the front and back end of the software into a lot of languages (Transifex, 2020). The service allocated for the ILO MOOC was chosen to be running on a classical Linux environment whose security was trusted. While parts of the MOOC reached limits in the process of authoring the content, the main MOOC delivery has held a strong response-time during the pilots.

Content creation workflow

In order to create the content, multiple stages of authoring were established: The process started as synchronized collaborative authoring, during which each author also considered the content created by partners to clarify his own; a second phase allowed the comfort of a common word-processor where much expressivity was still left; the third phase forced the authors into a simplification of their content so that it is compatible to the MarkDown format (Gruber, 2004): The simplification allowed for a highly re-useable and easily stylable content (e.g. for narrow screens). In the subsequent step, the authors copy and pasted the texts to the OpenEdX platform such that they could be displayed to students. The details of this process have been described in more detail (Libbrecht et al., 2019). It is centered around the versioning system, GitLab, which all authors and technical supporters contributed to and used.

Parallel to this, another relevant process in the content creation was video production and captioning. As such, there were some issues to solve prior to create audiovisual material. First, cultural questions such as the language considering doing videos in different languages: learning videos need to be carefully planned to avoid using a unique script for every video. Second, the following technical issues were resolved. The selection of a video editor to be used by the whole team in different operating systems, as all partners had different devices and different know-how on creating educational videos. This software should accomplish criteria such as easy management and caption generation in the different languages spoken in the partner's countries. The chosen software was Screencast-o-Matic (Big Nerd Software, LLC, 2020). This software currently works on different platforms as Windows, Macintosh and Android. It permits to create videos in several ways as video-lectures or video-tutorials (Boté, 2019), as well as the use of the webcam to speak in front of the camera when necessary. It is also possible to create a script inside the same software facilitating writing scripts in different languages. Scripting permits to create the whole text of the video before recording it and gives other members of the team the option to translate the script to any language. This element of openness and level of exchange is very useful when different members of a team need to create videos not only in different languages but also in different versions. Screencast-o-Matic also permits to generate captions in several languages with standardized exporting options, included the languages of the project. This functionality greatly facilitates the creation of the same videos in different languages. It was decided not to use background music or other sounds at the back or voice-over, so working with one track of sound was enough to accomplish the criteria stablished by the members of the team. Third, training was necessary part of the team to understand how the software worked. There were two specialists on the team that were in charge to train the rest of the team using this software and organize the working flow of the videos. Finally, videos could be uploaded in raw version to the GitLab platform to control versions in different languages. These raw versions could be downloaded by other members of the team to adopt the video to their language.

Also, some existing videos have been re-used. To ensure long-term availability, it was aimed not to include external videos but to host them on the ILO server. Therefore, the authors of the videos have been contacted to provide the original video files and to allow CC-BY licensing and self-hosting. This process has proofed to be difficult and less successful than initially expected. The identification of high-quality material suited exactly for an item in the ILO MOOC is time consuming and not always successful. In several cases, it was not possible to clarify rights or they were offered for a too high prize.

Figure 2 shows the final outcome of the ILO MOOC: (1) the landing page allowing users to access the MOOC itself with the learning content, access to a content repository including the full MOOC content in re-useable format and page with further information on the MOOC; (2) the course selection page allowing to access the MOOC in English, German, Spanish, Catalan, Slovenian and Croatian; (3) within one of the learning steps of the course – in this particular page the learner is introduced into the fictional character Marco whose information problem frames various of the further learning steps; (4) within an self-assessment where the learner goes through a simulated web search and has to decide for the most appropriate search result.



The aim of the study was to gain insight in three areas: (1) students' progress in knowledge, (2) students' opinion of the MOOC and (3) perceptions and opinions of the teachers who were involved in the evaluation of the MOOC. Therefore, the study was two-fold: student evaluation provided the data to evaluate the students' learning and the MOOC, while teacher evaluation was used as an enhancement, to gain insight into how evaluations were performed in each of the partners' institutions.

Student user study

Both aspects, learning outcomes and usability, were covered in one user study. As reported earlier (Dreisiebner et al., 2019), the student evaluation approach consisted of three phases: first, the students were asked to fill in a short pre-test. Second, they completed the full version of the ILO MOOC. Third, they were asked to fill out a longer post-test. To allow matching of the pre- and post-test questionnaires, self-generated identification codes (Yurek et al., 2008) were used. Both questionnaires were implemented into LimeSurvey.

The pre-test consisted of four parts: (1) a questionnaire on personal background information, such as age, study program, and previous degrees; (2) a self-assessment of IL skills consisting of seven questions on previous experience and information needs which were rated on a three-point Likert scale, finishing with an open question on perceived problems regarding information needs; (3) a shortened adopted version of a standardized IL test (Boh Podgornik et al., 2016) consisting of 12 single-choice questions (Figure 3 gives an example of one question within this questionnaire) and (4) a short questionnaire with three subject-related questions. Students were expected to complete the pre-test in 5–7 min.

The post-test consisted out of three parts: (1) the full version of a standardized IL test (Boh Podgornik et al., 2016) consisting of 39 single choice questions; (2) the same three subject-related questions as in the pre-test and (3) a user experience questionnaire. This questionnaire asked for the language setting used when attending the MOOC and allowed an open response answer on the overall experience with the MOOC. Afterwards, nine usability aspects were rated on a five-point Likert scale. For two of these aspects, participants had the possibility to leave comments. Finally, participants were asked how important they considered the availability of information in various formats within the course and had the possibility to leave comments on anything particularly disturbing or anything particularly appealing when using the MOOC. Students were expected to complete the post-test in 15–20 min.

The students could gain one point per correct multiple-choice question in the standardized IL test. The pre-test contained 15 multiple-choice questions, the post-test 42 multiple-choice questions, thus allowing a maximum of 15 and 42 points, respectively. The results were first analyzed using Microsoft Excel, where the achieved points per participant were computed in percent. Afterwards IBM SPSS Statistics was used for further statistical analysis.

Teacher opinions

Here we used a structured interview consisting of 12 questions, asking about the time in the semester when the MOOC was evaluated, the way they performed the evaluation with the students, was it a voluntary or a mandatory task for the students, did it result in any benefits for the students, what were students' reactions and feedback, did they observe and change in students' information behaviour and/or information skills after attending the MOOC.

Samples and data collection

Data were collected in the period between March and June 2019. We used convenience purposive sampling in both parts of the research. 350 students took part in the evaluation. Out of these, 198 students (144 female and 54 male) completed both the pre-test and the post-test. The biggest group of participants were students from the University of Hildesheim, Germany, with 68 participants. The sample consists mostly of bachelor students (84%). Most students were still in their first (53%) or second (34%) year of their studies. 76% of the participants acknowledged to have an information science background. Each group used the respective linguistic version. Table 1 gives an overview of the composition of the sample.

Teachers' opinions were gathered by an interview that was done via email with the teachers and lecturers who were involved in evaluating the MOOC. In all, seven teachers and lecturers participated; one from each country.


Levels of inclusion into the syllabus

The institutions and lecturers participating in the evaluation opted for different approaches in including the MOOC into their teaching. This was also due to different teaching schedules, different formal regulations regarding the use and evaluation of online teaching elements. As Table 2 details, the approaches using various levels of course inclusion might be categorized into three levels of compulsion and inclusion into the course syllabus:

  1. High level of compulsion and inclusion into the course syllabus

  2. Moderate level of compulsion and inclusion into the course syllabus

  3. Low level of compulsion and inclusion into the course syllabus

The evaluation in Graz and Hildesheim was conducted with a high level of compulsion and inclusion into the course syllabus. The evaluation was already announced and started at the beginning of the semester and the participation was announced as mandatory and part of the final grading. In Graz, the evaluation was conducted in one German-language class with master students and one English-language class with bachelor students. Accordingly, they were asked to participate in the MOOC in the corresponding language. The students were asked to fill in the pre-test and post-test in class. The students were asked to register in the MOOC with their real name, which allowed the instructors to check actual registration and quiz participation in the MOOC through the instructor console of ILO. In the master class the students received a reminder around half in the evaluation period. The amount of MOOC participation was a grading component of the course. In Hildesheim, the students participated in the evaluation as part of a distance-learning course in the bachelor program. Thus, they had to fill in the pre- and post-test at home during a two-week period at the beginning of the semester. They received all instructions on the tasks through the course webpage, but no additional reminders. The participation was not checked for the MOOC itself, but for the pre- and post-test. The students had to hand in their self-generated identification code after finishing the evaluation. For participation they received eight points that were part of the final grade. The approach of course inclusion is reflected in the completion rate of both pre- and post-test of over 80%.

The evaluation in Zadar and Ljubljana was conducted with a moderate level of compulsion and inclusion into the course syllabus. Participation in the evaluation was not mandatory and conducted by the end of the semester with less strict measures in place to control participation. In Zadar, the students were asked to participate in the evaluation in the last month of the semester. The pre-test had to be taken in class, while the post-test was to be taken at home. For most of the students (two classes with 43 students) the participation was not mandatory, not controlled and no form of benefits provided. However, for a smaller group of one class with 12 students the evaluation was mandatory and participation was also checked through the instructor console of the MOOC. These students could gain some additional points for their final grade through participation. Reminders were given during the evaluation period through the Moodle forum and in person during lectures. The overall completion rate was high (91%). In Ljubljana, the students were asked to participate in the evaluation in the last weeks of the semester, which was also the exam period. The participation was not mandatory and all evaluation activities had to be done at home. The students received weekly reminders in class. If the students could prove their participation in the evaluation, they got a bonus for the grade of their final exam. The completion rate was relatively good (60%).

The evaluation in London and Barcelona was conducted with a low level of compulsion and inclusion into the course syllabus. In London, the evaluation was announced around half-way through the semester to a cohort of library/information graduate students and to cohorts of mathematics and engineering undergraduate students. The students were assumed to complete the whole evaluation in their own time out of class within a six-week timeframe. One reminder was issued through the e-learning system. The participation was voluntary and there were no benefits provided. While some of the students completed the pre-test, none of them completed the post-test. In Barcelona, the evaluation was announced in the last weeks of the semester. It was announced as mandatory activity, and the students had to fill out the pre- and post-test in class. For this participation they received a small benefit for their final grading (0.2 points out of the total 10). During the evaluation period they were reminded to participate in the MOOC. However, the participation in the MOOC was not checked. As students complained about the workload, they were asked to complete at least one module of the MOOC. The students could choose between the Catalan or Spanish version of the MOOC at their best convenience. However, the evaluation questionnaire was only provided in Spanish. The Barcelona approach had some similarities regarding the compulsion and inclusion with Zadar and Ljubljana. However, it was classified for the purpose of this study as low level of compulsion and inclusion, as the students were instructed to go through at least just one of the six units of the MOOC.

Three of the seven interviewed teachers observed changes in the students' information behaviour and/or information skills after attending the MOOC, particularly as they were doing well in their term papers. All of these teachers were from the institutions with a high level of inclusion. In one case they reported also about very positive feedback from the students, stating that they found the content of the MOOC as very valuable, specifically the subject-specific information resources for economics and business administration and practical guidelines on how to write a scientific thesis. The remaining four teachers did not observe any changes but also remarked that the semester was over in their cases when the evaluation was finished.


The self-assessment, which was filled in as part of the pre-test before attending the MOOC, showed an overall high confidence in the students' own IL skills. However, it also indicates a certain awareness of shortcomings. As Table 3 shows, a majority (73.22%) of the participating students frequently (weekly or several times a month) search for information as part of their study activities and 24.75% search for information at least sometimes (several times during a semester). Confidence in knowing information sources is not so high, as 79.80% believe to know some of the most important information sources in their field, but only 18.18% believe to know most of them. 75.76% sometimes know which source to use when a particular type of information is needed, but only 21.72% believe they always do. Only around half (56,57%) of the students think that they can successfully use most sources to retrieve the needed information and 58.08% acknowledged to usually find the information need in the used sources. The students appear to be less confident regarding comparing and evaluating different resources and using information appropriately to the task, where 6.57 and 4.55% answered with no and 54.04 and 54.55% answered with yes, respectively.

In addition to the questions shown in Table 3, the self-assessment gave participants the possibility to express their problems regarding their information needs in their own words. As main problems the students reported to struggle with information overload, finding relevant information and evaluating the quality of information.

Results of the IL testing

Comparing the results of the students achieved in the IL test before and after attending the MOOC shows a diverse picture. As Table 4 shows, the average result of all 198 participating students stayed at the same level of approximately 74.9%, with a minimal difference of 0.02%. A paired sample t-test confirms no significant difference between the pre- and post-test (p = 0.986; T(197) = 0.018) and the effect size (Cohen, 1988) shows no effect (d = 0.00126).

However, as Table 5 shows, the achieved results differed according to the level of compulsion and inclusion into the course syllabus. The group with the highest level of inclusion, i.e. where the evaluation was announced and started at the beginning of the semester, the participation was announced as mandatory and part of the final grading, shows an increase of the average achieved result in the post-test by 3.46 %. A paired sample t-test shows a significant difference between the pre- and post-test (p = 0.014; T(96) = –2.503). The effect size is small (d = 0.25) and thus “not so small as to be trivial” (Cohen, 1988). The Graz students were able to increase their result by 6.3%, which is significant (p = 0.007; T(28) = −2.938) with a medium effect (d = 0.54). Among these Graz students, the participants of the German-language MOOC achieved a slightly better result than the participants of the English-language MOOC, which can be contributed to the fact that German is their mother tongue. The Graz students of the German-language MOOC also achieved the highest average score of all groups in the post-test with 84.87%. In comparison, the Hildesheim students were only able to increase their result by 2.25%–72.96%. This is not significant (p = 0.1993; T(67) = −1.2963) and no effect given (d = 0.19).

The group with a medium level of inclusion, i.e. where participation in the evaluation was not mandatory stayed at the same level. There was a decrease of 1.98%, which is not significant (p = 0.141; T(59) = 1.491) and no effect given (d = 0.19). Both groups of students (from Zadar and Ljubljana) had similar results. The post-test results by the Zadar students decreased by 1.75%, which is not significant (p = 0.287; T(41) = 1.080), with no effect given (d = 0.17). In comparison, the post-test results by the Ljubljana students decreased by 2.48%, which is not significant (p = 0.302; T(17) = 1.064) but with a small effect (d = 0.25).

The group with a low level of inclusion, i.e. where the students were introduced into the evaluation at a late point during the semester and given the official permission to attend only one out of the six units of the MOOC before attending the post-test, the results decreased by 5.37%. This is a significant decrease (p = 0.007; T(40) = 2.847) with a small but nearly medium effect (d = 0.44). This group of students also achieved the worst average score of all groups in the post-test with 70.04%.

User experience

As final part of the evaluation after attending the MOOC the participants were asked to fill in a questionnaire about their user experience. The questions asked for ratings on a five-point Likert scale. As can be seen in Table 6, all items received relatively positive ratings. The highest ratings were given to the navigation. Finding the next/previous button (4.66), moving between the different lessons (4.25) and the navigation in the user interface (4.20) were found as very easy. The layout of the text on the screen (3.84), clarity and general quality of the text of the lessons (3.72) and organization of the interface (3.71) were found as good. The participants were also satisfied with the language of the interface (3.72). However, the students evaluated the amount of course content more critically. The amount of material in the course (3.73) and the amount of information shown on one screen (3.44) were both considered as too much. The participants were also invited to express their personal preference on the importance of having learning information in various formats (e.g. text, videos). This was considered as very important (4.26).

At the end of the questionnaire the participants were asked on their overall experience with the MOOC and whether they found anything particularly disturbing or anything particularly appealing when using the user interface. Not all participants answered these open questions. The general feedback was mostly positive or neutral with 66.43% of students' remarks describing their experience in terms like exciting, helpful and educational. Particularly well received were the layout and the structure of the MOOC. A total of 44 participants commented that the layout was clear, intuitive or easy to use, while 36 participants remarked the structure of the content as particularly appealing. Critical comments found the MOOC as too detailed and time-consuming (12 participants), found some formulations and tasks not clear enough (10 participants) and felt that the MOOC had too much learning content in text form (six participants). Additional critical comments focused mainly on technical issues, particularly on compatibility to devices, an aspect which was resolved later on.

Use of the ILO MOOC

The ILO MOOC was published for the broad public in August 2019. By March 25, 2020 the ILO MOOC had 650 registered learners. According to analysis retrieved from Matomo, the MOOC had 6,732 visits and 32,202 unique page views in the period from March 1, 2019 to February 29, 2020. The top 10 countries by visits in this period were Germany (1,892), Austria (860), Croatia (629), United States (542), Spain (519), Slovenia (341), Italy (320), United Kingdom (197), China (106) and Kenya (95).


As the results of the standardized IL tests conducted before and after attending the MOOC show, the level of integration into the teaching and duration of the activities seem to be important success factors. A significant increase of knowledge – on average 3.46% – (p = 0.014; T(96) = −2.503) was achieved in classes where the evaluation was already announced and started at the beginning of the semester, lasted throughout the semester, and where the participation was announced as mandatory and part of the final grading. The best result was achieved by Graz students, where the active participation in the MOOC was also checked through the requirement to register with real names. The highest score was achieved by the Graz students completing the German version of the MOOC with an average result in the post-test of 84.87% and an significant increase by 6.54% (p = 0.023; T(19) = −2.470). This was slightly better than the achieved increase of 5.77% by the Graz students attending the English version. One possible reason is that in the Graz Master class the results of these checks were also announced in the middle of the evaluation period along with an additional reminder. While by this point most students did not show much course progress, all students completed the MOOC coursework afterwards. Another possible explanation might be that English was not the mother language of all Graz students attending the English MOOC, which has already been previously found to be an obstacle (Fini, 2009). It also seems that student motivation (even if by outside means) is also important and has positive influence on the knowledge achieved through participating in the MOOC. This goes along with Crawford and Broertjes (2010), who recommended to link online IL units to assessment in credit-bearing units. Hildesheim students, who received a similar level of integration, but the actual participation in the MOOC was not checked and had a lower weight for their final grade, only achieved an increase of 2.25%. In comparison, the students from Barcelona, where the students were introduced into the evaluation at a late point during the semester and given the official permission to attend only one out of the six units of the MOOC, achieved the worst average score of all groups in the post-test with 70.04%, which was an significant decrease compared to their result achieved before attending the MOOC by 5.37% (p = 0.007; T(40) = 2.847). And in London, where the evaluation as only announced as voluntary with no benefits provided through the eLearning system, none of the students completed the evaluation, although some of the students participated in the pre-test. Already previous research has found that students struggle with the open self-paced learning environment found in MOOCs, leading to low completion rates (Reich and Ruipérez-Valiente, 2019; Zawacki-Richter et al., 2018), with students experiencing a lack of self-regulation and procrastination (Barnard et al., 2009). The grading and active surveillance of MOOC participation with the Graz students might have helped the students to overcome this issue. However, the results of this study cannot confirm that the multilingual facilitation alone can have a positive effect on active MOOC participation (Colas et al., 2016; Fini, 2009). The feedback received in the user experience evaluation shows similarities to previous evaluations of online IL tutorials in terms of the in general positive feedback, but criticism regarding too detailed and time-consuming content, too much learning content in text form and technical issues (Crawford and Broertjes, 2010; Weiner et al., 2012).

Having evaluated the MOOC showed that the delivery process, once the authoring process is terminated, can work in a good performance within the pilots, except for the standardized assessment which suffered from network problems when the connection was not at top speeds. The hypothesis of a multilingual MOOC hosting platform which favored OpenEdX to Canvas turned out to be wrong: It happened that the translations of OpenEdX are much more numerous but also of much lower quality and that a consistent processing of translations' contributions was barely present. This brought, for example, MOOC pages where the translation for Forward was present but not Backward. While efforts were made to work around the lack of approval and these inconsistencies by manual adjustments to the script and data, the initially expected complete full translation of the MOOC user-interface messages was impossible. This is also reflected by the relatively low approval rating of the language of the user interface in the user experience evaluation, in comparison to the other items.

Conclusions, limitations and further research

The aim of this paper was to describe the rationale, technical framework, content creation workflow and evaluation for a multilingual MOOC to facilitate IL considering cultural aspects. The ILO MOOC was implemented in English, German, Spanish, Catalan, Slovenian and Croatian and evaluated by 198 students in all of these languages by completing self-assessment, standardized IL test and user experience questionnaire. The results show that first the designed workflow was suitable in practice and led to the implementation of a full-grown MOOC. By March 25, 2020 the ILO MOOC already had 650 registered learners and also received various visits from outside the partner countries, notably the United States (fourth place by number of visits) and China (ninth place by number of visits). Second, the implementation itself provided implications for future projects developing multilingual educational resources. Especially the consideration of the quality of the available translations of the hosting platform figured out to be an important issue. Third, the evaluation results show that participants achieved significantly higher results in a standardized IL test after attending the MOOC as mandatory coursework. Those participants could increase their result significantly with a small effect by on average 3.46%. Fourth, self-motivation to complete the MOOC showed to be a challenge for students asked to attend the MOOC as non-mandatory out-of-classroom task. Those students actually achieved a lower result in their post-test. This issue goes along with previous research that has found students to struggle with the open self-paced learning environment of MOOCs, leading to low completion rates and procrastination (Reich and Ruipérez-Valiente, 2019; Zawacki-Richter et al., 2018; Barnard et al., 2009). Thus, based on these observed effects, it seems that multilingual facilitation alone is not sufficient to increase active MOOC participation.

This work comes with several limitations that in turn provide avenues for future research: First, the evaluation included a relatively small sample of 198 students attending selected classes. 76.26% of them had an information science background and 83.83% were undergraduate students. Second, the evaluation of IL knowledge gains was based only on single-choice questions in the standardized IL test. Multiple forms of assessment would be needed to fully measure student performance and program effectiveness (Beile, 2008). Third, the approaches how the evaluation was included into the various classes, if it was made a mandatory coursework and how student participation was checked and encouraged differed between the different partners. This was influenced by the different regulations at the participating institutions out of six countries, which partly limited the teacher's possibilities to introduce the evaluation and the MOOC into existing courses. Fourth, it cannot be assured that the students participating in the evaluation actually also registered and participated in the MOOC. Such checks were only conduced at one of the participating institutions. Fifth, there might be also culture-specific differences in how the MOOC is used (Loizzo and Ertmer, 2016; Liu et al., 2016). The analysis of possible differences in usage patterns between the different language versions of the MOOC is a promising future research direction.

We have described the difficulties that we met into re-using existing video contents and have attempted to apply a better practice into archiving the various stages of our content. The MOOC was used by students learning self-parts and (in parts) in lectures at some of the partners' universities. Aside of a completely independent consumption of the content in a self-service approach and a guided usage of the MOOC, many other forms of content-reuse can be considered, like the re-use of only some selected content elements. It is not clear yet, what forms of re-use is to be favored or predicted.

New MOOCs aiming to facilitate IL skills continue to emerge. Recent examples are the MOOCs InfoLit for U developed in Hong Kong (The Chinese University of Hong Kong, 2018) and iMOOC developed in Switzerland (University of St. Gallen, 2019). Some of the approaches found in the ILO MOOC are also resembled in these recent projects. Both InfoLit for U and iMOOC have chosen OpenEdX as software platform. Both of them also had a specific group of users in mind when designing the content. While iMOOC is focusing on pupils, InfoLit for U is also focusing on undergraduate students and provides a generic content that is extended through subject-specific modules. At the time of writing, the iMOOC was in progress of being translated from German to Italian and French, which would lead to another example of a multilingual IL MOOC.


Structure of the ILO project

Figure 1

Structure of the ILO project


Figure 2


Example of one question within the pre-test

Figure 3

Example of one question within the pre-test

Composition of the sample

Institution ZadarHildesheimGrazLjubljanaBarcelonaTotal
Language CroatianGermanEnglishGermanSlovenianSpanish or Catalan
AgeUp to 20 years7410062291
21–25 years3125711121687
25 + years42290320
Previous degreesYes420201532
Information science backgroundYes2965201641151
Not known3000003
Year of study12612081840104
In total 42689201841198

Evaluation approaches using various levels of course inclusion

Level of inclusionHighModerateLow
InstitutionGraz (AT)Hildesheim (DE)Zadar (HR)Ljubljana (SI)London (GB)Barcelona (ES)
Language versionGermanEnglishGermanCroatianSlovenianEnglishSpanish or Catalan
Students enrolled241081463011070
Completed questionnaires209684218041
Completion rate (%)8390849160059
Duration (weeks)4824Not specified63.5
Evaluation timeBeginning of semesterBeginning of semesterBeginning of semesterEnd of semesterEnd of semesterEnd of semesterEnd of semester
Place of completing evaluation formsIn classIn classAt homePre-test in class, post-test at homeAt homeAt homeIn class
Control of participationYesYesYesFor majority NoNoNoNo
ReminderOnce mid-evaluationNoNoSeveral timesOnce a weekNoOnce a week
Participation mandatoryYesYesYesFor majority NoNoNoYes
AssignmentFull MOOCFull MOOCFull MOOCFull MOOCFull MOOCFull MOOCOne unit of MOOC
Benefits for participationYesYesYesFor majority NoYesNoYes

Results of the self-assessment of the participants

Self-assessment questionNeverSometimesFrequently
(1) I often search for information as part of my study activities2.02%24.75%73.23%
Not at allSome of themMost of them
(2) I know the most important sources of information in my field2.02%79.80%18.18%
(3) I know which source to use when I need a particular type of information2.53%75.76%21.72%
(4) I can successfully use most sources to retrieve the information I need0.51%42.93%56.57%
(5) I usually find the information I need in the sources that I'm using1.52%40.40%58.08%
(6) I can compare and evaluate different resources6.57%39.39%54.04%
(7) I know how to use information appropriately to the task4.55%40.91%54.55%

Overall results of the evaluation

M pre-test (%)SD pre-test (%)M post-test (%)SD post-test (%)T (197)p<Mean difference (%)d

Results of the evaluation considering the level of inclusion at the different institutions/language versions of the MOOC

InstitutionLanguage of MOOCM pre-test (%)SD pre-test (%)M post-test (%)SD posttest (%)Tp<Mean difference (%)d
High level of inclusion71.8214.3175.2813.32−2.5030.0143.460.25
German and English74.4315.7480.7312.23−2.9380.0076.290.54
Medium level of inclusion79.7210.3377.748.291.4910.141−1.980.19
Low level of inclusion75.4112.9070.0413.932.8470.007−5.370.44
BarcelonaCatalan and Spanish75.4112.9070.0413.932.8470.007−5.370.44

Results of the User Experience questionnaire after attending the MOOC

User experience questionMeanStan. devMinMax
How easy/difficult was it to find the next/previous buttons? 1 (very difficult) - 5 (very easy)4.660.221815
How easy/difficult was to move between individual lessons? 1 (very difficult) - 5 (very easy)4.250.116715
How easy/difficult was the navigation in the interface? 1 (very difficult) - 5 (very easy)4.200.152015
How satisfied were you with the layout of text on the screen? 1 (unsatisfactory) - 5 (very good)3.840.153915
How satisfied were you with the amount of material in the course? 1 (way too little) - 5 (way too much)3.730.099715
How would you rate the clarity and general quality of the text of the lessons? 1 (unsatisfactory) - 5 (very good)3.720.127315
How satisfied were you with the language of the interface? 1 (very dissatisfied) - 5 (very satisfied)3.720.149615
How satisfied were you with the organization of the interface (buttons, menus etc.)? 1 (unsatisfactory) - 5 (very good)3.710.150315
How satisfied were you with the amount of information on one screen? 1 (way too little) - 5 (way too much)3.450.104815
How important do you consider that you have information in various formats (eg. text, videos)? 1 (not at all important) - 5 (very important)4.260.119125


ACRL (2000), “Information literacy competency standards for higher education”, Chicago, Illinois, available at: http://www.ala.org/acrl/sites/ala.org.acrl/files/content/standards/standards.pdf.

ACRL (2016), “Framework for information literacy for higher education”, Chicago, available at: http://www.ala.org/acrl/sites/ala.org.acrl/files/content/issues/infolit/framework.pdf.

Barnard, L., Lan, W.Y., To, Y.M., Paton, V.O. and Lai, S.-L. (2009), “Measuring self-regulation in online and blended learning environments”, The Internet and Higher Education, Vol. 12 No. 1, pp. 1-6.

Beile, P. (2008), “Information literacy assessment. A review of objective and interpretive measures”, Society for Information Technology Teacher Education International Conference 2008 Las Vegas, Vol. 2008, pp. 1860-1867.

Big Nerd Software, LLC (2020), Screencast-O-Matic, Big Nerd Software, LLC, Seattle, WA.

Boh Podgornik, B., Dolničar, D., Šorgo, A. and Bartol, T. (2016), “Development, testing, and validation of an information literacy test (ILT) for higher education”, Journal of the Association for Information Science and Technology, Vol. 67 No. 10, pp. 2420-2436.

Boté, J.-J. (2019), “Lack of standards in evaluating YouTube health videos”, Revista Cubana de Información en Ciencias de la Salud, Vol. 30 No. 2.

Bowler, L. and Large, A. (2008), “Design-based research for LIS”, Library and Information Science Research, Vol. 30 No. 1, pp. 39-46, Library & Information Science Research, Vol. 30 No. 1, pp. 39–46.

Clinnin, K. (2014), “Redefining the MOOC: examining the multilingual and community potential of massive online courses”, Journal of Global Literacies, Technologies, and Emerging Pedagogies, Vol. 2 No. 3, pp. 140-162.

Cohen, J. (1988), Statistical Power Analysis for the Behavioral Sciences, 2nd ed., Taylor and Francis, Hoboken.

Colas, J.-F., Sloep, P.B. and Garreta-Domingo, M. (2016), “The effect of multilingual facilitation on active participation in MOOCs”, International Review of Research in Open and Distributed Learning, Vol. 17 No. 4, pp. 280-314.

Coonan, E. and Secker, J. (2011), “A new curriculum for information literacy (ANCIL)- curriculum and supporting documents”, available at: http://www.dspace.cam.ac.uk/handle/1810/244638.

Crawford, N. and Broertjes, A. (2010), “Evaluation of a university online Information Literacy unit”, Australian Library Journal, Vol. 59 No. 4, pp. 187-196.

Creed-Dikeogu, G. and Clark, C. (2013), “Are you MOOC-ing yet? A review for academic libraries”, Kansas Library Association College and University Libraries Section Proceedings, Vol. 3 No. 1, pp. 9-13.

Dreisiebner, S. (2019), “Content and instructional design of MOOCs on information literacy: a comprehensive analysis of 11 xMOOCs”, Information and Learning Science, Vol. 120 Nos 3/4, pp. 173-189.

Dreisiebner, S. and Schlögl, C. (2019), “Assessing disciplinary differences in information literacy teaching materials”, Aslib Journal of Information Management, Vol. 71 No. 3, pp. 392-414.

Dreisiebner, S., Zumer, M., Vila, P. and Mandl, T. (2019), “Evaluation of a MOOC to promote information literacy: first evaluation results”, in Bago, P., Grgic, I.H., Ivanjko, T., Juricic, V., Miklosevic, Z. and Stublic, H. (Eds), 7th International Conference: The Future of Information Sciences, INFuture2019: Knowledge in the Digital Age, 21-22 November 2019, FF Press, Zagreb, Croatia, pp. 106-111.

Eisenberg, M. and Berkowitz, R.E. (1990), Information Problem-Solving: The Big Six Skills Approach to Library and Information Skills Instruction, Ablex, Norwood.

Fini, A. (2009), “The technological dimension of a massive open online course: the case of the CCK08 course tools”, The International Review of Research in Open and Distributed Learning, Vol. 10 No. 5.

Gore, H. (2014), “Massive open online courses (MOOCs) and their impact on academic library services. Exploring the issues and challenges”, New Review of Academic Librarianship, Vol. 20 No. 1, pp. 4-28.

Grafstein, A. (2002), “A discipline-based approach to information literacy”, The Journal of Academic Librarianship, Vol. 28 No. 4, pp. 197-204.

Gruber, J. (2004), Markdown, The Daring Fireball Company LLC.

Guo, Y.R. and Goh, D.H.-L. (2016), “Library escape: user-centered design of an information literacy game”, The Library Quarterly, Vol. 86 No. 3, pp. 330-355.

Klusek, L. and Bornstein, J. (2006), “Information literacy skills for business careers”, Journal of Business and Finance Librarianship, Vol. 11 No. 4, pp. 3-21.

Libbrecht, P., Dreisiebner, S., Buchal, B. and Polzer, A. (2019), “Creating multilingual MOOC content for information literacy: a workflow”, Proceedings of the Conference on Learning Information Literacy across the Globe, May 10, 2019, Frankfurt am Main, Germany, pp. 1-15.

Liu, Z., Brown, R., Lynch, C.F., Barnes, T., Baker, R., Bergner, Y. and McNamara, D. (2016), “MOOC learner behaviors by country and culture; an exploratory analysis”, Proceedings of the 9th International Conference on Eductional Data Mining, pp. 127-134.

Loizzo, J. and Ertmer, P.A. (2016), “MOOCocracy: the learning culture of massive open online courses”, Educational Technology Research and Development, Vol. 64 No. 6, pp. 1013-1032.

Mackey, T.P. and Jacobson, T.E. (2011), “Reframing information literacy as a metaliteracy”, College Research Libraries, Vol. 72, pp. 62-78.

Markey, K., Leeder, C. and Rieh, S.Y. (2014), Designing Online Information Literacy Games Students Want to Play, Rowman & Littlefield Publishers, Lanham.

Martín-Monje, E. and Bárcena, E. (2015), Language MOOCs. Providing Learning, Transcending Boundaries, De Gruyter, s.l.

Maurer, A., Schlögl, C. and Dreisiebner, S. (2016), “Comparing information literacy of student beginners among different branches of study”, Libellarium, Vol. 9 No. 2, pp. 309-319.

Michalak, R. and Rysavy, M.D.T. (2016), “Information literacy in 2015: international graduate business students' perceptions of information literacy skills compared to test-assessed skills”, Journal of Business and Finance Librarianship, Vol. 21 No. 2, pp. 152-174.

Nowrin, S., Robinson, L. and Bawden, D. (2019), “Multi-lingual and multi-cultural information literacy: perspectives, models and good practice”, Global Knowledge, Memory and Communication, Vol. 68 No. 3, pp. 207-222.

Reich, J. and Ruipérez-Valiente, J.A. (2019), “The MOOC pivot”, Science (New York, N.Y.), Vol. 363 No. 6423, pp. 130-131.

Robinson, L. and Bawden, D. (2018a), “Identifying good practices in information literacy education; Creating a multi-lingual, multi-cultural MOOC”, in Kurbanoğlu, S., Boustany, J., Špiranec, S., Grassian, E., Mizrachi, D. and Roy, L. (Eds), Information Literacy in the Workplace, Communications in Computer and Information Science, Vol. 810, Springer International Publishing, Cham, pp. 715-727.

Robinson, L. and Bawden, D. (2018b), “International good practice in information literacy education”, Knjižnica, Vol. 62 No. 1, pp. 169-185.

SCONUL (2011), “The SCONUL seven Pillars of information literacy: core model”, London, available at: http://www.sconul.ac.uk/groups/information_literacy/publications/coremodel.pdf.

Skipton, M.D. and Bail, J. (2014), “Cognitive processes and information literacy. Some initial results from a survey of business students' learning activities”, Journal of Business and Finance Librarianship, Vol. 19 No. 3, pp. 181-233.

The Chinese University of Hong Kong (2018), “InfoLit for U”, available at: https://edx.keep.edu.hk/courses/course-v1:UGCULibs+IL1001+2017_01/about (accessed 15 September 2020).

Transifex (2020), Transifex, Transifex US, Walnut, CA.

UNESCO and IFLA (2012), “The moscow Declaration on Media and information literacy”, Moskau, available at: http://www.ifla.org/files/assets/information-literacy/publications/moscow-declaration-on-mil-en.pdf.

University of St. Gallen (2019), “iMooc – massive open online course”, available at: https://i-mooc.ch/ (accessed 15 September 2020).

Veletsianos, G. and Shepherdson, P. (2016), “A systematic analysis and synthesis of the empirical MOOC literature published in 2013–2015”, International Review of Research in Open and Distributed Learning, Vol. 17 No. 2.

Weiner, S., Pelaez, N., Chang, K. and Weiner, J. (2012), “Biology and nursing students' perceptions of a web-based information literacy tutorial”, Comminfolit, Vol. 5 No. 2, p. 187.

Wulf, J., Blohm, I., Leimeister, J.M. and Brenner, W. (2014), “Massive open online courses”, Business and Information Systems Engineering, Vol. 6 No. 2, pp. 111-114.

Yurek, L.A., Vasey, J. and Sullivan Havens, D. (2008), “The use of self-generated identification codes in longitudinal research”, Evaluation Review, Vol. 32 No. 5, pp. 435-452.

Zawacki-Richter, O., Bozkurt, A., Alturki, U. and Aldraiweesh, A. (2018), “What research says about MOOCs – an explorative content analysis”, International Review of Research in Open and Distributed Learning, Vol. 19 No. 1, pp. 242-259.


The authors would like to express their gratitude to numerous project team members who were not co-authoring this paper but for their valuable contributions to the final project outcomes, particularly David Bawden (City, University of London), Björn Buchal, Carolin Hahnel and Alexander Botte (Leibniz Institute for Educational Research and Information), Fedra Kuttkat (University of Hildesheim), Christian Schlögl, Thomas Repp, Mirjam Ricarda Resztej and Sandra Borić (University of Graz) and Sílvia Argudo and Ángel Borrego (University of Barcelona). The production of this publication was co-funded by the Erasmus+ Programme of the European Union under the agreement number 2016-1-AT01-KA203-016762. The European Commission support for the production of this publication does not constitute endorsement of the contents which reflects the views only of the authors, and the European Commission cannot be held responsible for any use which may be made of the information contained therein.

Corresponding author

Stefan Dreisiebner can be contacted at: stefan.dreisiebner@uni-graz.at

Related articles