This paper provides a research-based approach for evaluating resources for transitioning to teaching online.
This paper uses Davies’ (2011) discussion of technological literacy; Koehler and Mishra’s (2009) Technology, Pedagogy and Content Knowledge (TPACK); Leacock and Nesbit’s (2011) Learning Object Review Instrument; and Reynolds and Leeder’s (2018) expanded notion of “technology stewardship” to underpin an approach that educators can use to evaluate educational resources for transitioning to teaching online.
This paper introduces and applies an approach focused on evaluating the source of a given educational resource, as well as how it can be implemented.
This paper synthesizes frameworks relating to qualities of educational technologies and frameworks relating to qualities of educators, and introduces two criteria for evaluating resources for transitioning to distance learning.
This paper provides readily applicable criteria for evaluating resources in a time of emergency distance learning.
This approach enables educators to evaluate resources in a time of emergency distance learning.
The synthesis of four approaches to evaluating educational technologies, and applying the approach to four resources that have emerged to address COVID-19-related instructional needs.
Aguilar, S.J. (2020), "A research-based approach for evaluating resources for transitioning to teaching online", Information and Learning Sciences, Vol. 121 No. 5/6, pp. 301-310. https://doi.org/10.1108/ILS-04-2020-0072Download as .RIS
Emerald Publishing Limited
Copyright © 2020, Emerald Publishing Limited
Educational institutions in the USA have largely transitioned to online learning as a result of state and local policies designed to help curb the spread of COVID-19. In response, informative twitter threads about pedagogy (Kennedy, 2020), blog posts outlining best practices for transitioning online (Darby, 2019), and entire websites (Xu et al., 2020) have been created to help educators cope with the transition. Where does one start when selecting resources to use, however? This paper aims to answer this question by synthesizing four evaluative frameworks that address:
teachers’ technology capacity; and
implementation criteria for e-learning technologies.
A few select criteria from among these frameworks are used to model what it means to be a discerning e-learning technology steward under emergency remote teaching and learning conditions. Once this framework is established, four e-learning resources that may be considered for implementation are evaluated.
Frameworks relating to qualities of educators
Teachers’ educational technology literacy
Davies (2011) provides a tripartite framework for defining technological literacy among educators. In this framework, educators who have little to no experience with implementing educational technologies have limited literacy, or “awareness” of various technologies, and thus may wonder what a particular technology can do. Teachers who are more experienced with technology (i.e. “Praxis”) can successfully answer questions about how a particular technology is used in the classroom. The highest level of literacy, according to Davies, is characterized by “Phronesis,” or having enough experience with technology to discern when a particular technology is appropriate for a given situation (Davies, 2011). Educators who are navigating the transition to online education may be at various literacy levels, depending on the technologies or resources they are being asked to use.
Technology, pedagogy and content knowledge
Koehler and Mishra (2009) examine the intersection between technology, content, and instruction. Technology, pedagogy and content knowledge (TPACK) is thus the “understanding that emerges from interactions among content, pedagogy, and technology knowledge.” TPACK implicitly acknowledges the need for educational literacy described by Davies (2011) while also foregrounding the importance of disciplinary content (e.g. math), and pedagogy. In this way, TPACK knowledge is a synthesis of what one needs to teach (content), how one should best teach it (pedagogy), and what technology can be brought to bear given the former two (technology knowledge). It is unreasonable to expect educators thrust into online education during a global pandemic to have high levels of TPACK knowledge, but increased TPACK knowledge should be the eventual goal.
It is important to attend to institutional contexts where particular resources might be deployed – potentially for longer periods than originally anticipated. In this vein, Reynolds and Leeder (2018) expand on the notion of “technology stewardship,” which posits that a technology steward should understand community and local context needs; have an awareness of technology options; know the consequences of selecting and installing a given solution; help with adoption and transition; and support the everyday use of a given technology (Wenger et al., 2009 in Reynolds and Leeder, 2018). Technology stewards look beyond the ephemeral instructional needs of a given situation by also attending to larger implications when selecting different technologies. This is particularly important for school leaders and administrators who have purchasing authority and will make decisions with consequences beyond the pandemic.
Frameworks relating to qualities of educational technologies
Learning object review instrument
When evaluating particular learning resources, it is also helpful to turn to work by Leacock and Nesbit (2007), who developed the Learning Object Review Instrument (LORI) in response to the vast growth of digital learning resources (i.e. objects) available on the internet. LORI consists of nine evaluative components:
“Content quality,” defined by an object’s accuracy, level of detail and presentation;
“Learning goal alignment,” defined by an object’s alignment to an instructor’s goals for an activity;
“Feedback and adaptation,” or an object’s ability to respond to the needs of various learners;
“Motivation,” defined as an object’s ability to motivate learners;
“Presentation design,” defined as an object’s visual and auditory design features intended to help students learn;
“Interaction usability,” or an object’s ease of use;
“Accessibility,” or an object’s capability to attend to students with disabilities;
“Reusability,” defined as an object’s ability to be adapted to future situations or different cohorts of learners; and
“Standards compliance,” or an object’s adherence to any technical requirements (Leacock and Nesbit, 2007).
LORI provides a way to examine particular learning objects that can be incorporated into a particular lesson plan, unit plan or course syllabus.
Wenger et al. (2009) provide a set of seven technology acquisition strategies that can be used by technology stewards when selecting a technology to meet the given community context. Teachers, librarians, school information-technology personnel or other school staff who have purchasing or implementation authority should consider using the following seven strategies. “Use what you have,” i.e., be aware of current contracts in place for learning management systems or e-learning software; “Build your own,” i.e., understand the needs of your community and build tools that best address them; “Go for the free stuff,” e.g., use social media platforms with existing community organizing features; “Patch elements together,” i.e., find technologies that can work with one another, like RSS feeds and blogs; “Get a commercial platform,” e.g., purchase learning management systems; Build on an enterprise platform,” i.e., use software applications that are integrated with one another; and “Use open-source tools,” e.g., embed documents or maps by using tools like Google drive.
Source and implementation: key criteria synthesized from the above frameworks
The following evaluative approach has two major components that are predicated on the above frameworks: source and implementation. Both source and implementation ask educators evaluating technologies to attend to specific features of a given resource in hopes that the above frameworks will be kept in mind during an evaluation. “Source,” asks educators to consider where a given resources comes from, as a source’s author may be indicative of applicability to a particular situation. “Implementation,” asks educators to look at the logistics of deploying a given resource given their particular circumstances.
The pandemic has encouraged many to contribute to the needs of educators as they transition to online learning. This isn’t to say, however, that one should treat all sources as equal. Instead, one should attend to who has provided a given resource by posing questions outlined by Reynolds and Leeder (2018), such as: “What kinds of evaluation research back up the offerings’ effectiveness?” Doing so will enable one to foreground research on a resource’s effectiveness, unveil potential conflicts of interest, or simply learn more about the source’s author (potentially leading to more helpful resources). When evaluating a resource, it is also important to ask “who wrote it?”. The answer to that question may help guide subsequent implementation decisions. When resources are written and disseminated by organizations focused on learning and research (e.g., Xu et al., 2020; USC Rossier School of Education, 2020), for example, one knows that such resources likely underwent a vetting procedure.
New and untested educational technologies that have large potential implementations should also be scrutinized by those with high levels of technological literacy as described by Davies (2011), or by technology stewards (Reynolds and Leeder, 2018). Smaller implementations (e.g., one’s classroom) should be vetted by the person implementing them. The LORI framework (Leacock and Nesbit, 2007) described above is a good place to start.
Like Davies (2011) suggests, an important first question one should ask when evaluating a resource is: “How can I use it?” The answer to this question, moreover, can be broken down into three components: intended use, prerequisites and stability.
Every educational resource that has been developed for use in online learning has an intended use. This intended use includes the audience it was designed for, as well as a set of circumstances (or learning objectives) it was designed to address. Some of the resources I describe below, for example, were designed as a response to online learning needs as a result of COVID-19 campus shutdowns, while others were repurposed to accommodate them.
Some resources are toolkits designed to help educators think through their own designs (Xu et al., 2020), while others are better characterized as the multimedia resources (Phenomena for NGSS, 2020). Regardless of their type, understanding the prerequisites of a given resource is essential for its successful implementation. This point is underscored by Reynolds and Leeder (2018). Knowing a given resource’s prerequisites can include the understanding the necessary infrastructure (e.g., internet connectivity) that is required to effectively use it, its costs (both in terms of time to implement and monetary costs), or the prerequisite technological literacy levels described by Davies (2011).
Given the surge of resources, it is important to question whether a given resource is ephemeral such that it may not be supported (or may be taken down, as is the case for websites) in a few weeks, or months, time. This is particularly important when it comes to new resources, and less important when evaluating resources that are, by design, intended to be static.
Choice of e-learning examples
There are many e-learning resources that discerning educators can familiarize themselves with and implement. Some resources have been developed as a direct response to the needs of emergency distance learning initiatives, whereas others predate the current crisis, but have none the less been made freely available in response to it. The resources described below were selected in order to highlight both variety (e.g. guidelines, classroom resources, entire design frameworks) and applicability (e.g. particular to specific instructional environments). All are also free and can be patched together as Wenger et al. (2009) describe to address different instructional demands.
Online course quality rubric
The “Online Course Quality Rubric” was written to “provide a systematic and descriptive benchmark for researchers and practitioners who are striving to develop a culture of high-quality college-level online courses” (Xu et al., 2020). It was chosen as an example of a research-based design framework that is geared toward higher education instructors.
Online learning in the wake of COVID-19.
“Online Learning in the Wake of COVID-19,” (Aguilar, 2020) is an Open Science Framework (OSF) project that was established by the author of this paper. As a crowd-sourced project, it is an example of an organic resource that reflects the contributions of a broad community. It was selected as an example of a resource that has multiple components (e.g. resources for research and resources for educators) that addresses the needs of both K-12 and higher education educators.
Statistics by David A. Kenny.
“Statistics by David A. Kenny” (Kenny, 2014) is a website that contains webinars and PowerPoints that have been curated over the years to teach advanced statistics concept such as multiple regression and causal mediation analysis. It was selected as an example of a static resource for higher education instructors that existed before the pandemic, but was made freely available to instructors in response to it. Many such resources have been made freely available in response to distance learning requirements.
Supporting online learning in a time of pandemic.
“Supporting Online Learning in A Time of Pandemic” (USC Rossier School of Education, 2020) is a report that was co-written by over a dozen faculty from the USC Rossier School of Education. It is geared toward K12 educators and is written as a series of questions and answers that may be on the minds of educators. It was chosen as a static example of an official institutional response to emergency distance learning needs.
Analysis of e-learning examples.
The urgent need to implement online learning solutions in diverse contexts at scale will benefit from drawing on the above frameworks and they key criteria I have identified. In the section that follows, I apply the above evaluative approach to the aforementioned resources.
Online course quality rubric
The Online Course Quality Rubric is part of a larger resource: The Online Learning Research Center (OLRC), which is managed by faculty from the UC Irvine school of education. OLRC contains resources for educators, students and researchers (www.olrc.us/reflecting-on-course-design.html). As the OLRC is a collaborative project from a ranked school of education, it contains information by faculty with expertise in online learning. This provides a layer of legitimacy, as it represents the views of experts in the field of online education.
As a rubric, the “Online Course Quality Rubric” can be understood as a formal checklist of course design features to evaluate. It is organized into six course components:
website organization and feedback;
logistics and course management; and
targeted support for online learning.
It also centers three online learning concepts that one might consider given the constraints and affordances of online education:
scaffolding self-directed learning skills and guiding the learning process;
student agency; and
presence and interactivity.
The course components and online learning concepts constitute the rows and columns of the rubric, respectively. Many of the cells of the rubric link to sections within the document that define the concept, outline characteristics of “beginning,” “developing” and “proficient” work, and provide examples of each. It has two versions, a full rubric and an abridged version.
The rubric is a resource for higher education instructors that want a framework to evaluate their course designs. It is a static document that can be best implemented in the ideation or evaluative stage of course development. The abridged version may be better for instructors who already have their online courses designed, and wish to evaluate each of the components. The full version is appropriate for instructors who have both the time and the inclination to understand the learning theory that grounds each of the rubric’s components.
This resource presumes that instructors are in a position to either make substantial changes to their courses, or are in a position to evaluate existing courses. As such, users of this tool may require substantial time to familiarize themselves with it.
The OLRC is positioned as a National Science Foundation supported tool and is thus presumably stable. However, as the website is new, it is important to note that it may be updated frequently, or infrequently. Regardless, the PDF rubric is a static resource that, once downloaded, can be referred to without needing to access the website.
Online learning in the wake of COVID-19
The OSF project is a crowd-sourced project that was established by the author of this paper. It is publicly editable so long as a user has an OSF login (https://osf.io/gxjt4). As an aggregator, it links multiple sources, each of which will have to be evaluated for suitability.
As a Wiki, the OSF project has aggregated various resources focused on helping instructors make the transition to online and distance education. It is organized into two sections. The first is a wiki with links to various resources, which includes Twitter threads written by experts (e.g., “Assessment strategies by Mika McKinnon”), blog posts (e.g., “Zoom 101 tips”) and other general resources (e.g., “AASCU Resources for State Colleges and Universities on the Coronavirus”). The second contains survey instruments that practitioners and researchers alike may find useful to use in their classrooms or research projects, respectively.
Online learning in the wake of COVID-19 is designed as an editable repository of resources that educators transitioning to online education can access at any given moment to orient themselves. The section with survey instruments contains measures that have been written by in response to specific COVID-19 concerns (e.g., digital equity items). The Google Scholar links to work from various scholars may also be useful for those wishing to dig into the research to inform your practice or the design of your online courses.
Familiarity with OSF Wikipedia’s is necessary for editing, though otherwise it operates like a normal website. Still, it is a tool that is likely best used by those who are fairly technologically literate because it requires users to have a sense for what they’re looking for already.
As an OSF project that is open to the public, the project is designed to grow as support around it grows. As with any such project, there is always a danger of stagnation, or of growth that leads to incoherence.
Statistics by David A. Kenny
David A. Kenny is a Distinguished Professor Emeritus at University of Connecticut’s Department of Psychological Sciences and can thus be considered a content expert. His website contains webinars and PowerPoints that have been curated over the years to teach advanced statistics concept such as multiple regression and causal mediation analysis (http://davidakenny.net/webinars/list.html).
This statistics resource is an example of a resource that was adapted (i.e., made free) as a way to help statistics instructors in need of asynchronous material. For example, instructors teaching similar statistics concepts may not need to record an entirely new lecture on a specific topic, as it may already exist in this resource. Implementation of it would thus consist of instructors linking videos on their syllabi or lecture notes.
The videos and PowerPoints are best used asynchronously by instructors of advanced statistics courses who wish to supplement their own course materials. This is thus an example of a resource that responds to a very specific need, and as such is not designed in a way that attends to the needs of instructors who are transitioning to online instruction.
The technological prerequisites of this resource are low; however, an instructors TPACK (Koehler and Mishra, 2009) would likely need to be high to notice a time when this resource was appropriate.
The website itself is dated, but has been active for years. One can thus presume that it will be stable in the strictest since – it will note change. This can be seen as affordance, as the lack of bells and whistles for the website and listed resources ensures that it is more likely to be accessible for students who are forced to use more dated technology. It can also be seen as a drawback, as students may not be receptive to dated material (i.e., component six of Leacock and Nesbit, 2007).
Supporting online learning in a time of pandemic
“Supporting Online Learning in A Time of Pandemic” is a report that was co-written by over a dozen faculty from the USC Rossier School of Education in collaboration with nine California superintendents and represents a vetted examples of an institutions public response to helping educators cope with the demands brought on by COVID-19 (https://rossier.usc.edu/files/2020/04/report-supporting-online-learning-during-pandemic.pdf).
This report is intended for K-12 educators, administrators and leaders. Each section is organized around a guiding question that is likely to be on the minds of educators as they seek to adapt to changing instructional circumstances (e.g., “What are key lessons for engaging students in online instruction?”). Each section (i.e., answer to a question) begins with a reflection from USC Rossier instructors, and ends with a set of enumerated recommendations that address the question directly. The end of the document contains two appendices, the first of which list further reading, by topic (e.g., apps that may be useful), and the second of which is specific to Zoom.
This is an example of a pandemic-specific resource that is best used as a starting point, as it contains a set of guidelines that can be used to inform subsequent planning on the part of educators. It is also a document that would be used well be technology stewards who are in the position to set policy. Yet, it should be one of many of such resources that stewards should use, since it offers a perspective specific to a particular institution.
The prerequisites of using this document are not technological, instead they may require educators to have the flexibility to change their own instructional practices.
As a static document this resource is highly stable, but it is also necessarily unresponsive to changes in instructional circumstances that may be experienced by educators who use it.
Nothing about adapting to the newfound requirements of online instruction in a post-COVID-19 world is easy, including sifting through the various solutions and suggestions that have emerged as a result of moving most instruction away from classrooms and into the homes of students. The proceeding research-based approach for evaluating resources for transitioning to teaching online began with an overview of four frameworks that have been used to evaluate educational technologies. These frameworks focused on educators (Davies, 2011; Koehler and Mishra, 2009), tools (Leacock and Nesbit, 2007; Wenger et al., 2009) and context (Wenger et al., 2009; Reynolds and Leeder, 2018). From those, I introduced two key criteria: source and implementation. The former focused on the evaluating the authors of a given resource, while the latter focused on how a given resource could be implemented. Finally, I modeled the above approach to help those wishing to be discerning technology educators in the time of emergency distance learning.
Aguilar, S.J. (2020), “Online learning in the wake of COVID-19”, Open Science Framework, available at: osf.io/gxjt4 (accessed 12 May 2020).
Darby, F. (2019), “How to be a better online teacher”, The Chronicle of Higher Education, available at: www.chronicle.com/interactives/advice-online-teaching (accessed 4 May 2020).
Davies, R.S. (2011), “Understanding technology literacy: a framework for evaluating educational technology integration”, TechTrends, Vol. 55 No. 5, p. 45.
Kennedy, A. (2020), “Twitter thread”, Twitter, available at: https://twitter.com/alanakennedy808/status/1237491713679060998 (accessed 4 May 2020).
Kenny, D.A. (2014), “Webinars and PowerPoints (David A. Kenny)”, Davidakenny.Net, available at: http://davidakenny.net/webinars/list.html (accessed 2 May 2020).
Koehler, M. and Mishra, P. (2009), “What is technological pedagogical content knowledge (TPACK)?”, Contemporary Issues in Technology and Teacher Education, Vol. 9 No. 1, pp. 60-70.
Leacock, T.L. and Nesbit, J.C. (2007), “A framework for evaluating the quality of multimedia learning resources”, Journal of Educational Technology and Society, Vol. 10 No. 2, pp. 44-59.
Phenomena for NGSS (2020), “Phenomena for NGSS”, Phenomena for NGSS, available at: www.ngssphenomena.com/ (accessed 1 May 2020).
Reynolds, R. and Leeder, C. (2018), “School librarian decision-making for e-learning innovation”, in Lee, V. and Phillips, A. (Eds), Reconceptualizing Libraries: Opportunities from the Information and Learning Sciences, Routledge.
USC Rossier School of Education (2020), “Supporting online learning in a time of pandemic”, USC Rossier, available at: https://rossier.usc.edu/supporting-online-learning-covid-pandemic/ (accessed 2 May 2020).
Wenger, E., White, N. and Smith, J.D. (2009), Digital Habitats: Stewarding Technology for Communities, CPsquare.
Xu, D. Li, Q. and Zhou, X. (2020), “Online course quality rubric: a tool box”, Online Learning Research Center, available at: www.olrc.us/reflecting-on-course-design.html (accessed 2 May 2020).
This article is part of the special issue, “A Response to Emergency Transitions to Remote Online Education in K-12 and Higher Education” which contains shorter, rapid-turnaround invited works, not subject to double blind peer review. The issue was called, managed and produced on short timeline in Summer 2020 toward pragmatic instructional application in the Fall 2020 semester.