Search results
1 – 10 of over 129000
This paper seeks to address current limitations in approaches to training evaluation by presenting a conceptual model of work‐based learning and an associated evaluation framework.
Abstract
Purpose
This paper seeks to address current limitations in approaches to training evaluation by presenting a conceptual model of work‐based learning and an associated evaluation framework.
Design/methodology/approach
The model and framework presented in this paper are based on a critical review of current approaches to learning evaluation and insights from learning transfer research and programme theory.
Findings
This paper sets out a conceptual model of workplace learning based on five elements: a pre‐learning stage, the trigger (need) for learning, the learning event, application of learning and the impact of learning. A linked criterion evaluation framework is also described. It is proposed that this provides a scientifically robust but practitioner friendly framework for workplace learning evaluation.
Practical implications
While most organisations wish to evaluate the effectiveness of their investment in employee training and development, few do. One of the barriers to effective learning evaluation is the failure to ground approaches in a contemporary and comprehensive model of workplace learning. The model and framework set out in this paper aim to assist evaluation by addressing this gap in a practitioner friendly way.
Originality/value
This paper sets out a novel, flexible and comprehensive conceptual model of workplace learning along with an innovative approach to training evaluation that addresses limitations in existing approaches. It is hoped that this will contribute to the debate on appropriate evaluation methods and assist practitioners to undertake evaluation in a more credible manner.
Details
Keywords
Susan Geertshuis, Mary Holmes, Harry Geertshuis, David Clancy and Amanda Bristol
Reports on an effort to implement good practices in learning evaluation. Reviews learning evaluation practices and gathers data using a dedicated software system. Demonstrates…
Abstract
Reports on an effort to implement good practices in learning evaluation. Reviews learning evaluation practices and gathers data using a dedicated software system. Demonstrates learning takes place within complex social systems populated by a multiplicity of factors that influence perceptions of learning and performance outcomes. Argues that technology enables cost‐effective evaluations to be implemented that encompass a broad spectrum of influencing variables and acknowledge the empowered status of the learner. Discusses the implications for evaluation methodologies and the role of trainers within organisations.
Details
Keywords
This paper seeks to argue that workplace learning evaluation theory and practice is still an emergent field and that this creates a number of challenges for practitioners and…
Abstract
Purpose
This paper seeks to argue that workplace learning evaluation theory and practice is still an emergent field and that this creates a number of challenges for practitioners and researchers alike.
Design/methodology/approach
This is a descriptive paper based on a critical review of existing approaches and the research literature.
Findings
While programme evaluation has a long history, workplace learning evaluation is yet to establish itself as a distinct field. This has a number of consequences including the lack of a single or settled view on how workplace learning should be evaluated or what specific aspects of learning should be investigated.
Practical implications
The need to demonstrate a return on investment in organisational learning is as pressing as ever. To become more effective training evaluation methods need to be grounded in a theory. This article aims to provide an informed perspective on the current state of workplace evaluation along with insights into how evaluation can be placed on firmer theoretical foundations in order to produce robust findings in a practitioner friendly way.
Originality/value
This paper provides original insights into the development of workplace evaluation approaches and the challenges the field faces.
Details
Keywords
Lynne Caley, Sharon J. Williams, Izabela Spernaes, David Thomas, Doris Behrens and Alan Willson
It has become accepted practice to include an evaluation alongside learning programmes that take place at work, as a means of judging their effectiveness. There is a tendency to…
Abstract
Purpose
It has become accepted practice to include an evaluation alongside learning programmes that take place at work, as a means of judging their effectiveness. There is a tendency to focus such evaluations on the relevance of the intervention and the amount of learning achieved by the individual. The aim of this review is to examine existing evaluation frameworks that have been used to evaluate education interventions and, in particular, assess how these have been used and the outcomes of such activity.
Design/methodology/approach
A scoping review using Arskey and O’Malley’s five stage framework was undertaken to examine existing evaluation frameworks claiming to evaluate education interventions.
Findings
Forty five articles were included in the review. A majority of papers concentrate on learner satisfaction and/or learning achieved. Rarely is a structured framework mentioned, or detail of the approach to analysis cited. Typically, evaluations lacked baseline data, control groups, longitudinal observations and contextual awareness.
Practical implications
This review has implications for those involved in designing and evaluating work-related education programmes, as it identifies areas where evaluations need to be strengthened and recommends how existing frameworks can be combined to improve how evaluations are conducted.
Originality/value
This scoping review is novel in its assessment and critique of evaluation frameworks employed to evaluate work-related education programmes.
Details
Keywords
Evaluation is the process by which we estimate how things should go, explore how things are going, and determine how things went in terms of course redesign. In this chapter, we…
Abstract
Evaluation is the process by which we estimate how things should go, explore how things are going, and determine how things went in terms of course redesign. In this chapter, we examine formative and summative methods for assessing student learning and establishing teacher effectiveness and course quality. Evaluation is a subjective, value-laden process. To introduce the rigor needed to make it meaningful, evaluation should be multifaceted, planned in advance, made transparent to learners, and employ valid and reliable methods. Moving courses online presents both opportunities and challenges for evaluation. We explore ways to implement assessment to make full use of the advantages of technology while mitigating the problems associated with online delivery.
Details
Keywords
Panagiotis Zaharias and Panayiotis Koutsabasis
The purpose of this paper is to discuss heuristic evaluation as a method for evaluating e‐learning courses and applications and more specifically to investigate the applicability…
Abstract
Purpose
The purpose of this paper is to discuss heuristic evaluation as a method for evaluating e‐learning courses and applications and more specifically to investigate the applicability and empirical use of two customized e‐learning heuristic protocols.
Design/methodology/approach
Two representative e‐learning heuristic protocols were chosen for the comparative analysis. These protocols augment the “traditional” heuristic sets so as to cover technology‐enhanced learning properties. Two reviewers that have considerable experience in usability evaluation as well as in e‐learning were involved in this comparative analysis. Coverage, distribution and redundancy were employed as three basic criteria for conducting the evaluation
Findings
The main results of the study indicate that both heuristic protocols exhibit wide coverage of potential usability problems. The distribution of usability problems is uneven to a large number of heuristics for both heuristic sets, which reveals that some heuristics are more general than others.
Originality/value
This study shows the empirical application of two heuristic protocols in a usability evaluation of e‐learning applications. Furthermore, it provides a comparison of the two heuristic sets according to a set of criteria and provides a first set of suggestions regarding further development and validation of these heuristic sets.
Details
Keywords
Many universities are currently investing significant sums of money into refurbishing existing learning spaces and/or building further infrastructure (including Next Generation…
Abstract
Many universities are currently investing significant sums of money into refurbishing existing learning spaces and/or building further infrastructure (including Next Generation Learning Spaces (NGLS)) to support learning and teaching in the face-to-face context. While this is usually welcome by staff and students, there is often a concern that designs are not informed by input from appropriate stakeholders.
This chapter brings together information from a range of sources to provide practical ideas and advice on designing robust, whole-of-lifecycle evaluations for learning space projects. By incorporating pre- and post-occupancy stages, involving a wide array of stakeholders and looking beyond surveys and focus groups as evaluation techniques, universities can ensure that future designs take into consideration the experiences and context of staff and students at the institution as well as lessons learned from previous projects.
Details
Keywords
Lefteris Moussiades and Anthi Iliopolou
The evolution of Internet Technology has influenced the basis of education by introducing new methodologies in to teaching and giving a new dimension to distance learning. On the…
Abstract
The evolution of Internet Technology has influenced the basis of education by introducing new methodologies in to teaching and giving a new dimension to distance learning. On the other hand, there is an emerging need on the users part to define some standards that will be used for judging the quality and effectiveness of the educational Websites. In this project, a presentation of evaluation methodology of Virtual Learning Environments (VLEs) is presented, along with basic guidelines that must be followed when evaluating a VLE. This report emphasizes the importance of evaluation in the educational practice and presents a review of the current literature in terms of helping the VLE practitioner to find his own way concerning the evaluation process.
Details
Keywords
Ulrik Brandi and Peter Christensen
The purpose of this paper is to explore how enterprises are to arrange its learning processes in order to optimise the integration and creation of sustainable organisational…
Abstract
Purpose
The purpose of this paper is to explore how enterprises are to arrange its learning processes in order to optimise the integration and creation of sustainable organisational learning. The paper describes a lite learning evaluation technology that makes processual real-time evaluation of implementation of new knowledge and competences in practice context.
Design/methodology/approach
The research is based on a case study that is designed and planned as a mixed method inquiry. The empirical case study is based on data from a large Danish enterprise from the telecommunication industry conducting a leadership and sales training programme. Case study analysis uses data drawn from the implemented pulse survey followed up with qualitative interviews with the course participants.
Findings
The authors show results on two levels. On the individual level, processual real-time lite learning evaluation tools create transparency and adaptability. On the organisational level, tool shapes the organisational capacity to improve routines and practices for how to work with organisational learning and learning data in general. Instead of treating learning and development as something that happens “automatically”, organisations now have a tool for informed decisions aimed at creating sustainable organisational learning processes and results.
Originality/value
The paper prompts insights that call for enterprises to enhance focus and dialogue on how to work in new and smart ways with learning at a multi-stakeholder level in organisations. The design and deployment of a real-time lite evaluation tool in organisations are key to bolster learning and competence development, so that organisations and societies can become more responsive in responding to the challenges posed by today’s knowledge economy.
Details
Keywords
– This research paper presents an innovative evaluation methodology which was developed as part of a doctoral research study in a voluntary sector youth organisation in England.
Abstract
Purpose
This research paper presents an innovative evaluation methodology which was developed as part of a doctoral research study in a voluntary sector youth organisation in England.
Design/methodology/approach
The transformative methodology synthesises aspects of appreciative inquiry, participatory evaluation and transformative learning and engages the whole organisation in evaluating impact. Using an interpretive paradigm, data were collected from youth workers via semi-structured interviews prior and post implementation of the transformative evaluation methodology.
Findings
Drawing on thematic analysis of the youth workers' experiences, it is argued that the illuminative and transformative nature of the methodology enabled the learning and development functions of evaluation to be realised. Further, it is argued that this form of evaluation not only supports the collection of evidence to demonstrate impact externally, but that the process itself has the potential to enhance practice, improve outcomes “in the moment” and promotes organisational learning.
Research limitations/implications
The research findings are limited by the small-scale nature of the project. Further research is needed to investigate the supporting and enabling factors that underpin participatory practices in organisation evaluation; and in particular to investigate the experience of the managers and trustees as these were not the focus of this research.
Originality/value
This article makes a significant contribution to knowledge in regard to the design and use of participatory evaluation. It evidences the benefits in relation to generating practice improvements and for practitioners themselves in terms of countering the negatives effects of performativity. Transformative evaluation offers an innovative structure and process through which organisational learning can be realised.
Details