Search results
1 – 10 of over 107000Training evaluation is an elusive concept, especially when it comes to practice. The practice of evaluation in training has received a lot of criticism. This criticism is largely…
Abstract
Training evaluation is an elusive concept, especially when it comes to practice. The practice of evaluation in training has received a lot of criticism. This criticism is largely explained by the unsystematic, informal, and ad hoc evaluation that has been conducted by training institutions. In Malaysia, training activities are monitored by the government. Organisations are required to obtain training services from approved training providers registered with the government. Examines the clients’ demand toward evaluation, the commitment given by training providers, and the overall practice of evaluation by the training providers in Malaysia. Finds that the government, client and economic situations have influenced the evaluation practice in a positive direction.
Details
Keywords
Ahmad Al‐Athari and Mohamed Zairi
This paper is based on a study which examined the current training evaluation activity and challenges that face Kuwaiti organisations. The study sample was five UK organisations…
Abstract
This paper is based on a study which examined the current training evaluation activity and challenges that face Kuwaiti organisations. The study sample was five UK organisations (recognised as best practice organisations in their T&D activities) and 77 Kuwaiti organisations (40 government and 37 private). Interviews and questionnaires were used. The study reveals that the majority of respondents, both in government and in private sectors, only evaluate their training programme occasionally. The most popular evaluation tools and technique used by government and private sectors were questionnaires. The most common model used by Kuwaiti organisations is the Kirkpatrick model, while the most common level of evaluation for both government and private sector is reaction type.
Details
Keywords
This paper presents research‐based insight on the challenges of evaluating training activities in today's complex organizational settings.
Abstract
Purpose
This paper presents research‐based insight on the challenges of evaluating training activities in today's complex organizational settings.
Design/methodology/approach
The research is taken from three case studies conducted in the New Zealand manufacturing sector, as well as sources of relevant literature. The commentary takes a critical‐realist perspective and challenges learning and development professionals to address the poor reputation of training evaluation.
Findings
Human resource practitioners recognise the importance of gaining feedback from learning events, but research reports question the thoroughness of evaluation processes, claiming they rarely happen to the satisfaction of management. Consequently, training budgets become an easy target during periods of rationalization. The problem centres on overcoming the complexity of defining a meaningful cause/effect relationship between the training and resultant benefit. This research discovered the presence of an “evaluation vacuum” and nine thematic areas requiring close attention. The paper offers reasons why the evaluation of training is becoming increasingly difficult.
Research limitations
The findings are contextual and may not fit all settings, but they offer a comparative account of training evaluation in both straightforward and complex learning environments.
Practical implications
The paper has real and practical implications for human resource professionals. Evaluation of training is not a trivial issue and organizations need to get much better at explaining the beneficial outcomes derived from investments in training.
Originality/value
This paper will be of value to human resource professionals and managers, assisting them to think differently about evaluating training. Innovative concepts such as the “evaluation vacuum” and the term “learning bleed” clarify priorities and contribute to a new perspective on evaluation.
Details
Keywords
Changes in the format of library materials, increased amounts of information, and the speed at which information is being produced have created an unrelenting need for training…
Abstract
Changes in the format of library materials, increased amounts of information, and the speed at which information is being produced have created an unrelenting need for training for library staff members. Additionally, library employees are retiring in greater numbers and their accompanying expertise is being lost. The purpose of this study was to document evaluation practices currently used in library training and continuing education programs for library employees, including metrics used in calculating return-on-investment (ROI). This research project asked 272 library training professionals to identify how they evaluate training, what kind of training evaluation practices are in place, how they select programs to evaluate for ROI, and what criteria are important in determining an effective method for calculating ROI.
A five‐stage process model of training is presented which integrates research concerning the transfer, evaluation and institutionalisation of training. The model is designed to…
Abstract
A five‐stage process model of training is presented which integrates research concerning the transfer, evaluation and institutionalisation of training. The model is designed to enhance the on‐the‐job evaluations of training effectiveness by following up on negative reactions, using additional training, and providing feedback to rectify breakdowns in earlier stages of training use.
Details
Keywords
Training initiatives are widely acknowledged to be a salient feature of the competitive organization’s corporate strategy. Contends that, despite the heavy investment in training…
Abstract
Training initiatives are widely acknowledged to be a salient feature of the competitive organization’s corporate strategy. Contends that, despite the heavy investment in training, organizations frequently fail to evaluate adequately the value or success of their training programmes. Those companies which do evaluate often use measures considered ineffective by many researchers. Part of the reason for companies’ reluctance to evaluate their training may be confusion as to how and what to evaluate. Reviews some of the barriers to effective training evaluation and outlines the benefits that organizations which do evaluate can invite. Describes the results and implications for organizations of a study undertaken in Europe to answer the question: “What should training evaluations evaluate?” Concludes that neither reactions to training nor immediate post‐training knowledge are predictors of subsequent self‐efficacy (an established indicator of actual performance).
Details
Keywords
Hong Kong has experienced an economic transformation from a manufacturing‐based to a service‐based economy which has impacted on the demand for manual labour. In 1992, the…
Abstract
Hong Kong has experienced an economic transformation from a manufacturing‐based to a service‐based economy which has impacted on the demand for manual labour. In 1992, the Employee Retraining Board was set up to provide employees’ retraining programmes (ERP) for unemployed manual workers. It aims to help unemployed manual workers to acquire and develop knowledge, skills and abilities so that they can re‐enter the labour market. This study focuses on evaluating the effectiveness of ERP from the perspectives of training providers designated by Employee Retraining Board to fulfil the above objective. The evaluation of the ERP is based on how the various ERP courses can meet the training objectives, assessment of training needs, design of the ERP, course evaluation, and follow‐up services conducted by the selected training bodies. The overall effectiveness of ERP is found to be low. The indicators participation rate and job placement rate used by the training bodies tend to provide misleading evaluation results to the ERP.
Details
Keywords
JAG JONES and WT ANDERSON
For the last seven years we have been involved in feasibility studies, applying the economist's rigorous cost benefit analysis techniques to training investments and activities…
Abstract
For the last seven years we have been involved in feasibility studies, applying the economist's rigorous cost benefit analysis techniques to training investments and activities. The progress of the research has been well documented in various journals. Very early on it was demonstrated that although cost benefit techniques could be applied to certain forms of operator training, they only formed a part of the armoury of a training evaluator when more complicated skills and job organisations were encountered. In 1970 a progress summary was published in this journal. As a part of the current debate about the evaluation of training we have been asked to write a further progress summary. Rather than repeating parts of other articles we have attempted to do two things in this article: first to clarify some confusion we see developing in the terms used by trainers, and secondly to air some ideas which are being developed in the various research projects in which we are engaged. In this way we hope we can illustrate, by implication, the current thinking in the ITS about evaluation of training.
The purpose of this paper is to provide training and human resources development practitioners with a practical, credible and strategically‐useful training evaluation method.
Abstract
Purpose
The purpose of this paper is to provide training and human resources development practitioners with a practical, credible and strategically‐useful training evaluation method.
Design/methodology/approach
The suggested evaluation strategy and method are based on the author's experience as a thought leader and consultant with hundreds of organizations world‐wide.
Findings
Human resources development practitioners need a more practical, simple, valid and actionable approach to evaluation.
Practical implications
Evaluation should focus on the entire training and performance improvement process, not solely on training events. Leverage for making improvements to training impact is found in the performance management system factors in the larger organization outside the boundaries of the training department or function.
Originality/value
The paper proposes a new, more simple and valid approach to measurement of training impact that has been tried successfully in several dozen leading companies.
Details
Keywords
Antonio Giangreco, Andrea Carugati and Antonio Sebastiano
This paper aims to advance the debate regarding the use of training evaluation tools, chiefly the Kirkpatrick model, in reaction to minimal use of the tools reported in the…
Abstract
Purpose
This paper aims to advance the debate regarding the use of training evaluation tools, chiefly the Kirkpatrick model, in reaction to minimal use of the tools reported in the literature and the economic changes that have characterised the industrialised world in the past 20 years.
Design/methodology/approach
The main argument – the need to design new evaluation tools – emerges from an extensive literature review of criticism of the Kirkpatrick model. The approach is deductive; the argument emerges from extant literature.
Findings
The main findings of the literature review show that the major criticisms of the Kirkpatrick model, though rigorous, are not relevant in today's post‐industrial economy. Issues of complexity, accuracy and refinement, which are relevant in stable industrial organisations, must be revised in the new economic world.
Research limitations/implications
This paper is based on a literature review and presents a call for new research. As such, it is not grounded in original empirical evidence, beyond that presented in the cited articles.
Practical implications
The paper calls for training evaluation tools that align better with modern organisational reality. If the research community responds to this call, the results will benefit practitioners directly. This paper also presents practical advice about the use of existing evaluation techniques.
Originality/value
A new angle on criticisms of existing training evaluation systems does not reiterate classic criticisms based on logic and mathematics but rather takes a pragmatic and economic approach. Thus, this paper offers evidence of theoretically grounded paradoxes of the consequences of existing criticisms of training evaluation.
Details