Search results

1 – 10 of over 6000
Article
Publication date: 26 April 2013

Jamison V. Kovach, Lee Revere and Ken Black

This study aims to provide healthcare managers with a meaningful synthesis of state of the art knowledge on error proofing strategies. The purpose is to provide a foundation for…

1104

Abstract

Purpose

This study aims to provide healthcare managers with a meaningful synthesis of state of the art knowledge on error proofing strategies. The purpose is to provide a foundation for understanding medical error prevention, to support the strategic deployment of error proofing strategies, and facilitate the development and implementation of new error proofing strategies.

Design/methodology/approach

A diverse panel of 40 healthcare professionals evaluated the 150 error proofing strategies presented in the AHRQ research monograph using classification systems developed by earlier researchers. Error proofing strategies were ranked based on effectiveness, cost, and ease of implementation as well as based on their aim/purpose, i.e. elimination, replacement, facilitation, detection, or mitigation of errors.

Findings

The findings of this study include prioritized lists of error proofing strategies from the AHRQ manual based on the preferred characteristics (i.e. effectiveness, cost, ease of implementation) and underlying principles (i.e. elimination, replacement, facilitation, detections mitigation of errors) associated with each strategy.

Research limitations/implications

The results of this study should be considered in light of certain limitations. The sample size of 40 panelists from hospitals, medical practices, and other healthcare related companies in the Gulf Coast region of the USA prevents a stronger generalization of the findings to other groups or settings. Future studies that replicate this approach, but employ larger samples, are appropriate. Through the use of public forums and expanded sampling, it may be possible to further validate research findings in this paper and to expand and build on the results obtained in this study.

Practical implications

Using the errorproofing strategies identified provides a starting point for researchers seeking to better understand the impact of error proofing on healthcare services, the quality of those services and the potential financial ramifications. Further, the results presented enhance the strategic deployment of error proofing strategies by bringing to light some of the important factors that healthcare managers should consider when implementing error proofing solutions. Most notably, healthcare managers are encouraged to implement effective solutions, rather than those that are merely inexpensive and/or easy to implement, which is more often the case.

Originality/value

This study provides a much‐needed forum for sharing errorproofing strategies, their effectiveness, and their implementation.

Content available
Article
Publication date: 26 April 2013

Jennifer Bowerman

68

Abstract

Details

Leadership in Health Services, vol. 26 no. 2
Type: Research Article
ISSN: 1751-1879

Article
Publication date: 12 October 2015

Maria Crema and Chiara Verbano

The purpose of this paper is to investigate connections and overlaps between health lean management (HLM) and clinical risk management (CRM) understanding whether and how these…

1082

Abstract

Purpose

The purpose of this paper is to investigate connections and overlaps between health lean management (HLM) and clinical risk management (CRM) understanding whether and how these two approaches can be combined together to pursue efficiency and patient safety improvements simultaneously.

Design/methodology/approach

A systematic literature review has been carried out. Searching in academic databases, papers that focus not only on HLM, but also on clinical errors and risk reduction, were included. The general characteristics of the selected papers were analysed and a content analysis was conducted.

Findings

In most of the papers, pursing objectives of HLM and CRM and adopting tools and practices of both approaches, results of quality and, particularly, of safety improvements were obtained. A two-way arrow between HLM and CRM emerged but so far, none of the studies has been focused on the relationship between HLM and CRM.

Originality/value

Results highlight an emerging research stream, with many useful theoretical and practical implications and opportunities for further research.

Details

International Journal of Health Care Quality Assurance, vol. 28 no. 8
Type: Research Article
ISSN: 0952-6862

Keywords

Article
Publication date: 15 September 2022

Armagan Altinisik, Utku Yildirim and Y. Ilker Topcu

The tightening operations are one of the most critical operations in automotive assembly lines because of its direct impact on customer safety. This study aims to evaluate the…

Abstract

Purpose

The tightening operations are one of the most critical operations in automotive assembly lines because of its direct impact on customer safety. This study aims to evaluate the major complexity drivers for manual tightening operations, correlate with real tightening failure data and propose mitigations to improve the complexity.

Design/methodology/approach

In the first stage, the complexity drivers for manual tightening operations were identified. Then, the relative importance of the risk attributes was defined by using pairwise comparisons questionnaire. Further, failure mode effect analysis–analytic hierarchy process (FMEA–AHP) and AHP ratings methods were applied to 20 manual tightening operations in automotive assembly lines. Finally, the similarities between the revealed results and the real failure rates of a Turkish automotive factory were examined and a sensitivity analysis was conducted.

Findings

The correlation between the proposed methods and manual tightening failure data was calculated as 83%–86%. On the other hand, the correlation between FMEA–AHP and AHP ratings was found as 92%. Poor ergonomics, operator competency and training, operator concentration-loose attention fatigue, manual mouthing before the tightening operation, frequent task changes, critical tightening sequence, positioning of the part and/or directional assembly were found relatively critical for the selected 20 tightening operations.

Originality/value

This is a unique study for the evaluation of the attributes for manual tightening complexity in automotive assembly lines. The output of this study can be used to improve manual tightening failures in manual assembly lines and to create low complexity assembly lines in new model launches.

Details

Assembly Automation, vol. 42 no. 5
Type: Research Article
ISSN: 0144-5154

Keywords

Article
Publication date: 15 February 2021

David Keatley and David D. Clarke

While behaviour sequence analysis (BSA) is popular, it is not without limitations, namely, the level of detail required and time taken to run analyses; therefore, this paper aims…

Abstract

Purpose

While behaviour sequence analysis (BSA) is popular, it is not without limitations, namely, the level of detail required and time taken to run analyses; therefore, this paper aims to outline a novel method, using 30 serial homicide cases as a worked example.

Design/methodology/approach

Temporal analysis methods are becoming increasingly popular in applied forensic and criminological research. In recent years, BSA has become a widely used approach.

Findings

Waypoint sequencing provides a streamlined version of the traditional BSA approach, allowing for fewer behaviours to be included and providing a clearer overview of the main behaviours of interest.

Practical implications

Waypoint sequencing is shown in the current paper through serial killer research to show how to conduct the analyses and how it is effective in current investigations by expediting the process and allowing quicker analysis to facilitate current investigations.

Originality/value

The current research provides a novel approach to sequence analysis that is more useful in applied settings as it requires fewer behaviours or events than traditional BSA.

Article
Publication date: 7 January 2020

Omri Suissa, Avshalom Elmalech and Maayan Zhitomirsky-Geffet

Digitization of historical documents is a challenging task in many digital humanities projects. A popular approach for digitization is to scan the documents into images, and then…

Abstract

Purpose

Digitization of historical documents is a challenging task in many digital humanities projects. A popular approach for digitization is to scan the documents into images, and then convert images into text using optical character recognition (OCR) algorithms. However, the outcome of OCR processing of historical documents is usually inaccurate and requires post-processing error correction. The purpose of this paper is to investigate how crowdsourcing can be utilized to correct OCR errors in historical text collections, and which crowdsourcing methodology is the most effective in different scenarios and for various research objectives.

Design/methodology/approach

A series of experiments with different micro-task’s structures and text lengths were conducted with 753 workers on the Amazon’s Mechanical Turk platform. The workers had to fix OCR errors in a selected historical text. To analyze the results, new accuracy and efficiency measures were devised.

Findings

The analysis suggests that in terms of accuracy, the optimal text length is medium (paragraph-size) and the optimal structure of the experiment is two phase with a scanned image. In terms of efficiency, the best results were obtained when using longer text in the single-stage structure with no image.

Practical implications

The study provides practical recommendations to researchers on how to build the optimal crowdsourcing task for OCR post-correction. The developed methodology can also be utilized to create golden standard historical texts for automatic OCR post-correction.

Originality/value

This is the first attempt to systematically investigate the influence of various factors on crowdsourcing-based OCR post-correction and propose an optimal strategy for this process.

Details

Aslib Journal of Information Management, vol. 72 no. 2
Type: Research Article
ISSN: 2050-3806

Keywords

Article
Publication date: 25 August 2023

Kirk Luther, Zak Keeping, Brent Snook, Hannah de Almeida, Weyam Fahmy, Alexia Smith and Tianshuang Han

The purpose of this study is to contribute to the literature on information elicitation. The authors investigated the impact of social influence strategies on eyewitness recall…

Abstract

Purpose

The purpose of this study is to contribute to the literature on information elicitation. The authors investigated the impact of social influence strategies on eyewitness recall performance. Specifically, the authors examined the effect of social influence techniques (Cialdini, 2007) on recall performance (Experiment 1) and conducted a follow-up experiment to examine the incremental effect of social proof on the report everything cognitive interview mnemonic (Experiment 2).

Design/methodology/approach

Participants watched a video depicting vandalism (Experiment 1: N = 174) or a verbal altercation (Experiment 2: N = 128) and were asked to recall the witnessed event. Experiment 1: Participants were assigned randomly to one of six conditions: control (open-ended prompt), engage and explain (interview ground rules), consistency (signing an agreement to work diligently), reciprocity (given water and food), authority (told of interviewer’s training) and social proof (shown transcript from an exemplar participant). Experiment 2: The authors used a 2 (social proof: present, absent) × 2 (report everything: present, absent) between-participants design.

Findings

Across both experiments, participants exposed to the social proof tactic (i.e. compared to a model exemplar) spoke longer and recalled more correct details than participants not exposed to the social proof tactic. In Experiment 2, participants interviewed with the report everything mnemonic also spoke longer, recalled more correct details, more incorrect details and provided slightly more confabulations than those not interviewed with the report everything mnemonic.

Originality/value

The findings have practical value for police investigators and other professionals who conduct interviews (e.g. military personnel, doctors obtaining information from patients). Interviewers can incorporate social proof in their interviewing practices to help increase the amount and accuracy of information obtained.

Details

Journal of Criminal Psychology, vol. 14 no. 1
Type: Research Article
ISSN: 2009-3829

Keywords

Article
Publication date: 4 July 2023

Binghai Zhou and Mingda Wen

In a kitting supply system, the occurrence of material-handling errors is unavoidable and will cause serious production losses to an assembly line. To minimize production losses…

Abstract

Purpose

In a kitting supply system, the occurrence of material-handling errors is unavoidable and will cause serious production losses to an assembly line. To minimize production losses, this paper aims to present a dynamic scheduling problem of automotive assembly line considering material-handling mistakes by integrating abnormal disturbance into the material distribution problem of mixed-model assembly lines (MMALs).

Design/methodology/approach

A multi-phase dynamic scheduling (MPDS) algorithm is proposed based on the characteristics and properties of the dynamic scheduling problem. In the first phase, the static material distribution scheduling problem is decomposed into three optimization sub-problems, and the dynamic programming algorithm is used to jointly optimize the sub-problems to obtain the optimal initial scheduling plan. In the second phase, a two-stage rescheduling algorithm incorporating removing rules and adding rules was designed according to the status update mechanism of material demand and multi-load AGVs.

Findings

Through comparative experiments with the periodic distribution strategy (PD) and the direct insertion method (DI), the superiority of the proposed dynamic scheduling strategy and algorithm is verified.

Originality/value

To the best of the authors’ knowledge, this study is the first to consider the impact of material-handling errors on the material distribution scheduling problem when using a kitting strategy. By designing an MPDS algorithm, this paper aims to maximize the absorption of the disturbance caused by material-handling errors and reduce the production losses of the assembly line as well as the total cost of the material transportation.

Details

Engineering Computations, vol. 40 no. 5
Type: Research Article
ISSN: 0264-4401

Keywords

Book part
Publication date: 5 April 2024

Ziwen Gao, Steven F. Lehrer, Tian Xie and Xinyu Zhang

Motivated by empirical features that characterize cryptocurrency volatility data, the authors develop a forecasting strategy that can account for both model uncertainty and…

Abstract

Motivated by empirical features that characterize cryptocurrency volatility data, the authors develop a forecasting strategy that can account for both model uncertainty and heteroskedasticity of unknown form. The theoretical investigation establishes the asymptotic optimality of the proposed heteroskedastic model averaging heterogeneous autoregressive (H-MAHAR) estimator under mild conditions. The authors additionally examine the convergence rate of the estimated weights of the proposed H-MAHAR estimator. This analysis sheds new light on the asymptotic properties of the least squares model averaging estimator under alternative complicated data generating processes (DGPs). To examine the performance of the H-MAHAR estimator, the authors conduct an out-of-sample forecasting application involving 22 different cryptocurrency assets. The results emphasize the importance of accounting for both model uncertainty and heteroskedasticity in practice.

Article
Publication date: 27 February 2007

Marcel Fernandez, Josep Cotrina‐Navau and Miguel Soriano

The purpose of this paper is to show that a fingerprinting code is a set of code words that are embedded in each copy of a digital object, with the purpose of making each copy…

Abstract

Pupose

The purpose of this paper is to show that a fingerprinting code is a set of code words that are embedded in each copy of a digital object, with the purpose of making each copy unique. If the fingerprinting code is c‐secure, then the decoding of a pirate word created by a coalition of at most c dishonest users, will expose at least one of the guilty parties.

Design/methodology/approach

The paper presents a systematic strategy for collusions attacking a fingerprinting scheme. As a particular case, this strategy shows that linear codes are not good fingerprinting codes. Based on binary linear equidistant codes, the paper constructs a family of fingerprinting codes in which the identification of guilty users can be efficiently done using minimum distance decoding. Moreover, in order to obtain codes with a better rate a 2‐secure fingerprinting code is also constructed by concatenating a code from the previous family with an outer IPP code.

Findings

The particular choice of the codes is such that it allows the use of efficient decoding algorithms that correct errors beyond the error correction bound of the code, namely a simplified version of the Chase algorithms for the inner code and the Koetter‐Vardy soft‐decision list decoding algorithm for the outer code.

Originality/value

The paper presents a fingerprinting code together with an efficient chasing algorithm.

Details

Online Information Review, vol. 31 no. 1
Type: Research Article
ISSN: 1468-4527

Keywords

1 – 10 of over 6000