Search results

1 – 10 of 56
Article
Publication date: 10 October 2023

Sou-Sen Leu, Yen-Lin Fu and Pei-Lin Wu

This paper aims to develop a dynamic civil facility degradation prediction model to forecast the reliability performance tendency and remaining useful life under imperfect…

Abstract

Purpose

This paper aims to develop a dynamic civil facility degradation prediction model to forecast the reliability performance tendency and remaining useful life under imperfect maintenance based on the inspection records and the maintenance actions.

Design/methodology/approach

A real-time hidden Markov chain (HMM) model is proposed in this paper to predict the reliability performance tendency and remaining useful life under imperfect maintenance based on rare failure events. The model assumes a Poisson arrival pattern for facility failure events occurrence. HMM is further adopted to establish the transmission probabilities among stages. Finally, the simulation inference is conducted using Particle filter (PF) to estimate the most probable model parameters. Water seals at the spillway hydraulic gate in a Taiwan's reservoir are used to examine the appropriateness of the approach.

Findings

The results of defect probabilities tendency from the real-time HMM model are highly consistent with the real defect trend pattern of civil facilities. The proposed facility degradation prediction model can provide the maintenance division with early warning of potential failure to establish a proper proactive maintenance plan, even under the condition of rare defects.

Originality/value

This model is a new method of civil facility degradation prediction under imperfect maintenance, even with rare failure events. It overcomes several limitations of classical failure pattern prediction approaches and can reliably simulate the occurrence of rare defects under imperfect maintenance and the effect of inspection reliability caused by human error. Based on the degradation trend pattern prediction, effective maintenance management plans can be practically implemented to minimize the frequency of the occurrence and the consequence of civil facility failures.

Details

Journal of Quality in Maintenance Engineering, vol. 30 no. 1
Type: Research Article
ISSN: 1355-2511

Keywords

Content available
Article
Publication date: 23 October 2023

Adam Biggs and Joseph Hamilton

Evaluating warfighter lethality is a critical aspect of military performance. Raw metrics such as marksmanship speed and accuracy can provide some insight, yet interpreting subtle…

Abstract

Purpose

Evaluating warfighter lethality is a critical aspect of military performance. Raw metrics such as marksmanship speed and accuracy can provide some insight, yet interpreting subtle differences can be challenging. For example, is a speed difference of 300 milliseconds more important than a 10% accuracy difference on the same drill? Marksmanship evaluations must have objective methods to differentiate between critical factors while maintaining a holistic view of human performance.

Design/methodology/approach

Monte Carlo simulations are one method to circumvent speed/accuracy trade-offs within marksmanship evaluations. They can accommodate both speed and accuracy implications simultaneously without needing to hold one constant for the sake of the other. Moreover, Monte Carlo simulations can incorporate variability as a key element of performance. This approach thus allows analysts to determine consistency of performance expectations when projecting future outcomes.

Findings

The review divides outcomes into both theoretical overview and practical implication sections. Each aspect of the Monte Carlo simulation can be addressed separately, reviewed and then incorporated as a potential component of small arms combat modeling. This application allows for new human performance practitioners to more quickly adopt the method for different applications.

Originality/value

Performance implications are often presented as inferential statistics. By using the Monte Carlo simulations, practitioners can present outcomes in terms of lethality. This method should help convey the impact of any marksmanship evaluation to senior leadership better than current inferential statistics, such as effect size measures.

Details

Journal of Defense Analytics and Logistics, vol. 7 no. 2
Type: Research Article
ISSN: 2399-6439

Keywords

Article
Publication date: 23 February 2024

Anand Prakash and Sudhir Ambekar

This study aims to describe the fundamentals of teaching risk management in a classroom setting, with an emphasis on the learning interface between higher education and the…

Abstract

Purpose

This study aims to describe the fundamentals of teaching risk management in a classroom setting, with an emphasis on the learning interface between higher education and the workplace environment for business management students.

Design/methodology/approach

The study reviews literature that uses spreadsheets to visualize and model risk and uncertainty. Using six distinct case-based activities (CBAs), the study illustrates the practical applications of software like Palisade @RISK in risk management education. It helps to close the gap between theory and practice. The software assists in estimating the likelihood of a risk event and the impact or repercussions it will have if it occurs. This technique of risk analysis makes it possible to identify the risks that need the most active control.

Findings

@RISK can be used to create models that produce results to demonstrate every potential scenario outcome. When faced with a choice or analysis that involves uncertainty, @RISK can be utilized to enhance the perspective of what the future might contain.

Originality/value

The insights from this study can be used to develop critical thinking, independent thinking, problem-solving and other important skills in learners. Further, educators can apply Bloom’s taxonomy and the problem-solving taxonomy to help students make informed decisions in risky situations.

Details

Higher Education, Skills and Work-Based Learning, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2042-3896

Keywords

Article
Publication date: 7 September 2023

Juyeon Lee and Taekyung Park

Growing attention has been paid to bricolage as a strategic means to overcome resource constraints in small and medium-sized enterprises (SMEs). In the industrial market, a…

Abstract

Purpose

Growing attention has been paid to bricolage as a strategic means to overcome resource constraints in small and medium-sized enterprises (SMEs). In the industrial market, a bricolage strategy and ambidextrous action may help firms to remain competitive by responding quickly to the business-to-business marketing. Despite its paramount importance, questions as to how bricolage is strengthened and how bricolage improves innovation ambidexterity have remained unanswered. This study aims to develop an integrated model for the relationships among environmental turbulence, learning orientation, ambidexterity and performance, with a particular focus on the mediation of bricolage.

Design/methodology/approach

Building on the literature review regarding the key constructs, hypotheses were developed. Data were collected using questionnaires from 229 SMEs in South Korea. To test hypothesis, structural equation modeling and Monte Carlo method for assessing mediation were performed.

Findings

Results reveal that environmental turbulence and learning orientation are positively associated with bricolage, which sequentially affects ambidextrous action as a driver of performance. The findings also indicate that bricolage significantly mediates the relationship between its antecedents and ambidexterity.

Originality/value

This research contributes to advancing our understanding of the role of a bricolage strategy for innovation ambidexterity and performance in SMEs. This study is the first to examine the mediation of bricolage between environmental factors and ambidexterity for improved performance.

Details

Journal of Business & Industrial Marketing, vol. 39 no. 3
Type: Research Article
ISSN: 0885-8624

Keywords

Article
Publication date: 11 December 2023

Justin B. Keeler, Noelle F. Scuderi, Meagan E. Brock Baskin, Patricia C. Jordan and Laura M. Meade

The purpose of this study is to investigate the complexity of how demands and stress are mitigated to enhance employee performance in remote working arrangements.

Abstract

Purpose

The purpose of this study is to investigate the complexity of how demands and stress are mitigated to enhance employee performance in remote working arrangements.

Design/methodology/approach

A time-lagged snowball sample of 223 full-time remote working adults in the United States participated in an online survey. Data were analyzed using R 4.0.2 and structural equation modeling.

Findings

Results suggest remote job resources involving organizational trust and work flexibility increase performance via serial mediation when considering information communication technology (ICT) demands and work–life interference (WLI). The findings provide insights into counterbalancing the negative aspects of specific demands and stress in remote work arrangements.

Practical implications

This study provides insights for managers to understand how basic job resources may shape perspectives on demands and WLI to impact performance. Specific to remote working arrangements, establishing trust with the employees and promoting accountability with their work flexibility can play an important part in people and their performance.

Originality/value

This study contributes theoretically to the literature by evidencing how components of the E-Work Life (EWL) scale can be used with greater versatility beyond the original composite measurement because of the job-demand resource (JD-R) framework and conservation of resources theory (COR). This study answers several calls by research to investigate how ICT demands and WLI play a complex role in work performance.

Details

Journal of Organizational Effectiveness: People and Performance, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2051-6614

Keywords

Article
Publication date: 5 April 2024

Fangqi Hong, Pengfei Wei and Michael Beer

Bayesian cubature (BC) has emerged to be one of most competitive approach for estimating the multi-dimensional integral especially when the integrand is expensive to evaluate, and…

Abstract

Purpose

Bayesian cubature (BC) has emerged to be one of most competitive approach for estimating the multi-dimensional integral especially when the integrand is expensive to evaluate, and alternative acquisition functions, such as the Posterior Variance Contribution (PVC) function, have been developed for adaptive experiment design of the integration points. However, those sequential design strategies also prevent BC from being implemented in a parallel scheme. Therefore, this paper aims at developing a parallelized adaptive BC method to further improve the computational efficiency.

Design/methodology/approach

By theoretically examining the multimodal behavior of the PVC function, it is concluded that the multiple local maxima all have important contribution to the integration accuracy as can be selected as design points, providing a practical way for parallelization of the adaptive BC. Inspired by the above finding, four multimodal optimization algorithms, including one newly developed in this work, are then introduced for finding multiple local maxima of the PVC function in one run, and further for parallel implementation of the adaptive BC.

Findings

The superiority of the parallel schemes and the performance of the four multimodal optimization algorithms are then demonstrated and compared with the k-means clustering method by using two numerical benchmarks and two engineering examples.

Originality/value

Multimodal behavior of acquisition function for BC is comprehensively investigated. All the local maxima of the acquisition function contribute to adaptive BC accuracy. Parallelization of adaptive BC is realized with four multimodal optimization methods.

Details

Engineering Computations, vol. 41 no. 2
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 24 November 2023

Vikas Ghute and Mahesh Deshpande

The paper aims to identify the effect of ignorance of correlatedness among process observations and to implement new sampling schemes; skip and mixed sampling, in order to reduce…

Abstract

Purpose

The paper aims to identify the effect of ignorance of correlatedness among process observations and to implement new sampling schemes; skip and mixed sampling, in order to reduce the effect of autocorrelation on process capability index (PCI) Cpm.

Design/methodology/approach

Autocorrelated observations are generated using autoregressive process of order two (AR (2)) using Monte Carlo simulations. The PCI is computed based on these observations assuming the independence. The skip and mixed sampling schemes are then used to form sub-groups among correlated observations. The PCI obtained using sub-groups from skip and mixed sampling schemes are assessed using sample mean and sample standard deviation.

Findings

The paper provides empirical insights into how the effect of autocorrelation decreases in the estimated value of PCI Cpm. The use of new sampling schemes, skip and mixed sampling, reduces the effect of autocorrelation on estimates of PCI Cpm.

Originality/value

This paper fulfills an identified need to study how to reduce the effect of autocorrelation on PCI Cpm.

Details

International Journal of Quality & Reliability Management, vol. 41 no. 4
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 1 August 2023

Rafaela Aparecida Mendonça Marques, Aline Cristina Maciel, Antonio Fernando Branco Costa and Kleber Roberto da Silva Santos

This study investigates the repetitive mixed sampling (MRS) plan based on the Cpk index that was proposed by Aslam et al. (2013a). They were the first to study the MRS plan, but…

Abstract

Purpose

This study investigates the repetitive mixed sampling (MRS) plan based on the Cpk index that was proposed by Aslam et al. (2013a). They were the first to study the MRS plan, but they did not pay attention to the fact that submitting to the variable inspection a sample that was first submitted to the attribute inspection, truncates the X observations. In addition, they did not work with an accurate expression to calculate the probabilities of the Cpk statistic.

Design/methodology/approach

The authors presented the results based on their original sampling plan through Monte Carlo simulation and defined the theoretical results of their plan when the sample submitted to the variable inspection is no longer the same one submitted to the attribute inspection.

Findings

The β risks of the optimum sampling plans presented by Aslam et al. (2013a) are pretty high, exceeding 46%, on average – this same problem was also observed in Saminathan and Mahalingam (2018), Balamurali (2020) and Balamurali et al. (2020), where the β risks of their proposed sampling plans are yet higher.

Originality/value

In terms of originality, the authors can declare the following. It is not a big deal to propose new sampling plans, if one does not know how to obtain their properties. The miscalculations of the sampling plans risks are dangerous; imagine the situation where the acceptance of bad lots exceeds 50% just because the sampling plan was incorrectly designed. Yes, it is a big deal to warn that this type of problem is arising in a growing number of papers. The authors of this study are the pioneers to discover that many studies focusing on the sampling plans need to be urgently revised.

Details

International Journal of Quality & Reliability Management, vol. 41 no. 2
Type: Research Article
ISSN: 0265-671X

Keywords

Content available
Book part
Publication date: 18 January 2024

Abstract

Details

Artificial Intelligence, Engineering Systems and Sustainable Development
Type: Book
ISBN: 978-1-83753-540-8

Article
Publication date: 11 October 2023

Xiongming Lai, Yuxin Chen, Yong Zhang and Cheng Wang

The paper proposed a fast procedure for solving the reliability-based robust design optimization (RBRDO) by modifying the RBRDO formulation and transforming it into a series of…

Abstract

Purpose

The paper proposed a fast procedure for solving the reliability-based robust design optimization (RBRDO) by modifying the RBRDO formulation and transforming it into a series of RBRDO subproblems. Then for each subproblem, the objective function, constraint function and reliability index are approximated using Taylor series expansion, and their approximate forms depend on the deterministic design vector rather than the random vector and the uncertain estimation in the inner loop of RBRDO can be avoided. In this way, it can greatly reduce the evaluation number of performance function. Lastly, the trust region method is used to manage the above sequential RBRDO subproblems for convergence.

Design/methodology/approach

As is known, RBRDO is nested optimization, where the outer loop updates the design vector and the inner loop estimate the uncertainties. When solving the RBRDO, a large evaluation number of performance functions are needed. Aiming at this issue, the paper proposed a fast integrated procedure for solving the RBRDO by reducing the evaluation number for the performance functions. First, it transforms the original RBRDO problem into a series of RBRDO subproblems. In each subproblem, the objective function, constraint function and reliability index caused are approximated using simple explicit functions that solely depend on the deterministic design vector rather than the random vector. In this way, the need for extensive sampling simulation in the inner loop is greatly reduced. As a result, the evaluation number for performance functions is significantly reduced, leading to a substantial reduction in computation cost. The trust region method is then employed to handle the sequential RBRDO subproblems, ensuring convergence to the optimal solutions. Finally, the engineering test and the application are presented to illustrate the effectiveness and efficiency of the proposed methods.

Findings

The paper proposes a fast procedure of solving the RBRDO can greatly reduce the evaluation number of performance function within the RBRDO and the computation cost can be saved greatly, which makes it suitable for engineering applications.

Originality/value

The standard deviation of the original objective function of the RBRDO is replaced by the mean and the reliability index of the original objective function, which are further approximated by using Taylor series expansion and their approximate forms depend on the deterministic design vector rather than the random vector. Moreover, the constraint functions are also approximated by using Taylor series expansion. In this way, the uncertainty estimation of the performance functions (i.e. the mean of the objective function, the constraint functions) and the reliability index of the objective function are avoided within the inner loop of the RBRDO.

Details

International Journal of Structural Integrity, vol. 14 no. 6
Type: Research Article
ISSN: 1757-9864

Keywords

1 – 10 of 56