Search results

1 – 10 of 59
Article
Publication date: 5 April 2024

Fangqi Hong, Pengfei Wei and Michael Beer

Bayesian cubature (BC) has emerged to be one of most competitive approach for estimating the multi-dimensional integral especially when the integrand is expensive to evaluate, and…

Abstract

Purpose

Bayesian cubature (BC) has emerged to be one of most competitive approach for estimating the multi-dimensional integral especially when the integrand is expensive to evaluate, and alternative acquisition functions, such as the Posterior Variance Contribution (PVC) function, have been developed for adaptive experiment design of the integration points. However, those sequential design strategies also prevent BC from being implemented in a parallel scheme. Therefore, this paper aims at developing a parallelized adaptive BC method to further improve the computational efficiency.

Design/methodology/approach

By theoretically examining the multimodal behavior of the PVC function, it is concluded that the multiple local maxima all have important contribution to the integration accuracy as can be selected as design points, providing a practical way for parallelization of the adaptive BC. Inspired by the above finding, four multimodal optimization algorithms, including one newly developed in this work, are then introduced for finding multiple local maxima of the PVC function in one run, and further for parallel implementation of the adaptive BC.

Findings

The superiority of the parallel schemes and the performance of the four multimodal optimization algorithms are then demonstrated and compared with the k-means clustering method by using two numerical benchmarks and two engineering examples.

Originality/value

Multimodal behavior of acquisition function for BC is comprehensively investigated. All the local maxima of the acquisition function contribute to adaptive BC accuracy. Parallelization of adaptive BC is realized with four multimodal optimization methods.

Details

Engineering Computations, vol. 41 no. 2
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 10 October 2023

Sou-Sen Leu, Yen-Lin Fu and Pei-Lin Wu

This paper aims to develop a dynamic civil facility degradation prediction model to forecast the reliability performance tendency and remaining useful life under imperfect…

Abstract

Purpose

This paper aims to develop a dynamic civil facility degradation prediction model to forecast the reliability performance tendency and remaining useful life under imperfect maintenance based on the inspection records and the maintenance actions.

Design/methodology/approach

A real-time hidden Markov chain (HMM) model is proposed in this paper to predict the reliability performance tendency and remaining useful life under imperfect maintenance based on rare failure events. The model assumes a Poisson arrival pattern for facility failure events occurrence. HMM is further adopted to establish the transmission probabilities among stages. Finally, the simulation inference is conducted using Particle filter (PF) to estimate the most probable model parameters. Water seals at the spillway hydraulic gate in a Taiwan's reservoir are used to examine the appropriateness of the approach.

Findings

The results of defect probabilities tendency from the real-time HMM model are highly consistent with the real defect trend pattern of civil facilities. The proposed facility degradation prediction model can provide the maintenance division with early warning of potential failure to establish a proper proactive maintenance plan, even under the condition of rare defects.

Originality/value

This model is a new method of civil facility degradation prediction under imperfect maintenance, even with rare failure events. It overcomes several limitations of classical failure pattern prediction approaches and can reliably simulate the occurrence of rare defects under imperfect maintenance and the effect of inspection reliability caused by human error. Based on the degradation trend pattern prediction, effective maintenance management plans can be practically implemented to minimize the frequency of the occurrence and the consequence of civil facility failures.

Details

Journal of Quality in Maintenance Engineering, vol. 30 no. 1
Type: Research Article
ISSN: 1355-2511

Keywords

Article
Publication date: 27 November 2023

Velmurugan Kumaresan, S. Saravanasankar and Gianpaolo Di Bona

Through the use of the Markov Decision Model (MDM) approach, this study uncovers significant variations in the availability of machines in both faulty and ideal situations in…

Abstract

Purpose

Through the use of the Markov Decision Model (MDM) approach, this study uncovers significant variations in the availability of machines in both faulty and ideal situations in small and medium-sized enterprises (SMEs). The first-order differential equations are used to construct the mathematical equations from the transition-state diagrams of the separate subsystems in the critical part manufacturing plant.

Design/methodology/approach

To obtain the lowest investment cost, one of the non-traditional optimization strategies is employed in maintenance operations in SMEs in this research. It will use the particle swarm optimization (PSO) algorithm to optimize machine maintenance parameters and find the best solutions, thereby introducing the best decision-making process for optimal maintenance and service operations.

Findings

The major goal of this study is to identify critical subsystems in manufacturing plants and to use an optimal decision-making process to adopt the best maintenance management system in the industry. The optimal findings of this proposed method demonstrate that in problematic conditions, the availability of SME machines can be enhanced by up to 73.25%, while in an ideal situation, the system's availability can be increased by up to 76.17%.

Originality/value

The proposed new optimal decision-support system for this preventive maintenance management in SMEs is based on these findings, and it aims to achieve maximum productivity with the least amount of expenditure in maintenance and service through an optimal planning and scheduling process.

Details

Journal of Quality in Maintenance Engineering, vol. 30 no. 1
Type: Research Article
ISSN: 1355-2511

Keywords

Article
Publication date: 18 January 2024

Zaihua Luo, Juliang Xiao, Sijiang Liu, Mingli Wang, Wei Zhao and Haitao Liu

This paper aims to propose a dynamic parameter identification method based on sensitivity analysis for the 5-degree of freedom (DOF) hybrid robots, to solve the problems of too…

Abstract

Purpose

This paper aims to propose a dynamic parameter identification method based on sensitivity analysis for the 5-degree of freedom (DOF) hybrid robots, to solve the problems of too many identification parameters, complex model, difficult convergence of optimization algorithms and easy-to-fall into a locally optimal solution, and improve the efficiency and accuracy of dynamic parameter identification.

Design/methodology/approach

First, the dynamic parameter identification model of the 5-DOF hybrid robot was established based on the principle of virtual work. Then, the sensitivity of the parameters to be identified is analyzed by Sobol’s sensitivity method and verified by simulation. Finally, an identification strategy based on sensitivity analysis was designed, experiments were carried out on the real robot and the results were verified.

Findings

Compared with the traditional full-parameter identification method, the dynamic parameter identification method based on sensitivity analysis proposed in this paper converges faster when optimized using the genetic algorithm, and the identified dynamic model has higher prediction accuracy for joint drive forces and torques than the full-parameter identification models.

Originality/value

This work analyzes the sensitivity of the parameters to be identified in the dynamic parameter identification model for the first time. Then a parameter identification method is proposed based on the results of the sensitivity analysis, which can effectively reduce the parameters to be identified, simplify the identification model, accelerate the convergence of the optimization algorithm and improve the prediction accuracy of the identified model for the joint driving forces and torques.

Details

Industrial Robot: the international journal of robotics research and application, vol. 51 no. 2
Type: Research Article
ISSN: 0143-991X

Keywords

Open Access
Article
Publication date: 22 March 2024

Geming Zhang, Lin Yang and Wenxiang Jiang

The purpose of this study is to introduce the top-level design ideas and the overall architecture of earthquake early-warning system for high speed railways in China, which is…

Abstract

Purpose

The purpose of this study is to introduce the top-level design ideas and the overall architecture of earthquake early-warning system for high speed railways in China, which is based on P-wave earthquake early-warning and multiple ways of rapid treatment.

Design/methodology/approach

The paper describes the key technologies that are involved in the development of the system, such as P-wave identification and earthquake early-warning, multi-source seismic information fusion and earthquake emergency treatment technologies. The paper also presents the test results of the system, which show that it has complete functions and its major performance indicators meet the design requirements.

Findings

The study demonstrates that the high speed railways earthquake early-warning system serves as an important technical tool for high speed railways to cope with the threat of earthquake to the operation safety. The key technical indicators of the system have excellent performance: The first report time of the P-wave is less than three seconds. From the first arrival of P-wave to the beginning of train braking, the total delay of onboard emergency treatment is 3.63 seconds under 95% probability. The average total delay for power failures triggered by substations is 3.3 seconds.

Originality/value

The paper provides a valuable reference for the research and development of earthquake early-warning system for high speed railways in other countries and regions. It also contributes to the earthquake prevention and disaster reduction efforts.

Book part
Publication date: 5 April 2024

Alecos Papadopoulos

The author develops a bilateral Nash bargaining model under value uncertainty and private/asymmetric information, combining ideas from axiomatic and strategic bargaining theory…

Abstract

The author develops a bilateral Nash bargaining model under value uncertainty and private/asymmetric information, combining ideas from axiomatic and strategic bargaining theory. The solution to the model leads organically to a two-tier stochastic frontier (2TSF) setup with intra-error dependence. The author presents two different statistical specifications to estimate the model, one that accounts for regressor endogeneity using copulas, the other able to identify separately the bargaining power from the private information effects at the individual level. An empirical application using a matched employer–employee data set (MEEDS) from Zambia and a second using another one from Ghana showcase the applied potential of the approach.

Book part
Publication date: 4 March 2024

Oswald A. J. Mascarenhas, Munish Thakur and Payal Kumar

We revisit the problem of redesigning the Master in Business Administration (MBA) program, curriculum, and pedagogy, focusing on understanding and seeking to tame its “wicked…

Abstract

Executive Summary

We revisit the problem of redesigning the Master in Business Administration (MBA) program, curriculum, and pedagogy, focusing on understanding and seeking to tame its “wicked problems,” as an intrinsic part and challenge of the MBA program venture, and to render it more realistic and relevant to address major problems and their consequences. We briefly review the theory of wicked problems and methods of dealing with their consequences from multiple perspectives. Most characterization of problems classifies them as simple (problems that have known formulations and solutions), complex (where formulations are known but not their resolutions), unstructured problems (where formulations are unknown, but solutions are estimated), and “wicked” (where both problem formulations and their resolutions are unknown but eventually partially tamable). Uncertainty, unpredictability, randomness, and ambiguity increase from simple to complex to unstructured to wicked problems. A redesigned MBA program should therefore address them effectively through the four semesters in two years. Most of these problems are real and affect life and economies, and hence, business schools cannot but incorporate them into their critical, ethical, and moral thinking.

Details

A Primer on Critical Thinking and Business Ethics
Type: Book
ISBN: 978-1-83753-312-1

Article
Publication date: 8 March 2024

Satyajit Mahato and Supriyo Roy

Managing project completion within the stipulated time is significant to all firms' sustainability. Especially for software start-up firms, it is of utmost importance. For any…

Abstract

Purpose

Managing project completion within the stipulated time is significant to all firms' sustainability. Especially for software start-up firms, it is of utmost importance. For any schedule variation, these firms must spend 25 to 40 percent of the development cost reworking quality defects. Significantly, the existing literature does not support defect rework opportunities under quality aspects among Indian IT start-ups. The present study aims to fill this niche by proposing a unique mathematical model of the defect rework aligned with the Six Sigma quality approach.

Design/methodology/approach

An optimization model was formulated, comprising the two objectives: rework “time” and rework “cost.” A case study was developed in relevance, and for the model solution, we used MATLAB and an elitist, Nondominated Sorting Genetic Algorithm (NSGA-II).

Findings

The output of the proposed approach reduced the “time” by 31 percent at a minimum “cost”. The derived “Pareto Optimal” front can be used to estimate the “cost” for a pre-determined rework “time” and vice versa, thus adding value to the existing literature.

Research limitations/implications

This work has deployed a decision tree for defect prediction, but it is often criticized for overfitting. This is one of the limitations of this paper. Apart from this, comparing the predicted defect count with other prediction models hasn’t been attempted. NSGA-II has been applied to solve the optimization problem; however, the optimal results obtained have yet to be compared with other algorithms. Further study is envisaged.

Practical implications

The Pareto front provides an effective visual aid for managers to compare multiple strategies to decide the best possible rework “cost” and “time” for their projects. It is beneficial for cost-sensitive start-ups to estimate the rework “cost” and “time” to negotiate with their customers effectively.

Originality/value

This paper proposes a novel quality management framework under the Six Sigma approach, which integrates optimization of critical metrics. As part of this study, a unique mathematical model of the software defect rework process was developed (combined with the proposed framework) to obtain the optimal solution for the perennial problem of schedule slippage in the rework process of software development.

Details

International Journal of Quality & Reliability Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0265-671X

Keywords

Open Access
Article
Publication date: 29 April 2024

Dada Zhang and Chun-Hsing Ho

The purpose of this paper is to investigate the vehicle-based sensor effect and pavement temperature on road condition assessment, as well as to compute a threshold value for the…

Abstract

Purpose

The purpose of this paper is to investigate the vehicle-based sensor effect and pavement temperature on road condition assessment, as well as to compute a threshold value for the classification of pavement conditions.

Design/methodology/approach

Four sensors were placed on the vehicle’s control arms and one inside the vehicle to collect vibration acceleration data for analysis. The Analysis of Variance (ANOVA) tests were performed to diagnose the effect of the vehicle-based sensors’ placement in the field. To classify road conditions and identify pavement distress (point of interest), the probability distribution was applied based on the magnitude values of vibration data.

Findings

Results from ANOVA indicate that pavement sensing patterns from the sensors placed on the front control arms were statistically significant, and there is no difference between the sensors placed on the same side of the vehicle (e.g., left or right side). A reference threshold (i.e., 1.7 g) was computed from the distribution fitting method to classify road conditions and identify the road distress based on the magnitude values that combine all acceleration along three axes. In addition, the pavement temperature was found to be highly correlated with the sensing patterns, which is noteworthy for future projects.

Originality/value

The paper investigates the effect of pavement sensors’ placement in assessing road conditions, emphasizing the implications for future road condition assessment projects. A threshold value for classifying road conditions was proposed and applied in class assignments (I-17 highway projects).

Details

Built Environment Project and Asset Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2044-124X

Keywords

Open Access
Article
Publication date: 15 December 2023

Chon Van Le and Uyen Hoang Pham

This paper aims mainly at introducing applied statisticians and econometricians to the current research methodology with non-Euclidean data sets. Specifically, it provides the…

Abstract

Purpose

This paper aims mainly at introducing applied statisticians and econometricians to the current research methodology with non-Euclidean data sets. Specifically, it provides the basis and rationale for statistics in Wasserstein space, where the metric on probability measures is taken as a Wasserstein metric arising from optimal transport theory.

Design/methodology/approach

The authors spell out the basis and rationale for using Wasserstein metrics on the data space of (random) probability measures.

Findings

In elaborating the new statistical analysis of non-Euclidean data sets, the paper illustrates the generalization of traditional aspects of statistical inference following Frechet's program.

Originality/value

Besides the elaboration of research methodology for a new data analysis, the paper discusses the applications of Wasserstein metrics to the robustness of financial risk measures.

1 – 10 of 59