Search results

1 – 10 of 266
Article
Publication date: 5 April 2024

Fangqi Hong, Pengfei Wei and Michael Beer

Bayesian cubature (BC) has emerged to be one of most competitive approach for estimating the multi-dimensional integral especially when the integrand is expensive to evaluate, and…

Abstract

Purpose

Bayesian cubature (BC) has emerged to be one of most competitive approach for estimating the multi-dimensional integral especially when the integrand is expensive to evaluate, and alternative acquisition functions, such as the Posterior Variance Contribution (PVC) function, have been developed for adaptive experiment design of the integration points. However, those sequential design strategies also prevent BC from being implemented in a parallel scheme. Therefore, this paper aims at developing a parallelized adaptive BC method to further improve the computational efficiency.

Design/methodology/approach

By theoretically examining the multimodal behavior of the PVC function, it is concluded that the multiple local maxima all have important contribution to the integration accuracy as can be selected as design points, providing a practical way for parallelization of the adaptive BC. Inspired by the above finding, four multimodal optimization algorithms, including one newly developed in this work, are then introduced for finding multiple local maxima of the PVC function in one run, and further for parallel implementation of the adaptive BC.

Findings

The superiority of the parallel schemes and the performance of the four multimodal optimization algorithms are then demonstrated and compared with the k-means clustering method by using two numerical benchmarks and two engineering examples.

Originality/value

Multimodal behavior of acquisition function for BC is comprehensively investigated. All the local maxima of the acquisition function contribute to adaptive BC accuracy. Parallelization of adaptive BC is realized with four multimodal optimization methods.

Details

Engineering Computations, vol. 41 no. 2
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 6 November 2023

Zhiying Wang and Hongmei Jia

Forecasting demand of emergency supplies under major epidemics plays a vital role in improving rescue efficiency. Few studies have combined intuitionistic fuzzy set with…

Abstract

Purpose

Forecasting demand of emergency supplies under major epidemics plays a vital role in improving rescue efficiency. Few studies have combined intuitionistic fuzzy set with grey-Markov method and applied it to the prediction of emergency supplies demand. Therefore, this article aims to establish a novel method for emergency supplies demand forecasting under major epidemics.

Design/methodology/approach

Emergency supplies demand is correlated with the number of infected cases in need of relief services. First, a novel method called the Intuitionistic Fuzzy TPGM(1,1)-Markov Method (IFTPGMM) is proposed, and it is utilized for the purpose of forecasting the number of people. Then, the prediction of demand for emergency supplies is calculated using a method based on the safety inventory theory, according to numbers predicted by IFTPGMM. Finally, to demonstrate the effectiveness of the proposed method, a comparative analysis is conducted between IFTPGMM and four other methods.

Findings

The results show that IFTPGMM demonstrates superior predictive performance compared to four other methods. The integration of the grey method and intuitionistic fuzzy set has been shown to effectively handle uncertain information and enhance the accuracy of predictions.

Originality/value

The main contribution of this article is to propose a novel method for emergency supplies demand forecasting under major epidemics. The benefits of utilizing the grey method for handling small sample sizes and intuitionistic fuzzy set for handling uncertain information are considered in this proposed method. This method not only enhances existing grey method but also expands the methodologies used for forecasting demand for emergency supplies.

Highlights (for review)

  1. An intuitionistic fuzzy TPGM(1,1)-Markov method (IFTPGMM) is proposed.

  2. The safety inventory theory is combined with IFTPGMM to construct a prediction method.

  3. Asymptomatic infected cases are taken to forecast the demand for emergency supplies.

An intuitionistic fuzzy TPGM(1,1)-Markov method (IFTPGMM) is proposed.

The safety inventory theory is combined with IFTPGMM to construct a prediction method.

Asymptomatic infected cases are taken to forecast the demand for emergency supplies.

Details

Grey Systems: Theory and Application, vol. 14 no. 1
Type: Research Article
ISSN: 2043-9377

Keywords

Article
Publication date: 22 March 2024

Sanaz Khalaj Rahimi and Donya Rahmani

The study aims to optimize truck routes by minimizing social and economic costs. It introduces a strategy involving diverse drones and their potential for reusing at DNs based on…

26

Abstract

Purpose

The study aims to optimize truck routes by minimizing social and economic costs. It introduces a strategy involving diverse drones and their potential for reusing at DNs based on flight range. In HTDRP-DC, trucks can select and transport various drones to LDs to reduce deprivation time. This study estimates the nonlinear deprivation cost function using a linear two-piece-wise function, leading to MILP formulations. A heuristic-based Benders Decomposition approach is implemented to address medium and large instances. Valid inequalities and a heuristic method enhance convergence boundaries, ensuring an efficient solution methodology.

Design/methodology/approach

Research has yet to address critical factors in disaster logistics: minimizing the social and economic costs simultaneously and using drones in relief distribution; deprivation as a social cost measures the human suffering from a shortage of relief supplies. The proposed hybrid truck-drone routing problem minimizing deprivation cost (HTDRP-DC) involves distributing relief supplies to dispersed demand nodes with undamaged (LDs) or damaged (DNs) access roads, utilizing multiple trucks and diverse drones. A Benders Decomposition approach is enhanced by accelerating techniques.

Findings

Incorporating deprivation and economic costs results in selecting optimal routes, effectively reducing the time required to assist affected areas. Additionally, employing various drone types and their reuse in damaged nodes reduces deprivation time and associated deprivation costs. The study employs valid inequalities and the heuristic method to solve the master problem, substantially reducing computational time and iterations compared to GAMS and classical Benders Decomposition Algorithm. The proposed heuristic-based Benders Decomposition approach is applied to a disaster in Tehran, demonstrating efficient solutions for the HTDRP-DC regarding computational time and convergence rate.

Originality/value

Current research introduces an HTDRP-DC problem that addresses minimizing deprivation costs considering the vehicle’s arrival time as the deprivation time, offering a unique solution to optimize route selection in relief distribution. Furthermore, integrating heuristic methods and valid inequalities into the Benders Decomposition approach enhances its effectiveness in solving complex routing challenges in disaster scenarios.

Details

Kybernetes, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 16 April 2024

Chaofan Wang, Yanmin Jia and Xue Zhao

Prefabricated columns connected by grouted sleeves are increasingly used in practical projects. However, seismic fragility analyses of such structures are rarely conducted…

Abstract

Purpose

Prefabricated columns connected by grouted sleeves are increasingly used in practical projects. However, seismic fragility analyses of such structures are rarely conducted. Seismic fragility analysis has an important role in seismic hazard evaluation. In this paper, the seismic fragility of sleeve connected prefabricated column is analyzed.

Design/methodology/approach

A model for predicting the seismic demand on sleeve connected prefabricated columns has been created by incorporating engineering demand parameters (EDP) and probabilities of seismic failure. The incremental dynamics analysis (IDA) curve clusters of this type of column were obtained using finite element analysis. The seismic fragility curve is obtained by regression of Exponential and Logical Function Model.

Findings

The IDA curve cluster gradually increased the dispersion after a peak ground acceleration (PGA) of 0.3 g was reached. For both columns, the relative displacement of the top of the column significantly changed after reaching 50 mm. The seismic fragility of the prefabricated column with the sleeve placed in the cap (SPCA) was inadequate.

Originality/value

The sleeve was placed in the column to overcome the seismic fragility of prefabricated columns effectively. In practical engineering, it is advisable to utilize these columns in regions susceptible to earthquakes and characterized by high seismic intensity levels in order to mitigate the risk of structural damage resulting from ground motion.

Details

International Journal of Structural Integrity, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1757-9864

Keywords

Article
Publication date: 20 March 2024

Vinod Bhatia and K. Kalaivani

Indian railways (IR) is one of the largest railway networks in the world. As a part of its strategic development initiative, demand forecasting can be one of the indispensable…

Abstract

Purpose

Indian railways (IR) is one of the largest railway networks in the world. As a part of its strategic development initiative, demand forecasting can be one of the indispensable activities, as it may provide basic inputs for planning and control of various activities such as coach production, planning new trains, coach augmentation and quota redistribution. The purpose of this study is to suggest an approach to demand forecasting for IR management.

Design/methodology/approach

A case study is carried out, wherein several models i.e. automated autoregressive integrated moving average (auto-ARIMA), trigonometric regressors (TBATS), Holt–Winters additive model, Holt–Winters multiplicative model, simple exponential smoothing and simple moving average methods have been tested. As per requirements of IR management, the adopted research methodology is predominantly discursive, and the passenger reservation patterns over a five-year period covering a most representative train service for the past five years have been employed. The relative error matrix and the Akaike information criterion have been used to compare the performance of various models. The Diebold–Mariano test was conducted to examine the accuracy of models.

Findings

The coach production strategy has been proposed on the most suitable auto-ARIMA model. Around 6,000 railway coaches per year have been produced in the past 3 years by IR. As per the coach production plan for the year 2023–2024, a tentative 6551 coaches of various types have been planned for production. The insights gained from this paper may facilitate need-based coach manufacturing and optimum utilization of the inventory.

Originality/value

This study contributes to the literature on rail ticket demand forecasting and adds value to the process of rolling stock management. The proposed model can be a comprehensive decision-making tool to plan for new train services and assess the rolling stock production requirement on any railway system. The analysis may help in making demand predictions for the busy season, and the management can make important decisions about the pricing of services.

Details

foresight, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1463-6689

Keywords

Article
Publication date: 13 April 2023

Sadia Samar Ali, Shahbaz Khan, Nosheen Fatma, Cenap Ozel and Aftab Hussain

Organisations and industries are often looking for technologies that can accomplish multiple tasks, providing economic benefits and an edge over their competitors. In this…

Abstract

Purpose

Organisations and industries are often looking for technologies that can accomplish multiple tasks, providing economic benefits and an edge over their competitors. In this context, drones have the potential to change many industries by making operations more efficient, safer and more economic. Therefore, this study investigates the use of drones as the next step in smart/digital warehouse management to determine their socio-economic benefits.

Design/methodology/approach

The study identifies various enablers impacting drone applications to improve inventory management, intra-logistics, inspections and surveillance in smart warehouses through a literature review, a test of concordance and the fuzzy Delphi method. Further, the graph theory matrix approach (GTMA) method was applied to ranking the enablers of drone application in smart/digital warehouses. In the subsequent phase, researchers investigated the relation between the drone application's performance and the enablers of drone adoption using logistic regression analysis under the TOE framework.

Findings

This study identifies inventory man agement, intra-logistics, inspections and surveillance are three major applications of drones in the smart warehousing. Further, nine enablers are identified for the adoption of drone in warehouse management. The findings suggest that operational effectiveness, compatibility of drone integration and quality/value offered are the most impactful enablers of drone adoption in warehouses. The logistic regression findings are useful for warehouse managers who are planning to adopt drones in a warehouse for efficient operations.

Research limitations/implications

This study identifies the enablers of drone adoption in the smart and digital warehouse through the literature review and fuzzy Delphi. Therefore, some enablers may be overlooked during the identification process. In addition to this, the analysis is based on the opinion of the expert which might be influenced by their field of expertise.

Practical implications

By considering technology-organisation-environment (TOE) framework warehousing companies identify the opportunities and challenges associated with using drones in a smart warehouse and develop strategies to integrate drones into their operations effectively.

Originality/value

This study proposes a TOE-based framework for the adoption of drones in warehouse management to improve the three prominent warehouse functions inventory management, intra-logistics, inspections and surveillance using the mixed-method.

Details

Benchmarking: An International Journal, vol. 31 no. 3
Type: Research Article
ISSN: 1463-5771

Keywords

Article
Publication date: 4 October 2022

Dhruba Jyoti Borgohain, Raj Kumar Bhardwaj and Manoj Kumar Verma

Artificial Intelligence (AI) is an emerging technology and turned into a field of knowledge that has been consistently displacing technologies for a change in human life. It is…

2067

Abstract

Purpose

Artificial Intelligence (AI) is an emerging technology and turned into a field of knowledge that has been consistently displacing technologies for a change in human life. It is applied in all spheres of life as reflected in the review of the literature section here. As applicable in the field of libraries too, this study scientifically mapped the papers on AAIL and analyze its growth, collaboration network, trending topics, or research hot spots to highlight the challenges and opportunities in adopting AI-based advancements in library systems and processes.

Design/methodology/approach

The study was developed with a bibliometric approach, considering a decade, 2012 to 2021 for data extraction from a premier database, Scopus. The steps followed are (1) identification, selection of keywords, and forming the search strategy with the approval of a panel of computer scientists and librarians and (2) design and development of a perfect algorithm to verify these selected keywords in title-abstract-keywords of Scopus (3) Performing data processing in some state-of-the-art bibliometric visualization tools, Biblioshiny R and VOSviewer (4) discussing the findings for practical implications of the study and limitations.

Findings

As evident from several papers, not much research has been conducted on AI applications in libraries in comparison to topics like AI applications in cancer, health, medicine, education, and agriculture. As per the Price law, the growth pattern is exponential. The total number of papers relevant to the subject is 1462 (single and multi-authored) contributed by 5400 authors with 0.271 documents per author and around 4 authors per document. Papers occurred mostly in open-access journals. The productive journal is the Journal of Chemical Information and Modelling (NP = 63) while the highly consistent and impactful is the Journal of Machine Learning Research (z-index=63.58 and CPP = 56.17). In the case of authors, J Chen (z-index=28.86 and CPP = 43.75) is the most consistent and impactful author. At the country level, the USA has recorded the highest number of papers positioned at the center of the co-authorship network but at the institutional level, China takes the 1st position. The trending topics of research are machine learning, large dataset, deep learning, high-level languages, etc. The present information system has a high potential to improve if integrated with AI technologies.

Practical implications

The number of scientific papers has increased over time. The evolution of themes like machine learning implicates AI as a broad field of knowledge that converges with other disciplines. The themes like large datasets imply that AI may be applied to analyze and interpret these data and support decision-making in public sector enterprises. Theme named high-level language emerged as a research hotspot which indicated that extensive research has been going on in this area to improve computer systems for facilitating the processing of data with high momentum. These implications are of high strategic worth for policymakers, library stakeholders, researchers and the government as a whole for decision-making.

Originality/value

The analysis of collaboration, prolific authors/journals using consistency factor and CPP, testing the relationship between consistency (z-index) and impact (h-index), using state-of-the-art network visualization and cluster analysis techniques make this study novel and differentiates it from the traditional bibliometric analysis. To the best of the author's knowledge, this work is the first attempt to comprehend the research streams and provide a holistic view of research on the application of AI in libraries. The insights obtained from this analysis are instrumental for both academics and practitioners.

Details

Library Hi Tech, vol. 42 no. 1
Type: Research Article
ISSN: 0737-8831

Keywords

Article
Publication date: 9 April 2024

Lu Wang, Jiahao Zheng, Jianrong Yao and Yuangao Chen

With the rapid growth of the domestic lending industry, assessing whether the borrower of each loan is at risk of default is a pressing issue for financial institutions. Although…

Abstract

Purpose

With the rapid growth of the domestic lending industry, assessing whether the borrower of each loan is at risk of default is a pressing issue for financial institutions. Although there are some models that can handle such problems well, there are still some shortcomings in some aspects. The purpose of this paper is to improve the accuracy of credit assessment models.

Design/methodology/approach

In this paper, three different stages are used to improve the classification performance of LSTM, so that financial institutions can more accurately identify borrowers at risk of default. The first approach is to use the K-Means-SMOTE algorithm to eliminate the imbalance within the class. In the second step, ResNet is used for feature extraction, and then two-layer LSTM is used for learning to strengthen the ability of neural networks to mine and utilize deep information. Finally, the model performance is improved by using the IDWPSO algorithm for optimization when debugging the neural network.

Findings

On two unbalanced datasets (category ratios of 700:1 and 3:1 respectively), the multi-stage improved model was compared with ten other models using accuracy, precision, specificity, recall, G-measure, F-measure and the nonparametric Wilcoxon test. It was demonstrated that the multi-stage improved model showed a more significant advantage in evaluating the imbalanced credit dataset.

Originality/value

In this paper, the parameters of the ResNet-LSTM hybrid neural network, which can fully mine and utilize the deep information, are tuned by an innovative intelligent optimization algorithm to strengthen the classification performance of the model.

Details

Kybernetes, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 11 October 2023

Yuhong Wang and Qi Si

This study aims to predict China's carbon emission intensity and put forward a set of policy recommendations for further development of a low-carbon economy in China.

Abstract

Purpose

This study aims to predict China's carbon emission intensity and put forward a set of policy recommendations for further development of a low-carbon economy in China.

Design/methodology/approach

In this paper, the Interaction Effect Grey Power Model of N Variables (IEGPM(1,N)) is developed, and the Dragonfly algorithm (DA) is used to select the best power index for the model. Specific model construction methods and rigorous mathematical proofs are given. In order to verify the applicability and validity, this paper compares the model with the traditional grey model and simulates the carbon emission intensity of China from 2014 to 2021. In addition, the new model is used to predict the carbon emission intensity of China from 2022 to 2025, which can provide a reference for the 14th Five-Year Plan to develop a scientific emission reduction path.

Findings

The results show that if the Chinese government does not take effective policy measures in the future, carbon emission intensity will not achieve the set goals. The IEGPM(1,N) model also provides reliable results and works well in simulation and prediction.

Originality/value

The paper considers the nonlinear and interactive effect of input variables in the system's behavior and proposes an improved grey multivariable model, which fills the gap in previous studies.

Details

Grey Systems: Theory and Application, vol. 14 no. 1
Type: Research Article
ISSN: 2043-9377

Keywords

Article
Publication date: 28 March 2024

Elisa Gonzalez Santacruz, David Romero, Julieta Noguez and Thorsten Wuest

This research paper aims to analyze the scientific and grey literature on Quality 4.0 and zero-defect manufacturing (ZDM) frameworks to develop an integrated quality 4.0 framework…

Abstract

Purpose

This research paper aims to analyze the scientific and grey literature on Quality 4.0 and zero-defect manufacturing (ZDM) frameworks to develop an integrated quality 4.0 framework (IQ4.0F) for quality improvement (QI) based on Six Sigma and machine learning (ML) techniques towards ZDM. The IQ4.0F aims to contribute to the advancement of defect prediction approaches in diverse manufacturing processes. Furthermore, the work enables a comprehensive analysis of process variables influencing product quality with emphasis on the use of supervised and unsupervised ML techniques in Six Sigma’s DMAIC (Define, Measure, Analyze, Improve and Control) cycle stage of “Analyze.”

Design/methodology/approach

The research methodology employed a systematic literature review (SLR) based on PRISMA guidelines to develop the integrated framework, followed by a real industrial case study set in the automotive industry to fulfill the objectives of verifying and validating the proposed IQ4.0F with primary data.

Findings

This research work demonstrates the value of a “stepwise framework” to facilitate a shift from conventional quality management systems (QMSs) to QMSs 4.0. It uses the IDEF0 modeling methodology and Six Sigma’s DMAIC cycle to structure the steps to be followed to adopt the Quality 4.0 paradigm for QI. It also proves the worth of integrating Six Sigma and ML techniques into the “Analyze” stage of the DMAIC cycle for improving defect prediction in manufacturing processes and supporting problem-solving activities for quality managers.

Originality/value

This research paper introduces a first-of-its-kind Quality 4.0 framework – the IQ4.0F. Each step of the IQ4.0F was verified and validated in an original industrial case study set in the automotive industry. It is the first Quality 4.0 framework, according to the SLR conducted, to utilize the principal component analysis technique as a substitute for “Screening Design” in the Design of Experiments phase and K-means clustering technique for multivariable analysis, identifying process parameters that significantly impact product quality. The proposed IQ4.0F not only empowers decision-makers with the knowledge to launch a Quality 4.0 initiative but also provides quality managers with a systematic problem-solving methodology for quality improvement.

Details

The TQM Journal, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1754-2731

Keywords

1 – 10 of 266