Search results

1 – 10 of 82
Open Access
Article
Publication date: 12 July 2022

Tianyue Feng, Lihao Liu, Xingyu Xing and Junyi Chen

The purpose of this paper is to search for the critical-scenarios of autonomous vehicles (AVs) quickly and comprehensively, which is essential for verification and validation…

Abstract

Purpose

The purpose of this paper is to search for the critical-scenarios of autonomous vehicles (AVs) quickly and comprehensively, which is essential for verification and validation (V&V).

Design/methodology/approach

The author adopted the index F1 to quantitative critical-scenarios' coverage of the search space and proposed the improved particle swarm optimization (IPSO) to enhance exploration ability for higher coverage. Compared with the particle swarm optimization (PSO), there were three improvements. In the initial phase, the Latin hypercube sampling method was introduced for a uniform distribution of particles. In the iteration phase, the neighborhood operator was adapted to explore more modals with the particles divided into groups. In the convergence phase, the convergence judgment and restart strategy were used to explore the search space by avoiding local convergence. Compared with the Monte Carlo method (MC) and PSO, experiments on the artificial function and critical-scenarios search were carried out to verify the efficiency and the application effect of the method.

Findings

Results show that IPSO can search for multimodal critical-scenarios comprehensively, with a stricter threshold and fewer samples in the experiment on critical-scenario search, the coverage of IPSO is 14% higher than PSO and 40% higher than MC.

Originality/value

The critical-scenarios' coverage of the search space is firstly quantified by the index F1, and the proposed method has higher search efficiency and coverage for the critical-scenarios search of AVs, which shows application potential for V&V.

Details

Journal of Intelligent and Connected Vehicles, vol. 5 no. 3
Type: Research Article
ISSN: 2399-9802

Keywords

Open Access
Article
Publication date: 29 July 2020

Ghoulemallah Boukhalfa, Sebti Belkacem, Abdesselem Chikhi and Said Benaggoune

This paper presents the particle swarm optimization (PSO) algorithm in conjuction with the fuzzy logic method in order to achieve an optimized tuning of a proportional integral…

1191

Abstract

This paper presents the particle swarm optimization (PSO) algorithm in conjuction with the fuzzy logic method in order to achieve an optimized tuning of a proportional integral derivative controller (PID) in the DTC control loops of dual star induction motor (DSIM). The fuzzy controller is insensitive to parametric variations, however, with the PSO-based optimization approach we obtain a judicious choice of the gains to make the system more robust. According to Matlab simulation, the results demonstrate that the hybrid DTC of DSIM improves the speed loop response, ensures the system stability, reduces the steady state error and enhances the rising time. Moreover, with this controller, the disturbances do not affect the motor performances.

Details

Applied Computing and Informatics, vol. 18 no. 1/2
Type: Research Article
ISSN: 2634-1964

Keywords

Open Access
Article
Publication date: 9 October 2023

Mingyao Sun and Tianhua Zhang

A real-time production scheduling method for semiconductor back-end manufacturing process becomes increasingly important in industry 4.0. Semiconductor back-end manufacturing…

Abstract

Purpose

A real-time production scheduling method for semiconductor back-end manufacturing process becomes increasingly important in industry 4.0. Semiconductor back-end manufacturing process is always accompanied by order splitting and merging; besides, in each stage of the process, there are always multiple machine groups that have different production capabilities and capacities. This paper studies a multi-agent based scheduling architecture for the radio frequency identification (RFID)-enabled semiconductor back-end shopfloor, which integrates not only manufacturing resources but also human factors.

Design/methodology/approach

The architecture includes a task management (TM) agent, a staff instruction (SI) agent, a task scheduling (TS) agent, an information management center (IMC), machine group (MG) agent and a production monitoring (PM) agent. Then, based on the architecture, the authors developed a scheduling method consisting of capability & capacity planning and machine configuration modules in the TS agent.

Findings

The authors used greedy policy to assign each order to the appropriate machine groups based on the real-time utilization ration of each MG in the capability & capacity (C&C) planning module, and used a partial swarm optimization (PSO) algorithm to schedule each splitting job to the identified machine based on the C&C planning results. At last, we conducted a case study to demonstrate the proposed multi-agent based real-time production scheduling models and methods.

Originality/value

This paper proposes a multi-agent based real-time scheduling framework for semiconductor back-end industry. A C&C planning and a machine configuration algorithm are developed, respectively. The paper provides a feasible solution for semiconductor back-end manufacturing process to realize real-time scheduling.

Details

IIMBG Journal of Sustainable Business and Innovation, vol. 1 no. 1
Type: Research Article
ISSN: 2976-8500

Keywords

Open Access
Article
Publication date: 20 March 2024

Guijian Xiao, Tangming Zhang, Yi He, Zihan Zheng and Jingzhe Wang

The purpose of this review is to comprehensively consider the material properties and processing of additive titanium alloy and provide a new perspective for the robotic grinding…

Abstract

Purpose

The purpose of this review is to comprehensively consider the material properties and processing of additive titanium alloy and provide a new perspective for the robotic grinding and polishing of additive titanium alloy blades to ensure the surface integrity and machining accuracy of the blades.

Design/methodology/approach

At present, robot grinding and polishing are mainstream processing methods in blade automatic processing. This review systematically summarizes the processing characteristics and processing methods of additive manufacturing (AM) titanium alloy blades. On the one hand, the unique manufacturing process and thermal effect of AM have created the unique processing characteristics of additive titanium alloy blades. On the other hand, the robot grinding and polishing process needs to incorporate the material removal model into the traditional processing flow according to the processing characteristics of the additive titanium alloy.

Findings

Robot belt grinding can solve the processing problem of additive titanium alloy blades. The complex surface of the blade generates a robot grinding trajectory through trajectory planning. The trajectory planning of the robot profoundly affects the machining accuracy and surface quality of the blade. Subsequent research is needed to solve the problems of high machining accuracy of blade profiles, complex surface material removal models and uneven distribution of blade machining allowance. In the process parameters of the robot, the grinding parameters, trajectory planning and error compensation affect the surface quality of the blade through the material removal method, grinding force and grinding temperature. The machining accuracy of the blade surface is affected by robot vibration and stiffness.

Originality/value

This review systematically summarizes the processing characteristics and processing methods of aviation titanium alloy blades manufactured by AM. Combined with the material properties of additive titanium alloy, it provides a new idea for robot grinding and polishing of aviation titanium alloy blades manufactured by AM.

Details

Journal of Intelligent Manufacturing and Special Equipment, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2633-6596

Keywords

Open Access
Article
Publication date: 15 December 2020

Soha Rawas and Ali El-Zaart

Image segmentation is one of the most essential tasks in image processing applications. It is a valuable tool in many oriented applications such as health-care systems, pattern…

Abstract

Purpose

Image segmentation is one of the most essential tasks in image processing applications. It is a valuable tool in many oriented applications such as health-care systems, pattern recognition, traffic control, surveillance systems, etc. However, an accurate segmentation is a critical task since finding a correct model that fits a different type of image processing application is a persistent problem. This paper develops a novel segmentation model that aims to be a unified model using any kind of image processing application. The proposed precise and parallel segmentation model (PPSM) combines the three benchmark distribution thresholding techniques to estimate an optimum threshold value that leads to optimum extraction of the segmented region: Gaussian, lognormal and gamma distributions. Moreover, a parallel boosting algorithm is proposed to improve the performance of the developed segmentation algorithm and minimize its computational cost. To evaluate the effectiveness of the proposed PPSM, different benchmark data sets for image segmentation are used such as Planet Hunters 2 (PH2), the International Skin Imaging Collaboration (ISIC), Microsoft Research in Cambridge (MSRC), the Berkley Segmentation Benchmark Data set (BSDS) and Common Objects in COntext (COCO). The obtained results indicate the efficacy of the proposed model in achieving high accuracy with significant processing time reduction compared to other segmentation models and using different types and fields of benchmarking data sets.

Design/methodology/approach

The proposed PPSM combines the three benchmark distribution thresholding techniques to estimate an optimum threshold value that leads to optimum extraction of the segmented region: Gaussian, lognormal and gamma distributions.

Findings

On the basis of the achieved results, it can be observed that the proposed PPSM–minimum cross-entropy thresholding (PPSM–MCET)-based segmentation model is a robust, accurate and highly consistent method with high-performance ability.

Originality/value

A novel hybrid segmentation model is constructed exploiting a combination of Gaussian, gamma and lognormal distributions using MCET. Moreover, and to provide an accurate and high-performance thresholding with minimum computational cost, the proposed PPSM uses a parallel processing method to minimize the computational effort in MCET computing. The proposed model might be used as a valuable tool in many oriented applications such as health-care systems, pattern recognition, traffic control, surveillance systems, etc.

Details

Applied Computing and Informatics, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2634-1964

Keywords

Open Access
Article
Publication date: 10 August 2022

Jie Ma, Zhiyuan Hao and Mo Hu

The density peak clustering algorithm (DP) is proposed to identify cluster centers by two parameters, i.e. ρ value (local density) and δ value (the distance between a point and…

Abstract

Purpose

The density peak clustering algorithm (DP) is proposed to identify cluster centers by two parameters, i.e. ρ value (local density) and δ value (the distance between a point and another point with a higher ρ value). According to the center-identifying principle of the DP, the potential cluster centers should have a higher ρ value and a higher δ value than other points. However, this principle may limit the DP from identifying some categories with multi-centers or the centers in lower-density regions. In addition, the improper assignment strategy of the DP could cause a wrong assignment result for the non-center points. This paper aims to address the aforementioned issues and improve the clustering performance of the DP.

Design/methodology/approach

First, to identify as many potential cluster centers as possible, the authors construct a point-domain by introducing the pinhole imaging strategy to extend the searching range of the potential cluster centers. Second, they design different novel calculation methods for calculating the domain distance, point-domain density and domain similarity. Third, they adopt domain similarity to achieve the domain merging process and optimize the final clustering results.

Findings

The experimental results on analyzing 12 synthetic data sets and 12 real-world data sets show that two-stage density peak clustering based on multi-strategy optimization (TMsDP) outperforms the DP and other state-of-the-art algorithms.

Originality/value

The authors propose a novel DP-based clustering method, i.e. TMsDP, and transform the relationship between points into that between domains to ultimately further optimize the clustering performance of the DP.

Details

Data Technologies and Applications, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2514-9288

Keywords

Open Access
Article
Publication date: 12 April 2019

Iman Ghalehkhondabi, Ehsan Ardjmand, William A. Young and Gary R. Weckman

The purpose of this paper is to review the current literature in the field of tourism demand forecasting.

14550

Abstract

Purpose

The purpose of this paper is to review the current literature in the field of tourism demand forecasting.

Design/methodology/approach

Published papers in the high quality journals are studied and categorized based their used forecasting method.

Findings

There is no forecasting method which can develop the best forecasts for all of the problems. Combined forecasting methods are providing better forecasts in comparison to the traditional forecasting methods.

Originality/value

This paper reviews the available literature from 2007 to 2017. There is not such a review available in the literature.

Details

Journal of Tourism Futures, vol. 5 no. 1
Type: Research Article
ISSN: 2055-5911

Keywords

Open Access
Article
Publication date: 23 August 2022

Armin Mahmoodi, Leila Hashemi, Milad Jasemi, Jeremy Laliberté, Richard C. Millar and Hamed Noshadi

In this research, the main purpose is to use a suitable structure to predict the trading signals of the stock market with high accuracy. For this purpose, two models for the…

957

Abstract

Purpose

In this research, the main purpose is to use a suitable structure to predict the trading signals of the stock market with high accuracy. For this purpose, two models for the analysis of technical adaptation were used in this study.

Design/methodology/approach

It can be seen that support vector machine (SVM) is used with particle swarm optimization (PSO) where PSO is used as a fast and accurate classification to search the problem-solving space and finally the results are compared with the neural network performance.

Findings

Based on the result, the authors can say that both new models are trustworthy in 6 days, however, SVM-PSO is better than basic research. The hit rate of SVM-PSO is 77.5%, but the hit rate of neural networks (basic research) is 74.2.

Originality/value

In this research, two approaches (raw-based and signal-based) have been developed to generate input data for the model: raw-based and signal-based. For comparison, the hit rate is considered the percentage of correct predictions for 16 days.

Details

Asian Journal of Economics and Banking, vol. 7 no. 1
Type: Research Article
ISSN: 2615-9821

Keywords

Open Access
Article
Publication date: 11 April 2018

Mohamed A. Tawhid and Kevin B. Dsouza

In this paper, we present a new hybrid binary version of bat and enhanced particle swarm optimization algorithm in order to solve feature selection problems. The proposed…

Abstract

In this paper, we present a new hybrid binary version of bat and enhanced particle swarm optimization algorithm in order to solve feature selection problems. The proposed algorithm is called Hybrid Binary Bat Enhanced Particle Swarm Optimization Algorithm (HBBEPSO). In the proposed HBBEPSO algorithm, we combine the bat algorithm with its capacity for echolocation helping explore the feature space and enhanced version of the particle swarm optimization with its ability to converge to the best global solution in the search space. In order to investigate the general performance of the proposed HBBEPSO algorithm, the proposed algorithm is compared with the original optimizers and other optimizers that have been used for feature selection in the past. A set of assessment indicators are used to evaluate and compare the different optimizers over 20 standard data sets obtained from the UCI repository. Results prove the ability of the proposed HBBEPSO algorithm to search the feature space for optimal feature combinations.

Details

Applied Computing and Informatics, vol. 16 no. 1/2
Type: Research Article
ISSN: 2634-1964

Keywords

Open Access
Article
Publication date: 13 April 2022

Jian Li, Xinlei Yan, Feifei Zhao and Xin Zhao

The purpose of this paper is to solve the problem that the location of the initiation point cannot be measured accurately in the shallow underground space, this paper proposes a…

Abstract

Purpose

The purpose of this paper is to solve the problem that the location of the initiation point cannot be measured accurately in the shallow underground space, this paper proposes a method, which is based on fusion of multidimensional vibration sensor information, to locate single shallow underground sources.

Design/methodology/approach

First, in this paper, using the characteristics of low multipath interference and good P-wave polarization in the near field, the adaptive covariance matrix algorithm is used to extract the polarization angle information of the P-wave and the short term averaging/long term averaging algorithm is used to extract the first break travel time information. Second, a hybrid positioning model based on travel time and polarization angle is constructed. Third, the positioning model is taken as the particle update fitness function of quantum-behaved particle swarm optimization and calculation is performed in the hybrid positioning model. Finally, the experiment verification is carried out in the field.

Findings

The experimental results show that, with root mean square error, spherical error probable and fitness value as evaluation indicators, the positioning performance of this method is better than that without speed prediction. And the positioning accuracy of this method has been improved by nearly 30%, giving all of the three tests a positioning error within 0.5 m and a fitness less than 1.

Originality/value

This method provides a new idea for high-precision positioning of shallow underground single source. It has a certain engineering application value in the fields of directional demolition of engineering blasting, water inrush and burst mud prediction, fuze position measurement, underground initiation point positioning of ammunition, mine blasting monitoring and so on.

Details

Sensor Review, vol. 42 no. 3
Type: Research Article
ISSN: 0260-2288

Keywords

1 – 10 of 82