Search results
1 – 10 of over 14000
The purpose of this paper is to elaborate the effective method of adaptation of the external penalty function to the genetic algorithm.
Abstract
Purpose
The purpose of this paper is to elaborate the effective method of adaptation of the external penalty function to the genetic algorithm.
Design/methodology/approach
In the case of solving the optimization tasks with constraints using the external penalty function, the penalty term has a larger value than the primary objective function. The sigmoidal transformation is introduced to solve this problem. A new method of determining the value of the penalty coefficient in subsequent iterations associated with the changing penalty has been proposed. The proposed approach has been applied to the optimization of an electromagnetic linear actuator, and the mathematical model of the devices contains equations of the magnetic field, by taking into account the nonlinearity of ferromagnetic material.
Findings
The proposed new approach of the penalty function method consists in the reduction of the external penalty function in successive penalty iterations instead of its increase as it is in the classical method. In addition, the method of normalization of constraints during the formulation of optimization problem has a significant impact on the obtained results of optimization calculations.
Originality/value
The proposed approach can be applied to solve constrained optimization tasks in designing of electromagnetic devices.
Details
Keywords
R. Kleyle, A. de Korvin and T. McLaughlin
In this paper we discuss a mechanism for making business decisions on the basis of an expected penalty function associated with cost variance. We assume that the decision maker is…
Abstract
In this paper we discuss a mechanism for making business decisions on the basis of an expected penalty function associated with cost variance. We assume that the decision maker is knowledgeable of the economic environment in which the decision will be made, but that he has no hard data” such as a market research report. In this setting fuzzy logic is more applicable than ordinary statistical decision theory. We develop a method of computing a fuzzy expected penalty based on a fuzzy distribution of cost variance and a fuzzy penalty function. These fuzzy expected penalties are then defuzzified” so that a non‐fuzzy decision can be made.
Kuntal Bhattacharyya, Alfred L. Guiffrida, Milton Rene Soto-Ferrari and Paul Schikora
Untimely delivery of goods and services, especially in a post-COVID landscape, is a critical harbinger of end-to-end fulfillment. Existing literature in supplier delivery modeling…
Abstract
Purpose
Untimely delivery of goods and services, especially in a post-COVID landscape, is a critical harbinger of end-to-end fulfillment. Existing literature in supplier delivery modeling is focused on penalizing suppliers for late deliveries built into a contractual transaction, which eventually erodes trust. As such, a holistic modeling technique focused on long-term relationship building is missing. This study aims to design a supplier evaluation model that analytically equates supplier delivery performance to cost realization while replicating a core attribute of successful supply chains – alignment, leading to long-term supplier relationships.
Design/methodology/approach
The supplier evaluation model designed in this paper uses delivery deviation as a unit of measure as opposed to delivery duration to enhance consistency with enterprise resource planning protocols. A one-sided modified Taguchi-type quality loss function (QLF) models delivery lateness to construct a multinomial probability penalty cost function for untimely delivery. Prescriptive analytics using simulation and optimization of the proposed mathematical model supports buyer–supplier alignment.
Findings
The supplier evaluation model designed herein not only optimizes likelihood parameters for early and late deliveries for competing suppliers to enhance total landed cost comparisons for on-shore, near-shore and off-shore suppliers but also allows for the creation of an efficient frontier toward supply base optimization.
Research limitations/implications
At a time of systemic disruptions such as the COVID pandemic, global supply chains are at risk of business continuity. Supplier evaluation models need to focus on long-term relationship modeling as opposed to short-term contractual penalty-based modeling to enhance business continuity. The model offered in this paper is grounded in alignment – a cornerstone of successful supply chain integration, and offers an interesting departure from traditional modeling techniques in this genre.
Practical implications
The results from this analytical approach offer flexibility to a supply manager toward building redundancies in the supply chain using an efficient frontier within the supply landscape, which also helps to manage disruption and maintain end-to-end fulfillment.
Originality/value
The model offered in this paper is grounded in alignment – a cornerstone of successful supply chain integration, and offers an interesting departure from traditional modeling techniques in this genre. The authors offer a rational solution by creating an evaluation model that uses penalty cost modeling as an internal quality measure to rate suppliers and uses the outcome as a yardstick for negotiations instead of imposing penalties within contracts.
Details
Keywords
Shuangshuang Liu and Xiaoling Li
Conventional image super-resolution reconstruction by the conventional deep learning architectures suffers from the problems of hard training and gradient disappearing. In order…
Abstract
Purpose
Conventional image super-resolution reconstruction by the conventional deep learning architectures suffers from the problems of hard training and gradient disappearing. In order to solve such problems, the purpose of this paper is to propose a novel image super-resolution algorithm based on improved generative adversarial networks (GANs) with Wasserstein distance and gradient penalty.
Design/methodology/approach
The proposed algorithm first introduces the conventional GANs architecture, the Wasserstein distance and the gradient penalty for the task of image super-resolution reconstruction (SRWGANs-GP). In addition, a novel perceptual loss function is designed for the SRWGANs-GP to meet the task of image super-resolution reconstruction. The content loss is extracted from the deep model’s feature maps, and such features are introduced to calculate mean square error (MSE) for the loss calculation of generators.
Findings
To validate the effectiveness and feasibility of the proposed algorithm, a lot of compared experiments are applied on three common data sets, i.e. Set5, Set14 and BSD100. Experimental results have shown that the proposed SRWGANs-GP architecture has a stable error gradient and iteratively convergence. Compared with the baseline deep models, the proposed GANs models have a significant improvement on performance and efficiency for image super-resolution reconstruction. The MSE calculated by the deep model’s feature maps gives more advantages for constructing contour and texture.
Originality/value
Compared with the state-of-the-art algorithms, the proposed algorithm obtains a better performance on image super-resolution and better reconstruction results on contour and texture.
Details
Keywords
The most commonly used approaches for cluster validation are based on indices but the majority of the existing cluster validity indices do not work well on data sets of different…
Abstract
Purpose
The most commonly used approaches for cluster validation are based on indices but the majority of the existing cluster validity indices do not work well on data sets of different complexities. The purpose of this paper is to propose a new cluster validity index (ARSD index) that works well on all types of data sets.
Design/methodology/approach
The authors introduce a new compactness measure that depicts the typical behaviour of a cluster where more points are located around the centre and lesser points towards the outer edge of the cluster. A novel penalty function is proposed for determining the distinctness measure of clusters. Random linear search-algorithm is employed to evaluate and compare the performance of the five commonly known validity indices and the proposed validity index. The values of the six indices are computed for all nc ranging from (nc min, nc max) to obtain the optimal number of clusters present in a data set. The data sets used in the experiments include shaped, Gaussian-like and real data sets.
Findings
Through extensive experimental study, it is observed that the proposed validity index is found to be more consistent and reliable in indicating the correct number of clusters compared to other validity indices. This is experimentally demonstrated on 11 data sets where the proposed index has achieved better results.
Originality/value
The originality of the research paper includes proposing a novel cluster validity index which is used to determine the optimal number of clusters present in data sets of different complexities.
Details
Keywords
Aziz Kaba and Emre Kiyak
The purpose of this paper is to introduce an artificial bee colony-based Kalman filter algorithm along with an extended objective function to ensure the optimality of the…
Abstract
Purpose
The purpose of this paper is to introduce an artificial bee colony-based Kalman filter algorithm along with an extended objective function to ensure the optimality of the estimator of the quadrotor in the presence of unknown measurement noise statistics.
Design/methodology/approach
Six degree-of-freedom mathematical model of the quadrotor is derived. Position controller for the quadrotor is designed. Kalman filter-based estimation algorithm is implemented in the sensor feedback loop. Artificial bee colony-based hybrid algorithm is used as an optimization method to handle the unknown noise statistics. Existing objective function is extended with a penalty term. Mathematical proof of the extended objective function is derived. Results of the proposed algorithm is compared with de facto genetic algorithm-based Kalman filter.
Findings
Artificial bee colony algorithm-based Kalman filter and extended objective function duo are able to optimize the measurement noise covariance matrix with an absolute error as low as 0.001 [m2]. Proposed method and function is capable of reducing the noise from 2 to 0.09 [m] for x-axis, 3.4 to 0.14 [m] for y-axis and 3.7 to 0.2 [m] for z-axis, respectively.
Originality/value
The motivation behind this paper is to bring a novel optimization-based solution for the estimation problem of the quadrotor when the measurement noise statistics are unknown along with an extended objective function to prevent the infeasible solutions with mathematical convergence analysis.
Details
Keywords
Ilya V. Avdeev, Alexei I. Borovkov, Olga L. Kiylo, Michael R. Lovell and Dipo Onipede
This paper presents a new finite element (FE) approach to modeling edge effects in beam sandwich structures. The approach is based on a mixed 2D and beam formulation conjuncted by…
Abstract
This paper presents a new finite element (FE) approach to modeling edge effects in beam sandwich structures. The approach is based on a mixed 2D and beam formulation conjuncted by means of a penalty function method. Several results from analysis of sandwich beams and frames bending with different boundary conditions and laminate properties are solved in order to demonstrate the accuracy of the algorithms and software developed. The influence of the penalty factor on the spectral condition number of the stiffness matrix and on the residual norm of the solution is also investigated for different isotropic and sandwich structures. The FE analysis of complex sandwich beam joint is subsequently presented. Results of this analysis show the advantages of the developed approach for large problems.
Details
Keywords
Yuebin Zhang, Xin Yi, Shuangshuang Li and Hui Qiu
This study aims to reduce the construction safety accidents of prefabricated building (PB) projects, improve the efficiency and effectiveness of safety supervision by government…
Abstract
Purpose
This study aims to reduce the construction safety accidents of prefabricated building (PB) projects, improve the efficiency and effectiveness of safety supervision by government departments, and provide theoretical reference for improving the safety supervision system of PB construction.
Design/methodology/approach
Considering the information asymmetry between government supervision departments and construction contractors and the interactive relationship between the two parties under bounded rationality, we propose an evolutionary game model for the construction safety dynamic supervision of PBs and analyze the evolutionary strategy of the game. The system dynamics (SD) method is used to simulate and analyze the evolutionary game process under a dynamic supervision strategy and the adjustment of external variables.
Findings
The cost difference between the government's strong and weak supervision, the construction contractor's additional expenditure for strengthening safety management, and other factors affect system stability. The government can dynamically adjust the penalties based on the construction contractor's subjective willingness to ignore safety management and further adjust their rate of change based on the completion of the supervision goals to improve the efficiency and effectiveness of construction safety supervision.
Originality/value
This study makes contributions in two areas. Through a combination of SD and an evolutionary game, it provides new insights into the strategic choice of the main body related to PB construction safety. Additionally, considering the nonlinear characteristics of construction safety supervision, it provides useful universal suggestions for PB construction safety.
Details
Keywords
Islamic banking was developed to serve two objectives: to replace interest-based loan system with profit and loss sharing investment modes and to promote equity in resource…
Abstract
Purpose
Islamic banking was developed to serve two objectives: to replace interest-based loan system with profit and loss sharing investment modes and to promote equity in resource allocation. The first objective is called procedural whereas the second one is termed consequential. Scholars have been debating about the success of Islamic banking in achieving these objectives. This paper aims to develop an index for measuring the extent of convergence between theory and practice of Islamic banking.
Design/methodology/approach
For measuring the procedural and consequential convergence between objectives and practice of Islamic banking, the paper derives a set of indicators from the celebrated theory of Islamic banking and then develops the methodology of ranking all banks in terms of those indicators.
Findings
The paper provides ranking of Islamic banks in Pakistan in the light of this index. The results indicate that none of the Islamic banks in Pakistan has been doing good enough to achieve the convergence, instead they are moving in the opposite direction over time.
Practical implications
Using the methodology developed in this paper, universal ranking of Islamic banks may be issued every year.
Originality/value
Scholars have proposed some indices for measuring the performance of Islamic banking. There are two basic problems with these proposed measures: they do not directly compare the performance of Islamic banking against its stated objectives and they naively use an additive form of index without explaining the reason for this choice, i.e. as to what are the desirable characteristics which their preferred mathematical form of index serves. The index proposed in this paper attempts to overcome these shortcomings.
Details