Search results
1 – 10 of 17Mehmet Kursat Oksuz and Sule Itir Satoglu
Disaster management and humanitarian logistics (HT) play crucial roles in large-scale events such as earthquakes, floods, hurricanes and tsunamis. Well-organized disaster response…
Abstract
Purpose
Disaster management and humanitarian logistics (HT) play crucial roles in large-scale events such as earthquakes, floods, hurricanes and tsunamis. Well-organized disaster response is crucial for effectively managing medical centres, staff allocation and casualty distribution during emergencies. To address this issue, this study aims to introduce a multi-objective stochastic programming model to enhance disaster preparedness and response, focusing on the critical first 72 h after earthquakes. The purpose is to optimize the allocation of resources, temporary medical centres and medical staff to save lives effectively.
Design/methodology/approach
This study uses stochastic programming-based dynamic modelling and a discrete-time Markov Chain to address uncertainty. The model considers potential road and hospital damage and distance limits and introduces an a-reliability level for untreated casualties. It divides the initial 72 h into four periods to capture earthquake dynamics.
Findings
Using a real case study in Istanbul’s Kartal district, the model’s effectiveness is demonstrated for earthquake scenarios. Key insights include optimal medical centre locations, required capacities, necessary medical staff and casualty allocation strategies, all vital for efficient disaster response within the critical first 72 h.
Originality/value
This study innovates by integrating stochastic programming and dynamic modelling to tackle post-disaster medical response. The use of a Markov Chain for uncertain health conditions and focus on the immediate aftermath of earthquakes offer practical value. By optimizing resource allocation amid uncertainties, the study contributes significantly to disaster management and HT research.
Details
Keywords
Zsolt Tibor Kosztyán, Tibor Csizmadia, Zoltán Kovács and István Mihálcz
The purpose of this paper is to generalize the traditional risk evaluation methods and to specify a multi-level risk evaluation framework, in order to prepare customized risk…
Abstract
Purpose
The purpose of this paper is to generalize the traditional risk evaluation methods and to specify a multi-level risk evaluation framework, in order to prepare customized risk evaluation and to enable effectively integrating the elements of risk evaluation.
Design/methodology/approach
A real case study of an electric motor manufacturing company is presented to illustrate the advantages of this new framework compared to the traditional and fuzzy failure mode and effect analysis (FMEA) approaches.
Findings
The essence of the proposed total risk evaluation framework (TREF) is its flexible approach that enables the effective integration of firms’ individual requirements by developing tailor-made organizational risk evaluation.
Originality/value
Increasing product/service complexity has led to increasingly complex yet unique organizational operations; as a result, their risk evaluation is a very challenging task. Distinct structures, characteristics and processes within and between organizations require a flexible yet robust approach of evaluating risks efficiently. Most recent risk evaluation approaches are considered to be inadequate due to the lack of flexibility and an inappropriate structure for addressing the unique organizational demands and contextual factors. To address this challenge effectively, taking a crucial step toward customization of risk evaluation.
Details
Keywords
F.J. Farsana, V.R. Devi and K. Gopakumar
This paper introduces an audio encryption algorithm based on permutation of audio samples using discrete modified Henon map followed by substitution operation with keystream…
Abstract
This paper introduces an audio encryption algorithm based on permutation of audio samples using discrete modified Henon map followed by substitution operation with keystream generated from the modified Lorenz-Hyperchaotic system. In this work, the audio file is initially compressed by Fast Walsh Hadamard Transform (FWHT) for removing the residual intelligibility in the transform domain. The resulting file is then encrypted in two phases. In the first phase permutation operation is carried out using modified discrete Henon map to weaken the correlation between adjacent samples. In the second phase it utilizes modified-Lorenz hyperchaotic system for substitution operation to fill the silent periods within the speech conversation. Dynamic keystream generation mechanism is also introduced to enhance the correlation between plaintext and encrypted text. Various quality metrics analysis such as correlation, signal to noise ratio (SNR), differential attacks, spectral entropy, histogram analysis, keyspace and key sensitivity are carried out to evaluate the quality of the proposed algorithm. The simulation results and numerical analyses demonstrate that the proposed algorithm has excellent security performance and robust against various cryptographic attacks.
Details
Keywords
Yahya Alnashri and Hasan Alzubaidi
The main purpose of this paper is to introduce the gradient discretisation method (GDM) to a system of reaction diffusion equations subject to non-homogeneous Dirichlet boundary…
Abstract
Purpose
The main purpose of this paper is to introduce the gradient discretisation method (GDM) to a system of reaction diffusion equations subject to non-homogeneous Dirichlet boundary conditions. Then, the authors show that the GDM provides a comprehensive convergence analysis of several numerical methods for the considered model. The convergence is established without non-physical regularity assumptions on the solutions.
Design/methodology/approach
In this paper, the authors use the GDM to discretise a system of reaction diffusion equations with non-homogeneous Dirichlet boundary conditions.
Findings
The authors provide a generic convergence analysis of a system of reaction diffusion equations. The authors introduce a specific example of numerical scheme that fits in the gradient discretisation method. The authors conduct a numerical test to measure the efficiency of the proposed method.
Originality/value
This work provides a unified convergence analysis of several numerical methods for a system of reaction diffusion equations. The generic convergence is proved under the classical assumptions on the solutions.
Details
Keywords
- A gradient discretisation method (GDM)
- Gradient schemes
- Convergence analysis
- Existence of weak solutions
- Two-dimensional reaction–diffusion Brusselator system
- Dirichlet boundary conditions
- Non-conforming finite element methods
- Finite volume schemes
- Hybrid mixed mimetic (HMM) method
- 35K57
- 65N12
- 65M08
T.O.M. Forslund, I.A.S. Larsson, J.G.I. Hellström and T.S. Lundström
The purpose of this paper is to present a fast and bare bones implementation of a numerical method for quickly simulating turbulent thermal flows on GPUs. The work also validates…
Abstract
Purpose
The purpose of this paper is to present a fast and bare bones implementation of a numerical method for quickly simulating turbulent thermal flows on GPUs. The work also validates earlier research showing that the lattice Boltzmann method (LBM) method is suitable for complex thermal flows.
Design/methodology/approach
A dual lattice hydrodynamic (D3Q27) thermal (D3Q7) multiple-relaxation time LBM model capable of thermal DNS calculations is implemented in CUDA.
Findings
The model has the same computational performance compared to earlier publications of similar LBM solvers. The solver is validated against three benchmark cases for turbulent thermal flow with available data and is shown to be in excellent agreement.
Originality/value
The combination of a D3Q27 and D3Q7 stencil for a multiple relaxation time -LBM has, to the authors’ knowledge, not been used for simulations of thermal flows. The code is made available in a public repository under a free license.
Details
Keywords
Cong Li, YunFeng Xie, Gang Wang, XianFeng Zeng and Hui Jing
This paper studies the lateral stability regulation of intelligent electric vehicle (EV) based on model predictive control (MPC) algorithm.
Abstract
Purpose
This paper studies the lateral stability regulation of intelligent electric vehicle (EV) based on model predictive control (MPC) algorithm.
Design/methodology/approach
Firstly, the bicycle model is adopted in the system modelling process. To improve the accuracy, the lateral stiffness of front and rear tire is estimated using the real-time yaw rate acceleration and lateral acceleration of the vehicle based on the vehicle dynamics. Then the constraint of input and output in the model predictive controller is designed. Soft constraints on the lateral speed of the vehicle are designed to guarantee the solved persistent feasibility and enforce the vehicle’s sideslip angle within a safety range.
Findings
The simulation results show that the proposed lateral stability controller based on the MPC algorithm can improve the handling and stability performance of the vehicle under complex working conditions.
Originality/value
The MPC schema and the objective function are established. The integrated active front steering/direct yaw moments control strategy is simultaneously adopted in the model. The vehicle’s sideslip angle is chosen as the constraint and is controlled in stable range. The online estimation of tire stiffness is performed. The vehicle’s lateral acceleration and the yaw rate acceleration are modelled into the two-degree-of-freedom equation to solve the tire cornering stiffness in real time. This can ensure the accuracy of model.
Details
Keywords
Xuemei Li, Ya Zhang and Kedong Yin
The traditional grey relational models directly describe the behavioural characteristics of the systems based on the sample point connections. Few grey relational models can…
Abstract
Purpose
The traditional grey relational models directly describe the behavioural characteristics of the systems based on the sample point connections. Few grey relational models can measure the dynamic periodic fluctuation rules of the objects, and most of these models do not have affinities, which results in instabilities of the relational results because of sequence translation. The paper aims to discuss these issues.
Design/methodology/approach
Fourier transform functions are used to fit the system behaviour curves, redefine the area difference between the curves and construct a grey relational model based on discrete Fourier transform (DFTGRA).
Findings
To verify its validity, feasibility and superiority, DFTGRA is applied to research on the correlation between macroeconomic growth and marine economic growth in China coastal areas. It is proved that DFTGRA has the superior properties of affinity, symmetry, uniqueness, etc., and wide applicability.
Originality/value
DFTGRA can not only be applied to equidistant and equal time sequences but also be adopted for non-equidistant and unequal time sequences. DFTGRA can measure both the global relational degree and the dynamic correlation of the variable cyclical fluctuation between sequences.
Details
Keywords
Jannicke Baalsrud Hauge and Yongkuk Jeong
This research analyses challenges faced by users at various levels in planning and designing participatory simulation models of cities. It aims to identify issues that hinder…
Abstract
Purpose
This research analyses challenges faced by users at various levels in planning and designing participatory simulation models of cities. It aims to identify issues that hinder experts from maximising the effectiveness of the SUMO tool. Additionally, evaluating current methods highlights their strengths and weaknesses, facilitating the use of participatory simulation advantages to address these issues. Finally, the presented case studies illustrate the diversity of user groups and emphasise the need for further development of blueprints.
Design/methodology/approach
In this research, action research was used to assess and improve a step-by-step guideline. The guideline's conceptual design is based on stakeholder analysis results from those involved in developing urban logistics scenarios and feedback from potential users. A two-round process of application and refinement was conducted to evaluate and enhance the guideline's initial version.
Findings
The guidelines still demand an advanced skill level in simulation modelling, rendering them less effective for the intended audience. However, they have proven beneficial in a simulation course for students, emphasising the importance of developing accurate conceptual models and the need for careful implementation.
Originality/value
This paper introduces a step-by-step guideline designed to tackle challenges in modelling urban logistics scenarios using SUMO simulation software. The guideline's effectiveness was tested and enhanced through experiments involving diverse groups of students, varying in their experience with simulation modelling. This approach demonstrates the guideline's applicability and adaptability across different skill levels.
Details
Keywords
Isabel María Parra Oller, Salvador Cruz Rambaud and María del Carmen Valls Martínez
The main purpose of this paper is to determine the discount function which better fits the individuals' preferences through the empirical analysis of the different functions used…
Abstract
Purpose
The main purpose of this paper is to determine the discount function which better fits the individuals' preferences through the empirical analysis of the different functions used in the field of intertemporal choice.
Design/methodology/approach
After an in-depth revision of the existing literature and unlike most studies which only focus on exponential and hyperbolic discounting, this manuscript compares the adjustment of data to six different discount functions. To do this, the analysis is based on the usual statistical methods, and the non-linear least squares regression, through the algorithm of Gauss-Newton, in order to estimate the models' parameters; finally, the AICc method is used to compare the significance of the six proposed models.
Findings
This paper shows that the so-called q-exponential function deformed by the amount is the model which better explains the individuals' preferences on both delayed gains and losses. To the extent of the authors' knowledge, this is the first time that a function different from the general hyperbola fits better to the individuals' preferences.
Originality/value
This paper contributes to the search of an alternative model able to explain the individual behavior in a more realistic way.
Details
Keywords
Juan David Cortes, Jonathan E. Jackson and Andres Felipe Cortes
Despite the abundance of small-scale farms in the USA and their importance for both rural economic development and food availability, the extensive research on small business…
Abstract
Purpose
Despite the abundance of small-scale farms in the USA and their importance for both rural economic development and food availability, the extensive research on small business management and entrepreneurship has mostly neglected the agricultural context, leaving many of these farms' business challenges unexplored. The authors focus on informing a specific decision faced by small farm managers: selling directly to consumers (i.e. farmer's markets) versus selling through aggregators. By collecting historical data and a series of interviews with industry experts, the authors employ simulation methodology to offer a framework that advises how small-scale farmers can allocate their product across these two channels to increase revenue in a given season. The results, which are relevant for operations management, small business management and entrepreneurship literature, can help small-scale farmers improve their performance and compete against their larger counterparts.
Design/methodology/approach
The authors rely on historical and interview data from key industry players (an aggregator and a small farm manager) to design a simulation analysis that determines which factors influence season-long farm revenue performance under varying strategies of channel allocation and commodity production.
Findings
The model suggests that farm managers should plan to evenly split their production between the two distribution channels, but if an even split is not possible, they should plan to keep a larger percentage in the nonaggregator (farmers' market/direct) channel. Further, the authors find that farmers can benefit significantly from a strong aggregator channel customer base, which suggests that farmers should promote and advertise the aggregator channel even if they only use it for a limited amount of their product.
Originality/value
The authors integrate small business management and operations management literature to study a widely understudied context and present practical implications for the performance of small-scale farms.
Details