Search results
1 – 10 of 62Patrik Jonsson, Johan Öhlin, Hafez Shurrab, Johan Bystedt, Azam Sheikh Muhammad and Vilhelm Verendel
This study aims to explore and empirically test variables influencing material delivery schedule inaccuracies?
Abstract
Purpose
This study aims to explore and empirically test variables influencing material delivery schedule inaccuracies?
Design/methodology/approach
A mixed-method case approach is applied. Explanatory variables are identified from the literature and explored in a qualitative analysis at an automotive original equipment manufacturer. Using logistic regression and random forest classification models, quantitative data (historical schedule transactions and internal data) enables the testing of the predictive difference of variables under various planning horizons and inaccuracy levels.
Findings
The effects on delivery schedule inaccuracies are contingent on a decoupling point, and a variable may have a combined amplifying (complexity generating) and stabilizing (complexity absorbing) moderating effect. Product complexity variables are significant regardless of the time horizon, and the item’s order life cycle is a significant variable with predictive differences that vary. Decoupling management is identified as a mechanism for generating complexity absorption capabilities contributing to delivery schedule accuracy.
Practical implications
The findings provide guidelines for exploring and finding patterns in specific variables to improve material delivery schedule inaccuracies and input into predictive forecasting models.
Originality/value
The findings contribute to explaining material delivery schedule variations, identifying potential root causes and moderators, empirically testing and validating effects and conceptualizing features that cause and moderate inaccuracies in relation to decoupling management and complexity theory literature?
Details
Keywords
Olufemi Gbenga Onatunji, Oluwayemisi Kadijat Adeleke and Akintoye Victor Adejumo
This study reinvestigates the validity of the Phillips curve in Nigeria for the period 1980–2020 by considering the asymmetric nexus between unemployment and inflation.
Abstract
Purpose
This study reinvestigates the validity of the Phillips curve in Nigeria for the period 1980–2020 by considering the asymmetric nexus between unemployment and inflation.
Design/methodology/approach
The nonlinear autoregressive distributed lag (NARDL) technique was used to decompose the unemployment variable into two components: tight and loosened labour markets.
Findings
The empirical outcome shows that unemployment has a significant negative effect on inflation when the labour market is tight and a weakly negative and significant effect on inflation when the labour market is loose. The study confirms an asymmetric Phillips curve in Nigeria since the positive (tight) unemployment rate exerts a greater effect on inflation than the negative (loosened) unemployment rate.
Practical implications
The findings of this study have important implications for implementing monetary policy in Nigeria.
Originality/value
To the best of the authors’ knowledge, this is the first study to investigate the existence of a nonlinear Phillip curve in Nigeria.
Details
Keywords
Yongjiang Xue, Wei Wang and Qingzeng Song
The primary objective of this study is to tackle the enduring challenge of preserving feature integrity during the manipulation of geometric data in computer graphics. Our work…
Abstract
Purpose
The primary objective of this study is to tackle the enduring challenge of preserving feature integrity during the manipulation of geometric data in computer graphics. Our work aims to introduce and validate a variational sparse diffusion model that enhances the capability to maintain the definition of sharp features within meshes throughout complex processing tasks such as segmentation and repair.
Design/methodology/approach
We developed a variational sparse diffusion model that integrates a high-order L1 regularization framework with Dirichlet boundary constraints, specifically designed to preserve edge definition. This model employs an innovative vertex updating strategy that optimizes the quality of mesh repairs. We leverage the augmented Lagrangian method to address the computational challenges inherent in this approach, enabling effective management of the trade-off between diffusion strength and feature preservation. Our methodology involves a detailed analysis of segmentation and repair processes, focusing on maintaining the acuity of features on triangulated surfaces.
Findings
Our findings indicate that the proposed variational sparse diffusion model significantly outperforms traditional smooth diffusion methods in preserving sharp features during mesh processing. The model ensures the delineation of clear boundaries in mesh segmentation and achieves high-fidelity restoration of deteriorated meshes in repair tasks. The innovative vertex updating strategy within the model contributes to enhanced mesh quality post-repair. Empirical evaluations demonstrate that our approach maintains the integrity of original, sharp features more effectively, especially in complex geometries with intricate detail.
Originality/value
The originality of this research lies in the novel application of a high-order L1 regularization framework to the field of mesh processing, a method not conventionally applied in this context. The value of our work is in providing a robust solution to the problem of feature degradation during the mesh manipulation process. Our model’s unique vertex updating strategy and the use of the augmented Lagrangian method for optimization are distinctive contributions that enhance the state-of-the-art in geometry processing. The empirical success of our model in preserving features during mesh segmentation and repair presents an advancement in computer graphics, offering practical benefits to both academic research and industry applications.
Details
Keywords
Yu Song, Bingrui Liu, Lejia Li and Jia Liu
In recent years, terrorist attacks have gradually become one of the important factors endangering social security. In this context, this research aims to propose methods and…
Abstract
Purpose
In recent years, terrorist attacks have gradually become one of the important factors endangering social security. In this context, this research aims to propose methods and principles which can be utilized to make effective evacuation plans to reduce casualties in terrorist attacks.
Design/methodology/approach
By analyzing the statistical data of terrorist attack videos, this paper proposes an extended cellular automaton (CA) model and simulates the panic evacuation of the pedestrians in the terrorist attack.
Findings
The main findings are as follows. (1) The panic movement of pedestrians leads to the dispersal of the crowd and the increase in evacuation time. (2) Most deaths occur in the early stage of crowd evacuation while pedestrians gather without perceiving the risk. (3) There is a trade-off between escaping from the room and avoidance of attackers for pedestrians. Appropriate panic contagion enables pedestrians to respond more quickly to risks. (4) Casualties are mainly concentrated in complex terrains, e.g. walls, corners, obstacles, exits, etc. (5) The initial position of the attackers has a significant effect on the crowd evacuation. The evacuation efficiency should be reduced if the attacker starts the attack from the exit or corners.
Originality/value
In this research, the concept of “focus region” is proposed to depict the different reactions of pedestrians to danger and the effects of the attacker’s motion (especially the attack strategies of attackers) are classified. Additionally, the influences on pedestrians by direct and indirect panic sources are studied.
Details
Keywords
Arjun J Nair, Sridhar Manohar and Amit Mittal
Amidst unpredictable and turbulent periods, such as the COVID-19 pandemic, service organization’s responses are required to be innovative, adaptable and resilient. The purpose of…
Abstract
Purpose
Amidst unpredictable and turbulent periods, such as the COVID-19 pandemic, service organization’s responses are required to be innovative, adaptable and resilient. The purpose of this study is to explore the utilization of both reconfiguration and transformational strategies as instruments for cultivating resilience and advancing sustainability in service organizations.
Design/methodology/approach
The study examines a proposed resilience model using fuzzy logic. The research also used a semantic differential scale to capture nuanced and intricate attitudes. Finally, to augment the validity of the resilience model, a measurement scale was formulated using business mathematics and expert opinions.
Findings
Although investing in resilience training can help organizations gain control and maintain their operations in times of crisis, it may not directly help service organizations understand the external turmoil, seek available resources or create adaptive remedies. Conversely, high levels of reconfiguration and transformation management vigour empower a service organization’s revolutionary, malleable vision, organizational structure and decision-making processes, welcoming talented and innovative employees to enhance capabilities during crises.
Research limitations/implications
The resilience model bestows a comprehensive understanding of the pertinence of building resilience for service organizations identifying the antecedents that influence the adoption of these strategies and introduces a range of theoretical perspectives that empowers service organizations to conceptualize and plan for building resilience. The research guides service organizations to become more resilient to external shocks and adapt to changing circumstances by diversifying their offerings, optimizing their resources and adopting flexible work arrangements. The study elaborates on the enhancement of resilience, increasing innovation, improving efficiency and enhancing customer satisfaction for service organizations to remain competitive and contribute to positive social and economic outcomes through the adoption of both reconfiguration and transformational strategies.
Practical implications
The study also guides the service organizations to become more resilient to external shocks and adapt to changing circumstances by diversifying their offerings, optimizing their resources and adopting flexible work arrangements. Rapid innovation and business model innovation are essential components, enabling service organizations to foster a culture of innovation and remain competitive. In addition, the adoption can lead to improved financial performance, job creation and economic growth, contributing to positive social and economic impacts.
Social implications
The resilience model bestows a comprehensive understanding of the pertinence of building resilience for service organizations. It identifies the antecedents that influence the adoption of these strategies and introduces a range of theoretical perspectives that empowers service organizations to conceptualize and plan for building resilience. The research also provides a foundation for further investigation into the effectiveness of these strategies and their impact on organizational performance and sustainability. By better preparing service organizations for disruptions and uncertainties, this research triggers ameliorated organizational performance and sustainability.
Originality/value
Within the realm of the service industry, the present investigation has undertaken the development, quantification and scrutiny of both resilience and tenacity. In addition, it has delved into the intricate dynamics surrounding the influencing factors and antecedents that bear upon resilience, elucidating their consequential impact on the operational performance and outlook of service-oriented organizations. The findings derived from this research furnish valuable insights germane to enhancing operational efficacy and surmounting impediments within the sector.
Details
Keywords
Cathrine Banga, Abraham Deka, Salim Hamza Ringim, Abubakar Sadiq Mustapha, Hüseyin Özdeşer and Hasan Kilic
The current study aims to ascertain the association between tourism development, economic growth and environmental quality by using the short-run and long-run autoregressive…
Abstract
Purpose
The current study aims to ascertain the association between tourism development, economic growth and environmental quality by using the short-run and long-run autoregressive distributive lag model.
Design/methodology/approach
Tourism development has a major role to play in improving a nation’s economic growth. However, it is also blamed for exacerbating environmental pollution because of its massive use of energy (non-renewable energy).
Findings
The major findings of this research show that renewable energy (RE) use and gross domestic product (GDP) negatively impact carbon dioxide (CO2) emissions in South Africa. Tourism arrivals and CO2 emissions negatively impact GDP, while capital positively impacts GDP in the long run.
Practical implications
This research recommends the use of RE, since it reduces carbon emissions, and capital, as it remains the major driver of economic growth.
Originality/value
The originality of the current research is that it uses long-period annual time series data from 1971 to 2019 of South Africa, one of the largest tourist nations in Africa. To the best of the authors’ knowledge, no studies have examined South Africa in this context and minimal research has been conducted to ascertain the impact of the tourism industry on the environment, despite the accusations directed toward it.
Details
Keywords
Jinsong Zhang, Xinlong Wang, Chen Yang, Mingkang Sun and Zhenwei Huang
This study aims to investigate the noise-inducing characteristics during the start-up process of a mixed-flow pump and the impact of different start-up schemes on pump noise.
Abstract
Purpose
This study aims to investigate the noise-inducing characteristics during the start-up process of a mixed-flow pump and the impact of different start-up schemes on pump noise.
Design/methodology/approach
This study conducted numerical simulations on the mixed-flow pump under different start-up schemes and investigated the flow characteristics and noise distribution under these schemes.
Findings
The results reveal that the dipole noise is mainly caused by pressure fluctuations, while the quadrupole noise is mainly generated by the generation, development and breakdown of vortices. Additionally, the noise evolution characteristics during the start-up process of the mixed-flow pump can be divided into the initial stage, stable growth stage, impulse stage and stable operation stage.
Originality/value
The findings of this study can provide a theoretical basis for the selection of start-up schemes for mixed-flow pumps, reducing flow noise and improving the operational stability of mixed-flow pumps.
Details
Keywords
Gaurav Kumar, Molla Ramizur Rahman, Abhinav Rajverma and Arun Kumar Misra
This study aims to analyse the systemic risk emitted by all publicly listed commercial banks in a key emerging economy, India.
Abstract
Purpose
This study aims to analyse the systemic risk emitted by all publicly listed commercial banks in a key emerging economy, India.
Design/methodology/approach
The study makes use of the Tobias and Brunnermeier (2016) estimator to quantify the systemic risk (ΔCoVaR) that banks contribute to the system. The methodology addresses a classification problem based on the probability that a particular bank will emit high systemic risk or moderate systemic risk. The study applies machine learning models such as logistic regression, random forest (RF), neural networks and gradient boosting machine (GBM) and addresses the issue of imbalanced data sets to investigate bank’s balance sheet features and bank’s stock features which may potentially determine the factors of systemic risk emission.
Findings
The study reports that across various performance matrices, the authors find that two specifications are preferred: RF and GBM. The study identifies lag of the estimator of systemic risk, stock beta, stock volatility and return on equity as important features to explain emission of systemic risk.
Practical implications
The findings will help banks and regulators with the key features that can be used to formulate the policy decisions.
Originality/value
This study contributes to the existing literature by suggesting classification algorithms that can be used to model the probability of systemic risk emission in a classification problem setting. Further, the study identifies the features responsible for the likelihood of systemic risk.
Details
Keywords
Julien Dhima and Catherine Bruneau
This study aims to demonstrate and measure the impact of liquidity shocks on a bank’s solvency, especially when the bank does not hold sufficient liquid assets.
Abstract
Purpose
This study aims to demonstrate and measure the impact of liquidity shocks on a bank’s solvency, especially when the bank does not hold sufficient liquid assets.
Design/methodology/approach
The proposed model is an extension of Merton’s (1974) model. It assesses the bank’s probability of default over one or two (short) periods relative to liquidity shocks. The shock scenarios are materialised by different net demands for the withdrawal of funds (NDWF) and may lead the bank to sell illiquid assets at a depreciated value. We consider the possibility of second-round effects at the beginning of the second period by introducing the probability of their occurrence. This probability depends on the proportion of illiquid assets put up for sale following the initial shock in different dependency scenarios.
Findings
We observe a positive relationship between the initial NDWF and the bank’s probability of default (particularly over the second period, which is conditional on the second-round effects). However, this relationship is not linear, and a significant proportion of liquid assets makes it possible to attenuate or even eliminate the effects of shock scenarios on bank solvency.
Practical implications
The proposed model enables banks to determine the necessary level of liquid assets, allowing them to resist (i.e. remain solvent) different liquidity shock scenarios for both periods (including eventual second-round effects) under the assumptions considered. Therefore, it can contribute to complementing or improving current internal liquidity adequacy assessment processes (ILAAPs).
Originality/value
The proposed microprudential approach consists of measuring the impact of liquidity risk on a bank’s solvency, complementing the current prudential framework in which these two topics are treated separately. It also complements the existing literature, in which the impact of liquidity risk on solvency risk has not been sufficiently studied. Finally, our model allows banks to manage liquidity using a solvency approach.
Details
Keywords
- Liquidity shock scenarios
- Bank solvency
- Probability of default (over one and two periods)
- Net demand for the withdrawal of funds (NDWF)
- Liquid and illiquid assets
- Second-round effects
- Probability of the occurrence of second-round effects
- Internal liquidity adequacy assessment process (ILAAP)
- C30
- G01
- G21
- G33
Hu Luo, Haobin Ruan and Dawei Tu
The purpose of this paper is to propose a whole set of methods for underwater target detection, because most underwater objects have small samples, low quality underwater images…
Abstract
Purpose
The purpose of this paper is to propose a whole set of methods for underwater target detection, because most underwater objects have small samples, low quality underwater images problems such as detail loss, low contrast and color distortion, and verify the feasibility of the proposed methods through experiments.
Design/methodology/approach
The improved RGHS algorithm to enhance the original underwater target image is proposed, and then the YOLOv4 deep learning network for underwater small sample targets detection is improved based on the combination of traditional data expansion method and Mosaic algorithm, expanding the feature extraction capability with SPP (Spatial Pyramid Pooling) module after each feature extraction layer to extract richer feature information.
Findings
The experimental results, using the official dataset, reveal a 3.5% increase in average detection accuracy for three types of underwater biological targets compared to the traditional YOLOv4 algorithm. In underwater robot application testing, the proposed method achieves an impressive 94.73% average detection accuracy for the three types of underwater biological targets.
Originality/value
Underwater target detection is an important task for underwater robot application. However, most underwater targets have the characteristics of small samples, and the detection of small sample targets is a comprehensive problem because it is affected by the quality of underwater images. This paper provides a whole set of methods to solve the problems, which is of great significance to the application of underwater robot.
Details