Search results
1 – 10 of 27Claire Economidou, Dimitris Karamanis, Alexandra Kechrinioti, Konstantinos N. Konstantakis and Panayotis G. Michaelides
In this work, the authors analyze the dynamic interdependencies between military expenditures and the real economy for the period 1970–2018, and the authors' approach allows for…
Abstract
Purpose
In this work, the authors analyze the dynamic interdependencies between military expenditures and the real economy for the period 1970–2018, and the authors' approach allows for the existence of dominant economies in the system.
Design/methodology/approach
In this study, the authors employ a Network General Equilibrium GVAR (global vector autoregressive) model.
Findings
By accounting for the interconnection among the top twelve military spenders, the authors' findings show that China acts as a leader in the global military scene based on the respective centrality measures. Meanwhile, statistically significant deviations from equilibrium are observed in most of the economies' military expenses, when subjected to an unanticipated unit shock of other countries. Nonetheless, in the medium run, the shocks tend to die out and economies converge to an equilibrium position.
Originality/value
With the authors' methodology the authors are able to capture not only the effect of nearness on a country's military spending, as the past literature has documented, but also a country's defense and economic dependencies with other countries and how a unit's military expenses could shape the spending of the rest. Using state-to-the-art quantitative and econometric techniques, the authors provide robust and comprehensive analysis.
Details
Keywords
Fangqi Hong, Pengfei Wei and Michael Beer
Bayesian cubature (BC) has emerged to be one of most competitive approach for estimating the multi-dimensional integral especially when the integrand is expensive to evaluate, and…
Abstract
Purpose
Bayesian cubature (BC) has emerged to be one of most competitive approach for estimating the multi-dimensional integral especially when the integrand is expensive to evaluate, and alternative acquisition functions, such as the Posterior Variance Contribution (PVC) function, have been developed for adaptive experiment design of the integration points. However, those sequential design strategies also prevent BC from being implemented in a parallel scheme. Therefore, this paper aims at developing a parallelized adaptive BC method to further improve the computational efficiency.
Design/methodology/approach
By theoretically examining the multimodal behavior of the PVC function, it is concluded that the multiple local maxima all have important contribution to the integration accuracy as can be selected as design points, providing a practical way for parallelization of the adaptive BC. Inspired by the above finding, four multimodal optimization algorithms, including one newly developed in this work, are then introduced for finding multiple local maxima of the PVC function in one run, and further for parallel implementation of the adaptive BC.
Findings
The superiority of the parallel schemes and the performance of the four multimodal optimization algorithms are then demonstrated and compared with the k-means clustering method by using two numerical benchmarks and two engineering examples.
Originality/value
Multimodal behavior of acquisition function for BC is comprehensively investigated. All the local maxima of the acquisition function contribute to adaptive BC accuracy. Parallelization of adaptive BC is realized with four multimodal optimization methods.
Details
Keywords
Warisa Thangjai and Sa-Aat Niwitpong
Confidence intervals play a crucial role in economics and finance, providing a credible range of values for an unknown parameter along with a corresponding level of certainty…
Abstract
Purpose
Confidence intervals play a crucial role in economics and finance, providing a credible range of values for an unknown parameter along with a corresponding level of certainty. Their applications encompass economic forecasting, market research, financial forecasting, econometric analysis, policy analysis, financial reporting, investment decision-making, credit risk assessment and consumer confidence surveys. Signal-to-noise ratio (SNR) finds applications in economics and finance across various domains such as economic forecasting, financial modeling, market analysis and risk assessment. A high SNR indicates a robust and dependable signal, simplifying the process of making well-informed decisions. On the other hand, a low SNR indicates a weak signal that could be obscured by noise, so decision-making procedures need to take this into serious consideration. This research focuses on the development of confidence intervals for functions derived from the SNR and explores their application in the fields of economics and finance.
Design/methodology/approach
The construction of the confidence intervals involved the application of various methodologies. For the SNR, confidence intervals were formed using the generalized confidence interval (GCI), large sample and Bayesian approaches. The difference between SNRs was estimated through the GCI, large sample, method of variance estimates recovery (MOVER), parametric bootstrap and Bayesian approaches. Additionally, confidence intervals for the common SNR were constructed using the GCI, adjusted MOVER, computational and Bayesian approaches. The performance of these confidence intervals was assessed using coverage probability and average length, evaluated through Monte Carlo simulation.
Findings
The GCI approach demonstrated superior performance over other approaches in terms of both coverage probability and average length for the SNR and the difference between SNRs. Hence, employing the GCI approach is advised for constructing confidence intervals for these parameters. As for the common SNR, the Bayesian approach exhibited the shortest average length. Consequently, the Bayesian approach is recommended for constructing confidence intervals for the common SNR.
Originality/value
This research presents confidence intervals for functions of the SNR to assess SNR estimation in the fields of economics and finance.
Details
Keywords
Mohd Irfan and Anup Kumar Sharma
A progressive hybrid censoring scheme (PHCS) becomes impractical for ensuring dependable outcomes when there is a low likelihood of encountering a small number of failures prior…
Abstract
Purpose
A progressive hybrid censoring scheme (PHCS) becomes impractical for ensuring dependable outcomes when there is a low likelihood of encountering a small number of failures prior to the predetermined terminal time T. The generalized progressive hybrid censoring scheme (GPHCS) efficiently addresses to overcome the limitation of the PHCS.
Design/methodology/approach
In this article, estimation of model parameter, survival and hazard rate of the Unit-Lindley distribution (ULD), when sample comes from the GPHCS, have been taken into account. The maximum likelihood estimator has been derived using Newton–Raphson iterative procedures. Approximate confidence intervals of the model parameter and their arbitrary functions are established by the Fisher information matrix. Bayesian estimation procedures have been derived using Metropolis–Hastings algorithm under squared error loss function. Convergence of Markov chain Monte Carlo (MCMC) samples has been examined. Various optimality criteria have been considered. An extensive Monte Carlo simulation analysis has been shown to compare and validating of the proposed estimation techniques.
Findings
The Bayesian MCMC approach to estimate the model parameters and reliability characteristics of the generalized progressive hybrid censored data of ULD is recommended. The authors anticipate that health data analysts and reliability professionals will get benefit from the findings and approaches presented in this study.
Originality/value
The ULD has a broad range of practical utility, making it a problem to estimate the model parameters as well as reliability characteristics and the significance of the GPHCS also encourage the authors to consider the present estimation problem because it has not previously been discussed in the literature.
Details
Keywords
James L. Sullivan, David Novak, Eric Hernandez and Nick Van Den Berg
This paper introduces a novel quality measure, the percent-within-distribution, or PWD, for acceptance and payment in a quality control/quality assurance (QC/QA) performance…
Abstract
Purpose
This paper introduces a novel quality measure, the percent-within-distribution, or PWD, for acceptance and payment in a quality control/quality assurance (QC/QA) performance specification (PS).
Design/methodology/approach
The new quality measure takes any sample size or distribution and uses a Bayesian updating process to re-estimate parameters of a design distribution as sample observations are fed through the algorithm. This methodology can be employed in a wide range of applications, but the authors demonstrate the use of the measure for a QC/QA PS with upper and lower bounds on 28-day compressive strength of in-place concrete for bridge decks.
Findings
The authors demonstrate the use of this new quality measure to illustrate how it addresses the shortcomings of the percent-within-limits (PWL), which is the current industry standard quality measure. The authors then use the PWD to develop initial pay factors through simulation regimes. The PWD is shown to function better than the PWL with realistic sample lots simulated to represent a variety of industry responses to a new QC/QA PS.
Originality/value
The analytical contribution of this work is the introduction of the new quality measure. However, the practical and managerial contributions of this work are of equal significance.
Details
Keywords
Sou-Sen Leu, Yen-Lin Fu and Pei-Lin Wu
This paper aims to develop a dynamic civil facility degradation prediction model to forecast the reliability performance tendency and remaining useful life under imperfect…
Abstract
Purpose
This paper aims to develop a dynamic civil facility degradation prediction model to forecast the reliability performance tendency and remaining useful life under imperfect maintenance based on the inspection records and the maintenance actions.
Design/methodology/approach
A real-time hidden Markov chain (HMM) model is proposed in this paper to predict the reliability performance tendency and remaining useful life under imperfect maintenance based on rare failure events. The model assumes a Poisson arrival pattern for facility failure events occurrence. HMM is further adopted to establish the transmission probabilities among stages. Finally, the simulation inference is conducted using Particle filter (PF) to estimate the most probable model parameters. Water seals at the spillway hydraulic gate in a Taiwan's reservoir are used to examine the appropriateness of the approach.
Findings
The results of defect probabilities tendency from the real-time HMM model are highly consistent with the real defect trend pattern of civil facilities. The proposed facility degradation prediction model can provide the maintenance division with early warning of potential failure to establish a proper proactive maintenance plan, even under the condition of rare defects.
Originality/value
This model is a new method of civil facility degradation prediction under imperfect maintenance, even with rare failure events. It overcomes several limitations of classical failure pattern prediction approaches and can reliably simulate the occurrence of rare defects under imperfect maintenance and the effect of inspection reliability caused by human error. Based on the degradation trend pattern prediction, effective maintenance management plans can be practically implemented to minimize the frequency of the occurrence and the consequence of civil facility failures.
Details
Keywords
Chuyu Tang, Hao Wang, Genliang Chen and Shaoqiu Xu
This paper aims to propose a robust method for non-rigid point set registration, using the Gaussian mixture model and accommodating non-rigid transformations. The posterior…
Abstract
Purpose
This paper aims to propose a robust method for non-rigid point set registration, using the Gaussian mixture model and accommodating non-rigid transformations. The posterior probabilities of the mixture model are determined through the proposed integrated feature divergence.
Design/methodology/approach
The method involves an alternating two-step framework, comprising correspondence estimation and subsequent transformation updating. For correspondence estimation, integrated feature divergences including both global and local features, are coupled with deterministic annealing to address the non-convexity problem of registration. For transformation updating, the expectation-maximization iteration scheme is introduced to iteratively refine correspondence and transformation estimation until convergence.
Findings
The experiments confirm that the proposed registration approach exhibits remarkable robustness on deformation, noise, outliers and occlusion for both 2D and 3D point clouds. Furthermore, the proposed method outperforms existing analogous algorithms in terms of time complexity. Application of stabilizing and securing intermodal containers loaded on ships is performed. The results demonstrate that the proposed registration framework exhibits excellent adaptability for real-scan point clouds, and achieves comparatively superior alignments in a shorter time.
Originality/value
The integrated feature divergence, involving both global and local information of points, is proven to be an effective indicator for measuring the reliability of point correspondences. This inclusion prevents premature convergence, resulting in more robust registration results for our proposed method. Simultaneously, the total operating time is reduced due to a lower number of iterations.
Details
Keywords
Wonjun Choi, Wooyoung (William) Jang, Hyunseok Song, Min Jung Kim, Wonju Lee and Kevin K. Byon
This study aimed to identify subgroups of esports players based on their gaming behavior patterns across game genres and compare self-efficacy, social efficacy, loneliness and…
Abstract
Purpose
This study aimed to identify subgroups of esports players based on their gaming behavior patterns across game genres and compare self-efficacy, social efficacy, loneliness and three dimensions of quality of life between these subgroups.
Design/methodology/approach
324 participants were recruited from prolific academic to complete an online survey. We employed latent profile analysis (LPA) to identify subgroups of esports players based on their behavioral patterns across genres. Additionally, a one-way multivariate analysis of covariance (MANCOVA) was conducted to test the association between cluster memberships and development and well-being outcomes, controlling for age and gender as covariates.
Findings
LPA analysis identified five clusters (two single-genre gamer groups, two multigenre gamer groups and one all-genre gamer group). Univariate analyses indicated the significant effect of the clusters on social efficacy, psychological health and social health. Pairwise comparisons highlighted the salience of the physical enactment-plus-sport simulation genre group in these outcomes.
Originality/value
This study contributes to the understanding of the development and well-being benefits experienced by various esports consumers, as well as the role of specific gameplay in facilitating targeted outcomes among these consumer groups.
Details
Keywords
This paper aims to enhance the Global Projection Model (GPM) developed by the International Monetary Fund by constructing a GPM4 model that includes the United States of America…
Abstract
Purpose
This paper aims to enhance the Global Projection Model (GPM) developed by the International Monetary Fund by constructing a GPM4 model that includes the United States of America, the Eurozone, Japan and China.
Design/methodology/approach
This article introduces the United States of America, the Eurozone, Japan and China into a comprehensive global forecasting model, analyzing the impact of liquidity management in G3 economies on nine key macroeconomic variables in China.
Findings
The findings reveal that the liquidity management strategies employed by major economies do exert a certain influence on China's major macroeconomic variables. Different types of liquidity shocks elicit varying effects. Monetary shocks exhibit the strongest instantaneous impact, while credit conditions and policy rate shocks contribute more significantly to China's long-term macroeconomic fluctuations. However, no single shock stands out as the dominant factor.
Originality/value
This paper attempts to expand the GPM model developed by the International Monetary Fund and build a GPM4 model including China, the United States of America, the Eurozone and Japan. For the first time, the GPM model was used to analyze the spillover effects of liquidity management in major economies on China's macroeconomy and revealed the impact of non-price factors such as credit conditions on China's macroeconomic variables.
Details
Keywords
Rajesh Shah, Blerim Gashi, Vikram Mittal, Andreas Rosenkranz and Shuoran Du
Tribological research is complex and multidisciplinary, with many parameters to consider. As traditional experimentation is time-consuming and expensive due to the complexity of…
Abstract
Purpose
Tribological research is complex and multidisciplinary, with many parameters to consider. As traditional experimentation is time-consuming and expensive due to the complexity of tribological systems, researchers tend to use quantitative and qualitative analysis to monitor critical parameters and material characterization to explain observed dependencies. In this regard, numerical modeling and simulation offers a cost-effective alternative to physical experimentation but must be validated with limited testing. This paper aims to highlight advances in numerical modeling as they relate to the field of tribology.
Design/methodology/approach
This study performed an in-depth literature review for the field of modeling and simulation as it relates to tribology. The authors initially looked at the application of foundational studies (e.g. Stribeck) to understand the gaps in the current knowledge set. The authors then evaluated a number of modern developments related to contact mechanics, surface roughness, tribofilm formation and fluid-film layers. In particular, it looked at key fields driving tribology models including nanoparticle research and prosthetics. The study then sought out to understand the future trends in this research field.
Findings
The field of tribology, numerical modeling has shown to be a powerful tool, which is both time- and cost-effective when compared to standard bench testing. The characterization of tribological systems of interest fundamentally stems from the lubrication regimes designated in the Stribeck curve. The prediction of tribofilm formation, film thickness variation, fluid properties, asperity contact and surface deformation as well as the continuously changing interactions between such parameters is an essential challenge for proper modeling.
Originality/value
This paper highlights the major numerical modeling achievements in various disciplines and discusses their efficacy, assumptions and limitations in tribology research.
Peer review
The peer review history for this article is available at: https://publons.com/publon/10.1108/ILT-03-2023-0076/
Details