Search results
1 – 10 of 267Vahid Badeli, Sascha Ranftl, Gian Marco Melito, Alice Reinbacher-Köstinger, Wolfgang Von Der Linden, Katrin Ellermann and Oszkar Biro
This paper aims to introduce a non-invasive and convenient method to detect a life-threatening disease called aortic dissection. A Bayesian inference based on enhanced…
Abstract
Purpose
This paper aims to introduce a non-invasive and convenient method to detect a life-threatening disease called aortic dissection. A Bayesian inference based on enhanced multi-sensors impedance cardiography (ICG) method has been applied to classify signals from healthy and sick patients.
Design/methodology/approach
A 3D numerical model consisting of simplified organ geometries is used to simulate the electrical impedance changes in the ICG-relevant domain of the human torso. The Bayesian probability theory is used for detecting an aortic dissection, which provides information about the probabilities for both cases, a dissected and a healthy aorta. Thus, the reliability and the uncertainty of the disease identification are found by this method and may indicate further diagnostic clarification.
Findings
The Bayesian classification shows that the enhanced multi-sensors ICG is more reliable in detecting aortic dissection than conventional ICG. Bayesian probability theory allows a rigorous quantification of all uncertainties to draw reliable conclusions for the medical treatment of aortic dissection.
Originality/value
This paper presents a non-invasive and reliable method based on a numerical simulation that could be beneficial for the medical management of aortic dissection patients. With this method, clinicians would be able to monitor the patient’s status and make better decisions in the treatment procedure of each patient.
Details
Keywords
Son Nguyen, Peggy Shu-Ling Chen and Yuquan Du
Container shipping is a crucial component of the global supply chain that is affected by a large range of operational risks with high uncertainty, threatening the stability of…
Abstract
Purpose
Container shipping is a crucial component of the global supply chain that is affected by a large range of operational risks with high uncertainty, threatening the stability of service, manufacture, distribution and profitability of involved parties. However, quantitative risk analysis (QRA) of container shipping operational risk (CSOR) is being obstructed by the lack of a well-established theoretical structure to guide deeper research efforts. This paper proposes a methodological framework to strengthen the quality and reliability of CSOR analysis (CSORA).
Design/methodology/approach
Focusing on addressing uncertainties, the framework establishes a solid, overarching and updated basis for quantitative CSORA. The framework consists of clearly defined elements and processes, including knowledge establishing, information gathering, aggregating multiple sources of data (social/deliberative and mathematical/statistical), calculating risk and uncertainty level and presenting and interpreting quantified results. The framework is applied in a case study of three container shipping companies in Vietnam.
Findings
Various methodological contributions were rendered regarding CSOR characteristics, settings of analysis models, handling of uncertainties and result interpretation. The empirical study also generated valuable managerial implications regarding CSOR management policies.
Originality/value
This paper fills the gap of an updated framework for CSORA considering the recent advancements of container shipping operations and risk management. The framework can be used by both practitioners as a tool for CSORA and scholars as a test bench to facilitate the comparison and development of QRA models.
Details
Keywords
Markus Neumayer, Thomas Suppan and Thomas Bretterklieber
The application of statistical inversion theory provides a powerful approach for solving estimation problems including the ability for uncertainty quantification (UQ) by means of…
Abstract
Purpose
The application of statistical inversion theory provides a powerful approach for solving estimation problems including the ability for uncertainty quantification (UQ) by means of Markov chain Monte Carlo (MCMC) methods and Monte Carlo integration. This paper aims to analyze the application of a state reduction technique within different MCMC techniques to improve the computational efficiency and the tuning process of these algorithms.
Design/methodology/approach
A reduced state representation is constructed from a general prior distribution. For sampling the Metropolis Hastings (MH) Algorithm and the Gibbs sampler are used. Efficient proposal generation techniques and techniques for conditional sampling are proposed and evaluated for an exemplary inverse problem.
Findings
For the MH-algorithm, high acceptance rates can be obtained with a simple proposal kernel. For the Gibbs sampler, an efficient technique for conditional sampling was found. The state reduction scheme stabilizes the ill-posed inverse problem, allowing a solution without a dedicated prior distribution. The state reduction is suitable to represent general material distributions.
Practical implications
The state reduction scheme and the MCMC techniques can be applied in different imaging problems. The stabilizing nature of the state reduction improves the solution of ill-posed problems. The tuning of the MCMC methods is simplified.
Originality/value
The paper presents a method to improve the solution process of inverse problems within the Bayesian framework. The stabilization of the inverse problem due to the state reduction improves the solution. The approach simplifies the tuning of MCMC methods.
Details
Keywords
Yingjie Yang, Sifeng Liu and Naiming Xie
The purpose of this paper is to propose a framework for data analytics where everything is grey in nature and the associated uncertainty is considered as an essential part in data…
Abstract
Purpose
The purpose of this paper is to propose a framework for data analytics where everything is grey in nature and the associated uncertainty is considered as an essential part in data collection, profiling, imputation, analysis and decision making.
Design/methodology/approach
A comparative study is conducted between the available uncertainty models and the feasibility of grey systems is highlighted. Furthermore, a general framework for the integration of grey systems and grey sets into data analytics is proposed.
Findings
Grey systems and grey sets are useful not only for small data, but also big data as well. It is complementary to other models and can play a significant role in data analytics.
Research limitations/implications
The proposed framework brings a radical change in data analytics. It may bring a fundamental change in our way to deal with uncertainties.
Practical implications
The proposed model has the potential to avoid the mistake from a misleading data imputation.
Social implications
The proposed model takes the philosophy of grey systems in recognising the limitation of our knowledge which has significant implications in our way to deal with our social life and relations.
Originality/value
This is the first time that the whole data analytics is considered from the point of view of grey systems.
Details
Keywords
Yanan Wang, Jianqiang Li, Sun Hongbo, Yuan Li, Faheem Akhtar and Azhar Imran
Simulation is a well-known technique for using computers to imitate or simulate the operations of various kinds of real-world facilities or processes. The facility or process of…
Abstract
Purpose
Simulation is a well-known technique for using computers to imitate or simulate the operations of various kinds of real-world facilities or processes. The facility or process of interest is usually called a system, and to study it scientifically, we often have to make a set of assumptions about how it works. These assumptions, which usually take the form of mathematical or logical relationships, constitute a model that is used to gain some understanding of how the corresponding system behaves, and the quality of these understandings essentially depends on the credibility of given assumptions or models, known as VV&A (verification, validation and accreditation). The main purpose of this paper is to present an in-depth theoretical review and analysis for the application of VV&A in large-scale simulations.
Design/methodology/approach
After summarizing the VV&A of related research studies, the standards, frameworks, techniques, methods and tools have been discussed according to the characteristics of large-scale simulations (such as crowd network simulations).
Findings
The contributions of this paper will be useful for both academics and practitioners for formulating VV&A in large-scale simulations (such as crowd network simulations).
Originality/value
This paper will help researchers to provide support of a recommendation for formulating VV&A in large-scale simulations (such as crowd network simulations).
Details
Keywords
Background: Commodity-driven deforestation is a major driver of forest loss worldwide, and globalisation has increased the disconnect between producer and consumer countries…
Abstract
Background: Commodity-driven deforestation is a major driver of forest loss worldwide, and globalisation has increased the disconnect between producer and consumer countries. Recent due-diligence legislation aiming to improve supply chain sustainability covers major forest-risk commodities. However, the evidence base for specific commodities included within policy needs assessing to ensure effective reduction of embedded deforestation.
Methods: We conducted a rapid evidence synthesis in October 2020 using three databases; Google Scholar, Web of Science, and Scopus, to assess the literature and identify commodities with the highest deforestation risk linked to UK imports. Inclusion criteria include publication in the past 10 years and studies that didn't link commodity consumption to impacts or to the UK were excluded. The development of a review protocol was used to minimise bias and critical appraisal of underlying data and methods in studies was conducted in order to assess the uncertainties around results.
Results: From a total of 318 results, 17 studies were included in the final synthesis. These studies used various methodologies and input data, yet there is broad alignment on commodities, confirming that those included in due diligence legislation have a high deforestation risk. Soy, palm oil, and beef were identified as critical, with their production being concentrated in just a few global locations. However, there are also emerging commodities that have a high deforestation risk but are not included in legislation, such as sugar and coffee. These commodities are much less extensively studied in the literature and may warrant further research and consideration.
Conclusion: Policy recommendations in the selected studies suggests further strengthening of the UK due diligence legislation is needed. In particular, the provision of incentives for uptake of policies and wider stakeholder engagement, as well as continual review of commodities included to ensure a reduction in the UK's overseas deforestation footprint.
Details
Keywords
Adam Biggs and Joseph Hamilton
Evaluating warfighter lethality is a critical aspect of military performance. Raw metrics such as marksmanship speed and accuracy can provide some insight, yet interpreting subtle…
Abstract
Purpose
Evaluating warfighter lethality is a critical aspect of military performance. Raw metrics such as marksmanship speed and accuracy can provide some insight, yet interpreting subtle differences can be challenging. For example, is a speed difference of 300 milliseconds more important than a 10% accuracy difference on the same drill? Marksmanship evaluations must have objective methods to differentiate between critical factors while maintaining a holistic view of human performance.
Design/methodology/approach
Monte Carlo simulations are one method to circumvent speed/accuracy trade-offs within marksmanship evaluations. They can accommodate both speed and accuracy implications simultaneously without needing to hold one constant for the sake of the other. Moreover, Monte Carlo simulations can incorporate variability as a key element of performance. This approach thus allows analysts to determine consistency of performance expectations when projecting future outcomes.
Findings
The review divides outcomes into both theoretical overview and practical implication sections. Each aspect of the Monte Carlo simulation can be addressed separately, reviewed and then incorporated as a potential component of small arms combat modeling. This application allows for new human performance practitioners to more quickly adopt the method for different applications.
Originality/value
Performance implications are often presented as inferential statistics. By using the Monte Carlo simulations, practitioners can present outcomes in terms of lethality. This method should help convey the impact of any marksmanship evaluation to senior leadership better than current inferential statistics, such as effect size measures.
Details