Search results
1 – 10 of over 4000Na Zhang, Haiyan Wang and Zaiwu Gong
Grey target decision-making serves as a pivotal analytical tool for addressing dynamic multi-attribute group decision-making amidst uncertain information. However, the setting of…
Abstract
Purpose
Grey target decision-making serves as a pivotal analytical tool for addressing dynamic multi-attribute group decision-making amidst uncertain information. However, the setting of bull's eye is frequently subjective, and each stage is considered independent of the others. Interference effects between each stage can easily influence one another. To address these challenges effectively, this paper employs quantum probability theory to construct quantum-like Bayesian networks, addressing interference effects in dynamic multi-attribute group decision-making.
Design/methodology/approach
Firstly, the bull's eye matrix of the scheme stage is derived based on the principle of group negotiation and maximum satisfaction deviation. Secondly, a nonlinear programming model for stage weight is constructed by using an improved Orness measure constraint to determine the stage weight. Finally, the quantum-like Bayesian network is constructed to explore the interference effect between stages. In this process, the decision of each stage is regarded as a wave function which occurs synchronously, with mutual interference impacting the aggregate result. Finally, the effectiveness and rationality of the model are verified through a public health emergency.
Findings
The research shows that there are interference effects between each stage. Both the dynamic grey target group decision model and the dynamic multi-attribute group decision model based on quantum-like Bayesian network proposed in this paper are scientific and effective. They enhance the flexibility and stability of actual decision-making and provide significant practical value.
Originality/value
To address issues like stage interference effects, subjective bull's eye settings and the absence of participative behavior in decision-making groups, this paper develops a grey target decision model grounded in group negotiation and maximum satisfaction deviation. Furthermore, by integrating the quantum-like Bayesian network model, this paper offers a novel perspective for addressing information fusion and subjective cognitive biases during decision-making.
Details
Keywords
Tim Schürmann, Nina Gerber and Paul Gerber
Online privacy research has seen a focus on user behavior over the last decade, partly to understand and explain user decision-making and seeming inconsistencies regarding users'…
Abstract
Purpose
Online privacy research has seen a focus on user behavior over the last decade, partly to understand and explain user decision-making and seeming inconsistencies regarding users' stated preferences. This article investigates the level of modeling that contemporary approaches rely on to explain said inconsistencies and whether drawn conclusions are justified by the applied modeling methodology. Additionally, it provides resources for researchers interested in using computational modeling.
Design/methodology/approach
The article uses data from a pre-existing literature review on the privacy paradox (N = 179 articles) to identify three characteristics of prior research: (1) the frequency of references to computational-level theories of human decision-making and perception in the literature, (2) the frequency of interpretations of human decision-making based on computational-level theories, and (3) the frequency of actual computational-level modeling implementations.
Findings
After excluding unrelated articles, 44.1 percent of investigated articles reference at least one theory that has been traditionally interpreted on a computational level. 33.1 percent of all relevant articles make statements regarding computational properties of human cognition in online privacy scenarios. Meanwhile, 5.1 percent of all relevant articles apply formalized computational-level modeling to substantiate their claims.
Originality/value
The findings highlight the importance of formal, computational-level modeling in online privacy research, which has so far drawn computational-level conclusions without utilizing appropriate modeling techniques. Furthermore, this article provides an overview of said modeling techniques and their benefits to researchers, as well as references for model theories and resources for practical implementation.
Details
Keywords
Paul Lewis Reynolds and Geoff Lancaster
The purpose of this paper is to suggest a framework for sales forecasting more suitable for smaller firms. The authors examine the sales forecasting practices of small firms and…
Abstract
Purpose
The purpose of this paper is to suggest a framework for sales forecasting more suitable for smaller firms. The authors examine the sales forecasting practices of small firms and then proceed to discuss the application of Bayesian decision theory in the production of sales forecasts, a method arguably more suited to the smaller firm. The authors suggest that many small firm entrepreneurs are inherently “Bayesian” in their thinking approach to predicting events in that they often rely on subjective estimates at least for initial starting values.
Design/methodology/approach
A triangulated approach which uses qualitative group discussions and thematic content analysis, a reasonably large‐scale questionnaire sample survey administered by post and analysed using descriptive statistics and non‐parametric tests of association and a case study approach based on the authors own consultancy activities to illustrate the practical application of the forecasting model suggested.
Findings
That many small firms use no formal sales forecasting framework at all. That the majority of small firm owners and/or managers rate sales forecasting skills very low in their list of priorities when given a choice of course to attend at subsidised rates. That there is no significant difference in the importance small firm owners and/or managers attach to formal sales forecasting skills.
Research limitations/implications
Information has been gained from one geographic area in the north of England although the results may have a wider application to all small firms in the UK and elsewhere. Only the region's six most important industry sectors were included as stratification variables in the sample survey. Other regions will have a different mix of industries and will be stratified differently.
Originality/value
The article addresses the sales forecasting needs of small firms specifically within the marketing for small business context and offers a realistic option with a clear rationale.
Details
Keywords
Yang Shen, Sifeng Liu, Zhigeng Fang and Mingli Hu
The purpose of this paper is to reveal the pattern of passengers' transferring on occasion of a large crowd being stranded at transportation hubs (such as a bus station, railway…
Abstract
Purpose
The purpose of this paper is to reveal the pattern of passengers' transferring on occasion of a large crowd being stranded at transportation hubs (such as a bus station, railway station, airport, etc.) in climate disasters, and then propose the proper policy recommendations for the government to evacuate stranded passengers.
Design/methodology/approach
A model is established based on Bayesian network and influence diagram to catch the features of a passenger's decision‐making process, and the transition probabilities of passengers are revised on the basis of the theory of herd behaviors in information to describe the influence of group behaviors on passenger individuals. Subsequently, a multi‐agent model is developed in Repast platform in Java language, and simulation and analysis are also made.
Findings
The results of simulation show that it is possible to apply the theory of herd behaviors and the multi‐agent method in analyzing the effectiveness of government policies on evacuating stranded passengers in climate disasters.
Originality/value
The research of this paper has important practical significance for the government to developing policies to evacuating stranded passengers in climate disasters, and is a useful exploration to open up new methodologies for emergency management.
Details
Keywords
In forecasting unknown quantities, risk and finance decision makers often rely on one or more biased experts, statistical specialists representing parties with an interest in the…
Abstract
Purpose
In forecasting unknown quantities, risk and finance decision makers often rely on one or more biased experts, statistical specialists representing parties with an interest in the decision maker's final forecast. This problem arises in a variety of contexts, and the decision maker may represent a corporate enterprise, rating agency, government regulator, etc. The purpose of the paper is to assist decision makers, experts, and others to have a better understanding of the dynamics of the problem, and to adopt strategies and practices that enhance efficiency.
Design/methodology/approach
The problem is formulated as a two‐person, non‐cooperative Bayesian game with the decision maker and one expert as players, and perfect Bayesian equilibrium solutions are identified. Then the analysis is extended to variations of the game in which the expert's loss function is not common knowledge, and in which there are multiple experts.
Findings
In the struggle for information between the decision maker and the experts, the experts generally benefit from greater uncertainty about the parameters of the model. Thus, in attempting to elicit as much information as possible from the experts, the decision maker must strive to minimize all sources of uncertainty.
Research limitations/implications
As in most Bayesian games, the analysis requires that a variety of process assumptions and model parameters be common knowledge. These conditions may be difficult to satisfy in real‐world applications.
Practical implications
The principal finding of the study is that there is truly a struggle for information between the decision maker and the experts. This generally encourages the experts to inject as much uncertainty as possible into the process. To counter this effect, the decision maker might: provide incentives for the experts to increase their sampling information; try to mitigate specific uncertainties regarding the model parameters; and try to increase the number of experts.
Originality/value
This is the first paper to apply the framework of signaling games to the problem of eliciting information from biased experts. It is of value to decision makers, experts, and economic researchers.
Details
Keywords
Leonidas A. Zampetakis and Vassilis S. Moustakis
The purpose of this paper is to present an inductive methodology, which supports ranking of entities. Methodology is based on Bayesian latent variable measurement modeling and…
Abstract
Purpose
The purpose of this paper is to present an inductive methodology, which supports ranking of entities. Methodology is based on Bayesian latent variable measurement modeling and makes use of assessment across composite indicators to assess internal and external model validity (uncertainty is used in lieu of validity). Proposed methodology is generic and it is demonstrated on a well‐known data set, related to the relative position of a country in a “doing business.”
Design/methodology/approach
The methodology is demonstrated using data from the World Banks' “Doing Business 2008” project. A Bayesian latent variable measurement model is developed and both internal and external model uncertainties are considered.
Findings
The methodology enables the quantification of model structure uncertainty through comparisons among competing models, nested or non‐nested using both an information theoretic approach and a Bayesian approach. Furthermore, it estimates the degree of uncertainty in the rankings of alternatives.
Research limitations/implications
Analyses are restricted to first‐order Bayesian measurement models.
Originality/value
Overall, the presented methodology contributes to a better understanding of ranking efforts providing a useful tool for those who publish rankings to gain greater insights into the nature of the distinctions they disseminate.
Details
Keywords
Martin Christopher, Jane Kirkland, John Jeffries and Richard Wilson
Describes the major influences directing the growth and development of marketing theory. Assesses the relative value of holistic and piecemeal approaches to this theory. Suggests…
Abstract
Describes the major influences directing the growth and development of marketing theory. Assesses the relative value of holistic and piecemeal approaches to this theory. Suggests that the most important advances in marketing management will stem from development of models of the market, advocating on piecemeal approach.
Details
Keywords
María M. Abad‐Grau and Daniel Arias‐Aranda
Information analysis tools enhance the possibilities of firm competition in terms of knowledge management. However, the generalization of decision support systems (DSS) is still…
Abstract
Purpose
Information analysis tools enhance the possibilities of firm competition in terms of knowledge management. However, the generalization of decision support systems (DSS) is still far away from everyday use by managers and academicians. This paper aims to present a framework of analysis based on Bayesian networks (BN) whose accuracy is measured in order to assess scientific evidence.
Design/methodology/approach
Different learning algorithms based on BN are applied to extract relevant information about the relationship between operations strategy and flexibility in a sample of engineering consulting firms. Feature selection algorithms automatically are able to improve the accuracy of these classifiers.
Findings
Results show that the behaviors of the firms can be reduced to different rules that help in the decision‐making process about investments in technology and production resources.
Originality/value
Contrasting with methods from the classic statistics, Bayesian classifiers are able to model a variety of relationships between the variables affecting the dependent variable. Contrasting with other methods from the artificial intelligence field, such as neural networks or support vector machines, Bayesian classifiers are white‐box models that can directly be interpreted. Together with feature selection techniques from the machine learning field, they are able to automatically learn a model that accurately fits the data.
Details
Keywords
Emmanuel Blanchard, Adrian Sandu and Corina Sandu
The purpose of this paper is to propose a new computational approach for parameter estimation in the Bayesian framework. A posteriori probability density functions are obtained…
Abstract
Purpose
The purpose of this paper is to propose a new computational approach for parameter estimation in the Bayesian framework. A posteriori probability density functions are obtained using the polynomial chaos theory for propagating uncertainties through system dynamics. The new method has the advantage of being able to deal with large parametric uncertainties, non‐Gaussian probability densities and nonlinear dynamics.
Design/methodology/approach
The maximum likelihood estimates are obtained by minimizing a cost function derived from the Bayesian theorem. Direct stochastic collocation is used as a less computationally expensive alternative to the traditional Galerkin approach to propagate the uncertainties through the system in the polynomial chaos framework.
Findings
The new approach is explained and is applied to very simple mechanical systems in order to illustrate how the Bayesian cost function can be affected by the noise level in the measurements, by undersampling, non‐identifiablily of the system, non‐observability and by excitation signals that are not rich enough. When the system is non‐identifiable and an a priori knowledge of the parameter uncertainties is available, regularization techniques can still yield most likely values among the possible combinations of uncertain parameters resulting in the same time responses than the ones observed.
Originality/value
The polynomial chaos method has been shown to be considerably more efficient than Monte Carlo in the simulation of systems with a small number of uncertain parameters. This is believed to be the first time the polynomial chaos theory has been applied to Bayesian estimation.
Details
Keywords
The purpose of this paper is to provide a methodology for benchmarking supplier risks through the creation of Bayesian networks. The networks are used to determine a supplier's…
Abstract
Purpose
The purpose of this paper is to provide a methodology for benchmarking supplier risks through the creation of Bayesian networks. The networks are used to determine a supplier's external, operational, and network risk probability to assess its potential impact on the buyer organization.
Design/methodology/approach
The research methodology includes the use of a risk assessment model, surveys, data collection from internal and external sources, and the creation of Bayesian networks used to create risk profiles for the study participants.
Findings
It is found that Bayesian networks can be used as an effective benchmarking tool to assist managers in making decisions regarding current and prospective suppliers based upon their potential impact on the buyer organization, as illustrated through their associated risk profiles.
Research limitations/implications
A potential limitation to the use of the methodology presented in the study is the ability to acquire the necessary data from current and potential suppliers needed to construct the Bayesian networks.
Practical implications
The methodology presented in this paper can be used by buyer organizations to benchmark supplier risks in supply chain networks, which may lead to adjustments to existing risk management strategies, policies, and tactics.
Originality/value
This paper provides practitioners with an additional tool for benchmarking supplier risks. Additionally, it provides the foundation for future research studies in the use of Bayesian networks for the examination of supplier risks.
Details