Search results

1 – 10 of over 14000
Article
Publication date: 5 June 2009

Boris Mitavskiy, Jonathan Rowe and Chris Cannings

A variety of phenomena such as world wide web, social or business networks, interactions are modelled by various kinds of networks (such as the scale free or preferential…

Abstract

Purpose

A variety of phenomena such as world wide web, social or business networks, interactions are modelled by various kinds of networks (such as the scale free or preferential attachment networks). However, due to the model‐specific requirements one may want to rewire the network to optimize the communication among the various nodes while not overloading the number of channels (i.e. preserving the number of edges). The purpose of this paper is to present a formal framework for this problem and to examine a family of local search strategies to cope with it.

Design/methodology/approach

This is mostly theoretical work. The authors use rigorous mathematical framework to set‐up the model and then we prove some interesting theorems about it which pertain to various local search algorithms that work by rerouting the network.

Findings

This paper proves that in cases when every pair of nodes is sampled with non‐zero probability then the algorithm is ergodic in the sense that it samples every possible network on the specified set of nodes and having a specified number of edges with nonzero probability. Incidentally, the ergodicity result led to the construction of a class of algorithms for sampling graphs with a specified number of edges over a specified set of nodes uniformly at random and opened some other challenging and important questions for future considerations.

Originality/value

The measure‐theoretic framework presented in the current paper is original and rather general. It allows one to obtain new points of view on the problem.

Details

International Journal of Intelligent Computing and Cybernetics, vol. 2 no. 2
Type: Research Article
ISSN: 1756-378X

Keywords

Article
Publication date: 27 May 2014

Annibal Parracho Sant’Anna, Lidia Angulo Meza and Rodrigo Otavio Araujo Ribeiro

The purpose of this paper is to discuss the application of a method for combining multiple criteria based on the transformation of numerical evaluations into probabilities of…

Abstract

Purpose

The purpose of this paper is to discuss the application of a method for combining multiple criteria based on the transformation of numerical evaluations into probabilities of preference. It is applied to compare failure risks and to measure efficiency in the retail trade sector.

Design/methodology/approach

The main conceptual aspect of the method employed is taking into account uncertainty. Its other important feature is allowing for the combination of evaluations in terms of joint probabilities. This avoids the need of assigning weights to the criteria. In the context of failure modes and effects analysis (FMEA) it provides a probabilistic derivation for priority scores. An application of FMEA to the sector of services is discussed. Another area of application investigated is the assessment of efficiency.

Findings

Details of the application of the probabilistic composition in the evaluation of modes of failure and in the comparison of operational efficiencies of retail stores are evidenced.

Research limitations/implications

The study is limited to the retail market. Other factors might be considered in the reliability analysis and other inputs and outputs might be added to the productivity evaluation. The extension of the study to other cases and sectors is straightforward.

Practical implications

Features of the evaluation of modes of failure and of productivity in the retail sector are revealed.

Originality/value

The main contribution of this paper is showing how to use a probabilistic framework to measure efficiency in services management.

Details

International Journal of Quality & Reliability Management, vol. 31 no. 6
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 1 April 2001

PHILIPP J. SCHÖNBUCHER

This article discusses factor models for portfolio credit. In these models, correlations between individual defaults are driven by a few systematic factors. By conditioning on…

Abstract

This article discusses factor models for portfolio credit. In these models, correlations between individual defaults are driven by a few systematic factors. By conditioning on these factors, defaults observed within are independent. This allows a greater degree of analytical tractability in the model with a realistic dependency structure.

Details

The Journal of Risk Finance, vol. 3 no. 1
Type: Research Article
ISSN: 1526-5943

Article
Publication date: 1 December 2020

Okechukwu Nwadigo, Nicola Naismith Naismith, Ali Ghaffarianhoseini, Amirhosein Ghaffarian Hoseini and John Tookey

A construction project is complex and requires dynamic modelling of a range of factors that deters time performance because of uncertainty and varying operating conditions. In…

Abstract

Purpose

A construction project is complex and requires dynamic modelling of a range of factors that deters time performance because of uncertainty and varying operating conditions. In construction project systems, the system components are the interconnected stages, which are time-dependent. Within the project stages are the activities which are the subsystems of the system components, causing a challenge to the analysis of the complex system. The relationship of construction project time management (CTM) with the construction project time influencing factors (CTFs) and the adaptability of the time-varying system is a key part of project effectiveness. This study explores the relationship between CTM and CTF, including the potentials to add dynamical changes on every project stage.

Design/methodology/approach

This study proposed a dynamic Bayesian network (DBN) model to examine the relationship between CTM and CTF. The model investigates the time performance of a construction project that enhances decision-making. First, the paper establishes a model of probabilistic reasoning and directed acrylic graph (DAG). Second, the study tests the dynamic impact (IM) of CTM-CTF on the project stages over a specific time, including the adaptability of time performance during disruptive CTF events. In demonstrating the effectiveness of the model, the authors selected one-organisation-single-location road-improvement project as the case study. Next, the confirmation of the model internal validity relied on conditional probabilities and the project knowledge experts' selected from the case company.

Findings

The study produced structural dependencies of CTM and CTF with probability observations at each stage. A predictive time performance analysis of the model at different scenarios evaluates the adaptability of CTM during CTF uncertain events. The case demonstration of the model application shows that CTFs have effects on CTM strategy, creating the observations to help time performance restorations after disruptions.

Research limitations/implications

Although the case company experts' panel confirms the internal validity of the results for managing time, the model used conditional probability table (CPT) and project state values from a project contract. A project-wide application then will require multi-case data and data-mining process for generating the CPTs.

Practical implications

The study developed a method for evaluating both quantitative and qualitative relationships between CTM and CTF, besides the knowledge to enhance CTM practice and research. In construction, the project team can use model observations to implement time performance restorations after a predictive or reactive disruption, which enhances decision-making.

Originality/value

The model used qualitative and qualitative data of a complex system to generate results, bounded by a range of probability distributions for CTM-CTF interconnections during time performance disruptions and restorations. The research explores the approach that can complement the mental CTM-CTF modeling of the project team. The CTM-CTF relationship model developed in this research is fundamental knowledge for future research, besides the valuable insight into CTF influence on CTM.

Details

Engineering, Construction and Architectural Management, vol. 28 no. 10
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 21 August 2007

Arindam Bandyopadhyay, Tasneem Chherawala and Asish Saha

This paper is a first attempt to empirically calibrate the default and asset correlation for large companies in India and elaborate its implications for credit risk capital…

Abstract

Purpose

This paper is a first attempt to empirically calibrate the default and asset correlation for large companies in India and elaborate its implications for credit risk capital estimation for a bank.

Design/methodology/approach

The authors estimate default probabilities and default correlations of long‐term bonds of 542 Indian corporates using rating transitions and pair‐wise migrations over ten year cohorts of firms. Further, the implicit asset correlation from the estimated default correlations and default thresholds are derived using the asymptotic single risk factor approach.

Findings

The authors find evidence that default correlations are time variant and vary across rating grades and industries. The highest correlations are observed between companies within the same rating grades (systematic risk impact) and within the same industry (industry specific impact). More interestingly, significantly smooth monotonic relationship between the probability of default (PD) and asset correlation as prescribed by the Basel II IRB document (2006) are not found. Moreover, it is found that the asset correlation range for Indian corporates do not match with what is prescribed for corporate exposures by BCBS.

Originality/value

The authors address the dilemma implied by the negative relationship between PD and asset correlation as suggested by BCBS IRB formula and other research for developed economies with estimates of asset correlation for and emerging market like India and demonstrate its implications on the estimation of credit risk capital.

Details

The Journal of Risk Finance, vol. 8 no. 4
Type: Research Article
ISSN: 1526-5943

Keywords

Article
Publication date: 9 March 2012

Annibal Parracho Sant'Anna

The purpose of this paper is to propose a method, derived from numerical evaluations on the criteria of security, frequency and detectability, of Failure Modes and Effects…

1756

Abstract

Purpose

The purpose of this paper is to propose a method, derived from numerical evaluations on the criteria of security, frequency and detectability, of Failure Modes and Effects Analysis (FMEA), a probabilistic priority measure for potential failures; and to evaluate the use of this method when combined with subjective evaluations to decide on improvement actions.

Design/methodology/approach

The method proposed is based on treating the numerical initial measurements as estimates of location parameters of probability distributions. This allows for objectively taking into account the uncertainty inherent in such measurements and to compute probabilities of each potential failure being the most important according to each criterion. These probabilities are then combined into a global quality measure, which can be interpreted as a joint probability of choice of the potential failure.

Findings

The results obtained in the cases studied show the suitability of the changes proposed. Thresholds levels proposed for the discretization of the probabilistic scores are also shown to be able to allow for an efficient combination with experts' evaluations.

Research limitations/implications

The approach here developed allows for the introduction of statistical parameters in the first stage of FMEA modeling. Employing a more complete model leads to greater reliability of the methodology.

Practical implications

The more precise modeling asks for a certain degree of practical knowledge of the factors effectively introducing variability in the measurements. This need may be surpassed by the choice of distributions such as those here employed.

Originality/value

The evaluation of the potential failures by the probability of being the most important, according to each criterion, is new in FMEA.

Details

International Journal of Quality & Reliability Management, vol. 29 no. 3
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 28 May 2021

Pedro E. Cadenas, Henryk Gzyl and Hyun Woong Park

This paper aims to illustrate, within the context of a well-known linear diversification model, that risk management as exerted by banks and regulators ultimately depends on how…

Abstract

Purpose

This paper aims to illustrate, within the context of a well-known linear diversification model, that risk management as exerted by banks and regulators ultimately depends on how risk is assessed and conceptualized. The two risk metrics used are the probability of bank failure and value at risk (VaR). The paper also extends the results of the model by incorporating an explicit analysis of correlation of the bank's portfolios.

Design/methodology/approach

The paper is based on a well-known model of linear diversification of two banking institutions developed by Wagner (2010) in the Journal of Financial Intermediation. The authors added considerations that were unexplored by Wagner and derived the corresponding logical and practical implications.

Findings

The authors found that depending on which of the two risk metrics being used, the way diversification is perceived and risk is managed may differ. This situation may very well end-up generating different incentives for banks and regulators. The authors suggest a general rationale for considering how to think about the apparent dilemma and the challenges faced by regulators. The authors also offer an explicit analysis of correlation for the bank's portfolios.

Research limitations/implications

The results are dependent on the particular aspects of the model, so the research results may lack generality in other contexts.

Practical implications

Despite the limitations already mentioned, the paper illustrates some relevant points within the open debate about risk measurement and diversification.

Originality/value

This paper contributes to the open discussion of diversification, risk perception and systemic crisis.

Details

The Journal of Risk Finance, vol. 22 no. 1
Type: Research Article
ISSN: 1526-5943

Keywords

Article
Publication date: 1 March 1984

Dan M. Frangopol

The paper attempts to establish the connection between structural reliability and structural optimization for the particular case of plastic structures. Along this line, the paper…

Abstract

The paper attempts to establish the connection between structural reliability and structural optimization for the particular case of plastic structures. Along this line, the paper outlines a reliability‐based optimization approach to design plastic structures with uncertain interdependent strengths and acted on by random interdependent loads. The importance of such interdependencies, and of some of the other statistical parameters used as input data in probabilistic computations, is demonstrated by several examples of sensitivity studies on both the probability of collapse failure as well as the reliability‐based optimum solution.

Details

Engineering Computations, vol. 1 no. 3
Type: Research Article
ISSN: 0264-4401

Article
Publication date: 7 January 2022

Ramon Swell Gomes Rodrigues Casado, Maisa Mendonca Silva and Lucio Camara Silva

The paper aims to propose a multi-criteria model for risk prioritisation associated to supply chain management involving multiple decision-makers.

Abstract

Purpose

The paper aims to propose a multi-criteria model for risk prioritisation associated to supply chain management involving multiple decision-makers.

Design/methodology/approach

The model integrates the composition of probabilistic preferences (CPP) on the failure modes analysis and its effects (FMEA) criteria. First, the authors carried out a probabilistic transformation of the numerical evaluations of the multiple decision-makers on the FMEA criteria regarding the internal risks that affect the chain of clothing pole in the Agreste region of Pernambuco. Then, the authors proposed the use of the Kendall's concordance coefficient W to aggregate these evaluations.

Findings

Contrary to expectations, the two main risks to be investigated as a model suggestion was related to the context of supply chain suppliers and not related to the raw material costs. Besides, a simulation with the traditional FMEA was carried out, and comparing with the model result, the simulation is worth highlighting seven consistent differences along the two rankings.

Research limitations/implications

The focus was restricted to the use of only internal chain risks.

Practical implications

The proposed model can contribute to the improvement of the decisions within organisations that make up the chains, thus guaranteeing a better quality in risk management.

Originality/value

Establishing a more effective representation of uncertain information related to traditional FMEA treatment involving multiple decision-makers means identifying in advance the potential risks, providing a better supply chain control.

Details

International Journal of Quality & Reliability Management, vol. 40 no. 3
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 1 February 1988

PAUL THOMPSON

The psychological literature on subjective probability estimation is reviewed to determine the feasibility of designing probabilistic information retrieval systems using such…

Abstract

The psychological literature on subjective probability estimation is reviewed to determine the feasibility of designing probabilistic information retrieval systems using such estimates. Their use has been considered by some writers, but psychological issues have not been addressed. Research pertinent to probabilistic information retrieval is examined and implications for probabilistic information retrieval are drawn. It is shown that accurate human probability estimation is possible, both in the laboratory and in real world tasks, e.g., in meteorological forecasting; but that it is also a task subject to systematic bias, or inaccuracy. Proposed techniques for debiasing are considered. The highly task‐dependent nature of such estimates is also discussed; two implications are that results from laboratory studies may have limited relevance to real world tasks and that empirical studies specific to the context of information retrieval need to be made. Human probability estimation appears to be a difficult task, but one which can be done well with proper training and use of debiasing techniques. It is premature to say how useful such estimates would be in probabilistic information retrieval, but their use should not yet be ruled out.

Details

Journal of Documentation, vol. 44 no. 2
Type: Research Article
ISSN: 0022-0418

1 – 10 of over 14000