Search results
1 – 10 of over 102000Managers of megaprojects face social risk management challenges throughout the various design, construction, and operation stages, owing to the various conflicts of interest among…
Abstract
Purpose
Managers of megaprojects face social risk management challenges throughout the various design, construction, and operation stages, owing to the various conflicts of interest among stakeholders, public skepticism, and opposition. However, most existing studies have not focused on the dynamic analysis of integrating social risks in these stages. This study developed a dynamic analysis approach to explore the dynamics of critical social risk factors and related stakeholders of megaprojects and built the managerial maps for various stakeholders.
Design/methodology/approach
Based on the social analysis network (SNA), a dynamic network analysis approach for understanding the dynamics of social risk and related stakeholders has been developed by literature and case analysis. The approach comprises the following steps: (1) generating social risk–stakeholder networks in different stages; (2) analysis of the critical stakeholders and social risk factors; (3) dynamic analysis of social risk factors; and (4) developing social risk management maps for various stakeholders. To verify the feasibility and effectiveness of the approach, 40 megaprojects from China were analyzed.
Findings
According to the results, the local government is a critical stakeholder during all stages, inadequate information promotion (IIP) and imperfect communication and coordination mechanism (ICCM) are key social risk sources throughout the megaproject life cycle. Furthermore, the management maps for government organizations, project implementation groups, and external stakeholders were constructed.
Originality/value
This research has three contributions. First, a dynamic analysis approach of stakeholder-associated social risks in megaprojects is developed, which enriches the social risk management theory of megaprojects and provides inspiration for future research focus. Second, the social risk–stakeholder networks and critical social risks in different stages are confirmed to provide a more valid and accurate picture of social risk management in megaprojects. Third, the social risk managerial maps for different stakeholders built in this research will be beneficial for governments, project implementation groups, and external stakeholders to optimize management strategies.
Details
Keywords
Zhangming Ma, Heap-Yih Chong and Pin-Chao Liao
Human error is among the leading causes of construction-based accidents. Previous studies on the factors affecting human error are rather vague from the perspective of complex and…
Abstract
Purpose
Human error is among the leading causes of construction-based accidents. Previous studies on the factors affecting human error are rather vague from the perspective of complex and changeable working environments. The purpose of this paper is to develop a dynamic causal model of human errors to improve safety management in the construction industry. A theoretical model is developed and tested through a case study.
Design/methodology/approach
First, the authors defined the causal relationship between construction and human errors based on the cognitive reliability and error analysis method (CREAM). A dynamic Bayesian network (DBN) was then developed by connecting time-variant causal relationships of human errors. Next, prediction, sensitivity analysis and diagnostic analysis of DBN were applied to demonstrate the function of this model. Finally, a case study of elevator installation was presented to verify the feasibility and applicability of the proposed approach in a construction work environment.
Findings
The results of the proposed model were closer to those of practice than previous static models, and the features of the systematization and dynamics are more efficient in adapting toward increasingly complex and changeable environments.
Originality/value
This research integrated CREAM as the theoretical foundation for a novel time-variant causal model of human errors in construction. Practically, this model highlights the hazards that potentially trigger human error occurrences, facilitating the implementation of proactive safety strategy and safety measures in advance.
Details
Keywords
Michel van der Wel, Sait R. Ozturk and Dick van Dijk
The implied volatility surface is the collection of volatilities implied by option contracts for different strike prices and time-to-maturity. We study factor models to capture…
Abstract
The implied volatility surface is the collection of volatilities implied by option contracts for different strike prices and time-to-maturity. We study factor models to capture the dynamics of this three-dimensional implied volatility surface. Three model types are considered to examine desirable features for representing the surface and its dynamics: a general dynamic factor model, restricted factor models designed to capture the key features of the surface along the moneyness and maturity dimensions, and in-between spline-based methods. Key findings are that: (i) the restricted and spline-based models are both rejected against the general dynamic factor model, (ii) the factors driving the surface are highly persistent, and (iii) for the restricted models option Δ is preferred over the more often used strike relative to spot price as measure for moneyness.
Details
Keywords
Bingjun Li, Chunhua He, Liping Hu and Yanhua Li
The purpose of this paper is to realize dynamical grey incidence order of influencing factors of grain production in Henan province using grey systems theory.
Abstract
Purpose
The purpose of this paper is to realize dynamical grey incidence order of influencing factors of grain production in Henan province using grey systems theory.
Design/methodology/approach
Starting from choosing influence factors on grain production and dividing the 30 years (from 1979 to 2009 year) of grain production in Henan province into three periods, the authors calculate grey incidence degree between grain yield and every influencing factor by grey incidence analysis method, respectively, then obtain the grey incidence order of influencing factors in every period. Also based on the three grey incidence orders from different periods, the authors find a changeable tendency of influencing factors on grain production and key influencing factors on grain production in different periods. Finally, to keep Henan province grain production stable and sustainable, several policy suggestions are given.
Findings
The results are convincing: it is effective and powerful to analyze dynamically influencing factors of grain production using grey systems theory, and it is urgent to strengthen agricultural science and technology input, and pay close attention to the influence of dosage of pesticide and fertilizers on grain production.
Practical implications
Grey incidence analysis and findings exposed in the paper can be used by agricultural firms to optimize grain production plans, and by government to formulate reasonable agricultural production policies.
Originality/value
The paper succeeds in getting dynamical grey incidence order of influencing factors of grain production in Henan province using grey systems theory.
Details
Keywords
Progressive collapse refers to a phenomenon, in which local damage in a primary structural component leads to total or partial structural system failure, without any…
Abstract
Purpose
Progressive collapse refers to a phenomenon, in which local damage in a primary structural component leads to total or partial structural system failure, without any proportionality between the initial and final damage. Robustness is a measure that demonstrates the strength of a structure to resist progressive collapse. Static pushdown and nonlinear dynamic analysis were two main procedures to calculate the capacity of structures to resist progressive collapse. According to previous works, static analysis would lead to inaccurate results. Meanwhile, capacity analysis by dynamic analysis needs several reruns and encountering numerical instability is inevitable. The purpose of this paper is to present the formulation of a solution procedure to determine robustness of steel moment resisting frames, using plastic limit analysis (PLA).
Design/methodology/approach
This formulation utilizes simplex optimization to solve the problem. Static pushdown and incremental dynamic methods are used for verification.
Findings
The results obtained from PLA have good agreement with incremental analysis results. While incremental dynamic analysis is a very demanding method, PLA can be utilized as an alternative method.
Originality/value
The formulation of progressive collapse resistance of steel moment frames by means of PLA is not proposed in previous research works.
Details
Keywords
Fabio Canova and Matteo Ciccarelli
This article provides an overview of the panel vector autoregressive models (VAR) used in macroeconomics and finance to study the dynamic relationships between heterogeneous…
Abstract
This article provides an overview of the panel vector autoregressive models (VAR) used in macroeconomics and finance to study the dynamic relationships between heterogeneous assets, households, firms, sectors, and countries. We discuss what their distinctive features are, what they are used for, and how they can be derived from economic theory. We also describe how they are estimated and how shock identification is performed. We compare panel VAR models to other approaches used in the literature to estimate dynamic models involving heterogeneous units. Finally, we show how structural time variation can be dealt with.
Details
Keywords
In the context of Dynamic Factor Models, we compare point and interval estimates of the underlying unobserved factors extracted using small- and big-data procedures. Our paper…
Abstract
In the context of Dynamic Factor Models, we compare point and interval estimates of the underlying unobserved factors extracted using small- and big-data procedures. Our paper differs from previous works in the related literature in several ways. First, we focus on factor extraction rather than on prediction of a given variable in the system. Second, the comparisons are carried out by implementing the procedures considered to the same data. Third, we are interested not only on point estimates but also on confidence intervals for the factors. Based on a simulated system and the macroeconomic data set popularized by Stock and Watson (2012), we show that, for a given procedure, factor estimates based on different cross-sectional dimensions are highly correlated. On the other hand, given the cross-sectional dimension, the maximum likelihood Kalman filter and smoother factor estimates are highly correlated with those obtained using hybrid procedures. The PC estimates are somehow less correlated. Finally, the PC intervals based on asymptotic approximations are unrealistically tiny.
Details
Keywords
Laura E. Jackson, M. Ayhan Kose, Christopher Otrok and Michael T. Owyang
We compare methods to measure comovement in business cycle data using multi-level dynamic factor models. To do so, we employ a Monte Carlo procedure to evaluate model performance…
Abstract
We compare methods to measure comovement in business cycle data using multi-level dynamic factor models. To do so, we employ a Monte Carlo procedure to evaluate model performance for different specifications of factor models across three different estimation procedures. We consider three general factor model specifications used in applied work. The first is a single-factor model, the second a two-level factor model, and the third a three-level factor model. Our estimation procedures are the Bayesian approach of Otrok and Whiteman (1998), the Bayesian state-space approach of Kim and Nelson (1998) and a frequentist principal components approach. The latter serves as a benchmark to measure any potential gains from the more computationally intensive Bayesian procedures. We then apply the three methods to a novel new dataset on house prices in advanced and emerging markets from Cesa-Bianchi, Cespedes, and Rebucci (2015) and interpret the empirical results in light of the Monte Carlo results.
Details
Keywords
Giuliana Passamani, Roberto Tamborini and Matteo Tomaselli
The purpose of this paper is to explain why some countries in the eurozone between 2010 and 2012 experienced a dramatic vicious circle between hard austerity plans and rising…
Abstract
Purpose
The purpose of this paper is to explain why some countries in the eurozone between 2010 and 2012 experienced a dramatic vicious circle between hard austerity plans and rising default risk premia. Were such plans too small, and hence non-credible, or too large, and hence non-sustainable? These questions have prompted theoretical and empirical investigations in the line of the so-called “self-fulfilling beliefs”, where beliefs of unsustainability of fiscal adjustments, and hence default on debt, feed higher risk premia which indeed make fiscal adjustments less sustainable.
Design/methodology/approach
Detecting the sustainability factor in the evolution of spreads is uneasy because it is largely non-observable and may be proxied by different variables. In this paper, the authors present the results of a dynamic principal components factor analysis (PCFA) applied to a panel data set of the 11 major EZ countries from 2000 to 2013, consisting of each country’s spread of long-term interest rate over Germany as dependent variable, and an array of leading fiscal and macroeconomic indicators of solvency fiscal effort and its sustainability.
Findings
The authors have been able to identify the role of these indicators that combine themselves as significant latent variables in boosting spreads. Moreover, the large joint deterioration of these variables is identifiably located between 2009 and 2012 and particularly for the group of countries under most severe default risk (with Italy and France as borderline cases). The authors also find evidence that the announcement of the European Central Bank Outright Monetary Transactions program has improved the sustainability assessment of sovereign debts.
Originality/value
Dynamic PCFA is a rather unusual technique with respect to standard econometric tests of models, which is particularly well-suited to reduce the number of variables in a data set by extracting meaningful linear combinations from the observed variables that may concur to explain a given phenomenon (the dependent variable). These combinations, called “common factors”, can be interpreted as latent, non-observable variables.
Details
Keywords
Gabriele Fiorentini, Alessandro Galesi and Enrique Sentana
We generalise the spectral EM algorithm for dynamic factor models in Fiorentini, Galesi, and Sentana (2014) to bifactor models with pervasive global factors complemented by…
Abstract
We generalise the spectral EM algorithm for dynamic factor models in Fiorentini, Galesi, and Sentana (2014) to bifactor models with pervasive global factors complemented by regional ones. We exploit the sparsity of the loading matrices so that researchers can estimate those models by maximum likelihood with many series from multiple regions. We also derive convenient expressions for the spectral scores and information matrix, which allows us to switch to the scoring algorithm near the optimum. We explore the ability of a model with a global factor and three regional ones to capture inflation dynamics across 25 European countries over 1999–2014.
Details