Search results

1 – 10 of over 5000
Article
Publication date: 3 August 2012

Anand Prakash, Sanjay Kumar Jha and Rajendra Prasad Mohanty

The purpose of this paper is to propose the idea of linking the use of the Monte Carlo simulation with scenario planning to assist strategy makers in formulating strategy in the…

1029

Abstract

Purpose

The purpose of this paper is to propose the idea of linking the use of the Monte Carlo simulation with scenario planning to assist strategy makers in formulating strategy in the face of uncertainty relating to service quality gaps for life insurance business, where discontinuities always remain for need‐based selling.

Design/methodology/approach

The paper reviews briefly some applications of scenario planning. Scenario planning emphasizes the development of a strategic plan that is robust across different scenarios. The paper provides considerable evidence to suggest a new strategic approach using Monte Carlo simulation for making scenario planning.

Findings

The paper highlights which particular service quality gap attribute as risk impacts most and least for the possibility of occurrences as best case, worst case, and most likely case.

Research limitations/implications

This study suffers from methodological limitations associated with convenience sampling and anonymous survey‐based research.

Practical implications

The approach using Monte Carlo simulation increases the credibility of the scenario to an acceptable level, so that it will be used by managers and other decision makers.

Social implications

The paper provides a thorough documentation on scenario planning upon studying the impact of risk and uncertainty in service quality gap for making rational decisions in management of services such that managers make better justification and communication for their arguments.

Originality/value

The paper offers empirical understanding of the application of Monte Carlo simulation to scenario planning and identifies key drivers which impact most and least on service quality gap.

Details

Journal of Strategy and Management, vol. 5 no. 3
Type: Research Article
ISSN: 1755-425X

Keywords

Article
Publication date: 3 July 2017

Anand Prakash and Rajendra P. Mohanty

Automakers are engaged in manufacturing both efficient and inefficient green cars. The purpose of this paper is to categorize efficient green cars and inefficient green cars…

Abstract

Purpose

Automakers are engaged in manufacturing both efficient and inefficient green cars. The purpose of this paper is to categorize efficient green cars and inefficient green cars followed by improving efficiencies of identified inefficient green cars for distribution fitting.

Design/methodology/approach

The authors have used 2014 edition of secondary data published by the Automotive Research Centre of the Automobile Club of Southern California. The paper provides the methodology of applying data envelopment analysis (DEA) consisting of 50 decision-making units (DMUs) of green cars with six input indices (emission, braking, ride quality, acceleration, turning circle, and luggage capacity) and two output indices (miles per gallon and torque) integrated with Monte Carlo simulation for drawing significant statistical inferences graphically.

Findings

The findings of this study showed that there are 27 efficient and 23 inefficient DMUs along with improvement matrix. Additionally, the study highlighted the best distribution fitting of improved efficient green cars for respective indices.

Research limitations/implications

This study suffers from limitations associated with 2014 edition of secondary data used in this research.

Practical implications

This study may be useful for motorists with efficient listing of green cars, whereas automakers can be benefitted with distribution fitting of improved efficient green cars using Monte Carlo simulation for calibration.

Originality/value

The paper uses DEA to empirically examine classification of green cars and applies Monte Carlo simulation for distribution fitting to improved efficient green cars to decide appropriate range of their attributes for calibration.

Details

Benchmarking: An International Journal, vol. 24 no. 5
Type: Research Article
ISSN: 1463-5771

Keywords

Article
Publication date: 23 February 2024

Anand Prakash and Sudhir Ambekar

This study aims to describe the fundamentals of teaching risk management in a classroom setting, with an emphasis on the learning interface between higher education and the…

Abstract

Purpose

This study aims to describe the fundamentals of teaching risk management in a classroom setting, with an emphasis on the learning interface between higher education and the workplace environment for business management students.

Design/methodology/approach

The study reviews literature that uses spreadsheets to visualize and model risk and uncertainty. Using six distinct case-based activities (CBAs), the study illustrates the practical applications of software like Palisade @RISK in risk management education. It helps to close the gap between theory and practice. The software assists in estimating the likelihood of a risk event and the impact or repercussions it will have if it occurs. This technique of risk analysis makes it possible to identify the risks that need the most active control.

Findings

@RISK can be used to create models that produce results to demonstrate every potential scenario outcome. When faced with a choice or analysis that involves uncertainty, @RISK can be utilized to enhance the perspective of what the future might contain.

Originality/value

The insights from this study can be used to develop critical thinking, independent thinking, problem-solving and other important skills in learners. Further, educators can apply Bloom’s taxonomy and the problem-solving taxonomy to help students make informed decisions in risky situations.

Details

Higher Education, Skills and Work-Based Learning, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2042-3896

Keywords

Book part
Publication date: 2 November 2009

Per Hjertstrand

Weak separability is an important concept in many fields of economic theory. This chapter uses Monte Carlo experiments to investigate the performance of newly developed…

Abstract

Weak separability is an important concept in many fields of economic theory. This chapter uses Monte Carlo experiments to investigate the performance of newly developed nonparametric revealed preference tests for weak separability. A main finding is that the bias of the sequentially implemented test for weak separability proposed by Fleissig and Whitney (2003) is low. The theoretically unbiased Swofford and Whitney test (1994) is found to perform better than all sequentially implemented test procedures but is found to suffer from an empirical bias, most likely because of the complexity in executing the test procedure. As a further source of information, we also perform sensitivity analyses on the nonparametric revealed preference tests. It is found that the Fleissig and Whitney test seems to be sensitive to measurement errors in the data.

Details

Measurement Error: Consequences, Applications and Solutions
Type: Book
ISBN: 978-1-84855-902-8

Article
Publication date: 15 May 2017

Felix Canitz, Panagiotis Ballis-Papanastasiou, Christian Fieberg, Kerstin Lopatta, Armin Varmaz and Thomas Walker

The purpose of this paper is to review and evaluate the methods commonly used in accounting literature to correct for cointegrated data and data that are neither stationary nor…

Abstract

Purpose

The purpose of this paper is to review and evaluate the methods commonly used in accounting literature to correct for cointegrated data and data that are neither stationary nor cointegrated.

Design/methodology/approach

The authors conducted Monte Carlo simulations according to Baltagi et al. (2011), Petersen (2009) and Gow et al. (2010), to analyze how regression results are affected by the possible nonstationarity of the variables of interest.

Findings

The results of this study suggest that biases in regression estimates can be reduced and valid inferences can be obtained by using robust standard errors clustered by firm, clustered by firm and time or Fama–MacBeth t-statistics based on the mean and standard errors of the cross section of coefficients from time-series regressions.

Originality/value

The findings of this study are suited to guide future researchers regarding which estimation methods are the most reliable given the possible nonstationarity of the variables of interest.

Details

The Journal of Risk Finance, vol. 18 no. 3
Type: Research Article
ISSN: 1526-5943

Keywords

Book part
Publication date: 19 December 2012

Marco Gallegati and James B. Ramsey

In this chapter we perform a Monte Carlo simulation study of the errors-in-variables model examined in Ramsey, Gallegati, Gallegati, and Semmler (2010) by using a wavelet…

Abstract

In this chapter we perform a Monte Carlo simulation study of the errors-in-variables model examined in Ramsey, Gallegati, Gallegati, and Semmler (2010) by using a wavelet multiresolution approximation approach. Differently from previous studies applying wavelets to errors-in-variables problem, we use a sequence of multiresolution approximations of the variable measured with error ranging from finer to coarser scales. Our results indicate that multiscale approximations to the variable observed with error based on the coarser scales provide an unbiased asymptotically efficient estimator that also possess good finite sample properties.

Details

Essays in Honor of Jerry Hausman
Type: Book
ISBN: 978-1-78190-308-7

Keywords

Article
Publication date: 9 May 2016

Torsten J. Gerpott and Sebastian May

Providers of cloud computing storage services (CCSS) charge offers in several unit bundles for a lump sum per bundle. This non-linear pricing approach is known as a bucket-pricing…

Abstract

Purpose

Providers of cloud computing storage services (CCSS) charge offers in several unit bundles for a lump sum per bundle. This non-linear pricing approach is known as a bucket-pricing plan (BPP). If a customer exploits the purchased bucket, he/she can opt for the next higher bucket or refrain from further CCSS use. CCSS suppliers are faced with an optimization problem concerning the number of buckets as well as their lower and upper storage volume boundaries. The purpose of this paper is to develop a model, which supports CCSS suppliers in deriving a BPP-structure and which maximizes their profit in varying market constellations.

Design/methodology/approach

The authors develop a multi-period model of tariff choice decisions of private customers of CCSS. The model is applied in Monte Carlo simulations to determine profit-maximal tariff structures as a function of different market characteristics such as median demand saturation, demand heterogeneity, average price per storage unit and bucket ceiling allocation (identical size of each bucket within the frame set by the lower and upper overall boundary, varying sizes of the buckets offered, so that the interval between two ceilings consecutively increases for subsequent buckets) and type of a customer’s utility function.

Findings

The simulation analysis suggests that demand heterogeneity and average price per unit are the most influential factors for CCSS tariff structure optimization. Price plans with more than two buckets tend to generate higher profits than simple schemes with two buckets only if demand heterogeneity is low and the average price per storage unit is high and/or median saturation level of customers is low.

Originality/value

Despite the popularity of BPP among providers of CCSS for consumers, there is a lack of scholarly modeling work on the profit implications of the number of buckets entailed in a scheme and the size/ceilings of the various buckets on offer. The model suggested in this paper is a first step toward narrowing this research gap.

Details

Journal of Modelling in Management, vol. 11 no. 2
Type: Research Article
ISSN: 1746-5664

Keywords

Article
Publication date: 26 January 2022

Liangyan Liu and Ming Cheng

In the process of building the “Belt and Road” and “Bright Road” community of interests between China and Kazakhstan, this paper proposes the construction of an inland nuclear…

Abstract

Purpose

In the process of building the “Belt and Road” and “Bright Road” community of interests between China and Kazakhstan, this paper proposes the construction of an inland nuclear power plant in Kazakhstan. Considering the uncertainty of investment in nuclear power generation, the authors propose the MGT (Monte-Carlo and Gaussian Radial Basis with Tensor factorization) utility evaluation model to evaluate the risk of investment in nuclear power in Kazakhstan and provide a relevant reference for decision making on inland nuclear investment in Kazakhstan.

Design/methodology/approach

Based on real options portfolio combined with a weighted utility function, this study takes into account the uncertainties associated with nuclear power investments through a minimum variance Monte Carlo approach, proposes a noise-enhancing process combined with geometric Brownian motion in solving complex conditions, and incorporates a measure of investment flexibility and strategic value in the investment, and then uses a deep noise reduction encoder to learn the initial values for potential features of cost and investment effectiveness. A Gaussian radial basis function used to construct a weighted utility function for each uncertainty, generate a minimization of the objective function for the tensor decomposition, and then optimize the objective loss function for the tensor decomposition, find the corresponding weights, and perform noise reduction to generalize the nonlinear problem to evaluate the effectiveness of nuclear power investment. Finally, the two dimensions of cost and risk (estimation of investment value and measurement of investment risk) are applied and simulated through actual data in Kazakhstan.

Findings

The authors assess the core indicators of Kazakhstan's nuclear power plants throughout their construction and operating cycles, based on data relating to a cluster of nuclear power plants of 10 different technologies. The authors compared it with several popular methods for evaluating the benefits of nuclear power generation and conducted subsequent sensitivity analyses of key indicators. Experimental results on the dataset show that the MGT method outperforms the other four methods and that changes in nuclear investment returns are more sensitive to changes in costs while operating cash flows from nuclear power are certainly an effective way to drive investment reform in inland nuclear power generation in Kazakhstan at current levels of investment costs.

Research limitations/implications

Future research could consider exploring other excellent methods to improve the accuracy of the investment prediction further using sparseness and noise interference. Also consider collecting some expert advice and providing more appropriate specific suggestions, which will facilitate the application in practice.

Practical implications

The Novel Coronavirus epidemic has plunged the global economy into a deep recession, the tension between China and the US has made the energy cooperation road unusually tortuous, Kazakhstan in Central Asia has natural geographical and resource advantages, so China–Kazakhstan energy cooperation as a new era of opportunity, providing a strong guarantee for China's political and economic stability. The basic idea of building large-scale nuclear power plants in Balkhash and Aktau is put forward, considering the development strategy of building Kazakhstan into a regional international energy base. This work will be a good inspiration for the investment of nuclear generation.

Originality/value

This study solves the problem of increasing noise by combining Monte Carlo simulation with geometric Brownian motion under complex conditions, adds the measure of investment flexibility and strategic value, constructs the utility function of noise reduction weight based on Gaussian radial basis function and extends the nonlinear problem to the evaluation of nuclear power investment benefit.

Details

Industrial Management & Data Systems, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0263-5577

Keywords

Article
Publication date: 28 January 2014

Constantinos Lefcaditis, Anastasios Tsamis and John Leventides

The IRB capital requirements of Basel II define the minimum level of capital that the bank has to retain to cover the current risks of its portfolio. The major risk that many…

1710

Abstract

Purpose

The IRB capital requirements of Basel II define the minimum level of capital that the bank has to retain to cover the current risks of its portfolio. The major risk that many banks are facing is credit risk and Basel II provides an approach to calculate its capital requirement. It is well known that Pillar I Basel II approach for credit risk capital requirements does not include concentration risk. The paper aims to propose a model modifying Basel II methodology (IRB) to include name concentration risk.

Design/methodology/approach

The model is developed on data based on a portfolio of Greek companies that are financed by Greek commercial banks. Based on the initial portfolio, new portfolios were simulated having a range of different credit risk parameters. Subsequently, the credit VaR of various portfolios was regressed against the credit risk indicators such as Basel II capital requirements, modified Herfindahl Index and a non-linear model was developed. This model modifies the Pillar I IRB capital requirements model of Basel II to include name concentration risk.

Findings

As the Pillar I IRB capital requirements model of Basel II does not include concentration risk, the credit VaR calculations performed in the present work appeared to have gaps with the Basel II capital requirements. These gaps were more apparent when there was high concentration risk in the credit portfolios. The new model bridges this gap providing with a correction coefficient.

Practical implications

The credit VaR of a loan portfolio could be calculated from the bank easily, without the use of additional complicated algorithms and systems.

Originality/value

The model is constructed in such a way as to provide an approximation of credit VaR satisfactory for business loan portfolios whose risk parameters lie within the range of those in a realistic bank credit portfolio and without the application of Monte Carlo simulations.

Details

The Journal of Risk Finance, vol. 15 no. 1
Type: Research Article
ISSN: 1526-5943

Keywords

Article
Publication date: 1 October 2008

C. Correia and P. Cramer

This study employs a sample survey to determine and analyse the corporate finance practices of South African listed companies in relation to cost of capital, capital structure and…

3012

Abstract

This study employs a sample survey to determine and analyse the corporate finance practices of South African listed companies in relation to cost of capital, capital structure and capital budgeting decisions.The results of the survey are mostly in line with financial theory and are generally consistent with a number of other studies. This study finds that companies always or almost always employ DCF methods such as NPV and IRR to evaluate projects. Companies almost always use CAPM to determine the cost of equity and most companies employ either a strict or flexible target debt‐equity ratio. Furthermore, most practices of the South African corporate sector are in line with practices employed by US companies. This reflects the relatively highly developed state of the South African economy which belies its status as an emerging market. However, the survey has also brought to the fore a number of puzzling results which may indicate some gaps in the application of finance theory. There is limited use of relatively new developments such as real options, APV, EVA and Monte Carlo simulation. Furthermore, the low target debt‐equity ratios reflected the exceptionally low use of debt by South African companies.

1 – 10 of over 5000