Search results
1 – 10 of over 2000Baabak Ashuri, Jian Lu and Hamed Kashani
This paper aims to present a financial valuation framework based on the real options theory to evaluate investments in toll road projects delivered under the two‐phase development…
Abstract
Purpose
This paper aims to present a financial valuation framework based on the real options theory to evaluate investments in toll road projects delivered under the two‐phase development plan.
Design/methodology/approach
The approach is based on applying the real options theory to evaluate investments in toll road projects. In particular, the risk‐neutral valuation method is used for pricing flexibility embedded in the two‐phase development plan. Risk‐neutral binomial lattice is used to model traffic uncertainty and to find the optimal time for the toll road expansion. Probabilistic life cycle cost and revenue analysis is conducted to characterize the investor's financial risk profile and determine the flexibility value of the expansion option.
Findings
The flexible, two‐phase development plan can improve the investor's financial risk profile in the toll road project through limiting the downside risk of overinvestment (i.e. decreasing the probability of investment loss) and increasing the expected investment value in a highway project.
Social implications
Private and public sectors can benefit from this valuation framework and use tax dollars and users' fees effectively through avoiding overinvestment in toll road projects.
Originality/value
The framework consists of several integrated features, which distinguish it from existing investment valuation models. The risk‐neutral valuation method for pricing flexibility embedded in the two‐phase development plan is applied. This real options framework is capable of characterizing traffic boundary, at which it is optimal for the investor to expand the toll road. Further, this framework provides the likelihood distribution of when the investor may expand the toll road.
Details
Keywords
Krasimir Milanov and Ognyan Kounchev
In this chapter we concentrate at the most popular model for convertible bond (CB) valuation in a one-factor, stochastic underlying stock price setting. Through the last decade…
Abstract
In this chapter we concentrate at the most popular model for convertible bond (CB) valuation in a one-factor, stochastic underlying stock price setting. Through the last decade, the Tsiveriotis–Fernandes model (1998) has become a widely commented model that involves the state of default of the issuer of the CB. A routine approach to the solution of this model is the usage of methods of finite difference schemes (FDS). However, for many people trained in finance these methods are not very intuitive and they tend to ignore them and prefer to use binomial-tree approach as more intuitive technique. For that reason, our primary focus is to highlight the answer of the so far unanswered question: Does the binomial-tree approach to CBs provide accurate pricing, hedging, and risk assessment? We show on a set of representative examples that by using binomial-tree methodology one is unable to provide a consistent analysis of the pricing, hedging, and risk assessment. We start the chapter with the basics of CBs and CB market. We then explain the implementation of TF model within binary-tree approach. We conclude the chapter with performance valuation of binomial-tree approach showing unexpected behavior in practice areas such as pricing (profile of CB's price versus underlying stock price), hedging (performance of CB's delta, gamma, and convertible arbitrage strategy versus underlying stock), and risk assessment (Monte Carlo VaR with respect to the underlying).
Details
Keywords
Dasheng Ji and B. Wade Brorsen
The purpose of this paper is to develop an option pricing model applicable to US options. The lognormality assumption that has typically been imposed with past binomial and…
Abstract
Purpose
The purpose of this paper is to develop an option pricing model applicable to US options. The lognormality assumption that has typically been imposed with past binomial and trinomial option pricing models is relaxed. The relaxed lattice model is then used to determine skewness and kurtosis of distributions of futures prices implied from option prices.
Design/methodology/approach
The relaxed lattice is based on Gaussian quadrature. The markets studied include corn, soybeans, and wheat. Skewness and kurtosis are implied by minimizing the squared deviations of actual option premia from predicted premia.
Findings
Positive skewness is the major source of nonnormality, but both skewness and kurtosis are important as the trinomial model that considers kurtosis has greater accuracy than the binomial model. The out‐of‐sample forecasting accuracy of the relaxed lattice models is better than the Black‐Scholes model in most, but not all cases.
Research limitations/implications
The model might benefit from using option prices from more than one day. The implied skewness and kurtosis were quite variable and using more data might reduce this variability.
Practical implications
Empirical results mostly show positive implied skewness, which suggests extreme price rises were more likely than extreme price decreases.
Originality/value
The relaxed lattice is a new model and the results about implied higher moments are new for these commodities. There are competing models available that should be able to get similar accuracy, so one key advantage of the new approach is its simplicity and ease of use.
Details
Keywords
Kim Hin David Ho and Shea Jean Tay
The purpose of this paper is to examine the risk neutral and non-risk neutral pricing of Singapore Real Estate Investment Trusts (S-REITs) via comparing the average of the…
Abstract
Purpose
The purpose of this paper is to examine the risk neutral and non-risk neutral pricing of Singapore Real Estate Investment Trusts (S-REITs) via comparing the average of the individual ratios (of deviation between expected and observed closing price/observed closing price) with the ratio (of standard deviation/mean) for closing prices via the binomial options pricing tree model.
Design/methodology/approach
If the ratio (of standard deviation/mean) ratio > the ratio (of deviation between expected and observed closing price/observed closing price), then the deviation of closing prices from the expected risk neutral prices is not significant and that the S-REIT is consistent with risk neutral pricing. If the ratio (of deviation between expected and observed closing price/observed closing price) is greater, then the S-REIT is not consistent with risk neutral pricing.
Findings
Capitacommercial Trust (CCT), Capitamall Trust (CMT) and Keppel Real Estate Investment Trust (REIT) have large positive differences between the two ratios (39.86, 30.79 and 18.96 percent, respectively), implying that these S-REITs are not trading at risk neutral pricing. Suntec REIT has a small positive difference of 2.35 percent between both ratios, implying that it is trading at risk neutral pricing. Ascendas REIT has the largest negative difference between the two ratios at −4.24 percent, to be followed by Mapletree Logistics Trust at −0.44 percent. Both S-REITs are trading at risk neutral pricing. The analysis shows that CCT, CMT and Keppel REIT exhibit risk averse pricing.
Research limitations/implications
Results are consistent with prudential asset allocation for viable S-REIT portfolio investing but that not all these S-REITs exhibit strong market efficiency in their pricing.
Practical implications
Pricing may be risk neutral over a certain period but investor sentiments, fear of risks and speculative activities could affect an S-REIT’s risk neutrality.
Social implications
With enhanced risk diversification activities, the S-REITs should attain risk neutral pricing.
Originality/value
Virtually no research of this nature has been undertaken for S-REITS.
Details
Keywords
Sandra García-Bustos, Nadia Cárdenas-Escobar, Ana Debón and César Pincay
The study aims to design a control chart based on an exponentially weighted moving average (EWMA) chart of Pearson's residuals of a model of negative binomial regression in order…
Abstract
Purpose
The study aims to design a control chart based on an exponentially weighted moving average (EWMA) chart of Pearson's residuals of a model of negative binomial regression in order to detect possible anomalies in mortality data.
Design/methodology/approach
In order to evaluate the performance of the proposed chart, the authors have considered official historical records of death of children of Ecuador. A negative binomial regression model was fitted to the data, and a chart of the Pearson residuals was designed. The parameters of the chart were obtained by simulation, as well as the performances of the charts related to changes in the mean of death.
Findings
When the chart was plotted, outliers were detected in the deaths of children in the years 1990–1995, 2001–2006, 2013–2015, which could show that there are underreporting or an excessive growth in mortality. In the analysis of performances, the value of λ = 0.05 presented the fastest detection of changes in the mean death.
Originality/value
The proposed charts present better performances in relation to EWMA charts for deviance residuals, with a remarkable advantage of the Pearson residuals, which are much easier to interpret and calculate. Finally, the authors would like to point out that although this paper only applies control charts to Ecuadorian infant mortality, the methodology can be used to calculate mortality in any geographical area or to detect outbreaks of infectious diseases.
Details
Keywords
Starting from a basis laid by Burrell, this paper develops a stochastic model of library borrowing using the Negative Binomial distribution. This shows an improvement over…
Abstract
Starting from a basis laid by Burrell, this paper develops a stochastic model of library borrowing using the Negative Binomial distribution. This shows an improvement over previous characterizations for academic libraries and accords well with new data obtained at Huddersfield Public Library. Evidence concerning the process of issue decay is presented and employed to obtain an optimum stock turnover rate for any collection in its ‘steady state’. A method is then given by which simple relegation tests can be constructed to maintain such as optimum turnover. Although not the ‘final word’ in circulation modelling, the negative binomial distribution extends the range of model applicability into the area of high volume, rapid movement collections with some success.
The challenge of truckload routing is increased in complexity by the introduction of stochastic demand. Typically, this demand is generalized to follow a Poisson distribution. In…
Abstract
The challenge of truckload routing is increased in complexity by the introduction of stochastic demand. Typically, this demand is generalized to follow a Poisson distribution. In this chapter, we cluster the demand data using data mining techniques to establish the more acceptable distribution to predict demand. We then examine this stochastic truckload demand using an econometric discrete choice model known as a count data model. Using actual truckload demand data and data from the bureau of transportation statistics, we perform count data regressions. Two outcomes are produced from every regression run, the predicted demand between every origin and destination, and the likelihood that that demand will occur. The two allow us to generate an expected value forecast of truckload demand as input to a truckload routing formulation. The negative binomial distribution produces an improved forecast over the Poisson distribution.
Brian Sloan, Olubukola Tokede, Sam Wamuziri and Andrew Brown
The main purpose of the study is to promote consideration of the issues and approaches available for costing sustainable buildings with a view to minimising cost overruns…
Abstract
Purpose
The main purpose of the study is to promote consideration of the issues and approaches available for costing sustainable buildings with a view to minimising cost overruns, occasioned by conservative whole-life cost estimates. The paper primarily looks at the impact of adopting continuity in whole-life cost models for zero carbon houses.
Design/methodology/approach
The study embraces a mathematically based risk procedure based on the binomial theorem for analysing the cost implication of the Lighthouse zero-carbon house project. A practical application of the continuous whole-life cost model is developed and results are compared with existing whole-life cost techniques using finite element methods and Monte Carlo analysis.
Findings
With standard whole-life costing, discounted present-value analysis tends to underestimate the cost of a project. Adopting continuity in whole-life cost models presents a clearer picture and profile of the economic realities and decision-choices confronting clients and policy-makers. It also expands the informative scope on the costs of zero-carbon housing projects.
Research limitations/implications
A primary limitation in this work is its focus on just one property type as the unit of analysis. This research is also limited in its consideration of initial and running cost categories only. The capital cost figures for the Lighthouse are indicative rather than definitive.
Practical implications
The continuous whole-life cost technique is a novel and innovative approach in financial appraisal […] Benefits of an improved costing framework will be far-reaching in establishing effective policies aimed at client acceptance and optimally performing supply chain networks.
Originality/value
The continuous whole-life costing pioneers an experimental departure from the stereo-typical discounting mechanism in standard whole-life costing procedures.
Details
Keywords
Dominique Lord and Srinivas Reddy Geedipally
Purpose – This chapter provides an overview of issues related to analysing crash data characterised by excess zero responses and/or long tails and how to overcome these problems…
Abstract
Purpose – This chapter provides an overview of issues related to analysing crash data characterised by excess zero responses and/or long tails and how to overcome these problems. Factors affecting excess zeros and/or long tails are discussed, as well as how they can bias the results when traditional distributions or models are used. Recently introduced multi-parameter distributions and models developed specifically for such datasets are described. The chapter is intended to guide readers on how to properly analyse crash datasets with excess zeros and long or heavy tails.
Methodology – Key references from the literature are summarised and discussed, and two examples detailing how multi-parameter distributions and models compare with the negative binomial distribution and model are presented.
Findings – In the event that the characteristics of the crash dataset cannot be changed or modified, recently introduced multi-parameter distributions and models can be used efficiently to analyse datasets characterised by excess zero responses and/or long tails. They offer a simpler way to interpret the relationship between crashes and explanatory variables, while providing better statistical performance in terms of goodness-of-fit and predictive capabilities.
Research implications – Multi-parameter models are expected to become the next series of traditional distributions and models. The research on these models is still ongoing.
Practical implications – With the advancement of computing power and Bayesian simulation methods, multi-parameter models can now be easily coded and applied to analyse crash datasets characterised by excess zero responses and/or long tails.
Details