Search results

1 – 10 of 100
Book part
Publication date: 15 January 2010

Denis Bolduc and Ricardo Alvarez-Daziano

The search for flexible models has led the simple multinomial logit model to evolve into the powerful but computationally very demanding mixed multinomial logit (MMNL) model. That…

Abstract

The search for flexible models has led the simple multinomial logit model to evolve into the powerful but computationally very demanding mixed multinomial logit (MMNL) model. That flexibility search lead to discrete choice hybrid choice models (HCMs) formulations that explicitly incorporate psychological factors affecting decision making in order to enhance the behavioral representation of the choice process. It expands on standard choice models by including attitudes, opinions, and perceptions as psychometric latent variables.

In this paper we describe the classical estimation technique for a simulated maximum likelihood (SML) solution of the HCM. To show its feasibility, we apply it to data of stated personal vehicle choices made by Canadian consumers when faced with technological innovations.

We then go beyond classical methods, and estimate the HCM using a hierarchical Bayesian approach that exploits HCM Gibbs sampling considering both a probit and a MMNL discrete choice kernel. We then carry out a Monte Carlo experiment to test how the HCM Gibbs sampler works in practice. To our knowledge, this is the first practical application of HCM Bayesian estimation.

We show that although HCM joint estimation requires the evaluation of complex multi-dimensional integrals, SML can be successfully implemented. The HCM framework not only proves to be capable of introducing latent variables, but also makes it possible to tackle the problem of measurement errors in variables in a very natural way. We also show that working with Bayesian methods has the potential to break down the complexity of classical estimation.

Details

Choice Modelling: The State-of-the-art and The State-of-practice
Type: Book
ISBN: 978-1-84950-773-8

Book part
Publication date: 18 July 2016

Alan D. Olinsky, Kristin Kennedy and Michael Salzillo

Forecasting the number of bed days (NBD) needed within a large hospital network is extremely challenging, but it is imperative that management find a predictive model that best…

Abstract

Forecasting the number of bed days (NBD) needed within a large hospital network is extremely challenging, but it is imperative that management find a predictive model that best estimates the calculation. This estimate is used by operational managers for logistical planning purposes. Furthermore, the finance staff of a hospital would require an expected NBD as input for estimating future expenses. Some hospital reimbursement contracts are on a per diem schedule, and expected NBD is useful in forecasting future revenue.

This chapter examines two ways of estimating the NBD for a large hospital system, and it builds from previous work comparing time regression and an autoregressive integrated moving average (ARIMA). The two approaches discussed in this chapter examine whether using the total or combined NBD for all the data is a better predictor than partitioning the data by different types of services. The four partitions are medical, maternity, surgery, and psychology. The partitioned time series would then be used to forecast future NBD by each type of service, but one could also sum the partitioned predictors for an alternative total forecaster. The question is whether one of these two approaches outperforms the other with a best fit for forecasting the NBD. The approaches presented in this chapter can be applied to a variety of time series data for business forecasting when a large database of information can be partitioned into smaller segments.

Details

Advances in Business and Management Forecasting
Type: Book
ISBN: 978-1-78635-534-8

Keywords

Book part
Publication date: 5 October 2018

Olalekan Shamsideen Oshodi and Ka Chi Lam

Fluctuations in the tender price index have an adverse effect on the construction sector and the economy at large. This is largely due to the positive relationship that exists…

Abstract

Fluctuations in the tender price index have an adverse effect on the construction sector and the economy at large. This is largely due to the positive relationship that exists between the construction industry and economic growth. The consequences of these variations include cost overruns and schedule delays, among others. An accurate forecast of the tender price index is good for controlling the uncertainty associated with its variation. In the present study, the efficacy of using an adaptive neuro-fuzzy inference system (ANFIS) for tender price forecasting is investigated. In addition, the Box–Jenkins model, which is considered a benchmark technique, was used to evaluate the performance of the ANFIS model. The results demonstrate that the ANFIS model is superior to the Box–Jenkins model in terms of the accuracy and reliability of the forecast. The ANFIS could provide an accurate and reliable forecast of the tender price index in the medium term (i.e. over a three-year period). This chapter provides evidence of the advantages of applying nonlinear modelling techniques (such as the ANFIS) to tender price index forecasting. Although the proposed ANFIS model is applied to the tender price index in this study, it can also be applied to a wider range of problems in the field of construction engineering and management.

Details

Fuzzy Hybrid Computing in Construction Engineering and Management
Type: Book
ISBN: 978-1-78743-868-2

Keywords

Book part
Publication date: 24 March 2006

Ngai Hang Chan and Wilfredo Palma

Since the seminal works by Granger and Joyeux (1980) and Hosking (1981), estimations of long-memory time series models have been receiving considerable attention and a number of…

Abstract

Since the seminal works by Granger and Joyeux (1980) and Hosking (1981), estimations of long-memory time series models have been receiving considerable attention and a number of parameter estimation procedures have been proposed. This paper gives an overview of this plethora of methodologies with special focus on likelihood-based techniques. Broadly speaking, likelihood-based techniques can be classified into the following categories: the exact maximum likelihood (ML) estimation (Sowell, 1992; Dahlhaus, 1989), ML estimates based on autoregressive approximations (Granger & Joyeux, 1980; Li & McLeod, 1986), Whittle estimates (Fox & Taqqu, 1986; Giraitis & Surgailis, 1990), Whittle estimates with autoregressive truncation (Beran, 1994a), approximate estimates based on the Durbin–Levinson algorithm (Haslett & Raftery, 1989), state-space-based maximum likelihood estimates for ARFIMA models (Chan & Palma, 1998), and estimation of stochastic volatility models (Ghysels, Harvey, & Renault, 1996; Breidt, Crato, & de Lima, 1998; Chan & Petris, 2000) among others. Given the diversified applications of these techniques in different areas, this review aims at providing a succinct survey of these methodologies as well as an overview of important related problems such as the ML estimation with missing data (Palma & Chan, 1997), influence of subsets of observations on estimates and the estimation of seasonal long-memory models (Palma & Chan, 2005). Performances and asymptotic properties of these techniques are compared and examined. Inter-connections and finite sample performances among these procedures are studied. Finally, applications to financial time series of these methodologies are discussed.

Details

Econometric Analysis of Financial and Economic Time Series
Type: Book
ISBN: 978-1-84950-388-4

Book part
Publication date: 1 September 2008

Rebecca Lawrence

This chapter analyses the private financial sector's policy responses, lending practices and various forms of engagement with non-governmental organisations (NGOs), communities…

Abstract

This chapter analyses the private financial sector's policy responses, lending practices and various forms of engagement with non-governmental organisations (NGOs), communities and institutional clients involved in controversial commodity industries. The chapter demonstrates that secrecy plays a constitutive role in this engagement. For investment banks, client-confidentiality is the ultimate limit to transparency. At the same time, NGOs campaign to make public and reveal links between investment banks and clients in commodity industries. The chapter also explores techniques within the financial sector for the assessment of social and environmental risk. The chapter argues that these techniques combine both practices of uncertainty and practices of risk. For civil society organisations, NGOs and local communities, these techniques remain problematic, and various campaigns question both the robustness of the financial sector's social risk screening methods as well as the sustainability of the investments themselves.

Details

Hidden Hands in the Market: Ethnographies of Fair Trade, Ethical Consumption, and Corporate Social Responsibility
Type: Book
ISBN: 978-1-84855-059-9

Book part
Publication date: 2 June 2008

Rika Takahashi, Jin Kenzaki and Makoto Yano

In the real world, developed countries are permitted to impose tariffs only on a small range of imports (partial tariff). For this reason, tariff policies have been replaced in…

Abstract

In the real world, developed countries are permitted to impose tariffs only on a small range of imports (partial tariff). For this reason, tariff policies have been replaced in many countries by other policy devices such as a competition policy. This study compares a competition policy with a partial tariff policy. It demonstrates that if a country can impose a tariff on only a small part of the imports and at sufficiently low tariff rates, optimal partial tariff policy may not create as large a protective effect as optimal competition policy.

Details

Contemporary and Emerging Issues in Trade Theory and Policy
Type: Book
ISBN: 978-1-84950-541-3

Keywords

Book part
Publication date: 18 October 2014

Elesa Zehndorfer and Chris Mackintosh

This paper analyses the radical reorganisation of English school sport by the coalition government, a move that led to the emergence of a significant discourse of dissatisfaction…

Abstract

Purpose

This paper analyses the radical reorganisation of English school sport by the coalition government, a move that led to the emergence of a significant discourse of dissatisfaction amongst school sport advocacy coalition groups.

Design/methodology/approach

This paper utilises Sabatier’s (Sabatier & Jenkins-Smith, 1999) Advocacy Coalition Framework (ACF) to identify how the coalition government’s decision to abolish the successful Physical Education School Sport and Club Links (PESSCL) programme has specifically weakened the power of formerly influential advocacy coalitions within the school sport arena. Weber’s (1947) conceptualisation of charisma, in particular, the concept of charismatic rhetoric, is used to explain how these historically extensive policy changes were communicated by the coalition government, and particularly, by Michael Gove, the Secretary of State.

Findings

Locating the government’s rhetoric within the charismatic literature allowed the exploration of how a disempowerment of advocacy coalition groups and centralisation of power towards the state might have been partly achieved via the use of charismatic rhetoric (Weber, 1947).

Originality/value

Javidan and Waldman (2003) identified a lack of rigorous empirical study of the role of charismatic leadership and its consequences in public sector leadership, a critique that has been addressed by this paper.

Details

European Public Leadership in Crisis?
Type: Book
ISBN: 978-1-78350-901-0

Keywords

Abstract

Details

Transformative Leadership in Action: Allyship, Advocacy & Activism
Type: Book
ISBN: 978-1-83909-520-7

Book part
Publication date: 1 January 2005

Carroll Foster and Robert R. Trout

The basic model for estimating economic losses to a company that has some type of business interruption is well-documented in the forensic economics literature. A summary of much…

Abstract

The basic model for estimating economic losses to a company that has some type of business interruption is well-documented in the forensic economics literature. A summary of much of this literature is contained in Gaughan (2000). The general method used to measure damages is essentially the same regardless of whether the loss occurs because of some type of natural disaster (as in insurance claims resulting from flood, fire, or hurricane) or whether it is caused by the actions of another party (as with potential tort claims). The interruption prevents the firm from selling units of product, which would otherwise have been supplied to the market. Economic damage is the loss of revenues less the incremental production costs of the units not sold, plus or minus some adjustment factors described in Gaughan (2000, 2004), and elsewhere.

Details

Developments in Litigation Economics
Type: Book
ISBN: 978-1-84950-385-3

Book part
Publication date: 21 November 2018

Siti Mariam Norrulashikin, Fadhilah Yusof, Zulkifli Yusop, Ibrahim Lawal Kane, Norizzati Salleh and Aaishah Radziah Jamaludin

There is evidence that a stationary short memory process that encounters occasional structural break can show the properties of long memory processes or persistence behaviour…

Abstract

There is evidence that a stationary short memory process that encounters occasional structural break can show the properties of long memory processes or persistence behaviour which may lead to extreme weather condition. In this chapter, we applied three techniques for testing the long memory for six daily rainfall datasets in Kelantan area. The results explained that all the datasets exhibit long memory. An empirical fluctuation process was employed to test for structural changes using the ordinary least square (OLS)-based cumulative sum (CUSUM) test. The result also shows that structural change was spotted in all datasets. A long memory testing was then engaged to the datasets that were subdivided into their respective break and the results displayed that the subseries follows the same pattern as the original series. Hence, this indicated that there exists a true long memory in the data generating process (DGP) although structural break occurs within the data series.

Details

Improving Flood Management, Prediction and Monitoring
Type: Book
ISBN: 978-1-78756-552-4

Keywords

1 – 10 of 100