Search results

1 – 10 of over 4000
Click here to view access options
Article
Publication date: 11 January 2018

Qin Zhang and P.B. Seetharaman

The purpose of this paper is to propose a method to help firms assess lifetime profitability of customers whose buying behaviors are characterized by purchasing cycles…

Abstract

Purpose

The purpose of this paper is to propose a method to help firms assess lifetime profitability of customers whose buying behaviors are characterized by purchasing cycles, which are determined by both intrinsic purchasing cycles and cumulative effects of firms’ marketing solicitations.

Design/methodology/approach

This paper first proposes a probability model to predict customers’ responses to firms’ marketing solicitations in which a customer’s inter-purchase times are assumed to follow a Poisson distribution, whose parameters vary across customers and follow a gamma distribution. The paper then proposes a customer profitability scoring model that uses customers’ responses as an input to assess their lifetime profitability at a given point of time.

Findings

The paper illustrates the proposed method using individual-level purchasing data of 529 customers from a catalog firm. The paper shows that the proposed model outperforms the benchmark model in terms of both explaining and predicting customers’ purchases. The paper also demonstrates significant profit consequences to the firm if incorrect methods are used instead of the proposed method.

Practical implications

The proposed method can help firms select or eliminate customers based on their lifetime profitability so that firms can focus their marketing efforts in a more targeted manner to increase total profits.

Originality/value

The proposed Gamma-Poisson probability model and the profitability scoring method are easy to implement due to the attractive conjugacy property. It is valuable for firms’ customer relationship management applications from the standpoint of making customer selection and inventory management decisions.

Details

Marketing Intelligence & Planning, vol. 36 no. 2
Type: Research Article
ISSN: 0263-4503

Keywords

Open Access
Article
Publication date: 30 September 2019

Victor Motta

The purpose of this study is to account for a recent non-mainstream econometric approach using microdata and how it can inform research in business administration. More…

Downloads
5472

Abstract

Purpose

The purpose of this study is to account for a recent non-mainstream econometric approach using microdata and how it can inform research in business administration. More specifically, the paper draws from the applied microeconometric literature stances in favor of fitting Poisson regression with robust standard errors rather than the OLS linear regression of a log-transformed dependent variable. In addition, the authors point to the appropriate Stata coding and take into account the possibility of failing to check for the existence of the estimates – convergency issues – as well as being sensitive to numerical problems.

Design/methodology/approach

The author details the main issues with the log-linear model, drawing from the applied econometric literature in favor of estimating multiplicative models for non-count data. Then, he provides the Stata commands and illustrates the differences in the coefficient and standard errors between both OLS and Poisson models using the health expenditure dataset from the RAND Health Insurance Experiment (RHIE).

Findings

The results indicate that the use of Poisson pseudo maximum likelihood estimators yield better results that the log-linear model, as well as other alternative models, such as Tobit and two-part models.

Originality/value

The originality of this study lies in demonstrating an alternative microeconometric technique to deal with positive skewness of dependent variables.

Details

RAUSP Management Journal, vol. 54 no. 4
Type: Research Article
ISSN: 2531-0488

Keywords

Click here to view access options
Book part
Publication date: 18 April 2018

Mohammed Quddus

Purpose – Time-series regression models are applied to analyse transport safety data for three purposes: (1) to develop a relationship between transport accidents (or…

Abstract

Purpose – Time-series regression models are applied to analyse transport safety data for three purposes: (1) to develop a relationship between transport accidents (or incidents) and various time-varying factors, with the aim of identifying the most important factors; (2) to develop a time-series accident model in forecasting future accidents for the given values of future time-varying factors and (3) to evaluate the impact of a system-wide policy, education or engineering intervention on accident counts. Regression models for analysing transport safety data are well established, especially in analysing cross-sectional and panel datasets. There is, however, a dearth of research relating to time-series regression models in the transport safety literature. The purpose of this chapter is to examine existing literature with the aim of identifying time-series regression models that have been employed in safety analysis in relation to wider applications. The aim is to identify time-series regression models that are applicable in analysing disaggregated accident counts.

Methodology/Approach – There are two main issues in modelling time-series accident counts: (1) a flexible approach in addressing serial autocorrelation inherent in time-series processes of accident counts and (2) the fact that the conditional distribution (conditioned on past observations and covariates) of accident counts follow a Poisson-type distribution. Various time-series regression models are explored to identify the models most suitable for analysing disaggregated time-series accident datasets. A recently developed time-series regression model – the generalised linear autoregressive and moving average (GLARMA) – has been identified as the best model to analyse safety data.

Findings – The GLARMA model was applied to a time-series dataset of airproxes (aircraft proximity) that indicate airspace safety in the United Kingdom. The aim was to evaluate the impact of an airspace intervention (i.e., the introduction of reduced vertical separation minima, RVSM) on airspace safety while controlling for other factors, such as air transport movements (ATMs) and seasonality. The results indicate that the GLARMA model is more appropriate than a generalised linear model (e.g., Poisson or Poisson-Gamma), and it has been found that the introduction of RVSM has reduced the airprox events by 15%. In addition, it was found that a 1% increase in ATMs within UK airspace would lead to a 1.83% increase in monthly airproxes in UK airspace.

Practical applications – The methodology developed in this chapter is applicable to many time-series processes of accident counts. The models recommended in this chapter could be used to identify different time-varying factors and to evaluate the effectiveness of various policy and engineering interventions on transport safety or similar data (e.g., crimes).

Originality/value of paper – The GLARMA model has not been properly explored in modelling time-series safety data. This new class of model has been applied to a dataset in evaluating the effectiveness of an intervention. The model recommended in this chapter would greatly benefit researchers and analysts working with time-series data.

Details

Safe Mobility: Challenges, Methodology and Solutions
Type: Book
ISBN: 978-1-78635-223-1

Keywords

Click here to view access options
Article
Publication date: 1 August 1997

Richard L. Henshel

Briefly reviews the standard Poisson distribution and then examines a set of derivative, modified Poisson distributions for testing hypotheses derived from positive…

Downloads
790

Abstract

Briefly reviews the standard Poisson distribution and then examines a set of derivative, modified Poisson distributions for testing hypotheses derived from positive deviation‐amplifying feedback models, which do not lend themselves to ordinary statistically based hypothesis testing. The “reinforcement” or “contagious” Poisson offers promise for a subset of such models, in particular those models with data in the form of rates (rather than magnitudes). The practical difficulty lies in distinguishing reinforcement effects from initial heterogeneity, since both can form negative binomial distributions, with look‐alike data. Illustrates these difficulties, and also opportunities, for various feedback models employing the self‐fulfilling prophecy, and especially for confidence loops, which incorporate particular self‐fulfilling prophecies as part of a larger dynamic process. Describes an actual methodology for testing hypotheses regarding confidence loops with the aid of a “reinforcement” Poisson distribution, as well as its place within sociocybernetics.

Click here to view access options
Book part
Publication date: 1 December 2016

Roman Liesenfeld, Jean-François Richard and Jan Vogler

We propose a generic algorithm for numerically accurate likelihood evaluation of a broad class of spatial models characterized by a high-dimensional latent Gaussian…

Abstract

We propose a generic algorithm for numerically accurate likelihood evaluation of a broad class of spatial models characterized by a high-dimensional latent Gaussian process and non-Gaussian response variables. The class of models under consideration includes specifications for discrete choices, event counts and limited-dependent variables (truncation, censoring, and sample selection) among others. Our algorithm relies upon a novel implementation of efficient importance sampling (EIS) specifically designed to exploit typical sparsity of high-dimensional spatial precision (or covariance) matrices. It is numerically very accurate and computationally feasible even for very high-dimensional latent processes. Thus, maximum likelihood (ML) estimation of high-dimensional non-Gaussian spatial models, hitherto considered to be computationally prohibitive, becomes feasible. We illustrate our approach with ML estimation of a spatial probit for US presidential voting decisions and spatial count data models (Poisson and Negbin) for firm location choices.

Details

Spatial Econometrics: Qualitative and Limited Dependent Variables
Type: Book
ISBN: 978-1-78560-986-2

Keywords

Click here to view access options
Book part
Publication date: 12 November 2014

Matthew Lindsey and Robert Pavur

A Bayesian approach to demand forecasting to optimize spare parts inventory that requires periodic replenishment is examined relative to a non-Bayesian approach when the…

Abstract

A Bayesian approach to demand forecasting to optimize spare parts inventory that requires periodic replenishment is examined relative to a non-Bayesian approach when the demand rate is unknown. That is, optimal inventory levels are decided using these two approaches at consecutive time intervals. Simulations were conducted to compare the total inventory cost using a Bayesian approach and a non-Bayesian approach to a theoretical minimum cost over a variety of demand rate conditions including the challenging slow moving or intermittent type of spare parts. Although Bayesian approaches are often recommended, this study’s results reveal that under conditions of large variability across the demand rates of spare parts, the inventory cost using the Bayes model was not superior to that using the non-Bayesian approach. For spare parts with homogeneous demand rates, the inventory cost using the Bayes model for forecasting was generally lower than that of the non-Bayesian model. Practitioners may still opt to use the non-Bayesian model since a prior distribution for the demand does not need to be identified.

Details

Advances in Business and Management Forecasting
Type: Book
ISBN: 978-1-78441-209-8

Keywords

Click here to view access options
Article
Publication date: 2 November 2012

Wael Hemrit and Mounira Ben Arab

The purpose of this paper is to examine the determinants of operational losses in insurance companies.

Abstract

Purpose

The purpose of this paper is to examine the determinants of operational losses in insurance companies.

Design/methodology/approach

By using most common estimates of frequency and severity of losses that affected business‐lines during 2009, the paper integrates a quantitative aspect that reflects the mode of organization in the insurance company. In this paper, it would be more appropriate to focus on the frequency and severity of losses estimated by insurers and which are related to each category of operational risk events that took place in 2009.

Findings

The paper finds that the frequency of operational losses is positively related to the Market Share (MARKSHARE) and the Rate of Geographic Location (RAGELOC). However, the occurrence of loss is negatively related to the Variety of Insurance Activities (VARIACT). The paper also found a decrease in the frequency of losses associated with a large number of employees. Therefore, there is a significant relationship between the Human Factor (HF) and the occurrence of operational losses. In terms of severity, the empirical study has shown that the probability of zero intensity of operational losses is negatively influenced by the Market Share (MARKSHARE) and the Rate of Geographic Location (RAGELOC). In the same framework, the Variety of Insurance Activities (VARIACT) has a negative effect on the probability of high operational loss severity.

Originality/value

Despite the absence of the quantitative data of operational risk, this article will discover a new research perspective to estimate the frequency and severity of operational losses in the insurance sector in Tunisia.

Details

The Journal of Risk Finance, vol. 13 no. 5
Type: Research Article
ISSN: 1526-5943

Keywords

Click here to view access options
Article
Publication date: 27 March 2020

Martin Boďa and Katarína Čunderlíková

This paper studies the density of bank branches in districts of Slovakia and aims to identify determinants that explain or justify districtural differences in the density…

Abstract

Purpose

This paper studies the density of bank branches in districts of Slovakia and aims to identify determinants that explain or justify districtural differences in the density of bank branches.

Design/methodology/approach

Bank branch density is measured by the number of branches in a district, and banks are further differentiated by size and profile. Potential determinants of bank branch density are sought through univariate and bivariate Poisson regressions amongst economic factors, socioeconomic factors, technological factors, urbanization factors, and branch market concentration.

Findings

Using data from 2016, it has been found that branch numbers in districts are determined chiefly by five factors that describe their economic development, population size with its characteristics, and existent branch concentration. The spatial distribution of bank branches in the territory of Slovakia is not random, but is found to be affected by environmental factors measurable at the districtural level. Only 22 Slovak districts representing administrative or economic centers are expected to be over-branched.

Practical implications

The study helps to identify factors that need be accounted for in planning and redesigning of branch networks or in implementing mergers and acquisitions on a bank level. The results are also useful in regional policy and regulatory oversight.

Originality/value

The present study is unique since the decision-making processes of Slovak commercial banks in planning the location and density of their branch networks have not been rationalized and researched as of yet.

Details

International Journal of Bank Marketing, vol. 38 no. 4
Type: Research Article
ISSN: 0265-2323

Keywords

Click here to view access options
Article
Publication date: 11 July 2016

Hossein Karimi, Timothy R.B. Taylor, Paul M. Goodrum and Cidambi Srinivasan

This paper aims to quantify the impact of craft worker shortage on construction project safety performance.

Downloads
1005

Abstract

Purpose

This paper aims to quantify the impact of craft worker shortage on construction project safety performance.

Design/methodology/approach

A database of 50 North American construction projects completed between 2001 and 2014 was compiled by taking information from a research project survey and the Construction Industry Institute Benchmarking and Metrics Database. The t-test and Mann-Whitney test were used to determine whether there was a significant difference in construction project safety performance on projects with craft worker recruiting difficulty. Poisson regression analysis was then used to examine the relationship between craft worker recruiting difficulty and Occupational Safety and Health Administration Total Number of Recordable Incident Cases per 200,000 Actual Direct Work Hours (TRIR) on construction projects.

Findings

The result showed that the TRIR distribution of a group of projects that reported craft worker recruiting difficulty tended to be higher than the TRIR distribution of a group of projects with no craft worker recruiting difficulty (p-value = 0.004). Moreover, the average TRIR of the projects that reported craft worker recruiting difficulty was more than two times the average TRIR of projects that experienced no craft recruiting difficulty (p-value = 0.035). Furthermore, the Poisson regression analysis demonstrated that there was a positive exponential relationship between craft worker recruiting difficulty and TRIR in construction projects (p-value = 0.004).

Research limitations/implications

The projects used to construct the database are heavily weighted towards industrial construction.

Practical implications

There have been significant long-term gains in construction safety within the USA. However, if recent craft shortages continue, the quantitative analyses presented herein indicate a strong possibility that more safety incidents will occur unless the shortages are reversed. Innovative construction means and methods should be developed and adopted to work in a safe manner with a less qualified workforce.

Originality/value

The Poisson regression model is the first model that quantifiably links project craft worker availability to construction project safety performance.

Details

Construction Innovation, vol. 16 no. 3
Type: Research Article
ISSN: 1471-4175

Keywords

Click here to view access options
Book part
Publication date: 7 June 2013

Nhuong Tran, Norbert Wilson and Diane Hite

The purpose of the chapter is to test the hypothesis that food safety (chemical) standards act as barriers to international seafood imports. We use zero-accounting gravity…

Abstract

The purpose of the chapter is to test the hypothesis that food safety (chemical) standards act as barriers to international seafood imports. We use zero-accounting gravity models to test the hypothesis that food safety (chemical) standards act as barriers to international seafood imports. The chemical standards on which we focus include chloramphenicol required performance limit, oxytetracycline maximum residue limit, fluoro-quinolones maximum residue limit, and dichlorodiphenyltrichloroethane (DDT) pesticide residue limit. The study focuses on the three most important seafood markets: the European Union’s 15 members, Japan, and North America.Our empirical results confirm the hypothesis and are robust to the OLS as well as alternative zero-accounting gravity models such as the Heckman estimation and the Poisson family regressions. For the choice of the best model specification to account for zero trade and heteroskedastic issues, it is inconclusive to base on formal statistical tests; however, the Heckman sample selection and zero-inflated negative binomial (ZINB) models provide the most reliable parameter estimates based on the statistical tests, magnitude of coefficients, economic implications, and the literature findings. Our findings suggest that continually tightening of seafood safety standards has had a negative impact on exporting countries. Increasing the stringency of regulations by reducing analytical limits or maximum residue limits in seafood in developed countries has negative impacts on their bilateral seafood imports. The chapter furthers the literature on food safety standards on international trade. We show competing gravity model specifications and provide additional evidence that no one gravity model is superior.

Details

Nontariff Measures with Market Imperfections: Trade and Welfare Implications
Type: Book
ISBN: 978-1-78190-754-2

Keywords

1 – 10 of over 4000