Search results

1 – 10 of over 1000
Open Access
Article
Publication date: 30 September 2019

Victor Motta

The purpose of this study is to account for a recent non-mainstream econometric approach using microdata and how it can inform research in business administration. More…

11471

Abstract

Purpose

The purpose of this study is to account for a recent non-mainstream econometric approach using microdata and how it can inform research in business administration. More specifically, the paper draws from the applied microeconometric literature stances in favor of fitting Poisson regression with robust standard errors rather than the OLS linear regression of a log-transformed dependent variable. In addition, the authors point to the appropriate Stata coding and take into account the possibility of failing to check for the existence of the estimates – convergency issues – as well as being sensitive to numerical problems.

Design/methodology/approach

The author details the main issues with the log-linear model, drawing from the applied econometric literature in favor of estimating multiplicative models for non-count data. Then, he provides the Stata commands and illustrates the differences in the coefficient and standard errors between both OLS and Poisson models using the health expenditure dataset from the RAND Health Insurance Experiment (RHIE).

Findings

The results indicate that the use of Poisson pseudo maximum likelihood estimators yield better results that the log-linear model, as well as other alternative models, such as Tobit and two-part models.

Originality/value

The originality of this study lies in demonstrating an alternative microeconometric technique to deal with positive skewness of dependent variables.

Details

RAUSP Management Journal, vol. 54 no. 4
Type: Research Article
ISSN: 2531-0488

Keywords

Article
Publication date: 27 March 2020

Martin Boďa and Katarína Čunderlíková

This paper studies the density of bank branches in districts of Slovakia and aims to identify determinants that explain or justify districtural differences in the density of bank…

Abstract

Purpose

This paper studies the density of bank branches in districts of Slovakia and aims to identify determinants that explain or justify districtural differences in the density of bank branches.

Design/methodology/approach

Bank branch density is measured by the number of branches in a district, and banks are further differentiated by size and profile. Potential determinants of bank branch density are sought through univariate and bivariate Poisson regressions amongst economic factors, socioeconomic factors, technological factors, urbanization factors, and branch market concentration.

Findings

Using data from 2016, it has been found that branch numbers in districts are determined chiefly by five factors that describe their economic development, population size with its characteristics, and existent branch concentration. The spatial distribution of bank branches in the territory of Slovakia is not random, but is found to be affected by environmental factors measurable at the districtural level. Only 22 Slovak districts representing administrative or economic centers are expected to be over-branched.

Practical implications

The study helps to identify factors that need be accounted for in planning and redesigning of branch networks or in implementing mergers and acquisitions on a bank level. The results are also useful in regional policy and regulatory oversight.

Originality/value

The present study is unique since the decision-making processes of Slovak commercial banks in planning the location and density of their branch networks have not been rationalized and researched as of yet.

Details

International Journal of Bank Marketing, vol. 38 no. 4
Type: Research Article
ISSN: 0265-2323

Keywords

Book part
Publication date: 1 December 2016

Roman Liesenfeld, Jean-François Richard and Jan Vogler

We propose a generic algorithm for numerically accurate likelihood evaluation of a broad class of spatial models characterized by a high-dimensional latent Gaussian process and…

Abstract

We propose a generic algorithm for numerically accurate likelihood evaluation of a broad class of spatial models characterized by a high-dimensional latent Gaussian process and non-Gaussian response variables. The class of models under consideration includes specifications for discrete choices, event counts and limited-dependent variables (truncation, censoring, and sample selection) among others. Our algorithm relies upon a novel implementation of efficient importance sampling (EIS) specifically designed to exploit typical sparsity of high-dimensional spatial precision (or covariance) matrices. It is numerically very accurate and computationally feasible even for very high-dimensional latent processes. Thus, maximum likelihood (ML) estimation of high-dimensional non-Gaussian spatial models, hitherto considered to be computationally prohibitive, becomes feasible. We illustrate our approach with ML estimation of a spatial probit for US presidential voting decisions and spatial count data models (Poisson and Negbin) for firm location choices.

Details

Spatial Econometrics: Qualitative and Limited Dependent Variables
Type: Book
ISBN: 978-1-78560-986-2

Keywords

Article
Publication date: 11 July 2016

Hossein Karimi, Timothy R.B. Taylor, Paul M. Goodrum and Cidambi Srinivasan

This paper aims to quantify the impact of craft worker shortage on construction project safety performance.

1203

Abstract

Purpose

This paper aims to quantify the impact of craft worker shortage on construction project safety performance.

Design/methodology/approach

A database of 50 North American construction projects completed between 2001 and 2014 was compiled by taking information from a research project survey and the Construction Industry Institute Benchmarking and Metrics Database. The t-test and Mann-Whitney test were used to determine whether there was a significant difference in construction project safety performance on projects with craft worker recruiting difficulty. Poisson regression analysis was then used to examine the relationship between craft worker recruiting difficulty and Occupational Safety and Health Administration Total Number of Recordable Incident Cases per 200,000 Actual Direct Work Hours (TRIR) on construction projects.

Findings

The result showed that the TRIR distribution of a group of projects that reported craft worker recruiting difficulty tended to be higher than the TRIR distribution of a group of projects with no craft worker recruiting difficulty (p-value = 0.004). Moreover, the average TRIR of the projects that reported craft worker recruiting difficulty was more than two times the average TRIR of projects that experienced no craft recruiting difficulty (p-value = 0.035). Furthermore, the Poisson regression analysis demonstrated that there was a positive exponential relationship between craft worker recruiting difficulty and TRIR in construction projects (p-value = 0.004).

Research limitations/implications

The projects used to construct the database are heavily weighted towards industrial construction.

Practical implications

There have been significant long-term gains in construction safety within the USA. However, if recent craft shortages continue, the quantitative analyses presented herein indicate a strong possibility that more safety incidents will occur unless the shortages are reversed. Innovative construction means and methods should be developed and adopted to work in a safe manner with a less qualified workforce.

Originality/value

The Poisson regression model is the first model that quantifiably links project craft worker availability to construction project safety performance.

Details

Construction Innovation, vol. 16 no. 3
Type: Research Article
ISSN: 1471-4175

Keywords

Article
Publication date: 2 November 2012

Wael Hemrit and Mounira Ben Arab

The purpose of this paper is to examine the determinants of operational losses in insurance companies.

1019

Abstract

Purpose

The purpose of this paper is to examine the determinants of operational losses in insurance companies.

Design/methodology/approach

By using most common estimates of frequency and severity of losses that affected business‐lines during 2009, the paper integrates a quantitative aspect that reflects the mode of organization in the insurance company. In this paper, it would be more appropriate to focus on the frequency and severity of losses estimated by insurers and which are related to each category of operational risk events that took place in 2009.

Findings

The paper finds that the frequency of operational losses is positively related to the Market Share (MARKSHARE) and the Rate of Geographic Location (RAGELOC). However, the occurrence of loss is negatively related to the Variety of Insurance Activities (VARIACT). The paper also found a decrease in the frequency of losses associated with a large number of employees. Therefore, there is a significant relationship between the Human Factor (HF) and the occurrence of operational losses. In terms of severity, the empirical study has shown that the probability of zero intensity of operational losses is negatively influenced by the Market Share (MARKSHARE) and the Rate of Geographic Location (RAGELOC). In the same framework, the Variety of Insurance Activities (VARIACT) has a negative effect on the probability of high operational loss severity.

Originality/value

Despite the absence of the quantitative data of operational risk, this article will discover a new research perspective to estimate the frequency and severity of operational losses in the insurance sector in Tunisia.

Details

The Journal of Risk Finance, vol. 13 no. 5
Type: Research Article
ISSN: 1526-5943

Keywords

Book part
Publication date: 17 January 2009

Virginia M. Miori

The challenge of truckload routing is increased in complexity by the introduction of stochastic demand. Typically, this demand is generalized to follow a Poisson distribution. In…

Abstract

The challenge of truckload routing is increased in complexity by the introduction of stochastic demand. Typically, this demand is generalized to follow a Poisson distribution. In this chapter, we cluster the demand data using data mining techniques to establish the more acceptable distribution to predict demand. We then examine this stochastic truckload demand using an econometric discrete choice model known as a count data model. Using actual truckload demand data and data from the bureau of transportation statistics, we perform count data regressions. Two outcomes are produced from every regression run, the predicted demand between every origin and destination, and the likelihood that that demand will occur. The two allow us to generate an expected value forecast of truckload demand as input to a truckload routing formulation. The negative binomial distribution produces an improved forecast over the Poisson distribution.

Details

Advances in Business and Management Forecasting
Type: Book
ISBN: 978-1-84855-548-8

Book part
Publication date: 7 June 2013

Nhuong Tran, Norbert Wilson and Diane Hite

The purpose of the chapter is to test the hypothesis that food safety (chemical) standards act as barriers to international seafood imports. We use zero-accounting gravity models

Abstract

The purpose of the chapter is to test the hypothesis that food safety (chemical) standards act as barriers to international seafood imports. We use zero-accounting gravity models to test the hypothesis that food safety (chemical) standards act as barriers to international seafood imports. The chemical standards on which we focus include chloramphenicol required performance limit, oxytetracycline maximum residue limit, fluoro-quinolones maximum residue limit, and dichlorodiphenyltrichloroethane (DDT) pesticide residue limit. The study focuses on the three most important seafood markets: the European Union’s 15 members, Japan, and North America.Our empirical results confirm the hypothesis and are robust to the OLS as well as alternative zero-accounting gravity models such as the Heckman estimation and the Poisson family regressions. For the choice of the best model specification to account for zero trade and heteroskedastic issues, it is inconclusive to base on formal statistical tests; however, the Heckman sample selection and zero-inflated negative binomial (ZINB) models provide the most reliable parameter estimates based on the statistical tests, magnitude of coefficients, economic implications, and the literature findings. Our findings suggest that continually tightening of seafood safety standards has had a negative impact on exporting countries. Increasing the stringency of regulations by reducing analytical limits or maximum residue limits in seafood in developed countries has negative impacts on their bilateral seafood imports. The chapter furthers the literature on food safety standards on international trade. We show competing gravity model specifications and provide additional evidence that no one gravity model is superior.

Details

Nontariff Measures with Market Imperfections: Trade and Welfare Implications
Type: Book
ISBN: 978-1-78190-754-2

Keywords

Book part
Publication date: 16 September 2022

Vasileios Ouranos and Alexandra Livada

Probability of Default (PD) is a crucial credit risk parameter. International accords have motivated banks and credit institutions to adopt objective systems of evaluating and…

Abstract

Probability of Default (PD) is a crucial credit risk parameter. International accords have motivated banks and credit institutions to adopt objective systems of evaluating and monitoring the PD. This study examines retail unsecured loans of a major Greek bank during the period of the financial crisis. It focusses on the stochastic behaviour of the financial states of the loans. It is tested whether a first-order Markov chain (MC) model describes sufficiently the transitions from one state to another. Moreover, Poisson regression models are estimated in order to calculate the limiting transition matrix, the limiting state probabilities and the PD. It is proved that the MC of the financial states of loans is non-homogeneous suggesting that the transition probabilities from one financial state to another are not constant across time. From the Poisson regression models, the transition probability matrix is estimated from one state to another in alternative time periods. From the limiting transition matrix, it is shown that if a loan is delayed then it is very likely to move towards the next worst case. The findings of this research could be useful for bank management.

Details

The New Digital Era: Other Emerging Risks and Opportunities
Type: Book
ISBN: 978-1-80382-983-8

Keywords

Article
Publication date: 10 May 2023

Upama Dey, Aparna Duggirala and Souren Mitra

Aluminium alloys can be used as lightweight and high-strength materials in combination with the technology of laser beam welding, an efficient joining method, in the manufacturing…

Abstract

Purpose

Aluminium alloys can be used as lightweight and high-strength materials in combination with the technology of laser beam welding, an efficient joining method, in the manufacturing of automotive parts. The purposes of this paper are to conduct laser welding experiments with Al2024 in the lap joint configuration, model the laser welding process parameters of Al2024 alloys and use propounded models to optimize the process parameters.

Design/methodology/approach

Laser welding of Al2024 alloy has been conducted in the lap joint configuration. Then, the influences of explanatory variables (laser peak power, scanning speed and frequency) on outcome variables (weld width [WW], throat length [TL] and breaking load [BL]) have been investigated with Poisson regression analysis of the data set derived from experimentation. Thereafter, a multi-objective genetic algorithm (MOGA) has been used using MATLAB to find the optimum solutions. The effects of various input process parameters on the responses have also been analysed using response surface plots.

Findings

The promulgated statistical models, derived with Poisson regression analysis, are evinced to be well-fit ones using the analysis of deviance approach. Pareto fronts have been used to demonstrate the optimization results, and the maximized load-bearing capacity is computed to be 1,263 N, whereas the compromised WW and TL are 714 µm and 760 µm, respectively.

Originality/value

This work of conducting laser welding of lap joint of Al2024 alloy incorporating the Taguchi method and optimizing the input process parameters with the promulgated statistical models proffers a neoteric perspective that can be useful to the manufacturing industry.

Details

World Journal of Engineering, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1708-5284

Keywords

Article
Publication date: 1 February 1986

K.L. Mak and C.H. Hung

In recent decades there has been much interest and activity in the application of mathematical ideas for controlling inventory. However most of this has been related to the…

Abstract

In recent decades there has been much interest and activity in the application of mathematical ideas for controlling inventory. However most of this has been related to the control of stock products whose demand is smooth and continuous. When demand is lumpy these methods are inefficient in their attempts to minimise thé operating cost. A simple regression model is developed for computing optimal (s, S) policies for items with lumpy demand patterns. Continuous review of inventory level is assumed and the lead time demand is approximated by the stuttering Poisson distribution. A grid of 864 known optimal policies has been used to provide the data for the calibration of the regression models. Numerical models are used to illustrate this approach. Extensive computational results show that this model provides excellent performance in estimating the optimal values of the control parameters s and S for wide ranges of demand and cost parameters.

Details

International Journal of Operations & Production Management, vol. 6 no. 2
Type: Research Article
ISSN: 0144-3577

Keywords

1 – 10 of over 1000