Search results

1 – 10 of over 17000
Article
Publication date: 19 June 2009

Clara M. Novoa and Francis Mendez

The purpose of this paper is to present bootstrapping as an alternative statistical methodology to analyze time studies and input data for discrete‐event simulations. Bootstrapping

Abstract

Purpose

The purpose of this paper is to present bootstrapping as an alternative statistical methodology to analyze time studies and input data for discrete‐event simulations. Bootstrapping is a non‐parametric technique to estimate the sampling distribution of a statistic by doing repeated sampling (i.e. resampling) with replacement from an original sample. This paper proposes a relatively simple implementation of bootstrap techniques to time study analysis.

Design/methodology/approach

Using an inductive approach, this work selects a typical situation to conduct a time study, applies two bootstrap procedures for the statistical analysis, compares bootstrap to traditional parametric approaches, and extrapolates general advantages of bootstrapping over parametric approaches.

Findings

Bootstrap produces accurate inferences when compared to those from parametric methods, and it is an alternative when the underlying parametric assumptions are not met.

Research limitations/implications

Research results contribute to work measurement and simulation fields since bootstrap promises an increase in accuracy in cases where the normality assumption is violated or only small samples are available. Furthermore, this paper shows that electronic spreadsheets are appropriate tools to implement the proposed bootstrap procedures.

Originality/value

In previous work, the standard procedure to analyze time studies and input data for simulations is a parametric approach. Bootstrap permits to obtain both point estimates and estimates of time distributions. Engineers and managers involved in process improvement initiatives could use bootstrap to exploit better the information from available samples.

Details

International Journal of Productivity and Performance Management, vol. 58 no. 5
Type: Research Article
ISSN: 1741-0401

Keywords

Article
Publication date: 2 May 2017

Yugu Xiao, Ke Wang and Lysa Porth

While crop insurance ratemaking has been studied for many decades, it is still faced with many challenges. Crop insurance premium rates (PRs) are traditionally determined only by…

Abstract

Purpose

While crop insurance ratemaking has been studied for many decades, it is still faced with many challenges. Crop insurance premium rates (PRs) are traditionally determined only by point estimation, and this approach may lead to uncertainty because it is sensitive to the underwriter’s assumptions regarding the trend, yield distribution, and other issues such as data scarcity and credibility. Thus, the purpose of this paper is to obtain the interval estimate for the PR, which can provide additional information about the accuracy of the point estimate.

Design/methodology/approach

A bootstrap method based on the loss cost ratio ratemaking approach is proposed. Using Monte Carlo experiments, the performance of this method is tested against several popular methods. To measure the efficiency of the confidence interval (CI) estimators, the actual coverage probabilities and the average widths of these intervals are calculated.

Findings

The proposed method is shown to be as efficient as the non-parametric kernel method, and has the features of flexibility and robustness, and can provide insight for underwriters regarding uncertainty based on the width of the CI.

Originality/value

Comprehensive comparisons are conducted to show the advantage and the efficiency of the proposed method. In addition, a significant empirical example is given to show how to use the CIs to support ratemaking.

Details

China Agricultural Economic Review, vol. 9 no. 2
Type: Research Article
ISSN: 1756-137X

Keywords

Article
Publication date: 17 May 2013

Dorothea Diers, Martin Eling and Marc Linde

The purpose of this paper is to illustrate the importance of modeling parameter risk in premium risk, especially when data are scarce and a multi‐year projection horizon is…

Abstract

Purpose

The purpose of this paper is to illustrate the importance of modeling parameter risk in premium risk, especially when data are scarce and a multi‐year projection horizon is considered. Internal risk models often integrate both process and parameter risks in modeling reserve risk, whereas parameter risk is typically omitted in premium risk, the modeling of which considers only process risk.

Design/methodology/approach

The authors present a variety of methods for modeling parameter risk (asymptotic normality, bootstrap, Bayesian) with different statistical properties. They then integrate these different modeling approaches in an internal risk model and compare their results with those from modeling approaches that measure only process risk in premium risk.

Findings

The authors show that parameter risk is substantial, especially when a multi‐year projection horizon is considered and when there is only limited historical data available for parameterization (as is often the case in practice). The authors' results also demonstrate that parameter risk substantially influences risk‐based capital and strategic management decisions, such as reinsurance.

Practical implications

The authors' findings emphasize that it is necessary to integrate parameter risk in risk modeling. Their findings are thus not only of interest to academics, but of high relevance to practitioners and regulators working toward appropriate risk modeling in an enterprise risk management and solvency context.

Originality/value

To the authors' knowledge, there are no model approaches or studies on parameter uncertainty for projection periods of not just one, but several, accident years; however, consideration of multiple years is crucial when thinking strategically about enterprise risk management.

Details

The Journal of Risk Finance, vol. 14 no. 3
Type: Research Article
ISSN: 1526-5943

Keywords

Article
Publication date: 7 March 2019

Biao Mei, Weidong Zhu, Yinglin Ke and Pengyu Zheng

Assembly variation analysis generally demands probability distributions of variation sources. However, due to small production volume in aircraft manufacturing, especially…

Abstract

Purpose

Assembly variation analysis generally demands probability distributions of variation sources. However, due to small production volume in aircraft manufacturing, especially prototype manufacturing, the probability distributions are hard to obtain, and only the small-sample data of variation sources can be consulted. Thus, this paper aims to propose a variation analysis method driven by small-sample data for compliant aero-structure assembly.

Design/methodology/approach

First, a hybrid assembly variation model, integrating rigid effects with flexibility, is constructed based on the homogeneous transformation and elasticity mechanics. Then, the bootstrap approach is introduced to estimate a variation source based on small-sample data. The influences of bootstrap parameters on the estimation accuracy are analyzed to select suitable parameters for acceptable estimation performance. Finally, the process of assembly variation analysis driven by small-sample data is demonstrated.

Findings

A variation analysis method driven by small-sample data, considering both rigid effects and flexibility, is proposed for aero-structure assembly. The method provides a good complement to traditional variation analysis methods based on probability distributions of variation sources.

Practical implications

With the proposed method, even if probability distribution information of variation sources cannot be obtained, accurate estimation of the assembly variation could be achieved. The method is well suited for aircraft assembly, especially in the stage of prototype manufacturing.

Originality/value

A variation analysis method driven by small-sample data is proposed for aero-structure assembly, which can be extended to deal with other similar applications.

Open Access
Article
Publication date: 21 March 2024

Warisa Thangjai and Sa-Aat Niwitpong

Confidence intervals play a crucial role in economics and finance, providing a credible range of values for an unknown parameter along with a corresponding level of certainty…

Abstract

Purpose

Confidence intervals play a crucial role in economics and finance, providing a credible range of values for an unknown parameter along with a corresponding level of certainty. Their applications encompass economic forecasting, market research, financial forecasting, econometric analysis, policy analysis, financial reporting, investment decision-making, credit risk assessment and consumer confidence surveys. Signal-to-noise ratio (SNR) finds applications in economics and finance across various domains such as economic forecasting, financial modeling, market analysis and risk assessment. A high SNR indicates a robust and dependable signal, simplifying the process of making well-informed decisions. On the other hand, a low SNR indicates a weak signal that could be obscured by noise, so decision-making procedures need to take this into serious consideration. This research focuses on the development of confidence intervals for functions derived from the SNR and explores their application in the fields of economics and finance.

Design/methodology/approach

The construction of the confidence intervals involved the application of various methodologies. For the SNR, confidence intervals were formed using the generalized confidence interval (GCI), large sample and Bayesian approaches. The difference between SNRs was estimated through the GCI, large sample, method of variance estimates recovery (MOVER), parametric bootstrap and Bayesian approaches. Additionally, confidence intervals for the common SNR were constructed using the GCI, adjusted MOVER, computational and Bayesian approaches. The performance of these confidence intervals was assessed using coverage probability and average length, evaluated through Monte Carlo simulation.

Findings

The GCI approach demonstrated superior performance over other approaches in terms of both coverage probability and average length for the SNR and the difference between SNRs. Hence, employing the GCI approach is advised for constructing confidence intervals for these parameters. As for the common SNR, the Bayesian approach exhibited the shortest average length. Consequently, the Bayesian approach is recommended for constructing confidence intervals for the common SNR.

Originality/value

This research presents confidence intervals for functions of the SNR to assess SNR estimation in the fields of economics and finance.

Details

Asian Journal of Economics and Banking, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2615-9821

Keywords

Article
Publication date: 9 January 2017

Brian Williamson and Sam Wood

The purpose of this paper is to integrate mobile supply and demand on an economic basis and to model the economic value of additional data capacity, spectrum demand and data…

Abstract

Purpose

The purpose of this paper is to integrate mobile supply and demand on an economic basis and to model the economic value of additional data capacity, spectrum demand and data growth under a range of parameter and policy assumptions.

Design/methodology/approach

The modelling requires an iterative solution to find an equilibrium between supply and demand, which allows data demand to be bootstrapped, i.e. determined endogenously within the model.

Findings

The sensitivity of the model to input parameter changes differs from a modelling approach where data demand is assumed to be exogenous, whilst in some instances, the sign of the relationship is reversed, e.g. the response of economic value to mobile site cost changes.

Research limitations/implications

The approach suggests a research agenda to estimate willingness to pay for data and the price elasticity of data demand, and may also suggest new explanatory variables to test econometrically in relation to spectrum value.

Practical implications

The approach provides a different route to spectrum valuation and allows estimation of the economic impacts of a range of policy questions.

Originality/value

This paper provides the integration of supply and demand and endogenous estimation of data demand and economic value, coupled with quantitative assessment of a range of policy questions.

Details

Digital Policy, Regulation and Governance, vol. 19 no. 1
Type: Research Article
ISSN: 2398-5038

Keywords

Article
Publication date: 23 November 2017

Yan-Kwang Chen, Chih-Teng Chen, Fei-Rung Chiu and Jiunn-Woei Lian

Group buying (GB) is a shopping strategy through which customers obtain volume discounts on the products they purchase, whereas retailers obtain quick turnover. In the scenario of…

Abstract

Purpose

Group buying (GB) is a shopping strategy through which customers obtain volume discounts on the products they purchase, whereas retailers obtain quick turnover. In the scenario of GB, the optimal discount strategy is a key issue because it affects the profit of sellers. Previous research has focused on exploring the price discount and order quantity with a fixed selling price of the product assuming that customer demand is uncertain (but follows a known distribution). This study aims to look at the same problem but goes further to examine the case where not only customer demand is certain but also the demand distribution is unknown.

Design/methodology/approach

In this study, optimal price discount and order quantity of a GB problem cast as a price-setting newsvendor problem were obtained assuming that the distribution of customer demand is unknown. The price–demand relationship is considered in addition form and product form, respectively. The bootstrap sampling technique is used to develop a solution procedure for the problem. To validate the usefulness of the proposed method, a simulated comparison of the proposed model and the existing one was conducted. The effects of sample size, demand form and parameters of the demand form on the performance of the proposed model are presented and discussed.

Findings

It is revealed from the numerical results that the proposed model is appropriate to the problem at hand, and it becomes more effective as sample size increases. Because the two forms of demand indicate restrictive assumptions about the effect of price on the variance of demand, it is found that the proposed model seems to be more suitable for addition form of demand.

Originality/value

This study contributes to the growing literature on GB models by developing a bootstrap-based newsvendor model to determine an optimal discount price and order quantity for a fixed-price GB website. This model can assist the sellers in making decisions on optimal discount price and order quantity without knowing the form of customer demand distribution.

Details

Kybernetes, vol. 46 no. 10
Type: Research Article
ISSN: 0368-492X

Keywords

Open Access
Article
Publication date: 26 October 2021

Michael Kaku Minlah, Xibao Zhang, Philipine Nelly Ganyoh and Ayesha Bibi

This study investigates the existence of the environmental Kuznets curve (EKC) for deforestation for Ghana over the 1962–2018 the time period.

1139

Abstract

Purpose

This study investigates the existence of the environmental Kuznets curve (EKC) for deforestation for Ghana over the 1962–2018 the time period.

Design/methodology/approach

The study employs a time-varying approach, the bootstrap rolling window Granger causality test to achieve its set objectives.

Findings

The results from our study reveals an inverted “N” shape EKC for deforestation, implying that deforestation will initially decrease with increases in economic growth up to a certain income threshold and increases with further increases in economic growth beyond this income threshold up to a higher income threshold and then decrease with further increases in economic beyond the higher income threshold.

Practical implications

The results from the study project show that over time economic growth can serve as a natural panacea to cure and mitigate the ills of deforestation that have plagued Ghana's forests over the years.

Social implications

The results further highlight the important role of strong institutions in fighting the deforestation menace.

Originality/value

The originality of this study lies in its methodology which allows for feedback from deforestation to the economy. This is in contrast to earlier studies on the EKC for deforestation which allowed causality only from deforestation to the economy.

Details

Forestry Economics Review, vol. 3 no. 1
Type: Research Article
ISSN: 2631-3030

Keywords

Book part
Publication date: 18 November 2014

Ted D. Englebrecht, Xiaoyan Chu and Yingxu Kuang

Dissatisfaction with the current federal tax system is fostering serious interest in several tax reform plans such as a value-added tax (VAT), a flat tax, and a national retail…

Abstract

Dissatisfaction with the current federal tax system is fostering serious interest in several tax reform plans such as a value-added tax (VAT), a flat tax, and a national retail sales tax. Recently, one of the former Republican presidential candidates, Herman Cain, initiated a 999 tax plan. As illustrated on Cain’s official website, the 999 plan intends to replace current federal taxes with a 9% business flat tax, a 9% individual flat tax, and a 9% national sales tax. We examine the distributional effects of the 999 tax plan, as well as the current system it intends to replace, under both annual income and lifetime income approaches. Global measures of progressivity and bootstrap-t confidence intervals suggest that the current federal tax system is progressive while Cain’s 999 tax plan is regressive under the annual income approach. Under the lifetime income approach, both the current federal tax system and Cain’s 999 tax plan show progressivity. However, the current federal tax system is more progressive. The findings in this study suggest that Cain’s 999 tax plan should be considered more seriously and further analysis of the 999 tax plan is warranted.

Details

Advances in Taxation
Type: Book
ISBN: 978-1-78441-120-6

Keywords

Article
Publication date: 19 June 2017

Jin-Li Hu, Yang Li and Hsin-Jing Tung

For strategic and competitive insights, the purpose of this paper is to measure and benchmark the comparative operating efficiencies of Association of Southeast Asian Nations’…

Abstract

Purpose

For strategic and competitive insights, the purpose of this paper is to measure and benchmark the comparative operating efficiencies of Association of Southeast Asian Nations’ (ASEAN) major airlines and present a new interpretation along with managerial implications.

Design/methodology/approach

This research statistically tests returns to scale and the equality of mean efficiencies for 15 ASEAN airlines covering the period 2010-2014. The disaggregate input efficiency of ASEAN airlines is computed by comparing the target and actual inputs.

Findings

The disaggregate input efficiency of ASEAN airlines shows that aircraft efficiency is the lowest, operating cost efficiency is better, and available seat efficiency is the best.

Originality/value

This paper applies data envelopment analysis models, disaggregated input efficiency measures, and bootstrapping approaches to compute the operational efficiency of ASEAN airlines. Strategic suggestions are made to improve the operational efficiency of ASEAN airlines.

Details

Management Decision, vol. 55 no. 5
Type: Research Article
ISSN: 0025-1747

Keywords

1 – 10 of over 17000