Search results

1 – 10 of over 64000
Article
Publication date: 14 October 2020

Haiyan Ge, Xintian Liu, Yu Fang, Haijie Wang, Xu Wang and Minghui Zhang

The purpose of this paper is to introduce error ellipse into the bootstrap method to improve the reliability of small samples and the credibility of the S-N curve.

Abstract

Purpose

The purpose of this paper is to introduce error ellipse into the bootstrap method to improve the reliability of small samples and the credibility of the S-N curve.

Design/methodology/approach

Based on the bootstrap method and the reliability of the original samples, two error ellipse models are proposed. The error ellipse model reasonably predicts that the discrete law of expanded virtual samples obeys two-dimensional normal distribution.

Findings

By comparing parameters obtained by the bootstrap method, improved bootstrap method (normal distribution) and error ellipse methods, it is found that the error ellipse method achieves the expansion of sampling range and shortens the confidence interval, which improves the accuracy of the estimation of parameters with small samples. Through case analysis, it is proved that the tangent error ellipse method is feasible, and the series of S-N curves is reasonable by the tangent error ellipse method.

Originality/value

The error ellipse methods can lay a technical foundation for life prediction of products and have a progressive significance for the quality evaluation of products.

Details

Engineering Computations, vol. 38 no. 1
Type: Research Article
ISSN: 0264-4401

Keywords

Open Access
Article
Publication date: 21 August 2023

Yue Zhou, Xiaobei Shen and Yugang Yu

This study examines the relationship between demand forecasting error and retail inventory management in an uncertain supplier yield context. Replenishment is segmented into…

1662

Abstract

Purpose

This study examines the relationship between demand forecasting error and retail inventory management in an uncertain supplier yield context. Replenishment is segmented into off-season and peak-season, with the former characterized by longer lead times and higher supply uncertainty. In contrast, the latter incurs higher acquisition costs but ensures certain supply, with the retailer's purchase volume aligning with the acquired volume. Retailers can replenish in both phases, receiving goods before the sales season. This paper focuses on the impact of the retailer's demand forecasting bias on their sales period profits for both phases.

Design/methodology/approach

This study adopts a data-driven research approach by drawing inspiration from real data provided by a cooperating enterprise to address research problems. Mathematical modeling is employed to solve the problems, and the resulting optimal strategies are tested and validated in real-world scenarios. Furthermore, the applicability of the optimal strategies is enhanced by incorporating numerical simulations under other general distributions.

Findings

The study's findings reveal that a greater disparity between predicted and actual demand distributions can significantly reduce the profits that a retailer-supplier system can earn, with the optimal purchase volume also being affected. Moreover, the paper shows that the mean of the forecasting error has a more substantial impact on system revenue than the variance of the forecasting error. Specifically, the larger the absolute difference between the predicted and actual means, the lower the system revenue. As a result, managers should focus on improving the quality of demand forecasting, especially the accuracy of mean forecasting, when making replenishment decisions.

Practical implications

This study established a two-stage inventory optimization model that simultaneously considers random yield and demand forecast quality, and provides explicit expressions for optimal strategies under two specific demand distributions. Furthermore, the authors focused on how forecast error affects the optimal inventory strategy and obtained interesting properties of the optimal solution. In particular, the property that the optimal procurement quantity no longer changes with increasing forecast error under certain conditions is noteworthy, and has not been previously noted by scholars. Therefore, the study fills a gap in the literature.

Originality/value

This study established a two-stage inventory optimization model that simultaneously considers random yield and demand forecast quality, and provides explicit expressions for optimal strategies under two specific demand distributions. Furthermore, the authors focused on how forecast error affects the optimal inventory strategy and obtained interesting properties of the optimal solution. In particular, the property that the optimal procurement quantity no longer changes with increasing forecast error under certain conditions is noteworthy, and has not been previously noted by scholars. Therefore, the study fills a gap in the literature.

Details

Modern Supply Chain Research and Applications, vol. 5 no. 2
Type: Research Article
ISSN: 2631-3871

Keywords

Abstract

Details

Applying Maximum Entropy to Econometric Problems
Type: Book
ISBN: 978-0-76230-187-4

Article
Publication date: 1 April 1993

E. OÑATE and G. BUGEDA

The concepts of solution error and optimal mesh in adaptive finite element analysis are revisited. It is shown that the correct evaluation of the convergence rate of the error

Abstract

The concepts of solution error and optimal mesh in adaptive finite element analysis are revisited. It is shown that the correct evaluation of the convergence rate of the error norms involved in the error measure and the optimal mesh criteria chosen are essential to avoid oscillations in the refinement process. Two mesh optimality criteria based on: (a) the equal distribution of global error, and (b) the specific error over the elements are studied and compared in detail through some examples of application.

Details

Engineering Computations, vol. 10 no. 4
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 29 July 2014

Rafał Kluz and Tomasz Trzepieciński

The purpose of the following work was to work out the dependency to allow for the determination of the repeatability positioning error value of the robot at any given point in its…

Abstract

Purpose

The purpose of the following work was to work out the dependency to allow for the determination of the repeatability positioning error value of the robot at any given point in its workspace, without the necessity of conducting time-consuming measurements while routing a precise surface of repeatability positioning.

Design/methodology/approach

The presented dependency permits for the possibility to determine, even at the planning phase, the optimal connection point in the workspace, ensuring the best parameters for the process of machine assembly, without needless overestimation of precision of the utilized equipment. To solve the task the sequential quadratic programming (SQP) method implemented in the MATLAB(R) environment was used. To verify the hypothesis of the compatibility of the empirical distribution with the hypothetical distribution of the robot’s positioning error, the Kolmogorov test was used.

Findings

In this paper, it has been demonstrated theoretically and experimentally that the industrial robot accuracy can vary over a very wide range in the workspace. This provides an additional opportunity to increase reliability of the assembly process through the appropriate choice of the point of parts joining. The methodology presented here allows the designer of assembly workstations to rapidly estimate the repeatability of robot positioning and to allocate at the design stage of assembly process the optimal position in the robot workspace to ensure the required precision, without unnecessarily high accuracy of equipment used and, therefore, without inflated costs.

Originality/value

An alternative solution to the stated problem can be the proposed method for determining the robot’s positioning errors, requiring a much smaller amount of measurements to be taken that would be necessary to determine the parameters of the random variable errors of the joint coordinates of the robot and for their verification by the repeatability of positioning in randomly selected points in the workspace. Additionally discussed in the study, the methodology of identifying connection place was designed for typical combinations of machine parts, most frequently encountered in assembly process and was taken into account, typical limitations occurring in actual manufacturing conditions.

Details

Assembly Automation, vol. 34 no. 3
Type: Research Article
ISSN: 0144-5154

Keywords

Book part
Publication date: 18 October 2019

Edward George, Purushottam Laud, Brent Logan, Robert McCulloch and Rodney Sparapani

Bayesian additive regression trees (BART) is a fully Bayesian approach to modeling with ensembles of trees. BART can uncover complex regression functions with high-dimensional…

Abstract

Bayesian additive regression trees (BART) is a fully Bayesian approach to modeling with ensembles of trees. BART can uncover complex regression functions with high-dimensional regressors in a fairly automatic way and provide Bayesian quantification of the uncertainty through the posterior. However, BART assumes independent and identical distributed (i.i.d) normal errors. This strong parametric assumption can lead to misleading inference and uncertainty quantification. In this chapter we use the classic Dirichlet process mixture (DPM) mechanism to nonparametrically model the error distribution. A key strength of BART is that default prior settings work reasonably well in a variety of problems. The challenge in extending BART is to choose the parameters of the DPM so that the strengths of the standard BART approach is not lost when the errors are close to normal, but the DPM has the ability to adapt to non-normal errors.

Details

Topics in Identification, Limited Dependent Variables, Partial Observability, Experimentation, and Flexible Modeling: Part B
Type: Book
ISBN: 978-1-83867-419-9

Article
Publication date: 31 August 2022

G.T.S. Ho, S.K. Choy, P.H. Tong and V. Tang

Demand forecast methodologies have been studied extensively to improve operations in e-commerce. However, every forecast inevitably contains errors, and this may result in a…

461

Abstract

Purpose

Demand forecast methodologies have been studied extensively to improve operations in e-commerce. However, every forecast inevitably contains errors, and this may result in a disproportionate impact on operations, particularly in the dynamic nature of fulfilling orders in e-commerce. This paper aims to quantify the impact that forecast error in order demand has on order picking, the most costly and complex operations in e-order fulfilment, in order to enhance the application of the demand forecast in an e-fulfilment centre.

Design/methodology/approach

The paper presents a Gaussian regression based mathematical method that translates the error of forecast accuracy in order demand to the performance fluctuations in e-order fulfilment. In addition, the impact under distinct order picking methodologies, namely order batching and wave picking. As described.

Findings

A structured model is developed to evaluate the impact of demand forecast error in order picking performance. The findings in terms of global results and local distribution have important implications for organizational decision-making in both long-term strategic planning and short-term daily workforce planning.

Originality/value

Earlier research examined demand forecasting methodologies in warehouse operations. And order picking and examining the impact of error in demand forecasting on order picking operations has been identified as a research gap. This paper contributes to closing this research gap by presenting a mathematical model that quantifies impact of demand forecast error into fluctuations in order picking performance.

Details

Industrial Management & Data Systems, vol. 122 no. 11
Type: Research Article
ISSN: 0263-5577

Keywords

Article
Publication date: 12 January 2010

María Cristina Sánchez and J.R. Mahan

The purpose of this paper is to present the results obtained from numerical models of radiant energy exchange in instruments typically used to measure various characteristics of…

Abstract

Purpose

The purpose of this paper is to present the results obtained from numerical models of radiant energy exchange in instruments typically used to measure various characteristics of the Earth's ocean‐atmosphere system.

Design/methodology/approach

Numerical experiments were designed and performed in a statistical environment, based on the Monte Carlo ray‐trace (MCRT) method, developed to model thermal and optical systems. Results from the derived theoretical equations were then compared to the results from the numerical experiments.

Findings

A rigorous statistical protocol is defined and demonstrated for establishing the uncertainty and related confidence interval in results obtained from MCRT models of radiant exchange.

Research limitations/implications

The methodology developed in this paper should be adapted to predict the uncertainty of more comprehensive parameters such as the total radiative heat transfer.

Practical implications

Results can be used to estimate the number of energy bundles necessary to be traced per surface element in a MCRT model to obtain a desired relative error.

Originality/value

This paper offers a new methodology to predict the uncertainty of parameters in high‐level modeling and analysis of instruments that accumulate the long‐term database required to correlate observed trends with human activity and natural phenomena. The value of this paper lies in the interest in understanding the climatological role of the Earth's radiative energy budget.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 20 no. 1
Type: Research Article
ISSN: 0961-5539

Keywords

Content available
Article
Publication date: 10 May 2021

Zachary Hornberger, Bruce Cox and Raymond R. Hill

Large/stochastic spatiotemporal demand data sets can prove intractable for location optimization problems, motivating the need for aggregation. However, demand aggregation induces…

Abstract

Purpose

Large/stochastic spatiotemporal demand data sets can prove intractable for location optimization problems, motivating the need for aggregation. However, demand aggregation induces errors. Significant theoretical research has been performed related to the modifiable areal unit problem and the zone definition problem. Minimal research has been accomplished related to the specific issues inherent to spatiotemporal demand data, such as search and rescue (SAR) data. This study provides a quantitative comparison of various aggregation methodologies and their relation to distance and volume based aggregation errors.

Design/methodology/approach

This paper introduces and applies a framework for comparing both deterministic and stochastic aggregation methods using distance- and volume-based aggregation error metrics. This paper additionally applies weighted versions of these metrics to account for the reality that demand events are nonhomogeneous. These metrics are applied to a large, highly variable, spatiotemporal demand data set of SAR events in the Pacific Ocean. Comparisons using these metrics are conducted between six quadrat aggregations of varying scales and two zonal distribution models using hierarchical clustering.

Findings

As quadrat fidelity increases the distance-based aggregation error decreases, while the two deliberate zonal approaches further reduce this error while using fewer zones. However, the higher fidelity aggregations detrimentally affect volume error. Additionally, by splitting the SAR data set into training and test sets this paper shows the stochastic zonal distribution aggregation method is effective at simulating actual future demands.

Originality/value

This study indicates no singular best aggregation method exists, by quantifying trade-offs in aggregation-induced errors practitioners can utilize the method that minimizes errors most relevant to their study. Study also quantifies the ability of a stochastic zonal distribution method to effectively simulate future demand data.

Details

Journal of Defense Analytics and Logistics, vol. 5 no. 1
Type: Research Article
ISSN: 2399-6439

Keywords

Article
Publication date: 20 October 2021

Houmera Bibi Sabera Nunkoo, Preethee Nunkoo Gonpot, Noor-Ul-Hacq Sookia and T.V. Ramanathan

The purpose of this study is to identify appropriate autoregressive conditional duration (ACD) models that can capture the dynamics of tick-by-tick mid-cap exchange traded funds…

Abstract

Purpose

The purpose of this study is to identify appropriate autoregressive conditional duration (ACD) models that can capture the dynamics of tick-by-tick mid-cap exchange traded funds (ETFs) for the period July 2017 to December 2017 and accurately predict future trade duration values. The forecasted durations are then used to demonstrate the practical usefulness of the ACD models in quantifying an intraday time-based risk measure.

Design/methodology/approach

Through six functional forms and six error distributions, 36 ACD models are estimated for eight mid-cap ETFs. The Akaike information criterion and Bayesian information criterion and the Ljung-Box test are used to evaluate goodness-of-fit while root mean square error and the Superior predictive ability test are applied to assess forecast accuracy.

Findings

The Box-Cox ACD (BACD), augmented Box-Cox ACD (ABACD) and additive and multiplicative ACD (AMACD) extensions are among the best fits. The results obtained prove that higher degrees of flexibility do not necessarily enhance goodness of fit and forecast accuracy does not always depend on model adequacy. BACD and AMACD models based on the generalised-F distribution generate the best forecasts, irrespective of the trading frequencies of the ETFs.

Originality/value

To the best of the authors’ knowledge, this is the first study that analyses the empirical performance of ACD models for high-frequency ETF data. Additionally, in comparison to previous works, a wider range of ACD models is considered on a reasonably longer sample period. The paper will be of interest to researchers in the area of market microstructure and to practitioners engaged in high-frequency trading.

Details

Studies in Economics and Finance, vol. 39 no. 1
Type: Research Article
ISSN: 1086-7376

Keywords

1 – 10 of over 64000