Search results

1 – 10 of over 4000
Article
Publication date: 1 August 1994

T.A. Spedding and P.L. Rawlings

Control charts and process capability calculations remain fundamentaltechniques for statistical process control. However, it has long beenrealized that the accuracy of these…

1639

Abstract

Control charts and process capability calculations remain fundamental techniques for statistical process control. However, it has long been realized that the accuracy of these calculations can be significantly affected when sampling from a non‐Gaussian population. Many quality practitioners are conscious of these problems but are not aware of the effects such problems might have on the integrity of their results. Considers non‐normality with respect to the use of traditional control charts and process capability calculations, so that users may be aware of the errors that are involved when sampling from a non‐Gaussian population. Use is made of the Johnson system of distributions as a simulation technique to investigate the effects of non‐normality of control charts and process control calculations. An alternative technique is suggested for process capability calculations which alleviates the problems of non‐normality while retaining computational efficiency.

Details

International Journal of Quality & Reliability Management, vol. 11 no. 6
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 11 July 2008

Vivian W.Y. Tam and Khoa N. Le

Various method have been used by organisations in the construction industry to improve quality, employing mainly two major techniques: management techniques such as quality…

Abstract

Purpose

Various method have been used by organisations in the construction industry to improve quality, employing mainly two major techniques: management techniques such as quality control, quality assurance, total quality management; and statistical techniques such as cost of quality, customer satisfaction and the six sigma principle. The purpose of this paper is to show that it is possible to employ the six sigma principle in the field of construction management provided that sufficient information on a particular population is obtained.

Design/methodology/approach

Statistical properties of the hyperbolic distribution are given and quality factors such as population in range, number of defects, yield percentage and defects per million opportunities are estimated. Graphical illustrations of the hyperbolic and Gaussian distributions are also given. From that, detailed comparisons of these two distributions are numerically obtained. The impacts of these quality factors are briefly discussed to give a rough guidance to organisations in the construction industry on how to lower cost and to improve project quality by prevention. A case study on a construction project is given in which it is shown that the hyperbolic distribution is better suited to the cost data than the Gaussian distribution. Cost and quality data of all projects in the company are collected over a period of eight years. Each project may consist of a number of phases, typically spanning about three months. Each phase can be considered as a member of the project population. Quality factors of this population are estimated using the six sigma principle.

Findings

The paper finds that by using a suitable distribution, it is possible to improve quality factors such as population in range, yield percentage and number of defects per million opportunities.

Originality/value

This paper is of value in assessing the suitability of the hyperbolic and Gaussian distributions in modelling the population and showing that hyperbolic distribution can be more effectively used to model the cost data than the Gaussian distribution.

Details

Journal of Engineering, Design and Technology, vol. 6 no. 2
Type: Research Article
ISSN: 1726-0531

Keywords

Book part
Publication date: 18 January 2022

Dante Amengual, Enrique Sentana and Zhanyuan Tian

We study the statistical properties of Pearson correlation coefficients of Gaussian ranks, and Gaussian rank regressions – ordinary least-squares (OLS) models applied to those…

Abstract

We study the statistical properties of Pearson correlation coefficients of Gaussian ranks, and Gaussian rank regressions – ordinary least-squares (OLS) models applied to those ranks. We show that these procedures are fully efficient when the true copula is Gaussian and the margins are non-parametrically estimated, and remain consistent for their population analogs otherwise. We compare them to Spearman and Pearson correlations and their regression counterparts theoretically and in extensive Monte Carlo simulations. Empirical applications to migration and growth across US states, the augmented Solow growth model and momentum and reversal effects in individual stock returns confirm that Gaussian rank procedures are insensitive to outliers.

Details

Essays in Honor of M. Hashem Pesaran: Panel Modeling, Micro Applications, and Econometric Methodology
Type: Book
ISBN: 978-1-80262-065-8

Keywords

Article
Publication date: 7 September 2022

Zhe Liu, Zexiong Yu, Leilei Wang, Li Chen, Haihang Cui and Bohua Sun

The purpose of this study is to use a weak light source with spatial distribution to realize light-driven fluid by adding high-absorbing nanoparticles to the droplets, thereby…

Abstract

Purpose

The purpose of this study is to use a weak light source with spatial distribution to realize light-driven fluid by adding high-absorbing nanoparticles to the droplets, thereby replacing a highly focused strong linear light source acting on pure droplets.

Design/methodology/approach

First, Fe3O4 nanoparticles with high light response characteristics were added to the droplets to prepare nanofluid droplets, and through the Gaussian light-driven flow experiment, the Marangoni effect inside a nanofluid droplet was studied, which can produce the surface tension gradient on the air/liquid interface and induce the vortex motion inside a droplet. Then, the numerical simulation method of multiphysics field coupling was used to study the effects of droplet height and Gaussian light distribution on the flow characteristics inside a droplet.

Findings

Nanoparticles can significantly enhance the light absorption, so that the Gaussian light is enough to drive the flow, and the formation of vortex can be regulated by light distribution. The multiphysics field coupling model can accurately describe this problem.

Originality/value

This study is helpful to understand the flow behavior and heat transfer phenomenon in optical microfluidic systems, and provides a feasible way to construct the rapid flow inside a tiny droplet by light.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 33 no. 2
Type: Research Article
ISSN: 0961-5539

Keywords

Book part
Publication date: 5 July 2012

Miguel Angel Fuentes, Austin Gerig and Javier Vicente

It is well known that the probability distribution of stock returns is non-Gaussian. The tails of the distribution are too “fat,” meaning that extreme price movements, such as…

Abstract

It is well known that the probability distribution of stock returns is non-Gaussian. The tails of the distribution are too “fat,” meaning that extreme price movements, such as stock market crashes, occur more often than predicted given a Gaussian model. Numerous studies have attempted to characterize and explain the fat-tailed property of returns. This is because understanding the probability of extreme price movements is important for risk management and option pricing. In spite of this work, there is still no accepted theoretical explanation. In this chapter, we use a large collection of data from three different stock markets to show that slow fluctuations in the volatility (i.e., the size of return increments), coupled with a Gaussian random process, produce the non-Gaussian and stable shape of the return distribution. Furthermore, because the statistical features of volatility are similar across stocks, we show that their return distributions collapse onto one universal curve. Volatility fluctuations influence the pricing of derivative instruments, and we discuss the implications of our findings for the pricing of options.

Details

Derivative Securities Pricing and Modelling
Type: Book
ISBN: 978-1-78052-616-4

Article
Publication date: 15 March 2021

Putta Hemalatha and Geetha Mary Amalanathan

Adequate resources for learning and training the data are an important constraint to develop an efficient classifier with outstanding performance. The data usually follows a…

Abstract

Purpose

Adequate resources for learning and training the data are an important constraint to develop an efficient classifier with outstanding performance. The data usually follows a biased distribution of classes that reflects an unequal distribution of classes within a dataset. This issue is known as the imbalance problem, which is one of the most common issues occurring in real-time applications. Learning of imbalanced datasets is a ubiquitous challenge in the field of data mining. Imbalanced data degrades the performance of the classifier by producing inaccurate results.

Design/methodology/approach

In the proposed work, a novel fuzzy-based Gaussian synthetic minority oversampling (FG-SMOTE) algorithm is proposed to process the imbalanced data. The mechanism of the Gaussian SMOTE technique is based on finding the nearest neighbour concept to balance the ratio between minority and majority class datasets. The ratio of the datasets belonging to the minority and majority class is balanced using a fuzzy-based Levenshtein distance measure technique.

Findings

The performance and the accuracy of the proposed algorithm is evaluated using the deep belief networks classifier and the results showed the efficiency of the fuzzy-based Gaussian SMOTE technique achieved an AUC: 93.7%. F1 Score Prediction: 94.2%, Geometric Mean Score: 93.6% predicted from confusion matrix.

Research limitations/implications

The proposed research still retains some of the challenges that need to be focused such as application FG-SMOTE to multiclass imbalanced dataset and to evaluate dataset imbalance problem in a distributed environment.

Originality/value

The proposed algorithm fundamentally solves the data imbalance issues and challenges involved in handling the imbalanced data. FG-SMOTE has aided in balancing minority and majority class datasets.

Details

International Journal of Intelligent Computing and Cybernetics, vol. 14 no. 2
Type: Research Article
ISSN: 1756-378X

Keywords

Article
Publication date: 31 July 2009

Valerio Giuliani, Ronald J. Hugo and Peihua Gu

The purpose of this paper is to provide a flexible tool to predict the particle temperature distribution for traditional laser applications and for the most recent diode laser…

Abstract

Purpose

The purpose of this paper is to provide a flexible tool to predict the particle temperature distribution for traditional laser applications and for the most recent diode laser processes. In the past few years, surface processing and rapid prototyping applications have frequently implemented the use of powder delivery nozzles and high power fibre‐coupled diode lasers with highly convergent laser beams. Owing to the complexity and variety of the process parameters involved in this technology, mathematical models are necessary to understand and predict the deposition behaviour. Modeling the dynamics of the melting pool and the particle temperature distribution is critical for achieving a good deposition quality.

Design/methodology/approach

This study focuses on the development of mathematical models to predict the particle temperature distribution over the melting pool. An analytical and a numerical solution are proposed for two cases of laser intensity distribution: top hat and Gaussian.

Findings

The results show that a more vertical position of powder delivery nozzle will lead to a higher and more uniform particle temperature distribution, in particular for the top‐hat intensity distribution case.

Originality/value

Previous work has dealt only with Gaussian laser spatial distributions and collimated laser beams. Therefore, they were limited to a specific class of laser processes. This work provides a flexible tool to predict the particle temperature distribution for traditional laser applications (powder delivery nozzle and Gaussian laser profile) and for the most recent diode laser processes (powder delivery nozzle and top‐hat laser distribution with highly convergent laser beam). In addition, the results demonstrate that the particle temperature does not monotonically increase while increasing the nozzle inclination as in the case of a collimated laser beam, but some particles show a minimum temperature for intermediate values of the nozzle inclination angle.

Details

Rapid Prototyping Journal, vol. 15 no. 4
Type: Research Article
ISSN: 1355-2546

Keywords

Book part
Publication date: 19 November 2012

Naceur Naguez and Jean-Luc Prigent

Purpose – The purpose of this chapter is to estimate non-Gaussian distributions by means of Johnson distributions. An empirical illustration on hedge fund returns is…

Abstract

Purpose – The purpose of this chapter is to estimate non-Gaussian distributions by means of Johnson distributions. An empirical illustration on hedge fund returns is detailed.

Methodology/approach – To fit non-Gaussian distributions, the chapter introduces the family of Johnson distributions and its general extensions. We use both parametric and non-parametric approaches. In a first step, we analyze the serial correlation of our sample of hedge fund returns and unsmooth the series to correct the correlations. Then, we estimate the distribution by the standard Johnson system of laws. Finally, we search for a more general distribution of Johnson type, using a non-parametric approach.

Findings – We use data from the indexes Credit Suisse/Tremont Hedge Fund (CSFB/Tremont) provided by Credit Suisse. For the parametric approach, we find that the SU Johnson distribution is the most appropriate, except for the Managed Futures. For the non-parametric approach, we determine the best polynomial approximation of the function characterizing the transformation from the initial Gaussian law to the generalized Johnson distribution.

Originality/value of chapter – These findings are novel since we use an extension of the Johnson distributions to better fit non-Gaussian distributions, in particular in the case of hedge fund returns. We illustrate the power of this methodology that can be further developed in the multidimensional case.

Details

Recent Developments in Alternative Finance: Empirical Assessments and Economic Implications
Type: Book
ISBN: 978-1-78190-399-5

Keywords

Open Access
Article
Publication date: 15 December 2020

Soha Rawas and Ali El-Zaart

Image segmentation is one of the most essential tasks in image processing applications. It is a valuable tool in many oriented applications such as health-care systems, pattern…

Abstract

Purpose

Image segmentation is one of the most essential tasks in image processing applications. It is a valuable tool in many oriented applications such as health-care systems, pattern recognition, traffic control, surveillance systems, etc. However, an accurate segmentation is a critical task since finding a correct model that fits a different type of image processing application is a persistent problem. This paper develops a novel segmentation model that aims to be a unified model using any kind of image processing application. The proposed precise and parallel segmentation model (PPSM) combines the three benchmark distribution thresholding techniques to estimate an optimum threshold value that leads to optimum extraction of the segmented region: Gaussian, lognormal and gamma distributions. Moreover, a parallel boosting algorithm is proposed to improve the performance of the developed segmentation algorithm and minimize its computational cost. To evaluate the effectiveness of the proposed PPSM, different benchmark data sets for image segmentation are used such as Planet Hunters 2 (PH2), the International Skin Imaging Collaboration (ISIC), Microsoft Research in Cambridge (MSRC), the Berkley Segmentation Benchmark Data set (BSDS) and Common Objects in COntext (COCO). The obtained results indicate the efficacy of the proposed model in achieving high accuracy with significant processing time reduction compared to other segmentation models and using different types and fields of benchmarking data sets.

Design/methodology/approach

The proposed PPSM combines the three benchmark distribution thresholding techniques to estimate an optimum threshold value that leads to optimum extraction of the segmented region: Gaussian, lognormal and gamma distributions.

Findings

On the basis of the achieved results, it can be observed that the proposed PPSM–minimum cross-entropy thresholding (PPSM–MCET)-based segmentation model is a robust, accurate and highly consistent method with high-performance ability.

Originality/value

A novel hybrid segmentation model is constructed exploiting a combination of Gaussian, gamma and lognormal distributions using MCET. Moreover, and to provide an accurate and high-performance thresholding with minimum computational cost, the proposed PPSM uses a parallel processing method to minimize the computational effort in MCET computing. The proposed model might be used as a valuable tool in many oriented applications such as health-care systems, pattern recognition, traffic control, surveillance systems, etc.

Details

Applied Computing and Informatics, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2634-1964

Keywords

Article
Publication date: 16 October 2009

Qiang Wang and Xianyi Gong

The purpose of this paper is to improve active sonar detection performance in shallow water. A stochastic‐like model multivariate elliptically contoured (MEC) distributions is…

Abstract

Purpose

The purpose of this paper is to improve active sonar detection performance in shallow water. A stochastic‐like model multivariate elliptically contoured (MEC) distributions is defined to model reverberation, which helps to reveal structure information of target signatures.

Design/methodology/approach

Active sonar systems have been developed with wider transmission bandwidths and larger aperture receiving array, which improve the signal‐to‐noise ratio and reverberation power ratio after matched filtering and beamforming. But, it has changed the statistical distribution of the reverberation‐induced envelope from the traditionally assumed Rayleigh distribution. The MEC is a kind of generalized non‐Gaussian distribution model. The authors theoretically derive the compound Gaussian, Rayleigh‐mixture, Weibull, K distributions are all special cases of MEC. It is known that Weibull and K distributions have obvious heavy‐tail than Rayleigh distribution. MEC is a suitable model to characterize non‐Rayleigh heavy‐tailed distribution of reverberation.

Findings

The analysis of test data shows reverberation envelopes obviously deviate Rayleigh distribution. In a broad non‐Gaussian framework, reverberation is modelled as MEC distribution, which is suitable to characterize non‐Rayleigh reverberation. The received data in trials validate the effectiveness of MEC model. The real data envelops follows K distribution, which is a special case of MEC. So, the MEC can be applied to develop novel signal‐processing algorithms that mitigate or account for the effects of the heavy‐tailed reverberation distributions on the target detection.

Research limitations/implications

The limited sea test data are the main limitation to prove model validation in further.

Practical implications

A very useful model for representing reverberation in shallow‐water.

Originality/value

The MECs in fact represent an attractive set data model for adaptive array, and it provides a theoretic framework to design an optimal or sub‐optimal detector.

Details

Kybernetes, vol. 38 no. 10
Type: Research Article
ISSN: 0368-492X

Keywords

1 – 10 of over 4000