Search results

1 – 10 of 64
To view the access options for this content please click here
Article
Publication date: 9 August 2011

George J. Besseris

The purpose of this paper is to provide a case study on endorsing process improvement in maritime operations by implementing design of experiments on Lean Six Sigma…

Abstract

Purpose

The purpose of this paper is to provide a case study on endorsing process improvement in maritime operations by implementing design of experiments on Lean Six Sigma performance responses. It is demonstrated how process efficiency and environmental muda may be dealt with simultaneously in a lean‐and‐green project driven by hardcore Six Sigma tools.

Design/methodology/approach

A 16‐run Taguchi‐type orthogonal design was employed to gather data for vessel speed (VS), exhaust gas temperature (EGT) and fuel consumption (FC) as modulated by a total of 15 controlling parameters synchronously. Active dependencies were inferred based on the desirability analysis method on direct process data from a performance log. This log was maintained for a long‐term monitoring during sea voyages of a double skin bulk carrier of 55,000 DWT while in sea service.

Findings

A high composite desirability value was achieved eclipsing the 0.90 mark. Values well over the 0.9 level were also obtained for the three examined individual desirability values of VS, EGT and FC. Leading controlling parameters were discovered to be compressor pressure, fuel pump index, slip, governor index and MIP.

Practical implications

A Lean Six Sigma project is carried out to improve performance characteristics in ordinary maritime operations. While the company in the case study outlined in this article no longer relies on periodic inspections to determine machinery conditions, improvement on key process characteristics were nevertheless deemed worthy of ameliorating. Information retrieval from computerized continuous monitoring systems assisted in conducting experimental designs in order to obtain optimal performance. Specifically, the tuning of vessel main engine running mode was examined aiming at increasing the quality levels of output power to the shaft along with a reduction of NOx emissions.

Originality/value

This work adds an interesting paradigm in the critical field of maritime activities for processes in full gear while operating at sea. Maritime operations are an imperative necessity when expediting international trading transactions. It is the first time that such a case study has emanated from a real pilot Lean Six Sigma project which interlaces process efficiency enhancement with concurrent environmental muda reduction.

To view the access options for this content please click here
Article
Publication date: 5 April 2021

George Besseris and Panagiotis Tsarouhas

The study aims to provide a quick-and-robust multifactorial screening technique for early detection of statistically significant effects that could influence a product's…

Abstract

Purpose

The study aims to provide a quick-and-robust multifactorial screening technique for early detection of statistically significant effects that could influence a product's life-time performance.

Design/methodology/approach

The proposed method takes advantage of saturated fractional factorial designs for organizing the lifetime dataset collection process. Small censored lifetime data are fitted to the Kaplan–Meier model. Low-percentile lifetime behavior that is derived from the fitted model is used to screen for strong effects. A robust surrogate profiler is employed to furnish the predictions.

Findings

The methodology is tested on a difficult published case study that involves the eleven-factor screening of an industrial-grade thermostat. The tested thermostat units are use-rate accelerated to expedite the information collection process. The solution that is provided by this new method suggests as many as two active effects at the first decile of the data which improves over a solution provided from more classical methods.

Research limitations/implications

To benchmark the predicted solution with other competing approaches, the results showcase the critical first decile part of the dataset. Moreover, prediction capability is demonstrated for the use-rate acceleration condition.

Practical implications

The technique might be applicable to projects where the early reliability improvement is studied for complex industrial products.

Originality/value

The proposed methodology offers a range of features that aim to make the product reliability profiling process faster and more robust while managing to be less susceptible to assumptions often encountered in classical multi-parameter treatments.

Details

International Journal of Quality & Reliability Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0265-671X

Keywords

To view the access options for this content please click here
Article
Publication date: 1 May 2009

George J. Besseris

The purpose of this paper is to propose a manufacturing product‐screening methodology that will require minimal resource expenditures as well as succinct improvement tools…

Abstract

Purpose

The purpose of this paper is to propose a manufacturing product‐screening methodology that will require minimal resource expenditures as well as succinct improvement tools based on multi‐response prioritisation.

Design/methodology/approach

A six‐step methodology is overviewed that relies on the sampling efficiency of fractional factorial designs introduced and recommended by Dr G. Taguchi. Moreover, the multi‐response optimisation approach based on the super‐ranking concept is expanded to the more pragmatic situation where prioritising of the implicated responses is imperative. Theoretical developments address the on‐going research issue of saturated and unreplicated fractional‐factorial designs. The methodology promotes the “user‐friendly” incorporation of assigned preference weights on the studied responses. Test efficiency is improved by concise rank ordering. This technique is accomplished by adopting the powerful rank‐sum inference method of Wilcoxon‐Mann‐Whitney.

Findings

Two real‐life case studies complement the proposed technique. The first discusses a production problem on manufacturing disposable shavers. Injection moulding data for factors such as handle weight, two associated critical handle dimensions and a single mechanical property undergo preferential multi‐response improvement based on working specification standards. This case shows that regardless of fluctuations incurred by four different sources of response prioritisation, only injection speed endures high‐statistical significance for all four cases out of the seven considered production factors. Similarly, the technique identifies a single active factor in a foil manufacturing optimisation of three traits among seven examined effects.

Originality/value

This investigation suggests a technique that targets the needs of manufacturing managers and engineers for “quick‐and‐robust” decision making in preferential product improvement. This is achieved by conjoining orthogonal arrays with a well‐established non‐parametric comparison test. A version of the super‐ranking concept is adapted for the weighted multi‐response optimisation case.

Details

Journal of Manufacturing Technology Management, vol. 20 no. 4
Type: Research Article
ISSN: 1741-038X

Keywords

To view the access options for this content please click here
Article
Publication date: 6 January 2012

George J. Besseris

The purpose of this paper is to propose a methodology that may aid in assessing ecological quality multi‐trait screening through the use of simple and robust tools while…

Abstract

Purpose

The purpose of this paper is to propose a methodology that may aid in assessing ecological quality multi‐trait screening through the use of simple and robust tools while exerting minimal effort in conducting trials and interpreting results.

Design/methodology/approach

Response data for two popular site‐monitoring environmental indicators, chemical oxygen demand (COD) and biochemical oxygen demand (BOD), are arranged by implementing an 8‐run saturated orthogonal array proposed by Taguchi. Unreplicated data consolidation is performed through the Super‐Ranking translation. This permits converting the two eco‐traits to a single master response which becomes much easier to manipulate statistically. Distribution‐free multi‐factor contrasting provides the data reduction engine to filter‐out non‐active eco‐design variables for a waste treatment unit in a large dairy‐products plant.

Findings

Environmental quality improvement is achieved by accumulating structured eco‐data sets through an unreplicated‐saturated L8(27) Taguchi design. Concurrent minimization of the two selected eco‐responses, COD and BOD, promotes in a statistically significant fashion the quantity of incoming wastes, set at its minimum load, as the sole active eco‐factor.

Practical implications

Brief but robust experimentation is exploited in gaining information about the phenomenological behavior of environmental quality indicators, namely COD and BOD, in facilities that manage wastewater treatment. Design for environment is enforced through standard DOE planning schemes. Collected multi‐metric eco‐quality data are translated non‐parametrically in an easy‐to‐comprehend manner that requires no assist from software aids while bypassing more statistical intensive techniques which may demand involvement of more experienced personnel. The methodology is accessible to any level of statistical competence seamlessly intertwined to optimization demands for rapid inference needs.

Originality/value

The method mixes up three distinctive “design‐and‐analysis” elements in order to provide optimal solution in a design‐for‐environment project. The sampling capabilities of Taguchi's orthogonal arrays in concert with Super‐Ranking transformation fuse multi‐eco‐characteristics to a single easy‐to‐handle master unitless eco‐response. Order statistics tables recently published in terms of true probabilities have been adopted for supplying the proper cutoff points to be utilized for gauging against observed rank sums for an 8‐run orthogonal array screening. Quality managers and environmental engineers who contribute routinely to continuous eco‐improvement projects in their Total Environmental Quality Management (TEQM) program may find this approach attractive and viable en route to a typical industrial pollution prevention control deployment.

To view the access options for this content please click here
Article
Publication date: 26 June 2009

George J. Besseris

The aim of this paper is to circumvent the multi‐distribution effects and small sample constraints that may arise in unreplicated‐saturated fractional factorial designs…

Abstract

Purpose

The aim of this paper is to circumvent the multi‐distribution effects and small sample constraints that may arise in unreplicated‐saturated fractional factorial designs during construction blueprint screening.

Design/methodology/approach

A simple additive ranking scheme is devised based on converting the responses of interest to rank variables regardless of the nature of each response and the optimization direction that may be issued for each of them. Collapsing all ranked responses to a single rank response, appropriately referred to as “Super‐Ranking”, allows simultaneous optimization for all factor settings considered.

Research limitations/implications

The Super‐Rank response is treated by Wilcoxon's rank sum test or Mann‐Whitney's test, aiming to establish possible factor‐setting differences by exploring their statistical significance. An optimal value for each response is predicted.

Practical implications

It is stressed, by example, that the model may handle simultaneously any number of quality characteristics. A case study based on a real geotechnical engineering project is used to illustrate how this method may be applied for optimizing simultaneously three quality characteristics that belong to each of the three possible cases, i.e. “nominal‐is‐best”, “larger‐is‐better”, and “smaller‐is‐better” respectively. For this reason, a screening set of experiments is performed on a professional CAD/CAE software package making use of an L8(27) orthogonal array where all seven factor columns are saturated by group excavation controls.

Originality/value

The statistical nature of this method is discussed in comparison with results produced by the desirability method for the case of exhausted degrees of freedom for the error. The case study itself is a unique paradigm from the area of construction operations management.

Details

International Journal of Quality & Reliability Management, vol. 26 no. 6
Type: Research Article
ISSN: 0265-671X

Keywords

To view the access options for this content please click here
Article
Publication date: 17 April 2009

George J. Besseris

The aim of this paper is to examine product formulation screening at the industrial level in terms of multi‐trait improvement by considering several pertinent controlling factors.

Abstract

Purpose

The aim of this paper is to examine product formulation screening at the industrial level in terms of multi‐trait improvement by considering several pertinent controlling factors.

Design/methodology/approach

The study adopts Taguchi's orthogonal arrays (OAs) for sufficient and economical sampling in a mixture problem. Robustness of testing data is instilled in this method by employing a two‐stage analysis where controlling components are investigated together while the slack variable is tested independently. Multi‐responses collapse to a single master response has been incurred according to the Super Ranking concept. Order statistics are employed to provide statistical significance. The slack variable influence is tested by regression and nonparametric correlation.

Findings

Synergy among Taguchi methodology, super ranking and nonparametric testing was seamless to offer practical resolution to product component activeness. The concurrent modulation of two key product traits due to five constituents in the industrial production of muffin‐cake is invoked. The slack variable, rich cream, is strongly active while the influence of added amount of water is barely evident.

Research limitations/implications

The method presented is suitable only for situations where industrial mixtures are investigated. The case study demonstrates prediction capabilities up to quadratic effects for five nominated effects. However, the statistical processor selected here may be adapted to any number of factor settings dictated by the OA sampling plan.

Practical implications

By using a case study from food engineering, the industrial production of a muffin‐cake is examined focusing on a total of five controlling mixture components and two responses. This demonstration emphasizes the dramatic savings in time and effort that are gained by the proposed method due to reduction of experimental effort while gaining on analysis robustness.

Originality/value

This work interconnects Taguchi methodology with powerful nonparametric tests of Kruskal‐Wallis for the difficult problem of non‐linear analysis of mixtures for saturated, unreplicated fractional factorial designs in search of multi‐factor activeness in multi‐response cases employing simple and practical tools.

Details

International Journal of Quality & Reliability Management, vol. 26 no. 4
Type: Research Article
ISSN: 0265-671X

Keywords

To view the access options for this content please click here
Article
Publication date: 5 August 2014

George Besseris

The purpose of this paper is to propose a set of process capability indices (PCIs) which are based on robust and agile statistics such that they may be applicable…

Abstract

Purpose

The purpose of this paper is to propose a set of process capability indices (PCIs) which are based on robust and agile statistics such that they may be applicable irrespective of the process status.

Design/methodology/approach

The four popular PCIs – Cp, Cpk, Cpm and Cpmk – are reconstructed to improve location and dispersion predictions by introducing robust estimators such as the median and the interquartile range. The proposed PCIs are sequentially evaluated in partitioned regions where fluctuations are inspected to be not significant. The runs test playing the role of a detector permits marking those regions between two consecutive appearances of causes that disrupt data randomness. Wilcoxon's one-sample test is utilized to approximate PCI's central tendency and its confidence interval across all formed partitions.

Findings

The Cpmk depicted the most conservative view of the process status when tracking the magnesium content in a showcased aluminum manufacturing paradigm. Cp and Cpk were benchmarked with controlled random data. It was found that the proposed set of robust PCIs are substantially less prone to false alarm in predicting non-conforming units in comparison to the regular PCIs.

Originality/value

The recommended method for estimating PCIs is purely distribution-free and thus deployable at any process maturity level. The advantageous approach defends vigorously against the influence of intruding sources of unknown and unknowable variability. Therefore, the predicament here is to protect the monitoring indicators from unforeseen data instability and breakdown, which are conspicuous in wreaking havoc in managerial decisions.

To view the access options for this content please click here
Article
Publication date: 6 March 2017

Daryl Powell, Sissel Lundeby, Lukas Chabada and Heidi Dreyer

The purpose of this paper is to investigate the application of Lean Six Sigma (LSS) in the continuous process industry, taking an insight into the food processing…

Abstract

Purpose

The purpose of this paper is to investigate the application of Lean Six Sigma (LSS) in the continuous process industry, taking an insight into the food processing industry; and to evaluate the impact of LSS on environmental sustainability. The authors present observations and experiences from the application of LSS at a Norwegian dairy producer, with the aim of bringing out pertinent factors and useful insights that help us to understand how LSS can contribute toward greater environmental sustainability in this industry type, something that is so far lacking in the extant literature.

Design/methodology/approach

The authors adopt a single, longitudinal field study approach as we observe an entire cycle of the VSM-DMAIC (value stream mapping-define, measure, analyze, improve and control) LSS process, which evolved over a six-month period at the dairy.

Findings

The authors highlight some of the important elements that should be considered when using LSS as a contributor toward greater environmental sustainability in fresh-food supply chains. The authors also present some of the specific outcomes and key success criteria that became apparent to the implementation team following the deployment of the VSM-DMAIC approach.

Originality/value

The authors demonstrate how LSS can be applied in the food processing industry as a contributor to greater environmental sustainability. The authors also make useful reflections regarding the success criteria that can be used by researchers and practitioners for the effective deployment of such an approach, particularly in the continuous process industry.

Details

International Journal of Lean Six Sigma, vol. 8 no. 1
Type: Research Article
ISSN: 2040-4166

Keywords

To view the access options for this content please click here
Article
Publication date: 29 July 2014

George Besseris

The purpose of this study is to provide a method for Lean Six Sigma (LSS) improvement projects that may aid LSS practitioners to plan and conduct robust and lean…

Abstract

Purpose

The purpose of this study is to provide a method for Lean Six Sigma (LSS) improvement projects that may aid LSS practitioners to plan and conduct robust and lean product/process optimization studies for complex and constrained products, such as those encountered in food industry operations.

Design/methodology/approach

The technique is to be used for replicated LSS product experimentation on multiple effects elicited on several product traits. The authors compress replicated information reducing each response to simpler lean and robust median and range response components. Then, the desirability method is utilized to optimize concurrently location and dispersion contributions.

Findings

The suggested method is demonstrated with a case study drawn from the area of food development where cocoa-cream filling for a large-scale croissant production operation undergoes a robust screening on two crucial characteristics – viscosity and water activity – that influence product and process performance as well as product safety.

Originality/value

The proposed method amalgamates concepts of fractional factorial designs for expedient experimentation along with robust multi-factorial inference methods easily integrated to the desirability function for determining significant process and product effects in a synchronous multi-characteristic improvement effort. The authors show that the technique is not hampered by ordinary limitations expected with mainstream solvers, such as MANOVA. The case study is unique because it brings in jointly lean, quality and safety aspects of an edible product. The showcased responses are unique because they influence both process and product behavior. Lean response optimization is demonstrated through the paradigm.

Details

International Journal of Lean Six Sigma, vol. 5 no. 3
Type: Research Article
ISSN: 2040-4166

Keywords

To view the access options for this content please click here
Article
Publication date: 7 September 2010

George J. Besseris

Screening simultaneously for effects and their curvature may be useful in industrial environments when an economic restriction on experimentation is imposed…

Abstract

Purpose

Screening simultaneously for effects and their curvature may be useful in industrial environments when an economic restriction on experimentation is imposed. Saturated‐unreplicated fractional factorial designs have been a regular outlet for scheduling screening investigations under such circumstances. The purpose of this paper is to devise a practical test that may simultaneously quantify in statistical terms the possible existence of active factors in concert with an associated non‐linearity during screening.

Design/methodology/approach

The three‐level, nine‐run orthogonal design is utilized to compute a family of parameter‐free reference cumulative distributions by permuting ranked observations via a brute‐force method. The proposed technique is simple, practical and non‐graphical. It is based on Kruskal‐Wallis test and involves a sum of effects through the squared rank‐sum inference statistic. This statistic is appropriately extended for fractional factorial composite contrasting while avoiding explicitly the effect sparsity assumption.

Findings

The method is shown to be worthy competing with mainstream comparison methods and aids in averting potential complications arising from the indiscriminant use of analysis of variance in very low sampling schemes where subjective variance pooling is otherwise enforced.

Research limitations/implications

The true distributions obtained in this paper are suitable for sieving a fairly small amount of potential control factors while maintaining the non‐linearity question in the search.

Practical implications

The method is objective and is further elucidated by reworking two recent case studies which account for a total of five saturated screenings.

Originality/value

The statistical tables produced are easy to use and uphold the need for estimating separately mean and variance effects which are rather difficult to pinpoint for the fast track, low‐volume trials this paper is intended to.

Details

International Journal of Quality & Reliability Management, vol. 27 no. 8
Type: Research Article
ISSN: 0265-671X

Keywords

1 – 10 of 64