Search results
1 – 10 of 709Screening simultaneously for effects and their curvature may be useful in industrial environments when an economic restriction on experimentation is imposed…
Abstract
Purpose
Screening simultaneously for effects and their curvature may be useful in industrial environments when an economic restriction on experimentation is imposed. Saturated‐unreplicated fractional factorial designs have been a regular outlet for scheduling screening investigations under such circumstances. The purpose of this paper is to devise a practical test that may simultaneously quantify in statistical terms the possible existence of active factors in concert with an associated non‐linearity during screening.
Design/methodology/approach
The three‐level, nine‐run orthogonal design is utilized to compute a family of parameter‐free reference cumulative distributions by permuting ranked observations via a brute‐force method. The proposed technique is simple, practical and non‐graphical. It is based on Kruskal‐Wallis test and involves a sum of effects through the squared rank‐sum inference statistic. This statistic is appropriately extended for fractional factorial composite contrasting while avoiding explicitly the effect sparsity assumption.
Findings
The method is shown to be worthy competing with mainstream comparison methods and aids in averting potential complications arising from the indiscriminant use of analysis of variance in very low sampling schemes where subjective variance pooling is otherwise enforced.
Research limitations/implications
The true distributions obtained in this paper are suitable for sieving a fairly small amount of potential control factors while maintaining the non‐linearity question in the search.
Practical implications
The method is objective and is further elucidated by reworking two recent case studies which account for a total of five saturated screenings.
Originality/value
The statistical tables produced are easy to use and uphold the need for estimating separately mean and variance effects which are rather difficult to pinpoint for the fast track, low‐volume trials this paper is intended to.
Details
Keywords
George Besseris and Panagiotis Tsarouhas
The study aims to provide a quick-and-robust multifactorial screening technique for early detection of statistically significant effects that could influence a product's life-time…
Abstract
Purpose
The study aims to provide a quick-and-robust multifactorial screening technique for early detection of statistically significant effects that could influence a product's life-time performance.
Design/methodology/approach
The proposed method takes advantage of saturated fractional factorial designs for organizing the lifetime dataset collection process. Small censored lifetime data are fitted to the Kaplan–Meier model. Low-percentile lifetime behavior that is derived from the fitted model is used to screen for strong effects. A robust surrogate profiler is employed to furnish the predictions.
Findings
The methodology is tested on a difficult published case study that involves the eleven-factor screening of an industrial-grade thermostat. The tested thermostat units are use-rate accelerated to expedite the information collection process. The solution that is provided by this new method suggests as many as two active effects at the first decile of the data which improves over a solution provided from more classical methods.
Research limitations/implications
To benchmark the predicted solution with other competing approaches, the results showcase the critical first decile part of the dataset. Moreover, prediction capability is demonstrated for the use-rate acceleration condition.
Practical implications
The technique might be applicable to projects where the early reliability improvement is studied for complex industrial products.
Originality/value
The proposed methodology offers a range of features that aim to make the product reliability profiling process faster and more robust while managing to be less susceptible to assumptions often encountered in classical multi-parameter treatments.
Details
Keywords
Mohammad Reza Fathi, Seyed Mohammad Sobhani, Mohammad Hasan Maleki and Gholamreza Jandaghi
This study aims to formulate exploratory scenarios of the textile industry in Iran based on MICMAC and soft operational research methods.
Abstract
Purpose
This study aims to formulate exploratory scenarios of the textile industry in Iran based on MICMAC and soft operational research methods.
Design/methodology/approach
In this study, to formulate plausible scenarios, literature reviews and external experts’ opinions of this field have been gathered through the Delphi approach and uncertainty questionnaires. After the utilization of the most important uncertainties, the textile industry’s plausible scenarios have been mapped with the help of experts through co-thinking workshops. Results show that two factors, including the business atmosphere and membership in World Trade Organization (WTO), play a more important role than the other factors. These two factors were considered for the formulation of the scenario. To formulate plausible scenarios, soft systems methodology, which is a kind of soft operational research methods, is applied.
Findings
Based on the results, four scenarios are presented. These scenarios include the Elysium scenario, Hades scenario, Tatarus scenario and Sisyphus scenario. In the Elysium scenario, the business atmosphere has improved and Iran has been granted membership of the WTO. In Hades scenario, Iran has joined the WTO, but due to the government’s weakness and inactivity and key decision-makers, the required preparations have not been made. In the Tatarus scenario, Iran is not a WTO member and the business atmosphere is disastrous. In the Sisyphus scenario, the government takes reasonable actions toward a better business environment.
Originality/value
Formulating plausible scenarios of the textile industry is an excellent contribution to the key beneficiaries and actors of this industry so they can present flexible preparation-based programs in the face of circumstances. Future study of the textile industry familiarizes the actors and beneficiaries of this industry with the procedures and the driving forces that influence this industry’s future and it will ascertain various scenarios for the actors of this field.
Details
Keywords
The aim of this paper is to examine product formulation screening at the industrial level in terms of multi‐trait improvement by considering several pertinent controlling factors.
Abstract
Purpose
The aim of this paper is to examine product formulation screening at the industrial level in terms of multi‐trait improvement by considering several pertinent controlling factors.
Design/methodology/approach
The study adopts Taguchi's orthogonal arrays (OAs) for sufficient and economical sampling in a mixture problem. Robustness of testing data is instilled in this method by employing a two‐stage analysis where controlling components are investigated together while the slack variable is tested independently. Multi‐responses collapse to a single master response has been incurred according to the Super Ranking concept. Order statistics are employed to provide statistical significance. The slack variable influence is tested by regression and nonparametric correlation.
Findings
Synergy among Taguchi methodology, super ranking and nonparametric testing was seamless to offer practical resolution to product component activeness. The concurrent modulation of two key product traits due to five constituents in the industrial production of muffin‐cake is invoked. The slack variable, rich cream, is strongly active while the influence of added amount of water is barely evident.
Research limitations/implications
The method presented is suitable only for situations where industrial mixtures are investigated. The case study demonstrates prediction capabilities up to quadratic effects for five nominated effects. However, the statistical processor selected here may be adapted to any number of factor settings dictated by the OA sampling plan.
Practical implications
By using a case study from food engineering, the industrial production of a muffin‐cake is examined focusing on a total of five controlling mixture components and two responses. This demonstration emphasizes the dramatic savings in time and effort that are gained by the proposed method due to reduction of experimental effort while gaining on analysis robustness.
Originality/value
This work interconnects Taguchi methodology with powerful nonparametric tests of Kruskal‐Wallis for the difficult problem of non‐linear analysis of mixtures for saturated, unreplicated fractional factorial designs in search of multi‐factor activeness in multi‐response cases employing simple and practical tools.
Details
Keywords
The purpose of this paper is to propose a methodology that may aid in assessing information technology (IT) quality characteristic optimisation through the use of simple and…
Abstract
Purpose
The purpose of this paper is to propose a methodology that may aid in assessing information technology (IT) quality characteristic optimisation through the use of simple and robust tools with minimal effort.
Design/methodology/approach
Non‐linear saturated fractional factorial designs proposed by Taguchi receive robust data processing by the efficient nonparametric test of Jonckheere and Terpstra.
Findings
The paper finds that e‐mail quality improvement is achieved by collecting data through an unreplicated‐saturated L9(34) design. Active influences are attributed to the e‐mail volume and the receiving hardware type.
Research limitations/implications
The overall efficiency of the method is greatly enhanced due to incorporation of a nonparametric analysis tool that is known to perform effectively when data availability is minimal. The method does not succumb to normality and multi‐distributional effects which may easily handicap the decision‐making process when derived from other mainstream methods.
Practical implications
There are obvious professional and pedagogical aspects in this work aiming at IT quality practitioners offering facilitation towards implementing robust techniques while suppressing quality costs. It is noteworthy that nonparametric data processing improves on the ability to make predictions over Taguchi's regular Design of Experiments (DOE) formulation for small sampling conditions.
Originality/value
This method embraces designing efficiency by non‐linear orthogonal arrays with multi‐level order statistics providing the weaponry to deal with quality optimisation in complex environments such as those in the IT area. The value of this work may be appreciated best by quality managers and engineers engaged in routine quality improvement projects in the area of information systems which also augments the general database of quality‐related testing cases.
Details
Keywords
Zongwu Cai, Jingping Gu and Qi Li
There is a growing literature in nonparametric econometrics in the recent two decades. Given the space limitation, it is impossible to survey all the important recent developments…
Abstract
There is a growing literature in nonparametric econometrics in the recent two decades. Given the space limitation, it is impossible to survey all the important recent developments in nonparametric econometrics. Therefore, we choose to limit our focus on the following areas. In Section 2, we review the recent developments of nonparametric estimation and testing of regression functions with mixed discrete and continuous covariates. We discuss nonparametric estimation and testing of econometric models for nonstationary data in Section 3. Section 4 is devoted to surveying the literature of nonparametric instrumental variable (IV) models. We review nonparametric estimation of quantile regression models in Section 5. In Sections 2–5, we also point out some open research problems, which might be useful for graduate students to review the important research papers in this field and to search for their own research interests, particularly dissertation topics for doctoral students. Finally, in Section 6 we highlight some important research areas that are not covered in this paper due to space limitation. We plan to write a separate survey paper to discuss some of the omitted topics.
Bassem M. Hijazi and James A. Conover
We examine the empirical relationship between direct equity agency costs measures and corporate governance control mechanisms to control equity agency costs. We measure the three…
Abstract
We examine the empirical relationship between direct equity agency costs measures and corporate governance control mechanisms to control equity agency costs. We measure the three direct agency cost proxies commonly used in the literature: the operating expense; asset turnover; and selling, general, and administrative (SGA) ratios. Internal corporate governance control mechanisms examined are inside ownership (IO), outside ownership concentration (OC), the size of the board of directors (BODs), and the composition of the BODs (proportion of nonexecutive (NE) directors and separation of chief executive officer (CEO) and board chair). The external corporate governance control mechanism examined is the size of bank debt (short-term debt). Univariate and multivariate tests reveal that the only statistically significant relationship between corporate governance control mechanisms and direct equity agency cost measures is the negative relationship between the proportion of IO and direct agency costs. The asset utilization ratio (asset turnover) ratio is the best proxy for direct equity agency costs and can be useful for event studies of announcement period excess returns.
Burn‐in is an engineering method extensively used to screen out infant mortality failure defects. Previous studies have attempted to determine the optimum burn‐in time and cost…
Abstract
Burn‐in is an engineering method extensively used to screen out infant mortality failure defects. Previous studies have attempted to determine the optimum burn‐in time and cost for a device or a system. However, for the mathematical model, many assumptions are inappropriate due to practical concerns, and for the cost model, the required costs are difficult to find. How to effectively determine the optimal burn‐in time and cost has perplexed manufacturers for quite some time. In the actual manufacturing process, a new electronic product is always extended from an old product, called the base product. By adopting the relationship between new product and base product, this study presents a neural network‐based approach to determine the optimal burn‐in time and cost without any assumptions. A case study of the production of a switch mode rectifier demonstrates the effectiveness of the proposed approach.
Details
Keywords
This chapter introduces a conceptual framework which links consumers' demographic characteristics with their attitudes toward major shopping area attributes (the push/pull…
Abstract
This chapter introduces a conceptual framework which links consumers' demographic characteristics with their attitudes toward major shopping area attributes (the push/pull factors), as well as their motivations toward cross-border shopping. It is built on the extant literature of outshopping, cross-border shopping, and consumer switching behavior. It has been tested with data collected from 485 Hong Kong residents. A nonparametric approach will be used to analyze the data. Findings of this study show that “age” and “education” characteristics are good indicators for most of the macrofactors (shopping area attributes). As for microfactors (motivational factors), “age” and “gender” are the best indicators. Results of this study also confirm previous findings that demographic characteristics of consumers affect their cross-border shopping behavior. Low prices on products and good services are the most important pull-factor attracting cross-border shopping. It further reveals that a higher percentage of cross-border shoppers are from lower income families, having only secondary education level, and in the age category of 30–49. Implications for retailers, governments, and tourism-related institutions are discussed.
Details
Keywords
The aim of this paper is to circumvent the multi‐distribution effects and small sample constraints that may arise in unreplicated‐saturated fractional factorial designs during…
Abstract
Purpose
The aim of this paper is to circumvent the multi‐distribution effects and small sample constraints that may arise in unreplicated‐saturated fractional factorial designs during construction blueprint screening.
Design/methodology/approach
A simple additive ranking scheme is devised based on converting the responses of interest to rank variables regardless of the nature of each response and the optimization direction that may be issued for each of them. Collapsing all ranked responses to a single rank response, appropriately referred to as “Super‐Ranking”, allows simultaneous optimization for all factor settings considered.
Research limitations/implications
The Super‐Rank response is treated by Wilcoxon's rank sum test or Mann‐Whitney's test, aiming to establish possible factor‐setting differences by exploring their statistical significance. An optimal value for each response is predicted.
Practical implications
It is stressed, by example, that the model may handle simultaneously any number of quality characteristics. A case study based on a real geotechnical engineering project is used to illustrate how this method may be applied for optimizing simultaneously three quality characteristics that belong to each of the three possible cases, i.e. “nominal‐is‐best”, “larger‐is‐better”, and “smaller‐is‐better” respectively. For this reason, a screening set of experiments is performed on a professional CAD/CAE software package making use of an L8(27) orthogonal array where all seven factor columns are saturated by group excavation controls.
Originality/value
The statistical nature of this method is discussed in comparison with results produced by the desirability method for the case of exhausted degrees of freedom for the error. The case study itself is a unique paradigm from the area of construction operations management.
Details