Search results

1 – 10 of over 116000
Article
Publication date: 1 April 1993

James H. Thompson and Bart H. Ward

Discusses alternative strategies which may be employed when differences arise between achieved audit‐sampling results and planned results, which means that risk levels used in ex

Abstract

Discusses alternative strategies which may be employed when differences arise between achieved audit‐sampling results and planned results, which means that risk levels used in ex post decision making may be different from planned levels. Contrasts a conventional strategy — which is to fix the risk of incorrect acceptance at a planned level and to ignore the risk of incorrect rejection or to accept the minimum available level of that risk which is consistent, after the fact, with the planned level of risk of incorrect acceptance — with a theoretically appealing strategy which balances both risk levels in proportion to their perceived disutility. Reports on the results of an experiment involving these two strategies, in which all subjects were auditors with statistical audit experience. Suggests that the most important statistically significant finding is that, in certain circumstances, these auditors are more willing to base audit decisions on statistical evidence after the alternative strategy is explained and available for their use.

Details

Managerial Auditing Journal, vol. 8 no. 4
Type: Research Article
ISSN: 0268-6902

Keywords

Article
Publication date: 1 March 1991

James H. Thompson and Bart H. Ward

Traditional approaches to risk control used for planning,executing, and evaluating substantive audit tests focus primarily on therisk of accepting a materially misstated amount (…

Abstract

Traditional approaches to risk control used for planning, executing, and evaluating substantive audit tests focus primarily on the risk of accepting a materially misstated amount (β risk) and only consider passively the risk of rejecting a correctly stated amount (α risk). This article discusses an alternative – the trade‐off approach – that formally considers both risks. Although the traditional and the trade‐off approach frequently lead to the same statistical conclusion, there are some applications in which only the trade‐off approach, can provide a statistical conclusion in support of the auditor′s assertion that a client′s balance is correctly stated. The article identifies these applications and suggests that the trade‐off approach merits further consideration.

Details

Managerial Auditing Journal, vol. 6 no. 3
Type: Research Article
ISSN: 0268-6902

Keywords

Article
Publication date: 1 October 2001

Freddy Romm

For numerical treatment of resin‐containing systems and forecasting of their properties, certain models of branching are needed. In this review, existing theoretical models of…

Abstract

For numerical treatment of resin‐containing systems and forecasting of their properties, certain models of branching are needed. In this review, existing theoretical models of systems containing branched structures (polymers, aggregates, etc.) are analyzed and compared. The criteria of selection of the optimal theoretical model comprise chemical and physical problems available for solution, simplicity of such solution, connection between theoretically forecasted and experimental results, and the time needed for computing. It is concluded that, according to these criteria, the optimal (between existing models) is the statistical polymer method.

Details

Pigment & Resin Technology, vol. 30 no. 5
Type: Research Article
ISSN: 0369-9420

Keywords

Article
Publication date: 4 September 2017

Sagar Sikder, Subhash Chandra Panja and Indrajit Mukherjee

The purpose of this paper is to develop a new easy-to-implement distribution-free integrated multivariate statistical process control (MSPC) approach with an ability to recognize…

Abstract

Purpose

The purpose of this paper is to develop a new easy-to-implement distribution-free integrated multivariate statistical process control (MSPC) approach with an ability to recognize out-of-control points, identify the key influential variable for the out-of-control state, and determine necessary changes to achieve the state of statistical control.

Design/methodology/approach

The proposed approach integrates the control chart technique, the Mahalanobis-Taguchi System concept, the Andrews function plot, and nonlinear optimization for multivariate process control. Mahalanobis distance, Taguchi’s orthogonal array, and the main effect plot concept are used to identify the key influential variable responsible for the out-of-control situation. The Andrews function plot and nonlinear optimization help to identify direction and necessary correction to regain the state of statistical control. Finally, two different real life case studies illustrate the suitability of the approach.

Findings

The case studies illustrate the potential of the proposed integrated multivariate process control approach for easy implementation in varied manufacturing and process industries. In addition, the case studies also reveal that the multivariate out-of-control state is primarily contributed by a single influential variable.

Research limitations/implications

The approach is limited to the situation in which a single influential variable contributes to out-of-control situation. The number and type of cases used are also limited and thus generalization may not be debated. Further research is necessary with varied case situations to refine the approach and prove its extensive applicability.

Practical implications

The proposed approach does not require multivariate normality assumption and thus provides greater flexibility for the industry practitioners. The approach is also easy to implement and requires minimal programming effort. A simple application Microsoft Excel is suitable for online implementation of this approach.

Originality/value

The key steps of the MSPC approach are identifying the out-of-control point, diagnosing the out-of-control point, identifying the “influential” variable responsible for the out-of-control state, and determining the necessary direction and the amount of adjustment required to achieve the state of control. Most of the approaches reported in open literature are focused only until identifying influencing variable, with many restrictive assumptions. This paper addresses all key steps in a single integrated distribution-free approach, which is easy to implement in real time.

Details

International Journal of Quality & Reliability Management, vol. 34 no. 8
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 17 December 2021

Krzysztof Dmytrów and Wojciech Kuźmiński

Our research aims in designation of a hybrid approach in the calibration of an attribute impact vector in order to guarantee its completeness in case when other approaches cannot…

Abstract

Purpose

Our research aims in designation of a hybrid approach in the calibration of an attribute impact vector in order to guarantee its completeness in case when other approaches cannot ensure this.

Design/methodology/approach

Real estate mass appraisal aims at valuating a large number of properties by means of a specialised algorithm. We can apply various methods for this purpose. We present the Szczecin Algorithm of Real Estate Mass Appraisal (SAREMA) and the four methods of calibration of an attribute impact vector. Eventually, we present its application on the example of 318 residential properties in Szczecin, Poland.

Findings

We compare the results of appraisals obtained with the application of the hybrid approach with the appraisals obtained for the three remaining ones. If the database is complete and reliable, the econometric and statistical approaches could be recommended because they are based on quantitative measures of relationships between the values of attributes and properties' unit values. However, when the database is incomplete, the expert and, subsequently, hybrid approaches are used as supplementary ones.

Originality/value

The application of the hybrid approach ensures that the calibration system of an attribute impact vector is always complete. This is because it incorporates the expert approach that can be used even if the database excludes application of approaches that are based on quantitative measures of relationship between the unit real estate value and the value of attributes.

Article
Publication date: 27 July 2012

Anupam Das, J. Maiti and R.N. Banerjee

Monitoring of a process leading to the detection of faults and determination of the root causes are essential for the production of consistent good quality end products with…

1715

Abstract

Purpose

Monitoring of a process leading to the detection of faults and determination of the root causes are essential for the production of consistent good quality end products with improved yield. The history of process monitoring fault detection (PMFD) strategies can be traced back to 1930s. Thereafter various tools, techniques and approaches were developed along with their application in diversified fields. The purpose of this paper is to make a review to categorize, describe and compare the various PMFD strategies.

Design/methodology/approach

Taxonomy was developed to categorize PMFD strategies. The basis for the categorization was the type of techniques being employed for devising the PMFD strategies. Further, PMFD strategies were discussed in detail along with emphasis on the areas of applications. Comparative evaluations of the PMFD strategies based on some commonly identified issues were also carried out. A general framework common to all the PMFD has been presented. And lastly a discussion into future scope of research was carried out.

Findings

The techniques employed for PMFD are primarily of three types, namely data driven techniques such as statistical model based and artificial intelligent based techniques, priori knowledge based techniques, and hybrid models, with a huge dominance of the first type. The factors that should be considered in developing a PMFD strategy are ease in development, diagnostic ability, fault detection speed, robustness to noise, generalization capability, and handling of nonlinearity. The review reveals that there is no single strategy that can address all aspects related to process monitoring and fault detection efficiently and there is a need to mesh the different techniques from various PMFD strategies to devise a more efficient PMFD strategy.

Research limitations/implications

The review documents the existing strategies for PMFD with an emphasis on finding out the nature of the strategies, data requirements, model building steps, applicability and scope for amalgamation. The review helps future researchers and practitioners to choose appropriate techniques for PMFD studies for a given situation. Further, future researchers will get a comprehensive but precise report on PMFD strategies available in the literature to date.

Originality/value

The review starts with identifying key indicators of PMFD for review and taxonomy was proposed. An analysis was conducted to identify the pattern of published articles on PMFD followed by evolution of PMFD strategies. Finally, a general framework is given for PMFD strategies for future researchers and practitioners.

Details

International Journal of Quality & Reliability Management, vol. 29 no. 7
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 1 March 1990

T.N. Goh

The key to increasing productivity in the manufacturing sector does not lie solely in high technology, but also in an environment in which personnel at all levels are operating…

Abstract

The key to increasing productivity in the manufacturing sector does not lie solely in high technology, but also in an environment in which personnel at all levels are operating with rational decisions and actions based on organised information. Inasmuch as uncertainties and changes are the features of all real‐world situations, appropriate means of measurement, analysis, prediction and control are indispensable; thus statistical tools have become a prerequisite to any productivity drive, those of statistical quality control being the most typical and best known. Some aspects of attaining proficiency in the statistical approach in an organisation are discussed from a management perspective.

Details

International Journal of Quality & Reliability Management, vol. 7 no. 3
Type: Research Article
ISSN: 0265-671X

Keywords

Book part
Publication date: 1 January 2008

Arnold Zellner

After briefly reviewing the past history of Bayesian econometrics and Alan Greenspan's (2004) recent description of his use of Bayesian methods in managing policy-making risk…

Abstract

After briefly reviewing the past history of Bayesian econometrics and Alan Greenspan's (2004) recent description of his use of Bayesian methods in managing policy-making risk, some of the issues and needs that he mentions are discussed and linked to past and present Bayesian econometric research. Then a review of some recent Bayesian econometric research and needs is presented. Finally, some thoughts are presented that relate to the future of Bayesian econometrics.

Details

Bayesian Econometrics
Type: Book
ISBN: 978-1-84855-308-8

Article
Publication date: 17 March 2023

Stewart Jones

This study updates the literature review of Jones (1987) published in this journal. The study pays particular attention to two important themes that have shaped the field over the…

Abstract

Purpose

This study updates the literature review of Jones (1987) published in this journal. The study pays particular attention to two important themes that have shaped the field over the past 35 years: (1) the development of a range of innovative new statistical learning methods, particularly advanced machine learning methods such as stochastic gradient boosting, adaptive boosting, random forests and deep learning, and (2) the emergence of a wide variety of bankruptcy predictor variables extending beyond traditional financial ratios, including market-based variables, earnings management proxies, auditor going concern opinions (GCOs) and corporate governance attributes. Several directions for future research are discussed.

Design/methodology/approach

This study provides a systematic review of the corporate failure literature over the past 35 years with a particular focus on the emergence of new statistical learning methodologies and predictor variables. This synthesis of the literature evaluates the strength and limitations of different modelling approaches under different circumstances and provides an overall evaluation the relative contribution of alternative predictor variables. The study aims to provide a transparent, reproducible and interpretable review of the literature. The literature review also takes a theme-centric rather than author-centric approach and focuses on structured themes that have dominated the literature since 1987.

Findings

There are several major findings of this study. First, advanced machine learning methods appear to have the most promise for future firm failure research. Not only do these methods predict significantly better than conventional models, but they also possess many appealing statistical properties. Second, there are now a much wider range of variables being used to model and predict firm failure. However, the literature needs to be interpreted with some caution given the many mixed findings. Finally, there are still a number of unresolved methodological issues arising from the Jones (1987) study that still requiring research attention.

Originality/value

The study explains the connections and derivations between a wide range of firm failure models, from simpler linear models to advanced machine learning methods such as gradient boosting, random forests, adaptive boosting and deep learning. The paper highlights the most promising models for future research, particularly in terms of their predictive power, underlying statistical properties and issues of practical implementation. The study also draws together an extensive literature on alternative predictor variables and provides insights into the role and behaviour of alternative predictor variables in firm failure research.

Details

Journal of Accounting Literature, vol. 45 no. 2
Type: Research Article
ISSN: 0737-4607

Keywords

Article
Publication date: 1 January 1992

Dean E. Headley and Bob Choi

Postulates that the use of some key ideas from statistical controlthinking can improve service quality. Explores the identification andanalysis of gaps in perceptual differences…

1232

Abstract

Postulates that the use of some key ideas from statistical control thinking can improve service quality. Explores the identification and analysis of gaps in perceptual differences between service customers and service providers as a way of adopting a statistical control philosophy in a service environment. Argues that such a method provides excellent information for creating a true customer‐centred approach to service delivery, being practical, simple in operation and useful for both immediate and long‐term strategic impact.

Details

Journal of Services Marketing, vol. 6 no. 1
Type: Research Article
ISSN: 0887-6045

Keywords

1 – 10 of over 116000