Search results1 – 10 of over 93000
Discusses alternative strategies which may be employed when differences arise between achieved audit‐sampling results and planned results, which means that risk levels…
Discusses alternative strategies which may be employed when differences arise between achieved audit‐sampling results and planned results, which means that risk levels used in ex post decision making may be different from planned levels. Contrasts a conventional strategy — which is to fix the risk of incorrect acceptance at a planned level and to ignore the risk of incorrect rejection or to accept the minimum available level of that risk which is consistent, after the fact, with the planned level of risk of incorrect acceptance — with a theoretically appealing strategy which balances both risk levels in proportion to their perceived disutility. Reports on the results of an experiment involving these two strategies, in which all subjects were auditors with statistical audit experience. Suggests that the most important statistically significant finding is that, in certain circumstances, these auditors are more willing to base audit decisions on statistical evidence after the alternative strategy is explained and available for their use.
Traditional approaches to risk control used for planning,executing, and evaluating substantive audit tests focus primarily on therisk of accepting a materially misstated…
Traditional approaches to risk control used for planning, executing, and evaluating substantive audit tests focus primarily on the risk of accepting a materially misstated amount (β risk) and only consider passively the risk of rejecting a correctly stated amount (α risk). This article discusses an alternative – the trade‐off approach – that formally considers both risks. Although the traditional and the trade‐off approach frequently lead to the same statistical conclusion, there are some applications in which only the trade‐off approach, can provide a statistical conclusion in support of the auditor′s assertion that a client′s balance is correctly stated. The article identifies these applications and suggests that the trade‐off approach merits further consideration.
For numerical treatment of resin‐containing systems and forecasting of their properties, certain models of branching are needed. In this review, existing theoretical…
For numerical treatment of resin‐containing systems and forecasting of their properties, certain models of branching are needed. In this review, existing theoretical models of systems containing branched structures (polymers, aggregates, etc.) are analyzed and compared. The criteria of selection of the optimal theoretical model comprise chemical and physical problems available for solution, simplicity of such solution, connection between theoretically forecasted and experimental results, and the time needed for computing. It is concluded that, according to these criteria, the optimal (between existing models) is the statistical polymer method.
The purpose of this paper is to develop a new easy-to-implement distribution-free integrated multivariate statistical process control (MSPC) approach with an ability to…
The purpose of this paper is to develop a new easy-to-implement distribution-free integrated multivariate statistical process control (MSPC) approach with an ability to recognize out-of-control points, identify the key influential variable for the out-of-control state, and determine necessary changes to achieve the state of statistical control.
The proposed approach integrates the control chart technique, the Mahalanobis-Taguchi System concept, the Andrews function plot, and nonlinear optimization for multivariate process control. Mahalanobis distance, Taguchi’s orthogonal array, and the main effect plot concept are used to identify the key influential variable responsible for the out-of-control situation. The Andrews function plot and nonlinear optimization help to identify direction and necessary correction to regain the state of statistical control. Finally, two different real life case studies illustrate the suitability of the approach.
The case studies illustrate the potential of the proposed integrated multivariate process control approach for easy implementation in varied manufacturing and process industries. In addition, the case studies also reveal that the multivariate out-of-control state is primarily contributed by a single influential variable.
The approach is limited to the situation in which a single influential variable contributes to out-of-control situation. The number and type of cases used are also limited and thus generalization may not be debated. Further research is necessary with varied case situations to refine the approach and prove its extensive applicability.
The proposed approach does not require multivariate normality assumption and thus provides greater flexibility for the industry practitioners. The approach is also easy to implement and requires minimal programming effort. A simple application Microsoft Excel is suitable for online implementation of this approach.
The key steps of the MSPC approach are identifying the out-of-control point, diagnosing the out-of-control point, identifying the “influential” variable responsible for the out-of-control state, and determining the necessary direction and the amount of adjustment required to achieve the state of control. Most of the approaches reported in open literature are focused only until identifying influencing variable, with many restrictive assumptions. This paper addresses all key steps in a single integrated distribution-free approach, which is easy to implement in real time.
Monitoring of a process leading to the detection of faults and determination of the root causes are essential for the production of consistent good quality end products…
Monitoring of a process leading to the detection of faults and determination of the root causes are essential for the production of consistent good quality end products with improved yield. The history of process monitoring fault detection (PMFD) strategies can be traced back to 1930s. Thereafter various tools, techniques and approaches were developed along with their application in diversified fields. The purpose of this paper is to make a review to categorize, describe and compare the various PMFD strategies.
Taxonomy was developed to categorize PMFD strategies. The basis for the categorization was the type of techniques being employed for devising the PMFD strategies. Further, PMFD strategies were discussed in detail along with emphasis on the areas of applications. Comparative evaluations of the PMFD strategies based on some commonly identified issues were also carried out. A general framework common to all the PMFD has been presented. And lastly a discussion into future scope of research was carried out.
The techniques employed for PMFD are primarily of three types, namely data driven techniques such as statistical model based and artificial intelligent based techniques, priori knowledge based techniques, and hybrid models, with a huge dominance of the first type. The factors that should be considered in developing a PMFD strategy are ease in development, diagnostic ability, fault detection speed, robustness to noise, generalization capability, and handling of nonlinearity. The review reveals that there is no single strategy that can address all aspects related to process monitoring and fault detection efficiently and there is a need to mesh the different techniques from various PMFD strategies to devise a more efficient PMFD strategy.
The review documents the existing strategies for PMFD with an emphasis on finding out the nature of the strategies, data requirements, model building steps, applicability and scope for amalgamation. The review helps future researchers and practitioners to choose appropriate techniques for PMFD studies for a given situation. Further, future researchers will get a comprehensive but precise report on PMFD strategies available in the literature to date.
The review starts with identifying key indicators of PMFD for review and taxonomy was proposed. An analysis was conducted to identify the pattern of published articles on PMFD followed by evolution of PMFD strategies. Finally, a general framework is given for PMFD strategies for future researchers and practitioners.
The key to increasing productivity in the manufacturing sector does not lie solely in high technology, but also in an environment in which personnel at all levels are…
The key to increasing productivity in the manufacturing sector does not lie solely in high technology, but also in an environment in which personnel at all levels are operating with rational decisions and actions based on organised information. Inasmuch as uncertainties and changes are the features of all real‐world situations, appropriate means of measurement, analysis, prediction and control are indispensable; thus statistical tools have become a prerequisite to any productivity drive, those of statistical quality control being the most typical and best known. Some aspects of attaining proficiency in the statistical approach in an organisation are discussed from a management perspective.
After briefly reviewing the past history of Bayesian econometrics and Alan Greenspan's (2004) recent description of his use of Bayesian methods in managing policy-making…
After briefly reviewing the past history of Bayesian econometrics and Alan Greenspan's (2004) recent description of his use of Bayesian methods in managing policy-making risk, some of the issues and needs that he mentions are discussed and linked to past and present Bayesian econometric research. Then a review of some recent Bayesian econometric research and needs is presented. Finally, some thoughts are presented that relate to the future of Bayesian econometrics.
Postulates that the use of some key ideas from statistical controlthinking can improve service quality. Explores the identification andanalysis of gaps in perceptual…
Postulates that the use of some key ideas from statistical control thinking can improve service quality. Explores the identification and analysis of gaps in perceptual differences between service customers and service providers as a way of adopting a statistical control philosophy in a service environment. Argues that such a method provides excellent information for creating a true customer‐centred approach to service delivery, being practical, simple in operation and useful for both immediate and long‐term strategic impact.
The purpose of this paper is to find the proper statistical distribution function, which can cover the failure time of a single machine or a group of machines. To this…
The purpose of this paper is to find the proper statistical distribution function, which can cover the failure time of a single machine or a group of machines. To this end, an innovative program is written in an Excel software, capable of assessing at least six statistical distribution functions. This research study intends to show the advantages of applying statistical distribution functions in an integrated model format to create or increase productive reliability machines. Productive reliability is a simultaneous combination of efficiency and effectiveness in reliability.
The method of theoretical research methodology comprises data collection tools, reference books and articles in addition to exploiting written reports of the Iranian Center for Defence’s Standards. The practical research method includes deploying and assessing the proposed model for a selected machine (in this case a computerized numerical control machine).
A comprehensive program in an Excel software having the capability of assessing at least six statistical distribution functions was developed to find the most efficient option for covering the failure times of each machine in the shortest time with the highest precision. This is regarded as the most important achievement of the present study. Furthermore, the advantages of applying the developed model are discussed and a large group of which have direct influences on the productivity of equipment reliability.
The originality of the research was ascertained by managers and experts working in maintenance issues at the different levels of the Defense Industries Organization.
This chapter provides a presentation about Chapter 1 of The Balance of the National Economy, 1923–24, edited by Pavel Illich Popov. The Balance was issued in June 1926 by…
This chapter provides a presentation about Chapter 1 of The Balance of the National Economy, 1923–24, edited by Pavel Illich Popov. The Balance was issued in June 1926 by the Central Statistical Administration (CSU or TsSU) of the USSR, which Popov had headed from July 1918 to January 1926. In the first part of our chapter, we show how Popov’s work on the balance of the national economy was rooted in the specific scientific and political culture of zemstvo statisticians inherited from the Tsar. Statistical inquiry was considered an objective scientific process based on international standards. Furthermore, like zemstvo statisticians, CSU statisticians developed great autonomous political power. The balance of the national economy was built according to these principles, which met harsh criticism from revolutionaries and Bolsheviks. In the second part, we analyze the contents of Popov’s Chapter 1, especially the theoretical foundations of the balance and its connection with Soviet planning. In the third part, we discuss the balance’s significance in the years 1926–1929, years which ended the NEP and launched the first Five-Year Plan, so as to understand how CSU’s balance didn’t become a standard Soviet statistical instrument and was discarded as a “bourgeois” device.