Search results

1 – 10 of over 36000

Abstract

Details

Applying Maximum Entropy to Econometric Problems
Type: Book
ISBN: 978-0-76230-187-4

Article
Publication date: 8 February 2019

Pedro Carlos Oprime, Fabiane Leticia Lizarelli, Marcio Lopes Pimenta and Jorge Alberto Achcar

The traditional Shewhart control chart, the X-bar and R/S chart, cannot give support to decide when it is not economically feasible to stop the process in order to remove special…

Abstract

Purpose

The traditional Shewhart control chart, the X-bar and R/S chart, cannot give support to decide when it is not economically feasible to stop the process in order to remove special causes. Therefore, the purpose of this paper is to propose a new control chart design – a modified acceptance control chart, which provides a supportive method for decision making in economic terms, especially when the process has high capability indices.

Design/methodology/approach

The authors made a modeling expectation average run length (ARL), which incorporates the probability density function of the sampling distribution of Cpk, to compare and analyze the efficiency of the proposed design.

Findings

This study suggested a new procedure to calculate the control limits (CL) of the X-bar chart, which allows economic decisions about the process to be made. By introducing a permissible average variation and defining three regions for statistical CL in the traditional X-bar control chart, a new design is proposed.

Originality/value

A framework is presented to help practitioners in the use of the proposed control chart. Two new parameters (Cp and Cpk) in addition to m and n were introduced in the expected ARL equation. The Cpk is a random variable and its probability function is known. Therefore, by using a preliminary sample of a process under control, the authors can test whether the process is capable or not.

Details

International Journal of Quality & Reliability Management, vol. 36 no. 6
Type: Research Article
ISSN: 0265-671X

Keywords

Book part
Publication date: 1 January 2008

Arnold Zellner

After briefly reviewing the past history of Bayesian econometrics and Alan Greenspan's (2004) recent description of his use of Bayesian methods in managing policy-making risk…

Abstract

After briefly reviewing the past history of Bayesian econometrics and Alan Greenspan's (2004) recent description of his use of Bayesian methods in managing policy-making risk, some of the issues and needs that he mentions are discussed and linked to past and present Bayesian econometric research. Then a review of some recent Bayesian econometric research and needs is presented. Finally, some thoughts are presented that relate to the future of Bayesian econometrics.

Details

Bayesian Econometrics
Type: Book
ISBN: 978-1-84855-308-8

Abstract

Details

Empirical Nursing
Type: Book
ISBN: 978-1-78743-814-9

Book part
Publication date: 18 October 2019

John Geweke

Bayesian A/B inference (BABI) is a method that combines subjective prior information with data from A/B experiments to provide inference for lift – the difference in a measure of…

Abstract

Bayesian A/B inference (BABI) is a method that combines subjective prior information with data from A/B experiments to provide inference for lift – the difference in a measure of response in control and treatment, expressed as its ratio to the measure of response in control. The procedure is embedded in stable code that can be executed in a few seconds for an experiment, regardless of sample size, and caters to the objectives and technical background of the owners of experiments. BABI provides more powerful tests of the hypothesis of the impact of treatment on lift, and sharper conclusions about the value of lift, than do legacy conventional methods. In application to 21 large online experiments, the credible interval is 60% to 65% shorter than the conventional confidence interval in the median case, and by close to 100% in a significant proportion of cases; in rare cases, BABI credible intervals are longer than conventional confidence intervals and then by no more than about 10%.

Details

Topics in Identification, Limited Dependent Variables, Partial Observability, Experimentation, and Flexible Modeling: Part B
Type: Book
ISBN: 978-1-83867-419-9

Article
Publication date: 1 May 1988

François Schächter

The association of the principle of Jaynes, with a new probabilistic‐informational concept — the opacity functional — conceived by Magur‐Schächter, is said to lead to an abstract…

Abstract

The association of the principle of Jaynes, with a new probabilistic‐informational concept — the opacity functional — conceived by Magur‐Schächter, is said to lead to an abstract framework where it is possible to put a measure on the distance between an instantaneous evolving state and the corresponding equilibrium state. An illustration is given for the Helmholtz thermodynamic potential of a perfect gas, which reveals the significance of such a distance. The treatment of this particular case suggests the possibility of a new and general method for a probabilistic‐informational approach developed by the ideas of Onsager, Prigogine, Jaynes, Truesdell and Tykodi.

Details

Kybernetes, vol. 17 no. 5
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 11 May 2015

Diego Giuliani, Maria Michela Dickson and Giuseppe Espa

The purpose of this paper is to present the contents and the didactic approach that characterize, respectively, the “Introductory Statistics with R” and “Statistics and Foresight”…

Abstract

Purpose

The purpose of this paper is to present the contents and the didactic approach that characterize, respectively, the “Introductory Statistics with R” and “Statistics and Foresight” courses of the Master in Social Foresight.

Design/methodology/approach

The two courses “Introductory Statistics with R” and “Statistics and Foresight” are designed to provide an introduction to quantitative methods in the social sciences with specific applications to social foresight. In particular, the first course introduces students to data analysis providing the necessary tools to study and represent socio-economic phenomena through graphical summaries and numerical measures. During the course, example applications based on the use of the open-source software R are shown. At the end, the students should be able to perform data management, conduct descriptive analysis of categorical and quantitative variables and analyze bivariate distributions. The subsequent course “Statistics and Foresight” presents the most efficient methods to make decisions in a context of uncertainty while visualizing the potential errors of wrong decisions and computing the probability of their occurrence.

Findings

This paper is a description of an interesting and promising way of teaching applied statistics in social sciences.

Practical implications

With the main aim of learning the correct use of statistics, specific attention is devoted to the use and interpretation of the aforementioned methods rather than to their theoretical aspects. Even in the second course, an important role is played by the treatment of real data by the use of the R software.

Originality/value

This paper attempts to systematize a method of teaching statistics based on the practical use of open-source software.

Details

On the Horizon, vol. 23 no. 2
Type: Research Article
ISSN: 1074-8121

Keywords

Article
Publication date: 1 March 1986

L.R. Lichtenberg, M. Sleiman and M.J. Harry

During the past few years, statistical process control and experiment design concepts have taken a prominent place within the industry. The use of such tools within the Motorola…

Abstract

During the past few years, statistical process control and experiment design concepts have taken a prominent place within the industry. The use of such tools within the Motorola, GEG manufacturing environment, has grown to the point where reflow and wave solder process development and optimisation has significantly benefited. The ability to evaluate statistically and model various known and unknown phenomena has provided GEG's manufacturing technology with a series of very powerful tools to aid in process control and development. The primary purpose of this paper is to present the various approaches used by GEG to implement the previously mentioned statistical tools, with respect to the development of infra‐red (I‐R) reflow solder processes and enhancement of certain quality characteristics associated with wave soldered printed wiring boards (PWBs). Beyond specific GEG applications, the paper discusses the role of statistically designed experiments and process control methods as a vehicle for providing answers to complex manufacturing problems. In addition, a discussion of the mathematical and graphical methods underlying the interpretation of quantitative data is presented. Perhaps the most important benefit derived from the use of statistics to solve manufacturing and quality problems is related to decision making. When experiments are conducted to isolate unwanted sources of process and product variation, decisions must be made to determine whether or not certain experimental effects are important. Through the application of statistics, the researcher can ascertain the mathematical probability associated with the random chance occurrence of various experimental effects. With this knowledge, the researcher can make decisions with known degrees of risk and confidence. Without such knowledge, an organisation might possibly expend valuable resources and derive no direct benefit. Ultimately, the principal reason for applying statistical methods and procedures is to increase quality and yield, while simultaneously reducing costs.

Details

Circuit World, vol. 12 no. 4
Type: Research Article
ISSN: 0305-6120

Content available
Article
Publication date: 1 October 1998

435

Abstract

Details

Industrial Robot: An International Journal, vol. 25 no. 5
Type: Research Article
ISSN: 0143-991X

Article
Publication date: 4 September 2019

S. Khodaygan and A. Ghaderi

The purpose of this paper is to present a new efficient method for the tolerance–reliability analysis and quality control of complex nonlinear assemblies where explicit assembly…

Abstract

Purpose

The purpose of this paper is to present a new efficient method for the tolerance–reliability analysis and quality control of complex nonlinear assemblies where explicit assembly functions are difficult or impossible to extract based on Bayesian modeling.

Design/methodology/approach

In the proposed method, first, tolerances are modelled as the random uncertain variables. Then, based on the assembly data, the explicit assembly function can be expressed by the Bayesian model in terms of manufacturing and assembly tolerances. According to the obtained assembly tolerance, reliability of the mechanical assembly to meet the assembly requirement can be estimated by a proper first-order reliability method.

Findings

The Bayesian modeling leads to an appropriate assembly function for the tolerance and reliability analysis of mechanical assemblies for assessment of the assembly quality, by evaluation of the assembly requirement(s) at the key characteristics in the assembly process. The efficiency of the proposed method by considering a case study has been illustrated and validated by comparison to Monte Carlo simulations.

Practical implications

The method is practically easy to be automated for use within CAD/CAM software for the assembly quality control in industrial applications.

Originality/value

Bayesian modeling for tolerance–reliability analysis of mechanical assemblies, which has not been previously considered in the literature, is a potentially interesting concept that can be extended to other corresponding fields of the tolerance design and the quality control.

Details

Assembly Automation, vol. 39 no. 5
Type: Research Article
ISSN: 0144-5154

Keywords

1 – 10 of over 36000