Search results

1 – 10 of over 50000
Book part
Publication date: 30 October 2020

Carlo Zappia

This chapter documents an exchange between Leonard Savage, founder of the subjective probability approach to decision-making, and Karl Popper, advocate of the so-called propensity…

Abstract

This chapter documents an exchange between Leonard Savage, founder of the subjective probability approach to decision-making, and Karl Popper, advocate of the so-called propensity approach to probability, of which there is no knowledge in the literature on probability theory. Early in 1958, just after being informally tested by Daniel Ellsberg with a test of consistency in decision-making processes that originated the so-called Ellsberg Paradox, Savage was made aware that a similar argument had been put forward by Popper. Popper found it paradoxical that two apparently similar events should be attributed the same subjective probability even though evidence supporting judgment in one case was different than in the other case. On this ground, Popper rejected the subjective probability approach. Inspection of the Savage Papers archived at Yale University Library makes it possible to document Savage’s reaction to Popper, of which there is no evidence in his published writings. Savage wrote to Popper denying that his criticism had paradoxical content and a brief exchange followed. The chapter shows that while Savage was unconvinced by Popper’s argument he was not hostile to an axiomatically founded generalization of his theory.

Details

Research in the History of Economic Thought and Methodology: Including a Symposium on Sir James Steuart: The Political Economy of Money and Trade
Type: Book
ISBN: 978-1-83867-707-7

Keywords

Article
Publication date: 1 December 2001

George J. Klir

Presents an overview of currently recognized theories of imprecise probabilities and their possible extensions. It is shown how the theories are ordered by their levels of…

469

Abstract

Presents an overview of currently recognized theories of imprecise probabilities and their possible extensions. It is shown how the theories are ordered by their levels of generality. A summary of current results regarding measures of uncertainty and uncertainty‐based information is also presented.

Details

Kybernetes, vol. 30 no. 9/10
Type: Research Article
ISSN: 0368-492X

Keywords

Book part
Publication date: 19 August 2019

Christopher Torr

The Austrian economist Ludwig Lachmann claimed that Keynes was a lifelong subjectivist. To evaluate this, we start by distinguishing Keynes’ writings on probability theory from…

Abstract

The Austrian economist Ludwig Lachmann claimed that Keynes was a lifelong subjectivist. To evaluate this, we start by distinguishing Keynes’ writings on probability theory from his writings on economics. In the General Theory (1936), Keynes’ treatment of expectations provides the basis for Lachmann’s view that Keynes was a subjectivist at heart. In his Treatise on Probability (1921), Keynes refers explicitly to the subjectivism–objectivism divide in probability theory and pins his colors to the objectivist mast. In this essay, we present the objectivist slant in Keynes’ earlier writings on probability theory. Thereafter, we evaluate the criteria Lachmann employed to cast Keynes as a subjectivist.

Details

Including a Symposium on Ludwig Lachmann
Type: Book
ISBN: 978-1-78769-862-8

Keywords

Article
Publication date: 11 August 2014

Thomas Pistorius

The purpose of this paper is to analyse the current rhetoric of predictability in investment theory. After making the case for unpredictability, a new rhetoric for investment…

666

Abstract

Purpose

The purpose of this paper is to analyse the current rhetoric of predictability in investment theory. After making the case for unpredictability, a new rhetoric for investment theory is proposed.

Design/methodology/approach

McCloskey's project of the rhetoric of economics provides the background and approach for the author's investigation. In particular the author will use the notions of metaphor, prediction, discourse analysis, and virtue ethics.

Findings

The current rhetoric equals the original rhetoric in the seminal work of Markowitz. The current rhetoric is based on predictability and rational behaviour. The proposed new rhetoric for investment theory denies predictability. The new rhetoric aims to cope with statistics by stressing that statistics is supportive but not decisive: handling investment theory is about judgements, combining virtues with historical and theoretical insights.

Practical implications

The investigation of the rhetoric of investment theory has practical relevance because the theory constitutes investment practice, and can put financial wealth at risk. The new rhetoric for investment theory invites practitioners and researchers to reflect on the epistemology of investment theory, and its consequences for the field.

Originality/value

The rhetoric of investment theory is to the author's knowledge not yet analysed in the literature. The rhetorical analysis of the current rhetoric and the proposal of a new rhetoric aim to contribute to the literature on the rhetoric of investment theory.

Details

Journal of Organizational Change Management, vol. 27 no. 5
Type: Research Article
ISSN: 0953-4814

Keywords

Article
Publication date: 1 October 1996

George J. Klir and David Harmanec

Provides an overview of major developments pertaining to generalized information theory during the lifetime of Kybernetes. Generalized information theory is viewed as a collection…

571

Abstract

Provides an overview of major developments pertaining to generalized information theory during the lifetime of Kybernetes. Generalized information theory is viewed as a collection of concepts, theorems, principles, and methods for dealing with problems involving uncertainty‐based information that are beyond the narrow scope of classical information theory. Introduces well‐justified measures of uncertainty in fuzzy set theory, possibility theory, and Dempster‐Shafer theory. Shows how these measures are connected with the classical Hartley measure and Shannon entropy. Discusses basic issues regarding some principles of generalized uncertainty‐based information.

Details

Kybernetes, vol. 25 no. 7/8
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 30 September 2014

Shu Qing Liu, Qin Su and Ping Li

In order to meet the requirements of 6σ management and to overcome the deficiencies of the theory for using the pre-control chart to evaluate and monitor quality stability, the…

Abstract

Purpose

In order to meet the requirements of 6σ management and to overcome the deficiencies of the theory for using the pre-control chart to evaluate and monitor quality stability, the purpose of this paper is to probe into the quality stability evaluation and monitoring guidelines of small batch production process based on the pre-control chart under the conditions of the distribution center and specifications center non-coincidence (0<ɛ≤1.5σ), the process capability index C p ≥2 and the virtual alarm probability α=0.27 percent.

Design/methodology/approach

First, the range of the quality stability evaluation sampling number in initial production process is determined by using probability and statistics methods, the sample size for the quality stability evaluation is adjusted and determined in initial production process according to the error judgment probability theory, and the guideline for quality stability evaluation has been proposed in initial production process based on the theory of small probability events. Second, the alternative guidelines for quality stability monitoring and control in formal production process are proposed by using combination theory, the alternative guidelines are initially selected based on the theory of small probability events, a comparative analysis of the guidelines is made according to the average run lengths values, and the monitoring and control guidelines for quality stability are determined in formal production process.

Findings

The results obtained from research indicate that when the virtual alarm probability α=0.27 percent, the shifts ɛ in the range 0<ɛ≤1.5σ and the process capability index C p ≥2, the quality stability evaluation sample size of the initial production process is 11, whose scondition is that the number of the samples falling into the yellow zone is 1 at maximum. The quality stability evaluation sample size of the formal production process is 5, and when the number of the samples falling into the yellow zone is ≤1, the process is stable, while when two of the five samples falling into the yellow, then one more sample needs to be added, and only if this sample falls into the green zone, the process is stable.

Originality/value

Research results can overcome the unsatisfactory 6σ management assumptions and requirements and the oversize virtual alarm probability α of the past pre-control charts, as well as the shortage only adaptable to the pre-control chart when the shifts ɛ=0. And at the same time, the difficult problem hard to adopt the conventional control charts to carry out process control because of a fewer sample sizes is solved.

Details

International Journal of Quality & Reliability Management, vol. 31 no. 9
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 18 October 2011

Minghu Ha, Jiqiang Chen, Witold Pedrycz and Lu Sun

Bounds on the rate of convergence of learning processes based on random samples and probability are one of the essential components of statistical learning theory (SLT). The…

Abstract

Purpose

Bounds on the rate of convergence of learning processes based on random samples and probability are one of the essential components of statistical learning theory (SLT). The constructive distribution‐independent bounds on generalization are the cornerstone of constructing support vector machines. Random sets and set‐valued probability are important extensions of random variables and probability, respectively. The paper aims to address these issues.

Design/methodology/approach

In this study, the bounds on the rate of convergence of learning processes based on random sets and set‐valued probability are discussed. First, the Hoeffding inequality is enhanced based on random sets, and then making use of the key theorem the non‐constructive distribution‐dependent bounds of learning machines based on random sets in set‐valued probability space are revisited. Second, some properties of random sets and set‐valued probability are discussed.

Findings

In the sequel, the concepts of the annealed entropy, the growth function, and VC dimension of a set of random sets are presented. Finally, the paper establishes the VC dimension theory of SLT based on random sets and set‐valued probability, and then develops the constructive distribution‐independent bounds on the rate of uniform convergence of learning processes. It shows that such bounds are important to the analysis of the generalization abilities of learning machines.

Originality/value

SLT is considered at present as one of the fundamental theories about small statistical learning.

Article
Publication date: 6 July 2015

Masudul Choudhury

The purpose of this paper is to theorize the existing idea of subjective probability a la Keynes’s Treatise on Probability Theory. Then to show that, under the especial kind of…

Abstract

Purpose

The purpose of this paper is to theorize the existing idea of subjective probability a la Keynes’s Treatise on Probability Theory. Then to show that, under the especial kind of financial valuation model in the absence of interest rate and speculation, subjective probability is not of a major concern in Islamic financial theory.

Design/methodology/approach

The topic is of an epistemological nature premised on the Islamic unity of knowledge and the world-system with special attention given to the formulation of the financial model for evaluation under its unitary characteristic at each time period of financial evaluation. The approach, while being epistemological, is also mathematical in the financial valuation field.

Findings

Mathematical calculation of approximate solution using Newton-Raphson method applied to Islamic financial valuation model with yields, evolutionary learning and of the nature of unitary discursive experience at every stage of valuation taken continuously establishes the innovative method approximating subjective probability of events to limiting negligible field.

Practical implications

The nature and importance of Islamic valuation models brings about the implication of diversification of risk and production diversification that altogether underlie the limiting phenomenon of subjective probability in a narrow closure.

Social implications

The epistemological implication of unity of knowledge and unity of the specific events during the valuation experience causes the socio-economic system to gain increasing levels of stability and certainty while subjective probability narrows down in its small closure.

Originality/value

This paper is boldly original in the light of its methodology that addresses the much pursued topic of subjective probability in the Islamic heterodox economic and financial field.

Details

Journal of Financial Reporting and Accounting, vol. 13 no. 1
Type: Research Article
ISSN: 1985-2517

Keywords

Book part
Publication date: 14 July 2010

Hugh Pforsich, Susan Gill and Debra Sanders

This study examines contextual influences on taxpayers’ perceptions of a vague “low” probability of detection and the relationship between taxpayers’ perceptions and their…

Abstract

This study examines contextual influences on taxpayers’ perceptions of a vague “low” probability of detection and the relationship between taxpayers’ perceptions and their likelihood to take questionable tax deductions. As such, we tie psychological theories that explain differential interpretations of qualitative probability phrases (base rate and support theories) to the taxpayer perception literature. Consistent with our hypotheses, taxpayers’ interpretations of “low” differ both between and within subjects, depending on the context in which deductions are presented. On average, our taxpayer subjects are less likely to take questionable deductions perceived to have a higher probability of detection than those perceived to have a lower detection probability. Our results contribute to existing literature by demonstrating that knowledge of subjects’ assessments of an event's probability is integral to designing experiments and drawing conclusions regarding observed behavior. This appears necessary even when researchers provide assessments of detection probabilities and/or employ scenarios for which systematic differences in probability perceptions are not inherently obvious.

Details

Advances in Taxation
Type: Book
ISBN: 978-0-85724-140-5

Article
Publication date: 1 March 1989

Ian Scott, Stuart Gronow and Brian Rosser

Examines the ability of an expert computer system to evaluateuncertainty within a valuation context and thus emulate the professionalskill of the valuer. Shows that because…

Abstract

Examines the ability of an expert computer system to evaluate uncertainty within a valuation context and thus emulate the professional skill of the valuer. Shows that because property valuation programs based on regression analysis require data input for each variable, they are unable to evaluate uncertainty and hence to apply the rational judgement which enables the human valuer to produce a valuation in the light of uncertain or incomplete information.

Details

Journal of Valuation, vol. 7 no. 3
Type: Research Article
ISSN: 0263-7480

Keywords

1 – 10 of over 50000