Search results

1 – 10 of over 3000
Article
Publication date: 1 July 2004

Alexandros M. Goulielmos

This article deals first in a theoretical fashion – a kind of a literature review – with the concept of randomness, as this appears in various disciplines. Second, an empirical…

Abstract

This article deals first in a theoretical fashion – a kind of a literature review – with the concept of randomness, as this appears in various disciplines. Second, an empirical approach is performed with actual data concerning marine accidents in the form of ships totally lost in two counts: ships lost per area and ships lost per month. The first appears non random and the latter is random! This finding is very crucial for the countries with the most dangerous areas, as well as for IMO. The test used for non‐randomness is the BDS statistic. The BDS statistic tests for the nonlinear dependence. The test proved randomness for monthly time series at both 95 per cent and 99 per cent confidence and non‐randomness for area data at the same confidence levels as above.

Details

Disaster Prevention and Management: An International Journal, vol. 13 no. 3
Type: Research Article
ISSN: 0965-3562

Keywords

Book part
Publication date: 4 March 2024

Oswald A. J. Mascarenhas, Munish Thakur and Payal Kumar

This chapter addresses one of the most crucial areas for critical thinking: the morality of turbulent markets around the world. All of us are overwhelmed by such turbulent…

Abstract

Executive Summary

This chapter addresses one of the most crucial areas for critical thinking: the morality of turbulent markets around the world. All of us are overwhelmed by such turbulent markets. Following Nassim Nicholas Taleb (2004, 2010), we distinguish between nonscalable industries (ordinary professions where income grows linearly, piecemeal or by marginal jumps) and scalable industries (extraordinary risk-prone professions where income grows in a nonlinear fashion, and by exponential jumps and fractures). Nonscalable industries generate tame and predictable markets of goods and services, while scalable industries regularly explode into behemoth virulent markets where rewards are disproportionately large compared to effort, and they are the major causes of turbulent financial markets that rock our world causing ever-widening inequities and inequalities. Part I describes both scalable and nonscalable markets in sufficient detail, including propensity of scalable industries to randomness, and the turbulent markets they create. Part II seeks understanding of moral responsibility of turbulent markets and discusses who should appropriate moral responsibility for turbulent markets and under what conditions. Part III synthesizes various theories of necessary and sufficient conditions for accepting or assigning moral responsibility. We also analyze the necessary and sufficient conditions for attribution of moral responsibility such as rationality, intentionality, autonomy or freedom, causality, accountability, and avoidability of various actors as moral agents or as moral persons. By grouping these conditions, we then derive some useful models for assigning moral responsibility to various entities such as individual executives, corporations, or joint bodies. We discuss the challenges and limitations of such models.

Details

A Primer on Critical Thinking and Business Ethics
Type: Book
ISBN: 978-1-83753-312-1

Article
Publication date: 4 January 2008

Michael R. Powers

In this two‐part series, this paper seeks to consider certain intriguing aspects of randomness, the basic mathematical concept used to model financial risk and other unknown…

378

Abstract

Purpose

In this two‐part series, this paper seeks to consider certain intriguing aspects of randomness, the basic mathematical concept used to model financial risk and other unknown quantities in the physical world.

Design/methodology/approach

Part 1 applies concepts from quantum physics and algorithmic information theory to distinguish between knowable complexity and unknowable complexity.

Findings

In Part 1, it is found that Heisenberg's uncertainty principle can be used to provide concrete examples of random variables, and that the Kolmogorov/Chaitin notion of algorithmic complexity can be used to define the formal concept of randomness.

Originality/value

The two‐part series explores the underlying nature of randomness in terms of both its physical/mathematical properties and its role in human cognition.

Details

The Journal of Risk Finance, vol. 9 no. 1
Type: Research Article
ISSN: 1526-5943

Keywords

Book part
Publication date: 27 July 2023

Oswald A. J. Mascarenhas, Munish Thakur and Payal Kumar

All of us seek truth via objective inquiry into various human and nonhuman phenomena that nature presents to us on a daily basis. We are empirical (or nonempirical) decision…

Abstract

Executive Summary

All of us seek truth via objective inquiry into various human and nonhuman phenomena that nature presents to us on a daily basis. We are empirical (or nonempirical) decision makers who hold that uncertainty is our discipline, and that understanding how to act under conditions of incomplete information is the highest and most urgent human pursuit (Karl Popper, as cited in Taleb, 2010, p. 57). We verify (prove something as right) or falsify (prove something as wrong), and this asymmetry of knowledge enables us to distinguish between science and nonscience. According to Karl Popper (1971), we should be an “open society,” one that relies on skepticism as a modus operandi, refusing and resisting definitive (dogmatic) truths. An open society, maintained Popper, is one in which no permanent truth is held to exist; this would allow counter-ideas to emerge. Hence, any idea of Utopia is necessarily closed since it chokes its own refutations. A good model for society that cannot be left open for falsification is totalitarian and epistemologically arrogant. The difference between an open and a closed society is that between an open and a closed mind (Taleb, 2004, p. 129). Popper accused Plato of closing our minds. Popper's idea was that science has problems of fallibility or falsifiability. In this chapter, we deal with fallibility and falsifiability of human thinking, reasoning, and inferencing as argued by various scholars, as well as the falsifiability of our knowledge and cherished cultures and traditions. Critical thinking helps us cope with both vulnerabilities. In general, we argue for supporting the theory of “open mind and open society” in order to pursue objective truth.

Details

A Primer on Critical Thinking and Business Ethics
Type: Book
ISBN: 978-1-83753-308-4

Article
Publication date: 5 October 2012

I. Doltsinis

The purpose of this paper is to expose computational methods as applied to engineering systems and evolutionary processes with randomness in external actions and inherent…

Abstract

Purpose

The purpose of this paper is to expose computational methods as applied to engineering systems and evolutionary processes with randomness in external actions and inherent parameters.

Design/methodology/approach

In total, two approaches are distinguished that rely on solvers from deterministic algorithms. Probabilistic analysis is referred to as the approximation of the response by a Taylor series expansion about the mean input. Alternatively, stochastic simulation implies random sampling of the input and statistical evaluation of the output.

Findings

Beyond the characterization of random response, methods of reliability assessment are discussed. Concepts of design improvement are presented. Optimization for robustness diminishes the sensitivity of the system to fluctuating parameters.

Practical implications

Deterministic algorithms available for the primary problem are utilized for stochastic analysis by statistical Monte Carlo sampling. The computational effort for the repeated solution of the primary problem depends on the variability of the system and is usually high. Alternatively, the analytic Taylor series expansion requires extension of the primary solver to the computation of derivatives of the response with respect to the random input. The method is restricted to the computation of output mean values and variances/covariances, with the effort determined by the amount of the random input. The results of the two methods are comparable within the domain of applicability.

Originality/value

The present account addresses the main issues related to the presence of randomness in engineering systems and processes. They comprise the analysis of stochastic systems, reliability, design improvement, optimization and robustness against randomness of the data. The analytical Taylor approach is contrasted to the statistical Monte Carlo sampling throughout. In both cases, algorithms known from the primary, deterministic problem are the starting point of stochastic treatment. The reader benefits from the comprehensive presentation of the matter in a concise manner.

Article
Publication date: 29 February 2008

Michael R. Powers

The purpose of the paper, in this two‐part series, is to consider certain intriguing aspects of randomness, the basic mathematical concept used to model financial risk and other…

246

Abstract

Purpose

The purpose of the paper, in this two‐part series, is to consider certain intriguing aspects of randomness, the basic mathematical concept used to model financial risk and other unknown quantities in the physical world.

Design/methodology/approach

In Part 2 of the paper, the author describes methods for simulating random variables, and explores whether it is possible in practice to distinguish between the knowable complexity of compressible random variables and the unknowable complexity of incompressible random variables.

Findings

In Part 2 of the paper, it is found that because of Chaitin's impossibility result (regarding incompressible sequences) and the possibility of constructing least‐surprising sequences, one is left with two cognitive constraints with serious implications for statistical hypothesis testing and the scientific method.

Originality/value

This two‐part paper explores the underlying nature of randomness in terms of both its physical/mathematical properties and its role in human cognition.

Details

The Journal of Risk Finance, vol. 9 no. 2
Type: Research Article
ISSN: 1526-5943

Keywords

Article
Publication date: 31 December 2007

Kenneth D. Mackenzie

The purpose of this paper is to advocate the use of processual rather than the common variance‐theoretic approaches to the study of groups and organizational processes. There is a…

Abstract

Purpose

The purpose of this paper is to advocate the use of processual rather than the common variance‐theoretic approaches to the study of groups and organizational processes. There is a proliferation of paradigms and a lack of cumulation in organization science. This paper argues that this is due to the adoption of an inappropriate philosophy. Early on, the field adopted the variance‐theoretic approach instead of a process‐theoretic approach. The dominant paradigm is not one of theory, but one of method.

Design/methodology/approach

The paper systematically undermines the basis for variance‐theoretic approaches to the study of groups and organizational phenomena, followed by a discussion of how and why processual approaches provide remedies.

Findings

As group and organizational phenomena are inherently processual in nature, it makes more sense to study them with processual methods.

Research limitations/implications

The argument for the solution offered in this article is based on one concept of a group and organizational process. Other types of process models are not excluded.

Practical implications

It is possible that the continued use of variance‐theoretical approaches is a form of professional misconduct leading to paradigm proliferation instead of progress.

Originality/value

This paper provides an original analysis of both variance‐theoretic and processual approaches to the study of group and organizational processes.

Details

International Journal of Organizational Analysis, vol. 15 no. 1
Type: Research Article
ISSN: 1934-8835

Keywords

Article
Publication date: 1 April 2000

Bel G. Raggad

Proposes a possibilistic group support system (PGSS) for the retailer pricing and inventory problem when possibilistic fluctuations of product parameters are controlled by a set…

Abstract

Proposes a possibilistic group support system (PGSS) for the retailer pricing and inventory problem when possibilistic fluctuations of product parameters are controlled by a set of possibilistic optimality conditions. Experts in various functional areas convey their subjective judgement to the PGSS in the form of analytical models (for product parameters estimation), fuzzy concepts (facts), and possibilistic propositions (for validation and choice procedures). Basic probability assignments are used to elicit experts’ opinions. They are then transformed into compatibility functions for fuzzy concepts using the falling shadow technique. Evidence is processed in the form of fuzzy concepts, then is rewritten back to basic probability assignments using the principle of least ignorance on randomness. The PGSS allows the user (inventory control) to examine a trade‐off between the belief value of a greater profit and a lower amount of randomness associated with it.

Details

Logistics Information Management, vol. 13 no. 2
Type: Research Article
ISSN: 0957-6053

Keywords

Article
Publication date: 6 October 2023

Jie Yang, Manman Zhang, Linjian Shangguan and Jinfa Shi

The possibility function-based grey clustering model has evolved into a complete approach for dealing with uncertainty evaluation problems. Existing models still have problems…

Abstract

Purpose

The possibility function-based grey clustering model has evolved into a complete approach for dealing with uncertainty evaluation problems. Existing models still have problems with the choice dilemma of the maximum criteria and instances when the possibility function may not accurately capture the data's randomness. This study aims to propose a multi-stage skewed grey cloud clustering model that blends grey and randomness to overcome these problems.

Design/methodology/approach

First, the skewed grey cloud possibility (SGCP) function is defined, and its digital characteristics demonstrate that a normal cloud is a particular instance of a skewed cloud. Second, the border of the decision paradox of the maximum criterion is established. Third, using the skewed grey cloud kernel weight (SGCKW) transformation as a tool, the multi-stage skewed grey cloud clustering coefficient (SGCCC) vector is calculated and research items are clustered according to this multi-stage SGCCC vector with overall features. Finally, the multi-stage skewed grey cloud clustering model's solution steps are then provided.

Findings

The results of applying the model to the assessment of college students' capacity for innovation and entrepreneurship revealed that, in comparison to the traditional grey clustering model and the two-stage grey cloud clustering evaluation model, the proposed model's clustering results have higher identification and stability, which partially resolves the decision paradox of the maximum criterion.

Originality/value

Compared with current models, the proposed model in this study can dynamically depict the clustering process through multi-stage clustering, ensuring the stability and integrity of the clustering results and advancing grey system theory.

Details

Grey Systems: Theory and Application, vol. 14 no. 1
Type: Research Article
ISSN: 2043-9377

Keywords

Article
Publication date: 12 August 2019

Kyoung Cheon Cha, Minah Suh, Gusang Kwon, Seungeun Yang and Eun Ju Lee

The purpose of this paper is to determine the auditory-sensory characteristics of the digital pop music that is particularly successful on the YouTube website by measuring young…

1498

Abstract

Purpose

The purpose of this paper is to determine the auditory-sensory characteristics of the digital pop music that is particularly successful on the YouTube website by measuring young listeners’ brain responses to highly successful pop music noninvasively.

Design/methodology/approach

The authors conducted a functional near-infrared spectroscopy (fNIRS) experiment with 56 young adults (23 females; mean age 24 years) with normal vision and hearing and no record of neurological disease. The authors calculated total blood flow (TBF) and hemodynamic randomness and examined their relationships with online popularity.

Findings

The authors found that TBF to the right medial prefrontal cortex increased more when the young adults heard music that presented acoustic stimulation well above previously defined optimal sensory level. The hemodynamic randomness decreased significantly when the participants listened to music that provided near- or above-OSL stimulation.

Research limitations/implications

Online popularity, recorded as the number of daily hits, was significantly positively related with the TBF and negatively related with hemodynamic randomness.

Practical implications

These findings suggest that a new media marketing strategy may be required that can provide a sufficient level of sensory stimulation to Millennials in order to increase their engagements in various use cases including entertainment, advertising and retail environments.

Social implications

Digital technology has so drastically reduced the costs of sharing and disseminating information, including music, that consumers can now easily use digital platforms to access a wide selection of music at minimal cost. The structure of the current music market reflects the decentralized nature of the online distribution network such that artists from all over the world now have equal access to billions of members of the global music audience.

Originality/value

This study confirms the importance of understanding target customer’s sensory experiences would grow in determining the success of digital contents and marketing.

Details

Asia Pacific Journal of Marketing and Logistics, vol. 32 no. 5
Type: Research Article
ISSN: 1355-5855

Keywords

1 – 10 of over 3000