Search results
1 – 3 of 3Tim Kastrup, Michael Grant and Fredrik Nilsson
The purpose of this paper is to contribute to a better, empirically grounded and theoretically informed understanding of data analytics (DA) use and nonuse in accounting for…
Abstract
Purpose
The purpose of this paper is to contribute to a better, empirically grounded and theoretically informed understanding of data analytics (DA) use and nonuse in accounting for decision-making. To that end, it explores the links between accounting logic, commercial logic and DA use in financial due diligence (FDD).
Design/methodology/approach
The paper reports the findings of a case study of DA use in the FDD practice of a Big Four accounting firm in Sweden (Pseudonym: DealCo). The primary data comprises semistructured interviews, observations and additional meetings. Institutional logics is mobilized as method theory.
Findings
First, accounting logic and commercial logic both drove and hindered DA use in DealCo’s FDD practice in different ways. Second, conflicting prescriptions for DA use existed mostly within commercial logic rather than between accounting logic and commercial logic. Third, accounting logic and commercial logic, as perceptual and conceptual filters, seemed to shape DealCo’s advisors’ understanding of DA and give rise to an efficiency-centric DA logic. This logic, in turn, as a high-level model of how to use DA in the context of FDD, governed DA use broadly.
Originality/value
The paper draws attention to direct and indirect links between accounting logic and commercial logic, on the one hand, and DA conceptions and use, on the other hand. It, thereby, advances prior theorization of DA use in accounting for decision-making.
Details
Keywords
This paper reviews the recent collapse of two cryptocurrency enterprises, FTX and Celsius. These two cases of institutional bankruptcy have generated criminal charges and other…
Abstract
Purpose
This paper reviews the recent collapse of two cryptocurrency enterprises, FTX and Celsius. These two cases of institutional bankruptcy have generated criminal charges and other civil complaints, mainly alleging fraud against the CEOs of the companies. This paper aims to analyse the fraud leading to these bankruptcies, drawing on key concepts from the research literature on economic crime to provide explanations for what happened.
Design/methodology/approach
This paper uses a case study approach to the question of how large financial institutions can go off the rails. Two theoretical perspectives are applied to the cases of the FTX and Celsius collapses. These are the “normalisation of deviance” theory and the “cult of personality”.
Findings
In these two case studies, there is an interaction between the “normalisation of deviance” on the institutional level and the “cult of personality” at the level of individual leadership. The CEOs of the two companies promoted themselves as eccentric but successful examples of the visionary tech finance genius. This fostered the normalisation of deviance within their organisations. Employees, investors and regulators allowed criminal and highly financially risky practices to become normalised as they were caught up in the attractive story of the trailblazing entrepreneur making millions in the new cryptoeconomy.
Originality/value
This paper makes a contribution both to the case study literature on economic crime and to the development of general theory in economic criminology.
Details
Keywords
Habeeb Balogun, Hafiz Alaka and Christian Nnaemeka Egwim
This paper seeks to assess the performance levels of BA-GS-LSSVM compared to popular standalone algorithms used to build NO2 prediction models. The purpose of this paper is to…
Abstract
Purpose
This paper seeks to assess the performance levels of BA-GS-LSSVM compared to popular standalone algorithms used to build NO2 prediction models. The purpose of this paper is to pre-process a relatively large data of NO2 from Internet of Thing (IoT) sensors with time-corresponding weather and traffic data and to use the data to develop NO2 prediction models using BA-GS-LSSVM and popular standalone algorithms to allow for a fair comparison.
Design/methodology/approach
This research installed and used data from 14 IoT emission sensors to develop machine learning predictive models for NO2 pollution concentration. The authors used big data analytics infrastructure to retrieve the large volume of data collected in tens of seconds for over 5 months. Weather data from the UK meteorology department and traffic data from the department for transport were collected and merged for the corresponding time and location where the pollution sensors exist.
Findings
The results show that the hybrid BA-GS-LSSVM outperforms all other standalone machine learning predictive Model for NO2 pollution.
Practical implications
This paper's hybrid model provides a basis for giving an informed decision on the NO2 pollutant avoidance system.
Originality/value
This research installed and used data from 14 IoT emission sensors to develop machine learning predictive models for NO2 pollution concentration.
Details