Search results

1 – 10 of 100
Open Access
Article
Publication date: 8 December 2022

James Christopher Westland

This paper tests whether Bayesian A/B testing yields better decisions that traditional Neyman-Pearson hypothesis testing. It proposes a model and tests it using a large, multiyear…

1222

Abstract

Purpose

This paper tests whether Bayesian A/B testing yields better decisions that traditional Neyman-Pearson hypothesis testing. It proposes a model and tests it using a large, multiyear Google Analytics (GA) dataset.

Design/methodology/approach

This paper is an empirical study. Competing A/B testing models were used to analyze a large, multiyear dataset of GA dataset for a firm that relies entirely on their website and online transactions for customer engagement and sales.

Findings

Bayesian A/B tests of the data not only yielded a clear delineation of the timing and impact of the intellectual property fraud, but calculated the loss of sales dollars, traffic and time on the firm’s website, with precise confidence limits. Frequentist A/B testing identified fraud in bounce rate at 5% significance, and bounces at 10% significance, but was unable to ascertain fraud at the standard significance cutoffs for scientific studies.

Research limitations/implications

None within the scope of the research plan.

Practical implications

Bayesian A/B tests of the data not only yielded a clear delineation of the timing and impact of the IP fraud, but calculated the loss of sales dollars, traffic and time on the firm’s website, with precise confidence limits.

Social implications

Bayesian A/B testing can derive economically meaningful statistics, whereas frequentist A/B testing only provide p-value’s whose meaning may be hard to grasp, and where misuse is widespread and has been a major topic in metascience. While misuse of p-values in scholarly articles may simply be grist for academic debate, the uncertainty surrounding the meaning of p-values in business analytics actually can cost firms money.

Originality/value

There is very little empirical research in e-commerce that uses Bayesian A/B testing. Almost all corporate testing is done via frequentist Neyman-Pearson methods.

Details

Journal of Electronic Business & Digital Economics, vol. 1 no. 1/2
Type: Research Article
ISSN: 2754-4214

Keywords

Open Access
Article
Publication date: 21 March 2024

Warisa Thangjai and Sa-Aat Niwitpong

Confidence intervals play a crucial role in economics and finance, providing a credible range of values for an unknown parameter along with a corresponding level of certainty…

Abstract

Purpose

Confidence intervals play a crucial role in economics and finance, providing a credible range of values for an unknown parameter along with a corresponding level of certainty. Their applications encompass economic forecasting, market research, financial forecasting, econometric analysis, policy analysis, financial reporting, investment decision-making, credit risk assessment and consumer confidence surveys. Signal-to-noise ratio (SNR) finds applications in economics and finance across various domains such as economic forecasting, financial modeling, market analysis and risk assessment. A high SNR indicates a robust and dependable signal, simplifying the process of making well-informed decisions. On the other hand, a low SNR indicates a weak signal that could be obscured by noise, so decision-making procedures need to take this into serious consideration. This research focuses on the development of confidence intervals for functions derived from the SNR and explores their application in the fields of economics and finance.

Design/methodology/approach

The construction of the confidence intervals involved the application of various methodologies. For the SNR, confidence intervals were formed using the generalized confidence interval (GCI), large sample and Bayesian approaches. The difference between SNRs was estimated through the GCI, large sample, method of variance estimates recovery (MOVER), parametric bootstrap and Bayesian approaches. Additionally, confidence intervals for the common SNR were constructed using the GCI, adjusted MOVER, computational and Bayesian approaches. The performance of these confidence intervals was assessed using coverage probability and average length, evaluated through Monte Carlo simulation.

Findings

The GCI approach demonstrated superior performance over other approaches in terms of both coverage probability and average length for the SNR and the difference between SNRs. Hence, employing the GCI approach is advised for constructing confidence intervals for these parameters. As for the common SNR, the Bayesian approach exhibited the shortest average length. Consequently, the Bayesian approach is recommended for constructing confidence intervals for the common SNR.

Originality/value

This research presents confidence intervals for functions of the SNR to assess SNR estimation in the fields of economics and finance.

Details

Asian Journal of Economics and Banking, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2615-9821

Keywords

Open Access
Article
Publication date: 29 September 2023

Gabriel Caldas Montes and Raime Rolando Rodríguez Díaz

Business confidence is crucial to firm decisions, but it is deeply related to professional forecasters' expectations. Since Brazil is an important inflation targeting country…

Abstract

Purpose

Business confidence is crucial to firm decisions, but it is deeply related to professional forecasters' expectations. Since Brazil is an important inflation targeting country, this paper investigates whether monetary policy credibility and disagreements in inflation and interest rate expectations relate to business confidence in Brazil. The study considers the aggregate business confidence index and the business confidence indexes for 11 industrial sectors in Brazil.

Design/methodology/approach

The authors run ordinary least squares and generalized method of moments regressions to assess the direct effects of disagreements in expectation and monetary policy credibility on business confidence. The authors also make use of Wald test of parameter equality to observe whether there are “offsetting effects” of monetary credibility in mitigating the effects of both disagreements in expectations on business confidence. Besides, the authors run quantile regressions to analyze the effect of the main explanatory variables of interest on business confidence in contexts where business confidence is low (pessimistic) or high (optimistic).

Findings

Disagreements in inflation expectations reduce business confidence, monetary policy credibility improves business confidence and credibility mitigates the adverse effects of disagreements in expectations on business confidence. The sectors most sensitive to monetary policy credibility are Rubber, Motor Vehicles, Metallurgy, Metal Products and Cellulose. The findings also suggest the effect of disagreement in inflation expectations on business confidence decreases as confidence increases, and the effect of monetary policy credibility on business confidence increases as entrepreneurs are more optimistic.

Originality/value

While there is evidence that monetary policy credibility is beneficial to the economy, there are no studies on the effects of disagreements in inflation and interest rate expectations on business confidence (at the aggregate and sectoral levels). Besides, there are no studies that have investigated whether monetary policy credibility can mitigate the effects of disagreements in inflation and interest rate expectations on business confidence (at the aggregate and sectoral levels). Therefore, there are gaps to be filled in the literature addressing business confidence, monetary policy credibility and disagreements in expectations. These issues are particularly important to inflation targeting developing countries.

Details

Journal of Money and Business, vol. 3 no. 2
Type: Research Article
ISSN: 2634-2596

Keywords

Open Access
Article
Publication date: 22 August 2023

André M. Marques

This paper aims to test three hypotheses in city growth literature documenting the poverty reduction observed in Brazil and exploring a rich spatial dataset for 5,564 Brazilian…

Abstract

Purpose

This paper aims to test three hypotheses in city growth literature documenting the poverty reduction observed in Brazil and exploring a rich spatial dataset for 5,564 Brazilian cities observed between 1991 and 2010. The large sample and the author's improved econometric methods allows one to better understand and measure how important income growth is for poverty reduction, the patterns of agglomeration and population growth in all Brazilian cities.

Design/methodology/approach

The author identifies literature gaps and use a sizeable spatial dataset for 5,564 Brazilian cities observed in 1991, 2000 and 2010 applying instrumental variables methods. The bias-corrected accelerated bootstrap percentile interval supports the author's point estimates.

Findings

This manuscript finds that Brazilian data for cities does not support Gibrat's law, raising the scope for urban planning and associated policies. Second, economic growth on a sustainable basis is still a vital source of poverty reduction (The author estimates the poverty elasticity at four percentage points). Lastly, agglomeration effects positively affect the city's productivity, while negative externalities underlie the city's development patterns.

Originality/value

Data for cities in Brazil possess unique characteristics such as spatial autocorrelation and endogeneity. Applying proper methods to find more reliable answers to the above three questions is a desirable procedure that must be encouraged. As the author points out in the manuscript, dealing with endogenous regressors in regional economics is still a developing matter that regional scientists could more generally apply to many regional issues.

Details

EconomiA, vol. 24 no. 2
Type: Research Article
ISSN: 1517-7580

Keywords

Open Access
Article
Publication date: 17 February 2022

Kingstone Nyakurukwa and Yudhvir Seetharam

The authors examine the contemporaneous and causal association between tweet features (bullishness, message volume and investor agreement) and market features (stock returns…

Abstract

Purpose

The authors examine the contemporaneous and causal association between tweet features (bullishness, message volume and investor agreement) and market features (stock returns, trading volume and volatility) using 140 South African companies and a dataset of firm-level Twitter messages extracted from Bloomberg for the period 1 January 2015 to 31 March 2020.

Design/methodology/approach

Panel regressions with ticker fixed-effects are used to examine the contemporaneous link between tweet features and market features. To examine the link between the magnitude of tweet features and stock market features, the study uses quantile regression.

Findings

No monotonic relationship is found between the magnitude of tweet features and the magnitude of market features. The authors find no evidence that past values of tweet features can predict forthcoming stock returns using daily data while weekly and monthly data shows that past values of tweet features contain useful information that can predict the future values of stock returns.

Originality/value

The study is among the earlier to examine the association between textual sentiment from social media and market features in a South African context. The exploration of the relationship across the distribution of the stock market features gives new insights away from the traditional approaches which investigate the relationship at the mean.

Details

Managerial Finance, vol. 48 no. 4
Type: Research Article
ISSN: 0307-4358

Keywords

Open Access
Article
Publication date: 21 January 2022

Yizhi Wang, Brian Lucey, Samuel Alexandre Vigne and Larisa Yarovaya

(1) A concern often expressed in relation to cryptocurrencies is the environmental impact associated with increasing energy consumption and mining pollution. Controversy remains…

11725

Abstract

Purpose

(1) A concern often expressed in relation to cryptocurrencies is the environmental impact associated with increasing energy consumption and mining pollution. Controversy remains regarding how environmental attention and public concerns adversely affect cryptocurrency prices. Therefore, the paper aims to introduce the index of cryptocurrency environmental attention (ICEA), which aims to capture the relative extent of media discussions surrounding the environmental impact of cryptocurrencies. (2) The impacts of cryptocurrency environmental attention on long-term macro-financial markets and economic development remain part of undeveloped research fields. Based on these factors, the paper will further examine the effects of the ICEA on financial markets or economic developments.

Design/methodology/approach

(1) The paper introduces a new index to capture cryptocurrency environmental attention in terms of the cryptocurrency response to major related events through gathering a large amount of news stories around cryptocurrency environmental concerns – i.e. >778.2 million news items from the LexisNexis News & Business database, which can be considered as Big Data – and analysing that rich dataset using variety of quantitative techniques. (2) The vector error correction model (VECM) and structural VECM (SVECM) [impulse response function (IRF), forecast error variance decomposition (FEVD) and historical decomposition (HD)] are useful for characterising the dynamic relationships between ICEA and aggregate economic activities.

Findings

(1) The paper has developed a new measure of attention to sustainability concerns of cryptocurrency markets' growth, ICEA. (2) ICEA has a significantly positive relationship with the UCRY indices, volatility index (VIX), Brent crude oil (BCO) and Bitcoin. (3) ICEA has a significantly negative relationship with the global economic policy uncertainty (GlobalEPU) and global temperature uncertainty (GTU). Moreover, ICEA has a significantly positive relationship with the industrial production (IP) in the short term, whilst having a significantly negative relationship in the long term. (4) The HD of the ICEA displays higher linkages between environmental attention, Bitcoin and UCRY indices around key events that significantly change the prices of digital assets.

Research limitations/implications

The ICEA is significant in the analysis of whether cryptocurrency markets are sustainable regarding energy consumption requirements and negative contributions to climate change. Understanding of the broader impacts of cryptocurrency environmental concerns on cryptocurrency market volatility, uncertainty and environmental sustainability should be considered and developed. Moreover, the paper aims to point out future research and policy legislation directions. Notably, the paper poses the question of how cryptocurrency can be made more sustainable and environmentally friendly and how governments' cryptocurrency policies can address the cryptocurrency markets.

Practical implications

(1) The paper develops a cryptocurrency environmental attention index based on news coverage that captures the extent to which environmental sustainability concerns are discussed in conjunction with cryptocurrencies. (2) The paper empirically investigates the impacts of cryptocurrency environmental attention on other financial or economic variables [cryptocurrency uncertainty (UCRY) indices, Bitcoin, VIX, GlobalEPU, BCO, GTU index and the Organisation for Economic Co-operation and Development IP index]. (3) The paper provides insights into making the most effective use of online databases in the development of new indices for financial research.

Social implications

Whilst blockchain technology has a number of useful implications and has great potential to transform several industries, issues of high-energy consumption and CO2 pollution regarding cryptocurrency have become some of the main areas of criticism, raising questions about the sustainability of cryptocurrencies. These results are essential for both policy-makers and for academics, since the results highlight an urgent need for research addressing the key issues, such as the growth of carbon produced in the creation of this new digital currency. The results also are important for investors concerned with the ethical implications and environmental impacts of their investment choices.

Originality/value

(1) The paper provides an efficient new proxy for cryptocurrency and robust empirical evidence for future research concerning the impact of environmental issues on cryptocurrency markets. (2) The study successfully links cryptocurrency environmental attention to the financial markets, economic developments and other volatility and uncertainty measures, which has certain novel implications for the cryptocurrency literature. (3) The empirical findings of the paper offer useful and up-to-date insights for investors, guiding policy-makers, regulators and media, enabling the ICEA to evolve into a barometer in the cryptocurrency era and play a role in, for example, environmental policy development and investment portfolio optimisation.

Details

China Finance Review International, vol. 12 no. 3
Type: Research Article
ISSN: 2044-1398

Keywords

Open Access
Article
Publication date: 8 August 2023

Elisa Verna, Gianfranco Genta and Maurizio Galetto

The purpose of this paper is to investigate and quantify the impact of product complexity, including architectural complexity, on operator learning, productivity and quality…

Abstract

Purpose

The purpose of this paper is to investigate and quantify the impact of product complexity, including architectural complexity, on operator learning, productivity and quality performance in both assembly and disassembly operations. This topic has not been extensively investigated in previous research.

Design/methodology/approach

An extensive experimental campaign involving 84 operators was conducted to repeatedly assemble and disassemble six different products of varying complexity to construct productivity and quality learning curves. Data from the experiment were analysed using statistical methods.

Findings

The human learning factor of productivity increases superlinearly with the increasing architectural complexity of products, i.e. from centralised to distributed architectures, both in assembly and disassembly, regardless of the level of overall product complexity. On the other hand, the human learning factor of quality performance decreases superlinearly as the architectural complexity of products increases. The intrinsic characteristics of product architecture are the reasons for this difference in learning factor.

Practical implications

The results of the study suggest that considering product complexity, particularly architectural complexity, in the design and planning of manufacturing processes can optimise operator learning, productivity and quality performance, and inform decisions about improving manufacturing operations.

Originality/value

While previous research has focussed on the effects of complexity on process time and defect generation, this study is amongst the first to investigate and quantify the effects of product complexity, including architectural complexity, on operator learning using an extensive experimental campaign.

Details

Journal of Manufacturing Technology Management, vol. 34 no. 9
Type: Research Article
ISSN: 1741-038X

Keywords

Open Access
Article
Publication date: 27 July 2022

Ruilin Yu, Yuxin Zhang, Luyao Wang and Xinyi Du

Time headway (THW) is an essential parameter in traffic safety and is used as a typical control variable by many vehicle control algorithms, especially in safety-critical ADAS and…

1252

Abstract

Purpose

Time headway (THW) is an essential parameter in traffic safety and is used as a typical control variable by many vehicle control algorithms, especially in safety-critical ADAS and automated driving systems. However, due to the randomness of human drivers, THW cannot be accurately represented, affecting scholars’ more profound research.

Design/methodology/approach

In this work, two data sets are used as the experimental data to calculate the goodness-of-fit of 18 commonly used distribution models of THW to select the best distribution model. Subsequently, the characteristic parameters of traffic flow are extracted from the data set, and three variables with higher importance are extracted using the random forest model. Combining the best distribution model parameters of the data set, this study obtained a distribution model with adaptive parameters, and its performance and applicability are verified.

Findings

In this work, two data sets are used as the experimental data to calculate the goodness-of-fit of 18 commonly used distribution models of THW to select the best distribution model. Subsequently, the characteristic parameters of traffic flow are extracted from the data set, and three variables with higher importance are extracted using the random forest model. Combining the best distribution model parameters of the data set, this study obtained a distribution model with adaptive parameters, and its performance and applicability are verified.

Originality/value

The results show that the proposed model has a 62.7% performance improvement over the distribution model with fixed parameters. Moreover, the parameter function of the distribution model can be regarded as a quantitative analysis of the degree of influence of the traffic flow state on THW.

Details

Journal of Intelligent and Connected Vehicles, vol. 5 no. 3
Type: Research Article
ISSN: 2399-9802

Keywords

Open Access
Article
Publication date: 26 April 2024

Xue Xin, Yuepeng Jiao, Yunfeng Zhang, Ming Liang and Zhanyong Yao

This study aims to ensure reliable analysis of dynamic responses in asphalt pavement structures. It investigates noise reduction and data mining techniques for pavement dynamic…

Abstract

Purpose

This study aims to ensure reliable analysis of dynamic responses in asphalt pavement structures. It investigates noise reduction and data mining techniques for pavement dynamic response signals.

Design/methodology/approach

The paper conducts time-frequency analysis on signals of pavement dynamic response initially. It also uses two common noise reduction methods, namely, low-pass filtering and wavelet decomposition reconstruction, to evaluate their effectiveness in reducing noise in these signals. Furthermore, as these signals are generated in response to vehicle loading, they contain a substantial amount of data and are prone to environmental interference, potentially resulting in outliers. Hence, it becomes crucial to extract dynamic strain response features (e.g. peaks and peak intervals) in real-time and efficiently.

Findings

The study introduces an improved density-based spatial clustering of applications with Noise (DBSCAN) algorithm for identifying outliers in denoised data. The results demonstrate that low-pass filtering is highly effective in reducing noise in pavement dynamic response signals within specified frequency ranges. The improved DBSCAN algorithm effectively identifies outliers in these signals through testing. Furthermore, the peak detection process, using the enhanced findpeaks function, consistently achieves excellent performance in identifying peak values, even when complex multi-axle heavy-duty truck strain signals are present.

Originality/value

The authors identified a suitable frequency domain range for low-pass filtering in asphalt road dynamic response signals, revealing minimal amplitude loss and effective strain information reflection between road layers. Furthermore, the authors introduced the DBSCAN-based anomaly data detection method and enhancements to the Matlab findpeaks function, enabling the detection of anomalies in road sensor data and automated peak identification.

Details

Smart and Resilient Transportation, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2632-0487

Keywords

Open Access
Article
Publication date: 9 June 2021

Jin Gi Kim, Hyun-Tak Lee and Bong-Gyu Jang

This paper examines whether the successful bid rate of the OnBid public auction, published by Korea Asset Management Corporation, can identify and forecast the Korea…

Abstract

Purpose

This paper examines whether the successful bid rate of the OnBid public auction, published by Korea Asset Management Corporation, can identify and forecast the Korea business-cycle expansion and contraction regimes characterized by the OECD reference turning points. We use logistic regression and support vector machine in performing the OECD regime classification and predicting three-month-ahead regime. We find that the OnBid auction rate conveys important information for detecting the coincident and future regimes because this information might be closely related to deleveraging regarding default on debt obligations. This finding suggests that corporate managers and investors could use the auction information to gauge the regime position in their decision-making. This research has an academic significance that reveals the relationship between the auction market and the business-cycle regimes.

Details

Journal of Derivatives and Quantitative Studies: 선물연구, vol. 29 no. 2
Type: Research Article
ISSN: 1229-988X

Keywords

1 – 10 of 100