Search results
1 – 10 of 58Michael Fuchs, Guillaume Bodet and Gregor Hovemann
While consumer preferences for sporting goods have been widely researched within sport management, literature is lacking on aspects of social and environmental sustainability…
Abstract
Purpose
While consumer preferences for sporting goods have been widely researched within sport management, literature is lacking on aspects of social and environmental sustainability. Accordingly, this study aims to investigate the role of social and environmental sustainability for purchase decisions of sportswear and compares them to the role of price and functionality.
Design/methodology/approach
Based on a conjoint analysis among 1,012 Europeans, the authors conducted a two-step cluster analysis. First, the authors investigated the number of segments via Ward’s method. Second, the authors ran a k-means analysis based on part-worth utilities from the conjoint analysis.
Findings
The authors identified four segments which differ in terms of preferred product attributes, willingness to pay, and sociodemographic, behavioral, and psychographic characteristics: undecided, sustainable, price-focused and function-oriented consumers. Based on this segmentation, the authors found that the importance of social and environmental sustainability is growing, but not among all consumers.
Research limitations/implications
The generalizability of the study is limited since it is not built on a sample representative for the included European countries, it focuses on a single product, and participants are potentially subject to a social desirability bias.
Originality/value
The consumer analysis comprises the uptake of attributes related to social and environmental sustainability. The authors thereby address a literature gap as previous research (thematizing sporting goods) in the sport management field has often neglected sustainability elements despite their rapidly growing importance within the sport sector.
Details
Keywords
Mohd Irfan and Anup Kumar Sharma
A progressive hybrid censoring scheme (PHCS) becomes impractical for ensuring dependable outcomes when there is a low likelihood of encountering a small number of failures prior…
Abstract
Purpose
A progressive hybrid censoring scheme (PHCS) becomes impractical for ensuring dependable outcomes when there is a low likelihood of encountering a small number of failures prior to the predetermined terminal time T. The generalized progressive hybrid censoring scheme (GPHCS) efficiently addresses to overcome the limitation of the PHCS.
Design/methodology/approach
In this article, estimation of model parameter, survival and hazard rate of the Unit-Lindley distribution (ULD), when sample comes from the GPHCS, have been taken into account. The maximum likelihood estimator has been derived using Newton–Raphson iterative procedures. Approximate confidence intervals of the model parameter and their arbitrary functions are established by the Fisher information matrix. Bayesian estimation procedures have been derived using Metropolis–Hastings algorithm under squared error loss function. Convergence of Markov chain Monte Carlo (MCMC) samples has been examined. Various optimality criteria have been considered. An extensive Monte Carlo simulation analysis has been shown to compare and validating of the proposed estimation techniques.
Findings
The Bayesian MCMC approach to estimate the model parameters and reliability characteristics of the generalized progressive hybrid censored data of ULD is recommended. The authors anticipate that health data analysts and reliability professionals will get benefit from the findings and approaches presented in this study.
Originality/value
The ULD has a broad range of practical utility, making it a problem to estimate the model parameters as well as reliability characteristics and the significance of the GPHCS also encourage the authors to consider the present estimation problem because it has not previously been discussed in the literature.
Details
Keywords
James L. Sullivan, David Novak, Eric Hernandez and Nick Van Den Berg
This paper introduces a novel quality measure, the percent-within-distribution, or PWD, for acceptance and payment in a quality control/quality assurance (QC/QA) performance…
Abstract
Purpose
This paper introduces a novel quality measure, the percent-within-distribution, or PWD, for acceptance and payment in a quality control/quality assurance (QC/QA) performance specification (PS).
Design/methodology/approach
The new quality measure takes any sample size or distribution and uses a Bayesian updating process to re-estimate parameters of a design distribution as sample observations are fed through the algorithm. This methodology can be employed in a wide range of applications, but the authors demonstrate the use of the measure for a QC/QA PS with upper and lower bounds on 28-day compressive strength of in-place concrete for bridge decks.
Findings
The authors demonstrate the use of this new quality measure to illustrate how it addresses the shortcomings of the percent-within-limits (PWL), which is the current industry standard quality measure. The authors then use the PWD to develop initial pay factors through simulation regimes. The PWD is shown to function better than the PWL with realistic sample lots simulated to represent a variety of industry responses to a new QC/QA PS.
Originality/value
The analytical contribution of this work is the introduction of the new quality measure. However, the practical and managerial contributions of this work are of equal significance.
Details
Keywords
Kirk Luther, Zak Keeping, Brent Snook, Hannah de Almeida, Weyam Fahmy, Alexia Smith and Tianshuang Han
The purpose of this study is to contribute to the literature on information elicitation. The authors investigated the impact of social influence strategies on eyewitness recall…
Abstract
Purpose
The purpose of this study is to contribute to the literature on information elicitation. The authors investigated the impact of social influence strategies on eyewitness recall performance. Specifically, the authors examined the effect of social influence techniques (Cialdini, 2007) on recall performance (Experiment 1) and conducted a follow-up experiment to examine the incremental effect of social proof on the report everything cognitive interview mnemonic (Experiment 2).
Design/methodology/approach
Participants watched a video depicting vandalism (Experiment 1: N = 174) or a verbal altercation (Experiment 2: N = 128) and were asked to recall the witnessed event. Experiment 1: Participants were assigned randomly to one of six conditions: control (open-ended prompt), engage and explain (interview ground rules), consistency (signing an agreement to work diligently), reciprocity (given water and food), authority (told of interviewer’s training) and social proof (shown transcript from an exemplar participant). Experiment 2: The authors used a 2 (social proof: present, absent) × 2 (report everything: present, absent) between-participants design.
Findings
Across both experiments, participants exposed to the social proof tactic (i.e. compared to a model exemplar) spoke longer and recalled more correct details than participants not exposed to the social proof tactic. In Experiment 2, participants interviewed with the report everything mnemonic also spoke longer, recalled more correct details, more incorrect details and provided slightly more confabulations than those not interviewed with the report everything mnemonic.
Originality/value
The findings have practical value for police investigators and other professionals who conduct interviews (e.g. military personnel, doctors obtaining information from patients). Interviewers can incorporate social proof in their interviewing practices to help increase the amount and accuracy of information obtained.
Details
Keywords
Chuyu Tang, Hao Wang, Genliang Chen and Shaoqiu Xu
This paper aims to propose a robust method for non-rigid point set registration, using the Gaussian mixture model and accommodating non-rigid transformations. The posterior…
Abstract
Purpose
This paper aims to propose a robust method for non-rigid point set registration, using the Gaussian mixture model and accommodating non-rigid transformations. The posterior probabilities of the mixture model are determined through the proposed integrated feature divergence.
Design/methodology/approach
The method involves an alternating two-step framework, comprising correspondence estimation and subsequent transformation updating. For correspondence estimation, integrated feature divergences including both global and local features, are coupled with deterministic annealing to address the non-convexity problem of registration. For transformation updating, the expectation-maximization iteration scheme is introduced to iteratively refine correspondence and transformation estimation until convergence.
Findings
The experiments confirm that the proposed registration approach exhibits remarkable robustness on deformation, noise, outliers and occlusion for both 2D and 3D point clouds. Furthermore, the proposed method outperforms existing analogous algorithms in terms of time complexity. Application of stabilizing and securing intermodal containers loaded on ships is performed. The results demonstrate that the proposed registration framework exhibits excellent adaptability for real-scan point clouds, and achieves comparatively superior alignments in a shorter time.
Originality/value
The integrated feature divergence, involving both global and local information of points, is proven to be an effective indicator for measuring the reliability of point correspondences. This inclusion prevents premature convergence, resulting in more robust registration results for our proposed method. Simultaneously, the total operating time is reduced due to a lower number of iterations.
Details
Keywords
Bao Khac Quoc Nguyen, Nguyet Thi Bich Phan and Van Le
This study investigates the interactions between the US daily public debt and currency power under impacts of the Covid-19 crisis.
Abstract
Purpose
This study investigates the interactions between the US daily public debt and currency power under impacts of the Covid-19 crisis.
Design/methodology/approach
The authors employ the multivariate generalized autoregressive conditional heteroskedasticity (MGARCH) modeling to explore the interactions between daily changes in the US Debt to the Penny and the US Dollar Index. The data sets are from April 01, 1993, to May 27, 2022, in which noticeable points include the Covid-19 outbreak (January 01, 2020) and the US vaccination campaign commencement (December 14, 2020).
Findings
The authors find that the daily change in public debt positively affects the USD index return, and the past performance of currency power significantly mitigates the Debt to the Penny. Due to the Covid-19 outbreak, the impact of public debt on currency power becomes negative. This effect remains unchanged after the pandemic. These findings indicate that policy-makers could feasibly obtain both the budget stability and currency power objectives in pursuit of either public debt sustainability or power of currency. However, such policies should be considered that public debt could be a negative influencer during crisis periods.
Originality/value
The authors propose a pioneering approach to explore the relationship between leading and lagging indicators of an economy as characterized by their daily data sets. In accordance, empirical findings of this study inspire future research in relation to public debt and its connections with several economic indicators.
Peer review
The peer review history for this article is available at: https://publons.com/publon/10.1108/IJSE-08-2022-0581
Details
Keywords
Rucha Wadapurkar, Sanket Bapat, Rupali Mahajan and Renu Vyas
Ovarian cancer (OC) is the most common type of gynecologic cancer in the world with a high rate of mortality. Due to manifestation of generic symptoms and absence of specific…
Abstract
Purpose
Ovarian cancer (OC) is the most common type of gynecologic cancer in the world with a high rate of mortality. Due to manifestation of generic symptoms and absence of specific biomarkers, OC is usually diagnosed at a late stage. Machine learning models can be employed to predict driver genes implicated in causative mutations.
Design/methodology/approach
In the present study, a comprehensive next generation sequencing (NGS) analysis of whole exome sequences of 47 OC patients was carried out to identify clinically significant mutations. Nine functional features of 708 mutations identified were input into a machine learning classification model by employing the eXtreme Gradient Boosting (XGBoost) classifier method for prediction of OC driver genes.
Findings
The XGBoost classifier model yielded a classification accuracy of 0.946, which was superior to that obtained by other classifiers such as decision tree, Naive Bayes, random forest and support vector machine. Further, an interaction network was generated to identify and establish correlations with cancer-associated pathways and gene ontology data.
Originality/value
The final results revealed 12 putative candidate cancer driver genes, namely LAMA3, LAMC3, COL6A1, COL5A1, COL2A1, UGT1A1, BDNF, ANK1, WNT10A, FZD4, PLEKHG5 and CYP2C9, that may have implications in clinical diagnosis.
Details
Keywords
Emir Malikov, Shunan Zhao and Jingfang Zhang
There is growing empirical evidence that firm heterogeneity is technologically non-neutral. This chapter extends the Gandhi, Navarro, and Rivers (2020) proxy variable framework…
Abstract
There is growing empirical evidence that firm heterogeneity is technologically non-neutral. This chapter extends the Gandhi, Navarro, and Rivers (2020) proxy variable framework for structurally identifying production functions to a more general case when latent firm productivity is multi-dimensional, with both factor-neutral and (biased) factor-augmenting components. Unlike alternative methodologies, the proposed model can be identified under weaker data requirements, notably, without relying on the typically unavailable cross-sectional variation in input prices for instrumentation. When markets are perfectly competitive, point identification is achieved by leveraging the information contained in static optimality conditions, effectively adopting a system-of-equations approach. It is also shown how one can partially identify the non-neutral production technology in the traditional proxy variable framework when firms have market power.
Details
Keywords
An understanding of the role of decision-making has been emphasised since the seminal works on human information processing and professional judgements by accountants. The…
Abstract
Purpose
An understanding of the role of decision-making has been emphasised since the seminal works on human information processing and professional judgements by accountants. The interest in these topics has been reignited by the increasing digitisation of the financial reporting and auditing processes. Whilst the behavioural research on accounting is well-established, the application of seminal works in cognitive psychology and behavioural finance is lacking, especially from recent research endeavours. The purpose of this paper is to provide a synthesis of theories relating to accounting behavioural research by evaluating them against the theories of cognitive psychology.
Design/methodology/approach
Using theory synthesis, this research draws seemingly isolated strands of research into a coherent framework, underpinned by cognitive psychology.
Findings
Evidence from accounting and auditing behavioural research is largely consistent with the psychology and finance research on cognitive limitations and errors. There remains a lacuna in accounting behavioural research on debiasing techniques. Such research, if underpinned by a single, cohesive theoretical framework, is likely to have practical relevance.
Research limitations/implications
The current research has theoretical implications for the accounting decision-making and uncertainty research. Areas for future research, based on identified gaps in the current accounting behavioural research, are also proposed.
Details
Keywords
Warisa Thangjai and Sa-Aat Niwitpong
Confidence intervals play a crucial role in economics and finance, providing a credible range of values for an unknown parameter along with a corresponding level of certainty…
Abstract
Purpose
Confidence intervals play a crucial role in economics and finance, providing a credible range of values for an unknown parameter along with a corresponding level of certainty. Their applications encompass economic forecasting, market research, financial forecasting, econometric analysis, policy analysis, financial reporting, investment decision-making, credit risk assessment and consumer confidence surveys. Signal-to-noise ratio (SNR) finds applications in economics and finance across various domains such as economic forecasting, financial modeling, market analysis and risk assessment. A high SNR indicates a robust and dependable signal, simplifying the process of making well-informed decisions. On the other hand, a low SNR indicates a weak signal that could be obscured by noise, so decision-making procedures need to take this into serious consideration. This research focuses on the development of confidence intervals for functions derived from the SNR and explores their application in the fields of economics and finance.
Design/methodology/approach
The construction of the confidence intervals involved the application of various methodologies. For the SNR, confidence intervals were formed using the generalized confidence interval (GCI), large sample and Bayesian approaches. The difference between SNRs was estimated through the GCI, large sample, method of variance estimates recovery (MOVER), parametric bootstrap and Bayesian approaches. Additionally, confidence intervals for the common SNR were constructed using the GCI, adjusted MOVER, computational and Bayesian approaches. The performance of these confidence intervals was assessed using coverage probability and average length, evaluated through Monte Carlo simulation.
Findings
The GCI approach demonstrated superior performance over other approaches in terms of both coverage probability and average length for the SNR and the difference between SNRs. Hence, employing the GCI approach is advised for constructing confidence intervals for these parameters. As for the common SNR, the Bayesian approach exhibited the shortest average length. Consequently, the Bayesian approach is recommended for constructing confidence intervals for the common SNR.
Originality/value
This research presents confidence intervals for functions of the SNR to assess SNR estimation in the fields of economics and finance.
Details