Search results

1 – 10 of over 18000
Article
Publication date: 4 August 2022

Edward Rigdon

This paper aims to clarify some of the representations regarding philosophy of science and statistical methods, which are contained in Cadogan and Lee (this issue).

Abstract

Purpose

This paper aims to clarify some of the representations regarding philosophy of science and statistical methods, which are contained in Cadogan and Lee (this issue).

Design/methodology/approach

This paper uses logical argument and a review of literature.

Findings

Rigdon’s (2012) approach to construct validation is entirely consistent with scientific realism, while the “realist variable framework” revives the empiricist reification of common factors found in Bagozzi’s (1984) Holistic Construal and throughout the early literature of structural equation modeling. Factor indeterminacy is a phenomenon that makes it impossible to equate common factors with conceptual variables. The future of marketing measurement is not in the historical error-centric framework but in a measurement framework centered around uncertainty.

Research limitations/implications

Researchers should avoid reification of common factors and recognize the validity gap between conceptual variables and empirical proxies, consistent with Rigdon (2012) and should move toward an uncertainty-centric approach to measurement.

Practical implications

Decision-makers need to acknowledge the difference between data and the underlying reality. Success or failure will be shaped by the reality, not by the data.

Originality/value

To the best of the author’s knowledge, this is the first paper seeking to clarify representations in Cadogan and Lee (this issue). This paper aims to save journal readers from being misled.

Details

European Journal of Marketing, vol. 57 no. 6
Type: Research Article
ISSN: 0309-0566

Keywords

Article
Publication date: 28 March 2022

John W. Cadogan and Nick Lee

This study aims to determine whether partial least squares path modeling (PLS) is fit for purpose for scholars holding scientific realist views.

Abstract

Purpose

This study aims to determine whether partial least squares path modeling (PLS) is fit for purpose for scholars holding scientific realist views.

Design/methodology/approach

The authors present the philosophical foundations of scientific realism and constructivism and examine the extent to which PLS aligns with them.

Findings

PLS does not align with scientific realism but aligns well with constructivism.

Research limitations/implications

Research is needed to assess PLS’s fit with instrumentalism and pragmatism.

Practical implications

PLS has no utility as a realist scientific tool but may be of interest to constructivists.

Originality/value

To the best of the authors’ knowledge, this study is the first to assess PLS’s alignments and mismatches with constructivist and scientific realist perspectives.

Article
Publication date: 16 August 2022

John W. Cadogan and Nick Lee

This study aims to correct errors in, and comment on the claims made in the comment papers of Rigdon (2022) and Henseler and Schuberth (2022), and to tidy up any substantive…

Abstract

Purpose

This study aims to correct errors in, and comment on the claims made in the comment papers of Rigdon (2022) and Henseler and Schuberth (2022), and to tidy up any substantive oversights made in Cadogan and Lee (2022).

Design/methodology/approach

The study discusses and clarifies the gap between Rigdon’s notion of scientific realism and the metaphysical, semantic and epistemological commitments that are broadly agreed to be key principles of scientific realism. The study also examines the ontological status of the variables that Henseler and Schuberth claim are emergent using emergence logic grounded in the notion that variables are only truly emergent if they demonstrate a failure of generative atomism.

Findings

In scientific realism, hypothetical causal contact between the unobserved and the observed is a key foundational stance, and as such, Rigdon’s concept proxy framework (CPF) is inherently anti-realist in nature. Furthermore, Henseler and Schuberth’s suggestion that composite-creating statistical packages [such as partial least squares (PLS)] can model emergent variables should be treated with skepticism by realists.

Research limitations/implications

Claims made by Rigdon regarding the realism of CPF are unfounded, and claims by Henseler and Schuberth regarding the universal suitability of partial least squares (PLS) as a tool for use by researchers of all ontological stripes (see their Table 5) do not appear to be well-grounded.

Practical implications

Those aspiring to do science according to the precepts of scientific realism need to be careful in assessing claims in the literature. For instance, despite Rigdon’s assertion that CPF is a realist framework, we show that it is not. Consequently, some of Rigdon’s core criticisms of the common factor logic make no sense for the realist. Likewise, if the variables resulting from composite creating statistical packages (like PLS) are not really emergent (contrary to Henseler and Schuberth) and so are not real, their utility as tools for scientific realist inquiry are called into question.

Originality/value

This study assesses PLS using the Eleatic Principle and examines H&S’s version of emergent variables from an ontological perspective.

Open Access
Article
Publication date: 13 April 2022

Florian Schuberth, Manuel E. Rademaker and Jörg Henseler

This study aims to examine the role of an overall model fit assessment in the context of partial least squares path modeling (PLS-PM). In doing so, it will explain when it is…

6422

Abstract

Purpose

This study aims to examine the role of an overall model fit assessment in the context of partial least squares path modeling (PLS-PM). In doing so, it will explain when it is important to assess the overall model fit and provides ways of assessing the fit of composite models. Moreover, it will resolve major concerns about model fit assessment that have been raised in the literature on PLS-PM.

Design/methodology/approach

This paper explains when and how to assess the fit of PLS path models. Furthermore, it discusses the concerns raised in the PLS-PM literature about the overall model fit assessment and provides concise guidelines on assessing the overall fit of composite models.

Findings

This study explains that the model fit assessment is as important for composite models as it is for common factor models. To assess the overall fit of composite models, researchers can use a statistical test and several fit indices known through structural equation modeling (SEM) with latent variables.

Research limitations/implications

Researchers who use PLS-PM to assess composite models that aim to understand the mechanism of an underlying population and draw statistical inferences should take the concept of the overall model fit seriously.

Practical implications

To facilitate the overall fit assessment of composite models, this study presents a two-step procedure adopted from the literature on SEM with latent variables.

Originality/value

This paper clarifies that the necessity to assess model fit is not a question of which estimator will be used (PLS-PM, maximum likelihood, etc). but of the purpose of statistical modeling. Whereas, the model fit assessment is paramount in explanatory modeling, it is not imperative in predictive modeling.

Details

European Journal of Marketing, vol. 57 no. 6
Type: Research Article
ISSN: 0309-0566

Keywords

Article
Publication date: 13 October 2022

Michael O'Connell

In order to provide an updated view on the drivers of German stock returns, the authors evaluate the relative performance of nine competing neoclassical asset pricing models in…

Abstract

Purpose

In order to provide an updated view on the drivers of German stock returns, the authors evaluate the relative performance of nine competing neoclassical asset pricing models in the German stock market between November 1991 and December 2021.

Design/methodology/approach

The authors conduct asymptotically valid tests of model comparison when the extent of model mispricing is gauged by the squared Sharpe ratio improvement measure of Barillas et al. (2020).

Findings

The study finds that the Fama and French six-factor model with both traditional and updated value factors emerges as the dominant model.

Originality/value

The authors shed new light on the drivers of German stock returns through an updated and extended period of analysis, wider range of potential models and utilization of valid asymptotic tests of model comparison when models are nonnested (Barillas et al., 2020).

Details

Journal of Economic Studies, vol. 50 no. 6
Type: Research Article
ISSN: 0144-3585

Keywords

Article
Publication date: 15 February 2023

Ismail Badraoui, Ivo A.M.C. van der Lans, Youssef Boulaksil and Jack G.A.J. van der Vorst

This study aims to compare the expectations of non-collaborating professionals and the actual opinions of collaborating professionals regarding success factors of horizontal…

Abstract

Purpose

This study aims to compare the expectations of non-collaborating professionals and the actual opinions of collaborating professionals regarding success factors of horizontal logistics collaboration (HLC) and investigates the reasons behind the observed differences.

Design/methodology/approach

This study employs a mixed-method approach. First, a survey is conducted to collect data from two samples representing collaborating and non-collaborating industry professionals. Second, confirmatory factor analysis (CFA) is used to compare the measurement models from the two samples and identify their similarities and differences. Third, a Delphi study is conducted to identify factors limiting collaborative behavior.

Findings

The results show that collaborating professionals exhibit lower levels of joint relationship efforts and trust than expected. This is primarily due to inadequate information sharing, poor collaboration formalization and the absence of a clear costs and benefits allocation mechanism.

Practical implications

The findings indicate that, in HLC, managers should give high importance to facilitating timely and complete information exchange, putting in place an acceptable costs/benefits allocation mechanism, formalizing the collaboration and prioritizing integrity over competency when selecting partners.

Originality/value

To the best of the authors’ knowledge, this is the first study that shows the existence of differences between industry professionals' pre-collaboration expectations and the actual experiences in HLC. This is also the first study that points to the exact HLC enablers that fail in practice and the barriers responsible for it.

Details

Benchmarking: An International Journal, vol. 31 no. 1
Type: Research Article
ISSN: 1463-5771

Keywords

Article
Publication date: 8 June 2023

Rupak Rauniar, Greg Rawski, Qing Ray Cao and Samhita Shah

Drawing upon a systematic literature review in new technology, innovation transfer and diffusion theories, and from interviews with technology leaders in digital transformation…

Abstract

Purpose

Drawing upon a systematic literature review in new technology, innovation transfer and diffusion theories, and from interviews with technology leaders in digital transformation programs in the US Oil & Gas (O&G) industry, the authors explore the relationships among O&G industry dynamics, organization's absorptive capacity and resource commitment for new digital technology adoption-implementation process.

Design/methodology/approach

The authors employed the empirical survey method to gather the data (a sample size of 172) in the US O&G industry and used structural equation modeling (SEM) to test the measurement model for validity and reliability and the conceptual model for hypothesized structural relationships.

Findings

The results provide support for the study’s causal model of adoption and implementation with positive and direct relationships between the initiation and trial stages, between the trial stages and the evaluation of effective outcomes and between the effective outcomes and the effective implementation stages of digital technologies. The results also reveal partial mediating relationships of industry dynamics, absorptive capacity and resource commitment between respective stages.

Practical implications

Based on the current study's findings, managers are recommended to pay attention to the evolving industry dynamics during the initiation stage of new digital technology adoption, to utilize the organization's knowledge-based absorptive capacity during digital technology trial and selection stages and to support the digital technology implementation project when the adoption decision of a particular digital technology has been made.

Originality/value

The empirical research contributes literature on digital technology adoption and implementation by identifying and demonstrating the importance of industry dynamics, absorptive capacity and resource commitment factors as mediating variables at various stages of the adoption-implementation process and empirically validating a process-based causal model of digital technology adoption and a successful implementation project that has been missing in the current body of literature on digital transformation.

Details

Journal of Enterprise Information Management, vol. 37 no. 3
Type: Research Article
ISSN: 1741-0398

Keywords

Open Access
Article
Publication date: 17 August 2022

Jörg Henseler and Florian Schuberth

In their paper titled “A Miracle of Measurement or Accidental Constructivism? How PLS Subverts the Realist Search for Truth,” Cadogan and Lee (2022) cast serious doubt on PLS’s…

2141

Abstract

Purpose

In their paper titled “A Miracle of Measurement or Accidental Constructivism? How PLS Subverts the Realist Search for Truth,” Cadogan and Lee (2022) cast serious doubt on PLS’s suitability for scientific studies. The purpose of this commentary is to discuss the claims of Cadogan and Lee, correct some inaccuracies, and derive recommendations for researchers using structural equation models.

Design/methodology/approach

This paper uses scenario analysis to show which estimators are appropriate for reflective measurement models and composite models, and formulates the statistical model that underlies PLS Mode A. It also contrasts two different perspectives: PLS as an estimator for structural equation models vs. PLS-SEM as an overarching framework with a sui generis logic.

Findings

There are different variants of PLS, which include PLS, consistent PLS, PLSe1, PLSe2, proposed ordinal PLS and robust PLS, each of which serves a particular purpose. All of these are appropriate for scientific inquiry if applied properly. It is not PLS that subverts the realist search for truth, but some proponents of a framework called “PLS-SEM.” These proponents redefine the term “reflective measurement,” argue against the assessment of model fit and suggest that researchers could obtain “confirmation” for their model.

Research limitations/implications

Researchers should be more conscious, open and respectful regarding different research paradigms.

Practical implications

Researchers should select a statistical model that adequately represents their theory, not necessarily a common factor model, and formulate their model explicitly. Particularly for instrumentalists, pragmatists and constructivists, the composite model appears promising. Researchers should be concerned about their estimator’s properties, not about whether it is called “PLS.” Further, researchers should critically evaluate their model, not seek confirmation or blindly believe in its value.

Originality/value

This paper critically appraises Cadogan and Lee (2022) and reminds researchers who wish to use structural equation modeling, particularly PLS, for their statistical analysis, of some important scientific principles.

Article
Publication date: 18 May 2023

Tamara Schamberger

Structural equation modeling (SEM) is a well-established and frequently applied method in various disciplines. New methods in the context of SEM are being introduced in an ongoing…

Abstract

Purpose

Structural equation modeling (SEM) is a well-established and frequently applied method in various disciplines. New methods in the context of SEM are being introduced in an ongoing manner. Since formal proof of statistical properties is difficult or impossible, new methods are frequently justified using Monte Carlo simulations. For SEM with covariance-based estimators, several tools are available to perform Monte Carlo simulations. Moreover, several guidelines on how to conduct a Monte Carlo simulation for SEM with these tools have been introduced. In contrast, software to estimate structural equation models with variance-based estimators such as partial least squares path modeling (PLS-PM) is limited.

Design/methodology/approach

As a remedy, the R package cSEM which allows researchers to estimate structural equation models and to perform Monte Carlo simulations for SEM with variance-based estimators has been introduced. This manuscript provides guidelines on how to conduct a Monte Carlo simulation for SEM with variance-based estimators using the R packages cSEM and cSEM.DGP.

Findings

The author introduces and recommends a six-step procedure to be followed in conducting each Monte Carlo simulation.

Originality/value

For each of the steps, common design patterns are given. Moreover, these guidelines are illustrated by an example Monte Carlo simulation with ready-to-use R code showing that PLS-PM needs the constructs to be embedded in a nomological net to yield valuable results.

Details

Industrial Management & Data Systems, vol. 123 no. 6
Type: Research Article
ISSN: 0263-5577

Keywords

Article
Publication date: 11 January 2023

Peixu He, Amitabh Anand, Mengying Wu, Cuiling Jiang and Qing Xia

The purpose of this paper is to investigate how voluntary citizenship behaviour towards an individual (VCB-I) is linked with vicious knowledge hiding (VKH), and why members…

1048

Abstract

Purpose

The purpose of this paper is to investigate how voluntary citizenship behaviour towards an individual (VCB-I) is linked with vicious knowledge hiding (VKH), and why members, within a mastery climate, tend to participate in less VKH after their engaging in VCB-I. The authors, according to the moral licensing theory, propose that moral licensing mediates the relationship between VCB-I and VKH, and that a mastery climate weakens the hypothesised link via moral licensing.

Design/methodology/approach

This study surveys 455 valid matching samples of subordinates and supervisors from 77 working teams in China at two time points and explores the relationship between VCB and VKH, as well as the underlying mechanism. A confirmatory factor analysis, bootstrapping method and hierarchical linear model were used to validate the research hypotheses.

Findings

The results show that VCB-I has a significant positive effect on VKH; moral credentials play a mediating role in the relationship between VCB-I and VKH; and the mastery climate moderates the positive effect of moral credentials on VKH and the mediating effect of moral credentials. In a high-mastery climate, the direct effect of moral credentials on VKH and the indirect influence of VCB-I on VKH through moral credentials are both weakened, and conversely, both effects are enhanced in a low-mastery climate. However, contrary to the expected hypothesis, moral credits do not mediate the relationship between VCB-I and VKH, which may be due to the differences in the mechanisms between the two moral licensing models.

Originality/value

Prior research has mainly focused on the “victim-centric” perspective to examine the impacts of others’ behaviour on employees’ knowledge hiding. Few works have used the “actor-centric” perspective to analyse the relationship between employees’ prior workplace behaviour and their subsequent knowledge hiding intention. In addition, this study enriches the field research on the voluntary aspects of organisational citizenship behaviour, which differs from its involuntary ones.

1 – 10 of over 18000