Search results
1 – 10 of over 44000Manuel E. Rademaker, Florian Schuberth and Theo K. Dijkstra
The purpose of this paper is to enhance consistent partial least squares (PLSc) to yield consistent parameter estimates for population models whose indicator blocks contain a…
Abstract
Purpose
The purpose of this paper is to enhance consistent partial least squares (PLSc) to yield consistent parameter estimates for population models whose indicator blocks contain a subset of correlated measurement errors.
Design/methodology/approach
Correction for attenuation as originally applied by PLSc is modified to include a priori assumptions on the structure of the measurement error correlations within blocks of indicators. To assess the efficacy of the modification, a Monte Carlo simulation is conducted.
Findings
In the presence of population measurement error correlation, estimated parameter bias is generally small for original and modified PLSc, with the latter outperforming the former for large sample sizes. In terms of the root mean squared error, the results are virtually identical for both original and modified PLSc. Only for relatively large sample sizes, high population measurement error correlation, and low population composite reliability are the increased standard errors associated with the modification outweighed by a smaller bias. These findings are regarded as initial evidence that original PLSc is comparatively robust with respect to misspecification of the structure of measurement error correlations within blocks of indicators.
Originality/value
Introducing and investigating a new approach to address measurement error correlation within blocks of indicators in PLSc, this paper contributes to the ongoing development and assessment of recent advancements in partial least squares path modeling.
Details
Keywords
Pingan Zhu, Chao Zhang and Jun Zou
The purpose of the work is to provide a comprehensive review of the digital image correlation (DIC) technique for those who are interested in performing the DIC technique in the…
Abstract
Purpose
The purpose of the work is to provide a comprehensive review of the digital image correlation (DIC) technique for those who are interested in performing the DIC technique in the area of manufacturing.
Design/methodology/approach
No methodology was used because the paper is a review article.
Findings
no fundings.
Originality/value
Herein, the historical development, main strengths and measurement setup of DIC are introduced. Subsequently, the basic principles of the DIC technique are outlined in detail. The analysis of measurement accuracy associated with experimental factors and correlation algorithms is discussed and some useful recommendations for reducing measurement errors are also offered. Then, the utilization of DIC in different manufacturing fields (e.g. cutting, welding, forming and additive manufacturing) is summarized. Finally, the current challenges and prospects of DIC in intelligent manufacturing are discussed.
Details
Keywords
Addresses the standardization of the measurements and the labels for concepts commonly used in the study of work organizations. As a reference handbook and research tool, seeks to…
Abstract
Addresses the standardization of the measurements and the labels for concepts commonly used in the study of work organizations. As a reference handbook and research tool, seeks to improve measurement in the study of work organizations and to facilitate the teaching of introductory courses in this subject. Focuses solely on work organizations, that is, social systems in which members work for money. Defines measurement and distinguishes four levels: nominal, ordinal, interval and ratio. Selects specific measures on the basis of quality, diversity, simplicity and availability and evaluates each measure for its validity and reliability. Employs a set of 38 concepts ‐ ranging from “absenteeism” to “turnover” as the handbook’s frame of reference. Concludes by reviewing organizational measurement over the past 30 years and recommending future measurement reseach.
Details
Keywords
This paper examines the incidence of measurement error in wage data on the estimation of returns to seniority. Earnings surveys collect wage data through questions pertaining to…
Abstract
This paper examines the incidence of measurement error in wage data on the estimation of returns to seniority. Earnings surveys collect wage data through questions pertaining to earnings and hours over a given period of time (year, week) or through direct reports of hourly wages. Comparing results for different wage variables from the panel study of income dynamics (PSID), it is shown that estimated returns to seniority are very sensitive to the type of wage data used. Estimates based on yearly reports are typically twice as large as those using direct reports. Two sources account for this discrepancy. First, the inclusion of earnings from secondary jobs and overtime in the PSID annual earnings data tends to overestimate returns to seniority. Second, hourly wages computed from yearly measures include important measurement errors that tend to bias coefficients upward.
Details
Keywords
Florian Schuberth, Manuel Elias Rademaker and Jörg Henseler
The purpose of this study is threefold: (1) to propose partial least squares path modeling (PLS-PM) as a way to estimate models containing composites of composites and to compare…
Abstract
Purpose
The purpose of this study is threefold: (1) to propose partial least squares path modeling (PLS-PM) as a way to estimate models containing composites of composites and to compare the performance of the PLS-PM approaches in this context, (2) to provide and evaluate two testing procedures to assess the overall fit of such models and (3) to introduce user-friendly step-by-step guidelines.
Design/methodology/approach
A simulation is conducted to examine the PLS-PM approaches and the performance of the two proposed testing procedures.
Findings
The simulation results show that the two-stage approach, its combination with the repeated indicators approach and the extended repeated indicators approach perform similarly. However, only the former is Fisher consistent. Moreover, the simulation shows that guidelines neglecting model fit assessment miss an important opportunity to detect misspecified models. Finally, the results show that both testing procedures based on the two-stage approach allow for assessment of the model fit.
Practical implications
Analysts who estimate and assess models containing composites of composites should use the authors’ guidelines, since the majority of existing guidelines neglect model fit assessment and thus omit a crucial step of structural equation modeling.
Originality/value
This study contributes to the understanding of the discussed approaches. Moreover, it highlights the importance of overall model fit assessment and provides insights about testing the fit of models containing composites of composites. Based on these findings, step-by-step guidelines are introduced to estimate and assess models containing composites of composites.
Details
Keywords
Iryna Pentina and David Strutton
This paper aims to analyze and quantitatively compare existing empirical findings on the role of organizational information‐processing and new product outcomes. The meta‐analytic…
Abstract
Purpose
This paper aims to analyze and quantitatively compare existing empirical findings on the role of organizational information‐processing and new product outcomes. The meta‐analytic technique is used to reconcile some of the current divergent thinking on the role of organizational learning in new product success.
Design/methodology/approach
The method and procedure of the meta‐analysis are utilized to generalize existing empirical findings regarding the role of information processing in new product success by evaluating homogeneity of the obtained results, and measurement‐ and context‐related moderators of the relationship magnitude. It reports and discusses the results, and proposes theoretical and managerial implications of the findings.
Findings
The meta‐analysis of the relationship between organizational information processing and new product success supports an overall positive effect, and identifies measurement‐ and context‐related moderators influencing the magnitude of the relationship.
Research limitations/implications
The analysis, done on the limited number of available effect sizes (77) due to the newness of the area, provides guidance to future researchers by clarifying operationalization and measurement of the main constructs, and suggesting the role of context variables for sampling purposes.
Practical implications
The paper provides guidance to New Product Development (NPD) team leaders by emphasizing the need for integrating information‐related processes and idea management at various NPD stages, and stressing better effectiveness of information processing at the team level.
Originality/value
This first meta‐analysis in the area of information processing and new product outcomes confirms the importance of organizational learning in new product development and outlines important implications for future research and managerial practice.
Details
Keywords
Structural equation modeling (SEM) is a well-established and frequently applied method in various disciplines. New methods in the context of SEM are being introduced in an ongoing…
Abstract
Purpose
Structural equation modeling (SEM) is a well-established and frequently applied method in various disciplines. New methods in the context of SEM are being introduced in an ongoing manner. Since formal proof of statistical properties is difficult or impossible, new methods are frequently justified using Monte Carlo simulations. For SEM with covariance-based estimators, several tools are available to perform Monte Carlo simulations. Moreover, several guidelines on how to conduct a Monte Carlo simulation for SEM with these tools have been introduced. In contrast, software to estimate structural equation models with variance-based estimators such as partial least squares path modeling (PLS-PM) is limited.
Design/methodology/approach
As a remedy, the R package cSEM which allows researchers to estimate structural equation models and to perform Monte Carlo simulations for SEM with variance-based estimators has been introduced. This manuscript provides guidelines on how to conduct a Monte Carlo simulation for SEM with variance-based estimators using the R packages cSEM and cSEM.DGP.
Findings
The author introduces and recommends a six-step procedure to be followed in conducting each Monte Carlo simulation.
Originality/value
For each of the steps, common design patterns are given. Moreover, these guidelines are illustrated by an example Monte Carlo simulation with ready-to-use R code showing that PLS-PM needs the constructs to be embedded in a nomological net to yield valuable results.
Details
Keywords
Kai S. Cortina, Hans Anand Pant and Joanne Smith-Darden
In response to the Three-Step-Approach (TSA) that Cortina, Pant, and Smith-Darden (this volume) have suggested, Chan (this volume) expressed his reservations regarding the…
Abstract
In response to the Three-Step-Approach (TSA) that Cortina, Pant, and Smith-Darden (this volume) have suggested, Chan (this volume) expressed his reservations regarding the usefulness of a procedure that explicitly ignores measurement considerations and does not include mean scores. In this reply, we argue that the purpose of TSA is heuristic in nature and does not involve statistical testing of assumptions. In this spirit, the software, illustrated by Grimm and McArdle (this volume), rounds out our more conceptual considerations.
Structural Equating Modeling (SEM) is a formal model for representing dependency relations between variables of psychological events and may be used for verifying the structural…
Abstract
Structural Equating Modeling (SEM) is a formal model for representing dependency relations between variables of psychological events and may be used for verifying the structural organization of a theoretical model. “Rules of thumb” for the use of SEM are presented regarding each step of its application: specification of the structural model, measurement of the psychological event, and estimation of the adequacy of the model in representing the event. The investigation of the factorial structure of Greenspan's model of personal competence is presented as an example of SEM application with participants with disabilities.