Abstract
Purpose
The purpose of this study is to add to the currently limited research on individual level people analytics (PA) adoption by focusing on a vital resource in the implementation of PA, namely the technology used for PA. We draw on the unified theory of acceptance and use of technology (UTAUT) to examine antecedents of the use of technology for PA.
Design/methodology/approach
This study uses latent profile analysis to examine how different antecedents of PA (technology) adoption jointly act among 279 users of a specific PA technology.
Findings
This study identifies four user profiles, that we labeled the skeptic diplomats, the optimistic strugglers, the optimists, and the enthusiasts. These profiles relate to differences in user satisfaction and the frequency and versatility of PA technology use. This study demonstrates that performance benefits, social influence, required effort, and facilitating conditions jointly affect the use of PA technology, but that the latter two might be the most influential factors.
Practical implications
This study offers recommendations to practitioners and organizations on which actions by managers and the organization can be taken to support the use of PA technology.
Originality/value
Compared to previous research, we take a different approach by applying latent profile analysis to examine the combined effect of antecedents on user behavior and user satisfaction. In addition to a different analytical approach, we also extend existing research on individual PA adoption by focusing on actual behaviors and behavioral intention.
Keywords
Citation
Bentvelzen, M., Boon, C. and Den Hartog, D.N. (2024), "A person-centered approach to individual people analytics adoption", Journal of Organizational Effectiveness: People and Performance, Vol. ahead-of-print No. ahead-of-print. https://doi.org/10.1108/JOEPP-07-2023-0276
Publisher
:Emerald Publishing Limited
Copyright © 2024, Margriet Bentvelzen, Corine Boon and Deanne N. Den Hartog
License
Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) license. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this license may be seen at http://creativecommons.org/licences/by/4.0/legalcode
Introduction
People analytics (PA), a data-driven approach to Human Resource Management (HRM) focused on systematically identifying and quantifying people related drivers of important business outcomes, is an important part of the decision-making process in many organizations (e.g. Fernandez and Gallardo-Gallardo, 2020; Larsson and Edwards, 2021; Shet et al., 2021; Van den Heuvel and Bondarouk, 2017). Using PA can help the HR profession to become a strategic partner, not only through improving the quality of HR decision-making, but also by being able to demonstrate the direct impact of HR practices on business outcomes (Vargas, 2015). Yet, despite the interest in PA, studies have concluded that organizations often struggle to adopt PA for different reasons, such as insufficient data and metrics or the absence of a PA culture (Fernandez and Gallardo-Gallardo, 2020; Marler and Boudreau, 2017; Peeters et al., 2020).
Three recent reviews on the adoption of analytics in HR (Coolen et al., 2023; Fernandez and Gallardo-Gallardo, 2020; Shet et al., 2021) have identified numerous challenges that organizations have to overcome in doing so, showing that the PA adoption process is a complex one. One of those challenges is that individuals within organizations often find analytics hard to understand, accept, and adopt (Van den Heuvel and Bondarouk, 2017). Previous research highlights that HR technology, such as the information technology that stores data and makes it accessible for PA, is crucial in the implementation and growth of PA (e.g. Van den Heuvel and Bondarouk, 2017). Such systems enable data-driven decisions that can help to improve organizational performance (Ferrar and Green, 2021; McCartney and Fu, 2021), but this benefit is only realized if these technologies and systems are actually used (Aral et al., 2012; Ruël and Bondarouk, 2014; Theres and Strohmeier, 2023). Hence, the use of technology for PA by individual decision-makers is vital in order for PA to lead to improved decisions and outcomes. Consequently, in order to better understand why individuals do or do not adopt PA, it is important to focus on factors that affect the use of technology for PA.
Individual adoption and use of information technology in various contexts has been captured by the unified theory of acceptance and use of technology (UTAUT; Venkatesh et al., 2003), which has been extensively studied within the Information Systems field (Hossain and Quaddus, 2012; Venkatesh et al., 2007) and is increasingly applied to digital HRM (Theres and Strohmeier, 2023). The UTAUT identifies performance expectancy, effort expectancy, social influence, and facilitating conditions as key antecedents of the behavioral intention to use a technology and of actual technology use. Altogether, the UTAUT explains a substantial proportion of the variance in the behavioral intention to use a technology and actual technology use (Theres and Strohmeier, 2023; Venkatesh et al., 2016).
Research suggests that performance expectancy, effort expectancy, social influence, and facilitating conditions of the UTAUT model facilitate individual adoption of PA or digital HRM (e.g. Fernandez and Gallardo-Gallardo, 2020; Shet et al., 2021; Theres and Strohmeier, 2023; Vargas et al., 2018). However, results also suggest that PA adoption is more complex due to the interrelations between the different UTAUT factors (e.g. Yasmin et al., 2020; Theres and Strohmeier, 2023). Predictors of PA adoption seem to work in concert rather than separately, which has not been captured by the variable-centered approach which is commonly used in this field. Capturing such interrelations between multiple variables through the traditional variable-centered approach would require a very complex model. Therefore, to assess the combined effect of UTAUT factors on the adoption of PA technology and capture how these factors operate in combination within individuals (Meyer et al., 2013), we will use a person-centered approach (e.g. Howard and Hoffman, 2018). A person-centered approach allows researchers to model complex processes in a more heuristic way (Cooper et al., 2020; Morin et al., 2018; Wang and Hanges, 2011), making it a valuable complementary approach to variable-centered methods. Using a person-centered approach, we will explore (1) whether different user profiles exist based on the antecedents specified in the UTAUT and (2) if and how user profiles relate to differences in user satisfaction and the actual use of PA technology, going beyond previous studies that mostly focused on behavioral intentions.
With this study, we aim to clarify what is needed to incorporate PA into individuals’ decision-making processes. In doing so, we offer two important contributions. First, we answer the calls by Marler and Boudreau (2017) and Minbaeva (2018) to investigate factors that facilitate PA adoption. In addition, by using the UTAUT model as the foundation of this study, we build on and extend Theres and Strohmeier’s (2023) meta-analysis by testing how antecedents work in concert rather than separately. We show that interrelations exist between predictors of individual PA adoption, thereby adding knowledge on the combination of predictors explaining individual adoption. Second, by examining the relationship between user profiles and actual use of PA technology, we move from only studying behavioral intentions to also studying actual user behavior. While user intentions are a necessary requirement for successful adoption (Theres and Strohmeier, 2023), the behavioral data in this study offers important insights in how UTAUT factors relate to PA technology use. From a practical perspective, this may help organizations to facilitate the use of PA technology, thereby potentially facilitating decision-making based on PA.
Theoretical background
People analytics adoption
Enabled by the digitalization of HRM, PA uses systematic methods of analysis and visualization of HR-related data to enhance strategic decisions (Margherita, 2021; Meijerink et al., 2021; Van den Heuvel and Bondarouk, 2017). The potential of PA to improve the way organizations manage their employees and the decisions made about them, has led organizations to evaluate PA as “a boat they cannot miss” (Van den Heuvel and Bondarouk, 2017). The process through which an organization invests in, operationalizes, and incorporates PA into the workforce’s decision-making process is called PA adoption (Shet et al., 2021). To understand PA adoption, scholars often build on the diffusion of innovation theory (Rogers, 2003), which divides this process into three stages, namely the familiarization, adoption, and institutionalization stage (e.g. Coolen et al., 2023). Part of the PA adoption process is assessing which resources are required to support the decision-making through PA, prioritizing the delivery of such supporting resources, and to stimulate the usage of PA in the entire organization (Shet et al., 2021).
Yet, despite organizations’ investment in a PA infrastructure and other necessary resources that stimulate the usage of PA in the organization, organizations are struggling to let PA become an organizational reality (e.g. Marler and Boudreau, 2017). One proposed explanation for the lagging adoption is that individuals find it difficult to comprehend and accept PA (Van den Heuvel and Bondarouk, 2017). When individuals adopt PA, they consider PA as “the best course of action available” and decide to “make full use of PA” in their (daily) work (Rogers, 2003, p. 177). Hence, whether or not organizations are able to incorporate PA into the workforce’s decision-making process depends on the behavior of individuals within organizations.
As mentioned above, previous studies have identified numerous factors that affect the adoption of PA in organizations, such as technological, organizational, environmental, governance, and people or individual factors (e.g. Coolen et al., 2023; Margherita, 2021; Marler and Boudreau, 2017; Shet et al., 2021; Van den Heuvel and Bondarouk, 2017; Vargas et al., 2018). The technology used for PA has been marked as a key resource for the implementation and use of PA by employees (Fernandez and Gallardo-Gallardo, 2020; Ferrar and Green, 2021; Shet et al., 2021; Van den Heuvel and Bondarouk, 2017). This technology, such as the information technology that stores (HR) data and makes it accessible for PA, forms the driving force in the implementation and growth of PA (McCartney and Fu, 2021). For example, Shet et al. (2021) concluded that technological factors such as the perceived usefulness of PA, the complexity of PA, and the accessibility and availability of relevant data are critical for the adoption of PA. The technology used for PA is a vital resource as it makes data accessible and easy to understand, which enables data-driven decision-making (Ferrar and Green, 2021). Similarly, Van den Heuvel and Bondarouk (2017) underscored the necessity to build a solid information technology infrastructure to support evidence-based decision making. Thus, the use of technology for PA by individual decision-makers likely plays a crucial role in the ability to adopt PA as a practice within the organization and for PA to lead to improved decisions and outcomes. We argue that studying factors that affect the individual use of technology for PA will help us to better understand why individuals do (or do not) adopt PA in their work.
Adoption of information technology
How and why individuals adopt new information technologies has been described as one of the most mature research areas in Information Systems literature (Venkatesh et al., 2003). Researchers in this field have used a multitude of theoretical models with roots in different disciplines to study and understand individual-level adoption, such as the Theory of Planned Behavior (Ajzen, 1985) and the Technology Acceptance Model (Davis, 1989). These consistently explained about 40 percent of the variance in intention to use technology (Venkatesh et al., 2003). In order to integrate the contributions of these different models in predicting user acceptance and use of information technology, Venkatesh et al. (2003) synthesized these models in the UTAUT. The UTAUT has been used extensively in technology adoption studies in organizational contexts in the past two decades (Coolen et al., 2023; Venkatesh et al., 2016), including the context of digital HRM (Theres and Strohmeier, 2023).
The UTAUT identifies four key factors related to predicting behavioral intention to use a technology and actual technology use. The first factor, performance expectancy, refers to the degree to which an individual believes that using a specific system will help him or her to attain gains in job performance (Venkatesh et al., 2003). Hence, it relates to the potential gain in the performance of an individual user. Effort expectancy is the second factor and is defined as the degree of ease associated with the use of a specific system. Third, social influence refers to the degree to which an individual perceives that important others believe he or she should use the system. Fourth, facilitating conditions refers to the degree to which an individual believes that an organizational and technical infrastructure exists to support use a specific system. Previous research showed that UTAUT aspects overall explains a substantial proportion of the variance in behavioral intention to use a technology (∼77%) and actual technology use (∼52%) (Venkatesh et al., 2016), albeit with some contextual differences (e.g. Kim et al., 2007; Workman, 2014).
The four factors identified in the UTAUT model are in line with determinants of successful adoption identified in earlier reviews, such as the complexity and perceived usefulness of technology, a PA culture, and adequate quality, accessibility, and IT infrastructure (e.g. Coolen et al., 2023; Shet et al., 2021). Here we will use the UTAUT as a framework and study to what extent performance expectancy, effort expectancy, social influence, and facilitating conditions affect individual adoption of the technology used for PA, and thereby indirectly facilitates the adoption of PA as a practice. Research suggests that the different determinants of successful PA adoption are likely interdependent (Shet et al., 2021; Yasmin et al., 2020). To illustrate, research on big data analytics suggests that “an effective combination of infrastructure, human resources, and management capabilities is significant for the successful adoption of big data analytics.” (Yasmin et al., 2020, p. 2) and that such capabilities are a “balanced combination” of different necessary factors (Akhtar et al., 2019, p. 252), suggesting that it is the combination of factors that determine successful adoption, rather than individual factors.
To capture these combinations of, or interrelationships between, the different factors, we take a person-centered approach to explore the combined effect of these UTAUT factors on user behavior and attitudes. Specifically, we use latent profile analysis (LPA) to study the combined effect of these factors on individual’s PA technology adoption behavior. In summary, this study aims to answer our research question: “What profiles can be identified through the UTAUT factors and what is the relation of these profiles with system use and user satisfaction?”.
Method
Sample and procedure
In this study, we collaborated with Crunchr Workforce Analytics (hereafter referred to as “Crunchr”), a Dutch analytics service provider. Crunchr provides a cloud solution that collects and secures workforce data, improves its quality, and makes it accessible for company users to generate insights from the data. At the time of data collection, the Crunchr system reflected early stage adoption and offered primarily descriptive and basic predictive analytics. Data was collected in two ways. Data considering performance expectancy, effort expectancy, social influence, facilitating conditions, satisfaction, self-reported use of the PA technology, and demographic information was collected through an online survey. Collection of this data took place in February and March 2022. In addition to the survey, individual user data from January 2022 until May 2022 was retrieved from the Crunchr database. After collecting the two datasets, they were merged.
With the help of Crunchr, we distributed the survey among users from different client organizations that consented to participate in our study, whom Crunchr kept anonymous. These organizations operate in different parts of the world and work in different industries. In total, 279 users filled out the complete survey (M response rate = 19%, range = 9–30%). In the invitation to users, we explained that participation was voluntary, that in case of participation their results of the survey will be linked to their Crunchr usage data, and that their responses were kept confidential. The majority of the users that filled out the survey were female (60%), 35% were male. The remaining users (5%) preferred not to share this information. In addition, age ranged from 18 to 65 and older, with the majority of the users (36%) indicating their age between 45 and 54 years. The mean age was in between the categories 35–44 years and 45–54 years. Most of the users completed a master’s degree (54%) or a bachelor’s degree (39%). Finally, we asked users whether they had previous experience with using other PA tools or systems in the past. There was a relatively equal distribution on users in terms of previous experience, with 35% having no previous experience, 36% having used other tools or systems a few times in the past, and 27% having used other tools or systems regularly in the past (2% did not disclose this information).
Measures
All scales were developed together with Crunchr and are based on existing measures. All scales were self-rated and were measured on a 7-point Likert scale ranging from 1 = Strongly disagree to 7 = Strongly agree unless indicated differently. The full set of items to measure each scale can be found in the appendix (Appendix). As we used item scores rather than the scale mean score in the LPA (as we will explain below), we do not include the Cronbach’s alpha (i.e. reliability measure) for performance expectancy, effort expectancy, social influence, and facilitating conditions.
Performance expectancy. To assess performance expectancy, we used items from two different sources. Specifically, we included two items from the performance expectancy scale from Venkatesh et al. (2003) and two items from the Usefulness scale from Marler et al. (2006). Example items of this scale are “Using people analytics improves the quality of the work I do.” and “Using people analytics enables me to accomplish tasks more quickly.”.
Effort expectancy. We measured this construct with three items that originated from the effort expectancy scale by Venkatesh et al. (2003). Example items were “Crunchr and how to use it is clear.” and “It was easy for me to become skillful at using Crunchr.”.
Social influence. To measure social influence, users filled out three items based on the review by Shet et al. (2021), focusing on the perceived attitudes of different “others”, such as senior management. Example items were “In my role, it is expected that I use people analytics.” and “My direct manager finds it important to base people decisions on data.”.
Facilitation conditions. To assess facilitating conditions, we used four items based on Shet et al. (2021) and Marler et al. (2006). Each item focused on different organizational and technical factors that may facilitate adoption. Example items were “It is clear to me what I am allowed or not allowed to do with the data according to the internal guidelines.” and “I believe that the data is reliable enough for HR to build credible people analytics.”.
Satisfaction. Users were asked to indicate their level of satisfaction with three different aspects, namely with Crunchr (i.e. the PA technology), the insights that PA has provided so far, and the business value PA has delivered to the organization so far. Items were measured on a 7-point Likert scale ranging from 1 = extremely dissatisfied to 7 = extremely satisfied. This scale showed high reliability (α = 0.92).
Self-reported frequency of use. We measured how frequently they used the PA system. To measure the frequency of use, we asked users to indicate how often they use Crunchr in their job on average, with options (using a slider) ranging from never (0) to daily (100). A Kolmogorov-Smirnov test indicated that the self-reported frequency of use does not follow a normal distribution, D (269) = 0.191, p < 0.001. We used a root square transformation to reach a more normally distributed of scores.
Number of system logins. Every time a user logs into the PA technology, it is recorded. Based on this data, we calculated the total number of logins for each user by counting every logins of a user after survey participation (i.e. ranging from start of February until start of March) up to the end of April 2022. Again, the Kolmogorov-Smirnov test indicated a non-normal distribution, D (270) = 0.203, p < 0.001. A histogram showed a positively skewed distribution. To solve this non-normal distribution, we used a square root transformation.
Proportion of system access used. The PA technology in our research setting consists of modules corresponding to different HR domains such as recruitment, absenteeism, diversity and inclusion, and retention. In addition, modules can have multiple pages, e.g. the module “recruitment” has three pages: “new hires”, “failed hires”, and “hiring channels”. Hence, each module and page presents the user with different insights. Altogether, the PA technology has 15 modules and 43 pages. However, users do not have automatic access to all modules. Instead, to which modules and pages users have access, is determined by their organization. The access of individual users and which of those modules and pages users actually visit, is recorded by the system. Based on the data on each user’s access and the access used, we created a variable representing the percentage of access used. The higher this percentage, the more thoroughly users use the PA technology. The Kolmogorov-Smirnov test indicated a non-normal distribution, D (227) = 0.141, p < 0.001. A histogram showed a positively skewed distribution, thus we again used a square root transformation.
Analyses
To identify relevant user profiles in our sample, we used LPA. As explained below, we used the manual BCH three-step approach (Bakk et al., 2013; Bolck et al., 2004) in Mplus 8.4 as it is the recommended approach in cases where continuous distal outcomes are predicted (e.g. Nylund-Gibson et al., 2019). Following Magidson et al. (2020) and Peeters et al. (2021), we used item scores rather than the scale mean score in the LPA [1]. By using item scores, we do not lose any variance due to aggregating the items to the scale level, which gives a more accurate and detailed view. In the first step, we identified the best-fitting model by comparing models with different number of profiles from the UTAUT survey items. We allowed the means, but not the variances to be freely estimated. In this step we also saved the posterior probabilities and modal class assignments for the best-fitting model. We determined the best-fitting model by comparing model fit indices. Better model fit is indicated by lower values for the AIC, BIC, and SABIC. In addition, significant LMR and BLRT p-values indicate that the current model fits better than one with one less profile. Moreover, higher values for entropy indicate greater classification accuracy. In the second step, classification errors for each individual are computed. In the third step, we used class membership (step 1) and the inverse logits of individual-level error rates (step 2) as fixed class-specific multinomial intercepts to assess differences across profiles for each outcome. We used model constraints to compare the mean scores of each system use indicator across profiles. Although the UTAUT factors and some of the outcomes were collected from the same source, common method bias is unlikely to be an issue for methods such as LPA that aim to explain covariances among a set of indicators (Meyer and Morin, 2016). In addition, we limited the risks of such bias as we used a different source to measure most of our outcome variables.
Results
Descriptive statistics
Tables 1 and 2 present the descriptive statistics of the variables in this study, including mean, standard deviation, and correlations at the scale and item level, respectively. As can be seen from these tables, all variables correlated positively with one another.
PA technology user profiles
First, we determined the number of profiles using an exploratory approach in which the number of profiles was increased until the most statistically and theoretically suitable model was found. The number of profiles was increased until the model fit indices no longer indicated a model improvement or until the model no longer converged. The model fit indices for LPA solutions up to five profiles are presented in Table 3 below. As can be seen from the table, AIC, BIC, and SABIC had the lowest value for the most complicated model (i.e. five profiles). Although a higher entropy value was obtained from the five-profiles solution compared to the four-profiles solution, the former showed a substantially worse LMR statistic. Also, the decrease in AIC, BIC, and SABIC becomes smaller from the four-to five-profile solution in comparison to the decrease between the two- and three- and between the three- and four-profiles solution. In addition, the five-profiles solution resulted in two very small subgroups (5 and 6% of the total sample). As the entropy of the four-profiles solution still showed good identification of the different groups, the smallest group still included an adequate proportion of the complete sample, and showed a better AIC, BIC, and SABIC than the one-, two-, and three-profile solution, we decided on the four-profile solution. Figure 1 depicts the four-profile solution and Table 4 provides the means and standard deviations for each of the profiles.
The first profile and largest profile (44.1%; pink line) scored high on all items, thus being positive regarding the performance benefits and the social influence, considering the facilitating conditions to be sufficient and finding the PA technology easy to use. Given this, we labeled profile 1 “enthusiasts”. The second profile (33.7%; blue line) scored slightly above average on most items and we therefore labeled this profile as “optimists”. Profile 3 (11.8%; green line) was labeled “optimistic strugglers” because on the one hand has very positive evaluations of the performance benefits of using PA and recognize that others support the use, but on the other hand, find it difficult to use the system and are hesitant regarding the facilitating infrastructure. The last and smallest profile (10.4%; red line) has the lowest scores on all items except the effort expectancy items. Overall, profile 4 has somewhat negative evaluations of the performance benefits, facilitating conditions, ease of use, and scores slightly higher on social influence. Because individuals in this profile appear to be skeptical of the benefits, facilitating conditions and ease of use of PA technology, yet are probably motivated to adopt PA technology because is desired by important others, we labeled profile 4 “skeptic diplomats”.
Next, we used the BCH approach to test whether user profiles were related to differences in user satisfaction, self-reported frequency of use, number of logins, and how much of users’ access is used (see Figure 2 and Table 5 below). Regarding user satisfaction, our findings demonstrate that the enthusiasts exhibited the highest levels of satisfaction (M = 6.20), followed by the optimists (M = 5.72), the optimistic strugglers (M = 3.84) and the sceptic diplomats (M = 3.83). The differences in user satisfaction were significant between all profiles, except between the skeptic diplomats and the optimistic strugglers (Table 5). A similar trend can be seen for self-reported frequency of use: the enthusiasts reported the highest frequency of use (M = 7.07), followed by the optimists (M = 5.34), the optimistic strugglers (M = 3.67) and the skeptic diplomats (M = 3.63). Again, the differences in self-reported frequency of use were significant between all profiles, except between the skeptic diplomats and the optimistic strugglers.
Regarding the number of logins, the enthusiast (M = 3.26) had logged in most often, followed by the optimists (M = 1.90), the optimistic strugglers (M = 1.49), and the skeptic diplomats (M = 1.33). However, the differences in the number of logins were not significant between the optimists, the optimistic strugglers, and the skeptic diplomats; only the enthusiasts differed significantly from the other profiles in this regard (see Table 5). Looking at how much of its access is used by respective users, the profiles exhibit a slightly different trend. Although the enthusiasts again exhibit the highest scores (i.e. make most use of their access; M = 0.55), this is followed by the optimistic strugglers (M = 0.42) and the skeptic diplomats (M = 0.41). The optimists seem to make the most limited use of their access in Crunchr (M = 0.37). However, differences in access used are small and again only the enthusiasts differ significantly from the other profiles in this regard.
Thus, regarding our research question, we identified four profiles of PA technology users depending on their evaluations of the performance benefits, ease of use, social influence, and facilitating conditions to use PA technology. These profiles were labeled enthusiasts, optimists, optimistic strugglers, and skeptic diplomats. In addition, we found that enthusiasts (profile 1, pink) consistently exhibit different user behavior and satisfaction compared to the other three profiles. The optimists differed significantly from the skeptic diplomats and the optimistic strugglers in terms of satisfaction and self-reported frequency of use, but did not exhibit significant differences in terms of the number of logins and access used. The skeptic diplomats (profile 4, red) and the optimistic strugglers (profile 3, green) showed similar use behavior and satisfaction.
Discussion
Given the calls by Marler and Boudreau (2017) and Minbaeva (2018) to investigate factors that hinder organizations to move to PA and the limited research on individual level adoption of PA, this study focused on a vital resource in the implementation of PA in decision-making processes, namely the technology used for PA. Building on the call by Theres and Strohmeier (2023), we used the UTAUT model as the foundation of this study to contribute to our understanding of PA adoption. Furthermore, to investigate the possibility that predictors of PA technology adoption work in concert rather than separately, we applied a LPA. Altogether, the aim of the present study was to assess whether UTAUT factors jointly act among individuals (i.e. whether user profiles exist) and how this relates to the use of and satisfaction with PA technology. Moreover, we aimed to extend existing research on individual PA adoption and focused specifically on behaviors rather than behavioral intention.
First, LPA revealed four profiles of PA technology users depending on their evaluations of the four UTAUT factors (i.e. performance expectancy, effort expectancy, social influence, and facilitating conditions): enthusiasts, optimists, optimistic strugglers, and skeptic diplomats. The enthusiasts had the most positive evaluations of the four UTAUT factors, and optimists scored above average on most items. The optimistic strugglers scored high on performance expectancy and social influence, but scored lower on effort expectancy and facilitating conditions, and the skeptic diplomats had the least positive evaluations overall. Overall, these results support the existence of PA technology user profiles. Besides that the findings demonstrate the heterogeneity of PA technology users, it also shows the interrelationships between the UTAUT factors, which adds a new theoretical perspective to PA adoption. Thus using LPA, we also go beyond the recent meta-analysis showing that UTAUT is a fruitful theoretical foundation to study the adoption of digital HRM, by uncovering interrelationships between UTAUT factors that affect technology adoption.
Second, we explored whether user profiles were related to differences in user satisfaction, self-reported frequency of use, number of logins, and how much of users’ access is used. We found that enthusiasts are the most satisfied and most active users of PA technology. In general, this suggests that having highly positive evaluations of PA technology was related to higher feelings of satisfaction, more frequent use of the technology, and using the diverse possibilities of a technology. Interestingly, user satisfaction, frequency of use, and the number of logins were significantly lower for the optimists compared to the enthusiasts even though their item scores on the UTAUT factors appear to be similar (Figure 1). Hence, even though the profile shape of optimists and enthusiasts is similar and both have positive evaluations of the UTAUT factors, their difference in behavior and attitude suggests that differentiating between these profiles is practically relevant as enthusiasts make more use of the technology which facilitates PA adoption.
In contrast, the skeptic diplomats and the optimistic strugglers exhibited almost identical behavior and attitude. At first glance, this was surprising as their evaluations of performance expectancy and social influence differ (Figure 1). Looking at their evaluations of effort expectancy and facilitating conditions, however, skeptic diplomats and optimistic strugglers’ were very similar and they are both much lower on these items than the two other profiles. These findings suggest that effort expectancy and/or facilitating conditions could be the driving force behind PA technology use and user satisfaction, and it is especially when individuals score low on these that they are likely to struggle. Thus, ensuring that PA technology is easy to use and that organizational and technological infrastructure is adequate may be the most important conditions in order to facilitate the use of PA technology and subsequently the adoption of PA.
Together, these findings show the relevance of user perceptions and evaluations of PA technology for their interaction and satisfaction with this technology. It suggests that if organizations want to move towards PA, investing in PA technology that is considered beneficial for individual performance, easy to use, supported by important others, and the presence of adequate infrastructure is important. Moreover, findings also suggest that ease of use and adequate infrastructure may be especially important for organizational members to feel able to adopt and use the technology used for PA. Altogether, those users that are very positive – enthusiasts – interact most with PA technology. However, although the most positive profile (i.e. enthusiasts) is related to the highest use and satisfaction, the LPA may hint towards a more complex relationship between UTAUT profiles, user behavior and satisfaction. For example, looking at the effort expectancy items in Figure 1, the difference between optimists and optimistic strugglers overall seems to be larger than the difference between optimists and skeptic diplomats. Yet, the difference in user behavior and satisfaction between the optimists, optimistic strugglers, and skeptic diplomats is almost equal. This suggests that besides being quantitatively different (i.e. generally speaking, the higher the scores, the higher the outcomes), the profiles are also qualitatively different (i.e. the different nature of the profiles are associated with different outcomes), showing the benefits of using a person-centered approach.
Theoretical implications
This study makes several theoretical contributions. Broadly, the present study contributes to the PA literature by testing to what extent the UTAUT factors can help to explain differences in PA technology adoption. While UTAUT has been applied in various contexts to understand and explain technology use and use intentions (Venkatesh et al., 2016), including the context of digital HRM, complex interrelationships between the different UTAUT factors had not been examined to date. Our findings suggest that performance expectancy, effort expectancy, social influence, and facilitating conditions are relevant factors that work in concert to facilitate the use of PA technology. This shows the relevance of the UTAUT and the value of a person-centered approach in research on PA adoption.
As mentioned above, our findings suggest that effort expectancy and/or facilitating factors may be the most important drivers of PA technology use and user satisfaction. These UTAUT factors reflect an individual’s believe regarding the ease of use associated with the technology and whether sufficient infrastructure (e.g. data quality) exists to support use of the technology, as long as they are combined with high levels of performance expectancy and social influence. Vargas et al. (2018) concluded that the individual level of adoption was driven predominantly by variables related to the individual’s perception of being capable of doing analytics. We build on these findings by showing how the perception of capabilities can be combined with other UTAUT factors in the different profiles.
To the best of our knowledge, this is the first study that uses a person-centered approach to study the effect of UTAUT factors on individual use behavior and satisfaction. While the LPA revealed four distinct profiles, the profiles follow two general shapes: the skeptic diplomats and optimistic strugglers have more fluctuating scores on specific items, while the optimists and enthusiasts display a stable shape with relatively equal (high) scores on all items. The more stable profiles, enthusiasts and optimists, were the most prevalent profiles (together 78%). This also indicates that the majority of our sample was attitudinally overall quite positive regarding the PA technology, yet the actual technology use of a sizable part of this group (optimists, almost 34%) still lags behind though and is more similar to that of the 20% who struggle or are skeptic. In addition, more than 1 in 5 users in our sample struggled with or were skeptic of using PA technology in attitude and combined with the optimists this means around 50% does not show optimal usage. Hence, using a person-centered approach, we were able to describe and detail relevant subgroups and how they attitudinally and behaviorally vary from the user population (Howard and Hoffman, 2018).
Also, as mentioned above, using a person-centered approach, we found that UTAUT factors in combination can account for differences in user satisfaction and PA technology use. Our finding that more positive evaluations of UTAUT factors relate to higher satisfaction and technology use is consistent with previous research on information systems adoption (Venkatesh et al., 2016) and the finding by Vargas et al. (2018) that positive attitudes, self-efficacy and social influence increase PA adoption intentions. However, as explained above, we also see qualitative differences between the profiles that are related to the outcomes; we see different patterns when looking at the user behavior and satisfaction of the four user profiles identified. Although we see that high levels on all UTAUT factors – the enthusiasts – relate to the highest levels of satisfaction and use, the other profiles with lower scores indicate that it’s a more complex relationship. Overall, this underscores the contribution of using a person-centered approach in PA technology adoption.
Another theoretical implication we would like to highlight relates to the profiles that were (not) identified in this study. By applying LPA, we demonstrated that the population of PA technology users is heterogenous in their evaluations of PA technology aspects such as the perceived benefit. Hence, LPA and other person-centered approaches can be a valuable way to add to our current understanding of PA (technology) adoption and offer complementary theoretical knowledge on this subject. Besides the profiles that were identified in our analysis, future research could consider profiles that were absent in our sample and explore how they relate to user behaviors. For example, we did not identify profiles with contrasting evaluations of effort expectancy and facilitating conditions (i.e. high-low or low-high), profiles with contrasting evaluations of performance expectancy and social influence, or profiles where one of the variables is high while the others are low. Altogether, we encourage future research to adopt a person-centered approach and further explore existing subgroups in the PA (user) population, and how they differ in their attitudes and behaviors.
Limitations and future research
Several limitations of the current study should be acknowledged. First, although there is currently no simple way to estimate the required sample size in LPA (Ferguson et al., 2020) and sample sizes across LPA studies varies widely (ranging from 131 to over 16,000; Spurk et al., 2020), a sample size of about 500 is recommended. Although our sample is smaller, the model fit indices show a good fit. Therefore, the profile structures of PA technology users and their relationship with satisfaction and use behavior should be tested in future research in order to gain more confidence of our findings.
Furthermore, we want to highlight a potential avenue for future research regarding the social influence and facilitating conditions factors included in this study. The definition of social influence refers to “important others”, which is very broad and does not give much guidance in who may be considered as important others in the adoption process. In the present study, we operationalized social influence by referring to the direct manager, senior management, and a general role expectation. However, there may be other potential “influencers” that were not included yet in this study, such as pressure from competitors and business partners, other employees, and organizational culture (Fernandez and Gallardo-Gallardo, 2020; Shet et al., 2021). To better understand social influence, future research could use a qualitative approach to investigate who exert social influence in this context, both in situations of initial adoption as well as sustained use of PA technology. Similarly, the organizational and technological infrastructure encompasses many different aspects that were not included in this study, such as data accessibility, data integration, and a strategic focus in HRM (Fernandez and Gallardo-Gallardo, 2020; Shet et al., 2021). Future research should explore the importance of other aspects of organizational and technological infrastructure.
Finally, the present paper studied the individual adoption of a PA technology in an organizational context and to what extent UTAUT factors influence the use of such technology. We focused on system use because technology is a significant enabler of PA. However, the use of PA technology does not directly translate into improved organizational performance or evidence-based decision-making. While the access to individual user data can be considered a strength of this study, it only provides insights regarding users’ interaction with the PA technology. It does yet not say anything about whether what users subsequently do with the information that they retrieved from the system. Thus, we cannot draw any conclusions regarding how UTAUT factors and PA technology enable decision-makers to enhance their decisions and the way the workforce is managed. Therefore, how PA technology is used from day to day, to what extent it supports the decision-making process, and how this differs between the different user profiles, remains an avenue for future research.
Practical implications
Our study reveals that the PA technology users in organizations are heterogenous in their evaluations of the performance benefits and ease of use of PA technology, social influence, and available organizational and technical infrastructure to support the use of such technology. Increasing employees’ belief that using PA technology will benefit their performance, that use is desired by important others, that using PA technology falls within their capabilities, and that the organizational and technical infrastructure exist to support them, can generally be facilitated by clear communication: sharing success stories, express the value that management attaches to using PA technology, and communicate about the available resources. In addition, to further facilitate the evaluation that PA technology is easy to use, the organization could provide resources such as training and video tutorials. Moreover, organizations could create a user forum or Q&A channel in which users can share their experience or ask and answer questions. Such a resource could create a strong community that makes social influence more apparent and may serve as organizational resources at the same time.
If organizations have limited time and resources to support their employees (i.e. intended users) and to start to understand who their more enthusiastic and who their more skeptical users are, our findings suggest prioritizing actions oriented towards improving individual capabilities and the user-friendliness of PA technology, and investing in necessary organizational and technical infrastructure. Knowing that PA technology users are heterogenous in their perceptions, organizations may especially capitalize on the social influence that the enthusiasts could exert over the other user profiles in their organization in order to improve use and satisfaction.
If organizations do have more time and resources available, they could study which user groups they have in their organizations. Overall, the change literature suggests those who generally have a positive attitude towards organizational change (here PA adoption), are more likely to be affected by communication around the change aimed to increase usage than those who are very skeptical (Oreg et al., 2011; Stouten et al., 2018). Also, the work by Coolen et al. (2023) on the different phases, suggests targeted communication could focus these phases. Thus, specific communication for the groups that need more familiarization (in our profiles the skeptic diplomats and optimistic strugglers), can help them see the benefits and take away worries about extra effort. Those that are already positive, but have not yet (sufficiently) behaviorally adopted (the optimists in our study), can benefit from specific tips and guidance on usage, and the enthusiastic users (enthusiasts in our study) could benefit from further individualization, such as communication about advanced options and help to implement features they see as especially useful. Thus, it is important that managers and organizations are aware of potential differences in their user population and ideally communication should be targeted to these groups.
Also, we demonstrate the combined effect of the UTAUT factors on user satisfaction and PA technology use and that it is important to improve users’ evaluations of PA technology, even for those users that are already quite positive. Almost half of the group that was attitudinally positive, did not yet use the technology optimally. Hence, managers should explore ways to support especially these employees to make better use of the system and create positive perceptions of the technology that is available for PA, therewith increasing the number of enthusiasts in their organization. Our study implies that such targeted efforts would have beneficial effects for all users – both the skeptic and the optimistic users.
Conclusion
Many organizations strive to adopt PA in a quest to increase HRM’s overall strategic value to the business and improve individual and organizational outcomes. Despite the enthusiasm of organizations, they struggle to adopt PA. As technology has been identified as a significant enabler of PA, we used LPA to investigate whether multiple user profiles of technology users exist based on their perception of UTAUT factors and how those profiles relate to using PA technology. We identified four user profiles. As user profiles are related to different levels of satisfaction and use behavior, this study sheds light on the importance of positive perceptions of PA technology, especially the importance of effort expectancy and facilitating conditions, for individual adoption which may aid organizations in further embedding PA into the decision-making process.
Figures
Means, standard deviations, and correlations at the scale level
Mean | SD | N | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | Performance expectancy | 6.15 | 0.78 | 279 | ||||||||||||
2 | Effort expectancy | 5.23 | 1.32 | 279 | 0.411** | |||||||||||
3 | Social influence | 5.97 | 0.85 | 279 | 0.468** | 0.245** | ||||||||||
4 | Facilitating conditions | 5.42 | 1.03 | 279 | 0.453** | 0.573** | 0.485** | |||||||||
5 | User satisfaction | 5.52 | 1.16 | 278 | 0.480** | 0.773** | 0.289** | 0.634** | (0.922) | |||||||
6 | Use frequency (SR) | 5.71 | 2.56 | 269 | 0.377** | 0.505** | 0.304** | 0.339** | 0.457** | |||||||
7 | Number of logins | 2.38 | 1.94 | 270 | 0.290** | 0.338** | 0.264** | 0.239** | 0.318** | 0.704** | ||||||
8 | Access used | 0.46 | 0.25 | 227 | 0.172** | 0.180** | 0.257** | 0.197** | 0.203** | 0.505** | 0.751** | |||||
9 | Gender | 0.63 | 0.48 | 266 | 0.159** | 0.104 | 0.015 | 0.140* | 0.172** | 0.146* | 0.144* | −0.022 | ||||
10 | Age | 3.50 | 0.94 | 264 | −0.095 | −0.265** | 0.033 | −0.071 | −0.189** | −0.246** | −0.298** | −0.215** | −0.077 | |||
11 | Education | 3.57 | 0.60 | 275 | 0.121* | 0.091 | 0.125* | 0.113 | 0.018 | 0.013 | 0.038 | 0.058 | −0.09 | −0.033 | ||
12 | Work experience | 20.26 | 8.20 | 268 | −0.045 | −0.213** | 0.059 | −0.047 | −0.149* | −0.189** | −0.252** | −0.136* | −0.099 | 0.835** | −0.088 | |
13 | Previous PA tech use | 0.93 | 0.79 | 275 | 0.106 | 0.124* | 0.188** | 0.138* | 0.015 | 0.229** | 0.167** | 0.052 | 0.114 | 0.025 | 0.125* | 0.025 |
Note(s): N = sample size, SD = standard deviation. Cronbach’s alpha in between brackets. SR = self-reported. Gender measured with 0 = male and 1 = female. Age measured in years 0 = < 0.18, 1 = 18–24, 2 = 25–34, 3 = 35–44, 4 = 45–54, 5 = 55–64, 6 > 65. Education measured with 1 = primary education, 2 = secondary education, 3 = Bachelor’s degree, 4 = Master’s degree, 5 = PhD. Work experienced measured in years. **p < 0.01, *p < 0.05 (two-tailed test)
Source(s): Author's own work
Means, standard deviations, and correlations at the item level
Mean | SD | N | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | Performance expectancy 1 | 6.39 | 0.80 | 279 | ||||||||||||
2 | Performance expectancy 2 | 5.99 | 0.98 | 279 | 0.595** | |||||||||||
3 | Performance expectancy 3 | 6.13 | 0.94 | 279 | 0.682** | 0.711** | ||||||||||
4 | Performance expectancy 4 | 6.11 | 0.95 | 279 | 0.556** | 0.616** | 0.664** | |||||||||
5 | Effort expectancy 1 | 5.35 | 1.35 | 279 | 0.296** | 0.311** | 0.309** | 0.352** | ||||||||
6 | Effort expectancy 2 | 5.11 | 1.45 | 278 | 0.262** | 0.348** | 0.282** | 0.382** | 0.801** | |||||||
7 | Effort expectancy 3 | 5.26 | 1.36 | 278 | 0.321** | 0.365** | 0.314** | 0.370** | 0.821** | 0.872** | ||||||
8 | Social influence 1 | 6.10 | 0.95 | 279 | 0.343** | 0.329** | 0.367** | 0.454** | 0.190** | 0.149* | 0.148* | |||||
9 | Social influence 2 | 5.99 | 1.06 | 279 | 0.296** | 0.319** | 0.283** | 0.385** | 0.192** | 0.212** | 0.228** | 0.552** | ||||
10 | Social influence 3 | 5.82 | 1.14 | 279 | 0.248** | 0.274** | 0.233** | 0.367** | 0.167** | 0.155** | 0.199** | 0.300** | 0.575** | |||
11 | Facilitating conditions 1 | 5.24 | 1.41 | 279 | 0.214** | 0.399** | 0.329** | 0.355** | 0.360** | 0.384** | 0.424** | 0.348** | 0.377** | 0.382** | ||
12 | Facilitating conditions 2 | 5.25 | 1.41 | 279 | 0.133* | 0.198** | 0.292** | 0.276** | 0.403** | 0.381** | 0.398** | 0.213** | 0.285** | 0.367** | 0.434** | |
13 | Facilitating conditions 3 | 5.39 | 1.27 | 279 | 0.214** | 0.346** | 0.259** | 0.300** | 0.417** | 0.410** | 0.442** | 0.184** | 0.173** | 0.330** | 0.370** | 0.599** |
14 | Facilitating conditions 4 | 5.81 | 1.26 | 279 | 0.276** | 0.346** | 0.341** | 0.441** | 0.484** | 0.408** | 0.469** | 0.241** | 0.296** | 0.353** | 0.461** | 0.441** |
15 | User satisfaction | 5.52 | 1.16 | 278 | 0.371** | 0.429** | 0.419** | 0.412** | 0.746** | 0.687** | 0.756** | 0.174** | 0.223** | 0.291** | 0.427** | 0.497** |
16 | Use frequency (SR) | 5.71 | 2.56 | 269 | 0.300** | 0.296** | 0.342** | 0.351** | 0.479** | 0.475** | 0.453** | 0.333** | 0.257** | 0.163** | 0.233** | 0.335** |
17 | Number of logins | 2.38 | 1.94 | 270 | 0.219** | 0.192** | 0.252** | 0.326** | 0.289** | 0.331** | 0.330** | 0.277** | 0.238** | 0.135* | 0.174** | 0.230** |
18 | Access used | 0.46 | 0.25 | 227 | 0.121 | 0.09 | 0.154* | 0.210** | 0.130* | 0.194** | 0.159* | 0.213** | 0.224** | 0.175** | 0.187** | 0.193** |
19 | Gender | 0.63 | 0.48 | 266 | 0.172** | 0.053 | 0.153* | 0.171** | 0.150* | 0.08 | 0.08 | 0.096 | −0.036 | −0.016 | 0.058 | 0.165** |
20 | Age | 3.50 | 0.94 | 264 | −0.1 | −0.056 | −0.066 | −0.106 | −0.220** | −0.276** | −0.266** | 0.074 | −0.027 | 0.037 | 0.015 | −0.039 |
21 | Education | 3.57 | 0.60 | 275 | 0.096 | 0.126* | 0.108 | 0.08 | 0.085 | 0.108 | 0.048 | 0.163** | 0.102 | 0.047 | 0.137* | 0.066 |
22 | Work experience | 20.26 | 8.20 | 268 | −0.056 | −0.019 | −0.046 | −0.035 | −0.193** | −0.222** | −0.212** | 0.117 | 0.01 | 0.024 | 0.022 | −0.033 |
23 | Previous PA tech use | 0.93 | 0.79 | 275 | 0.103 | 0.101 | 0.086 | 0.073 | 0.127* | 0.142* | 0.084 | 0.254** | 0.140* | 0.075 | 0.179** | 0.095 |
Mean | SD | N | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | Performance expectancy 1 | 6.39 | 0.80 | 279 | |||||||||||
2 | Performance expectancy 2 | 5.99 | 0.98 | 279 | |||||||||||
3 | Performance expectancy 3 | 6.13 | 0.94 | 279 | |||||||||||
4 | Performance expectancy 4 | 6.11 | 0.95 | 279 | |||||||||||
5 | Effort expectancy 1 | 5.35 | 1.35 | 279 | |||||||||||
6 | Effort expectancy 2 | 5.11 | 1.45 | 278 | |||||||||||
7 | Effort expectancy 3 | 5.26 | 1.36 | 278 | |||||||||||
8 | Social influence 1 | 6.10 | 0.95 | 279 | |||||||||||
9 | Social influence 2 | 5.99 | 1.06 | 279 | |||||||||||
10 | Social influence 3 | 5.82 | 1.14 | 279 | |||||||||||
11 | Facilitating conditions 1 | 5.24 | 1.41 | 279 | |||||||||||
12 | Facilitating conditions 2 | 5.25 | 1.41 | 279 | |||||||||||
13 | Facilitating conditions 3 | 5.39 | 1.27 | 279 | 0.599** | ||||||||||
14 | Facilitating conditions 4 | 5.81 | 1.26 | 279 | 0.441** | 0.399** | |||||||||
15 | User satisfaction | 5.52 | 1.16 | 278 | 0.497** | 0.525** | 0.507** | (0.922) | |||||||
16 | Use frequency (SR) | 5.71 | 2.56 | 269 | 0.335** | 0.253** | 0.219** | 0.457** | |||||||
17 | Number of logins | 2.38 | 1.94 | 270 | 0.230** | 0.141* | 0.184** | 0.318** | 0.704** | ||||||
18 | Access used | 0.46 | 0.25 | 227 | 0.193** | 0.086 | 0.118 | 0.203** | 0.505** | 0.751** | |||||
19 | Gender | 0.63 | 0.48 | 266 | 0.165** | 0.086 | 0.125* | 0.172** | 0.146* | 0.144* | −0.022 | ||||
20 | Age | 3.50 | 0.94 | 264 | −0.039 | −0.106 | −0.1 | −0.189** | −0.246** | −0.298** | −0.215** | −0.077 | |||
21 | Education | 3.57 | 0.60 | 275 | 0.066 | 0.108 | 0.034 | 0.018 | 0.013 | 0.038 | 0.058 | −0.09 | −0.033 | ||
22 | Work experience | 20.26 | 8.20 | 268 | −0.033 | −0.07 | −0.07 | −0.149* | −0.189** | −0.252** | −0.136* | −0.099 | 0.835** | −0.088 | |
23 | Previous PA tech use | 0.93 | 0.79 | 275 | 0.095 | 0.025 | 0.119* | 0.015 | 0.229** | 0.167** | 0.052 | 0.114 | 0.025 | 0.125* | 0.025 |
Note(s): N = sample size, SD = standard deviation. SR = self-reported. Cronbach’s alpha in between brackets. Gender measured with 0 = male and 1 = female. Age measured in years 0 = < 0.18, 1 = 18–24, 2 = 25–34, 3 = 35–44, 4 = 45–54, 5 = 55–64, 6 > 65. Education measured with 1 = primary education, 2 = secondary education, 3 = Bachelor’s degree, 4 = Master’s degree, 5 = PhD. Work experienced measured in years. **p < 0.01, *p < 0.05 (two-tailed test)
Source(s): Author's own work
Summary of LPA for different models
LL | FP | AIC | BIC | SABIC | Entropy | LMR | BLRT | Smallest class % | |
---|---|---|---|---|---|---|---|---|---|
1 profile | −6068.71 | 28 | 12193.41 | 12295.09 | 12206.30 | – | – | – | – |
2 profiles | −5575.07 | 43 | 11236.13 | 11392.27 | 11255.92 | 0.966 | 0.047 | <0.001 | 22% |
3 profiles | −5434.55 | 58 | 10985.11 | 11195.72 | 11011.81 | 0.974 | 0.04 | <0.001 | 11% |
4 profiles | −5293.65 | 73 | 10733.30 | 10998.38 | 10766.90 | 0.908 | 0.09 | <0.001 | 10% |
5 profiles | −5232.19 | 88 | 10640.39 | 10959.93 | 10680.89 | 0.918 | 0.54 | <0.001 | 5% |
Note(s): Italic indicates selected model. Dashes indicate criterion was not calculated for the model. LL = Loglikelihood; FP = free parameters; AIC = Akaike information criterion; BIC = Bayesian information criterion; SABIC = Sample-size adjusted BIC; LMR = Lo-Mendel-Rubin likelihood ratio test; BLRT = Bootstrap Likelihood Ratio Test
Source(s): Author's own work
Means and standard deviations for each profile
Item | Enthusiasts | Optimists | Optimistic strugglers | Skeptic diplomats |
---|---|---|---|---|
PE 1 | 6.86 (0.04) | 6.61 (0.11) | 6.19 (0.08) | 4.93 (0.23) |
PE 2 | 6.63 (0.08) | 6.03 (0.16) | 5.69 (0.08) | 4.29 (0.16) |
PE 3 | 6.81 (0.05) | 6.37 (0.13) | 5.71 (0.08) | 4.46 (0.20) |
PE 4 | 6.73 (0.05) | 6.06 (0.18) | 5.85 (0.09) | 4.45 (0.17) |
EE 1 | 6.03 (0.07) | 3.33 (0.29) | 5.73 (0.08) | 3.56 (0.29) |
EE 2 | 5.83 (0.11) | 2.87 (0.21) | 5.46 (0.10) | 3.40 (0.31) |
EE 3 | 6.02 (0.08) | 3.02 (0.20) | 5.55 (0.09) | 3.61 (0.29) |
SI 1 | 6.50 (0.08) | 6.16 (0.25) | 5.89 (0.09) | 5.08 (0.28) |
SI 2 | 6.46 (0.09) | 5.97 (0.22) | 5.77 (0.11) | 4.85 (0.27) |
SI 3 | 6.24 (0.10) | 5.57 (0.23) | 5.69 (0.11) | 4.78 (0.25) |
FC 1 | 5.87 (0.11) | 4.31 (0.32) | 5.18 (0.15) | 3.90 (0.26) |
FC 2 | 5.84 (0.13) | 4.08 (0.30) | 5.29 (0.14) | 4.04 (0.24) |
FC 3 | 5.91 (0.11) | 4.47 (0.32) | 5.49 (0.10) | 4.01 (0.26) |
FC 4 | 6.37 (0.07) | 4.85 (0.29) | 5.90 (0.10) | 4.23 (0.33) |
Note(s): PE = Performance expectancy; EE = Effort expectancy; SI = Social influence; FC = Facilitating conditions. Standard deviations in between brackets
Source(s): Author's own work
Profile mean score difference test
Optimists vs Enthusiasts | Strugglers vs Enthusiasts | Skeptics vs Optimists | Skeptics vs Enthusiasts | Skeptics vs Strugglers | Optimists vs Strugglers | |
---|---|---|---|---|---|---|
Satisfaction | −0.48** | −2.36** | −1.89** | −2.36** | −0.01 | 1.88** |
Frequency (SR) | −1.74** | −3.40** | −1.70** | −3.44** | −0.04 | 1.67** |
Logins | −1.36** | −1.78** | −0.57 | −1.93** | −0.16 | 0.42 |
Access | −0.18** | −0.13* | 0.04 | −0.14* | −0.01 | −0.05 |
Note(s): *p < 0.05, **p < 0.01. SR = Self-reported. Strugglers = Optimistic strugglers; Skeptics = Skeptic diplomats
Source(s): Author's own work
Notes
In addition, for the sake of completeness, we have run construct-level LPA analysis, which we have included in the supplementary materials. These results show that the patterns are largely similar to the item-level results, although the item-level results give a more detailed view of the classes. We have also run a CFA (see the Supplementary materials), which shows the construct validity of the UTAUT constructs.
Appendix Survey items
Performance expectancy
- (1)
I find people analytics useful in my job.
- (2)
Using people analytics enables me to accomplish tasks more quickly.
- (3)
Using people analytics enhances my effectiveness doing my job
- (4)
Using people analytics improves the quality of the work I do.
Effort expectancy
- (1)
Crunchr and how to use it is clear.
- (2)
It was easy for me to become skillful at using Crunchr.
- (3)
I find Crunchr easy to use.
Social influence
- (1)
In my role, it is expected that I use people analytics.
- (2)
My direct manager finds it important to base people decisions on data.
- (3)
Senior management has made efforts to establish a positive attitude towards the use of people analytics for human resources management.
Facilitating conditions
- (1)
I am able to find the time I need to use people analytics in my job.
- (2)
Our analytics team has the capabilities and skills necessary to solve a varied set of business problems using people analytics.
- (3)
I believe that the data is reliable enough for HR to build credible people analytics.
- (4)
It is clear to me what I am allowed or not allowed to do with the data according to the internal guidelines.
Satisfaction – How satisfied are you with …
- (1)
… using Crunchr?
- (2)
… the insights that people analytics has provided you so far?
- (3)
… the business value that people analytics has delivered so far?
Self-reported use
- (1)
How often per week you use Crunchr in your job on average (never – once a week – 2–3 times a week – 3–6 times a week – daily).
The supplementary material for this article can be found online.
References
Ajzen (1985), “From intention to actions: a theory of planned behavior”, in Action Control, Springer, Berlin, pp. 11-39.
Akhtar, P., Frynas, G., Mellahi, K. and Ullah, S. (2019), “Big data-savvy teams' skills, big data-driven actions, and business performance”, British Journal of Management, Vol. 30 No. 2, pp. 252-271, doi: 10.1111/1467-8551.12333.
Aral, S., Brynjolfsson, E. and Wu, L. (2012), “Three-way complementarities: performance pay, human resource analytics, and information technology”, Management Science, Vol. 58 No. 5, pp. 913-931, doi: 10.1287/mnsc.1110.1460.
Bakk, Z., Tekle, F.B. and Vermunt, J.K. (2013), “Estimating the association between latent class membership and external variables using bias-adjusted three-step approaches”, Sociological Methodology, Vol. 43 No. 1, pp. 272-311, doi: 10.1177/0081175012470644.
Bolck, A., Croon, M. and Hagenaars, J. (2004), “Estimating latent structure models with categorical variables: one-step versus three-step estimators”, Political Analysis, Vol. 12 No. 1, pp. 3-27, doi: 10.1093/pan/mph001.
Coolen, P., van den Heuvel, S., Van De Voorde, K. and Paauwe, J. (2023), “Understanding the adoption and institutionalization of workforce analytics: a systematic literature review and research agenda”, Human Resource Management Review, Vol. 33 No. 4, p. 100985, doi: 10.1016/j.hrmr.2023.100985.
Cooper, J.T., Stanley, L.J., Stevens, C.E., Shenkar, O. and Kausch, C. (2020), “Switching analytical mindsets: a person-centered approach to the analysis of cultural values”, International Journal of Cross Cultural Management, Vol. 20 No. 2, pp. 223-247, doi: 10.1177/1470595820939981.
Davis, F.D. (1989), “Perceived usefulness, perceived ease of use, and user acceptance of information technology”, MIS Quarterly, Vol. 13 No. 3, pp. 319-340.
Ferguson, S.L., Moore, G.E.W. and Hull, D.M. (2020), “Finding latent groups in observed data: a primer on latent profile analysis in Mplus for applied researchers”, International Journal of Behavioral Development, Vol. 44 No. 5, pp. 458-468, doi: 10.1177/0165025419881721.
Fernandez, V. and Gallardo-Gallardo, E. (2020), “Tackling the HR digitalization challenge: key factors and barriers to HR analytics adoption”, Competitiveness Review: An International Business Journal, Vol. 31 No. 1, pp. 162-187, doi: 10.1108/CR-12-2019-0163.
Ferrar, J. and Green, D. (2021), Excellence in People Analytics: How to Use Workforce Data to Create Business Value, Kogan Page, London.
Hossain, M.A. and Quaddus, M. (2012), “Expectation-confirmation theory in information system research: a review and analysis”, in Information Systems Theory: Explaining and Predicting Our Digital Society, Springer, New York, Vol. 1, pp. 441-469, doi: 10.1007/978-1-4419-6108-2_21.
Howard, M.C. and Hoffman, M.E. (2018), “Variable-centered, person-centered, and person-specific approaches: where theory meets the method”, Organizational Research Methods, Vol. 21 No. 4, pp. 846-876, doi: 10.1177/1094428117744021.
Kim, C., Jahng, J. and Lee, J. (2007), “An empirical investigation into the utilization-based information technology success model: integrating task–performance and social influence perspective”, Journal of Information Technology, Vol. 22 No. 2, pp. 152-160, doi: 10.1057/palgrave.jit.2000072.
Larsson, A.S. and Edwards, M.R. (2021), “Insider econometrics meets people analytics and strategic human resource management”, The International Journal of Human Resource Management, Vol. 33 No. 12, pp. 2373-2419, doi: 10.1080/09585192.2020.1847166.
Magidson, J., Vermunt, J.K. and Madura, J.P. (2020), Latent Class Analysis, Sage, London.
Margherita, A. (2021), “Human resources analytics: a systematization of research topics and directions for future research”, Human Resource Management Review, No. 2, p. 100795, doi: 10.1016/j.hrmr.2020.100795.
Marler, J.H. and Boudreau, J.W. (2017), “An evidence-based review of HR Analytics”, International Journal of Human Resource Management, Vol. 28 No. 1, pp. 3-26, doi: 10.1080/09585192.2016.1244699.
Marler, J.H., Liang, X. and Dulebohn, J.H. (2006), “Training and effective employee information technology use”, Journal of Management, Vol. 32 No. 5, pp. 721-743, doi: 10.1177/0149206306292388.
McCartney, S. and Fu, N. (2021), “Bridging the gap: why, how and when HR analytics can impact organizational performance”, Management Decision, Vol. 60 No. 13, pp. 25-47, doi: 10.1108/MD-12-2020-1581.
Meijerink, J., Boons, M., Keegan, A. and Marler, J. (2021), “Algorithmic human resource management: synthesizing developments and cross-disciplinary insights on digital HRM”, International Journal of Human Resource Management, Vol. 32 No. 12, pp. 2545-2562, doi: 10.1080/09585192.2021.1925326.
Meyer, J.P. and Morin, A.J.S. (2016), “A person‐centered approach to commitment research: theory, research, and methodology”, Journal of Organizational Behavior, Vol. 37 No. 4, pp. 584-612, doi: 10.1002/job.2085.
Meyer, J.P., Stanley, L.J. and Vandenberg, R.J. (2013), “A person-centered approach to the study of commitment”, Human Resource Management Review, Vol. 23 No. 2, pp. 190-202, doi: 10.1016/j.hrmr.2012.07.007.
Minbaeva, D.B. (2018), “Building credible human capital analytics for organizational competitive advantage: building credible human capital analytics”, Human Resource Management, Vol. 57 No. 3, pp. 701-713, doi: 10.1002/hrm.21848.
Morin, A.J.S., Bujacz, A. and Gagné, M. (2018), “Person-centered methodologies in the organizational sciences: introduction to the feature topic”, Organizational Research Methods, Vol. 21 No. 4, pp. 803-813, doi: 10.1177/1094428118773856.
Nylund-Gibson, K., Grimm, R.P. and Masyn, K.E. (2019), “Prediction from latent classes: a demonstration of different approaches to include distal outcomes in mixture models”, Structural Equation Modeling: A Multidisciplinary Journal, Vol. 26 No. 6, pp. 967-985, doi: 10.1080/10705511.2019.1590146.
Oreg, S., Vakola, M. and Armenakis, A. (2011), “Change recipients' reactions to organizational change: a 60-year review of quantitative studies”, The Journal of Applied Behavioral Science, Vol. 47 No. 4, pp. 461-524, doi: 10.1177/0021886310396550.
Peeters, T., Paauwe, J. and Van De Voorde, K. (2020), “People analytics effectiveness: developing a framework”, Journal of Organizational Effectiveness: People and Performance, Vol. 7 No. 2, pp. 203-219, doi: 10.1108/JOEPP-04-2020-0071.
Peeters, T., Van De Voorde, K. and Paauwe, J. (2021), “Exploring the nature and antecedents of employee energetic well-being at work and job performance profiles”, Sustainability, Vol. 13 No. 13, p. 7424, doi: 10.3390/su13137424.
Rogers, E.M. (2003), Diffusion of Innovations, 5th ed., Simon & Schuster, New York.
Ruël, H. and Bondarouk, T. (2014), “E-HRM research and practice: facing the challenges ahead”, in Martínez-López, F.J. (Ed.), Handbook of Strategic E-Business Management, Springer Berlin Heidelberg, pp. 633-653, doi: 10.1007/978-3-642-39747-9_26.
Shet, S.V., Poddar, T., Wamba Samuel, F. and Dwivedi, Y.K. (2021), “Examining the determinants of successful adoption of data analytics in human resource management – a framework for implications”, Journal of Business Research, Vol. 131, pp. 311-326, doi: 10.1016/j.jbusres.2021.03.054.
Spurk, D., Hirschi, A., Wang, M., Valero, D. and Kauffeld, S. (2020), “Latent profile analysis: a review and “how to” guide of its application within vocational behavior research”, Journal of Vocational Behavior, Vol. 120, p. 103445, doi: 10.1016/j.jvb.2020.103445.
Stouten, J., Rousseau, D.M. and De Cremer, D. (2018), “Successful organizational change: integrating the management practice and scholarly literature”, The Academy of Management Annals, Vol. 12 No. 2, pp. 752-788, doi: 10.5465/annals.2016.0095.
Theres, C. and Strohmeier, S. (2023), “Consolidating the theoretical foundations of digital human resource management acceptance and use research: a meta-analytic validation of UTAUT”, Management Review Quarterly. doi: 10.1007/s11301-023-00367-z.
Van den Heuvel, S. and Bondarouk, T. (2017), “The rise (and fall?) of HR analytics: a study into the future application, value, structure, and system support”, Journal of Organizational Effectiveness: People and Performance, Vol. 4 No. 2, pp. 157-178, doi: 10.1108/JOEPP-03-2017-0022.
Vargas, R. (2015), “Adoption factors impacting human resource analytics among human resource professionals”, Doctoral dissertation, H. Wayne Huizenga College of Business and Entrepreneurship Nova Southeastern University.
Vargas, R., Yurova, Y.V., Ruppel, C.P., Tworoger, L.C. and Greenwood, R. (2018), “Individual adoption of HR analytics: a fine grained view of the early stages leading to adoption”, International Journal of Human Resource Management, Vol. 29 No. 22, pp. 23-3067, doi: 10.1080/09585192.2018.1446181.
Venkatesh, M., Morris, M.G., Davis, G.B. and Davis, F.D. (2003), “User acceptance of information technology: toward a unified view”, MIS Quarterly, Vol. 27 No. 3, p. 425, doi: 10.2307/30036540.
Venkatesh, V., Davis, F. and Morris, M. (2007), “Dead or alive? The development, trajectory and future of technology adoption research”, Journal of the Association for Information Systems, Vol. 8 No. 4, pp. 267-286, doi: 10.17705/1jais.00120.
Venkatesh, V., Thong, J.Y.L. and Xu, X. (2016), “Unified theory of acceptance and use of technology: a synthesis and the road ahead”, Journal of the Association for Information Systems, Vol. 17 No. 5, pp. 328-376, doi: 10.17705/1jais.00428.
Wang, M. and Hanges, P.J. (2011), “Latent class procedures: applications to organizational research”, Organizational Research Methods, Vol. 14 No. 1, pp. 24-31, doi: 10.1177/1094428110383988.
Workman, M. (2014), “New media and the changing face of information technology use: the importance of task pursuit, social influence, and experience”, Computers in Human Behavior, Vol. 31, pp. 111-117, doi: 10.1016/j.chb.2013.10.008.
Yasmin, M., Tatoglu, E., Kilic, H.S., Zaim, S. and Delen, D. (2020), “Big data analytics capabilities and firm performance: an integrated MCDM approach”, Journal of Business Research, Vol. 114, pp. 1-15, doi: 10.1016/j.jbusres.2020.03.028.
Acknowledgements
We would like to thank Dirk Jonker and Marieke Jesse for making the data collection possible.