Predicting knowledge workers’ participation in voluntary learning with employee characteristics and online learning tools

Catherine Hicks (Signal IO, San Francisco, California, USA)

Journal of Workplace Learning

ISSN: 1366-5626

Publication date: 5 March 2018

Abstract

Purpose

This paper aims to explore predicting employee learning activity via employee characteristics and usage for two online learning tools.

Design/methodology/approach

Statistical analysis focused on observational data collected from user logs. Data are analyzed via regression models.

Findings

Findings are presented for over 40,000 employees’ learning activity for one year in a multinational technology company. Variables including job level and tool use yielded a predictive model for overall learning behaviors. In addition, relevant differences are found for managers and nonprofessional learning.

Research limitations/implications

Importantly, how well employees learned content was not measured. This research is also limited to observational relationships: for example, the online tools were used by self-selected users, instead of randomly assigned. Future research which randomly assigns tool use to employee subgroups could explore causal relationships.

Practical implications

This paper presents implications for business analysts and educational technology: how predictive analytics can leverage data to plan programs, the significant challenges for the adoption and usage for online learning tools, and the distinct needs of managers engaging with these tools.

Originality/value

Given a growing emphasis on using employee data, it is important to explore how learning behaviors can be made visible in people analytics. While previous research has surveyed employee cultures on learning or explored the socio-psychological factors which contribute to this learning, this paper presents novel data on employee participation in learning programs which illuminates both how HR metrics can productively use this data to reify learning patterns, and how workplace technology designers can consider important factors such as internal hierarchies.

Keywords

Citation

Hicks, C. (2018), "Predicting knowledge workers’ participation in voluntary learning with employee characteristics and online learning tools", Journal of Workplace Learning, Vol. 30 No. 2, pp. 78-88. https://doi.org/10.1108/JWL-04-2017-0030

Download as .RIS

Publisher

:

Emerald Publishing Limited

Copyright © 2018, Emerald Publishing Limited


Introduction

Strong learning cultures foster employee performance, engagement and well-being (Egan et al., 2004, Maurer, 2001; Govaerts et al., 2011). Creating a workplace that encourages learning is both a competitive and values-driven goal for organizations, particularly those which rely on knowledge workers (Billett, 2001; Dodgson, 1993; Garvin, 2003; Littlejohn et al., 2012; Marsick and Watkins, 2003; Veng Seng et al., 2002). Simultaneously, research proposes that analyzing behavior data yield tremendous benefits, including efficiently using resources and refining policies to support employee well-being (Convertino et al., 2007; de Laat and Bieke, 2013; Leshed et al., 2010; Mashhadi et al., 2016; Mathur et al., 2015).

Despite its importance, there is little published data on why employees enroll in learning programs (Macpherson et al., 2005; Wang et al., 2007). This paper presents such an analysis within a large, global workplace, capturing voluntary learning activity for 46,897 users across a wide variety of both professional and nonprofessional programs.

This paper finds that:

  • online tools designed to support peer feedback and goal-setting are positively associated with workplace learning;

  • employee variables (e.g. job level, organizational function and online tool use) were strong predictors of professional learning activity; and

  • despite their benefits, adoption of online tools remained a significant challenge.

This is the first paper that the author is aware of which seeks to provide a comprehensive analysis of employee learning activity with a large group of real-world users.

Background: learning data and technology in the workplace

Significant bodies of research have examined factors that predict learning outcomes, such as motivation and learning beliefs, and the impact of learning cultures in organizations (Maurer et al., 2003; Joo, 2010; Warr et al., 1999). Predicting learning engagement has primarily centered on the percieved benefits of learning for individual employees (Maurer et al., 2002). However, little published work has explored demographic predictors for when employees will enroll in learning programs, particularly for knowledge workers. One study examined a sample of manufacturing employees and found that overall job satisfaction predicted greater enrollment in learning activities (Birdi et al., 1997). Research on other workplace behavior suggests some preliminary hypotheses for which employee characteristics could predict learning program engagement. For instance, a recent paper found that engagement with a corporate social media tool was impacted significantly by job level and managerial status (Guy et al., 2016). While learning activity and social media activity are different, they both represent voluntary engagement with tools and resources and at a high level, similar questions may be suggested by this research. Does learning activity differ for employees at different levels within the company? Do managers use online learning tools less, or differently, than non-managers?

Beyond HR data such as job level, learning analytics could also consider employee activity across the learning tool ecosystem. Workplaces are increasingly eager to embed technology in their learning programs, providing for increasing numbers of learners while decreasing cost (Littlejohn et al., 2012; Macpherson et al., 2005; Margaryan et al., 2009; Mashhadi et al., 2016). Despite these benefits, learning technology is difficult to apply with consistent efficacy. Learning tools face challenges in incorporating interactivity (Sun et al., 2016), and even well-tested tools can fail to yield the same results in a different learning context because of unforeseen interaction effects (Cuban, 2009). To successfully deploy learning technology, it is necessary to understand learning activity within the workplace.

The first tool examined in this paper helps employees solicit peer feedback online. Obtaining relevant feedback is a critical component of learning (Coetzee et al., 2015; Kluger and DeNisi, 1996; So and Brush, 2008). In work environments, feedback is linked to engagement, social relationships and organizational loyalty (Avolio et al., 2004; Bezuijen et al., 2010; Dellarocas, 2003; Lu et al., 2016). Nevertheless, there are barriers to obtaining this feedback such as reputational concerns, lack of social incentives or psychological safety and fear of appearing vulnerable all imped employees from asking for feedback (Bamberger, 2009). Online learning tools can help to scaffold this informal process, and as a positive consequence, can also quantify data for this invisible employee behavior (Dabbagh and Kitsantas, 2005; Hicks et al., 2016; Schreurs and de Laat, 2012; Siadaty et al., 2012a, 2012b, 2016).

The second tool examined in the paper encourages employees to monitor their skill development, reflecting and planning for future growth. Self-regulation is another key factor in workplace learning; self-regulated learners are able to identify goals, shortcomings and learning strategies (Littlejohn et al., 2012). Tools which encourage self-regulation enable employees to identify and address learning gaps as they arise, another process which is critical for knowledge workers (Dochy et al., 2003; Siadaty et al., 2016; Sitzmann and Ely, 2011). Such tools could be particularly beneficial for knowledge workers, who often do not proactively initiate learning and, when they do, may fail to identify effective learning habits such as self-reflection and goal-setting (Briggs et al., 2016; Church et al., 2001).

Research questions

At a high level, this paper investigates the following questions:

RQ1.

Are there predictive relationships between employee variables (e.g. job level and organizational function) and learning and development activity? Do these relationships change for professional versus nonprofessional course content, and for peer-led versus instructor-led learning?

RQ2.

How do employees interact with online learning tools that promote goal setting and peer feedback?

Methodology

Participants

Users represented a diverse population of employees at a multinational technology corporation across more than 40 countries. Table I shows employee data used as predictor variables. To protect employee privacy, it was neither possible to collect data on age, gender or ethnicity, nor possible to release the distribution of employees at job levels and organizational functions.

Learning and development platform

Data were gathered for 12 months via an online corporate learning and development (L&D) platform. Several key data sets were collected from platform logs:

Learning activity.

Learning activity was a record of the L&D programs attended by employees. Over 1,700 different learning experiences were available during the experimental period, ranging from online trainings (e.g. videos) to multi-day live sessions. Course instruction also varied; while the large majority of programs were facilitated by professional instructors, some were taught by employees considered experts in the domain.

Activity categories were:

  • professional development for “soft” skills (e.g. communication skills);

  • professional development for “hard” skills (e.g. statistics, computer science);

  • nonprofessional development (e.g. creative writing); and

  • mandated legal trainings.

As this paper focuses on voluntary activity, this last category was excluded.

Online feedback tool.

The platform provided an online tool called “Real-Time Feedback” (RTF) which helped users request feedback from other employees with rapid turnaround, e.g. same day (Figure 1). Any user could solicit both developmental feedback and performance feedback from peers by either creating a survey or using a template. The platform then distributed via email an optional survey to recipients selected by the requester, and aggregated results privately to the requester. A total of 5 per cent of employees in the data set (N = 3,398) sent, and 16.8 per cent of users (N = 10,226) responded to, an RTF request during the experimental period.

Skills profile tool.

The platform provided an online “Skills Profile” tool used to support employees in exploring current and desired skills (Figure 2). Users could list skills that they “Have” and/or skills that they “Want”, and skills could be private or publicly shared. A total of 35 per cent of employees in the experimental data set (21,457) used the skills profile by listing at least one skill they either want or have. Both public and private profiles were aggregated for this analysis. After building a profile, employees could browse the existing “library” and search for learning programs which matched listed skills.

Satisfaction.

The platform collected a satisfaction survey after engagement with a learning program. The survey included three items on a five-point agreement scale: Overall, this experience was worth my time; Overall, this experience will have a positive impact on my performance; I think that others at [Company] will benefit from my experience here. An average satisfaction score for each user was calculated from multiple responses.

Results

A total of 46,897 employees attended at least one nonmandatory training during the 12 months examined. During the experimental period, the average user attended 2.98 learning activities.

Mapping learning activity

This first analysis examines what information can be used to meaningfully predict employees’ learning engagement with several regression models, using employee data (Table I). This allows us to ask how multiple predictor variables contribute toward explaining the variance observed in learning activity.

Predicting learning engagement.

Overall learning activity could be predicted from a combination of all five predictor variables (Table I). Five main effects emerged as predictors, namely, job level, organizational function, overall satisfaction score, the Profile tool, the RTF tool [F(20, 39,849) = 1,742.92, R2 = 0.41, p < 0.0001]. A large R2 adj (=0.41) value for this result indicates that these variables account for ∼40 per cent of the variance in employees’ learning activity, providing evidence that these data are indeed highly relevant to predicting learning activity. Employees attended more learning programs if they were at a lower job level, from the sales organizational function, showed more satisfaction with programs, and used either the Skills Profile tool or the RTF tool. However, using both of these tools did not appear to compound the impact in a more than additive way; there was no significant interaction effect for users who engaged with both of the two tools.

For professional learning activity, the same five variables emerged as the best predictors [F(16, 32,520) = 921, R2 adj = 0.31, p < 0.0001]. However, when this analysis was conducted for nonprofessional learning activity, these variables failed to yield a model which explained a large percentage of the variance (R2 adj = 0.007). The same five variables did emerge as significant factors [F(16, 41,253) = 20.9, p < 0.0001]. In other words, these variables still appear as main effects in predicting learning activity for peer-led learning, but the lower R2 values suggest that to predict why employees attend nonprofessional content, important factors remain uncaptured.

Predicting overall learning engagement for managers.

For this population, being a manager was highly confounded with other variables such as job level. Therefore, being a manager was not treated as a predictor in this analysis. Analysis was repeated on the subset of employees who were managers (N = 6,846). The same five main effects were found to predict overall learning activity, namely, job level, organizational function, satisfaction, using the Profile tool and using the RTF tool [F(16, 6,829) = 213.54, R2 adj = 0.33, p < 0.0001]. This smaller population of managers did not enroll in enough nonprofessional content to examine these two categories separately.

Mapping real-time feedback tool usage

This section describes the analysis of usage for the RTF tool (Figure 1). This tool was available along with all other learning activity options in the main menu of the learning platform. A total of 5 per cent of employees in the data set (N = 3,398) sent an RTF request during the experimental period, and 16.8 per cent of users (N = 10,226) responded to an RTF request.

Users who requested at least one RTF instance were significantly more likely to respond to an RTF request than users who had never used the tool for themselves, [X2 (1, N = 46,897) = 3,370.37, p < 0.001]; of the 10,226 users who responded to an RTF request, 21 per cent had also created an RTF request. Employee variables (Table I) failed to significantly predict RTF usage. Nevertheless, as described in the previous sections, users who were engaged with the RTF tool showed increased learning activity, but were not different in their overall satisfaction score.

Managers respond to significantly more feedback requests through the real-time feedback tool.

For the population of managers, 29 per cent (N = 2,825) used the RTF tool during the experimental period. A total of 9 per cent sent an RTF request, and 26 per cent responded to an RTF request. As with all users, among managers, users who requested at least one RTF were also significantly more likely to respond to an RTF response themselves, [X2 (1, N = 9,453) = 582.8, p < 0.0001]; of the 2,484 users who responded to an RTF request, 22 per cent had also created an RTF request.

In a nominal logistic regression with organizational function, overall learning activity, satisfaction and manager status (job level excluded), there was a main effect of organizational function, overall learning activity and manager status: managers were slightly more likely to use RTF [X2(41,301) = 954.49, p < 0.0001, R2 adj = 0.02]; managers emerged as the strongest effect (logworth = 40.5), the second strongest was organizational function (logworth – 13). Employees were more likely to use the RTF tool if they were managers, were in the general organizational function and attended more learning programs.

Mapping skills profile tool usage

This section describes the analysis for the usage of the Skills Profile tool (Figure 2). A total of 35 per cent of users completed at least some portion of the skills profile by listing at least 1 skill that they either had, or wanted.

In predicting Profile tool use, a nominal logistic regression examining job level, total learning activity, organizational function, RTF usage and satisfaction yielded significant effects for each variable except for the learner’s satisfaction score, [X2 (11, N = 39870) = 1295.57, p <0.0001, R2 adj = 0.02.]. Employees were more likely to use the Skills Profile tool if they were at lower job levels, in either the general or the technical organizational function and had attended more learning activities. Again, the small value R2 indicates the difficulty of predicting Profile tool use, in contrast to the values observed for predicting learning activity. As with the RTF tool, user engagement with the Profile tool appears driven by more factors than are captured in this data set. As described in the previous sections, users who were engaged with the Profile tool showed increased learning activity, but were not different in their satisfaction score.

Managers use the skills profile tool less.

In contrast to RTF, where managers were more likely to use the tool, only 26 per cent of managers had used the Profile tool (N = 2,517). A total of 20 per cent filled out at least 1 skill they wanted, and 17 per cent filled out at least 1 skill they had. In a nominal logistic regression with organizational function, overall learning activity, satisfaction and manager status (job level excluded), there was a main effect of org function, overall learning activity and manager status; managers were less likely to use the Profile tool than the average user [X2(41,301) = 1,201.86, p <0.0001, R2 adj = 0.02].

Discussion and future work

This paper explored learning activity in the workplace, mapping general learning patterns for employees and documenting the usage of two online learning tools. At time of writing, this paper is among the first to explore learning tools in a large technology organization and their relationship to voluntary learning activity on this scale. Overall, our results find that a significant percentage of professional learning activity over a calendar year was predictable based on employee demographics. This result suggests that learning and development organizations focusing on professional skill-building should invest resources in monitoring these variables to predict voluntary engagement with their offerings. It is interesting to note, however, that a similar model failed to yield a strong set of predictor variables for nonprofessional learning activity. Cultivating a learning culture that includes nonprofessional learning is likely still a powerful benefit for organizations (Egan et al., 2004), but organizational analysts will need to explore different models to predict program engagement.

Previous research finds evidence that learning tools can scaffold self-reflection and goal-setting in the workplace (Siadaty et al., 2012a; 2012b; 2016), but little to no research has tested such tools in a real corporate environment. Across a user pool of more than 40,000 employees, this study found that online learning tools usage predicted greater learning activity from employees. First, this finding documents the value in analyzing learning tools data: such tools do provide meaningful signals for activity. Second, this finding raises the intriguing possibility that learning tools may directly increase employee learning. Because users self-selected whether they would engage with both the RTF and Profile tools, one explanation for this relationship is that these learning tools were used by a population of employees who are highly motivated for learning already, and therefore also participate in learning opportunities with greater frequency. However, significant research has documented powerful effects from interventions which promote seeking feedback and goal-setting (Multon et al., 1991; Russell, 2014; Sitzmann and Ely, 2011). It is possible that these tools served as such an intervention for users, enforcing adaptive learning and, as a consequence, driving up employees’ enrollment in learning programs. Further research which directly measures employees’ learning beliefs and motivation is needed to examine this hypothesis. Learning organizations should consider learning technology investment as a potential driver for longitudinal engagement with professional growth, and consider how interactive learning tools can supplement and provide a platform for reflection to knowledge workers beyond individual program offerings.

It is also important to note that active use of the online feedback tool was relatively low (5 per cent of users). This is commensurate with low engagement observed for learning tools in massive open online courses (Coffrin et al., 2014; DeBoer et al., 2014). However, employees were more likely to respond to requests, suggesting that this may offer a lower effort entry point to engaging with the tool. For example, one design implication is that a feedback tool could invite employees to trade feedback with peers instead of unidirectionally responding to requests. Instructional designers within organizations should assume that users will need encouragement to proactively engage with learning tools and consider how to cultivate opportunities for peer-to-peer engagement on internal learning platforms.

Another research question raised is whether different types of users perceive the tools as more or less helpful (Ali et al., 2013). Distinct types of users for the tools did not emerge in this analysis: requesting feedback via the RTF tool was not predicted by job level, organizational function or any of the other employee variables except for whether a learner was a manager. While these variables had a small impact on use of the Profile tool, this model was not strongly predictive. Understanding what drives engagement for embedded tools is necessary for learning technologists seeking to supplement day-to-day activities in the workplace.

Learning technologists should also consider the needs of different populations, for instance, managers versus individual contributors. Managers were more likely to respond to feedback requests than initiate them and more likely to be asked for feedback at all. Despite the tool’s hierarchy-agnostic design, which attempted to emphasize lightweight feedback from peers, usage suggests that employees were still bringing pre-existing hierarchies to bear in their interaction with this tool. It is likely that managers were solicited with significantly more feedback requests because their opinion was weighted more heavily by their direct reports compared to peers. However, fear of negative social or professional consequences from disclosure is a critical barrier to asking for feedback, and changes how learners engage with feedback (Bamberger, 2009). Future research should investigate how feedback tools are used differently along the corporate hierarchy, whether the design of these tools can actively encourage feedback outside of the formal hierarchy and how critical and developmental feedback might differ through these tools.

In contrast to the RTF tool, where managers were more likely to engage with the tool (but as responders rather than initiating users), managers were less likely to use the self-reflection Profile tool. In parallel, the learning activity analysis documented decreased engagement across the board for employees at higher job levels. These results could suggest that employees at higher job levels are less invested in learning and development, or constrained in their learning through more demands on their time.

Limitations

A key limitation is that these data are observational, situated within an active corporate environment. Examining user data within the workplace allowed this paper to explore learning engagement tool in the real world. Future research using experimental manipulation will make an important contribution to understanding the mechanisms at play. For example, while these results find that filling out a skills profile predicts greater learning activity for a given user, observation can provide associative, but not causal, evidence. Further research could randomly assign an employee subset to receive the tool and compare their learning engagement with a control group. Another next step is to investigate longitudinal learning activity in the workplace. For example, seasonal effects may impact how employees use developmental tools – e.g. seeking more feedback before a promotion cycle. Qualitative interview research would also inform understanding of employees’ perceptions.

Anonymity, user trust and ethical implications are vital concerns for all research which concerns employee analytics (Bolton et al., 2004; Dellarocas, 2003; Mathur et al., 2015). Both practitioners and researchers should also carefully consider the effects that gathering learning data may have on the learning culture itself. Learning analytics research can strive toward serving both employees and organizations as they attempt to understand their users; sharing individual learning data back with employees via tools such as the two described in this paper can help to advance both corporate learning culture and individual employee growth.

Conclusion

Workplaces are significant learning environments which impact hundreds of thousands of employees for years of their lives. The voluntary learning activity analyzed in this paper alone accounted for over 1,000,000 hours of human activity. This figure does not include the additional hours given by facilitators, program managers, engineers and others to sustain the many learning and development programs within this corporation. Understanding learning, feedback and reflection in the workplace is a tremendous opportunity to positively impact both organizations and their denizens.

Figures

Example from the real-time feedback tool user interface

Figure 1.

Example from the real-time feedback tool user interface

Example from the skills profile tool interface

Figure 2.

Example from the skills profile tool interface

Predictor variables from employee data

Job level Employees are classified in 1 of 14 sequential job levels, across all roles within the company, such that “level 1” is classified as entry-level and “level 8+” is classified as relatively senior in scope and responsibility
Organizational function Employees belong to one of three functions within the company: technical, sales and general
Used Profile tool Whether or not an employee used the online Profile tool
Used RTF tool Whether or not an employee used the online feedback tool
Satisfaction An average overall satisfaction score was calculated for each employee from all satisfaction survey responses during the experimental period (M = 3.2 on a five-point scale)
Manager Employees are classified as managers if they directly supervise at least one fulltime employee. This variable is excluded from analyses because of its relationship with job level, unless specified

References

Ali, L., Asadi, M., Gašević, D., Jovanović, J. and Hatala, M. (2013), “Factors influencing beliefs for adoption of a learning analytics tool: an empirical study”, Computers and Education, Vol. 62, pp. 130-148.

Avolio, B.J., Gardner, W.L., Walumbwa, F.O., Luthans, F. and May, D.R. (2004), “Unlocking the mask: a look at the process by which authentic leaders impact follower attitudes and behaviors”, The Leadership Quarterly, Vol. 15 No. 6, pp. 801-823.

Bamberger, P. (2009), “Employee help-seeking: antecedents, consequences and new insights for future research”, Research in Personnel and Human Resources Management, Vol. 28 No. 1, pp. 49-98.

Bezuijen, X.M., van Dam, K., van den Besrg, P.T. and Thierry, H. (2010), “How leaders stimulate employee learning: a leader–member exchange approach”, Journal of Occupational and Organizational Psychology, Vol. 83 No. 3, pp. 673-693.

Billett, S. (2001), “Learning through work: workplace affordances and individual engagement”, Journal of Workplace Learning, Vol. 13 No. 5, pp. 209-214.

Birdi, K., Allan, C. and Warr, P. (1997), “Correlates and perceived outcomes of 4 types of employee development activity”, Journal of Applied Psychology, Vol. 82 No. 6, p. 845.

Bolton, G.E., Katok, E. and Ockenfels, A. (2004), “How effective are electronic reputation mechanisms? An experimental investigation”, Management Science, Vol. 50 No. 11, pp. 1587-1602.

Briggs, P., Churchill, E., Levine, M., Nicholson, J., Pritchard, G.W. and Olivier, P. (2016), “Everyday Surveillance”, Proceedings of the SIGCHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA ‘16), pp. 3566-3573, available at: http://dx.doi.org/10.1145/2851581.2856493

Church, M.A., Elliot, A.J. and Gable, S.L. (2001), “Perceptions of classroom environment, achievement goals, and achievement outcomes”, Journal of Educational Psychology, Vol. 93 No. 1, pp. 43-54.

Coetzee, D., Lim, S., Fox, A., Hartmann, B. and Hearst, M.A. (2015). “Structuring interactions for large-scale synchronous peer learning”, Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing (CSCW ‘15), pp. 1139-1152, available at: http://dx.doi.org/10.1145/2675133.2675251

Coffrin, C., Corrin, L., de Barba, P. and Kennedy, G. (2014). “Visualizing patterns of student engagement and performance in MOOCs”, Proceedings of the Fourth International Conference on Learning Analytics and Knowledge (LAK ‘14), pp. 83-92, available at: http://dx.doi.org/10.1145/2567574.2567586

Convertino, G., Moran, T.P. and Smith, B.A. (2007). “Studying activity patterns in CSCW”, CHI’07 Extended Abstracts on Human Factors in Computing Systems (CHI EA ‘07), pp. 2339-2344, available at: http://dx.doi.org/10.1145/1240866.1241004

Cuban, L. (2009), Oversold and Underused: Computers in the Classroom, Harvard University Press, New York, NY.

Dabbagh, N. and Kitsantas, A. (2005), “Using web-based pedagogical tools as scaffolds for self-regulated learning”, Instructional Science, Vol. 33 Nos 5/6, pp. 513-540, available at: http://doi.org/10.1007/s11251-005-1278-3

DeBoer, J., Ho, A.D., Stump, G.S. and Beslow, L. (2014), “Changing ‘course’ reconceptualizing”, Educational Variables for Massive Open Online Courses. Educational Researcher, Vol. 43 No. 2, pp. 74-84, doi: 0013189X14523038.

Dochy, F., Segers, M., Van den Bossche, P. and Gijbels, D. (2003), “Effects of problem-based learning: a Meta-analysis”, Learning and Instruction, Vol. 13 No. 5, pp. 533-568.

de Laat, M. and Bieke, S. (2013), “Visualizing informal professional development networks building a case for learning analytics in the workplace”, American Behavioral Scientist, Vol. 57 No. 10, pp. 1421-1438.

Dellarocas, C. (2003), “The digitization of word of mouth: promise and challenges of online feedback mechanisms”, Management Science, Vol. 49 No. 10, pp. 1407-1424.

Dodgson, M. (1993), “Organizational learning: a review of some literatures”, Organization Studies, Vol. 14 No. 3, pp. 375-394.

Egan, T.M., Yang, B. and Bartlett, K.R. (2004), “The effects of organizational learning culture and job satisfaction on motivation to transfer learning and turnover intention”, Human Resource Development Quarterly, Vol. 15 No. 3, pp. 279-301.

Garvin, D.A. (2003), Learning in Action: A Guide to Putting the Learning Organization to Work, Harvard Business Review Press, New York, NY.

Govaerts, N., Kyndt, E., Dochy, F. and Baert, H. (2011), “Influence of learning and working climate on the retention of talented employees”, Journal of Workplace Learning, Vol. 23 No. 1, pp. 35-55.

Guy, I., Ronen, I., Zwerdling, N., Zuyev-Grabovitch, I. and Jacovi, M. (2016), “What is Your Organization ‘Like’? A Study of Liking Activity in the Enterprise”, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ‘16), pp. 3025-3037, available at: http://dx.doi.org/10.1145/2858036.2858540

Hicks, C.M., Pandey, V., Fraser, C.A. and Klemmer, S. (2016), “Framing Feedback: Choosing Review Environment Features that Support High Quality Peer Assessment”, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ‘16), pp. 458-469, available at: http://dx.doi.org/10.1145/2858036.2858195

Joo, B.K.B. (2010), “Organizational commitment for knowledge workers: the roles of perceived organizational learning culture, leader–member exchange quality, and turnover intention”, Human Resource Development Quarterly, Vol. 21 No. 1, pp. 69-85.

Kluger, A.N. and DeNisi, A. (1996), “The effects of feedback interventions on performance: a historical review, a meta-analysis, and a preliminary feedback intervention theory”, Psychological Bulletin, Vol. 119 No. 2, pp. 254-284.

Leshed, G., Cosley, D., Hancock, J.T. and Gay, G. (2010). “Visualizing language use in team conversations: designing through theory, experiments, and iterations”, CHI’10 Extended Abstracts on Human Factors in Computing Systems (CHI EA ‘10), pp. 4567-4582, available at: http://dx.doi.org/10.1145/1753846.1754195

Littlejohn, A., Milligan, A. and Margaryan, A. (2012), “Charting collective knowledge: supporting self-regulated learning in the workplace”, Journal of Workplace Learning, Vol. 24 No. 3, pp. 226-238.

Lu, D., Dugan, C., Farzan, R. and Geyer, W. (2016), “‘Let’s Stitch Me and You Together! Designing a Photo Co-creation Activity to Stimulate Playfulness in the Workplace”, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ‘16), pp. 3061-3065, doi: 10.1145/2858036.2858520.

Macpherson, A., Homan, G. and Wilkinson, K. (2005), “The implementation and use of e-learning in the corporate university”, Journal of Workplace Learning, Vol. 17 Nos 1/2, pp. 33-48.

Margaryan, A., Milligan, C. and Littlejohn, A. (2009), “Self–regulated learning and knowledge sharing in the workplace: Differences and similarities between experts and novices”, Proceedings of the Sixth International Conference on Researching Work and Learning (RWL6), Copenhagen, Vol. 28, pp. 1-15.

Marsick, V.J. and Watkins, K.E. (2003), “Demonstrating the value of an organization’s learning culture: the dimensions of the learning organization questionnaire”, Advances in Developing Human Resources, Vol. 5 No. 2, pp. 132-151.

Mashhadi, A., Kawsar, F., Mathur, A., Dugan, C. and Shami, N.S. (2016). “Let’s Talk About the Quantified Workplace”, Proceedings of the 19th ACM Conference on Computer Supported Cooperative Work and Social Computing Companion (CSCW ‘16), pp. 522-528, available at: http://dx.doi.org/10.1145/2818052.2855514

Mathur, A., Van den Broeck, M., Vanderhulst, G., Mashhadi, A. and Kawsar, F. (2015), “Tiny habits in the giant enterprise: understanding the dynamics of a quantified workplace”, Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp ‘15), pp. 577-588, available at: http://dx.doi.org/10.1145/2750858.2807528

Maurer, T.J. (2001), “Career-relevant learning and development, worker age, and beliefs about self-efficacy for development”, Journal of Management, Vol. 27 No. 2, pp. 123-140.

Maurer, T.J., Pierce, H.R. and Shore, L.M. (2002), “Perceived beneficiary of employee development activity: a three-dimensional social exchange model”, Academy of Management Review, Vol. 27 No. 3, pp. 432-444.

Maurer, T.J., Weiss, E.M. and Barbeite, F.G. (2003), “A model of involvement in work-related learning and development activity: the effects of individual, situational, motivational, and age variables”, Journal of Applied Psychology, Vol. 88 No. 4, p. 707.

Multon, K.D., Brown, S.D. and Lent, R.W. (1991), “Relation of self-efficacy beliefs to academic outcomes: a meta-analytic investigation”, Journal of Counseling Psychology, Vol. 38 No. 1, p. 30.

Russell, D.M. (2014). “Measuring learned skill behaviors post-MOOC”, CHI’14 Extended Abstracts on Human Factors in Computing Systems (CHI EA ‘14), pp. 2233-2238, available at: http://dx.doi.org/10.1145/2559206.2581180

Schreurs, B. and de Laat, M. (2012). “Network awareness tool-learning analytics in the workplace: detecting and analyzing informal workplace learning”, Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (LAK ‘12), pp. 59-64, available at: http://dx.doi.org/10.1145/2330601.2330620

Siadaty, M., Gašević, D. and Hatala, M. (2016), “Measuring the impact of technological scaffolding interventions on micro-level processes of self-regulated workplace learning”, Computers in Human Behavior, Vol. 59, pp. 469-482.

Siadaty, M., Gašević, D., Jovanović, J., Milikić, N., Jeremić, Z., Ali, L., Giljanovic, A. and Hatala, M. (2012a), “Learn-B: a social analytics-enabled tool for self-regulated workplace learning”, Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (LAK ‘12), pp. 115-119, available at: http://dx.doi.org/10.1145/2330601.2330632

Siadaty, M., Jovanović, J., Gašević, D., Milikić, N., Jeremić, Z., Ali, L., Giljanovic, A. and Hatala, M. (2012b). “Semantic web and linked learning to support workplace learning”, Proceedings of the 21st Int. WWW Conference, Lyon.

Sitzmann, T. and Ely, K. (2011), “A meta-analysis of self-regulated learning in work-related training and educational attainment: what we know and where we need to go”, Psychological Bulletin, Vol. 137 No. 3, pp. 421-442.

So, H.-J. and Brush, T.A. (2008), “Student perceptions of collaborative learning, social presence and satisfaction in a blended learning environment: relationships and critical factors”, Computers & Education, Vol. 51 No. 1, pp. 318-336.

Sun, Y., Qiao, Z., Chen, D., Xin, C. and Jiao, W. (2016). “An approach to using existing online education tools to support practical education on MOOCs”, Comp Software and Applications Conf (COMPSAC), doi: 10.1109/COMPSAC.2016.49.

Veng Seng, C., Zannes, E. and Wayne Pace, R. (2002), “The contributions of knowledge management to workplace learning”, Journal of Workplace Learning, Vol. 14 No. 4, pp. 138-147.

Wang, Y.S., Wang, H.Y. and Shee, D.Y. (2007), “Measuring e-learning systems success in an organizational context: scale development and validation”, Computers in Human Behavior, Vol. 23 No. 4, pp. 1792-1808.

Warr, P., Allan, C. and Birdi, K. (1999), “Predicting three levels of training outcome”, Journal of Occupational and Organizational Psychology, Vol. 72 No. 3, pp. 351-375.

Supplementary materials

JWL_30_2.pdf (11.8 MB)

Acknowledgements

This work was completed while the author was affiliated at Google, and this support is gratefully acknowledged.

Corresponding author

Catherine Hicks can be contacted at: dr.cat.hicks@gmail.com