Abstract
Purpose
Although it is commonly agreed that prescriptive analytics can benefit organizations by enabling better decision-making, the deployment of prescriptive analytics tools can be challenging. Previous studies have primarily focused on methodological issues rather than the organizational deployment of analytics. However, successful deployment is key to achieving the intended benefits of prescriptive analytics tools. Therefore, this study aims to identify the enablers of successful deployment of prescriptive analytics.
Design/methodology/approach
The authors examine the enablers for the successful deployment of prescriptive analytics through five organizational case studies. To provide a comprehensive view of the deployment process, each case includes interviews with users, managers and top management.
Findings
The findings suggest the key enablers for successful analytics deployment are strong leadership and management support, sufficient resources, user participation in development and a common dialogue between users, managers and top management. However, contrary to the existing literature, the authors found little evidence of external pressures to develop and deploy analytics. Importantly, the success of deployment in each case was related to the similarity with which different actors within the organization viewed the deployment process. Furthermore, end users tended to highlight user participation, skills and training, whereas managers and top management placed greater emphasis on the importance of organizational changes.
Originality/value
The results will help practitioners ensure that key enablers are in place to increase the likelihood of the successful deployment of prescriptive analytics.
Keywords
Citation
Hirvonen, M., Kauppi, K. and Liesiö, J. (2024), "Identifying enablers for the successful deployment of prescriptive analytics – a multiple case study", European Business Review, Vol. ahead-of-print No. ahead-of-print. https://doi.org/10.1108/EBR-08-2023-0253
Publisher
:Emerald Publishing Limited
Copyright © 2024, Marjut Hirvonen, Katri Kauppi and Juuso Liesiö.
License
Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial & non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode
1. Introduction
Analytics refers to the process of turning data into information for better decision-making by using various statistical and quantitative techniques (Davenport and Harris, 2007; Wilder and Ozgur, 2015). Business analytics enables organizations to make intelligent decisions quickly to create business value (Lepenioti et al., 2020). Analytics tools and approaches are commonly classified into three categories: descriptive, predictive and prescriptive (e.g. Lepenioti et al., 2020). Descriptive analytics answers the questions “What has happened?”, “Why did it happen?”, and “What is happening now?”, whereas predictive analytics answers future-oriented questions, such as “What will happen?” and “Why will it happen?”. The most advanced form of analytics, prescriptive analytics, seeks to find the best course of action by answering the question “What should be done and why?” (Akerkar, 2013; Krumeich et al., 2016; Šikšnys and Pedersen, 2009).
The main focus in both research and practice has been descriptive and predictive analytics, which typically use such methods as artificial intelligence, data mining, machine learning and simulation (den Hertog and Postek, 2016; Habeeb et al., 2019; Larose and Larose, 2015). Nevertheless, interest in prescriptive analytics has increased in recent years (Larson and Chang, 2016). Prescriptive analytics represents the highest level of analytics maturity, which promises to optimize decision-making and improve business performance. Indeed, the pressure to increase profitability has driven many companies to use prescriptive analytics. It is commonly agreed that its use can benefit decision-making by economizing on cognitive effort, solving complex problems and integrating knowledge (e.g. Liberatore et al., 2000; Luoma, 2016). Thus, as new analytical methods and various data sources have become increasingly available, better insights can be gained from mathematical optimization, simulation and statistical analysis to maximize business value (Liberatore and Luo, 2010; Mortenson et al., 2015). Overall, despite the plethora of methodological research on all three categories of analytics, the actual deployment of analytics has received less attention in previous studies, particularly in the case of prescriptive analytics.
While the benefits of prescriptive analytics are widely acknowledged, its deployment often causes both technological and organizational challenges (Arunachalam et al., 2018; Tim et al., 2020). Nonetheless, the necessary organizational features required for its successful deployment are often ignored (Tim et al., 2020). Indeed, successful analytics utilization may more often be prevented by managerial and cultural factors than technological barriers (LaValle et al., 2011). For example, according to Agrawal et al. (2020), the most significant obstacles to the digital transformation required for the deployment of prescriptive analytics relate to poor understanding of deployment urgency, the lack of industry-specific guidelines, digital skills and talent, as well as the high costs of implementation. Thus, “analytics is not simply a technical matter” (Vidgen et al., 2017, p. 628) – in addition to analytics resources and technologies, appropriate organizational features are required for successful value creation.
To better understand, and thereby help overcome, the barriers to prescriptive analytics adoption, this study aims to identify the enablers of the successful deployment of prescriptive analytics. Although there exists abundant research on the deployment of new ERP and IT systems, for example, less research has been dedicated to studying analytics deployment (e.g. Matende and Ogao, 2013), and it remains unclear whether similar enablers apply across all domains. Prescriptive analytics tools, in particular, differ from ERP/IT systems in that they may require users to possess a more in-depth understanding of the fundamental assumptions of underlying optimization or decision-support models. For instance, a user must understand the decision alternatives (feasible solutions) the tool considers, the types of decision objectives the tool uses to identify the most preferred decision alternatives (optimal solutions) and the way the tool accounts for uncertainties and risks.
To identify the enablers of the successful deployment of prescriptive analytics, we examine five recent prescriptive analytics deployment cases, interviewing users, managers and top management in each case. The user perspective, in particular, has often been overlooked in previous research (Ain et al., 2019). To ensure the generalizability of our results, the case organizations represent a variety of industries and cover both the private and public sector. Our study contributes to the current literature by identifying the enablers for successful analytics deployment, which will help practitioners more successfully design and implement deployment projects, thereby providing their organizations with a competitive advantage. Such a contribution is important for the wider adoption of prescriptive analytics, as the academic literature in this area has mainly focused on the methodological and theoretical development of prescriptive analytics. Nonetheless, the lack of attention to deployment may result in high expenses and negligible benefits (Bose, 2009).
The rest of this paper is structured as follows. Section 2 discusses the three categories of analytics, and Section 3 reviews the enablers of successful deployment based on previous studies related to analytics and ERP/IT systems deployment. This is followed by a presentation of our methods in Section 4 and the findings from the case studies in Section 5. Finally, a discussion of our results, theoretical contributions, recommendations, as well as the study’s limitations and perspectives on future research, are provided in Section 6.
2. Descriptive, predictive and prescriptive analytics
Business analytics tools are often categorized into the following three main categories, which differ in terms of difficulty, value and intelligence (Akerkar, 2013; Krumeich et al., 2016; Šikšnys and Pedersen, 2009):
descriptive analytics,
predictive analytics, and
prescriptive analytics.
Table 1 provides an overview of these different analytics categories in terms of intelligence, methodological sophistication, data requirements, and expected value.
Descriptive analytics uses, for instance, statistical methods to answer the questions “What has happened?”, “Why did it happen?” and “What is happening now?” (Akerkar, 2013; Krumeich et al., 2016; Šikšnys and Pedersen, 2009). Some authors, however, use the term diagnostic analytics when referring to descriptive analytics tools that are specifically geared toward addressing the question “Why did it happen?” (Soltanpoor and Sellis, 2016). Answers to this question help organizations identify the root-causes behind past events and understand the causal links between different types of data (Muacevic and Adler, 2020). However, despite helping to structure historical events, descriptive or diagnostic analytics does not help predict future outcomes. According to Lustig et al. (2010), the application of descriptive analytics does not always require extensive knowledge of analytics, and thus it can be readily applied to day-to-day operations. By contrast, predictive and prescriptive analytics commonly provide a more in-depth analysis of the data (Lustig et al., 2010). Descriptive analytics is mostly used in (1) summarizing past events, such as sales and operations, (2) tracking data related to social media usage and engagement, (3) reporting general trends and (4) collating survey results.
Predictive analytics, in turn, answers questions about the future, such as “What will happen?” and “Why will it happen?” (Akerkar, 2013; Krumeich et al., 2016; Šikšnys and Pedersen, 2009). It provides a picture of possible future events based on the past (Muacevic and Adler, 2020). A wide variety of predictive analytics techniques build on statistics, including probabilistic models (Lepenioti et al., 2020) as well as machine learning and data mining (Lu et al., 2017). Methods typically used in predictive analytics include decision trees, regression models, artificial neural networks, Bayesian statistics, ensemble learning, support vector machines, k-nearest neighbors and time series and principal component analysis (Kumar and Garg, 2018).
According to Hair (2007), data mining and predictive analytics help convert information into knowledge, and they are increasingly applied to marketing, product development, advertising, distribution and retailing and they are also used for business intelligence purposes. Predictive analytics techniques are also used to reduce risks and improve, for instance, business efficiency, customer service, HR practices, IT security and fraud detection and prediction (Balbin et al., 2020; Kumar and Garg, 2018; Lustig et al., 2010). For example, the work of Wang and Hajli (2017) aimed to help health-care practitioners exploit the business transformational capabilities of big data analytics. Their findings also form the empirical basis for a more detailed investigation of the implementation of big data analytics.
While predictive analytics focuses on examining the future, prescriptive analytics enables the generation of proactive decisions and implementable actions. Prescriptive analytics can be used in two modes, which differ in the level of human intervention. The first mode involves supporting decision-making by providing recommendations to human decision-makers, while the second concerns automating decision-making by implementing the decisions prescribed in the first mode (Hagerty, 2017). Its adoption can increase the maturity of data analytics, enabling early and optimized decision-making to improve business performance (den Hertog and Postek, 2016; Hagerty, 2017). According to Lepenioti et al. (2020), the full potential of predictive analytics can be achieved when it is combined with prescriptive analytics.
Overall, prescriptive analytics is the most sophisticated business analytics type as it is capable of producing the greatest level of intelligence and value for a business (Šikšnys and Pedersen, 2009). It can improve decision-making and process effectiveness by suggesting the best decision alternatives based on predictions. To achieve this, it incorporates the outputs of predictive analytics and uses machine learning, optimization algorithms and expert systems to provide optimal adaptive, time-dependent decisions (Basu, 2013; Engel et al., 2012; Hagerty, 2017; Lepenioti et al., 2020). According to Lustig et al. (2010), prescriptive analytics can improve production, customer experience and business growth, but, like predictive analytics, it may require large amounts of data to provide useful results. The best outcomes in prescriptive analytics are achieved by using optimization techniques that take into account uncertainties when identifying the best decision alternative (Poornima and Pushpalatha, 2020).
The utilization of prescriptive analytics is expected to grow in the near future as managers begin to understand its requirements and potential added value more widely (Lepenioti et al., 2020). The achievement of data-related business value requires rapid reactions to real-time events, or that value disappears. With the recent advances in event-processing technology and distributed, pervasive computing infrastructures, sophisticated features, such as distributed processing and data management, can be exploited by the next generation of prescriptive analytics systems. These systems may also be embedded with scalable operational research and machine learning algorithms (Lepenioti et al., 2020). Thus, in addition to identifying risks and potential problems in operations, the next generation of analytics systems will be able to recommend mitigating actions, thereby enabling real-time decision support. However, the benefits of such system investments are heavily dependent on the quality of their deployment (Kauppi, 2013). Thus, we focus on investigating the enablers of deployment to enhance the utilization of prescriptive analytics.
3. Enablers of successful analytics deployment
The deployment of analytics within organizations has not been studied extensively within the literature. Therefore, this review also included other related systems deployment studies, such as IT and ERP, to gain insights into potentially relevant enablers. The review of the literature on successes and failures revealed the following seven recurring enablers that enhance deployment: management support, organizational changes, technical readiness, resourcing and scheduling, change management, skills and training and external pressures.
3.1 Management support
Management support is one of the most widely cited enablers affecting deployment success (e.g. Ain et al., 2019; Ali and Miller, 2017; El-Adaileh and Foster, 2019; Hwang et al., 2004; Lautenbach et al., 2017; Puklavec et al., 2018; Rezaie et al., 2017; Villamarín and Diaz Pinzon, 2017; Wamba and Queiroz, 2020; Yeoh et al., 2008). Adequate management support for the use of a new system is essential, including the sufficient allocation of resources (Petter et al., 2013). Successful management support also increases commitment and initiative and helps create a conducive environment for a particular adoption to take place (Lee and Kim, 2007; Martins et al., 2016). Management support has been noted to impact many dimensions of deployment success, including system use (Xu and Hwang, 2007), system quality (Hwang and Xu, 2008), decision-making and productivity (Hasan et al., 2012) and user satisfaction (Hung et al., 2016). This means that top management support is fundamental if the organization is to manage all the stages of the adoption process effectively (Junior et al., 2019; Lin, 2014; Martins et al., 2016).
Despite the emphasis on management support as a major enabler of success, gaining managerial commitment within an organization can be seen as one of the greatest challenges faced by a deployment team (Yeoh et al., 2008). The full benefits of big data analytics will not be achieved by organizations unless they are able to address the managerial challenges resulting from the deployment of analytics (Mcafee and Brynjolfsson, 2012). Organizations must also adapt their strategic choices and resource configurations to analytics-driven operations (Xu et al., 2016). Thus, fully understanding the various impacts of managerial, economic and strategic factors on big data analytics is seen as a significant factor in successful analytics deployment (Raghupathi and Raghupathi, 2014; Ward et al., 2014).
3.2 Organizational changes
Many studies emphasize the significance of organizational changes as an enabler of analytics deployment (e.g. Fink et al., 2017; Trieu, 2017) and indicate the negative outcomes of ignoring it. Organizational changes include actions that change organizational culture, such as the implementation of an employee reward system (Hussein et al., 2019). Indeed, several authors stress the importance of organizational changes by noting that the mere existence of analytics resources and technologies does not guarantee successful value creation without the required organizational features (Tim et al., 2020; Vidgen et al., 2017). In addition, Ain et al. (2019) acknowledged the significance of organizational culture as an enabler. Organizational culture and structure should fit the demands of the new technology (Zhang et al., 2005). In addition, the findings of Oesterreich et al. (2022) highlighted the importance of social factors in enhancing firm performance, while also stressed the significance of technical factors. Despite wide acknowledgment of the benefits of analytics, many companies still struggle to create organizational value as often organizations fail to include analytical insights in day-to-day organizational processes (Ransbothan et al., 2016). According to Ransbothan et al. (2016), companies should understand what data is available and improve managers’ ability to use that data to succeed in maximizing the value of analytics initiatives. Unfortunately, such efforts usually fail (Ransbothan et al., 2016). In addition, Soja and Paliwoda-Pekosz (2009) listed the enterprise-structure-related changes required by deployment, as most significant failures were related to organizational changes or the lack thereof.
3.3 Technical readiness
In addition to organizational change, technical readiness is also required to enable the successful deployment of analytics. Indeed, according to a systemic review by Wamba et al. (2015), recent big-data-related research has primarily focused on addressing technical issues. According to Basu (2013), the following five technical-readiness-related pillars must be considered while deploying prescriptive analytics:
effective use of structured and unstructured data,
integrated predictions and prescriptions,
generation of prescriptions without undesired side effects,
adaptive algorithms that ensure that predictions and prescriptions remain relevant, and
a feedback mechanism for previously suggested decisions to assist in upcoming predictions and prescriptions.
In addition, the study discusses several other technical aspects that are important in prescriptive analytics tools and processes, including the balance between model granularity and complexity as well as the utilization of up-to-date, high-quality data from several sources and the design of fast and efficient user interfaces (Käki et al., 2019). Moreover, tools should take account of relevant uncertainties and constraints to produce decision recommendations which are implementable in real life (Basu, 2013). In addition, Reitsma and Hilletofth (2018) highlighted as critical success factors technical possibilities, software testing and the minimum customizations required for implementation.
Information-technology readiness and data quality are also mentioned as enablers of successful analytics implementation (Ain et al., 2019; Clark et al., 2020; Potdar and Rane, 2017). In this context, the importance of data source systems and data import, including IT infrastructure, has been underlined (Ain et al., 2019; El-Adaileh and Foster, 2019). Compatibility with existing systems is also important (Maroufkhani et al., 2020). Furthermore, the literature recommends that data security and governance-related issues as well as software sufficiency be considered (e.g. Hussein et al., 2019). In addition, in the context of social media, the empirical findings of Orlandi et al. (2020) showed a strong positive relationship between analytics deployment and the exploitation of technology and also highlight the significance of integrating marketing and information technology.
3.4 Resourcing and scheduling
The success of a deployment project is contingent on good project organization (Maltz et al., 2007). Important issues to note here include the coordination of organizational resources, the provision of sufficient labor, time and capital and the use of expert consultants to support the deployment (Hwang et al., 2004). Detailed milestones, critical paths and role boundaries must be established (Barth and Koch, 2019). Effective project management also entails setting a realistic timeframe and holding periodic status meetings (Zhang et al., 2005). In addition, the inadequacy of resources is listed as one of the most significant deployment failures (Soja and Paliwoda-Pekosz, 2009). Moreover, a study by Zhu et al. (2006) found that large firms tend to enjoy resource advantages during the initiation stage but are later subject to structural inertia.
3.5 Change management
Change management is required during analytics deployment to help organizations implement the required changes and adapt their operations to evolving circumstances (Al-Haddad and Kotnour, 2015; Burnes, 2011). For successful change management, an effective project management team is essential (Ali and Miller, 2017; El-Adaileh and Foster, 2019; Reitsma and Hilletofth, 2018), including appropriate change leadership (Errida and Lotfi, 2021). In relation to business intelligence (BI) deployment, the significance of well-defined visions and goals is acknowledged in the literature, as is the alignment between BI and business strategy (Ain et al., 2019). Reitsma and Hilletofth (2018) highlighted the importance of strategic decision-making, among other factors, in ERP system implementation. Other change-management-related enablers reported in the literature include effective and constant communication (Ali and Miller, 2017; Errida and Lotfi, 2021), the motivation of employees and change agents, the engagement of users and stakeholders (Errida and Lotfi, 2021; Fearon et al., 2013) and the evaluation of performance and impact (Clark et al., 2020; Reitsma and Hilletofth, 2018).
In contrast to the above-mentioned enablers of success related to change management, based on the findings of Reitsma and Hilletofth (2018), organizational change management and top management involvement are regarded as unimportant by users when implementing an ERP system. Instead, the users in their study listed the project team and performance measurement among the most important success factors of deployment.
Another study found that the main causes of change management failures were related to a lack of a clear vision and leadership skills, uncommitted stakeholders and poor communication (Errida and Lotfi, 2021). New analytics adoption may also be hindered by deployment-related complexity and uncertainty combined with overly high ambition and neglect of existing realities (Janssen et al., 2013). Furthermore, during the transformation process, the focus on users may be lost. Often users with realistic expectations prior to deployment are more satisfied with the system and exhibit a higher use rate (Ginzberg, 1981). Despite the challenges inherent in change management, Soja and Paliwoda-Pekosz (2009) suggested that many failures can be avoided by using performance indicators and incentives to increase the use of analytics.
3.6 Skills and training
The literature often mentions employees’ skills and expertise as an enabler of successful analytics deployment (Agrawal et al., 2020; Ain et al., 2019; Clark et al., 2020; El-Adaileh and Foster, 2019; Hussein et al., 2019; Orlandi et al., 2020). Here, one requirement is also the participation of end users in the development and deployment of analytics tools to build the necessary skill set and support learning (Soja and Paliwoda-Pekosz, 2009). Training is an important enabler of successful implementation (Reitsma and Hilletofth, 2018), especially if the new system is extremely complex (Bingi et al., 1999). Training and education reduce employee anxiety about the use of the new system (Lee et al., 2010) and increase users’ confidence in their ability to use it (Gist, 1987). Training is particularly important in the case of complex models, which can be resisted by decision-makers as users often fail to fully understand their premises and computations (Katsikopoulos et al., 2018). In addition, training can influence user beliefs about the new system (Gist, 1987) and thus promote a better understanding of its benefits (Lee et al., 2010).
One important reason for failed ERP system deployment is lack of training (Somers and Nelson, 2001). Soja and Paliwoda-Pekosz (2009) noted that typical skill-related failures concern the varying knowledge levels of employees in different positions and unsuitable training schedules. Therefore, training should be provided widely to employees and the different knowledge and backgrounds of the participants should also be considered.
3.7 External pressures
According to Fernando (2011), the business environment can be considered a collection of external forces, factors, and institutions that are beyond the control of the business that affect the functioning of the firm. In this study, we use the term external pressures to refer to the forces present in the operating environment in which the company conducts the analytics deployment project. External pressures concern such factors as the degree of competition, the regulatory environment and other industry characteristics, including customer and supplier readiness and the pressures and norms of the profession (Ke et al., 2009; Zhu et al., 2006).
In terms of analytics deployment in business, the literature identifies innovation, competitiveness and business-area-related technology as potential enablers (Hussein et al., 2019); nevertheless, most studies still examine the link between pressures and adoption rather than the impact of external pressures on the success of deployment. One exception is the analysis conducted by Vu et al. (2023) related to the managerial aspects of adapting to Industry 4.0. The authors emphasize the pressure on top management to focus more strongly on identifying trends or the latest technology and to build external and internal relationships with stakeholders to develop dynamic digital capabilities when the company’s knowledge is limited. On the other hand, competition among companies positively affects initiation and adoption (Hwang et al., 2004) while exerting a negative effect on routinization. Therefore, excessive competition may cause firms to chase the latest technologies before they have learnt to use the existing ones effectively (Zhu et al., 2006). In addition, the absence of guidelines tailored to the industry’s needs has been mentioned as a challenge to the adoption of innovations (Agrawal et al., 2020).
4. Methodology
This study aims to understand the enablers of success in the deployment of prescriptive analytics in organizations. Consequently, we adopted a case-study methodology (Yin, 2014), with the unit of analysis being the deployment process of a prescriptive analytics solution. We used theoretical sampling (Eisenhardt and Graebner, 2007), as our goal was to select cases that allowed us to identify the logic behind a particular prescriptive analytics deployment project as well as the relationships between enablers and the success of the deployment. Our multiple comparative case study approach allowed us to determine whether the findings were case-specific or replicated across cases (and given our sampling, across industries). Figure 1 presents our overall methodology from case selection to data analysis. It also demonstrates how the seven success enablers identified from the literature review impacted our empirical analysis by guiding the formation of the interview questions and by providing the coding categories for the data analysis.
The cases selected for the study were recent instances of completed prescriptive analytics deployment projects. It was necessary for the projects to have been completed so that the success or failure of the prescriptive analytics deployment could be estimated. It was also crucial that completion had occurred recently to ensure that the interviewees could recollect the deployment project in sufficient detail. Moreover, we only included cases in which access to end users and developers, managers, and top management could be guaranteed, thereby providing insight into views from different organizational levels. Because of these case selection criteria, it was not possible to include all projects offered by the organizations initially approached. For example, one potential case study was rejected because some of the individuals performing relevant roles in the deployment project were no longer available for interviews. The cases and companies selected are summarized in Table 2.
We used relatively loose semi-structured interviews based on the key enablers identified in the literature review. We used these themes to structure our interviews, based on theoretical replication (Yin, 2014), to investigate the deployment of prescriptive analytics, while also allowing new enablers to emerge. The interviews focused on prescriptive analytics deployment and the factors that contributed to the success or failure of the deployment. This structure provided consistency across the interviews and a connection to the literature we reviewed (Weston et al., 2001).
The study data was collected in Finland in spring 2022. Data collection took place through online interviews with employees who had participated in prescriptive analytics deployment projects. The goal was to interview employees working in one of the following three roles in each case: end user, manager and top management.
Several steps were taken to ensure validity (Yin, 2014). A multiple-informant approach was used to provide construct validity (Beverland and Lindgreen, 2010). Furthermore, sending the final results to the target organizations involved in the interviews ensured construct validity and validated the final results.
4.1 Interview coding categories
Deductive coding was used to analyze the interviews (Haug et al., 2021), which involved the development of a codebook with an initial set of codes (Saldana, 2009). This set consisted of coding categories based on the enablers for successful analytics deployment found in the literature review. The data was then read, and excerpts were assigned to codes. At the end of the analysis, the codes still closely resembled the original codebook.
Table 3 presents all coding categories derived from the literature review. The categories presented in Table 3 were used to analyze the interviews, including the background of analytics deployment, the benefits of analytics, the successes and failures of deployment, and learning. Table 4 illustrates the coding of the data using quotes [1] from multiple cases as examples.
4.2 Within-case descriptions
The cases used in this study included four companies and one public sector organization. Short descriptions of these case studies, including an overview of their analytics deployment processes, are presented below in the within-case analysis. A more detailed analysis of the enablers we identified is presented in the cross-case analysis.
4.2.1 Case A.
UPM is a Finnish forest industry company focused on delivering renewable solutions beyond fossils. The first UPM case study concerned a development project (titled PVOV) to design and implement a prescriptive analytics tool to optimize hydropower production. This case study focused on specific river systems, consisting of the Iijoki, Isohaara and Jumisko rivers. All these river systems contain one or more hydropower plants and are used for electricity production. The hydropower plants are owned by PVO Vesivoima (PVOV), which is a subsidiary of the company Pohjolan Voima (PVO). UPM owns shares in PVO that entitle them to receive electricity from the hydropower plants at production cost. In this case study, UPM acted not only as an owner, but also as a service provider offering hydro-planning, electricity trading, and balance settlement services. As a service provider, UPM developed in-house optimization models to carry out planning and operation services. Before the deployment of the optimization models, no sophisticated analytical solutions had been used; rather, the decisions relied on manual data analysis performed with in-house spreadsheet tools. In addition, all operations were performed by Pohjolan Voima instead of UPM.
Preparations for deployment began in 2017, and the model was developed in 2018. The modeling, performed by UPM developers, was followed by actual deployment for daily use by commercial planners to plan effective hydropower-related operations. This planning became more automated after the deployment. A company manager both steered the project and participated in the modeling. In turn, the top management was responsible for negotiating the commercial agreement between the software supplier and UPM. The prescriptive analytics deployment was seen as an overall success, and the consensus was that it had improved hydropower production significantly.
4.2.2 Case B.
The second UPM case study related to sales and operations planning solutions (SOPS) tool which was developed to identify the optimal combination of supply, sales, production and transportation options. It replaced an old-fashioned decision-making process based on an old tool (SAP APO), the experience of specialists and simple heuristics. Compared to the old tool, the new tool with extended and improved functionalities was considered more agile for planning and operating UPM’s complex supply chains. The new tool identified the best choice of supply, sales, production and transportation options to maximize supply-chain profitability. As a result, supply-chain-related decision-making became more efficient and effective, thus supporting the optimization of increasingly complex supply chains.
An external software supplier was selected during 2017 and 2018 from several candidates by the project management and leadership. Deployment occurred during 2019, and the new tool was introduced to sales and operations planners half a year ahead of schedule. These planners use the tool continuously for the monthly planning of sales and operations and for evaluating risks to cope with the dynamic business environment. The UPM SOPS case was, however, not considered fully successful because of the many challenges faced during project preparation and deployment. In particular, the initial choice of software supplier proved problematic, and it was necessary to change the supplier before deployment. However, despite these challenges, the new software was taken into use and it provided benefits for sales and operations planning.
4.2.3 Case C.
Application X gathers information on people’s movements and the way they use buildings and their facilities, especially elevators. This information is necessary for improving user experience and building operations based on analytics and actionable insights. Application X also keeps users aware of current building traffic and trends. Moreover, it facilitates efficient problem-solving and reporting on building usage. This allows, for instance, the reduction of queuing and waiting times or crowding, and thus the improvement of user experience. Application X is a mobile application which can be downloaded and easily used. It provides access to valuable cloud-based data about the use and functioning of elevators and sends notifications about issues in building traffic. Improved decisions, based on accurate data and analyses of how buildings handle the flow of people, enable a better user experience and help to achieve the full potential of buildings. Before the adoption of Application X, decision-making was more challenging as such extensive, real-time information on the horizontal and vertical movement of people was not available.
Application X was deployed in spring 2022. The end-users included, for instance, construction and real estate site managers who could use the system to prevent project schedule delays and thus enable savings by anticipating building traffic jams. The project manager was responsible for project planning and scheduling, defining project goals and delivering product-related feedback from end users. The project leader led the product development, including planning business priorities, a roadmap and future goals in cooperation with other stakeholders.
4.2.4 Case D.
Neste Corporation is the world’s largest producer of renewable diesel and renewable jet fuel refined from waste and residues. Moreover, it offers renewable solutions for the polymer and chemical industries and also produces, refines, and markets oil products. In this case study, a Sales and Operations Execution (S&OE) planning tool, SAP-based data entry and related integrations were developed to support effective supply-chain steering in renewable products.
The S&OE planning tool provides part of the input data for the S&OP optimization tool through developed integrations and conversion tools. Data from SAP and the S&OE planning tool forms the basis for the plan of the supply, sales and production of renewable products for the following year or more in a more effective way than the earlier tool, in which data input required a large amount of manual work with dedicated spreadsheets. Thus, data maintenance, including planning-related changes, has become easier, compared to the old spreadsheet-based system. In addition, the coordination of supply-chain-related data was automated and became more efficient.
The integrated planning project began in 2019, including the project design. The data integration and data-entry feature in SAP was created in 2020, and the new S&OE tool was deployed in a separate project during the fall of 2021. External consultants were used as a project management resource in both deployment projects. Initially, the S&OE tool was operated by single primary end user, who used the tool each day to change and improve the supply-chain plan. However, because the original implementation, the user base has grown significantly. The original end user was involved in consulting in the early phase of the project design, testing the user interface (UI), and supervising the final phase of deployment. The steering manager for the S&OE implementation tool project, who was changed during the project, was responsible for project design and deployment. Top management was responsible for defining tool functionalities and steering the actions of the software supplier.
4.2.5 Case E.
The Finnish Transport Infrastructure Agency (FTIA), with an annual budget of 2.1 billion euros, is a Finnish Government agency responsible for the maintenance and development of Finland’s road and railway infrastructure. A tool, named PRIO, was developed to prioritize infrastructure development projects by assessing the impact value of a project based on a combination of cost-benefit analysis and multi-objective optimization. The agency wished to select projects based not only on a comparison of costs and benefits but also on a consideration of their impacts from the perspective of transportation policy goals. The PRIO tool consists of spreadsheets and uses an external mixed-integer linear programming (MILP) solver. The PRIO tool allows different development projects to be compared and prioritized based on the following five aims:
the needs of freight transport and business travel,
the needs of commute and leisure travel,
traffic safety promotion,
carbon dioxide reduction, and
environmental sustainability and health promotion.
Before the PRIO tool, no such systematic method was used to prioritize development projects, which led to a lack of transparency in the project selection process.
The development of the PRIO tool began around 2010, but major advancements were made in 2017, leading to the first deployment in 2018. In 2021, after multiple additional development phases, the final deployment was completed. Currently, in the Finnish Transport Infrastructure Agency, one person works as the end user and continuously uses the PRIO tool for prioritizing projects. The previous end user changed his role to the manager responsible for tool design, management and deployment. Project leadership was divided between two persons who supported the deployment of the PRIO tool.
5. Interview results on the enablers of successful deployment
Table 5 summarizes the interview results. It shows which of the seven enablers identified from the existing literature were present in each case study and the type of effect (if any) the enablers exerted on the success of prescriptive analytics deployment. The following six labels are used in Table 5:
Empty The enabler was not mentioned;
YES+The enabler was mentioned and it positively contributed to the success of deployment;
YES+−The enabler was mentioned, and it contributed to the success of deployment both positively and negatively OR it is unclear whether it contributed to the success of deployment;
YES0 The enabler was mentioned, but it did not contribute to the success of deployment;
NO−The enabler was NOT PRESENT, and this MAY HAVE negatively contributed to the success of deployment; and
NO+−The enabler was NOT PRESENT, but it is unclear whether it contributed to the success of deployment.
The interview questions were mostly the same for all interviewees. However, the end users were primarily interviewed about their participation, whereas managers were asked more about project-management-related issues, such as budget and schedule. In addition, more detailed questions related to the enablers of success and the failures of deployment were directed to managers and top management. Unlike the managers and top management, end users were not asked in detail about each enabler of success, and thus their interviews followed a more open structure. Therefore, in Table 5, empty spaces are more frequent for end users, with whom the enablers of success and the reasons for possible failures were discussed at a more general level. The interview questions for each role are shown in the Appendix.
A comparison of the interview results with the outcome of the literature review showed that the significance of management support was strongly highlighted in both. By contrast, user participation and sufficient resources were highlighted more strongly in the interviews than was expected based on the results of the literature review. Moreover, although common enablers of success were identified in both the interviews and the literature review, some of the enablers from the literature review were not seen as important by the interviewees. In particular, they did not consider external pressures from competitors to be a significant motivation for deploying new analytics. In addition, the effect of organizational change was a more prominent topic in the literature than in the interviews. The next sections discuss the effect of each enabler on analytics deployment in the case studies.
5.1 Successful management support
Concerning management support, opinions differed between the different roles, except in Case A, where all interviewees expressed their satisfaction. For example, one end user in Case A described management support as follows:
In my understanding, the project had full management support and there was full trust in what we are doing and high expectations as well – I believe that we delivered on the promises.
In the same case, the top management also mentioned the significance of background discussions with the company’s top executives.
In the other cases, users mostly described the positive effects of management support on successful deployment, whereas managers and top management also discussed more negative issues. Thus, as previously mentioned, overall opinions about the effects of management support on the success of deployment varied considerably between different roles. Nevertheless, despite many end users mentioning the presence of successful management support, some end users felt that they lacked a concrete understanding of the nature of this support, as described by the end user in Case D: “For me, it is very hard to say what the management support was like behind the scenes”.
As mentioned, managers and top management expressed more critical views of management support. However, they also considered management support essential, and they linked negative outcomes to its poor quality or inadequacy. For example, the manager in Case E mentioned that delayed deployment was the result of unsatisfactory management support:
The deployment would have been faster if the management support had been stronger over the years – that first version [of the analytics tool] had it, this development of the financial situation was seen as strong and so forth. Then it was not as strong anymore.
Furthermore, the top management in Case B highlighted the management’s poor understanding of the deployment project:
So yes the management, management supported the project, but […] project management and then business line management were not […] on track with where we are going with this; rather they just believed that yes this is kind of going to be fine. It is going to be great when we have fancily made promises; there are fine-looking slides, and then all sorts of things come up and then in the end it is revealed that wait a minute this is kind of a dud.
However, the manager in the same case expressed different views: “Management support was important; it was always available, and it was high quality”.
Despite some differing views, the common sentiment among the interviewees was that successful management support was essential. Most of the negative evaluations were related to perceptions of inadequate support or the management’s lack of understanding of all the essential aspects of the deployment project.
5.2 Successful organizational change
In many cases, the analytics deployment process did not involve any significant organizational changes, or at least their effect was seen as negligible. Indeed in many interviews, organizational changes were discussed only briefly or not at all. In addition, sometimes the presence of organizational changes was not seen clearly, as described by the top management in Case B: “The organization as such did not change the way it operates at all”, or the manager in Case D:
Working methods did not change a lot. The same persons continued. The roles or responsibilities did not change except a new tool and new integrations were present. Communication happened through a new reporting system.
However, in Case B, the manager gave greater emphasis to unexpected organizational effects, which nonetheless led to a rather positive outcome: “The new system could make up for employees who left – the number of personnel has been reduced – If three left then only two were hired as replacements”.
However, Case E, the public sector case, differed considerably from the other cases as two out of three interviewees mentioned organizational changes and acknowledged their positive effect on analytics deployment, as described by the end user in Case E:
These more standard roles here, so who does what and what all we want to do, these [things] have perhaps stabilized. Little by little. I am not saying they are completely clear, but they have still been significantly clarified compared to a year ago.
The top management in Case E also described organizational changes due to the deployment process: “Of course, it has necessitated additional resources and such additional resources have been hired”. In addition, in Case A, the second end user described the real significance of the organizational changes: “The whole habit of the organization, or the way of thinking, has changed so that we have effectively learnt to deploy such things in a broader way”. Nonetheless, the other interviewees in Case A did not mention organizational changes or did not consider them significant.
5.3 Technical readiness
Technical readiness was seen in both a positive and negative light (or was viewed as insignificant) from the perspective of successful deployment in cases A, B and C. In Case B, there was only little discussion of technical readiness, which was mostly considered adequate. By contrast, in cases E and especially in D, technical factors were perceived to affect deployment more negatively, as described by the end user in Case D: “Because always, always some new problems were discovered in those functionalities or data”. Such negative sentiments were also expressed by the top management in Case E:
We have to take into consideration more general and more detailed guidelines when we make our investment plan. Those guidelines. And by the way, many of them are such, by nature, that they cannot be fed into it [PRIO].
In addition, despite actual changes to the PRIO tool being seen as successful, the end user of Case E also highlighted data-related challenges:
Such non-comparable data does not work for us, so how do we make sure that the kind of data, data there in our system is up to date and comparable; that is perhaps kind of a challenge and risk which at least I here myself strongly recognize.
However, the noted challenges were not always considered overwhelming, and most interviewees emphasized the adequacy of the organization’s technical readiness, like the manager in Case D:
We did not actually have any technical challenges as such, [with] all integrations and interfaces and etc. Other technical things went through very easily, and we got them done really quickly and according to schedule, and actually they took a lot less money too.
The overall views of end users, managers, and top management did not differ in cases A, B, and C. However, in cases D and E, the users felt that technical readiness was insufficient, thereby negatively affecting the deployment. Instead, managers and top management expressed both positive and negative views on technical readiness. In Case A, the manager considered the organization’s current technical readiness sufficient but still worried about technical challenges in the future, pondering how to reach the optimal level of model accuracy where “the model is detailed enough to be useful but does not anyway contain unnecessary details, which would lead to it needing to be constantly maintained”. In turn, the end users in Case A emphasized major technical challenges related to the maturity of the planning model and the fact that the simultaneous use of the new model and old tools was ignored. The second end user in Case A also described such technical challenges but also emphasized the organization’s ability to cope with them. On the other hand, instead of technical challenges, the top management in Case A highlighted the importance of fast solution time of the optimization model and an adequate understanding of the model’s behavior.
5.4 Sufficient resourcing and scheduling
All interviewees agreed that resourcing and scheduling affected the deployment. However, negative effects were mentioned slightly more by end users. By contrast, managers and top management more often discussed this enabler of successful deployment in a positive light. However, answers depended more on the case than the role of the interviewee. For example, in cases A, B and D, scarcity of resources was highlighted by all interviewees, while in cases C and E, this issue was not mentioned. The manager in Case D claimed that the problem was caused by increased human resource needs, and also the top management in the same case worried about the lack of resources:
And now we are learning with them [the new tools], so it now takes some time now to train. Training and extra orientation. But still it has been a clear sort of bottleneck in the [organization’s] resources.
However, interestingly, as described above in Section 5.3, the manager from Case D considered technical readiness to be one reason for the better management of resourcing and scheduling.
In addition to Case D, the role of insufficient resources in the failure of deployment was identified in many cases where the resources originally allocated to the project were inadequate or the risk of exceeding the budget was real. For example, in Case A, while all interviewees agreed that resources were generally sufficient, they claimed that no flexibility had been built in for surprises, such as sick leave. Therefore, the scarcity of resources was considered a significant risk. Moreover, the end user in Case A considered that “responsibility could have been shared more”, noting how the situation was frustrating for the manager as well. The second end user in Case A concurred but also thought that the deployment project was clearly structured as it was led by a single person: “If it is the vision and project of one man, then it stays under control”. The end user in Case B shared similar views, noting the risk of scarce resources but mentioning the importance of avoiding an overly large core team. The manager in Case B also pinpointed the importance of retaining the same core team: i.e. members should not change during the project.
In cases C and E, views on the adequacy of resources varied more between different roles. In Case C, issues of resources and scheduling were not raised by end users, whereas the manager and top management considered that resources and scheduling had been mostly appropriate. By contrast, in Case E, both the end user and top management highlighted development needs, especially related to scheduling.
5.5 Appropriate change management
The effects of change management varied considerably between the cases, but views on these effects were fairly consistent within each case. For instance, in Case A, staff commitment, cooperation and flexibility were considered successful by the top management, thus acknowledging the effect of change management on successful analytics deployment. The top management of Case A also highlighted the importance of planning and reported that trust in the team was high. In addition, the pressure to succeed created by the initial promises made was mentioned as a significant enabler of the success of analytics deployment. Furthermore, the organization had succeeded in differentiating the roles of service provider and project owner. However, knowledge sharing during the deployment project was not seen as successful by one of the users in Case A. Nevertheless, this person acknowledged that positive efforts had been made:
All efforts and focus were put on this new system, so that we did not have a back door to return to old ways, so this forced also that we needed to get this optimization model in such good shape that we can work with it and that is how it went.
The second end user in Case A emphasized the importance of strong leadership:
Even big things can be deployed by, by project management and scheduling and pushing it forward in a determined manner – We prioritized this, focused on this, and got it to work.
He also considered it important that change management was led internally instead of by external consultants.
The manager in Case B discussed the significance of cooperation and determination when working with a software supplier to achieve goals. The end user in Case B also mentioned that the personnel were prepared for the change and that successful deployment was predicated not only on technical deployment but also on appropriate organizational change. However, the top management in Case B highlighted the lack of successful leadership, which led to unsuccessful change management and thus reduced the success of the deployment.
In Case C, consideration of customer value was underlined in many interviews, especially by the top management:
A significant enabler is understanding the view of the customer. When the customer contacts you, you must be ready. All the time, you must be available and ready to serve the customer. The project team has been especially successful due to direct contact with the customer and activity in communication with the customer.
In addition, the manager in Case C mentioned the significance of customer feedback. Both the manager and top management emphasized the importance and success of user participation. The manager in Case C explained why participation was so essential:
Yes, they were prepped to some extent […] it was quite important that they are not just given a button in their hand, but we explained first that why, why we are doing this?
In cases D and E, inappropriate change management was identified. In Case D, one manager particularly highlighted the lack of trust and poor communication between a part of the project team and the software supplier:
[…] for the project, which started this year, we have selected another supplier simply because of lack of trust and bad chemistry. So apparently it is [the case] that they had not understood what we are genuinely asking for.
The end user in Case B also emphasized the importance of cooperation between the organization and software supplier. Similarly, in Case E, the interviewees did not consider change management to have been as strong as expected. The manager in Case E was disappointed with the change management outcomes and claimed that a more exact specification of the project goals would have been required rather than simply relying on the gradual development of the system over two years.
5.6 Sufficient skills and training
The interviewees, especially end users, emphasized the positive effect of user participation. For example, the end user in Case A was satisfied with user participation and described its sustained, seamless nature: “Deployment, training, and model development were a continuous process”. However, he felt that there should have been a longer piloting period in which old methods could have been used in parallel with the new analytics tools. On the other hand, the second end user in Case A felt that training with the new model had been extensive and sufficient. According to the manager in Case A, learning to use the new software and user interface took time, which was seen as a challenge. He also highlighted the importance of cooperation to improve skills: “It is useful that the modeler and substance expert discuss with each other”. In addition, the top management in Case A acknowledged the importance of sharing experiences throughout the whole organization. In Case A, the top management also stressed the importance of sufficient knowledge and skills but noted the inevitability of surprises and hence the need to adapt.
In Case B, the end user mentioned that the training sessions had been interactive and that they had also helped to further develop the system:
The workshops were educational too because there among ourselves we also had many discussions, so how we want to do these things, which lead to how the system was built and how the processes were executed within it.
The manager in Case B was impressed with the high level of skills in the team and thought it useful to use them in deployment projects instead of using external resources. In Case C, the manager and the top management placed particular emphasis on the importance of user participation. Moreover, the user in Case C confirmed that sufficient attention had been paid to user participation.
In cases D and E, the positive effects of skills and training were not discussed as much as in the other cases. In Case D, the top management considered that user participation had been beneficial, but its shortcomings were also highlighted: “For them, [the users], it would have been useful to practice already beforehand [i.e. before the deployment]”. In addition, the user in Case E acknowledged the importance of user participation and felt that it could have been initiated even earlier. Furthermore, the user in Case E noted the importance of sharing knowledge about the possibilities of the tool to avoid some benefits being ignored due to insufficient understanding. In Case E, the effects of skills and training on analytics deployment were not mentioned by the manager or top management. Overall, the lack of training and user participation in the tool deployment was seen as the most significant failure related to skills and training by the interviewees.
5.7 External pressures
In general, the managers did not consider external pressures from the business environment or the deployment of analytics by competitors to be a significant enabler of successful deployment. The top management viewed it as somewhat more significant, whereas many end users did not consider that such external factors had impacted the success of their own deployment. Especially in Case A, both the manager and top management felt that external pressures had not contributed to the success of deployment. However, the manager acknowledged that more sophisticated software for hydropower planning was known to be available, but the need for such external software had not been recognized in the organization. Moreover, the manager highlighted the presence of a “do-it-yourself” attitude in his company:
I know that there is software that is specialized in hydropower optimization or energy-market modeling. Such software might be more sophisticated than what we use here in our system. We have not gone for that [software], at least not yet, because we have not seen it as necessary. So, in a certain way, we have had quite a strong culture of doing it by ourselves [in our organization].
The representative of the top management in Case A claimed that the decision about the deployment had been taken independently, although they had monitored the progress of other companies:
It did not affect the decision what, for example, [our leading competitor] is doing‐we followed what they were doing, but maybe still searched for the solution from our perspective.
Likewise, the top management also emphasized that the software was not unique and that it was essential to focus on applying analytics in practice instead of constantly searching for novel technology. In addition, the top management highlighted the special nature of the case and the fact that the service was not provided for a third party but for one owner of the energy department at the company in question. Thus, there was no competition with other service providers. Nevertheless, the top management considered that the benchmarking of available tools could be useful in the future. On the other hand, in Case B, both the end user and manager stressed the absence of external pressures, emphasizing, instead, their desire to be pioneers.
In contrast to the end users, the managers more often mentioned the effect of external pressures, but their views about its significance varied. For example, the manager in Case C claimed that the effect of external pressures had been significant:
Exactly what I said about how it has been this hot topic, all this data collection, and, and how it is valuable. So, for sure, externally something also came from competitors.
In cases C, D and E, when mentioned at all, the interviewees primarily felt that external pressures had positively affected the success of analytics deployment. However, surprisingly, in Case B, the top management considered the presence of external pressures to have exerted a strong negative effect on the success of deployment. Specifically, the top management felt that the organization had blindly followed new digitalization trends without a deep understanding of the real needs of the organization: “Let’s do a thorough renewal and rambunctiously in the spirit of digitalization and like that. And then it did not end up quite as happily”.
6. Discussion and conclusions
The successful deployment of prescriptive analytics is a challenging task in both private and public organizations. The motivation for the study arose from the observation that while the benefits of prescriptive analytics are widely acknowledged, there is a lack of understanding about how it should be deployed. Indeed, its deployment often causes both technological and organizational challenges (Arunachalam et al., 2018; Tim et al., 2020). Our multi-case study represents a practice-oriented approach to identifying the key enablers of successful prescriptive analytics deployment. By building on the existing literature and our empirical findings, we identified multiple enablers. The most significant were successful management support, sufficient resourcing and scheduling, appropriate change management and sufficient skills and training. These findings provide key insights into how to increase the readiness of organizations to deploy new prescriptive analytics.
The results summarized in Table 5 and discussed in the previous chapter demonstrate that differences between the views of users, managers, and top management are smaller the more successful the case is, and vice versa. For instance, Case B was commonly seen as an example of the unsuccessful deployment of prescriptive analytics, and when comparing the views of users, managers, and top management on the enablers of successful deployment, a significant variation in views was observed. On the other hand, in case of successful deployment, for example in Case A, the views of the interviewees were rather similar. This highlights the need for collaboration, communication, and feedback between all organizational levels during the deployment – perhaps identifying these differing views and the related dissatisfaction during the deployment process could have led to changes and a more successful outcome.
Our results also show that most differences regarding enablers for successful deployment arise between different roles rather than between organizations. For example, concerning technical readiness, similar views tended to be shared between individuals performing similar roles rather than among personnel within the same organization. For example, in Case E and especially in Case D, end users did not consider technical readiness to have been sufficient, which then worsened the outcome of the deployment, whereas the manager and top management thought that technical readiness had been adequate despite some challenges. Such results have not been seen in previous research, which has focused more on the level of managers and top management (e.g. Thuethongchai et al., 2020).
In addition to the different views expressed by individuals performing different roles, the focus points between the roles also varied. In our study, end users tended to highlight user participation, skills and training, whereas managers and top management highlighted the significance of organizational change. Such different views may be explained by the roles themselves. Naturally, managers and top management tend to be more focused on organizational change than users, who mostly focus on learning to use the new tool. However, it is extremely important to understand the opinions of end users as they are responsible for the actual daily routine deployment of such tools.
Differences were also observed between the private company cases and the public sector case. In the public sector organization, issues related to technical readiness were highlighted more, whereas scarcity of resources was considered a less significant risk than in the private sector. In addition, the importance of change management was considered lower in the public sector case than among the private organizations that were studied. Failures related to skills, including user participation, were highlighted in the public sector, whereas the effect of external pressures was considered insignificant for the deployment of prescriptive analytics.
Overall, our empirical results mostly confirm the seven enablers identified in the literature, yet certain enablers were clearly more prominent. While the literature findings emphasize the effect of competitors (Zhu et al., 2006), in many of our cases, this topic was not raised at all. This may be explained by the novelty of prescriptive analytics deployment: success stories and large-scale pressure to adopt solutions to “stay in the game” are not yet present.
Many organizational-change-related themes from the literature, such as a supportive enterprise structure or a culture with a reward system, were not mentioned (Hussein et al., 2019; Soja and Paliwoda-Pekosz, 2009). This may be the result of the different nature of our interview case studies. In the literature, many studies cover more extensive business areas, especially regarding the deployment of ERP systems, whereas our case studies tend to focus on limited areas, such as one or two business units. Moreover, in such studies, organizational changes are mostly seen as a natural outcome of deployment instead of an enabler.
Despite management support being emphasized as a major enabler of successful deployment in both the literature and our interviews, securing commitment within the organization, particularly from the management, can be seen as one of the greatest challenges faced by a deployment team (Yeoh et al., 2008). This was also supported by our interview results. Moreover, technical readiness was also seen as an enabler, while many interviewees still noted that technical changes could have been managed even better.
6.1 Managerial recommendations
Our results lead us to suggest a few recommendations for the management to ensure that the adoption of prescriptive analytics leads to the desired outcomes. The most important recommendation is to strengthen the communication between end users, managers, and top management throughout the deployment – success is likely to hinge on all three organizational levels feeling the presence of the enabling factors. Through successful communication, potential problems can be discovered early and resources redirected, training increased or functionalities improved. Thus, a common dialogue between all parties is already highly recommended in the early phases of deployment.
Communication between all parties can also be supported by extensive user participation. The significance of already including user participation and training in the early phases of deployment was emphasized by the interviewees. In addition, the aim of successful management support and strong leadership, which was acknowledged in all the case studies, is more likely to be achieved by strengthening communication. To guarantee strong management support, it is essential that management possesses a full understanding of the necessary aspects of deployment. This can also be ensured by a strong dialogue between the different roles.
To achieve the aim of high user participation and a common dialogue during the deployment and preparatory stages, resourcing issues should also be considered. Without allocating sufficient resources, including adequate personnel and funding, cooperation between different roles may remain too low. Concerning resources, responsibilities should also be shared between more people to avoid over-reliance on a single person’s input. Estimating the skills of analytics deployment team members is also significant from the viewpoint of the success or failure of deployment. Thus, people who have a strong background in advanced analytics may be more capable of defining what questions are to be answered by new analytics, what way they are to be answered, and what constitutes a satisfying solution.
6.2 Limitations and future research
The cases used in this study involved four private companies and one public sector organization. As most case organizations were from the private sector, the generalizability of the study findings could be improved by increasing the number of case studies representing the public sector. Moreover, our findings could be tested in a public-sector-specific study to validate their wider applicability. In addition, more dynamic industries could be used in further research as most of the industries represented in our sample were of a relatively more static nature. In more dynamic business environments, the deployment challenges of prescriptive analytics may differ as data may become obsolete more quickly and systems may require increased flexibility and more frequent upgrading.
In terms of methodology, as prescriptive analytics deployment had already been completed by the time our study was conducted, our interviews asked respondents to reflect on past events. In qualitative studies, researchers often wish to learn about the past experiences of participants (Ellis et al., 2011). However, to avoid challenges such as inaccurate or inadequate memories of the past, it would be useful to conduct a long-term study that regularly follows an ongoing project, including continuous interviews during the project. In addition, some kind of diary of successes and failures could be kept by analytics deployment team members during the projects.
Our study has provided an understanding of what are the key enablers of successful deployment of prescriptive analytics. Given the qualitative nature of our study, we could not analyze the statistical relationship between the study constructs. Hence, we suggest future research to use large scale survey studies in organizations to understand the detailed impact of the key enablers identified and success of and performance derived from prescriptive analytics implementation, thereby also supporting the qualitative outcomes of this research.
Figures
Characteristics of different analytics categories
Category of analytics |
Intelligence | Methodological sophistication | Data requirements |
Expected value for organization |
---|---|---|---|---|
Descriptive | “What has happened?”, “why did it happen?”, and “what is happening now?” | Databases, reports, statistics, data mining, clustering | Low | Summary of historical data, data tracking, trend reporting, etc. |
Predictive | “What will happen?” and “why will it happen?” | Regression models, time-series models, machine learning | Medium | Forecasts, scenarios, trend identification, causal analysis |
Prescriptive | “What should be done?” | Mathematical optimization, decision analysis, simulation |
High | Decision recommendations, optimal planning and resource allocation, automated decision-making |
Source: Table by author
Overview of sample organizations
Case | Case description | Case company | Key business area | Interviewees |
---|---|---|---|---|
A | Deployment of hydropower-business-area-related optimization tool in 2017 | UPM | Paper and biorefining | Two users, a manager and top management |
B | Supply chain planning tool renewal in 2020 | UPM | Paper and biorefining | User, manager and top management |
C | A mobile application to improve user experience and building operations based on people movement and equipment data as well as analytics and insights in 2022 | Company X | Manufacturing | User, manager, and top management |
D | Supply-chain-planning-tool-related data system and integration renewal in 2020–2021 | Neste Corporation | Renewable products | User, manager and top management |
E | A tool for prioritization of development projects in 2018 | The Finnish Transport Infrastructure Agency | Transportation in public sector | User, manager and top management |
Source: Table by author
Description of coding categories related to success enablers of analytics deployment
Category | Description |
---|---|
Analytics description | Respondent describes analytics which was deployed |
Reasons for analytics deployment | Respondent discusses motivation for deployment of a new analytic tool |
Role in deployment | Respondent tells his/her role in deployment |
Goals set for new analytics | Respondent discusses specific goals set for new analytics deployment |
Performance of new analytics | Respondent discusses how well or badly the new analytics has performed |
Deployment time, duration and extensity | Respondent discusses the schedule and duration of deployment and how many persons were included |
Management support in the deployment process | Respondent discusses management support during the deployment process |
Organizational changes in the deployment process | Respondent discusses organizational changes during the deployment process |
Technical readiness in the deployment process | Respondent discusses technical readiness during the deployment process |
Resourcing and scheduling in the deployment process | Respondent discusses resourcing and scheduling during the deployment process |
Change management in the deployment process | Respondent discusses change management practices used during the deployment process |
Skills and training in the deployment process | Respondent discusses the development of end users’ skills and training during the deployment process |
External pressures in the deployment process | Respondent discusses the effects of external pressures on the deployment process |
Learning | Respondent discusses what was learnt |
Suggestions and recommendations | Respondent suggests what would be done similarly and what should be avoided |
Source: Table by author
Illustration of coded data. The example citations for each coding category are examples of the existence or lack of the enabler in a particular case
Category | Citation |
---|---|
Management support in the deployment process | “In my understanding, the project had full management support and there was full trust in what we are doing and high expectations as well, and as it had been promised, management had promised that we can improve this and the expectations were in line with that, but I believe that we delivered on the promises”. (User 1, Case A) “They (management) talk now about everything but the content […] what we want to get done with the system and what the expectations are of launching it, so there management must be on the map, must have same ideas, when the views are the same then I see that similar projects will also succeed”. (User, Case B) “Top management support was important; it was always available, and it was high quality […] they really got into it”. (Manager, Case B) “Let’s say that if there had continuously been a kind of strong management order for this and well support and preparedness to use the results. Then it would have gone forward”. (Manager, Case E) “Project management and then business line management were not […] on track with where we are going with this; rather, they just believed that yes this is kind of going to be fine […] and then in the end it is revealed that wait a minute this is kind of a dud”. (Top management, Case B) |
Organizational changes in the deployment process | “The whole habit of the organization, or the way of thinking, has changed so that we have effectively learnt to deploy such things in a broader way”. (User 2, Case A) “The organization as such did not change the way it operates at all”. (Top management, Case B) “Of course, it has necessitated additional resources and such additional resources have been hired”. (Top management, Case E) |
Technical readiness in the deployment process | “The biggest challenges are maybe in the technical part of the system […] those challenges in integration, information does not move as expected. Those (problems) were not due to the supplier but perhaps our own system architecture”. (Manager, Case B) “Such non-comparable data does not work for us, so how do we make sure that the kind of data, data there in our system is up to date and comparable; that is perhaps kind of a challenge and risk which at least I here myself strongly recognize”. (User, Case E) “Technology-wise we have had to take some detours that; you find sort of temporary solutions, that we can get it to work. Temporary solutions”. (Top management, Case C) “We did not actually have any technical challenges as such, [with] all integrations and interfaces and etc”. (Manager, Case D) |
Resourcing and scheduling in the deployment process | “There is a will to allocate resources to this”. (Top management, Case E) “It (budget) was exceeded because […] well this designer got, he was very overworked, so we had to loosen, lighten his workload”. (Manager, Case D) “It would not have survived with any sick leave, so it was really all resting on one man pretty much”. (User 2, Case A) “Finalizing the tools had to wait a bit, so there people had to adjust for sure; the schedule just hit us”. (Top management, Case A) “That is just one example of how there was this constant discontinuity between the reality and the schedule” (Top management, Case D) |
Change management in the deployment process | “In such deployment and model building, our benefit was that it was kind of made by our own, ok our own supervisor, but then again sort of our own close colleague so we were all the time together making this thing for our use. So if you imagine that it had been made by some consultant or some completely unknown person from some other organization for example then this change would not necessarily have been, and sort of configuration and implementation, been necessarily so smooth”. (User 2, Case A) “All efforts and focus were put on this new system, so that we did not have a back door to return to old ways, so this forced also that we needed to get this optimization model in such good shape that we can work with it and that is how it went”. (User 2, Case A) “At some point, there was criticism of the communication, that we did not always quite know where we are going in the project but that did get better when we gave feedback and it was when we actually started doing it, it was clear the dates when something is happening […] there was a sort of shushshus-business in the beginning before it can be published to other users; we did not get such good information”. (User 1, Case A) “Yes, they were prepped to some extent […] it was quite important that they are not just given a button in their hand, but we explained first that why, why we are doing this?” (Manager, Case C) |
Skills in the deployment process | “You cannot (with 3 shifts) get that training optimally for everyone”. (Top management, Case A) “The workshops were educational too because there among ourselves we also had many discussions, so how we want to do these things, which lead to how the system was built and how the processes were executed within it”. (User, Case B) “For them, [the users], it would have been useful to practice already beforehand [i.e., before the deployment]”. (Top management, Case D) |
External pressures in the deployment process | “It did not impact our decision-making what, for example, [a competitor] is doing–yes we did to some extent follow from the side what they are doing, but perhaps we anyway started from our own perspective–what the appropriate solution is for us”. (Top management, Case A) “It has been quite a hot topic all this sort of data collection and how it is valuable. So, for sure, externally something also came from competitors”. (Manager, Case C) |
Source: Table by author
Responses by interviewees
Case | Role: User/developer/ manager/ top management |
Successful management support |
Successful organizational changes |
Technical readiness |
Sufficient resourcing and scheduling |
Appropriate change management |
Sufficient skills and training |
External pressures |
---|---|---|---|---|---|---|---|---|
A | User 1 | YES+ | YES+− | NO− | YES+− | YES+− | ||
User 2 | YES+ | YES+ | YES+− | NO+− | YES+ | YES+ | ||
Developer/ manager | YES+ | YES+− | YES+ | YES+− | NO+− | |||
Top management | YES+ | NO+− | YES+ | YES+ | YES+ | YES+ | NO+− | |
B | User | NO− | YES+− | YES+ | YES+ | NO+− | ||
Manager | YES+ | NO+− | YES+ | YES+ | YES+ | YES+ | NO+− | |
Top management | YES0 | NO+− | YES+ | YES+− | NO− | NO− | YES0 | |
C | User | YES+− | NO− | YES+− | ||||
Manager | YES+ | NO− | YES+− | YES+ | YES+ | YES+ | ||
Top management | YES+− | NO− | YES+− | YES+ | YES+ | YES+ | YES+ | |
D | User | YES+− | NO− | NO− | NO− | YES+ | ||
Manager | YES+− | NO+− | YES+− | NO+− | NO+− | YES0 | YES+ | |
Top management | NO− | YES+− | NO− | NO− | NO− | |||
E | User | YES+ | YES+ | NO− | NO− | NO− | YES+− | |
Manager | YES+− | YES+− | NO− | NO− | ||||
Top management | YES+− | YES+ | YES+− | YES+− | YES+ |
Source: Table by author
Interview questions
END USER | MANAGER | TOP MANAGEMENT |
---|---|---|
What is the analytics software implemented in your company and for what it is used?What is your role related to the software and its deployment?What kind of process or decision making is replaced by such prescriptive analytics?What are the expected benefits achieved by prescriptive analytics?What were the actual benefits?When did you take new analytics in use in your work?How long did the deployment phase take time?Did you get any training or support for the deployment?Was the analytics deployment successful? Why/why not?What challenges have you faced?What did you learn from the deployment?What could be done similarly/differently in the future? | What is the analytics software implemented in your company and for what it is used?What is your role related to the software and its deployment?What kind of process or decision making is replaced by such prescriptive analytics?What are the expected benefits achieved by prescriptive analytics?What were the actual benefits?Why exactly this analytical program was chosen (either the external software or own customized analytical program)?When did the project preparation start?How long did development work/supplier’s choice take time before the actual deployment phase? What was exactly done during this time?How many people were involved in the preparation phase (who/which roles)?Has the project been in budget and schedule? Why/why not?How was the deployment carried out, in other words how were the users prepared or supported?Was the analytics deployment successful? Why/why not?What challenges have you faced?What did you learn from the deployment?What could be done similarly/differently in the future?What kind of effect did the following factors have on the success or failures? (management support, organizational changes, technical changes, resourcing and scheduling, change management, skills and training, competitors’ analytics deployment) | What is the analytics software implemented in your company and for what it is used?What is your role related to the software and its deployment?What kind of process or decision making is replaced by such prescriptive analytics?What are the expected benefits achieved by prescriptive analytics?What were the actual benefits?Was the analytics deployment successful? Why/why not?What challenges have you faced?What did you learn from the deployment?What could be done similarly/differently in the future?What kind of effect did the following factors have on the success or failures? (management support, organizational changes, technical changes, resourcing and scheduling, change management, skills and training, competitors’ analytics deployment) |
Source: Table by author
Note
In line with much qualitative social sciences research, we present the quotes translated from Finnish as verbatim as possible, but for readability and due to interviewee requests, some “um” and “kind of”-type natural hesitations in speech have been removed (e.g. Lingard, 2019).
Appendix
References
Ain, N., Vaia, G., DeLone, W.H. and Waheed, M. (2019), “Two decades of research on business intelligence system adoption, utilization and success – a systematic literature review”, Decision Support Systems, Vol. 125.
Al-Haddad, S. and Kotnour, T. (2015), “Integrating the organizational change literature: a model for successful change”, Journal of Organizational Change Management, Vol. 28 No. 2, pp. 234-262.
Agrawal, P., Narain, R. and Ullah, I. (2020), “Analysis of barriers in implementation of digital transformation of supply chain using interpretive structural modelling approach”, Journal of Modelling in Management, Vol. 15 No. 1, pp. 297-317.
Akerkar, R. (2013), “Advanced data analytics for business”, in Akerkar, R. (Ed.), Big Data Computing, CRC Press, Boca Raton, BR, pp. 373-397.
Ali, M. and Miller, L. (2017), “ERP system implementation in large enterprises – a systematic literature review”, Journal of Enterprise Information Management, Vol. 30 No. 4, pp. 666-692.
Arunachalam, D., Kumar, N. and Kawalek, J.P. (2018), “Understanding big data analytics capabilities in supply chain management: unravelling the issues, challenges and implications for practice”, Transportation Research Part E: Logistics and Transportation Review, Vol. 114, pp. 416-436.
Balbin, B.B.F., Barker, J.C.R., Leung, C.K., Tran, M., Wall, R.P. and Cuzzocrea, A. (2020), “Predictive analytics on open big data for supporting smart transportation services”, Procedia Computer Science, Vol. 176, pp. 3009-3018.
Barth, C. and Koch, S. (2019), “Critical success factors in ERP upgrade projects”, Industrial Management and Data Systems, Vol. 119 No. 3, pp. 656-675.
Basu, A. (2013), “Five pillars of prescriptive analytics success”, Analytics Magazine, Vol. 8, pp. 8-12.
Beverland, M. and Lindgreen, A. (2010), “What makes a good case study? A positivist review of qualitative case research published in industrial marketing management, 1971–2006”, Industrial Marketing Management, Vol. 39 No. 1, pp. 56-63.
Bingi, P., Sharma, M.K. and Godla, J.K. (1999), “Critical issues affecting an ERP implementation”, Information Systems Management, Vol. 16 No. 3, pp. 7-14.
Bose, R. (2009), “Advanced analytics: opportunities and challenges”, Industrial Management and Data Systems, Vol. 109 No. 2, pp. 155-172.
Burnes, B. (2011), “Introduction: why does change fail, and what can we do About it?”, Journal of Change Management, Vol. 11 No. 4, pp. 445-450.
Clark, J.-A., Liu, Y. and Isaias, P. (2020), “Critical success factors for implementing learning analytics in higher education: a mixed-method inquiry”, Australasian Journal of Educational Technology, Vol. 36 No. 6, pp. 89-106.
Davenport, T.H. and Harris, J.G. (2007), Competing on Analytics: The New Science of Winning, Harvard Business School Press, Boston, MA.
den Hertog, D. and Postek, K. (2016), “Bridging the gap between predictive and prescriptive analytics – new optimization methodology needed”, available at: https://optimization-online.org/?p=14328 (acessed 20 October 2022).
Eisenhardt, K.M. and Graebner, M.E. (2007), “Theory building from cases: opportunities and challenges”, Academy of Management Journal, Vol. 50 No. 1, pp. 25-32.
El-Adaileh, N.A. and Foster, S. (2019), “Successful business intelligence implementation: a systematic literature review”, Journal of Work-Applied Management, Vol. 11 No. 2, pp. 121-132.
Ellis, J., Amjad, A. and Deng, J. (2011), “Using pre-interview activities to support participants’ recall and analysis of past events”, education, Vol. 17 No. 2, pp. 61-73.
Engel, Y., Etzion, O. and Feldman, Z. (2012), “A basic model for proactive event-driven computing”, in Bry, F., Paschke, A., Eugster, P., Fetzer, C. and Behrend, A. (Eds), DEBS ’12: Proceedings of the 6th ACM International Conference on Distributed Event-Based Systems, Association for Computing Machinery, New York, NY, pp. 107-118.
Errida, A. and Lotfi, B. (2021), “The determinants of organizational change management success: literature review and case study”, International Journal of Engineering Business Management, Vol. 13, pp. 1-15.
Fearon, C., Manship, S., McLaughlin, H. and Jackson, S. (2013), “Making the case for ‘techno-change alignment’: a processual approach for understanding technology-enabled organisational change”, European Business Review, Vol. 25 No. 2, pp. 147-162.
Fernando, A.C. (2011), Business Environment, Dorling Kindersley, India.
Fink, L., Jogev, N. and Even, A. (2017), “Business intelligence and organizational learning: an empirical investigation of value creation processes”, Information and Management, Vol. 54 No. 1, pp. 38-56.
Ginzberg, M.J. (1981), “Early diagnosis of MIS implementation failure: promising results and unanswered questions”, Management Science, Vol. 27 No. 4, pp. 459-478.
Gist, M.E. (1987), “Self-efficacy: implications for organizational behavior and human resource management”, The Academy of Management Review, Vol. 12 No. 3, pp. 472-485.
Habeeb, R.A.A., Nasaruddin, F., Gani, A., Hashem, I.A.T., Ahmed, E. and Imran, M. (2019), “Real-time big data processing for anomaly detection: a survey”, International Journal of Information Management, Vol. 45, pp. 289-307.
Hagerty, J. (2017), “Planning guide for data and analytics”, available at: www.gartner.com/binaries/content/assets/events/keywords/catalyst/catus8/2017_planning_guide_for_data_analytics.pdf (accessed 20 October 2022).
Hair, J.F. (2007), “Knowledge creation in marketing: the role of predictive analytics”, European Business Review, Vol. 19 No. 4, pp. 303-315.
Hasan, H.M., Lotfollah, F. and Negar, M. (2012), “Comprehensive model of business intelligence: a case study of nano’s companies”, Indian Journal of Science and Technology, Vol. 5 No. 6, pp. 2851-2859.
Haug, S., Rietz, T. and Maedche, A. (2021), “Accelerating deductive coding of qualitative data: an experimental study on the applicability of crowdsourcing”, in Schneegass, S., Pfleging, B. and Kern, D. (Eds), MuC ’21: Proceedings of Mensch und Computer 2021, Association for Computing Machinery, New York, NY, pp. 432-443.
Hung, S.-Y., Huang, Y.-W., Lin, C.-C., Chen, K. and Tarn, J.M. (2016), “Factors influencing business intelligence systems implementation success in the enterprises”, paper presented at the Pacific Asia Conference on Information Systems (PACIS), 27 June-1 July, Chiayi, Taiwan, available at: www.pacis2016.org/ (accessed 4 April 2023).
Hussein, W.N., Kamarudin, L.M., Hussain, H.N., Ishak, N.A., Zakaria, A. and Jadaa, K.J. (2019), “Discovering the implementation success factors for IoT and big data analytics in transportation system”, paper presented at the 5th international conference on man machine systems (ICoMMS), 26-27 august, Pulau Pinang, Malaysia”, available at: https://iopscience.iop.org/article/10.1088/1757-899X/705/1/012049 (accessed 4 April 2023).
Hwang, M.I. and Xu, H. (2008), “A structural model of data warehousing success”, Journal of Computer Information Systems, Vol. 49 No. 1, pp. 48-56.
Hwang, H.G., Ku, C.Y., Yen, D.C. and Cheng, C.C. (2004), “Critical factors influencing the adoption of data warehouse technology: a study of the banking industry in Taiwan”, Decision Support Systems, Vol. 37 No. 1, pp. 1-21.
Janssen, M., van Veenstra, A.F. and van der Voort, H. (2013), “Management and failure of large transformation projects: factors affecting user adoption”, in Dwivedi, Y.K., Henriksen, H.Z., Wastell, D. and De’, R. (Eds), TDIT 2013: Grand Successes and Failures in IT. Public and Private Sectors. IFIP Advances in Information and Communication Technology, Vol. 402 Springer, Berlin, Heidelberg, pp. 121-135.
Junior, C.H., Oliveira, T. and Yanaze, M. (2019), “The adoption stages (evaluation, adoption, and routinization) of ERP systems with business analytics functionality in the context of farms”, Computers and Electronics in Agriculture, Vol. 156, pp. 334-348.
Käki, A., Kemppainen, K. and Liesiö, J. (2019), “What to do when decision-makers deviate from model recommendations?”, European Journal of Operational Research, Vol. 278 No. 3, pp. 869-882.
Katsikopoulos, K., Durbach, I. and Stewart, T. (2018), “When should we use simple decision models? A synthesis of various research strands”, Omega, Vol. 81, pp. 17-25.
Kauppi, K. (2013), “Extending the use of institutional theory in operations and supply chain management research: review and research suggestions”, International Journal of Operations and Production Management, Vol. 33 No. 10, pp. 1318-1345.
Ke, W., Liu, H., Wei, K.K., Gu, J. and Chen, H. (2009), “How do mediated and non-mediated power affect electronic supply chain management system adoption? The mediating effects of trust and institutional pressures”, Decision Support Systems, Vol. 46 No. 4, pp. 839-851.
Krumeich, J., Werth, D. and Loos, P. (2016), “Prescriptive control of business processes”, Business and Information Systems Engineering, Vol. 58 No. 4, pp. 261-280.
Kumar, V. and Garg, M.L. (2018), “Predictive analytics: a review of trends and techniques”, International Journal of Computer Applications, Vol. 182 No. 1, pp. 31-37.
LaValle, S., Lesser, E., Shockley, R., Hopkins, M.S. and Kruschwitz, N. (2011), “Big data, analytics and the path from insights to value”, MIT Sloan Management Review, Vol. 52 No. 2, pp. 21-32.
Larose, D.T. and Larose, C.D. (2015), Data Mining and Predictive Analytics, Wiley Series on Methods and Applications in Data Mining 2015, Wiley.
Larson, D. and Chang, V. (2016), “A review and future direction of agile, business intelligence, analytics and data science”, International Journal of Information Management, Vol. 36 No. 5, pp. 700-710.
Lautenbach, P., Johnston, K. and Adeniran-Ogundipe, T. (2017), “Factors influencing business intelligence and analytics usage extent in South African organisations”, South African Journal of Business Management, Vol. 48 No. 3, pp. 23-33.
Lee, S. and Kim, K. (2007), “Factors affecting the implementation success of internet-based information systems”, Computers in Human Behavior, Vol. 23 No. 4, pp. 1853-1880.
Lee, D., Lee, S.M., Olson, D.L. and Chung, S.H. (2010), “The effect of organizational support on ERP implementation”, Industrial Management and Data Systems, Vol. 110 No. 2, pp. 269-283.
Lepenioti, K., Bousdekis, A., Apostolou, D. and Mentzas, G. (2020), “Prescriptive analytics: literature review and research challenges”, International Journal of Information Management, Vol. 50, pp. 57-70.
Liberatore, M.J. and Luo, W. (2010), “The analytics movement: implications for operations research”, Interfaces, Vol. 40 No. 4, pp. 313-324.
Liberatore, M.J., Hatchuel, A., Wei, B. and Stylianou, A.C. (2000), “An organizational change perspective on the value of modeling”, European Journal of Operational Research, Vol. 125 No. 1, pp. 184-194.
Lin, H.F. (2014), “Understanding the determinants of electronic supply chain management system adoption: Using the technology-organization-environment framework”, Technological Forecasting and Social Change, Vol. 86, pp. 80-92.
Lingard, L. (2019), “Beyond the default Colon: effective use of quotes in qualitative research”, Perspectives on Medical Education, Vol. 8 No. 6, pp. 360-364.
Lu, J., Chen, W., Ma, Y., Ke, J., Li, Z., Zhang, F. and Maciejewski, R. (2017), “Recent progress and trends in predictive visual analytics”, Frontiers of Computer Science, Vol. 11 No. 2, pp. 192-207.
Luoma, J. (2016), “Model-based organizational decision making: a behavioral lens”, European Journal of Operational Research, Vol. 249 No. 3, pp. 816-882.
Lustig, I., Dietrich, B., Johnson, C. and Dziekan, C. (2010), “The analytics journey”, Business Analytics, available at: https://pubsonline.informs.org/do/10.1287/LYTX.2010.06.01/full/ (accessed 4 April 2023).
Maltz, E.N., Murphy, K.E. and Hand, M.L. (2007), “Decision support for university enrollment management: implementation and experience”, Decision Support Systems, Vol. 44 No. 1, pp. 106-123.
Maroufkhani, P., Tseng, M.L., Iranmanesh, M., Ismail, W.K.W. and Khalid, H. (2020), “Big data analytics adoption: determinants and performances among small to medium-sized enterprises”, International Journal of Information Management, Vol. 54.
Martins, R., Oliveira, T. and Thomas, M.A. (2016), “An empirical analysis to assess the determinants of SaaS diffusion in firms”, Computers in Human Behavior, Vol. 62, pp. 19-33.
Matende, S. and Ogao, P. (2013), “Enterprice resource planning (ERP) system implementation: a case for user participation”, Procedia Technology, Vol. 9, pp. 518-526.
Mcafee, A. and Brynjolfsson, E. (2012), “Big data: the management revolution”, Harvard Business Review, Vol. 90 No. 10, pp. 60-68.
Mortenson, M.J., Doherty, N.F. and Robinson, S. (2015), “Operational research from Taylorism to terabytes: a research agenda for the analytics age”, European Journal of Operational Research, Vol. 241 No. 3, pp. 583-595.
Muacevic, A. and Adler, J.R. (2020), “Making data reports useful: from descriptive to predictive”, Cureus, Vol. 12 No. 10, p. e10920, available at: www.ncbi.nlm.nih.gov/pmc/articles/PMC7657442/ (accessed 4 April 2023)
Oesterreich, T.D., Anton, E., Teuteberg, F. and Dwivedi, Y.K. (2022), “The role of the social and technical factors in creating business value from big data analytics: a meta-analysis”, Journal of Business Research, Vol. 153, pp. 128-149.
Orlandi, L.B., Zardini, A. and Rossignoli, C. (2020), “Organizational technological opportunism and social media: the deployment of social media analytics to sense and respond to technological discontinuities”, Journal of Business Research, Vol. 112, pp. 385-395.
Petter, S., DeLone, W. and McLean, E.R. (2013), “Information systems success: the quest for the independent variables”, Journal of Management Information Systems, Vol. 29 No. 4, pp. 7-62.
Poornima, S. and Pushpalatha, M. (2020), “A survey on various applications of prescriptive analytics”, International Journal of Intelligent Networks, Vol. 1, pp. 76-84.
Potdar, P.R. and Rane, S.B. (2017), “Exploring success factors for effective implementation of business analytics”, paper presented at the Changing Technology and Rural Development (CTRD) Conference, 22 December, Ratnagiri, India, available at: www.research-gate.net/publication/322538245_Exploring_success_factors_for_effective_implementa-tion_of_Business_Analytics (accessed 4 April 2023).
Puklavec, B., Oliveira, T. and Popovič, A. (2018), “Understanding the determinants of business intelligence system adoption stages: an empirical study of SMEs”, Industrial Management and Data Systems, Vol. 118 No. 1, pp. 236-261.
Raghupathi, W. and Raghupathi, V. (2014), “Big data analytics in healthcare: promise and potential”, Health Information Science and Systems, Vol. 2 No. 1, pp. 1-10, available at: www.researchgate.net/publication/272830136_Big_data_analytics_in_healthcare_Promise_and_potential (accessed 4 April 2023).
Ransbothan, S., Kiron, D. and Prentice, P.K. (2016), “Beyond the hype: the hard work behind analytics success”, MIT Sloan Management Review, Vol. 57 No. 3, pp. 3-16.
Reitsma, E. and Hilletofth, P. (2018), “Critical success factors for ERP system implementation: a user perspective”, European Business Review, Vol. 30 No. 3, pp. 285-310.
Rezaie, S., Mirabedini, S.J. and Abtahi, A. (2017), “Identifying key effective factors on the implementation process of business intelligence in the banking industry of Iran”, Journal of Intelligence Studies in Business, Vol. 7 No. 3, pp. 5-24.
Saldana, J. (2009), The Coding Manual for Qualitative Researchers, Sage, Thousand Oaks, CA.
Šikšnys, L. and Pedersen, T.B. (2009), “Prescriptive analytics”, in Liu, L. and Özsu, M. (Eds), Encyclopedia of Database Systems, Springer, New York, NY.
Soja, P. and Paliwoda-Pekosz, G. (2009), “What are real problems in enterprise system adoption?”, Industrial Management and Data Systems, Vol. 109 No. 5, pp. 610-627.
Soltanpoor, R. and Sellis, T. (2016), “Prescriptive analytics for big data”, in Cheema, M.A., Zhang, W. and Chang, L. (Eds) ADC 2016: 27th Australasian Database Conference. Databases Theory and Applications, LNCS, Vol. 9877 Springer International Publishing, Sydney, AU, pp. 245-325.
Somers, T.M. and Nelson, K. (2001), “The impact of critical success factors across the stages of enterprise resource planning implementations”, paper presented at the 34th annual Hawaii international conference on system sciences (HICSS), 3-6 January”, Maui, Hawaii, available at: www.researchgate.net/publication/267922937_The_impact_of_critical_success_factors_across_the_stages_of_ERP_implementations (accessed 4 April 2023).
Thuethongchai, N., Taiphapoon, T., Chandrachai, A. and Triukose, S. (2020), “Adopt big-data analytics to explore and exploit the new value for service innovation”, Social Sciences, Vol. 9 No. 3, pp. 1-17.
Tim, Y., Hallikainen, P., Pan, S.L. and Tamm, T. (2020), “Actualizing business analytics for organizational transformation: a case study of Rovio entertainment”, European Journal of Operational Research, Vol. 281 No. 3, pp. 642-655.
Trieu, V.-H. (2017), “Getting value from business intelligence systems: a review and research agenda”, Decision Support Systems, Vol. 93, pp. 111-124. doi: 10.1016/j.dss.2016.09.019.
Wamba, F.S. and Queiroz, M.M. (2020), “Industry 4.0 and the supply chain digitalisation: a blockchain diffusion perspective”, Production Planning and Control, Vol. 33 Nos 2/3, pp. 193-210.
Wamba, S.F., Akter, S., Edwards, A., Chopin, G. and Gnanzou, D. (2015), “How ‘big data’ can make big impact: findings from a systematic review and a longitudinal case study”, International Journal of Production Economics, Vol. 165, pp. 234-246.
Wang, Y. and Hajli, N. (2017), “Exploring the path to big data analytics success in healthcare”, Journal of Business Research, Vol. 70, pp. 287-299.
Ward, M.J., Marsolo, K.A. and Froehle, C.M. (2014), “Applications of business analytics in healthcare”, Business Horizons, Vol. 57 No. 5, pp. 571-582.
Weston, C., Gandell, T., Beauchamp, J., McAlpine, L., Wiseman, C. and Beauchamp, C. (2001), “Analyzing interview data: the development and evolution of a coding system”, Qualitative Sociology, Vol. 24 No. 3, pp. 381-400.
Vidgen, R., Shaw, S. and Grant, D.B. (2017), “Management challenges in creating value from business analytics”, European Journal of Operational Research, Vol. 261 No. 2, pp. 626-639.
Wilder, C.R. and Ozgur, C.O. (2015), “Business analytics curriculum for undergraduate majors”, INFORMS Transactions on Education, Vol. 15 No. 2, pp. 180-187.
Villamarín, J.M. and Diaz Pinzon, B. (2017), “Key success factors to business intelligence solution implementation”, Journal of Intelligence Studies in Business, Vol. 7 No. 1, pp. 48-69.
Vu, O.T.K., Alonso, A.D., Solis, M.A.B., Goyzueta, S., Nguyen, T., McClelland, R., Tran, T.D., Nguyen, N., Huynh, H.T.N. and Atay, E. (2023), “A dynamic capabilities approach of industry 4.0: the experiences of managers operating in two emerging economies”, European Business Review, Vol. 35 No. 2, pp. 137-160.
Xu, H. and Hwang, M.I. (2007), “The effect of implementation factors on data warehousing success: an exploratory study”, Journal of Information, Information Technology, and Organizations, Vol. 2, pp. 1-14.
Xu, Z., Frankwick, G.L. and Ramirez, E. (2016), “Effects of big data analytics and traditional marketing analytics on new product success: a knowledge fusion perspective”, Journal of Business Research, Vol. 69 No. 5, pp. 1562-1566.
Yeoh, W., Koronios, A. and Gao, J. (2008), “Managing the implementation of business intelligence systems: a critical success factors framework”, International Journal of Enterprise Information Systems, Vol. 4 No. 3, pp. 79-94.
Yin, R.K. (2014), Case Study Research Design and Methods, Sage, Thousand Oaks, CA.
Zhang, Z., Lee, M.K., Huang, P., Zhang, L. and Huang, X. (2005), “A framework of ERP systems implementation success in China: an empirical study”, International Journal of Production Economics, Vol. 98 No. 1, pp. 56-80.
Zhu, K., Kraemer, K.L. and Xu, S. (2006), “The process of innovation assimilation by firms in different countries: a technology diffusion perspective on E-Business”, Management Science, Vol. 52 No. 10, pp. 1557-1576.
Acknowledgements
The authors wish to thank the case organizations and their personnel for participating in our study. Marjut Hirvonen would also like to acknowledge the financial support from the Jenny and Antti Wihuri Foundation for this research (grants 00190089 and 00200096).
Corresponding author
About the authors
Marjut Hirvonen is a doctoral candidate in Logistics at the Aalto University School of Business. She obtained her MSc from Aalto University School of Chemical Engineering in 2014 and has worked at Neste Corporation. Her research interests are in prescriptive analytics benefits and deployment in organizations. She conducts research on these topics as well as works in related fields as a practitioner at Neste Corporation.
Katri Kauppi is a tenured Associate Professor in Logistics at the Aalto University School of Business. She obtained her PhD from Helsinki School of Economics in 2009 and worked at Manchester Business School and Nottingham University Business School before rejoining Aalto. Her main research interests lie in the area of organizational purchasing behavior, contract and systems compliance, public procurement and socially sustainable supply chains. Kauppi is also an Associate Editor of the Journal of Purchasing and Supply Management. She has previously published under her maiden name Karjalainen.
Juuso Liesiö is a tenured Associate Professor of Management Science at the Aalto University School of Business and a Visiting Fellow of the Loughborough University School of Business and Economics. Prof. Liesiö’s research interests lie in decision analysis and prescriptive analytics with a focus on optimization approaches for handling incomplete preference information and uncertainties in decision support as well as on axiomatic preference theory. This research has been widely applied in practice, for instance, to support resource allocation decisions in environmental management, infrastructure asset management and production planning.