Search results

1 – 10 of over 47000
Article
Publication date: 21 May 2020

Mohammad I. Merhi and Klajdi Bregu

This study aims to achieve three goals: present a holistic, flexible and dynamic model; define the model’s factors and explain how these factors lead to effective and efficient…

Abstract

Purpose

This study aims to achieve three goals: present a holistic, flexible and dynamic model; define the model’s factors and explain how these factors lead to effective and efficient usage of big data; and generate indexes based on experts’ input to rank them based on their importance.

Design/methodology/approach

This paper uses the analytic hierarchy process, a quantitative method of decision-making, to evaluate the importance of the factors presented in the model. The fundamental principle of the overall model is that of a dynamo which is borrowed from electromagnetic physics. The model is also based on three IS theories.

Findings

Technological advancements and data security are among the most important factors that may impact the effectiveness and efficiency of big data usage. Authentication, governments’ focus on it and transparency and accountability are the most important factors in techno-centric, governmental-centric and user-centric factors, respectively.

Research limitations/implications

The findings of this paper confirmed earlier findings in the literature and quantitatively assessed some of the factors that were conceptually presented. This paper also presented a framework that can be used in future studies.

Practical implications

Policy and decision-makers may need to upgrade pertinent technologies such as internet security, frame policies toward information technology (IT) and train the users.

Originality/value

This paper fills a gap in the literature by presenting a comprehensive study of how different factors dynamically contribute to the effective usage of big data in the public sector. It also quantitatively presents the importance of the factors based on the data collected from 12 IT experts.

Details

Transforming Government: People, Process and Policy, vol. 14 no. 4
Type: Research Article
ISSN: 1750-6166

Keywords

Article
Publication date: 4 August 2022

Anup Kumar, Santosh Kumar Shrivastav and Subhajit Bhattacharyya

This study proposes a methodology based on data source triangulation to measure the “strategic fit” for the automotive supply chain.

Abstract

Purpose

This study proposes a methodology based on data source triangulation to measure the “strategic fit” for the automotive supply chain.

Design/methodology/approach

At first, the authors measured the responsiveness of the Indian automobile supply chain, encompassing the top ten major automobile manufacturers, using both sentiment and conjoint analysis. Second, the authors used data envelopment analysis to identify the frontiers of their supply chain. The authors also measured the supply chain's efficiency, using the balance sheet. Further, the authors analyzed the “strategic fit” zone and discussed the results.

Findings

The results indicate that both the proposed methods yield similar outcomes in terms of strategic fitment.

Practical implications

The study outcomes facilitate measuring the strategic fit, thereby leveraging the resources available to align. The methodology proposed is both easy to use and practice. The methodology eases time and costs by eliminating hiring agencies to appraise the strategic fit. This valuable method to measure strategic fit can be considered feedback for strategic actions. This methodology could also be incorporated possibly as an operative measurement and control tool.

Originality/value

Data triangulation meaningfully enhances the accuracy and reliability of the analyses of strategic fit. Data triangulation leads to actionable insights relevant to top managers and strategic positioning of top managers within a supply chain.

Details

International Journal of Productivity and Performance Management, vol. 72 no. 10
Type: Research Article
ISSN: 1741-0401

Keywords

Article
Publication date: 6 May 2020

Lucy Wachera Kibe, Tom Kwanya and Ashah Owano

Big data analytics is a set of procedures and technologies that entails new forms of integration to uncover large unknown values from large data sets that are various, complex and…

Abstract

Purpose

Big data analytics is a set of procedures and technologies that entails new forms of integration to uncover large unknown values from large data sets that are various, complex and of an immense scale. The use of big data analytics is generally considered to improve organisational performance. However, this depends on capabilities of different organisations to provide the resources required for big data analytics. This study aims to investigate the influence of big data analytics on organisational performance of Technical University of Kenya (TUK) and Strathmore University (SU).

Design/methodology/approach

This study was conducted as a mixed method research to enable a deep understanding of the concept. Primary data was collected through structured questionnaires and interviews with clientele and information communication technology staff from the TUK and SU, both in Nairobi, Kenya. Secondary data was collected through interviews and questionnaires. Data was analysed and presented using descriptive statistics.

Findings

The findings revealed that most of the variables of organisational performance such as innovativeness, creativeness, effectiveness, productiveness and efficiency are affected positively by conducting big data analytics in both institutions. The results demonstrate that the TUK showed a negative relationship between big data analytics and competiveness and profitability while SU showed a positive relationship between the two variables. In terms of regression analysis, the findings revealed that SU showed a good relationship between independent and dependant variables while the TUK had a weak influence.

Originality/value

This study is original in terms of its subject matter, scope and application.

Details

Global Knowledge, Memory and Communication, vol. 69 no. 6/7
Type: Research Article
ISSN: 2514-9342

Keywords

Article
Publication date: 26 September 2022

Christian Nnaemeka Egwim, Hafiz Alaka, Oluwapelumi Oluwaseun Egunjobi, Alvaro Gomes and Iosif Mporas

This study aims to compare and evaluate the application of commonly used machine learning (ML) algorithms used to develop models for assessing energy efficiency of buildings.

Abstract

Purpose

This study aims to compare and evaluate the application of commonly used machine learning (ML) algorithms used to develop models for assessing energy efficiency of buildings.

Design/methodology/approach

This study foremostly combined building energy efficiency ratings from several data sources and used them to create predictive models using a variety of ML methods. Secondly, to test the hypothesis of ensemble techniques, this study designed a hybrid stacking ensemble approach based on the best performing bagging and boosting ensemble methods generated from its predictive analytics.

Findings

Based on performance evaluation metrics scores, the extra trees model was shown to be the best predictive model. More importantly, this study demonstrated that the cumulative result of ensemble ML algorithms is usually always better in terms of predicted accuracy than a single method. Finally, it was discovered that stacking is a superior ensemble approach for analysing building energy efficiency than bagging and boosting.

Research limitations/implications

While the proposed contemporary method of analysis is assumed to be applicable in assessing energy efficiency of buildings within the sector, the unique data transformation used in this study may not, as typical of any data driven model, be transferable to the data from other regions other than the UK.

Practical implications

This study aids in the initial selection of appropriate and high-performing ML algorithms for future analysis. This study also assists building managers, residents, government agencies and other stakeholders in better understanding contributing factors and making better decisions about building energy performance. Furthermore, this study will assist the general public in proactively identifying buildings with high energy demands, potentially lowering energy costs by promoting avoidance behaviour and assisting government agencies in making informed decisions about energy tariffs when this novel model is integrated into an energy monitoring system.

Originality/value

This study fills a gap in the lack of a reason for selecting appropriate ML algorithms for assessing building energy efficiency. More importantly, this study demonstrated that the cumulative result of ensemble ML algorithms is usually always better in terms of predicted accuracy than a single method.

Details

Journal of Engineering, Design and Technology , vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1726-0531

Keywords

Article
Publication date: 15 July 2020

Aras Okuyucu and Nilay Yavuz

Despite several big data maturity models developed for businesses, assessment of big data maturity in the public sector is an under-explored yet important area. Accordingly, the…

1219

Abstract

Purpose

Despite several big data maturity models developed for businesses, assessment of big data maturity in the public sector is an under-explored yet important area. Accordingly, the purpose of this study is to identify the big data maturity models developed specifically for the public sector and evaluate two major big data maturity models in that respect: one at the state level and the other at the organizational level.

Design/methodology/approach

A literature search is conducted using Web of Science and Google Scholar to determine big data maturity models explicitly addressing big data adoption by governments, and then two major models are identified and compared: Klievink et al.’s Big Data maturity model and Kuraeva’s Big Data maturity model.

Findings

While Klievink et al.’s model is designed to evaluate Big Data maturity at the organizational level, Kuraeva’s model is appropriate for assessments at the state level. The first model sheds light on the micro-level factors considering the specific data collection routines and requirements of the public organizations, whereas the second one provides a general framework in terms of the conditions necessary for government’s big data maturity such as legislative framework and national policy dimensions (strategic plans and actions).

Originality/value

This study contributes to the literature by identifying and evaluating the models specifically designed to assess big data maturity in the public sector. Based on the review, it provides insights about the development of integrated models to evaluate big data maturity in the public sector.

Details

Transforming Government: People, Process and Policy, vol. 14 no. 4
Type: Research Article
ISSN: 1750-6166

Keywords

Article
Publication date: 9 January 2023

Ayman Wael AL-Khatib

This study investigates the impact of big data analytics capabilities on export performance. Moreover, it assesses the mediating effect of the supply chain innovation and…

1014

Abstract

Purpose

This study investigates the impact of big data analytics capabilities on export performance. Moreover, it assesses the mediating effect of the supply chain innovation and moderating effect of supply chain agility.

Design/methodology/approach

This study is based on primary data that were collected from the manufacturing sector operating in Jordan. A total of 327 responses were used for the final data analysis. Data analysis was performed via a partial least square structural equation modeling (PLS-SEM) approach.

Findings

The results of the data analysis supported a positive relationship between big data analytics capabilities and the export performance as well as a mediating effect of supply chain innovation. It was confirmed that supply chain agility moderated the relationship of supply chain innovation and export performance.

Originality/value

This study developed a theoretical and empirical model to investigate the relationship between big data analytics capabilities, export performance, supply chain innovation and supply chain agility. This study offers new theoretical and managerial contributions that add value to the supply chain management literature by testing the moderated-mediated model of these constructs in the manufacturing sector in Jordan.

Details

International Journal of Emerging Markets, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1746-8809

Keywords

Article
Publication date: 4 February 2019

Riccardo Rialti, Giacomo Marzi, Cristiano Ciappei and Donatella Busso

Recently, several manuscripts about the effects of big data on organizations used dynamic capabilities as their main theoretical approach. However, these manuscripts still lack…

4583

Abstract

Purpose

Recently, several manuscripts about the effects of big data on organizations used dynamic capabilities as their main theoretical approach. However, these manuscripts still lack systematization. Consequently, the purpose of this paper is to systematize the literature on big data and dynamic capabilities.

Design/methodology/approach

A bibliometric analysis was performed on 170 manuscripts extracted from the Clarivate Analytics Web of Science Core Collection database. The bibliometric analysis was integrated with a literature review.

Findings

The bibliometric analysis revealed four clusters of papers on big data and dynamic capabilities: big data and supply chain management, knowledge management, decision making, business process management and big data analytics. The systematic literature review helped to clarify each clusters’ content.

Originality/value

To the authors’ best knowledge, minimal attention has been paid to systematizing the literature on big data and dynamic capabilities.

Details

Management Decision, vol. 57 no. 8
Type: Research Article
ISSN: 0025-1747

Keywords

Book part
Publication date: 21 October 2019

Roxana Mihet and Thomas Philippon

The authors analyze the expansion of Big Data and artificial intelligence technologies from the perspective of economic theory. The authors argue that these technologies can be…

Abstract

The authors analyze the expansion of Big Data and artificial intelligence technologies from the perspective of economic theory. The authors argue that these technologies can be viewed from three perspectives: (1) as an intangible asset; (2) as a search and matching technology; and (3) as a forecasting technology. These points of view shed light on how new technologies are likely to affect matching between firms and consumers, productivity growth, price discrimination, competition, inequality among firms, and inequality among workers.

Details

Disruptive Innovation in Business and Finance in the Digital World
Type: Book
ISBN: 978-1-78973-381-5

Keywords

Article
Publication date: 23 April 2020

Ajree Ducol Malawani, Achmad Nurmandi, Eko Priyo Purnomo and Taufiqur Rahman

This paper aims to examine tweet posts regarding Typhoon Washi to contend the usefulness of social media and big data as an aid of post-disaster management. Through topic…

Abstract

Purpose

This paper aims to examine tweet posts regarding Typhoon Washi to contend the usefulness of social media and big data as an aid of post-disaster management. Through topic modelling and content analysis, this study examines the priorities of the victims expressed in Twitter and how the priorities changed over a year.

Design/methodology/approach

Social media, particularly Twitter, was where the data gathered. Using big data technology, the gathered data were processed and analysed according to the objectives of the study. Topic modelling was used in clustering words from different topics. Clustered words were then used for content analysis in determining the needs of the victims. Word frequency count was also used in determining what words were repeatedly used during the course period. To validate the gathered data online, government documents were requested and concerned government agencies were also interviewed.

Finding

Findings of this study argue that housing and relief goods have been the top priorities of the victims. Victims are seeking relief goods, especially when they are in evacuation centres. Also, the lack of legal basis hinders government officials from integrating social media information unto policymaking.

Research limitation

This study only reports Twitter posts containing keywords either, Sendong, SendongPH, Washi or TyphoonWashi. The keywords were determined based on the words that trended after Typhoon Washi struck.

Practical implication

For social media and big data to be adoptable and efficacious, supporting and facilitating conditions are necessary. Structural, technical and financial support, as well as legal framework, should be in place. Maintaining and sustaining positive attitude towards it should be taken care of.

Originality/value

Although many studies have been conducted on the usefulness of social media in times of disaster, many of these focused on the use of social media as medium that can efficiently spread information, and little has been done on how the government can use both social media and big data in collecting and analysing the needs of the victims. This study fills those gaps in social big data literature.

Details

Transforming Government: People, Process and Policy, vol. 14 no. 2
Type: Research Article
ISSN: 1750-6166

Keywords

Article
Publication date: 24 December 2021

Lígia Lobo Mesquita, Fabiane Letícia Lizarelli, Susana Duarte and Pedro Carlos Oprime

This paper aims to thoroughly identify the forms of integration between Lean, Industry 4.0 (I4.0) and environmental sustainability (ES) by examining the relationships between…

Abstract

Purpose

This paper aims to thoroughly identify the forms of integration between Lean, Industry 4.0 (I4.0) and environmental sustainability (ES) by examining the relationships between these three constructs, deepening understanding surrounding the theme and evolving the construction of a framework that can aid managing industrial production processes.

Design/methodology/approach

A systematic literature review (SLR) was the method used to identify the relationships for integration in the current literature. The SLR was supported by content and cluster analysis. The analyzes identified relationships at two levels. The first level observed relationships for constructs and variables. The second, at the level of constructs and components, which detail the variables. This study also proposes an integrated conceptual framework showing these relationships at the construct, variable and component levels.

Findings

The results show how these three constructs are related and the study concludes by stating that there is stronger integration among I4.0 technologies and Lean practices for reaching ES. The SLR identified the main components that allowed for this integration, i.e. I4.0 technologies, Big Data, the internet of things and Lean practices, like reducing waste and customer needs.

Practical implications

From an academic standpoint, this study proposes new lines of research lines that have not been explored thus far, and can be developed via empirical studies, at the strategic and operational levels among different industrial sectors. Also, this study can help managers understand the integrations between Lean practices and I4.0 technologies to achieve better operational and environmental organizational results.

Originality/value

To the best of the knowledge, this study is the first of its kind using SLR to integrate Lean approaches, ES and I4.0 and to propose a unified framework to help managers and academics understand these relationships.

Details

International Journal of Lean Six Sigma, vol. 13 no. 4
Type: Research Article
ISSN: 2040-4166

Keywords

1 – 10 of over 47000