Search results
1 – 10 of over 3000Hany Elbardan, Donald Nordberg and Vikash Kumar Sinha
This study aims to examine how the legitimacy of internal auditing is reconstructed during enterprise resource planning (ERP)-driven technological change.
Abstract
Purpose
This study aims to examine how the legitimacy of internal auditing is reconstructed during enterprise resource planning (ERP)-driven technological change.
Design/methodology/approach
The study is based on the comparative analysis of internal auditing and its transformation due to ERP implementations at two case firms operating in the food sector in Egypt – one a major Egyptian multinational corporation (MNC) and the other a major domestic company (DC).
Findings
Internal auditors (IAs) at MNC saw ERP implementation as an opportunity to reconstruct the legitimacy of internal auditing work by engaging and partnering with actors involved with the ERP change. In doing so, the IAs acquired system certifications and provided line functions and external auditors with data-driven business insights. The “practical coping mechanism” adopted by the IAs led to the acceptance (and legitimacy) of their work. In contrast, IAs at DC adopted a purposeful strategy of disengaging, blaming and rejecting since they were skeptical of the top management team's (TMT's) sincerity. The “disinterestedness” led to the loss of legitimacy in the eyes of the stakeholders.
Originality/value
The article offers two contributions. First, it extends the literature by highlighting a spectrum of behavior displayed by IAs (coping with impending issues vs strategic purposefulness) during ERP-driven technological change. Second, the article contributes to the literature on legitimacy by highlighting four intertwined micro-processes – participating, socializing, learning and role-forging – that contribute to reconstructing the legitimacy of internal auditing.
Details
Keywords
This study aims to examine the scholarly impact of funded and non-funded research published in ten core library and information science (LIS) journals published in 2016.
Abstract
Purpose
This study aims to examine the scholarly impact of funded and non-funded research published in ten core library and information science (LIS) journals published in 2016.
Design/methodology/approach
In total, ten high-impact LIS journals were selected using Google Scholar metrics. The source title of each selected journal was searched in the Scopus database to retrieve the articles published in 2016. The detailed information of all the retrieved articles for every journal was exported in a CSV Excel file, and after collecting all the journal articles’ information, all CSV Excel files were merged into a single MS Excel file for data analysis.
Findings
The study analyzed 1,064 publications and found that 14% of them were funded research articles. Funded articles received higher average citation counts (24.56) compared to non-funded articles (20.49). Funded open-access articles had a higher scholarly impact than funded closed-access articles. The research area with the most funded articles was “Bibliometrics,” which also received the highest number of citations (1,676) with an average citation count of 24.64. The National Natural Science Foundation of China funded the most papers (30), while the USA funded the highest number of research publications (36) in the field of LIS.
Practical implications
This study highlights the importance of securing funding, open access publishing, discipline-specific differences, diverse funding sources and aiming for higher citations. Researchers, practitioners and policymakers can use these findings to enhance research impact in LIS.
Originality/value
This study explores the impact of funding on research LIS and provides valuable insights into the intricate relationship between funding and research impact.
Details
Keywords
Koraljka Golub, Osma Suominen, Ahmed Taiye Mohammed, Harriet Aagaard and Olof Osterman
In order to estimate the value of semi-automated subject indexing in operative library catalogues, the study aimed to investigate five different automated implementations of an…
Abstract
Purpose
In order to estimate the value of semi-automated subject indexing in operative library catalogues, the study aimed to investigate five different automated implementations of an open source software package on a large set of Swedish union catalogue metadata records, with Dewey Decimal Classification (DDC) as the target classification system. It also aimed to contribute to the body of research on aboutness and related challenges in automated subject indexing and evaluation.
Design/methodology/approach
On a sample of over 230,000 records with close to 12,000 distinct DDC classes, an open source tool Annif, developed by the National Library of Finland, was applied in the following implementations: lexical algorithm, support vector classifier, fastText, Omikuji Bonsai and an ensemble approach combing the former four. A qualitative study involving two senior catalogue librarians and three students of library and information studies was also conducted to investigate the value and inter-rater agreement of automatically assigned classes, on a sample of 60 records.
Findings
The best results were achieved using the ensemble approach that achieved 66.82% accuracy on the three-digit DDC classification task. The qualitative study confirmed earlier studies reporting low inter-rater agreement but also pointed to the potential value of automatically assigned classes as additional access points in information retrieval.
Originality/value
The paper presents an extensive study of automated classification in an operative library catalogue, accompanied by a qualitative study of automated classes. It demonstrates the value of applying semi-automated indexing in operative information retrieval systems.
Details
Keywords
Osamu Tsukada, Ugo Ibusuki, Shigeru Kuchii and Anderson Tadeu de Santi Barbosa de Almeida
The purpose of this study is to explore the relationship between Lean manufacturing and Industry 4.0 for small and medium size of enterprise in Japan and Brazil.
Abstract
Purpose
The purpose of this study is to explore the relationship between Lean manufacturing and Industry 4.0 for small and medium size of enterprise in Japan and Brazil.
Design/methodology/approach
The authors conducted a quantitative survey (20 companies in Japan and 30 companies in Brazil) combined with a qualitative interview (2 companies in Japan and 15 companies in Brazil).
Findings
According to the quantitative study, 90% of them practice Lean manufacturing and 40% of them practice Industry 4.0. In the qualitative study in Brazil, four managers responded that the Lean manufacturing is a prerequisite for Industry 4.0 since any production process with waste cannot be productive, even with sophisticated digitalization technology.
Originality/value
The authors explored further the relationship between “defensive Digital Transformation (DX),” which is based mainly on Lean manufacturing, and “offensive DX,” which relates to customer value creation through Industry 4.0. This study clarifies the relationship and plays as a roadmap to develop better the manufacturing from current status to the vision of Industry 4.0.
Details
Keywords
Vinayambika S. Bhat, Thirunavukkarasu Indiran, Shanmuga Priya Selvanathan and Shreeranga Bhat
The purpose of this paper is to propose and validate a robust industrial control system. The aim is to design a Multivariable Proportional Integral controller that accommodates…
Abstract
Purpose
The purpose of this paper is to propose and validate a robust industrial control system. The aim is to design a Multivariable Proportional Integral controller that accommodates multiple responses while considering the process's control and noise parameters. In addition, this paper intended to develop a multidisciplinary approach by combining computational science, control engineering and statistical methodologies to ensure a resilient process with the best use of available resources.
Design/methodology/approach
Taguchi's robust design methodology and multi-response optimisation approaches are adopted to meet the research aims. Two-Input-Two-Output transfer function model of the distillation column system is investigated. In designing the control system, the Steady State Gain Matrix and process factors such as time constant (t) and time delay (?) are also used. The unique methodology is implemented and validated using the pilot plant's distillation column. To determine the robustness of the proposed control system, a simulation study, statistical analysis and real-time experimentation are conducted. In addition, the outcomes are compared to different control algorithms.
Findings
Research indicates that integral control parameters (Ki) affect outputs substantially more than proportional control parameters (Kp). The results of this paper show that control and noise parameters must be considered to make the control system robust. In addition, Taguchi's approach, in conjunction with multi-response optimisation, ensures robust controller design with optimal use of resources. Eventually, this research shows that the best outcomes for all the performance indices are achieved when Kp11 = 1.6859, Kp12 = −2.061, Kp21 = 3.1846, Kp22 = −1.2176, Ki11 = 1.0628, Ki12 = −1.2989, Ki21 = 2.454 and Ki22 = −0.7676.
Originality/value
This paper provides a step-by-step strategy for designing and validating a multi-response control system that accommodates controllable and uncontrollable parameters (noise parameters). The methodology can be used in any industrial Multi-Input-Multi-Output system to ensure process robustness. In addition, this paper proposes a multidisciplinary approach to industrial controller design that academics and industry can refine and improve.
Details
Keywords
Ajay Jha, R.R.K. Sharma and Vimal Kumar
The study aims to add to the body of knowledge of open source tangible product management (also called open design). The objective is also to develop a guideline for efficient…
Abstract
Purpose
The study aims to add to the body of knowledge of open source tangible product management (also called open design). The objective is also to develop a guideline for efficient open source tangible product development and adoption.
Design/methodology/approach
The exploratory research design using secondary data (like newspapers, magazines, research articles, bogs, papers, etc.) is used to analyze open source tangible product design challenges and enablers. The success stories of Open Source Software projects (OSS) were studied for identification of critical success factors and further their relevancy was tested in the two popular cases of open source drug discovery (malaria and tuberculosis)
Findings
Open innovation has become a part of competitive strategy of current businesses. It requires an efficient intellectual property protection regime for its implementation. However, in a market dominated by proprietary benefits, the open source technology development can serve as remedy for innovation needs of neglected sectors. The OSS literature revealed managing two classes of factors, namely technology sponsor level factors and environmental factors for efficiency and effectiveness. The case study analysis in the context of applicability of these OSS critical factors showed their limitations in open source tangible products, and highlighted understanding additional challenges and remedies.
Research limitations/implications
Open source innovation is a collaborative effort involving inputs from various/diverse players, hence monitoring the effort and motivation level of the contributors is a cumbersome task. Only the information that is available online and in print media is taken as research inputs in this work. Also the data taken were from two case studies; a lot more case studies in the open design domain can progress the theory. The implications of this study are far-reaching in the areas where profit motivated proprietary efforts lack in addressing societal need. It provides guidelines for addressing those unmet needs by developing products in a collaborative way without intellectual property hurdles.
Originality/value
The essence of open design is becoming more vital, and there is a pressing need to build theory to support it, which still is elusive and dispersed. The study fills the gap using secondary data and case study approach.
Details
Keywords
Melis Baloğlu and Yüksel Demir
The purpose of this paper is to demonstrate how network theory and methods can provide insights into the forces shaping architectural learning agendas and knowledge construction…
Abstract
Purpose
The purpose of this paper is to demonstrate how network theory and methods can provide insights into the forces shaping architectural learning agendas and knowledge construction in architectural schools.
Design/methodology/approach
The methodology involves conceptualising learning as a constructivist process and the agenda as an interconnected network of actors, concepts and relations. Network analysis techniques, including centrality and brokerage metrics, are used to identify roles and knowledge flows using the data locally collected from Turkish universities as well as from the OpenSyllabus open-source database.
Findings
The analysis reveals the enduring influence of early modernists, signalling imbalanced canon formation in the architectural learning system. However, marginal voices highlight struggles in integrating unconventional perspectives. Limited integration of local figures indicates a consolidation of Eurocentric epistemes. Identifying these hidden forces is vital for reimagining learning agendas and socio-culturally engaged forms of learning. Pioneering figures demonstrate potential for synthesis when situated as brokers, not bifurcated schools.
Research limitations/implications
The outcomes are limited by the geographical and temporal boundaries of the data and the analysis method employed. Despite limitations, the diagnostic network framework reveals architectural learning as an open, contested ecosystem demanding pluralistic pedagogies concerning not only the global but the local, both canonical and marginal. Further research covering more data could enrich the understanding of qualitative complexities.
Practical implications
The network perspective prompts critical reflexivity about power, ideology and exclusion in knowledge construction. Strategic inclusion and diversification of voices provide pathways to bridge divides and ground learning locally.
Originality/value
This research offers a methodology model to examine forces and influences shaping architectural education by elucidating hidden and remote roles and knowledge gaps in learning agendas. Extending the techniques more widely can enable strategic interventions toward inclusive, impactful learning across disciplines, time and geographies.
Details
Keywords
Sumukh Hungund, Jighyasu Gaur and Aishwarya Narayan
The paper aims to examine the influence of closed and open innovation practices on economic performance. This papert also examines the mediating roles of innovation performance…
Abstract
Purpose
The paper aims to examine the influence of closed and open innovation practices on economic performance. This papert also examines the mediating roles of innovation performance and firm performance. The study uses innovation theory based on knowledge management for theoretical support.
Design/methodology/approach
The methodology involves two steps. First, all the variables relevant to the adoption of innovative approaches and performance parameters are identified. Subsequently, primary data are gathered from decision-makers of 200 biotechnological firms and a structural equation modeling analysis is performed.
Findings
The study's results showed that the open innovation practice, such as interaction with large research and development (R&D) firms and customers, influences the performance parameters. The findings indicate that closed and open innovation practices positively impact performance measures like innovation, firm and economic performance. The results also indicate the mediating role of firm performance. However, the innovation performance was not found to mediate the relationship.
Originality/value
This examination gives experimental bits of knowledge from any confining influence innovation approaches in India. Analysts and specialists of firms can use the results of the current study to comprehend the effect of various innovation practices on different performance measures.
Details
Keywords
Gavin Ford and Jonathan Gosling
The construction industry has struggled to deliver schemes on time to budget and right-first-time (RFT). There have been many studies into nonconformance and rework through…
Abstract
Purpose
The construction industry has struggled to deliver schemes on time to budget and right-first-time (RFT). There have been many studies into nonconformance and rework through quantitative research over the years to understand why the industry continues to see similar issues of failure. Some scholars have reported rework figures as high as 12.6% of total contract value, highlighting major concerns of the sustainability of construction projects. Separately, however, there have been few studies that explore and detail the views of industry professions who are caught in the middle of quality issues, to understand their perceptions of where the industry is failing. As such, this paper interrogates qualitative data (open-ended questions) on the topic of nonconformance and rework in construction to understand what industry professionals believe are the causes and suggested improvement areas.
Design/methodology/approach
A qualitative approach is adopted for this research. An industry survey consisting of seven open-ended questions is presented to two professional working groups within a Tier 1 contractor, and outputs are analysed using statistic software (NVivo 12) to identify prominent themes for discussion. Inductive analysis is undertaken to gain further insight into responses to yield recurrent areas for continuous improvement.
Findings
Qualitative analysis of the survey reveals a persistent prioritisation of cost and programme over quality management in construction project. Furthermore, feedback from construction professionals present a number of improvement areas that must be addressed to improve quality. These include increased training and competency investment, overhauling quality behaviours, providing greater quality leadership direction and reshaping the way clients govern schemes.
Research limitations/implications
There are limitations to this paper that require noting. Firstly, the survey was conducted within one principal contractor with varying levels of knowledge across multiple sectors. Secondly, the case study was from one major highways scheme; therefore, the generalisability of the findings is limited. It is suggested that a similar exercise is undertaken in other sectors to uncover similar improvement avenues.
Practical implications
The implications of this study calls for quality to be re-evaluated at project, company, sector and government levels to overhaul how quality is delivered. Furthermore, the paper identifies critical learning outcomes for the construction sector to take forward, including the need to reassess projects to ensure they are appropriately equip with competent personnel under a vetted, progressive training programme, share collaborative behaviours that value quality delivery on an equal standing to safety, programme and cost and tackle the inappropriate resource dilemmas projects finding themselves in through clear tendering and accurate planning. In addition, before making erratic decisions, projects must assess the risk profiling of proceed without approved design details and include the client in the decision-making process. Moreover, the findings call for a greater collaborative environment between the construction team and quality management department, rather than being seen as obstructive (i.e. compliance based policing). All of these must be driven by leadership to overhaul the way quality is managed on schemes. The findings demonstrate the importance and impact from open-ended survey response data studies to enhance quantitative outcomes and help provide strengthened proposals of improvement.
Originality/value
This paper addresses the highly sensitive area of quality failure outcomes and interrogates them via an industry survey within a major UK contractor for feedback. Unique insights are gained into how industry professionals perceive quality in construction. From previous research, this has been largely missing and offers a valuable addition in understanding the “quality status quo” from those delivering schemes.
Details
Keywords
Christian Nnaemeka Egwim, Hafiz Alaka, Youlu Pan, Habeeb Balogun, Saheed Ajayi, Abdul Hye and Oluwapelumi Oluwaseun Egunjobi
The study aims to develop a multilayer high-effective ensemble of ensembles predictive model (stacking ensemble) using several hyperparameter optimized ensemble machine learning…
Abstract
Purpose
The study aims to develop a multilayer high-effective ensemble of ensembles predictive model (stacking ensemble) using several hyperparameter optimized ensemble machine learning (ML) methods (bagging and boosting ensembles) trained with high-volume data points retrieved from Internet of Things (IoT) emission sensors, time-corresponding meteorology and traffic data.
Design/methodology/approach
For a start, the study experimented big data hypothesis theory by developing sample ensemble predictive models on different data sample sizes and compared their results. Second, it developed a standalone model and several bagging and boosting ensemble models and compared their results. Finally, it used the best performing bagging and boosting predictive models as input estimators to develop a novel multilayer high-effective stacking ensemble predictive model.
Findings
Results proved data size to be one of the main determinants to ensemble ML predictive power. Second, it proved that, as compared to using a single algorithm, the cumulative result from ensemble ML algorithms is usually always better in terms of predicted accuracy. Finally, it proved stacking ensemble to be a better model for predicting PM2.5 concentration level than bagging and boosting ensemble models.
Research limitations/implications
A limitation of this study is the trade-off between performance of this novel model and the computational time required to train it. Whether this gap can be closed remains an open research question. As a result, future research should attempt to close this gap. Also, future studies can integrate this novel model to a personal air quality messaging system to inform public of pollution levels and improve public access to air quality forecast.
Practical implications
The outcome of this study will aid the public to proactively identify highly polluted areas thus potentially reducing pollution-associated/ triggered COVID-19 (and other lung diseases) deaths/ complications/ transmission by encouraging avoidance behavior and support informed decision to lock down by government bodies when integrated into an air pollution monitoring system
Originality/value
This study fills a gap in literature by providing a justification for selecting appropriate ensemble ML algorithms for PM2.5 concentration level predictive modeling. Second, it contributes to the big data hypothesis theory, which suggests that data size is one of the most important factors of ML predictive capability. Third, it supports the premise that when using ensemble ML algorithms, the cumulative output is usually always better in terms of predicted accuracy than using a single algorithm. Finally developing a novel multilayer high-performant hyperparameter optimized ensemble of ensembles predictive model that can accurately predict PM2.5 concentration levels with improved model interpretability and enhanced generalizability, as well as the provision of a novel databank of historic pollution data from IoT emission sensors that can be purchased for research, consultancy and policymaking.
Details