Search results

1 – 10 of 107
Article
Publication date: 19 January 2024

Prihana Vasishta, Navjyoti Dhingra and Seema Vasishta

This research aims to analyse the current state of research on the application of Artificial Intelligence (AI) in libraries by examining document type, publication year, keywords…

Abstract

Purpose

This research aims to analyse the current state of research on the application of Artificial Intelligence (AI) in libraries by examining document type, publication year, keywords, country and research methods. The overarching aim is to enrich the existing knowledge of AI-powered libraries by identifying the prevailing research gaps, providing direction for future research and deepening the understanding needed for effective policy development.

Design/methodology/approach

This study used advanced tools such as bibliometric and network analysis, taking the existing literature from the SCOPUS database extending to the year 2022. This study analysed the application of AI in libraries by identifying and selecting relevant keywords, extracting the data from the database, processing the data using advanced bibliometric visualisation tools and presenting and discussing the results. For this comprehensive research, the search strategy was approved by a panel of computer scientists and librarians.

Findings

The majority of research concerning the application of AI in libraries has been conducted in the last three years, likely driven by the fourth industrial revolution. Results show that highly cited articles were published by Emerald Group Holdings Ltd. However, the application of AI in libraries is a developing field, and the study highlights the need for more research in areas such as Digital Humanities, Machine Learning, Robotics, Data Mining and Big Data in Academic Libraries.

Research limitations/implications

This study has excluded papers written in languages other than English that address domains beyond libraries, such as medicine, health, education, science and technology.

Practical implications

This article offers insight for managers and policymakers looking to implement AI in libraries. By identifying clusters and themes, the article would empower managers to plan ahead, mitigate potential drawbacks and seize opportunities for sustainable growth.

Originality/value

Previous studies on the application of AI in libraries have taken a broad approach, but this study narrows its focus to research published explicitly in Library and Information Science (LIS) journals. This makes it unique compared to previous research in the field.

Details

Library Hi Tech, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0737-8831

Keywords

Article
Publication date: 26 June 2023

Argaw Gurmu, M. Reza Hosseini, Mehrdad Arashpour and Wellia Lioeng

Building defects are becoming recurrent phenomena in most high-rise buildings. However, little research exists on the analysis of defects in high-rise buildings based on data from…

Abstract

Purpose

Building defects are becoming recurrent phenomena in most high-rise buildings. However, little research exists on the analysis of defects in high-rise buildings based on data from real-life projects. This study aims to develop dashboards and models for revealing the most common locations of defects, understanding associations among defects and predicting the rectification periods.

Design/methodology/approach

In total, 15,484 defect reports comprising qualitative and quantitative data were obtained from a company that provides consulting services for the construction industry in Victoria, Australia. Data mining methods were applied using a wide range of Python libraries including NumPy, Pandas, Natural Language Toolkit, SpaCy and Regular Expression, alongside association rule mining (ARM) and simulations.

Findings

Findings reveal that defects in multi-storey buildings often occur on lower levels, rather than on higher levels. Joinery defects were found to be the most recurrent problem on ground floors. The ARM outcomes show that the occurrence of one type of defect can be taken as an indication for the existence of other types of defects. For instance, in laundry, the chance of occurrence of plumbing and joinery defects, where paint defects are observed, is 88%. The stochastic model built for door defects showed that there is a 60% chance that defects on doors can be rectified within 60 days.

Originality/value

The dashboards provide original insight and novel ideas regarding the frequency of defects in various positions in multi-storey buildings. The stochastic models can provide a reliable point of reference for property managers, occupants and sub-contractors for taking measures to avoid reoccurring defects; so too, findings provide estimations of possible rectification periods for various types of defects.

Details

Construction Innovation , vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1471-4175

Keywords

Article
Publication date: 22 March 2024

Mohd Mustaqeem, Suhel Mustajab and Mahfooz Alam

Software defect prediction (SDP) is a critical aspect of software quality assurance, aiming to identify and manage potential defects in software systems. In this paper, we have…

Abstract

Purpose

Software defect prediction (SDP) is a critical aspect of software quality assurance, aiming to identify and manage potential defects in software systems. In this paper, we have proposed a novel hybrid approach that combines Gray Wolf Optimization with Feature Selection (GWOFS) and multilayer perceptron (MLP) for SDP. The GWOFS-MLP hybrid model is designed to optimize feature selection, ultimately enhancing the accuracy and efficiency of SDP. Gray Wolf Optimization, inspired by the social hierarchy and hunting behavior of gray wolves, is employed to select a subset of relevant features from an extensive pool of potential predictors. This study investigates the key challenges that traditional SDP approaches encounter and proposes promising solutions to overcome time complexity and the curse of the dimensionality reduction problem.

Design/methodology/approach

The integration of GWOFS and MLP results in a robust hybrid model that can adapt to diverse software datasets. This feature selection process harnesses the cooperative hunting behavior of wolves, allowing for the exploration of critical feature combinations. The selected features are then fed into an MLP, a powerful artificial neural network (ANN) known for its capability to learn intricate patterns within software metrics. MLP serves as the predictive engine, utilizing the curated feature set to model and classify software defects accurately.

Findings

The performance evaluation of the GWOFS-MLP hybrid model on a real-world software defect dataset demonstrates its effectiveness. The model achieves a remarkable training accuracy of 97.69% and a testing accuracy of 97.99%. Additionally, the receiver operating characteristic area under the curve (ROC-AUC) score of 0.89 highlights the model’s ability to discriminate between defective and defect-free software components.

Originality/value

Experimental implementations using machine learning-based techniques with feature reduction are conducted to validate the proposed solutions. The goal is to enhance SDP’s accuracy, relevance and efficiency, ultimately improving software quality assurance processes. The confusion matrix further illustrates the model’s performance, with only a small number of false positives and false negatives.

Details

International Journal of Intelligent Computing and Cybernetics, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1756-378X

Keywords

Article
Publication date: 1 May 2023

Guijie Zhang, Fangfang Wei and Peixin Wang

This paper presents a comprehensive study using bibliometric and social network analysis (SNA) to depict the academic community, research hotspots and the correlation between…

Abstract

Purpose

This paper presents a comprehensive study using bibliometric and social network analysis (SNA) to depict the academic community, research hotspots and the correlation between research performance and social network measurements within Library Hi Tech.

Design/methodology/approach

Publications from Library Hi Tech between 2010 and 2022 are reviewed and analysed through coauthorship analysis, co-occurrence analysis, SNA and the Spearman rank correlation test.

Findings

The annual number of publications in Library Hi Tech increased from 2016 to 2022, indicating that this research has gradually gained global attention. The USA and China are the most significant contributors to the relevant publications. Scholars in this field mainly engage in small-scale cooperation. Academic libraries, digital libraries, libraries, information technology and COVID-19 were hot topics during the study period. In light of the COVID-19 pandemic, there was a marked increase in research on healthcare. Academic interest in the internet of Things and social media has proliferated recently and may soon attract more attention. Spearman rank correlation analysis shows that research performance (i.e. publication count and citation count) is significantly and positively correlated with social network measurements (i.e. degree centrality, betweenness centrality, closeness centrality and eigenvector centrality) in studies of Library Hi Tech.

Originality/value

This paper reveals a systematic picture of the research landscape of Library Hi Tech and provides a potential guide for future research. The relationship between scientific research performance and social network measurements can be objectively identified based on statistical knowledge.

Details

Library Hi Tech, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0737-8831

Keywords

Article
Publication date: 22 June 2022

Shubangini Patil and Rekha Patil

Until now, a lot of research has been done and applied to provide security and original data from one user to another, such as third-party auditing and several schemes for…

Abstract

Purpose

Until now, a lot of research has been done and applied to provide security and original data from one user to another, such as third-party auditing and several schemes for securing the data, such as the generation of the key with the help of encryption algorithms like Rivest–Shamir–Adleman and others. Here are some of the related works that have been done previously. Remote damage control resuscitation (RDCR) scheme by Yan et al. (2017) is proposed based on the minimum bandwidth. By enabling the third party to perform the verification of public integrity. Although it supports the repair management for the corrupt data and tries to recover the original data, in practicality it fails to do so, and thus it takes more computation and communication cost than our proposed system. In a paper by Chen et al. (2015), using broadcast encryption, an idea for cloud storage data sharing has been developed. This technique aims to accomplish both broadcast data and dynamic sharing, allowing users to join and leave a group without affecting the electronic press kit (EPK). In this case, the theoretical notion was true and new, but the system’s practicality and efficiency were not acceptable, and the system’s security was also jeopardised because it proposed adding a member without altering any keys. In this research, an identity-based encryption strategy for data sharing was investigated, as well as key management and metadata techniques to improve model security (Jiang and Guo, 2017). The forward and reverse ciphertext security is supplied here. However, it is more difficult to put into practice, and one of its limitations is that it can only be used for very large amounts of cloud storage. Here, it extends support for dynamic data modification by batch auditing. The important feature of the secure and efficient privacy preserving provable data possession in cloud storage scheme was to support every important feature which includes data dynamics, privacy preservation, batch auditing and blockers verification for an untrusted and an outsourced storage model (Pathare and Chouragadec, 2017). A homomorphic signature mechanism was devised to prevent the usage of the public key certificate, which was based on the new id. This signature system was shown to be resistant to the id attack on the random oracle model and the assault of forged message (Nayak and Tripathy, 2018; Lin et al., 2017). When storing data in a public cloud, one issue is that the data owner must give an enormous number of keys to the users in order for them to access the files. At this place, the knowledge assisted software engineering (KASE) plan was publicly unveiled for the first time. While sharing a huge number of documents, the data owner simply has to supply the specific key to the user, and the user only needs to provide the single trapdoor. Although the concept is innovative, the KASE technique does not apply to the increasingly common manufactured cloud. Cui et al. (2016) claim that as the amount of data grows, distribution management system (DMS) will be unable to handle it. As a result, various proven data possession (PDP) schemes have been developed, and practically all data lacks security. So, here in these certificates, PDP was introduced, which was based on bilinear pairing. Because of its feature of being robust as well as efficient, this is mostly applicable in DMS. The main purpose of this research is to design and implement a secure cloud infrastructure for sharing group data. This research provides an efficient and secure protocol for multiple user data in the cloud, allowing many users to easily share data.

Design/methodology/approach

The methodology and contribution of this paper is given as follows. The major goal of this study is to design and implement a secure cloud infrastructure for sharing group data. This study provides an efficient and secure protocol for multiple user data in cloud, allowing several users to share data without difficulty. The primary purpose of this research is to design and implement a secure cloud infrastructure for sharing group data. This research develops an efficient and secure protocol for multiple user data in the cloud, allowing numerous users to exchange data without difficulty. Selection scheme design (SSD) comprises two algorithms; first algorithm is designed for limited users and algorithm 2 is redesigned for the multiple users. Further, the authors design SSD-security protocol which comprises a three-phase model, namely, Phase 1, Phase 2 and Phase 3. Phase 1 generates the parameters and distributes the private key, the second phase generates the general key for all the users that are available and third phase is designed to prevent the dishonest user to entertain in data sharing.

Findings

Data sharing in cloud computing provides unlimited computational resources and storage to enterprise and individuals; moreover, cloud computing leads to several privacy and security concerns such as fault tolerance, reliability, confidentiality and data integrity. Furthermore, the key consensus mechanism is fundamental cryptographic primitive for secure communication; moreover, motivated by this phenomenon, the authors developed SSDmechanismwhich embraces the multiple users in the data-sharing model.

Originality/value

Files shared in the cloud should be encrypted for security purpose; later these files are decrypted for the users to access the file. Furthermore, the key consensus process is a crucial cryptographic primitive for secure communication; additionally, the authors devised the SSD mechanism, which incorporates numerous users in the data-sharing model, as a result of this phenomena. For evaluation of the SSD method, the authors have considered the ideal environment of the system, that is, the authors have used java as a programming language and eclipse as the integrated drive electronics tool for the proposed model evaluation. Hardware configuration of the model is such that it is packed with 4 GB RAM and i7 processor, the authors have used the PBC library for the pairing operations (PBC Library, 2022). Furthermore, in the following section of this paper, the number of users is varied to compare with the existing methodology RDIC (Li et al., 2020). For the purposes of the SSD-security protocol, a prime number is chosen as the number of users in this work.

Details

International Journal of Pervasive Computing and Communications, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1742-7371

Keywords

Article
Publication date: 27 March 2024

Temesgen Agazhie and Shalemu Sharew Hailemariam

This study aims to quantify and prioritize the main causes of lean wastes and to apply reduction methods by employing better waste cause identification methodologies.

Abstract

Purpose

This study aims to quantify and prioritize the main causes of lean wastes and to apply reduction methods by employing better waste cause identification methodologies.

Design/methodology/approach

We employed fuzzy techniques for order preference by similarity to the ideal solution (FTOPSIS), fuzzy analytical hierarchy process (FAHP), and failure mode effect analysis (FMEA) to determine the causes of defects. To determine the current defect cause identification procedures, time studies, checklists, and process flow charts were employed. The study focuses on the sewing department of a clothing industry in Addis Ababa, Ethiopia.

Findings

These techniques outperform conventional techniques and offer a better solution for challenging decision-making situations. Each lean waste’s FMEA criteria, such as severity, occurrence, and detectability, were examined. A pairwise comparison revealed that defect has a larger effect than other lean wastes. Defects were mostly caused by inadequate operator training. To minimize lean waste, prioritizing their causes is crucial.

Research limitations/implications

The research focuses on a case company and the result could not be generalized for the whole industry.

Practical implications

The study used quantitative approaches to quantify and prioritize the causes of lean waste in the garment industry and provides insight for industrialists to focus on the waste causes to improve their quality performance.

Originality/value

The methodology of integrating FMEA with FAHP and FTOPSIS was the new contribution to have a better solution to decision variables by considering the severity, occurrence, and detectability of the causes of wastes. The data collection approach was based on experts’ focus group discussion to rate the main causes of defects which could provide optimal values of defect cause prioritization.

Details

International Journal of Quality & Reliability Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0265-671X

Keywords

Open Access
Article
Publication date: 13 October 2023

Stefano Francesco Musso and Giovanna Franco

This article sets out to show how principles and questions about method that underlie a way of interpreting the discipline of conservation and restoration can find results in…

Abstract

Purpose

This article sets out to show how principles and questions about method that underlie a way of interpreting the discipline of conservation and restoration can find results in research and studies, aiming at achieving even conscious reuse process. The occasion is the very recent research performed on the former Church of Saints Gerolamo and Francesco Saverio in Genoa, Italy, the Jesuit church annexed to the 17th-century College of the order. It is a small Baroque jewel in the heart of the ancient city, former University Library and actually abandoned, forgotten for years, inaccessible and awaiting a new use.

Design/methodology/approach

The two-year work carried out on the monumental building was conducted according to a study and research methodology developed and refined over the years within the activities of the School of Specialisation in Architectural Heritage and Landscape of the University of Genoa. It is a multidisciplinary and rigorous approach, which aims to train high-level professionals, up-to-date and aware of the multiple problems that interventions on existing buildings, especially of a monumental nature, involve.

Findings

The biennal study has been carried out within the activities of the Post-Graduate Programme in Architectural Heritage and Landscape of the University of Genoa. The work methodology faces the challenges of the contemporary complexity, raised by the progressive broadening of the concept of cultural “heritage” and by the problems of its conservation, its active safeguard and its reuse: safety in respect of seismic risk, fire and hydro geological instability, universal accessibility – cognitive, physical and alternative – resource efficiency, comfort and savings in energy consumption, sustainability, communication and involvement of local communities and stakeholders.

Originality/value

The goals of the work were the following: understanding of the architectural heritage, through the correlated study of its geometries, elements and construction materials, surfaces, structures, spaces and functions; understanding of the transformations that the building has undergone over time, relating the results of historical reconstructions from indirect sources and those of direct archaeological analysis; assessment of the state of conservation of the building recognising phenomena of deterioration, damage, faults and deficits that affect materials, construction elements, systems and structures; identification of the causes and extent of damage, faults and deficits, assessing the vulnerability and level of exposure of the asset to the aggression of environmental factors and related risks; evaluation of the compatibility between the characteristics of the available spaces, the primary needs of conservation, the instance of regeneration and possible new uses; the definition of criteria and guidelines for establishing the planning of conservation, restoration and redevelopment interventions.

Details

Journal of Cultural Heritage Management and Sustainable Development, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2044-1266

Keywords

Article
Publication date: 14 March 2023

Mohammad Javad Zoleykani, Hamidreza Abbasianjahromi, Saeed Banihashemi, Seyed Amir Tabadkani and Aso Hajirasouli

Extended reality (XR) is an emerging technology, with its popularity rising in different industry sectors, where its application has been recently considered in construction…

Abstract

Purpose

Extended reality (XR) is an emerging technology, with its popularity rising in different industry sectors, where its application has been recently considered in construction safety. This study aims to investigate the applications of XR technologies in the safety of construction through projects lifecycle perspective.

Design/methodology/approach

Scientometric analysis was conducted to discover trends, keywords, contribution of countries and publication outlets in the literature. The content analysis was applied to categorize previous studies into three groups concerning the phase of lifecycle in which they used XR.

Findings

Results of the content analysis showed that the application of XR in the construction safety is mostly covered in two areas, namely, safety training and risk management. It was found that virtual reality was the most used XR tool with most of its application dedicated to safety training in the design phase. The amount of research on the application of augmented reality and mixed reality in safety training, and risk management in all phases of lifecycle is still insignificant. Finally, this study proposed three main areas for using the XR technologies regarding the safety issues in future research, namely, control of safety regulations and safety coordination in construction phase, and safety reports in the operation phase.

Originality/value

This paper inspected the utilization of all types of XR for safety in each phase of construction lifecycle and proposed future directions for research by addressing the safety challenges in each phase.

Details

Construction Innovation , vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1471-4175

Keywords

Article
Publication date: 31 October 2023

Hong Zhou, Binwei Gao, Shilong Tang, Bing Li and Shuyu Wang

The number of construction dispute cases has maintained a high growth trend in recent years. The effective exploration and management of construction contract risk can directly…

Abstract

Purpose

The number of construction dispute cases has maintained a high growth trend in recent years. The effective exploration and management of construction contract risk can directly promote the overall performance of the project life cycle. The miss of clauses may result in a failure to match with standard contracts. If the contract, modified by the owner, omits key clauses, potential disputes may lead to contractors paying substantial compensation. Therefore, the identification of construction project contract missing clauses has heavily relied on the manual review technique, which is inefficient and highly restricted by personnel experience. The existing intelligent means only work for the contract query and storage. It is urgent to raise the level of intelligence for contract clause management. Therefore, this paper aims to propose an intelligent method to detect construction project contract missing clauses based on Natural Language Processing (NLP) and deep learning technology.

Design/methodology/approach

A complete classification scheme of contract clauses is designed based on NLP. First, construction contract texts are pre-processed and converted from unstructured natural language into structured digital vector form. Following the initial categorization, a multi-label classification of long text construction contract clauses is designed to preliminary identify whether the clause labels are missing. After the multi-label clause missing detection, the authors implement a clause similarity algorithm by creatively integrating the image detection thought, MatchPyramid model, with BERT to identify missing substantial content in the contract clauses.

Findings

1,322 construction project contracts were tested. Results showed that the accuracy of multi-label classification could reach 93%, the accuracy of similarity matching can reach 83%, and the recall rate and F1 mean of both can reach more than 0.7. The experimental results verify the feasibility of intelligently detecting contract risk through the NLP-based method to some extent.

Originality/value

NLP is adept at recognizing textual content and has shown promising results in some contract processing applications. However, the mostly used approaches of its utilization for risk detection in construction contract clauses predominantly are rule-based, which encounter challenges when handling intricate and lengthy engineering contracts. This paper introduces an NLP technique based on deep learning which reduces manual intervention and can autonomously identify and tag types of contractual deficiencies, aligning with the evolving complexities anticipated in future construction contracts. Moreover, this method achieves the recognition of extended contract clause texts. Ultimately, this approach boasts versatility; users simply need to adjust parameters such as segmentation based on language categories to detect omissions in contract clauses of diverse languages.

Details

Engineering, Construction and Architectural Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 31 October 2023

Nadin Augustiniok, Claudine Houbart, Bie Plevoets and Koenraad Van Cleempoel

Adaptive reuse processes aim to preserve heritage values while creating new values through the architectural interventions that have become necessary. This claim provokes a…

Abstract

Purpose

Adaptive reuse processes aim to preserve heritage values while creating new values through the architectural interventions that have become necessary. This claim provokes a discussion about the meaning of values, how we can preserve them in practice and how we can translate them into architectural qualities that users experience. Riegl's understanding of the different perspectives of heritage values in the past and present opens up the possibility of identifying present values as a reflection of current social, material and political conditions in the architectural discourse.

Design/methodology/approach

This qualitative and practical study compares two Belgian projects to trace the use of values in adaptive reuse projects from an architectural design perspective. The Predikherenklooster, a 17th-century monastery in Mechelen that now houses the public library, and the C-Mine cultural centre in Genk, a former 20th-century coal mine, are compared. The starting point is Flemish legislation, which defines significance through values, distinguishing between 13 heritage values.

Findings

The study demonstrates the opportunities that axiological questions offer during the design process of an adaptive reuse project. They provide an overarching framework for tangible and intangible aspects that need to be discussed, particularly in terms of the link between what exists, the design strategy and their effect.

Originality/value

Adaptive reuse can draw on approaches from both heritage conservation and contemporary architecture and explore values as a tool for “re-designing” built heritage.

Details

Journal of Cultural Heritage Management and Sustainable Development, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2044-1266

Keywords

1 – 10 of 107