Search results

1 – 10 of 207
Article
Publication date: 24 April 2024

Bahman Arasteh and Ali Ghaffari

Reducing the number of generated mutants by clustering redundant mutants, reducing the execution time by decreasing the number of generated mutants and reducing the cost of…

Abstract

Purpose

Reducing the number of generated mutants by clustering redundant mutants, reducing the execution time by decreasing the number of generated mutants and reducing the cost of mutation testing are the main goals of this study.

Design/methodology/approach

In this study, a method is suggested to identify and prone the redundant mutants. In the method, first, the program source code is analyzed by the developed parser to filter out the effectless instructions; then the remaining instructions are mutated by the standard mutation operators. The single-line mutants are partially executed by the developed instruction evaluator. Next, a clustering method is used to group the single-line mutants with the same results. There is only one complete run per cluster.

Findings

The results of experiments on the Java benchmarks indicate that the proposed method causes a 53.51 per cent reduction in the number of mutants and a 57.64 per cent time reduction compared to similar experiments in the MuJava and MuClipse tools.

Originality/value

Developing a classifier that takes the source code of the program and classifies the programs' instructions into effective and effectless classes using a dependency graph; filtering out the effectless instructions reduces the total number of mutants generated; Developing and implementing an instruction parser and instruction-level mutant generator for Java programs; the mutant generator takes instruction in the original program as a string and generates its single-line mutants based on the standard mutation operators in MuJava; Developing a stack-based evaluator that takes an instruction (original or mutant) and the test data and evaluates its result without executing the whole program.

Details

Data Technologies and Applications, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2514-9288

Keywords

Article
Publication date: 24 June 2022

Maitri Patel, Rajan Patel, Nimisha Patel, Parita Shah and Kamal Gulati

In the field of cryptography, authentication, secrecy and identification can be accomplished by use of secret keys for any computer-based system. The need to acquire certificates…

Abstract

Purpose

In the field of cryptography, authentication, secrecy and identification can be accomplished by use of secret keys for any computer-based system. The need to acquire certificates endorsed through CA to substantiate users for the barter of encoded communications is one of the most significant constraints for the extensive recognition of PKC, as the technique takes too much time and susceptible to error. PKC’s certificate and key management operating costs are reduced with IBC. IBE is a crucial primeval in IBC. The thought behind presenting the IBE scheme was to diminish the complexity of certificate and key management, but it also gives rise to key escrow and key revocation problem, which provides access to unauthorised users for the encrypted information.

Design/methodology/approach

This paper aims to compare the result of IIBES with the existing system and to provide security analysis for the same and the proposed system can be used for the security in federated learning.

Findings

Furthermore, it can be implemented using other encryption/decryption algorithms like elliptic curve cryptography (ECC) to compare the execution efficiency. The proposed system can be used for the security in federated learning.

Originality/value

As a result, a novel enhanced IBE scheme: IIBES is suggested and implemented in JAVA programming language using RSA algorithm, which eradicates the key escrow problem through eliminating the need for a KGC and key revocation problem by sing sub-KGC (SKGC) and a shared secret with nonce. IIBES also provides authentication through IBS as well as it can be used for securing the data in federated learning.

Details

International Journal of Pervasive Computing and Communications, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1742-7371

Keywords

Article
Publication date: 6 June 2024

Prida Ariani Ambar Astuti, Antonius Widi Hardianto, M. Sarofi Sahrul Romadhon and Roel P. Hangsing

This study aims to examine the strategy of TV9 Nusantara, one of the local televisions in Indonesia, marketing its religious programs when soap operas are the most popular…

Abstract

Purpose

This study aims to examine the strategy of TV9 Nusantara, one of the local televisions in Indonesia, marketing its religious programs when soap operas are the most popular television programs in Indonesia.

Design/methodology/approach

This study used a descriptive qualitative method by collecting data using in-depth interviews, observation and documentation.

Findings

TV9 Nusantara used a counter-programming strategy to seize viewers from the competing television stations; the prime time is also set differently from other televisions as well as implements a head-sterling strategy to make the audiences loyal to watching TV9 Nusantara programs and not switch the channels.

Research limitations/implications

In Indonesia, three types of television stations are broadcast nationally, publicly or government-owned, central and regional and local television. This study only focused on local television stations whose main program is religious, especially Islam.

Practical implications

The results of this study can underline the importance of establishing segmentation, targets, differentiation and market positioning as well as efforts to create products, prices, places and promotions for journalistic products, especially TV broadcast products and production processes that follow Sharia principles.

Social implications

This study can inform the public regarding TV Broadcasting products and production processes following Sharia principles.

Originality/value

This study examined the implementation of marketing strategies and the marketing mix on local television, especially television that broadcasts programs that are not the favorites of most viewers.

Details

Journal of Islamic Marketing, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1759-0833

Keywords

Article
Publication date: 19 May 2022

Priyanka Kumari Bhansali, Dilendra Hiran and Kamal Gulati

The purpose of this paper is to secure health data collection and transmission (SHDCT). In this system, a native network consists of portable smart devices that interact with…

Abstract

Purpose

The purpose of this paper is to secure health data collection and transmission (SHDCT). In this system, a native network consists of portable smart devices that interact with multiple gateways. It entails IoMT devices and wearables connecting to exchange sensitive data with a sensor node which performs the aggeration process and then communicates the data using a Fog server. If the aggregator sensor loses the connection from the Fog server, it will be unable to submit data directly to the Fog server. The node transmits encrypted information with a neighboring sensor and sends it to the Fog server integrated with federated learning, which encrypts data to the existing data. The fog server performs the operations on the measured data, and the values are stored in the local storage area and later it is updated to the cloud server.

Design/methodology/approach

SHDCT uses an Internet-of-things (IoT)-based monitoring network, making it possible for smart devices to connect and interact with each other. The main purpose of the monitoring network has been in the collection of biological data and additional information from mobile devices to the patients. The monitoring network is composed of three different types of smart devices that is at the heart of the IoT.

Findings

It has been addressed in this work how to design an architecture for safe data aggregation in heterogeneous IoT-federated learning-enabled wireless sensor networks (WSNs), which makes use of basic encoding and data aggregation methods to achieve this. The authors suggest that the small gateway node (SGN) captures all of the sensed data from the SD and uses a simple, lightweight encoding scheme and cryptographic techniques to convey the data to the gateway node (GWN). The GWN gets all of the medical data from SGN and ensures that the data is accurate and up to date. If the data obtained is trustworthy, then the medical data should be aggregated and sent to the Fog server for further processing. The Java programming language simulates and analyzes the proposed SHDCT model for deployment and message initiation. When comparing the SHDCT scheme to the SPPDA and electrohydrodynamic atomisation (EHDA) schemes, the results show that the SHDCT method performs significantly better. When compared with the SPPDA and EHDA schemes, the suggested SHDCT plan necessitates a lower communication cost. In comparison to EHDA and SPPDA, SHDCT achieves 4.72% and 13.59% less, respectively. When compared to other transmission techniques, SHDCT has a higher transmission ratio. When compared with EHDA and SPPDA, SHDCT achieves 8.47% and 24.41% higher transmission ratios, respectively. When compared with other ways it uses less electricity. When compared with EHDA and SPPDA, SHDCT achieves 5.85% and 18.86% greater residual energy, respectively.

Originality/value

In the health care sector, a series of interconnected medical devices collect data using IoT networks in the health care domain. Preventive, predictive, personalized and participatory care is becoming increasingly popular in the health care sector. Safe data collection and transfer to a centralized server is a challenging scenario. This study presents a mechanism for SHDCT. The mechanism consists of Smart healthcare IoT devices working on federated learning that link up with one another to exchange health data. Health data is sensitive and needs to be exchanged securely and efficiently. In the mechanism, the sensing devices send data to a SGN. This SGN uses a lightweight encoding scheme and performs cryptography techniques to communicate the data with the GWN. The GWN gets all the health data from the SGN and makes it possible to confirm that the data is validated. If the received data is reliable, then aggregate the medical data and transmit it to the Fog server for further process. The performance parameters are compared with the other systems in terms of communication costs, transmission ratio and energy use.

Details

International Journal of Pervasive Computing and Communications, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1742-7371

Keywords

Article
Publication date: 22 June 2022

Shubangini Patil and Rekha Patil

Until now, a lot of research has been done and applied to provide security and original data from one user to another, such as third-party auditing and several schemes for…

Abstract

Purpose

Until now, a lot of research has been done and applied to provide security and original data from one user to another, such as third-party auditing and several schemes for securing the data, such as the generation of the key with the help of encryption algorithms like Rivest–Shamir–Adleman and others. Here are some of the related works that have been done previously. Remote damage control resuscitation (RDCR) scheme by Yan et al. (2017) is proposed based on the minimum bandwidth. By enabling the third party to perform the verification of public integrity. Although it supports the repair management for the corrupt data and tries to recover the original data, in practicality it fails to do so, and thus it takes more computation and communication cost than our proposed system. In a paper by Chen et al. (2015), using broadcast encryption, an idea for cloud storage data sharing has been developed. This technique aims to accomplish both broadcast data and dynamic sharing, allowing users to join and leave a group without affecting the electronic press kit (EPK). In this case, the theoretical notion was true and new, but the system’s practicality and efficiency were not acceptable, and the system’s security was also jeopardised because it proposed adding a member without altering any keys. In this research, an identity-based encryption strategy for data sharing was investigated, as well as key management and metadata techniques to improve model security (Jiang and Guo, 2017). The forward and reverse ciphertext security is supplied here. However, it is more difficult to put into practice, and one of its limitations is that it can only be used for very large amounts of cloud storage. Here, it extends support for dynamic data modification by batch auditing. The important feature of the secure and efficient privacy preserving provable data possession in cloud storage scheme was to support every important feature which includes data dynamics, privacy preservation, batch auditing and blockers verification for an untrusted and an outsourced storage model (Pathare and Chouragadec, 2017). A homomorphic signature mechanism was devised to prevent the usage of the public key certificate, which was based on the new id. This signature system was shown to be resistant to the id attack on the random oracle model and the assault of forged message (Nayak and Tripathy, 2018; Lin et al., 2017). When storing data in a public cloud, one issue is that the data owner must give an enormous number of keys to the users in order for them to access the files. At this place, the knowledge assisted software engineering (KASE) plan was publicly unveiled for the first time. While sharing a huge number of documents, the data owner simply has to supply the specific key to the user, and the user only needs to provide the single trapdoor. Although the concept is innovative, the KASE technique does not apply to the increasingly common manufactured cloud. Cui et al. (2016) claim that as the amount of data grows, distribution management system (DMS) will be unable to handle it. As a result, various proven data possession (PDP) schemes have been developed, and practically all data lacks security. So, here in these certificates, PDP was introduced, which was based on bilinear pairing. Because of its feature of being robust as well as efficient, this is mostly applicable in DMS. The main purpose of this research is to design and implement a secure cloud infrastructure for sharing group data. This research provides an efficient and secure protocol for multiple user data in the cloud, allowing many users to easily share data.

Design/methodology/approach

The methodology and contribution of this paper is given as follows. The major goal of this study is to design and implement a secure cloud infrastructure for sharing group data. This study provides an efficient and secure protocol for multiple user data in cloud, allowing several users to share data without difficulty. The primary purpose of this research is to design and implement a secure cloud infrastructure for sharing group data. This research develops an efficient and secure protocol for multiple user data in the cloud, allowing numerous users to exchange data without difficulty. Selection scheme design (SSD) comprises two algorithms; first algorithm is designed for limited users and algorithm 2 is redesigned for the multiple users. Further, the authors design SSD-security protocol which comprises a three-phase model, namely, Phase 1, Phase 2 and Phase 3. Phase 1 generates the parameters and distributes the private key, the second phase generates the general key for all the users that are available and third phase is designed to prevent the dishonest user to entertain in data sharing.

Findings

Data sharing in cloud computing provides unlimited computational resources and storage to enterprise and individuals; moreover, cloud computing leads to several privacy and security concerns such as fault tolerance, reliability, confidentiality and data integrity. Furthermore, the key consensus mechanism is fundamental cryptographic primitive for secure communication; moreover, motivated by this phenomenon, the authors developed SSDmechanismwhich embraces the multiple users in the data-sharing model.

Originality/value

Files shared in the cloud should be encrypted for security purpose; later these files are decrypted for the users to access the file. Furthermore, the key consensus process is a crucial cryptographic primitive for secure communication; additionally, the authors devised the SSD mechanism, which incorporates numerous users in the data-sharing model, as a result of this phenomena. For evaluation of the SSD method, the authors have considered the ideal environment of the system, that is, the authors have used java as a programming language and eclipse as the integrated drive electronics tool for the proposed model evaluation. Hardware configuration of the model is such that it is packed with 4 GB RAM and i7 processor, the authors have used the PBC library for the pairing operations (PBC Library, 2022). Furthermore, in the following section of this paper, the number of users is varied to compare with the existing methodology RDIC (Li et al., 2020). For the purposes of the SSD-security protocol, a prime number is chosen as the number of users in this work.

Details

International Journal of Pervasive Computing and Communications, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1742-7371

Keywords

Article
Publication date: 10 July 2023

Md. Mehrab Hossain, Shakil Ahmed, S.M. Asif Anam, Irmatova Aziza Baxramovna, Tamanna Islam Meem, Md. Habibur Rahman Sobuz and Iffat Haq

Construction safety is a crucial aspect that has far-reaching impacts on economic development. But safety monitoring is often reliant on labor-based observations, which can be…

Abstract

Purpose

Construction safety is a crucial aspect that has far-reaching impacts on economic development. But safety monitoring is often reliant on labor-based observations, which can be prone to errors and result in numerous fatalities annually. This study aims to address this issue by proposing a cloud-building information modeling (BIM)-based framework to provide real-time safety monitoring on construction sites to enhance safety practices and reduce fatalities.

Design/methodology/approach

This system integrates an automated safety tracking mobile app to detect hazardous locations on construction sites, a cloud-based BIM system for visualization of worker tracking on a virtual construction site and a Web interface to visualize and monitor site safety.

Findings

The study’s results indicate that implementing a comprehensive automated safety monitoring approach is feasible and suitable for general indoor construction site environments. Furthermore, the assessment of an advanced safety monitoring system has been successfully implemented, indicating its potential effectiveness in enhancing safety practices in construction sites.

Practical implications

By using this system, the construction industry can prevent accidents and fatalities, promote the adoption of new technologies and methods with minimal effort and cost and improve safety outcomes and productivity. This system can reduce workers’ compensation claims, insurance costs and legal penalties, benefiting all stakeholders involved.

Originality/value

To the best of the authors’ knowledge, this study represents the first attempt in Bangladesh to develop a mobile app-based technological solution aimed at reforming construction safety culture by using BIM technology. This has the potential to change the construction sector’s attitude toward accepting new technologies and cultures through its convenient choice of equipment.

Details

Construction Innovation , vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1471-4175

Keywords

Article
Publication date: 29 March 2024

Sihao Li, Jiali Wang and Zhao Xu

The compliance checking of Building Information Modeling (BIM) models is crucial throughout the lifecycle of construction. The increasing amount and complexity of information…

Abstract

Purpose

The compliance checking of Building Information Modeling (BIM) models is crucial throughout the lifecycle of construction. The increasing amount and complexity of information carried by BIM models have made compliance checking more challenging, and manual methods are prone to errors. Therefore, this study aims to propose an integrative conceptual framework for automated compliance checking of BIM models, allowing for the identification of errors within BIM models.

Design/methodology/approach

This study first analyzed the typical building standards in the field of architecture and fire protection, and then the ontology of these elements is developed. Based on this, a building standard corpus is built, and deep learning models are trained to automatically label the building standard texts. The Neo4j is utilized for knowledge graph construction and storage, and a data extraction method based on the Dynamo is designed to obtain checking data files. After that, a matching algorithm is devised to express the logical rules of knowledge graph triples, resulting in automated compliance checking for BIM models.

Findings

Case validation results showed that this theoretical framework can achieve the automatic construction of domain knowledge graphs and automatic checking of BIM model compliance. Compared with traditional methods, this method has a higher degree of automation and portability.

Originality/value

This study introduces knowledge graphs and natural language processing technology into the field of BIM model checking and completes the automated process of constructing domain knowledge graphs and checking BIM model data. The validation of its functionality and usability through two case studies on a self-developed BIM checking platform.

Details

Engineering, Construction and Architectural Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 8 December 2022

Deden Sumirat Hidayat, Dana Indra Sensuse, Damayanti Elisabeth and Lintang Matahari Hasani

Study on knowledge-based systems for scientific publications is growing very broadly. However, most of these studies do not explicitly discuss the knowledge management (KM…

Abstract

Purpose

Study on knowledge-based systems for scientific publications is growing very broadly. However, most of these studies do not explicitly discuss the knowledge management (KM) component as knowledge management system (KMS) implementation. This background causes academic institutions to face challenges in developing KMS to support scholarly publication cycle (SPC). Therefore, this study aims to develop a new KMS conceptual model, Identify critical components and provide research gap opportunities for future KM studies on SPC.

Design/methodology/approach

This study used a systematic literature review (SLR) method with the procedure from Kitchenham et al. Then, the SLR results are compiled into a conceptual model design based on a framework on KM foundations and KM solutions. Finally, the model design was validated through interviews with related field experts.

Findings

The KMS for SPC focuses on the discovery, sharing and application of knowledge. The majority of KMS use recommendation systems technology with content-based filtering and collaborative filtering personalization approaches. The characteristics data used in KMS for SPC are structured and unstructured. Metadata and article abstracts are considered sufficiently representative of the entire article content to be used as a search tool and can provide recommendations. The KMS model for SPC has layers of KM infrastructure, processes, systems, strategies, outputs and outcomes.

Research limitations/implications

This study has limitations in discussing tacit knowledge. In contrast, tacit knowledge for SPC is essential for scientific publication performance. The tacit knowledge includes experience in searching, writing, submitting, publishing and disseminating scientific publications. Tacit knowledge plays a vital role in the development of knowledge sharing system (KSS) and KCS. Therefore, KSS and KCS for SPC are still very challenging to be researched in the future. KMS opportunities that might be developed further are lessons learned databases and interactive forums that capture tacit knowledge about SPC. Future work potential could identify other types of KMS in academia and focus more on SPC.

Originality/value

This study proposes a novel comprehensive KMS model to support scientific publication performance. This model has a critical path as a KMS implementation solution for SPC. This model proposes and recommends appropriate components for SPC requirements (KM processes, technology, methods/techniques and data). This study also proposes novel research gaps as KMS research opportunities for SPC in the future.

Details

VINE Journal of Information and Knowledge Management Systems, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2059-5891

Keywords

Article
Publication date: 9 August 2024

Mohammad Reza Jalilvand and Hamed Ghasemi

Augmented reality (AR) is revolutionizing the tourism and hospitality industry by offering immersive experiences as well as creating more engaging, informative and accessible…

Abstract

Purpose

Augmented reality (AR) is revolutionizing the tourism and hospitality industry by offering immersive experiences as well as creating more engaging, informative and accessible travel experiences that attract tourists from around the globe. From virtual tours and immersive historical site recreations to navigation assistance and cultural education, AR technology is transforming the way we explore and interact with the destinations. This study aims to identify benefits, risks, tools and techniques of AR in the tourism and hospitality literature.

Design/methodology/approach

The authors conducted a systematic literature review to answer six research questions. The authors also identified 33 primary studies, dated from January 2010 to February 2024 and coded them via a thematic analysis. Related studies were obtained through searching in Web of Science and Scopus.

Findings

The results identified nine themes for benefits, eight themes for risks/disadvantages and four tools and applications-related themes. Through the thematic analysis, the major benefits of AR in the tourism and hospitality were found to be differentiated travel experiences, improved performance of tourism value chain, more effective marketing efforts of tourism businesses, enhanced tourists’ engagement, enhanced performance of tourism destinations, stimulated behavioral intentions, tourist empowerment and providing more value, interactivity and integrity. Furthermore, eight risks were identified: physical, privacy and security, social, service failure, technical, psychological, managerial, information and knowledge gaps. The authors also recognized four tools and applications-related themes, namely, AR-enabled tools, AR applications, AR-enabled apps and AR-based techniques.

Originality/value

To the best of the authors’ knowledge, this review provides the first systematic exploration of the existing literature on usage of AR in the context of tourism and hospitality value chain.

Details

Journal of Science and Technology Policy Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2053-4620

Keywords

Article
Publication date: 2 April 2024

Farjam Eshraghian, Najmeh Hafezieh, Farveh Farivar and Sergio de Cesare

The applications of Artificial Intelligence (AI) in various areas of professional and knowledge work are growing. Emotions play an important role in how users incorporate a…

Abstract

Purpose

The applications of Artificial Intelligence (AI) in various areas of professional and knowledge work are growing. Emotions play an important role in how users incorporate a technology into their work practices. The current study draws on work in the areas of AI-powered technologies adaptation, emotions, and the future of work, to investigate how knowledge workers feel about adopting AI in their work.

Design/methodology/approach

We gathered 107,111 tweets about the new AI programmer, GitHub Copilot, launched by GitHub and analysed the data in three stages. First, after cleaning and filtering the data, we applied the topic modelling method to analyse 16,130 tweets posted by 10,301 software programmers to identify the emotions they expressed. Then, we analysed the outcome topics qualitatively to understand the stimulus characteristics driving those emotions. Finally, we analysed a sample of tweets to explore how emotional responses changed over time.

Findings

We found six categories of emotions among software programmers: challenge, achievement, loss, deterrence, scepticism, and apathy. In addition, we found these emotions were driven by four stimulus characteristics: AI development, AI functionality, identity work, and AI engagement. We also examined the change in emotions over time. The results indicate that negative emotions changed to more positive emotions once software programmers redirected their attention to the AI programmer's capabilities and functionalities, and related that to their identity work.

Practical implications

Overall, as organisations start adopting AI-powered technologies in their software development practices, our research offers practical guidance to managers by identifying factors that can change negative emotions to positive emotions.

Originality/value

Our study makes a timely contribution to the discussions on AI and the future of work through the lens of emotions. In contrast to nascent discussions on the role of AI in high-skilled jobs that show knowledge workers' general ambivalence towards AI, we find knowledge workers show more positive emotions over time and as they engage more with AI. In addition, this study unveils the role of professional identity in leading to more positive emotions towards AI, as knowledge workers view such technology as a means of expanding their identity rather than as a threat to it.

Details

Information Technology & People, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0959-3845

Keywords

1 – 10 of 207