Search results

1 – 10 of 11
Article
Publication date: 6 November 2023

Muneza Kagzi, Sayantan Khanra and Sanjoy Kumar Paul

From a technological determinist perspective, machine learning (ML) may significantly contribute towards sustainable development. The purpose of this study is to synthesize prior…

Abstract

Purpose

From a technological determinist perspective, machine learning (ML) may significantly contribute towards sustainable development. The purpose of this study is to synthesize prior literature on the role of ML in promoting sustainability and to encourage future inquiries.

Design/methodology/approach

This study conducts a systematic review of 110 papers that demonstrate the utilization of ML in the context of sustainable development.

Findings

ML techniques may play a vital role in enabling sustainable development by leveraging data to uncover patterns and facilitate the prediction of various variables, thereby aiding in decision-making processes. Through the synthesis of findings from prior research, it is evident that ML may help in achieving many of the United Nations’ sustainable development goals.

Originality/value

This study represents one of the initial investigations that conducted a comprehensive examination of the literature concerning ML’s contribution to sustainability. The analysis revealed that the research domain is still in its early stages, indicating a need for further exploration.

Details

Journal of Systems and Information Technology, vol. 25 no. 4
Type: Research Article
ISSN: 1328-7265

Keywords

Article
Publication date: 26 January 2024

Merly Thomas and Meshram B.B.

Denial-of-service (DoS) attacks develop unauthorized entry to various network services and user information by building traffic that creates multiple requests simultaneously…

Abstract

Purpose

Denial-of-service (DoS) attacks develop unauthorized entry to various network services and user information by building traffic that creates multiple requests simultaneously making the system unavailable to users. Protection of internet services requires effective DoS attack detection to keep an eye on traffic passing across protected networks, freeing the protected internet servers from surveillance threats and ensuring they can focus on offering high-quality services with the fewest response times possible.

Design/methodology/approach

This paper aims to develop a hybrid optimization-based deep learning model to precisely detect DoS attacks.

Findings

The designed Aquila deer hunting optimization-enabled deep belief network technique achieved improved performance with an accuracy of 92.8%, a true positive rate of 92.8% and a true negative rate of 93.6.

Originality/value

The introduced detection approach effectively detects DoS attacks available on the internet.

Details

International Journal of Web Information Systems, vol. 20 no. 1
Type: Research Article
ISSN: 1744-0084

Keywords

Article
Publication date: 21 February 2024

Nehal Elshaboury, Tarek Zayed and Eslam Mohammed Abdelkader

Water pipes degrade over time for a variety of pipe-related, soil-related, operational, and environmental factors. Hence, municipalities are necessitated to implement effective…

Abstract

Purpose

Water pipes degrade over time for a variety of pipe-related, soil-related, operational, and environmental factors. Hence, municipalities are necessitated to implement effective maintenance and rehabilitation strategies for water pipes based on reliable deterioration models and cost-effective inspection programs. In the light of foregoing, the paramount objective of this research study is to develop condition assessment and deterioration prediction models for saltwater pipes in Hong Kong.

Design/methodology/approach

As a perquisite to the development of condition assessment models, spherical fuzzy analytic hierarchy process (SFAHP) is harnessed to analyze the relative importance weights of deterioration factors. Afterward, the relative importance weights of deterioration factors coupled with their effective values are leveraged using the measurement of alternatives and ranking according to the compromise solution (MARCOS) algorithm to analyze the performance condition of water pipes. A condition rating system is then designed counting on the generalized entropy-based probabilistic fuzzy C means (GEPFCM) algorithm. A set of fourth order multiple regression functions are constructed to capture the degradation trends in condition of pipelines overtime covering their disparate characteristics.

Findings

Analytical results demonstrated that the top five influential deterioration factors comprise age, material, traffic, soil corrosivity and material. In addition, it was derived that developed deterioration models accomplished correlation coefficient, mean absolute error and root mean squared error of 0.8, 1.33 and 1.39, respectively.

Originality/value

It can be argued that generated deterioration models can assist municipalities in formulating accurate and cost-effective maintenance, repair and rehabilitation programs.

Details

Engineering, Construction and Architectural Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 28 February 2023

Tulsi Pawan Fowdur, M.A.N. Shaikh Abdoolla and Lokeshwar Doobur

The purpose of this paper is to perform a comparative analysis of the delay associated in running two real-time machine learning-based applications, namely, a video quality…

Abstract

Purpose

The purpose of this paper is to perform a comparative analysis of the delay associated in running two real-time machine learning-based applications, namely, a video quality assessment (VQA) and a phishing detection application by using the edge, fog and cloud computing paradigms.

Design/methodology/approach

The VQA algorithm was developed using Android Studio and run on a mobile phone for the edge paradigm. For the fog paradigm, it was hosted on a Java server and for the cloud paradigm on the IBM and Firebase clouds. The phishing detection algorithm was embedded into a browser extension for the edge paradigm. For the fog paradigm, it was hosted on a Node.js server and for the cloud paradigm on Firebase.

Findings

For the VQA algorithm, the edge paradigm had the highest response time while the cloud paradigm had the lowest, as the algorithm was computationally intensive. For the phishing detection algorithm, the edge paradigm had the lowest response time, and the cloud paradigm had the highest, as the algorithm had a low computational complexity. Since the determining factor for the response time was the latency, the edge paradigm provided the smallest delay as all processing were local.

Research limitations/implications

The main limitation of this work is that the experiments were performed on a small scale due to time and budget constraints.

Originality/value

A detailed analysis with real applications has been provided to show how the complexity of an application can determine the best computing paradigm on which it can be deployed.

Details

International Journal of Pervasive Computing and Communications, vol. 20 no. 1
Type: Research Article
ISSN: 1742-7371

Keywords

Article
Publication date: 3 November 2022

Vinod Nistane

Rolling element bearings (REBs) are commonly used in rotating machinery such as pumps, motors, fans and other machineries. The REBs deteriorate over life cycle time. To know the…

Abstract

Purpose

Rolling element bearings (REBs) are commonly used in rotating machinery such as pumps, motors, fans and other machineries. The REBs deteriorate over life cycle time. To know the amount of deteriorate at any time, this paper aims to present a prognostics approach based on integrating optimize health indicator (OHI) and machine learning algorithm.

Design/methodology/approach

Proposed optimum prediction model would be used to evaluate the remaining useful life (RUL) of REBs. Initially, signal raw data are preprocessing through mother wavelet transform; after that, the primary fault features are extracted. Further, these features process to elevate the clarity of features using the random forest algorithm. Based on variable importance of features, the best representation of fault features is selected. Optimize the selected feature by adjusting weight vector using optimization techniques such as genetic algorithm (GA), sequential quadratic optimization (SQO) and multiobjective optimization (MOO). New OHIs are determined and apply to train the network. Finally, optimum predictive models are developed by integrating OHI and artificial neural network (ANN), K-mean clustering (KMC) (i.e. OHI–GA–ANN, OHI–SQO–ANN, OHI–MOO–ANN, OHI–GA–KMC, OHI–SQO–KMC and OHI–MOO–KMC).

Findings

Optimum prediction models performance are recorded and compared with the actual value. Finally, based on error term values best optimum prediction model is proposed for evaluation of RUL of REBs.

Originality/value

Proposed OHI–GA–KMC model is compared in terms of error values with previously published work. RUL predicted by OHI–GA–KMC model is smaller, giving the advantage of this method.

Article
Publication date: 21 November 2023

Armin Mahmoodi, Leila Hashemi and Milad Jasemi

In this study, the central objective is to foresee stock market signals with the use of a proper structure to achieve the highest accuracy possible. For this purpose, three hybrid…

Abstract

Purpose

In this study, the central objective is to foresee stock market signals with the use of a proper structure to achieve the highest accuracy possible. For this purpose, three hybrid models have been developed for the stock markets which are a combination of support vector machine (SVM) with meta-heuristic algorithms of particle swarm optimization (PSO), imperialist competition algorithm (ICA) and genetic algorithm (GA).All the analyses are technical and are based on the Japanese candlestick model.

Design/methodology/approach

Further as per the results achieved, the most suitable algorithm is chosen to anticipate sell and buy signals. Moreover, the authors have compared the results of the designed model validations in this study with basic models in three articles conducted in the past years. Therefore, SVM is examined by PSO. It is used as a classification agent to search the problem-solving space precisely and at a faster pace. With regards to the second model, SVM and ICA are tested to stock market timing, in a way that ICA is used as an optimization agent for the SVM parameters. At last, in the third model, SVM and GA are studied, where GA acts as an optimizer and feature selection agent.

Findings

As per the results, it is observed that all new models can predict accurately for only 6 days; however, in comparison with the confusion matrix results, it is observed that the SVM-GA and SVM-ICA models have correctly predicted more sell signals, and the SCM-PSO model has correctly predicted more buy signals. However, SVM-ICA has shown better performance than other models considering executing the implemented models.

Research limitations/implications

In this study, the data for stock market of the years 2013–2021 were analyzed; the long length of timeframe makes the input data analysis challenging as they must be moderated with respect to the conditions where they have been changed.

Originality/value

In this study, two methods have been developed in a candlestick model; they are raw-based and signal-based approaches in which the hit rate is determined by the percentage of correct evaluations of the stock market for a 16-day period.

Details

EuroMed Journal of Business, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1450-2194

Keywords

Article
Publication date: 6 August 2024

Suhanom Mohd Zaki, Saifudin Razali, Mohd Aidil Riduan Awang Kader, Mohd Zahid Laton, Maisarah Ishak and Norhapizah Mohd Burhan

Many studies have examined pre-diploma students' backgrounds and academic performance with results showing that some did not achieve the expected level of competence. This study…

Abstract

Purpose

Many studies have examined pre-diploma students' backgrounds and academic performance with results showing that some did not achieve the expected level of competence. This study aims to examine the relationship between students’ demographic characteristics and their academic achievement at the pre-diploma level using machine learning.

Design/methodology/approach

Secondary data analysis was used in this study, which involved collecting information about 1,052 pre-diploma students enrolled at Universiti Teknologi MARA (UiTM) Pahang Branch between 2017 and 2021. The research procedure was divided into two parts: data collecting and pre-processing, and building the machine learning algorithm, pre-training and testing.

Findings

Gender, family income, region and achievement in the national secondary school examination (Sijil Pelajaran Malaysia [SPM]) predict academic performance. Female students were 1.2 times more likely to succeed academically. Central region students performed better with a value of 1.26. M40-income students were more likely to excel with an odds ratio of 2.809. Students who excelled in SPM English and Mathematics had a better likelihood of succeeding in higher education.

Research limitations/implications

This research was limited to pre-diploma students from UiTM Pahang Branch. For better generalizability of the results, future research should include pre-diploma students from other UiTM branches that offer this programme.

Practical implications

This study is expected to offer insights for policymakers, particularly, the Ministry of Higher Education, in developing a comprehensive policy to improve the tertiary education system by focusing on the fourth Sustainable Development Goal.

Social implications

These pre-diploma students were found to originate mainly from low- or middle-income families; hence, the programme may help them acquire better jobs and improve their standard of living. Most students enrolling on the pre-diploma performed below excellent at the secondary school level and were therefore given the opportunity to continue studying at a higher level.

Originality/value

This predictive model contributes to guidelines on the minimum requirements for pre-diploma students to gain admission into higher education institutions by ensuring the efficient distribution of resources and equal access to higher education among all communities.

Details

Kybernetes, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 17 April 2024

Jahanzaib Alvi and Imtiaz Arif

The crux of this paper is to unveil efficient features and practical tools that can predict credit default.

Abstract

Purpose

The crux of this paper is to unveil efficient features and practical tools that can predict credit default.

Design/methodology/approach

Annual data of non-financial listed companies were taken from 2000 to 2020, along with 71 financial ratios. The dataset was bifurcated into three panels with three default assumptions. Logistic regression (LR) and k-nearest neighbor (KNN) binary classification algorithms were used to estimate credit default in this research.

Findings

The study’s findings revealed that features used in Model 3 (Case 3) were the efficient and best features comparatively. Results also showcased that KNN exposed higher accuracy than LR, which proves the supremacy of KNN on LR.

Research limitations/implications

Using only two classifiers limits this research for a comprehensive comparison of results; this research was based on only financial data, which exhibits a sizeable room for including non-financial parameters in default estimation. Both limitations may be a direction for future research in this domain.

Originality/value

This study introduces efficient features and tools for credit default prediction using financial data, demonstrating KNN’s superior accuracy over LR and suggesting future research directions.

Details

Kybernetes, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 14 August 2023

Usman Tariq, Ranjit Joy, Sung-Heng Wu, Muhammad Arif Mahmood, Asad Waqar Malik and Frank Liou

This study aims to discuss the state-of-the-art digital factory (DF) development combining digital twins (DTs), sensing devices, laser additive manufacturing (LAM) and subtractive…

Abstract

Purpose

This study aims to discuss the state-of-the-art digital factory (DF) development combining digital twins (DTs), sensing devices, laser additive manufacturing (LAM) and subtractive manufacturing (SM) processes. The current shortcomings and outlook of the DF also have been highlighted. A DF is a state-of-the-art manufacturing facility that uses innovative technologies, including automation, artificial intelligence (AI), the Internet of Things, additive manufacturing (AM), SM, hybrid manufacturing (HM), sensors for real-time feedback and control, and a DT, to streamline and improve manufacturing operations.

Design/methodology/approach

This study presents a novel perspective on DF development using laser-based AM, SM, sensors and DTs. Recent developments in laser-based AM, SM, sensors and DTs have been compiled. This study has been developed using systematic reviews and meta-analyses (PRISMA) guidelines, discussing literature on the DTs for laser-based AM, particularly laser powder bed fusion and direct energy deposition, in-situ monitoring and control equipment, SM and HM. The principal goal of this study is to highlight the aspects of DF and its development using existing techniques.

Findings

A comprehensive literature review finds a substantial lack of complete techniques that incorporate cyber-physical systems, advanced data analytics, AI, standardized interoperability, human–machine cooperation and scalable adaptability. The suggested DF effectively fills this void by integrating cyber-physical system components, including DT, AM, SM and sensors into the manufacturing process. Using sophisticated data analytics and AI algorithms, the DF facilitates real-time data analysis, predictive maintenance, quality control and optimal resource allocation. In addition, the suggested DF ensures interoperability between diverse devices and systems by emphasizing standardized communication protocols and interfaces. The modular and adaptable architecture of the DF enables scalability and adaptation, allowing for rapid reaction to market conditions.

Originality/value

Based on the need of DF, this review presents a comprehensive approach to DF development using DTs, sensing devices, LAM and SM processes and provides current progress in this domain.

Article
Publication date: 12 December 2023

Niveen Badra, Hosam Hegazy, Mohamed Mousa, Jiansong Zhang, Sharifah Akmam Syed Zakaria, Said Aboul Haggag and Ibrahim Abdul-Rashied

This research aims to create a methodology that integrates optimization techniques into preliminary cost estimates and predicts the impacts of design alternatives of steel…

Abstract

Purpose

This research aims to create a methodology that integrates optimization techniques into preliminary cost estimates and predicts the impacts of design alternatives of steel pedestrian bridges (SPBs). The cost estimation process uses two main parameters, but the main goal is to create a cost estimation model.

Design/methodology/approach

This study explores a flexible model design that uses computing capabilities for decision-making. Using cost optimization techniques, the model can select an optimal pedestrian bridge system based on multiple criteria that may change independently. This research focuses on four types of SPB systems prevalent in Egypt and worldwide. The study also suggests developing a computerized cost and weight optimization model that enables decision-makers to select the optimal system for SPBs in keeping up with the criteria established for that system.

Findings

In this paper, the authors developed an optimization model for cost estimates of SPBs. The model considers two main parameters: weight and cost. The main contribution of this study based on a parametric study is to propose an approach that enables structural engineers and designers to select the optimum system for SPBs.

Practical implications

The implications of this research from a practical perspective are that the study outlines a feasible approach to develop a computerized model that utilizes the capabilities of computing for quick cost optimization that enables decision-makers to select the optimal system for four common SPBs based on multiple criteria that may change independently and in concert with cost optimization during the preliminary design stage.

Social implications

The model can choose an optimal system for SPBs based on multiple criteria that may change independently and in concert with cost optimization. The resulting optimization model can forecast the optimum cost of the SPBs for different structural spans and road spans based on local unit costs of materials cost of steel structures, fabrication, erection and painting works.

Originality/value

The authors developed a computerized model that uses spreadsheet software's capabilities for cost optimization, enabling decision-makers to select the optimal system for SPBs meeting the criteria established for such a system. Based on structural characteristics and material unit costs, this study shows that using the optimization model for estimating the total direct cost of SPB systems, the project cost can be accurately predicted based on the conceptual design status, and positive prediction outcomes are achieved.

Details

Engineering, Construction and Architectural Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0969-9988

Keywords

1 – 10 of 11