Search results

1 – 10 of 155
Article
Publication date: 6 September 2022

Elena Fedorova, Pavel Drogovoz, Anna Popova and Vladimir Shiboldenkov

The paper examines whether, along with the financial performance, the disclosure of research and development (R&D) expenses, patent portfolios, patent citations and innovation…

Abstract

Purpose

The paper examines whether, along with the financial performance, the disclosure of research and development (R&D) expenses, patent portfolios, patent citations and innovation activities affect the market capitalization of Russian companies.

Design/methodology/approach

The paper opted for a set of techniques including bag-of-words (BoW) to retrieve additional innovation-related data from companies' annual reports, self-organizing maps (SOM) to perform visual exploratory analysis and panel data regression (PDR) to conduct confirmatory analysis using data on 74 Russian publicly traded companies for the period 2013–2019.

Findings

The paper observes that the disclosure of nonfinancial data on R&D, patents and primarily product and marketing innovations positively affects the market capitalization of the largest Russian companies, which are mainly focused on energy, raw materials and utilities and are operating on international markets. The study suggests that these companies are financially well-resourced to innovate at risk and thus to provide positive signals to stakeholders and external agents.

Research limitations/implications

Our findings are important to management, investors, financial analysts, regulators and various agencies providing guidance on corporate governance and sustainability reporting. However, the authors acknowledge that the research results may lack generalizability due to the sample covering a single national context. Researchers are encouraged to test the proposed approach further on other countries' data by using the compiled lexicons.

Originality/value

The study aims to expand the domains of signaling theory and market valuation by providing new insights into the impact that companies' reporting on R&D, patents and innovation activities has on market capitalization. New nonfinancial factors that previous research does not investigate – innovation disclosure indicators (IDI) – are tested.

Details

Kybernetes, vol. 52 no. 12
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 16 August 2023

Jialiang Xie, Shanli Zhang, Honghui Wang and Mingzhi Chen

With the rapid development of Internet technology, cybersecurity threats such as security loopholes, data leaks, network fraud, and ransomware have become increasingly prominent…

Abstract

Purpose

With the rapid development of Internet technology, cybersecurity threats such as security loopholes, data leaks, network fraud, and ransomware have become increasingly prominent, and organized and purposeful cyberattacks have increased, posing more challenges to cybersecurity protection. Therefore, reliable network risk assessment methods and effective network security protection schemes are urgently needed.

Design/methodology/approach

Based on the dynamic behavior patterns of attackers and defenders, a Bayesian network attack graph is constructed, and a multitarget risk dynamic assessment model is proposed based on network availability, network utilization impact and vulnerability attack possibility. Then, the self-organizing multiobjective evolutionary algorithm based on grey wolf optimization is proposed. And the authors use this algorithm to solve the multiobjective risk assessment model, and a variety of different attack strategies are obtained.

Findings

The experimental results demonstrate that the method yields 29 distinct attack strategies, and then attacker's preferences can be obtained according to these attack strategies. Furthermore, the method efficiently addresses the security assessment problem involving multiple decision variables, thereby providing constructive guidance for the construction of security network, security reinforcement and active defense.

Originality/value

A method for network risk assessment methods is given. And this study proposed a multiobjective risk dynamic assessment model based on network availability, network utilization impact and the possibility of vulnerability attacks. The example demonstrates the effectiveness of the method in addressing network security risks.

Details

International Journal of Intelligent Computing and Cybernetics, vol. 17 no. 1
Type: Research Article
ISSN: 1756-378X

Keywords

Article
Publication date: 17 November 2022

Navid Mohammadi, Nader Seyyedamiri and Saeed Heshmati

The purpose of this study/paper is conducting a Systematic mapping review, as a systematic literature review method for reviewing the literature of new product development by…

Abstract

Purpose

The purpose of this study/paper is conducting a Systematic mapping review, as a systematic literature review method for reviewing the literature of new product development by textmining and mapping the results of this review.

Design/methodology/approach

This research has been conducted with the aim of systematically reviewing the literature on the field of design and development of products based on textual data. This research wants to know, how text data and text mining methods, can use for the design and development of new products.

Findings

This review finds out what are the most popular algorithms in this field? What are the most popular areas in using these approaches? What types of data are used in this area? What software is used in this regard? And what are the research gaps in this area?

Originality/value

The contribution of this review is creating a macro and comprehensive map for research in this field of study from various aspects and identifying the pros and cons of this field of study by systematic mapping review.

Details

Nankai Business Review International, vol. 14 no. 4
Type: Research Article
ISSN: 2040-8749

Keywords

Article
Publication date: 3 November 2023

Salam Abdallah and Ashraf Khalil

This study aims to understand and a lay a foundation of how analytics has been used in depression management, this study conducts a systematic literature review using two…

118

Abstract

Purpose

This study aims to understand and a lay a foundation of how analytics has been used in depression management, this study conducts a systematic literature review using two techniques – text mining and manual review. The proposed methodology would aid researchers in identifying key concepts and research gaps, which in turn, will help them to establish the theoretical background supporting their empirical research objective.

Design/methodology/approach

This paper explores a hybrid methodology for literature review (HMLR), using text mining prior to systematic manual review.

Findings

The proposed rapid methodology is an effective tool to automate and speed up the process required to identify key and emerging concepts and research gaps in any specific research domain while conducting a systematic literature review. It assists in populating a research knowledge graph that does not reach all semantic depths of the examined domain yet provides some science-specific structure.

Originality/value

This study presents a new methodology for conducting a literature review for empirical research articles. This study has explored an “HMLR” that combines text mining and manual systematic literature review. Depending on the purpose of the research, these two techniques can be used in tandem to undertake a comprehensive literature review, by combining pieces of complex textual data together and revealing areas where research might be lacking.

Details

Information Discovery and Delivery, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2398-6247

Keywords

Article
Publication date: 20 February 2023

Zakaria Sakyoud, Abdessadek Aaroud and Khalid Akodadi

The main goal of this research work is the optimization of the purchasing business process in the Moroccan public sector in terms of transparency and budgetary optimization. The…

Abstract

Purpose

The main goal of this research work is the optimization of the purchasing business process in the Moroccan public sector in terms of transparency and budgetary optimization. The authors have worked on the public university as an implementation field.

Design/methodology/approach

The design of the research work followed the design science research (DSR) methodology for information systems. DSR is a research paradigm wherein a designer answers questions relevant to human problems through the creation of innovative artifacts, thereby contributing new knowledge to the body of scientific evidence. The authors have adopted a techno-functional approach. The technical part consists of the development of an intelligent recommendation system that supports the choice of optimal information technology (IT) equipment for decision-makers. This intelligent recommendation system relies on a set of functional and business concepts, namely the Moroccan normative laws and Control Objectives for Information and Related Technology's (COBIT) guidelines in information system governance.

Findings

The modeling of business processes in public universities is established using business process model and notation (BPMN) in accordance with official regulations. The set of BPMN models constitute a powerful repository not only for business process execution but also for further optimization. Governance generally aims to reduce budgetary wastes, and the authors' recommendation system demonstrates a technical and methodological approach enabling this feature. Implementation of artificial intelligence techniques can bring great value in terms of transparency and fluidity in purchasing business process execution.

Research limitations/implications

Business limitations: First, the proposed system was modeled to handle one type products, which are computer-related equipment. Hence, the authors intend to extend the model to other types of products in future works. Conversely, the system proposes optimal purchasing order and assumes that decision makers will rely on this optimal purchasing order to choose between offers. In fact, as a perspective, the authors plan to work on a complete automation of the workflow to also include vendor selection and offer validation. Technical limitations: Natural language processing (NLP) is a widely used sentiment analysis (SA) technique that enabled the authors to validate the proposed system. Even working on samples of datasets, the authors noticed NLP dependency on huge computing power. The authors intend to experiment with learning and knowledge-based SA and assess the' computing power consumption and accuracy of the analysis compared to NLP. Another technical limitation is related to the web scraping technique; in fact, the users' reviews are crucial for the authors' system. To guarantee timeliness and reliable reviews, the system has to look automatically in websites, which confront the authors with the limitations of the web scraping like the permanent changing of website structure and scraping restrictions.

Practical implications

The modeling of business processes in public universities is established using BPMN in accordance with official regulations. The set of BPMN models constitute a powerful repository not only for business process execution but also for further optimization. Governance generally aims to reduce budgetary wastes, and the authors' recommendation system demonstrates a technical and methodological approach enabling this feature.

Originality/value

The adopted techno-functional approach enabled the authors to bring information system governance from a highly abstract level to a practical implementation where the theoretical best practices and guidelines are transformed to a tangible application.

Details

Kybernetes, vol. 53 no. 5
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 18 March 2024

Raj Kumar Bhardwaj, Ritesh Kumar and Mohammad Nazim

This paper evaluates the precision of four metasearch engines (MSEs) – DuckDuckGo, Dogpile, Metacrawler and Startpage, to determine which metasearch engine exhibits the highest…

Abstract

Purpose

This paper evaluates the precision of four metasearch engines (MSEs) – DuckDuckGo, Dogpile, Metacrawler and Startpage, to determine which metasearch engine exhibits the highest level of precision and to identify the metasearch engine that is most likely to return the most relevant search results.

Design/methodology/approach

The research is divided into two parts: the first phase involves four queries categorized into two segments (4-Q-2-S), while the second phase includes six queries divided into three segments (6-Q-3-S). These queries vary in complexity, falling into three types: simple, phrase and complex. The precision, average precision and the presence of duplicates across all the evaluated metasearch engines are determined.

Findings

The study clearly demonstrated that Startpage returned the most relevant results and achieved the highest precision (0.98) among the four MSEs. Conversely, DuckDuckGo exhibited consistent performance across both phases of the study.

Research limitations/implications

The study only evaluated four metasearch engines, which may not be representative of all available metasearch engines. Additionally, a limited number of queries were used, which may not be sufficient to generalize the findings to all types of queries.

Practical implications

The findings of this study can be valuable for accreditation agencies in managing duplicates, improving their search capabilities and obtaining more relevant and precise results. These findings can also assist users in selecting the best metasearch engine based on precision rather than interface.

Originality/value

The study is the first of its kind which evaluates the four metasearch engines. No similar study has been conducted in the past to measure the performance of metasearch engines.

Details

Performance Measurement and Metrics, vol. 25 no. 1
Type: Research Article
ISSN: 1467-8047

Keywords

Article
Publication date: 9 January 2024

Ning Chen, Zhenyu Zhang and An Chen

Consequence prediction is an emerging topic in safety management concerning the severity outcome of accidents. In practical applications, it is usually implemented through…

Abstract

Purpose

Consequence prediction is an emerging topic in safety management concerning the severity outcome of accidents. In practical applications, it is usually implemented through supervised learning methods; however, the evaluation of classification results remains a challenge. The previous studies mostly adopted simplex evaluation based on empirical and quantitative assessment strategies. This paper aims to shed new light on the comprehensive evaluation and comparison of diverse classification methods through visualization, clustering and ranking techniques.

Design/methodology/approach

An empirical study is conducted using 9 state-of-the-art classification methods on a real-world data set of 653 construction accidents in China for predicting the consequence with respect to 39 carefully featured factors and accident type. The proposed comprehensive evaluation enriches the interpretation of classification results from different perspectives. Furthermore, the critical factors leading to severe construction accidents are identified by analyzing the coefficients of a logistic regression model.

Findings

This paper identifies the critical factors that significantly influence the consequence of construction accidents, which include accident type (particularly collapse), improper accident reporting and handling (E21), inadequate supervision engineers (O41), no special safety department (O11), delayed or low-quality drawings (T11), unqualified contractor (C21), schedule pressure (C11), multi-level subcontracting (C22), lacking safety examination (S22), improper operation of mechanical equipment (R11) and improper construction procedure arrangement (T21). The prediction models and findings of critical factors help make safety intervention measures in a targeted way and enhance the experience of safety professionals in the construction industry.

Research limitations/implications

The empirical study using some well-known classification methods for forecasting the consequences of construction accidents provides some evidence for the comprehensive evaluation of multiple classifiers. These techniques can be used jointly with other evaluation approaches for a comprehensive understanding of the classification algorithms. Despite the limitation of specific methods used in the study, the presented methodology can be configured with other classification methods and performance metrics and even applied to other decision-making problems such as clustering.

Originality/value

This study sheds new light on the comprehensive comparison and evaluation of classification results through visualization, clustering and ranking techniques using an empirical study of consequence prediction of construction accidents. The relevance of construction accident type is discussed with the severity of accidents. The critical factors influencing the accident consequence are identified for the sake of taking prevention measures for risk reduction. The proposed method can be applied to other decision-making tasks where the evaluation is involved as an important component.

Details

Construction Innovation , vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1471-4175

Keywords

Article
Publication date: 20 October 2023

Duo Zhang, Yonghua Li, Gaping Wang, Qing Xia and Hang Zhang

This study aims to propose a more precise method for robust design optimization of mechanical structures with black-box problems, while also considering the efficiency of…

Abstract

Purpose

This study aims to propose a more precise method for robust design optimization of mechanical structures with black-box problems, while also considering the efficiency of uncertainty analysis.

Design/methodology/approach

The method first introduces a dual adaptive chaotic flower pollination algorithm (DACFPA) to overcome the shortcomings of the original flower pollination algorithm (FPA), such as its susceptibility to poor accuracy and convergence efficiency when dealing with complex optimization problems. Furthermore, a DACFPA-Kriging model is developed by optimizing the relevant parameter of Kriging model via DACFPA. Finally, the dual Kriging model is constructed to improve the efficiency of uncertainty analysis, and a robust design optimization method based on DACFPA-Dual-Kriging is proposed.

Findings

The DACFPA outperforms the FPA, particle swarm optimization and gray wolf optimization algorithms in terms of solution accuracy, convergence speed and capacity to avoid local optimal solutions. Additionally, the DACFPA-Kriging model exhibits superior prediction accuracy and robustness contrasted with the original Kriging and FPA-Kriging. The proposed method for robust design optimization based on DACFPA-Dual-Kriging is applied to the motor hanger of the electric multiple units as an engineering case study, and the results confirm a significant reduction in the fluctuation of the maximum equivalent stress.

Originality/value

This study represents the initial attempt to enhance the prediction accuracy of the Kriging model using the improved FPA and to combine the dual Kriging model for uncertainty analysis, providing an idea for the robust optimization design of mechanical structure with black-box problem.

Details

Multidiscipline Modeling in Materials and Structures, vol. 19 no. 6
Type: Research Article
ISSN: 1573-6105

Keywords

Article
Publication date: 22 March 2024

Mohd Mustaqeem, Suhel Mustajab and Mahfooz Alam

Software defect prediction (SDP) is a critical aspect of software quality assurance, aiming to identify and manage potential defects in software systems. In this paper, we have…

Abstract

Purpose

Software defect prediction (SDP) is a critical aspect of software quality assurance, aiming to identify and manage potential defects in software systems. In this paper, we have proposed a novel hybrid approach that combines Gray Wolf Optimization with Feature Selection (GWOFS) and multilayer perceptron (MLP) for SDP. The GWOFS-MLP hybrid model is designed to optimize feature selection, ultimately enhancing the accuracy and efficiency of SDP. Gray Wolf Optimization, inspired by the social hierarchy and hunting behavior of gray wolves, is employed to select a subset of relevant features from an extensive pool of potential predictors. This study investigates the key challenges that traditional SDP approaches encounter and proposes promising solutions to overcome time complexity and the curse of the dimensionality reduction problem.

Design/methodology/approach

The integration of GWOFS and MLP results in a robust hybrid model that can adapt to diverse software datasets. This feature selection process harnesses the cooperative hunting behavior of wolves, allowing for the exploration of critical feature combinations. The selected features are then fed into an MLP, a powerful artificial neural network (ANN) known for its capability to learn intricate patterns within software metrics. MLP serves as the predictive engine, utilizing the curated feature set to model and classify software defects accurately.

Findings

The performance evaluation of the GWOFS-MLP hybrid model on a real-world software defect dataset demonstrates its effectiveness. The model achieves a remarkable training accuracy of 97.69% and a testing accuracy of 97.99%. Additionally, the receiver operating characteristic area under the curve (ROC-AUC) score of 0.89 highlights the model’s ability to discriminate between defective and defect-free software components.

Originality/value

Experimental implementations using machine learning-based techniques with feature reduction are conducted to validate the proposed solutions. The goal is to enhance SDP’s accuracy, relevance and efficiency, ultimately improving software quality assurance processes. The confusion matrix further illustrates the model’s performance, with only a small number of false positives and false negatives.

Details

International Journal of Intelligent Computing and Cybernetics, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1756-378X

Keywords

Article
Publication date: 11 October 2022

Yuefeng Cen, Minglu Wang, Gang Cen, Yongping Cai, Cheng Zhao and Zhigang Cheng

The stock indexes are an important issue for investors, and in this paper good trading strategies will be aimed to be adopted according to the accurate prediction of the stock…

Abstract

Purpose

The stock indexes are an important issue for investors, and in this paper good trading strategies will be aimed to be adopted according to the accurate prediction of the stock indexes to chase high returns.

Design/methodology/approach

To avoid the problem of insufficient financial data for daily stock indexes prediction during modeling, a data augmentation method based on time scale transformation (DATT) was introduced. After that, a new deep learning model which combined DATT and NGRU (DATT-nested gated recurrent units (NGRU)) was proposed for stock indexes prediction. The proposed models and their competitive models were used to test the stock indexes prediction and simulated trading in five stock markets of China and the United States.

Findings

The experimental results demonstrated that both NGRU and DATT-NGRU outperformed the other recurrent neural network (RNN) models in the daily stock indexes prediction.

Originality/value

A novel RNN with NGRU and data augmentation is proposed. It uses the nested structure to increase the depth of the deep learning model.

Details

Kybernetes, vol. 53 no. 1
Type: Research Article
ISSN: 0368-492X

Keywords

1 – 10 of 155