Search results

1 – 10 of 219
Article
Publication date: 10 November 2023

Yonghong Zhang, Shouwei Li, Jingwei Li and Xiaoyu Tang

This paper aims to develop a novel grey Bernoulli model with memory characteristics, which is designed to dynamically choose the optimal memory kernel function and the length of…

Abstract

Purpose

This paper aims to develop a novel grey Bernoulli model with memory characteristics, which is designed to dynamically choose the optimal memory kernel function and the length of memory dependence period, ultimately enhancing the model's predictive accuracy.

Design/methodology/approach

This paper enhances the traditional grey Bernoulli model by introducing memory-dependent derivatives, resulting in a novel memory-dependent derivative grey model. Additionally, fractional-order accumulation is employed for preprocessing the original data. The length of the memory dependence period for memory-dependent derivatives is determined through grey correlation analysis. Furthermore, the whale optimization algorithm is utilized to optimize the cumulative order, power index and memory kernel function index of the model, enabling adaptability to diverse scenarios.

Findings

The selection of appropriate memory kernel functions and memory dependency lengths will improve model prediction performance. The model can adaptively select the memory kernel function and memory dependence length, and the performance of the model is better than other comparison models.

Research limitations/implications

The model presented in this article has some limitations. The grey model is itself suitable for small sample data, and memory-dependent derivatives mainly consider the memory effect on a fixed length. Therefore, this model is mainly applicable to data prediction with short-term memory effect and has certain limitations on time series of long-term memory.

Practical implications

In practical systems, memory effects typically exhibit a decaying pattern, which is effectively characterized by the memory kernel function. The model in this study skillfully determines the appropriate kernel functions and memory dependency lengths to capture these memory effects, enhancing its alignment with real-world scenarios.

Originality/value

Based on the memory-dependent derivative method, a memory-dependent derivative grey Bernoulli model that more accurately reflects the actual memory effect is constructed and applied to power generation forecasting in China, South Korea and India.

Details

Grey Systems: Theory and Application, vol. 14 no. 1
Type: Research Article
ISSN: 2043-9377

Keywords

Open Access
Article
Publication date: 29 July 2020

Abdullah Alharbi, Wajdi Alhakami, Sami Bourouis, Fatma Najar and Nizar Bouguila

We propose in this paper a novel reliable detection method to recognize forged inpainting images. Detecting potential forgeries and authenticating the content of digital images is…

Abstract

We propose in this paper a novel reliable detection method to recognize forged inpainting images. Detecting potential forgeries and authenticating the content of digital images is extremely challenging and important for many applications. The proposed approach involves developing new probabilistic support vector machines (SVMs) kernels from a flexible generative statistical model named “bounded generalized Gaussian mixture model”. The developed learning framework has the advantage to combine properly the benefits of both discriminative and generative models and to include prior knowledge about the nature of data. It can effectively recognize if an image is a tampered one and also to identify both forged and authentic images. The obtained results confirmed that the developed framework has good performance under numerous inpainted images.

Details

Applied Computing and Informatics, vol. 20 no. 1/2
Type: Research Article
ISSN: 2634-1964

Keywords

Article
Publication date: 11 March 2024

Vipin Gupta, Barak M.S. and Soumik Das

This paper addresses a significant research gap in the study of Rayleigh surface wave propagation within a piezoelectric medium characterized by piezoelectric properties, thermal…

Abstract

Purpose

This paper addresses a significant research gap in the study of Rayleigh surface wave propagation within a piezoelectric medium characterized by piezoelectric properties, thermal effects and voids. Previous research has often overlooked the crucial aspects related to voids. This study aims to provide analytical solutions for Rayleigh waves propagating through a medium consisting of a nonlocal piezo-thermo-elastic material with voids under the Moore–Gibson–Thompson thermo-elasticity theory with memory dependencies.

Design/methodology/approach

The analytical solutions are derived using a wave-mode method, and roots are computed from the characteristic equation using the Durand–Kerner method. These roots are then filtered based on the decay condition of surface waves. The analysis pertains to a medium subjected to stress-free and isothermal boundary conditions.

Findings

Computational simulations are performed to determine the attenuation coefficient and phase velocity of Rayleigh waves. This investigation goes beyond mere calculations and examines particle motion to gain deeper insights into Rayleigh wave propagation. Furthermore, this investigates how kernel function and nonlocal parameters influence these wave phenomena.

Research limitations/implications

The results of this study reveal several unique cases that significantly contribute to the understanding of Rayleigh wave propagation within this intricate material system, particularly in the presence of voids.

Practical implications

This investigation provides valuable insights into the synergistic dynamics among piezoelectric constituents, void structures and Rayleigh wave propagation, enabling advancements in sensor technology, augmented energy harvesting methodologies and pioneering seismic monitoring approaches.

Originality/value

This study formulates a novel governing equation for a nonlocal piezo-thermo-elastic medium with voids, highlighting the significance of Rayleigh waves and investigating the impact of memory.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 34 no. 4
Type: Research Article
ISSN: 0961-5539

Keywords

Article
Publication date: 23 June 2023

Rawid Banchuin

The purpose of this paper is to propose a novel nonlocal fractal calculus scheme dedicated to the analysis of fractal electrical circuit, namely, the generalized nonlocal fractal…

Abstract

Purpose

The purpose of this paper is to propose a novel nonlocal fractal calculus scheme dedicated to the analysis of fractal electrical circuit, namely, the generalized nonlocal fractal calculus.

Design/methodology/approach

For being generalized, an arbitrary kernel function has been adopted. The condition on order has been derived so that it is not related to the γ-dimension of the fractal set. The fractal Laplace transforms of our operators have been derived.

Findings

Unlike the traditional power law kernel-based nonlocal fractal calculus operators, ours are generalized, consistent with the local fractal derivative and use higher degree of freedom. As intended, the proposed nonlocal fractal calculus is applicable to any kind of fractal electrical circuit. Thus, it has been found to be a more efficient tool for the fractal electrical circuit analysis than any previous fractal set dedicated calculus scheme.

Originality/value

A fractal calculus scheme that is more efficient for the fractal electrical circuit analysis than any previous ones has been proposed in this work.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering , vol. 42 no. 6
Type: Research Article
ISSN: 0332-1649

Keywords

Article
Publication date: 6 January 2023

Hanieh Javadi Khasraghi, Isaac Vaghefi and Rudy Hirschheim

The research study intends to gain a better understanding of members' behaviors in the context of crowdsourcing contests. The authors examined the key factors that can motivate or…

229

Abstract

Purpose

The research study intends to gain a better understanding of members' behaviors in the context of crowdsourcing contests. The authors examined the key factors that can motivate or discourage contributing to a team and within the community.

Design/methodology/approach

The authors conducted 21 semi-structured interviews with Kaggle.com members and analyzed the data to capture individual members' contributions and emerging determinants that play a role during this process. The authors adopted a qualitative approach and used standard thematic coding techniques to analyze the data.

Findings

The analysis revealed two processes underlying contribution to the team and community and the decision-making involved in each. Accordingly, a set of key factors affecting each process were identified. Using Holbrook's (2006) typology of value creation, these factors were classified into four types, namely extrinsic and self-oriented (economic value), extrinsic and other-oriented (social value), intrinsic and self-oriented (hedonic value), and intrinsic and other-oriented (altruistic value). Three propositions were developed, which can be tested in future research.

Research limitations/implications

The study has a few limitations, which point to areas for future research on this topic. First, the authors only assessed the behaviors of individuals who use the Kaggle platform. Second, the findings of this study may not be generalizable to other crowdsourcing platforms such as Amazon Mechanical Turk, where there is no competition, and participants cannot meaningfully contribute to the community. Third, the authors collected data from a limited (yet knowledgeable) number of interviewees. It would be useful to use bigger sample sizes to assess other possible factors that did not emerge from our analysis. Finally, the authors presented a set of propositions for individuals' contributory behavior in crowdsourcing contest platforms but did not empirically test them. Future research is necessary to validate these hypotheses, for instance, by using quantitative methods (e.g. surveys or experiments).

Practical implications

The authors offer recommendations for implementing appropriate mechanisms for contribution to crowdsourcing contests and platforms. Practitioners should design architectures to minimize the effect of factors that reduce the likelihood of contributions and maximize the factors that increase contribution in order to manage the tension of simultaneously encouraging contribution and competition.

Social implications

The research study makes key theoretical contributions to research. First, the results of this study help explain the individuals' contributory behavior in crowdsourcing contests from two aspects: joining and selecting a team and content contribution to the community. Second, the findings of this study suggest a revised and extended model of value co-creation, one that integrates this study’s findings with those of Nov et al. (2009), Lakhani and Wolf (2005), Wasko and Faraj (2000), Chen et al. (2018), Hahn et al. (2008), Dholakia et al. (2004) and Teichmann et al. (2015). Third, using direct accounts collected through first-hand interviews with crowdsourcing contest members, this study provides an in-depth understanding of individuals' contributory behavior. Methodologically, this authors’ approach was distinct from common approaches used in this research domain that used secondary datasets (e.g. the content of forum discussions, survey data) (e.g. see Lakhani and Wolf, 2005; Nov et al., 2009) and quantitative techniques for analyzing collaboration and contribution behavior.

Originality/value

The authors advance the broad field of crowdsourcing by extending the literature on value creation in the online community, particularly as it relates to the individual participants. The study advances the theoretical understanding of contribution in crowdsourcing contests by focusing on the members' point of view, which reveals both the determinants and the process for joining teams during crowdsourcing contests as well as the determinants of contribution to the content distributed in the community.

Details

Information Technology & People, vol. 37 no. 1
Type: Research Article
ISSN: 0959-3845

Keywords

Article
Publication date: 7 November 2023

Yingguang Wang

The purpose of this paper is to exploit a new and robust method to forecast the long-term extreme dynamic responses for wave energy converters (WECs).

Abstract

Purpose

The purpose of this paper is to exploit a new and robust method to forecast the long-term extreme dynamic responses for wave energy converters (WECs).

Design/methodology/approach

A new adaptive binned kernel density estimation (KDE) methodology is first proposed in this paper.

Findings

By examining the calculation results the authors has found that in the tail region the proposed new adaptive binned KDE distribution curve becomes very smooth and fits quite well with the histogram of the measured ocean wave dataset at the National Data Buoy Center (NDBC) station 46,059. Carefully studying the calculation results also reveals that the 50-year extreme power-take-off heaving force value forecasted based on the environmental contour derived using the new method is 3572600N, which is much larger than the value 2709100N forecasted via the Rosenblatt-inverse second-order reliability method (ISORM) contour method.

Research limitations/implications

The proposed method overcomes the disadvantages of all the existing nonparametric and parametric methods for predicting the tail region probability density values of the sea state parameters.

Originality/value

It is concluded that the proposed new adaptive binned KDE method is robust and can forecast well the 50-year extreme dynamic responses for WECs.

Details

Engineering Computations, vol. 40 no. 9/10
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 2 January 2024

Xiumei Cai, Xi Yang and Chengmao Wu

Multi-view fuzzy clustering algorithms are not widely used in image segmentation, and many of these algorithms are lacking in robustness. The purpose of this paper is to…

Abstract

Purpose

Multi-view fuzzy clustering algorithms are not widely used in image segmentation, and many of these algorithms are lacking in robustness. The purpose of this paper is to investigate a new algorithm that can segment the image better and retain as much detailed information about the image as possible when segmenting noisy images.

Design/methodology/approach

The authors present a novel multi-view fuzzy c-means (FCM) clustering algorithm that includes an automatic view-weight learning mechanism. Firstly, this algorithm introduces a view-weight factor that can automatically adjust the weight of different views, thereby allowing each view to obtain the best possible weight. Secondly, the algorithm incorporates a weighted fuzzy factor, which serves to obtain local spatial information and local grayscale information to preserve image details as much as possible. Finally, in order to weaken the effects of noise and outliers in image segmentation, this algorithm employs the kernel distance measure instead of the Euclidean distance.

Findings

The authors added different kinds of noise to images and conducted a large number of experimental tests. The results show that the proposed algorithm performs better and is more accurate than previous multi-view fuzzy clustering algorithms in solving the problem of noisy image segmentation.

Originality/value

Most of the existing multi-view clustering algorithms are for multi-view datasets, and the multi-view fuzzy clustering algorithms are unable to eliminate noise points and outliers when dealing with noisy images. The algorithm proposed in this paper has stronger noise immunity and can better preserve the details of the original image.

Details

Engineering Computations, vol. 41 no. 1
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 6 October 2023

Vahide Bulut

Feature extraction from 3D datasets is a current problem. Machine learning is an important tool for classification of complex 3D datasets. Machine learning classification…

Abstract

Purpose

Feature extraction from 3D datasets is a current problem. Machine learning is an important tool for classification of complex 3D datasets. Machine learning classification techniques are widely used in various fields, such as text classification, pattern recognition, medical disease analysis, etc. The aim of this study is to apply the most popular classification and regression methods to determine the best classification and regression method based on the geodesics.

Design/methodology/approach

The feature vector is determined by the unit normal vector and the unit principal vector at each point of the 3D surface along with the point coordinates themselves. Moreover, different examples are compared according to the classification methods in terms of accuracy and the regression algorithms in terms of R-squared value.

Findings

Several surface examples are analyzed for the feature vector using classification (31 methods) and regression (23 methods) machine learning algorithms. In addition, two ensemble methods XGBoost and LightGBM are used for classification and regression. Also, the scores for each surface example are compared.

Originality/value

To the best of the author’s knowledge, this is the first study to analyze datasets based on geodesics using machine learning algorithms for classification and regression.

Details

Engineering Computations, vol. 40 no. 9/10
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 20 November 2023

Chao Zhang, Fang Wang, Yi Huang and Le Chang

This paper aims to reveal the interdisciplinarity of information science (IS) from the perspective of the evolution of theory application.

Abstract

Purpose

This paper aims to reveal the interdisciplinarity of information science (IS) from the perspective of the evolution of theory application.

Design/methodology/approach

Select eight representative IS journals as data sources, extract the theories mentioned in the full texts of the research papers and then measure annual interdisciplinarity of IS by conducting theory co-occurrence network analysis, diversity measure and evolution analysis.

Findings

As a young and vibrant discipline, IS has been continuously absorbing and internalizing external theoretical knowledge and thus formed a high degree of interdisciplinarity. With the continuous application of some kernel theories, the interdisciplinarity of IS appears to be decreasing and gradually converging into a few neighboring disciplines. Influenced by big data and artificial intelligence, the research paradigm of IS is shifting from a theory centered one to a technology centered one.

Research limitations/implications

This study helps to understand the evolution of the interdisciplinarity of IS in the past 21 years. The main limitation is that the data were collected from eight journals indexed by the Social Sciences Citation Index and a small amount of theories might have been omitted.

Originality/value

This study identifies the kernel theories in IS research, measures the interdisciplinarity of IS based on the evolution of the co-occurrence network of theory source disciplines and reveals the paradigm shift being happening in IS.

Details

Journal of Documentation, vol. 80 no. 2
Type: Research Article
ISSN: 0022-0418

Keywords

Open Access
Article
Publication date: 28 November 2022

Ruchi Kejriwal, Monika Garg and Gaurav Sarin

Stock market has always been lucrative for various investors. But, because of its speculative nature, it is difficult to predict the price movement. Investors have been using both…

1035

Abstract

Purpose

Stock market has always been lucrative for various investors. But, because of its speculative nature, it is difficult to predict the price movement. Investors have been using both fundamental and technical analysis to predict the prices. Fundamental analysis helps to study structured data of the company. Technical analysis helps to study price trends, and with the increasing and easy availability of unstructured data have made it important to study the market sentiment. Market sentiment has a major impact on the prices in short run. Hence, the purpose is to understand the market sentiment timely and effectively.

Design/methodology/approach

The research includes text mining and then creating various models for classification. The accuracy of these models is checked using confusion matrix.

Findings

Out of the six machine learning techniques used to create the classification model, kernel support vector machine gave the highest accuracy of 68%. This model can be now used to analyse the tweets, news and various other unstructured data to predict the price movement.

Originality/value

This study will help investors classify a news or a tweet into “positive”, “negative” or “neutral” quickly and determine the stock price trends.

Details

Vilakshan - XIMB Journal of Management, vol. 21 no. 1
Type: Research Article
ISSN: 0973-1954

Keywords

Access

Year

Last 6 months (219)

Content type

Article (219)
1 – 10 of 219