Search results

1 – 10 of 10
Open Access
Article
Publication date: 4 November 2022

Bianca Caiazzo, Teresa Murino, Alberto Petrillo, Gianluca Piccirillo and Stefania Santini

This work aims at proposing a novel Internet of Things (IoT)-based and cloud-assisted monitoring architecture for smart manufacturing systems able to evaluate their overall status…

2036

Abstract

Purpose

This work aims at proposing a novel Internet of Things (IoT)-based and cloud-assisted monitoring architecture for smart manufacturing systems able to evaluate their overall status and detect eventual anomalies occurring into the production. A novel artificial intelligence (AI) based technique, able to identify the specific anomalous event and the related risk classification for possible intervention, is hence proposed.

Design/methodology/approach

The proposed solution is a five-layer scalable and modular platform in Industry 5.0 perspective, where the crucial layer is the Cloud Cyber one. This embeds a novel anomaly detection solution, designed by leveraging control charts, autoencoders (AE) long short-term memory (LSTM) and Fuzzy Inference System (FIS). The proper combination of these methods allows, not only detecting the products defects, but also recognizing their causalities.

Findings

The proposed architecture, experimentally validated on a manufacturing system involved into the production of a solar thermal high-vacuum flat panel, provides to human operators information about anomalous events, where they occur, and crucial information about their risk levels.

Practical implications

Thanks to the abnormal risk panel; human operators and business managers are able, not only of remotely visualizing the real-time status of each production parameter, but also to properly face with the eventual anomalous events, only when necessary. This is especially relevant in an emergency situation, such as the COVID-19 pandemic.

Originality/value

The monitoring platform is one of the first attempts in leading modern manufacturing systems toward the Industry 5.0 concept. Indeed, it combines human strengths, IoT technology on machines, cloud-based solutions with AI and zero detect manufacturing strategies in a unified framework so to detect causalities in complex dynamic systems by enabling the possibility of products’ waste avoidance.

Details

Journal of Manufacturing Technology Management, vol. 34 no. 4
Type: Research Article
ISSN: 1741-038X

Keywords

Open Access
Article
Publication date: 29 January 2024

Miaoxian Guo, Shouheng Wei, Chentong Han, Wanliang Xia, Chao Luo and Zhijian Lin

Surface roughness has a serious impact on the fatigue strength, wear resistance and life of mechanical products. Realizing the evolution of surface quality through theoretical…

Abstract

Purpose

Surface roughness has a serious impact on the fatigue strength, wear resistance and life of mechanical products. Realizing the evolution of surface quality through theoretical modeling takes a lot of effort. To predict the surface roughness of milling processing, this paper aims to construct a neural network based on deep learning and data augmentation.

Design/methodology/approach

This study proposes a method consisting of three steps. Firstly, the machine tool multisource data acquisition platform is established, which combines sensor monitoring with machine tool communication to collect processing signals. Secondly, the feature parameters are extracted to reduce the interference and improve the model generalization ability. Thirdly, for different expectations, the parameters of the deep belief network (DBN) model are optimized by the tent-SSA algorithm to achieve more accurate roughness classification and regression prediction.

Findings

The adaptive synthetic sampling (ADASYN) algorithm can improve the classification prediction accuracy of DBN from 80.67% to 94.23%. After the DBN parameters were optimized by Tent-SSA, the roughness prediction accuracy was significantly improved. For the classification model, the prediction accuracy is improved by 5.77% based on ADASYN optimization. For regression models, different objective functions can be set according to production requirements, such as root-mean-square error (RMSE) or MaxAE, and the error is reduced by more than 40% compared to the original model.

Originality/value

A roughness prediction model based on multiple monitoring signals is proposed, which reduces the dependence on the acquisition of environmental variables and enhances the model's applicability. Furthermore, with the ADASYN algorithm, the Tent-SSA intelligent optimization algorithm is introduced to optimize the hyperparameters of the DBN model and improve the optimization performance.

Details

Journal of Intelligent Manufacturing and Special Equipment, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2633-6596

Keywords

Open Access
Article
Publication date: 31 October 2022

Sunday Adewale Olaleye, Emmanuel Mogaji, Friday Joseph Agbo, Dandison Ukpabi and Akwasi Gyamerah Adusei

The data economy mainly relies on the surveillance capitalism business model, enabling companies to monetize their data. The surveillance allows for transforming private human…

2099

Abstract

Purpose

The data economy mainly relies on the surveillance capitalism business model, enabling companies to monetize their data. The surveillance allows for transforming private human experiences into behavioral data that can be harnessed in the marketing sphere. This study aims to focus on investigating the domain of data economy with the methodological lens of quantitative bibliometric analysis of published literature.

Design/methodology/approach

The bibliometric analysis seeks to unravel trends and timelines for the emergence of the data economy, its conceptualization, scientific progression and thematic synergy that could predict the future of the field. A total of 591 data between 2008 and June 2021 were used in the analysis with the Biblioshiny app on the web interfaced and VOSviewer version 1.6.16 to analyze data from Web of Science and Scopus.

Findings

This study combined findable, accessible, interoperable and reusable (FAIR) data and data economy and contributed to the literature on big data, information discovery and delivery by shedding light on the conceptual, intellectual and social structure of data economy and demonstrating data relevance as a key strategic asset for companies and academia now and in the future.

Research limitations/implications

Findings from this study provide a steppingstone for researchers who may engage in further empirical and longitudinal studies by employing, for example, a quantitative and systematic review approach. In addition, future research could expand the scope of this study beyond FAIR data and data economy to examine aspects such as theories and show a plausible explanation of several phenomena in the emerging field.

Practical implications

The researchers can use the results of this study as a steppingstone for further empirical and longitudinal studies.

Originality/value

This study confirmed the relevance of data to society and revealed some gaps to be undertaken for the future.

Details

Information Discovery and Delivery, vol. 51 no. 2
Type: Research Article
ISSN: 2398-6247

Keywords

Open Access
Article
Publication date: 7 February 2023

Roberto De Luca, Antonino Ferraro, Antonio Galli, Mosè Gallo, Vincenzo Moscato and Giancarlo Sperlì

The recent innovations of Industry 4.0 have made it possible to easily collect data related to a production environment. In this context, information about industrial equipment  

1735

Abstract

Purpose

The recent innovations of Industry 4.0 have made it possible to easily collect data related to a production environment. In this context, information about industrial equipment – gathered by proper sensors – can be profitably used for supporting predictive maintenance (PdM) through the application of data-driven analytics based on artificial intelligence (AI) techniques. Although deep learning (DL) approaches have proven to be a quite effective solutions to the problem, one of the open research challenges remains – the design of PdM methods that are computationally efficient, and most importantly, applicable in real-world internet of things (IoT) scenarios, where they are required to be executable directly on the limited devices’ hardware.

Design/methodology/approach

In this paper, the authors propose a DL approach for PdM task, which is based on a particular and very efficient architecture. The major novelty behind the proposed framework is to leverage a multi-head attention (MHA) mechanism to obtain both high results in terms of remaining useful life (RUL) estimation and low memory model storage requirements, providing the basis for a possible implementation directly on the equipment hardware.

Findings

The achieved experimental results on the NASA dataset show how the authors’ approach outperforms in terms of effectiveness and efficiency the majority of the most diffused state-of-the-art techniques.

Research limitations/implications

A comparison of the spatial and temporal complexity with a typical long-short term memory (LSTM) model and the state-of-the-art approaches was also done on the NASA dataset. Despite the authors’ approach achieving similar effectiveness results with respect to other approaches, it has a significantly smaller number of parameters, a smaller storage volume and lower training time.

Practical implications

The proposed approach aims to find a compromise between effectiveness and efficiency, which is crucial in the industrial domain in which it is important to maximize the link between performance attained and resources allocated. The overall accuracy performances are also on par with the finest methods described in the literature.

Originality/value

The proposed approach allows satisfying the requirements of modern embedded AI applications (reliability, low power consumption, etc.), finding a compromise between efficiency and effectiveness.

Details

Journal of Manufacturing Technology Management, vol. 34 no. 4
Type: Research Article
ISSN: 1741-038X

Keywords

Open Access
Article
Publication date: 11 February 2020

Brian T. Ratchford

The purpose of this study is to determine what the history of research in marketing implies for the reaction of the field to recent developments in technology due to the internet…

13382

Abstract

Purpose

The purpose of this study is to determine what the history of research in marketing implies for the reaction of the field to recent developments in technology due to the internet and associated developments.

Design/methodology/approach

This paper examines the introduction of new research topics over 10-year intervals from 1960 to the present. These provide the basic body of knowledge that drives the field at the present time.

Findings

While researchers have always borrowed techniques, they have refined them to make them applicable to marketing problems. Moreover, the field has always responded to new developments in technology, such as more powerful computers, scanners and scanner data, and the internet with a flurry of research that applies the technologies.

Research limitations/implications

Marketing will adapt to changes brought on by the internet, increased computer power and big data. While the field faces competition for other disciplines, its established body of knowledge about solving marketing problems gives it a unique advantage.

Originality/value

This paper traces the history of academic marketing from 1960 to the present to show how major changes in the field responded to changes in computer power and technology. It also derives implications for the future from this analysis.

Propósito

El objetivo de este estudio es examinar qué implica la historia de la investigación académica en marketing en la reacción del campo de conocimiento a los recientes desarrollos tecnológicos como consecuencia de la irrupción de Internet.

Metodología

Esta investigación analiza la introducción de nuevos temas de investigación en intervalos de diez años desde 1960 hasta la actualidad. Estos periodos proporcionan el cuerpo de conocimiento básico que conduce al ámbito del marketing hasta el presente.

Hallazgos

Aunque los investigadores tradicionalmente han tomado prestadas ciertas técnicas, las han ido refinando para aplicarlas a los problemas de marketing. Además, el ámbito del marketing siempre ha respondido a los nuevos desarrollos tecnológicos, más poder de computación, datos de escáner o el desarrollo de Internet, con un amplio número de investigaciones aplicando tales tecnologías.

Implicaciones

El marketing se adaptará a los cambios provocados por Internet, aumentando el poder de computación y el big data. Aunque el marketing se enfrenta a la competencia de otras disciplinas, su sólido cuerpo de conocimiento orientado a la resolución de problemas le otorga una ventaja diferencial única.

Valor

Describe la historia académica del marketing desde 1960 hasta la actualidad, para mostrar cómo los principales cambios en este campo respondieron a los cambios tecnológicos. Se derivan interesantes implicaciones para el futuro.

Palabras clave

Historia, Revisión, Cambio, Tecnología, Conocimiento, Internet, Datos, Métodos

Tipo de artículo

Revisión general

Details

Spanish Journal of Marketing - ESIC, vol. 24 no. 1
Type: Research Article
ISSN: 2444-9709

Keywords

Open Access
Article
Publication date: 11 August 2021

Yang Zhao and Zhonglu Chen

This study explores whether a new machine learning method can more accurately predict the movement of stock prices.

3264

Abstract

Purpose

This study explores whether a new machine learning method can more accurately predict the movement of stock prices.

Design/methodology/approach

This study presents a novel hybrid deep learning model, Residual-CNN-Seq2Seq (RCSNet), to predict the trend of stock price movement. RCSNet integrates the autoregressive integrated moving average (ARIMA) model, convolutional neural network (CNN) and the sequence-to-sequence (Seq2Seq) long–short-term memory (LSTM) model.

Findings

The hybrid model is able to forecast both linear and non-linear time-series component of stock dataset. CNN and Seq2Seq LSTMs can be effectively combined for dynamic modeling of short- and long-term-dependent patterns in non-linear time series forecast. Experimental results show that the proposed model outperforms baseline models on S&P 500 index stock dataset from January 2000 to August 2016.

Originality/value

This study develops the RCSNet hybrid model to tackle the challenge by combining both linear and non-linear models. New evidence has been obtained in predicting the movement of stock market prices.

Details

Journal of Asian Business and Economic Studies, vol. 29 no. 2
Type: Research Article
ISSN: 2515-964X

Keywords

Open Access
Article
Publication date: 8 December 2022

James Christopher Westland

This paper tests whether Bayesian A/B testing yields better decisions that traditional Neyman-Pearson hypothesis testing. It proposes a model and tests it using a large, multiyear…

1222

Abstract

Purpose

This paper tests whether Bayesian A/B testing yields better decisions that traditional Neyman-Pearson hypothesis testing. It proposes a model and tests it using a large, multiyear Google Analytics (GA) dataset.

Design/methodology/approach

This paper is an empirical study. Competing A/B testing models were used to analyze a large, multiyear dataset of GA dataset for a firm that relies entirely on their website and online transactions for customer engagement and sales.

Findings

Bayesian A/B tests of the data not only yielded a clear delineation of the timing and impact of the intellectual property fraud, but calculated the loss of sales dollars, traffic and time on the firm’s website, with precise confidence limits. Frequentist A/B testing identified fraud in bounce rate at 5% significance, and bounces at 10% significance, but was unable to ascertain fraud at the standard significance cutoffs for scientific studies.

Research limitations/implications

None within the scope of the research plan.

Practical implications

Bayesian A/B tests of the data not only yielded a clear delineation of the timing and impact of the IP fraud, but calculated the loss of sales dollars, traffic and time on the firm’s website, with precise confidence limits.

Social implications

Bayesian A/B testing can derive economically meaningful statistics, whereas frequentist A/B testing only provide p-value’s whose meaning may be hard to grasp, and where misuse is widespread and has been a major topic in metascience. While misuse of p-values in scholarly articles may simply be grist for academic debate, the uncertainty surrounding the meaning of p-values in business analytics actually can cost firms money.

Originality/value

There is very little empirical research in e-commerce that uses Bayesian A/B testing. Almost all corporate testing is done via frequentist Neyman-Pearson methods.

Details

Journal of Electronic Business & Digital Economics, vol. 1 no. 1/2
Type: Research Article
ISSN: 2754-4214

Keywords

Open Access
Article
Publication date: 2 March 2023

Kartik Venkatraman, Stéphane Moreau, Julien Christophe and Christophe Schram

The purpose of the paper is to predict the aerodynamic performance of a complete scale model H-Darrieus vertical axis wind turbine (VAWT) with end plates at different operating…

1425

Abstract

Purpose

The purpose of the paper is to predict the aerodynamic performance of a complete scale model H-Darrieus vertical axis wind turbine (VAWT) with end plates at different operating conditions. This paper aims at understanding the flow physics around a model VAWT for three different tip speed ratios corresponding to three different flow regimes.

Design/methodology/approach

This study achieves a first three-dimensional hybrid lattice Boltzmann method/very large eddy simulation (LBM-VLES) model for a complete scaled model VAWT with end plates and mast using the solver PowerFLOW. The power curve predicted from the numerical simulations is compared with the experimental data collected at Erlangen University. This study highlights the complexity of the turbulent flow features that are seen at three different operational regimes of the turbine using instantaneous flow structures, mean velocity, pressure iso-contours, blade loading and skin friction plots.

Findings

The power curve predicted using the LBM-VLES approach and setup provides a good overall match with the experimental power curve, with the peak and drop after the operational point being captured. Variable turbulent flow structures are seen over the azimuthal revolution that depends on the tip speed ratio (TSR). Significant dynamic stall structures are seen in the upwind phase and at the end of the downwind phase of rotation in the deep stall regime. Strong blade wake interactions and turbulent flow structures are seen inside the rotor at higher TSRs.

Research limitations/implications

The computational cost and time for such high-fidelity simulations using the LBM-VLES remains expensive. Each simulation requires around a week using supercomputing facilities. Further studies need to be performed to improve analytical VAWT models using inputs/calibration from high fidelity simulation databases. As a future work, the impact of turbulent and nonuniform inflow conditions that are more representative of a typical urban environment also needs to be investigated.

Practical implications

The LBM methodology is shown to be a reliable approach for VAWT power prediction. Dynamic stall and blade wake interactions reduce the aerodynamic performance of a VAWT. An ideal operation close to the peak of the power curve should be favored based on the local wind resource, as this point exhibits a smoother variation of forces improving operational performance. The 3D flow features also exhibit a significant wake asymmetry that could impact the optimal layout of VAWT clusters to increase their power density. The present work also highlights the importance of 3D simulations of the complete model including the support structures such as end plates and mast.

Social implications

Accurate predictions of power performance for Darrieus VAWTs could help in better siting of wind turbines thus improving return of investment and reducing levelized cost of energy. It could promote the development of onsite electricity generation, especially for industrial sites/urban areas and renew interest for VAWT wind farms.

Originality/value

A first high-fidelity simulation of a complete VAWT with end plates and supporting structures has been performed using the LBM approach and compared with experimental data. The 3D flow physics has been analyzed at different operating regimes of the turbine. These physical insights and prediction capabilities of this approach could be useful for commercial VAWT manufacturers.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 33 no. 4
Type: Research Article
ISSN: 0961-5539

Keywords

Open Access
Article
Publication date: 25 July 2022

Cara Greta Kolb, Maja Lehmann, Johannes Kriegler, Jana-Lorena Lindemann, Andreas Bachmann and Michael Friedrich Zaeh

This paper aims to present a requirements analysis for the processing of water-based electrode dispersions in inkjet printing.

927

Abstract

Purpose

This paper aims to present a requirements analysis for the processing of water-based electrode dispersions in inkjet printing.

Design/methodology/approach

A detailed examination of the components and the associated properties of the electrode dispersions has been carried out. The requirements of the printing process and the resulting performance characteristics of the electrode dispersions were analyzed in a top–down approach. The product and process side were compared, and the target specifications of the dispersion components were derived.

Findings

Target ranges have been identified for the main component properties, balancing the partly conflicting goals between the product and the process requirements.

Practical implications

The findings are expected to assist with the formulation of electrode dispersions as printing inks.

Originality/value

Little knowledge is available regarding the particular requirements arising from the systematic qualification of aqueous electrode dispersions for inkjet printing. This paper addresses these requirements, covering both product and process specifications.

Details

Rapid Prototyping Journal, vol. 28 no. 11
Type: Research Article
ISSN: 1355-2546

Keywords

Open Access
Article
Publication date: 11 March 2022

Andrei Khrennikov

This paper aims to present the basic assumptions for creation of social Fröhlich condensate and attract attention of other researchers (both from physics and socio-political…

Abstract

Purpose

This paper aims to present the basic assumptions for creation of social Fröhlich condensate and attract attention of other researchers (both from physics and socio-political science) to the problem of modeling of stability and order preservation in highly energetic society coupled with social energy bath of high temperature.

Design/methodology/approach

The model of social Fröhlich condensation and its analysis are based on the mathematical formalism of quantum thermodynamics and field theory (applied outside of physics).

Findings

The presented quantum-like model provides the consistent operational model of such complex socio-political phenomenon as Fröhlich condensation.

Research limitations/implications

The model of social Fröhlich condensation is heavily based on theory of open quantum systems. Its consistent elaboration needs additional efforts.

Practical implications

Evidence of such phenomenon as social Fröhlich condensation is demonstrated by stability of modern informationally open societies.

Social implications

Approaching the state of Fröhlich condensation is the powerful source of social stability. Understanding its informational structure and origin may help to stabilize the modern society.

Originality/value

Application of the quantum-like model of Fröhlich condensation in social and political sciences is really the novel and original approach to mathematical modeling of social stability in society exposed to powerful information radiation from mass-media and Internet-based sources.

1 – 10 of 10