Search results

1 – 10 of 21
Open Access
Article
Publication date: 31 July 2023

Daniel Šandor and Marina Bagić Babac

Sarcasm is a linguistic expression that usually carries the opposite meaning of what is being said by words, thus making it difficult for machines to discover the actual meaning…

2983

Abstract

Purpose

Sarcasm is a linguistic expression that usually carries the opposite meaning of what is being said by words, thus making it difficult for machines to discover the actual meaning. It is mainly distinguished by the inflection with which it is spoken, with an undercurrent of irony, and is largely dependent on context, which makes it a difficult task for computational analysis. Moreover, sarcasm expresses negative sentiments using positive words, allowing it to easily confuse sentiment analysis models. This paper aims to demonstrate the task of sarcasm detection using the approach of machine and deep learning.

Design/methodology/approach

For the purpose of sarcasm detection, machine and deep learning models were used on a data set consisting of 1.3 million social media comments, including both sarcastic and non-sarcastic comments. The data set was pre-processed using natural language processing methods, and additional features were extracted and analysed. Several machine learning models, including logistic regression, ridge regression, linear support vector and support vector machines, along with two deep learning models based on bidirectional long short-term memory and one bidirectional encoder representations from transformers (BERT)-based model, were implemented, evaluated and compared.

Findings

The performance of machine and deep learning models was compared in the task of sarcasm detection, and possible ways of improvement were discussed. Deep learning models showed more promise, performance-wise, for this type of task. Specifically, a state-of-the-art model in natural language processing, namely, BERT-based model, outperformed other machine and deep learning models.

Originality/value

This study compared the performance of the various machine and deep learning models in the task of sarcasm detection using the data set of 1.3 million comments from social media.

Details

Information Discovery and Delivery, vol. 52 no. 2
Type: Research Article
ISSN: 2398-6247

Keywords

Open Access
Article
Publication date: 29 February 2024

Guanchen Liu, Dongdong Xu, Zifu Shen, Hongjie Xu and Liang Ding

As an advanced manufacturing method, additive manufacturing (AM) technology provides new possibilities for efficient production and design of parts. However, with the continuous…

Abstract

Purpose

As an advanced manufacturing method, additive manufacturing (AM) technology provides new possibilities for efficient production and design of parts. However, with the continuous expansion of the application of AM materials, subtractive processing has become one of the necessary steps to improve the accuracy and performance of parts. In this paper, the processing process of AM materials is discussed in depth, and the surface integrity problem caused by it is discussed.

Design/methodology/approach

Firstly, we listed and analyzed the characterization parameters of metal surface integrity and its influence on the performance of parts and then introduced the application of integrated processing of metal adding and subtracting materials and the influence of different processing forms on the surface integrity of parts. The surface of the trial-cut material is detected and analyzed, and the surface of the integrated processing of adding and subtracting materials is compared with that of the pure processing of reducing materials, so that the corresponding conclusions are obtained.

Findings

In this process, we also found some surface integrity problems, such as knife marks, residual stress and thermal effects. These problems may have a potential negative impact on the performance of the final parts. In processing, we can try to use other integrated processing technologies of adding and subtracting materials, try to combine various integrated processing technologies of adding and subtracting materials, or consider exploring more efficient AM technology to improve processing efficiency. We can also consider adopting production process optimization measures to reduce the processing cost of adding and subtracting materials.

Originality/value

With the gradual improvement of the requirements for the surface quality of parts in the production process and the in-depth implementation of sustainable manufacturing, the demand for integrated processing of metal addition and subtraction materials is likely to continue to grow in the future. By deeply understanding and studying the problems of material reduction and surface integrity of AM materials, we can better meet the challenges in the manufacturing process and improve the quality and performance of parts. This research is very important for promoting the development of manufacturing technology and achieving success in practical application.

Details

Journal of Intelligent Manufacturing and Special Equipment, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2633-6596

Keywords

Open Access
Article
Publication date: 20 March 2024

Guijian Xiao, Tangming Zhang, Yi He, Zihan Zheng and Jingzhe Wang

The purpose of this review is to comprehensively consider the material properties and processing of additive titanium alloy and provide a new perspective for the robotic grinding…

Abstract

Purpose

The purpose of this review is to comprehensively consider the material properties and processing of additive titanium alloy and provide a new perspective for the robotic grinding and polishing of additive titanium alloy blades to ensure the surface integrity and machining accuracy of the blades.

Design/methodology/approach

At present, robot grinding and polishing are mainstream processing methods in blade automatic processing. This review systematically summarizes the processing characteristics and processing methods of additive manufacturing (AM) titanium alloy blades. On the one hand, the unique manufacturing process and thermal effect of AM have created the unique processing characteristics of additive titanium alloy blades. On the other hand, the robot grinding and polishing process needs to incorporate the material removal model into the traditional processing flow according to the processing characteristics of the additive titanium alloy.

Findings

Robot belt grinding can solve the processing problem of additive titanium alloy blades. The complex surface of the blade generates a robot grinding trajectory through trajectory planning. The trajectory planning of the robot profoundly affects the machining accuracy and surface quality of the blade. Subsequent research is needed to solve the problems of high machining accuracy of blade profiles, complex surface material removal models and uneven distribution of blade machining allowance. In the process parameters of the robot, the grinding parameters, trajectory planning and error compensation affect the surface quality of the blade through the material removal method, grinding force and grinding temperature. The machining accuracy of the blade surface is affected by robot vibration and stiffness.

Originality/value

This review systematically summarizes the processing characteristics and processing methods of aviation titanium alloy blades manufactured by AM. Combined with the material properties of additive titanium alloy, it provides a new idea for robot grinding and polishing of aviation titanium alloy blades manufactured by AM.

Details

Journal of Intelligent Manufacturing and Special Equipment, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2633-6596

Keywords

Open Access
Article
Publication date: 8 April 2024

Oussama-Ali Dabaj, Ronan Corin, Jean-Philippe Lecointe, Cristian Demian and Jonathan Blaszkowski

This paper aims to investigate the impact of combining grain-oriented electrical steel (GOES) grades on specific iron losses and the flux density distribution within a…

Abstract

Purpose

This paper aims to investigate the impact of combining grain-oriented electrical steel (GOES) grades on specific iron losses and the flux density distribution within a single-phase magnetic core.

Design/methodology/approach

This paper presents the results of finite-element method (FEM) simulations investigating the impact of mixing two different GOES grades on losses of a single-phase magnetic core. The authors used different models: a 3D model with a highly detailed geometry including both saturation and anisotropy, as well as a simplified 2D model to save computation time. The behavior of the flux distribution in the mixed magnetic core is analyzed. Finally, the results from the numerical simulations are compared with experimental results.

Findings

The specific iron losses of a mixed magnetic core exhibit a nonlinear decrease with respect to the GOES grade with the lowest losses. Analyzing the magnetic core behavior using 2D and 3D FEM shows that the rolling direction of the GOES grades plays a critical role on the nonlinearity variation of the specific losses.

Originality/value

The novelty of this research lies in achieving an optimum trade-off between the manufacturing cost and the core efficiency by combining conventional and high-performance GOES grade in a single-phase magnetic core.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering , vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0332-1649

Keywords

Open Access
Article
Publication date: 2 April 2024

Koraljka Golub, Osma Suominen, Ahmed Taiye Mohammed, Harriet Aagaard and Olof Osterman

In order to estimate the value of semi-automated subject indexing in operative library catalogues, the study aimed to investigate five different automated implementations of an…

Abstract

Purpose

In order to estimate the value of semi-automated subject indexing in operative library catalogues, the study aimed to investigate five different automated implementations of an open source software package on a large set of Swedish union catalogue metadata records, with Dewey Decimal Classification (DDC) as the target classification system. It also aimed to contribute to the body of research on aboutness and related challenges in automated subject indexing and evaluation.

Design/methodology/approach

On a sample of over 230,000 records with close to 12,000 distinct DDC classes, an open source tool Annif, developed by the National Library of Finland, was applied in the following implementations: lexical algorithm, support vector classifier, fastText, Omikuji Bonsai and an ensemble approach combing the former four. A qualitative study involving two senior catalogue librarians and three students of library and information studies was also conducted to investigate the value and inter-rater agreement of automatically assigned classes, on a sample of 60 records.

Findings

The best results were achieved using the ensemble approach that achieved 66.82% accuracy on the three-digit DDC classification task. The qualitative study confirmed earlier studies reporting low inter-rater agreement but also pointed to the potential value of automatically assigned classes as additional access points in information retrieval.

Originality/value

The paper presents an extensive study of automated classification in an operative library catalogue, accompanied by a qualitative study of automated classes. It demonstrates the value of applying semi-automated indexing in operative information retrieval systems.

Details

Journal of Documentation, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0022-0418

Keywords

Open Access
Article
Publication date: 12 January 2024

Patrik Jonsson, Johan Öhlin, Hafez Shurrab, Johan Bystedt, Azam Sheikh Muhammad and Vilhelm Verendel

This study aims to explore and empirically test variables influencing material delivery schedule inaccuracies?

Abstract

Purpose

This study aims to explore and empirically test variables influencing material delivery schedule inaccuracies?

Design/methodology/approach

A mixed-method case approach is applied. Explanatory variables are identified from the literature and explored in a qualitative analysis at an automotive original equipment manufacturer. Using logistic regression and random forest classification models, quantitative data (historical schedule transactions and internal data) enables the testing of the predictive difference of variables under various planning horizons and inaccuracy levels.

Findings

The effects on delivery schedule inaccuracies are contingent on a decoupling point, and a variable may have a combined amplifying (complexity generating) and stabilizing (complexity absorbing) moderating effect. Product complexity variables are significant regardless of the time horizon, and the item’s order life cycle is a significant variable with predictive differences that vary. Decoupling management is identified as a mechanism for generating complexity absorption capabilities contributing to delivery schedule accuracy.

Practical implications

The findings provide guidelines for exploring and finding patterns in specific variables to improve material delivery schedule inaccuracies and input into predictive forecasting models.

Originality/value

The findings contribute to explaining material delivery schedule variations, identifying potential root causes and moderators, empirically testing and validating effects and conceptualizing features that cause and moderate inaccuracies in relation to decoupling management and complexity theory literature?

Details

International Journal of Operations & Production Management, vol. 44 no. 13
Type: Research Article
ISSN: 0144-3577

Keywords

Open Access
Article
Publication date: 19 January 2024

Fuzhao Chen, Zhilei Chen, Qian Chen, Tianyang Gao, Mingyan Dai, Xiang Zhang and Lin Sun

The electromechanical brake system is leading the latest development trend in railway braking technology. The tolerance stack-up generated during the assembly and production…

Abstract

Purpose

The electromechanical brake system is leading the latest development trend in railway braking technology. The tolerance stack-up generated during the assembly and production process catalyzes the slight geometric dimensioning and tolerancing between the motor stator and rotor inside the electromechanical cylinder. The tolerance leads to imprecise brake control, so it is necessary to diagnose the fault of the motor in the fully assembled electromechanical brake system. This paper aims to present improved variational mode decomposition (VMD) algorithm, which endeavors to elucidate and push the boundaries of mechanical synchronicity problems within the realm of the electromechanical brake system.

Design/methodology/approach

The VMD algorithm plays a pivotal role in the preliminary phase, employing mode decomposition techniques to decompose the motor speed signals. Afterward, the error energy algorithm precision is utilized to extract abnormal features, leveraging the practical intrinsic mode functions, eliminating extraneous noise and enhancing the signal’s fidelity. This refined signal then becomes the basis for fault analysis. In the analytical step, the cepstrum is employed to calculate the formant and envelope of the reconstructed signal. By scrutinizing the formant and envelope, the fault point within the electromechanical brake system is precisely identified, contributing to a sophisticated and accurate fault diagnosis.

Findings

This paper innovatively uses the VMD algorithm for the modal decomposition of electromechanical brake (EMB) motor speed signals and combines it with the error energy algorithm to achieve abnormal feature extraction. The signal is reconstructed according to the effective intrinsic mode functions (IMFS) component of removing noise, and the formant and envelope are calculated by cepstrum to locate the fault point. Experiments show that the empirical mode decomposition (EMD) algorithm can effectively decompose the original speed signal. After feature extraction, signal enhancement and fault identification, the motor mechanical fault point can be accurately located. This fault diagnosis method is an effective fault diagnosis algorithm suitable for EMB systems.

Originality/value

By using this improved VMD algorithm, the electromechanical brake system can precisely identify the rotational anomaly of the motor. This method can offer an online diagnosis analysis function during operation and contribute to an automated factory inspection strategy while parts are assembled. Compared with the conventional motor diagnosis method, this improved VMD algorithm can eliminate the need for additional acceleration sensors and save hardware costs. Moreover, the accumulation of online detection functions helps improve the reliability of train electromechanical braking systems.

Open Access
Article
Publication date: 21 March 2024

Warisa Thangjai and Sa-Aat Niwitpong

Confidence intervals play a crucial role in economics and finance, providing a credible range of values for an unknown parameter along with a corresponding level of certainty…

Abstract

Purpose

Confidence intervals play a crucial role in economics and finance, providing a credible range of values for an unknown parameter along with a corresponding level of certainty. Their applications encompass economic forecasting, market research, financial forecasting, econometric analysis, policy analysis, financial reporting, investment decision-making, credit risk assessment and consumer confidence surveys. Signal-to-noise ratio (SNR) finds applications in economics and finance across various domains such as economic forecasting, financial modeling, market analysis and risk assessment. A high SNR indicates a robust and dependable signal, simplifying the process of making well-informed decisions. On the other hand, a low SNR indicates a weak signal that could be obscured by noise, so decision-making procedures need to take this into serious consideration. This research focuses on the development of confidence intervals for functions derived from the SNR and explores their application in the fields of economics and finance.

Design/methodology/approach

The construction of the confidence intervals involved the application of various methodologies. For the SNR, confidence intervals were formed using the generalized confidence interval (GCI), large sample and Bayesian approaches. The difference between SNRs was estimated through the GCI, large sample, method of variance estimates recovery (MOVER), parametric bootstrap and Bayesian approaches. Additionally, confidence intervals for the common SNR were constructed using the GCI, adjusted MOVER, computational and Bayesian approaches. The performance of these confidence intervals was assessed using coverage probability and average length, evaluated through Monte Carlo simulation.

Findings

The GCI approach demonstrated superior performance over other approaches in terms of both coverage probability and average length for the SNR and the difference between SNRs. Hence, employing the GCI approach is advised for constructing confidence intervals for these parameters. As for the common SNR, the Bayesian approach exhibited the shortest average length. Consequently, the Bayesian approach is recommended for constructing confidence intervals for the common SNR.

Originality/value

This research presents confidence intervals for functions of the SNR to assess SNR estimation in the fields of economics and finance.

Details

Asian Journal of Economics and Banking, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2615-9821

Keywords

Open Access
Article
Publication date: 2 January 2024

Guillermo Guerrero-Vacas, Jaime Gómez-Castillo and Oscar Rodríguez-Alabanda

Polyurethane (PUR) foam parts are traditionally manufactured using metallic molds, an unsuitable approach for prototyping purposes. Thus, rapid tooling of disposable molds using…

Abstract

Purpose

Polyurethane (PUR) foam parts are traditionally manufactured using metallic molds, an unsuitable approach for prototyping purposes. Thus, rapid tooling of disposable molds using fused filament fabrication (FFF) with polylactic acid (PLA) and glycol-modified polyethylene terephthalate (PETG) is proposed as an economical, simpler and faster solution compared to traditional metallic molds or three-dimensional (3D) printing with other difficult-to-print thermoplastics, which are prone to shrinkage and delamination (acrylonitrile butadiene styrene, polypropilene-PP) or high-cost due to both material and printing equipment expenses (PEEK, polyamides or polycarbonate-PC). The purpose of this study has been to evaluate the ease of release of PUR foam on these materials in combination with release agents to facilitate the mulding/demoulding process.

Design/methodology/approach

PETG, PLA and hardenable polylactic acid (PLA 3D870) have been evaluated as mold materials in combination with aqueous and solvent-based release agents within a full design of experiments by three consecutive molding/demolding cycles.

Findings

PLA 3D870 has shown the best demoldability. A mold expressly designed to manufacture a foam cushion has been printed and the prototyping has been successfully achieved. The demolding of the part has been easier using a solvent-based release agent, meanwhile the quality has been better when using a water-based one.

Originality/value

The combination of PLA 3D870 and FFF, along with solvent-free water-based release agents, presents a compelling low-cost and eco-friendly alternative to traditional metallic molds and other 3D printing thermoplastics. This innovative approach serves as a viable option for rapid tooling in PUR foam molding.

Details

Rapid Prototyping Journal, vol. 30 no. 11
Type: Research Article
ISSN: 1355-2546

Keywords

Open Access
Article
Publication date: 24 August 2023

Chiara Bertolin and Filippo Berto

This article introduces the Special Issue on Sustainable Management of Heritage Buildings in long-term perspective.

Abstract

Purpose

This article introduces the Special Issue on Sustainable Management of Heritage Buildings in long-term perspective.

Design/methodology/approach

It starts by reviewing the gaps in knowledge and practice which led to the creation and implementation of the research project SyMBoL—Sustainable Management of Heritage Buildings in long-term perspective funded by the Norwegian Research Council over the 2018–2022 period. The SyMBoL project is the motivation at the base of this special issue.

Findings

The editorial paper briefly presents the main outcomes of SyMBoL. It then reviews the contributions to the Special Issue, focussing on the connection or differentiation with SyMBoL and on multidisciplinary findings that address some of the initial referred gaps.

Originality/value

The article shortly summarizes topics related to sustainable preservation of heritage buildings in time of reduced resources, energy crisis and impacts of natural hazards and global warming. Finally, it highlights future research directions targeted to overcome, or partially mitigate, the above-mentioned challenges, for example, taking advantage of no sestructive techniques interoperability, heritage building information modelling and digital twin models, and machine learning and risk assessment algorithms.

Access

Only Open Access

Year

Last 3 months (21)

Content type

1 – 10 of 21