Search results

1 – 10 of over 2000
Open Access
Article
Publication date: 31 July 2023

Daniel Šandor and Marina Bagić Babac

Sarcasm is a linguistic expression that usually carries the opposite meaning of what is being said by words, thus making it difficult for machines to discover the actual meaning…

2983

Abstract

Purpose

Sarcasm is a linguistic expression that usually carries the opposite meaning of what is being said by words, thus making it difficult for machines to discover the actual meaning. It is mainly distinguished by the inflection with which it is spoken, with an undercurrent of irony, and is largely dependent on context, which makes it a difficult task for computational analysis. Moreover, sarcasm expresses negative sentiments using positive words, allowing it to easily confuse sentiment analysis models. This paper aims to demonstrate the task of sarcasm detection using the approach of machine and deep learning.

Design/methodology/approach

For the purpose of sarcasm detection, machine and deep learning models were used on a data set consisting of 1.3 million social media comments, including both sarcastic and non-sarcastic comments. The data set was pre-processed using natural language processing methods, and additional features were extracted and analysed. Several machine learning models, including logistic regression, ridge regression, linear support vector and support vector machines, along with two deep learning models based on bidirectional long short-term memory and one bidirectional encoder representations from transformers (BERT)-based model, were implemented, evaluated and compared.

Findings

The performance of machine and deep learning models was compared in the task of sarcasm detection, and possible ways of improvement were discussed. Deep learning models showed more promise, performance-wise, for this type of task. Specifically, a state-of-the-art model in natural language processing, namely, BERT-based model, outperformed other machine and deep learning models.

Originality/value

This study compared the performance of the various machine and deep learning models in the task of sarcasm detection using the data set of 1.3 million comments from social media.

Details

Information Discovery and Delivery, vol. 52 no. 2
Type: Research Article
ISSN: 2398-6247

Keywords

Article
Publication date: 25 April 2024

Tulsi Pawan Fowdur and Ashven Sanghan

The purpose of this paper is to develop a blockchain-based data capture and transmission system that will collect real-time power consumption data from a household electrical…

Abstract

Purpose

The purpose of this paper is to develop a blockchain-based data capture and transmission system that will collect real-time power consumption data from a household electrical appliance and transfer it securely to a local server for energy analytics such as forecasting.

Design/methodology/approach

The data capture system is composed of two current transformer (CT) sensors connected to two different electrical appliances. The CT sensors send the power readings to two Arduino microcontrollers which in turn connect to a Raspberry-Pi for aggregating the data. Blockchain is then enabled onto the Raspberry-Pi through a Java API so that the data are transmitted securely to a server. The server provides real-time visualization of the data as well as prediction using the multi-layer perceptron (MLP) and long short term memory (LSTM) algorithms.

Findings

The results for the blockchain analysis demonstrate that when the data readings are transmitted in smaller blocks, the security is much greater as compared with blocks of larger size. To assess the accuracy of the prediction algorithms data were collected for a 20 min interval to train the model and the algorithms were evaluated using the sliding window approach. The mean average percentage error (MAPE) was used to assess the accuracy of the algorithms and a MAPE of 1.62% and 1.99% was obtained for the LSTM and MLP algorithms, respectively.

Originality/value

A detailed performance analysis of the blockchain-based transmission model using time complexity, throughput and latency as well as energy forecasting has been performed.

Details

Sensor Review, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0260-2288

Keywords

Article
Publication date: 26 April 2024

Wajde Baiod and Mostaq M. Hussain

This study aims to focus on the five most relevant and discursive emerging technologies in accounting (cloud computing, big data and data analytics, blockchain, artificial…

Abstract

Purpose

This study aims to focus on the five most relevant and discursive emerging technologies in accounting (cloud computing, big data and data analytics, blockchain, artificial intelligence (AI) and robotics process automation [RPA]). It investigates the adoption and use of these technologies based on data collected from accounting professionals in a technology-developed country – Canada, through a survey.

Design/methodology/approach

The study investigates the adoption and use of emerging technologies based on data collected from accounting professionals in a technology-developed country – Canada, through a survey. This study considers the said nature and characteristics of emerging technologies and proposes a model using the factors that have been found to be significant and most commonly investigated by existing prior technology-organization-environment (TOE)-related technology adoption studies. This survey applies the TOE framework and examines the influence of significant and most commonly known factors on Canadian firms’ intention to adopt the said emerging technologies.

Findings

Study results indicate that Canadian accounting professionals’ self-assessed knowledge (about these emerging technologies) is more theoretical than operational. Cloud computing is highly used by Canadian firms, while the use of other technologies, particularly blockchain and RPA, is reportedly low. However, firms’ intention about the future adoption of these technologies seems positive. Study results reveal that only the relative advantage and top management commitment are found to be significant considerations influencing the adoption intention.

Research limitations/implications

Study findings confirm some results presented in earlier studies but provide additional insights from a new perspective, that of accounting professionals in Canada. The first limitation relates to the respondents. Although accounting professionals provided valuable insights, their responses are personal views and do not necessarily represent the views of other professionals within the same firm or the official position of their accounting departments or firms. Therefore, the exclusion of diverse viewpoints from the same firm might have negatively impacted the results of this study. Second, this study sample is limited to Canada-based firms, which means that the study reflects only the situation in that country. Third, considering the research method and the limit on the number of questions the authors could ask, respondents were only asked to rate the impact of these five technologies on the accounting field and to clarify which technologies are used.

Practical implications

This study’s findings confirm that the organizational intention to adopt new technology is not primarily based on the characteristics of the technology. In the case of emerging technology adoption, the decision also depends upon other factors related to the internal organization. Furthermore, although this study found no support for the effect of environmental factors, it fills a gap in the literature by including the factor of vendor support, which has received little attention in prior information technology (IT)/ information system (IS) adoption research. Moreover, in contrast to most prior adoption studies, this study elaborates on accounting professionals’ experience and perceptions in investigating the organizational adoption and use of emerging technologies. Thus, the findings of this study are valuable, providing insights from a new perspective, that of professional accountants.

Social implications

The study findings may serve as a guide for researchers, practitioners, firms and other stakeholders, particularly technology providers, interested in learning about emerging technologies’ adoption and use in Canada and/or in a relevant context. Contrary to most prior adoption studies, this study elaborates on accounting professionals’ experience and perceptions in investigating the organizational adoption and use of emerging technologies. Thus, the findings of this study are valuable, providing insights from a new perspective, that of professional accountants.

Originality/value

The study provides insights into the said technologies’ actual adoption and improves the awareness of firms and stakeholders to the effect of some constructs that influence the adoption of these emerging technologies in accounting.

Details

International Journal of Accounting & Information Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1834-7649

Keywords

Open Access
Article
Publication date: 15 December 2023

Nicola Castellano, Roberto Del Gobbo and Lorenzo Leto

The concept of productivity is central to performance management and decision-making, although it is complex and multifaceted. This paper aims to describe a methodology based on…

Abstract

Purpose

The concept of productivity is central to performance management and decision-making, although it is complex and multifaceted. This paper aims to describe a methodology based on the use of Big Data in a cluster analysis combined with a data envelopment analysis (DEA) that provides accurate and reliable productivity measures in a large network of retailers.

Design/methodology/approach

The methodology is described using a case study of a leading kitchen furniture producer. More specifically, Big Data is used in a two-step analysis prior to the DEA to automatically cluster a large number of retailers into groups that are homogeneous in terms of structural and environmental factors and assess a within-the-group level of productivity of the retailers.

Findings

The proposed methodology helps reduce the heterogeneity among the units analysed, which is a major concern in DEA applications. The data-driven factorial and clustering technique allows for maximum within-group homogeneity and between-group heterogeneity by reducing subjective bias and dimensionality, which is embedded with the use of Big Data.

Practical implications

The use of Big Data in clustering applied to productivity analysis can provide managers with data-driven information about the structural and socio-economic characteristics of retailers' catchment areas, which is important in establishing potential productivity performance and optimizing resource allocation. The improved productivity indexes enable the setting of targets that are coherent with retailers' potential, which increases motivation and commitment.

Originality/value

This article proposes an innovative technique to enhance the accuracy of productivity measures through the use of Big Data clustering and DEA. To the best of the authors’ knowledge, no attempts have been made to benefit from the use of Big Data in the literature on retail store productivity.

Details

International Journal of Productivity and Performance Management, vol. 73 no. 11
Type: Research Article
ISSN: 1741-0401

Keywords

Article
Publication date: 23 January 2024

Zoltán Pápai, Péter Nagy and Aliz McLean

This study aims to estimate the quality-adjusted changes in residential mobile consumer prices by controlling for the changes in the relevant service characteristics and quality…

Abstract

Purpose

This study aims to estimate the quality-adjusted changes in residential mobile consumer prices by controlling for the changes in the relevant service characteristics and quality, in a case study on Hungary between 2015 and 2021; compare the results with changes measured by the traditionally calculated official telecommunications price index of the Statistical Office; and discuss separating the hedonic price changes from the effect of a specific government intervention that occurred in Hungary, namely, the significant reduction in the value added tax rate (VAT) levied on internet services.

Design/methodology/approach

Since the price of commercial mobile offers does not directly reflect the continuous improvements in service characteristics and functionalities over time, the price changes need to be adjusted for changes in quality. The authors use hedonic regression analysis to address this issue.

Findings

The results show significant hedonic price changes over the observed seven-year period of over 30%, which turns out to be primarily driven by the significant developments in the comprising service characteristics and not the VAT policy change.

Originality/value

This paper contributes to the literature on hedonic price analyses on complex telecommunications service plans and enhances this methodology by using weights and analysing the content-related features of the mobile packages.

Details

Digital Policy, Regulation and Governance, vol. 26 no. 3
Type: Research Article
ISSN: 2398-5038

Keywords

Article
Publication date: 26 September 2023

Mohammed Ayoub Ledhem and Warda Moussaoui

This paper aims to apply several data mining techniques for predicting the daily precision improvement of Jakarta Islamic Index (JKII) prices based on big data of symmetric…

Abstract

Purpose

This paper aims to apply several data mining techniques for predicting the daily precision improvement of Jakarta Islamic Index (JKII) prices based on big data of symmetric volatility in Indonesia’s Islamic stock market.

Design/methodology/approach

This research uses big data mining techniques to predict daily precision improvement of JKII prices by applying the AdaBoost, K-nearest neighbor, random forest and artificial neural networks. This research uses big data with symmetric volatility as inputs in the predicting model, whereas the closing prices of JKII were used as the target outputs of daily precision improvement. For choosing the optimal prediction performance according to the criteria of the lowest prediction errors, this research uses four metrics of mean absolute error, mean squared error, root mean squared error and R-squared.

Findings

The experimental results determine that the optimal technique for predicting the daily precision improvement of the JKII prices in Indonesia’s Islamic stock market is the AdaBoost technique, which generates the optimal predicting performance with the lowest prediction errors, and provides the optimum knowledge from the big data of symmetric volatility in Indonesia’s Islamic stock market. In addition, the random forest technique is also considered another robust technique in predicting the daily precision improvement of the JKII prices as it delivers closer values to the optimal performance of the AdaBoost technique.

Practical implications

This research is filling the literature gap of the absence of using big data mining techniques in the prediction process of Islamic stock markets by delivering new operational techniques for predicting the daily stock precision improvement. Also, it helps investors to manage the optimal portfolios and to decrease the risk of trading in global Islamic stock markets based on using big data mining of symmetric volatility.

Originality/value

This research is a pioneer in using big data mining of symmetric volatility in the prediction of an Islamic stock market index.

Details

Journal of Modelling in Management, vol. 19 no. 3
Type: Research Article
ISSN: 1746-5664

Keywords

Article
Publication date: 31 May 2023

Nathanaël Betti, Steven DeSimone, Joy Gray and Ingrid Poncin

This research paper aims to investigate the effects of internal audit’s (IA) use of data analytics and the performance of consulting activities on perceived IA quality.

Abstract

Purpose

This research paper aims to investigate the effects of internal audit’s (IA) use of data analytics and the performance of consulting activities on perceived IA quality.

Design/methodology/approach

The authors conduct a 2 × 2 between-subjects experiment among upper and middle managers where the use of data analytics and the performance of consulting activities by internal auditors are manipulated.

Findings

Results highlight the importance of internal auditor use of data analytics and performance of consulting activities to improve perceived IA quality. First, managers perceive internal auditors as more competent when the auditors use data analytics. Second, managers perceive internal auditors’ recommendations as more relevant when the auditors perform consulting activities. Finally, managers perceive an improvement in the quality of relationships with internal auditors when auditors perform consulting activities, which is strengthened when internal auditors combine the use of data analytics and the performance of consulting activities.

Research limitations/implications

From a theoretical perspective, this research builds on the IA quality framework by considering digitalization as a contextual factor. This research focused on the perceptions of one major stakeholder of the IA function: senior management. Future research should investigate the perceptions of other stakeholders and other contextual factors.

Practical implications

This research suggests that internal auditors should prioritize the development of the consulting role in their function and develop their digital expertise, especially expertise in data analytics, to improve perceived IA quality.

Originality/value

This research tests the impacts of the use of data analytics and the performance of consulting activities on perceived IA quality holistically, by testing Trotman and Duncan’s (2018) framework using an experiment.

Details

Journal of Accounting & Organizational Change, vol. 20 no. 2
Type: Research Article
ISSN: 1832-5912

Keywords

Open Access
Article
Publication date: 26 April 2024

Adela Sobotkova, Ross Deans Kristensen-McLachlan, Orla Mallon and Shawn Adrian Ross

This paper provides practical advice for archaeologists and heritage specialists wishing to use ML approaches to identify archaeological features in high-resolution satellite…

Abstract

Purpose

This paper provides practical advice for archaeologists and heritage specialists wishing to use ML approaches to identify archaeological features in high-resolution satellite imagery (or other remotely sensed data sources). We seek to balance the disproportionately optimistic literature related to the application of ML to archaeological prospection through a discussion of limitations, challenges and other difficulties. We further seek to raise awareness among researchers of the time, effort, expertise and resources necessary to implement ML successfully, so that they can make an informed choice between ML and manual inspection approaches.

Design/methodology/approach

Automated object detection has been the holy grail of archaeological remote sensing for the last two decades. Machine learning (ML) models have proven able to detect uniform features across a consistent background, but more variegated imagery remains a challenge. We set out to detect burial mounds in satellite imagery from a diverse landscape in Central Bulgaria using a pre-trained Convolutional Neural Network (CNN) plus additional but low-touch training to improve performance. Training was accomplished using MOUND/NOT MOUND cutouts, and the model assessed arbitrary tiles of the same size from the image. Results were assessed using field data.

Findings

Validation of results against field data showed that self-reported success rates were misleadingly high, and that the model was misidentifying most features. Setting an identification threshold at 60% probability, and noting that we used an approach where the CNN assessed tiles of a fixed size, tile-based false negative rates were 95–96%, false positive rates were 87–95% of tagged tiles, while true positives were only 5–13%. Counterintuitively, the model provided with training data selected for highly visible mounds (rather than all mounds) performed worse. Development of the model, meanwhile, required approximately 135 person-hours of work.

Research limitations/implications

Our attempt to deploy a pre-trained CNN demonstrates the limitations of this approach when it is used to detect varied features of different sizes within a heterogeneous landscape that contains confounding natural and modern features, such as roads, forests and field boundaries. The model has detected incidental features rather than the mounds themselves, making external validation with field data an essential part of CNN workflows. Correcting the model would require refining the training data as well as adopting different approaches to model choice and execution, raising the computational requirements beyond the level of most cultural heritage practitioners.

Practical implications

Improving the pre-trained model’s performance would require considerable time and resources, on top of the time already invested. The degree of manual intervention required – particularly around the subsetting and annotation of training data – is so significant that it raises the question of whether it would be more efficient to identify all of the mounds manually, either through brute-force inspection by experts or by crowdsourcing the analysis to trained – or even untrained – volunteers. Researchers and heritage specialists seeking efficient methods for extracting features from remotely sensed data should weigh the costs and benefits of ML versus manual approaches carefully.

Social implications

Our literature review indicates that use of artificial intelligence (AI) and ML approaches to archaeological prospection have grown exponentially in the past decade, approaching adoption levels associated with “crossing the chasm” from innovators and early adopters to the majority of researchers. The literature itself, however, is overwhelmingly positive, reflecting some combination of publication bias and a rhetoric of unconditional success. This paper presents the failure of a good-faith attempt to utilise these approaches as a counterbalance and cautionary tale to potential adopters of the technology. Early-majority adopters may find ML difficult to implement effectively in real-life scenarios.

Originality/value

Unlike many high-profile reports from well-funded projects, our paper represents a serious but modestly resourced attempt to apply an ML approach to archaeological remote sensing, using techniques like transfer learning that are promoted as solutions to time and cost problems associated with, e.g. annotating and manipulating training data. While the majority of articles uncritically promote ML, or only discuss how challenges were overcome, our paper investigates how – despite reasonable self-reported scores – the model failed to locate the target features when compared to field data. We also present time, expertise and resourcing requirements, a rarity in ML-for-archaeology publications.

Details

Journal of Documentation, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0022-0418

Keywords

Article
Publication date: 22 December 2022

Reihaneh Alsadat Tabaeeian, Behzad Hajrahimi and Atefeh Khoshfetrat

The purpose of this review paper was identifying barriers to the use of telemedicine systems in primary health-care individual level among professionals.

Abstract

Purpose

The purpose of this review paper was identifying barriers to the use of telemedicine systems in primary health-care individual level among professionals.

Design/methodology/approach

This study used Scopus and PubMed databases for scientific records identification. A systematic review of the literature structured by PRISMA guidelines was conducted on 37 included papers published between 2009 and 2019. A qualitative approach was used to synthesize insights into using telemedicine by primary care professionals.

Findings

Three barriers were identified and classified: system quality, data quality and service quality barriers. System complexity in terms of usability, system unreliability, security and privacy concerns, lack of integration and inflexibility of systems-in-use are related to system quality. Data quality barriers are data inaccuracy, data timeliness issues, data conciseness concerns and lack of data uniqueness. Finally, service reliability concerns, lack of technical support and lack of user training have been categorized as service quality barriers.

Originality/value

This review identified and mapped emerging themes of barriers to the use of telemedicine systems. This paper also through a new conceptualization of telemedicine use from perspectives of the primary care professionals contributes to informatics literature and system usage practices.

Details

Journal of Science and Technology Policy Management, vol. 15 no. 3
Type: Research Article
ISSN: 2053-4620

Keywords

Article
Publication date: 17 February 2022

Prajakta Thakare and Ravi Sankar V.

Agriculture is the backbone of a country, contributing more than half of the sector of economy throughout the world. The need for precision agriculture is essential in evaluating…

Abstract

Purpose

Agriculture is the backbone of a country, contributing more than half of the sector of economy throughout the world. The need for precision agriculture is essential in evaluating the conditions of the crops with the aim of determining the proper selection of pesticides. The conventional method of pest detection fails to be stable and provides limited accuracy in the prediction. This paper aims to propose an automatic pest detection module for the accurate detection of pests using the hybrid optimization controlled deep learning model.

Design/methodology/approach

The paper proposes an advanced pest detection strategy based on deep learning strategy through wireless sensor network (WSN) in the agricultural fields. Initially, the WSN consisting of number of nodes and a sink are clustered as number of clusters. Each cluster comprises a cluster head (CH) and a number of nodes, where the CH involves in the transfer of data to the sink node of the WSN and the CH is selected using the fractional ant bee colony optimization (FABC) algorithm. The routing process is executed using the protruder optimization algorithm that helps in the transfer of image data to the sink node through the optimal CH. The sink node acts as the data aggregator and the collection of image data thus obtained acts as the input database to be processed to find the type of pest in the agricultural field. The image data is pre-processed to remove the artifacts present in the image and the pre-processed image is then subjected to feature extraction process, through which the significant local directional pattern, local binary pattern, local optimal-oriented pattern (LOOP) and local ternary pattern (LTP) features are extracted. The extracted features are then fed to the deep-convolutional neural network (CNN) in such a way to detect the type of pests in the agricultural field. The weights of the deep-CNN are tuned optimally using the proposed MFGHO optimization algorithm that is developed with the combined characteristics of navigating search agents and the swarming search agents.

Findings

The analysis using insect identification from habitus image Database based on the performance metrics, such as accuracy, specificity and sensitivity, reveals the effectiveness of the proposed MFGHO-based deep-CNN in detecting the pests in crops. The analysis proves that the proposed classifier using the FABC+protruder optimization-based data aggregation strategy obtains an accuracy of 94.3482%, sensitivity of 93.3247% and the specificity of 94.5263%, which is high as compared to the existing methods.

Originality/value

The proposed MFGHO optimization-based deep-CNN is used for the detection of pest in the crop fields to ensure the better selection of proper cost-effective pesticides for the crop fields in such a way to increase the production. The proposed MFGHO algorithm is developed with the integrated characteristic features of navigating search agents and the swarming search agents in such a way to facilitate the optimal tuning of the hyperparameters in the deep-CNN classifier for the detection of pests in the crop fields.

Details

Journal of Engineering, Design and Technology , vol. 22 no. 3
Type: Research Article
ISSN: 1726-0531

Keywords

1 – 10 of over 2000