Search results

1 – 10 of over 2000
Article
Publication date: 29 January 2024

Kai Wang

The identification of network user relationship in Fancircle contributes to quantifying the violence index of user text, mining the internal correlation of network behaviors among…

Abstract

Purpose

The identification of network user relationship in Fancircle contributes to quantifying the violence index of user text, mining the internal correlation of network behaviors among users, which provides necessary data support for the construction of knowledge graph.

Design/methodology/approach

A correlation identification method based on sentiment analysis (CRDM-SA) is put forward by extracting user semantic information, as well as introducing violent sentiment membership. To be specific, the topic of the implementation of topology mapping in the community can be obtained based on self-built field of violent sentiment dictionary (VSD) by extracting user text information. Afterward, the violence index of the user text is calculated to quantify the fuzzy sentiment representation between the user and the topic. Finally, the multi-granularity violence association rules mining of user text is realized by constructing violence fuzzy concept lattice.

Findings

It is helpful to reveal the internal relationship of online violence under complex network environment. In that case, the sentiment dependence of users can be characterized from a granular perspective.

Originality/value

The membership degree of violent sentiment into user relationship recognition in Fancircle community is introduced, and a text sentiment association recognition method based on VSD is proposed. By calculating the value of violent sentiment in the user text, the annotation of violent sentiment in the topic dimension of the text is achieved, and the partial order relation between fuzzy concepts of violence under the effective confidence threshold is utilized to obtain the association relation.

Details

Data Technologies and Applications, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2514-9288

Keywords

Article
Publication date: 16 April 2024

Chaofan Wang, Yanmin Jia and Xue Zhao

Prefabricated columns connected by grouted sleeves are increasingly used in practical projects. However, seismic fragility analyses of such structures are rarely conducted…

Abstract

Purpose

Prefabricated columns connected by grouted sleeves are increasingly used in practical projects. However, seismic fragility analyses of such structures are rarely conducted. Seismic fragility analysis has an important role in seismic hazard evaluation. In this paper, the seismic fragility of sleeve connected prefabricated column is analyzed.

Design/methodology/approach

A model for predicting the seismic demand on sleeve connected prefabricated columns has been created by incorporating engineering demand parameters (EDP) and probabilities of seismic failure. The incremental dynamics analysis (IDA) curve clusters of this type of column were obtained using finite element analysis. The seismic fragility curve is obtained by regression of Exponential and Logical Function Model.

Findings

The IDA curve cluster gradually increased the dispersion after a peak ground acceleration (PGA) of 0.3 g was reached. For both columns, the relative displacement of the top of the column significantly changed after reaching 50 mm. The seismic fragility of the prefabricated column with the sleeve placed in the cap (SPCA) was inadequate.

Originality/value

The sleeve was placed in the column to overcome the seismic fragility of prefabricated columns effectively. In practical engineering, it is advisable to utilize these columns in regions susceptible to earthquakes and characterized by high seismic intensity levels in order to mitigate the risk of structural damage resulting from ground motion.

Details

International Journal of Structural Integrity, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1757-9864

Keywords

Article
Publication date: 14 July 2023

Bowen Zheng, Mudasir Hussain, Yang Yang, Albert P.C. Chan and Hung-Lin Chi

In the last decades, various building information modeling–life cycle assessment (BIM-LCA) integration approaches have been developed to assess the environmental impact of the…

Abstract

Purpose

In the last decades, various building information modeling–life cycle assessment (BIM-LCA) integration approaches have been developed to assess the environmental impact of the built asset. However, there is a lack of consensus on the optimal BIM-LCA integration approach that provides the most accurate and efficient assessment outcomes. To compare and determine their accuracy and efficiency, this study aimed to investigate four typical BIM-LCA integration solutions, namely, conventional, parametric modeling, plug-in and industry foundation classes (IFC)-based integration.

Design/methodology/approach

The four integration approaches were developed and applied using the same building project. A quantitative technique for evaluating the accuracy and efficiency of BIM-LCA integration solutions was used. Four indicators for assessing the performance of BIM-LCA integration were (1) validity of LCA results, (2) accuracy of bill-of-quantity (BOQ) extraction, (3) time for developing life cycle inventories (i.e. developing time) and (4) time for calculating LCA results (i.e. calculation time).

Findings

The results show that the plug-in-based approach outperforms others in developing and calculation time, while the conventional one could derive the most accuracy in BOQ extraction and result validity. The parametric modeling approach outperforms the IFC-based method regarding BOQ extraction, developing time and calculation time. Despite this, the IFC-based approach produces LCA outcomes with approximately 1% error, proving its validity.

Originality/value

This paper forms one of the first studies that employ a quantitative and objective method to determine the performance of four typical BIM-LCA integration solutions and reveal the trade-offs between the accuracy and efficiency of the integration approaches. The findings provide practical references for LCA practitioners to select appropriate BIM-LCA integration approaches for evaluating the environmental impact of the built asset during the design phase.

Details

Engineering, Construction and Architectural Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 30 August 2023

Donghui Yang, Yan Wang, Zhaoyang Shi and Huimin Wang

Improving the diversity of recommendation information has become one of the latest research hotspots to solve information cocoons. Aiming to achieve both high accuracy and…

Abstract

Purpose

Improving the diversity of recommendation information has become one of the latest research hotspots to solve information cocoons. Aiming to achieve both high accuracy and diversity of recommender system, a hybrid method has been proposed in this paper. This study aims to discuss the aforementioned method.

Design/methodology/approach

This paper integrates latent Dirichlet allocation (LDA) model and locality-sensitive hashing (LSH) algorithm to design topic recommendation system. To measure the effectiveness of the method, this paper builds three-level categories of journal paper abstracts on the Web of Science platform as experimental data.

Findings

(1) The results illustrate that the diversity of recommended items has been significantly enhanced by leveraging hashing function to overcome information cocoons. (2) Integrating topic model and hashing algorithm, the diversity of recommender systems could be achieved without losing the accuracy of recommender systems in a certain degree of refined topic levels.

Originality/value

The hybrid recommendation algorithm developed in this paper can overcome the dilemma of high accuracy and low diversity. The method could ameliorate the recommendation in business and service industries to address the problems of information overload and information cocoons.

Details

Aslib Journal of Information Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2050-3806

Keywords

Article
Publication date: 15 March 2024

Nawar Boujelben, Manal Hadriche and Yosra Makni Fourati

The purpose of this study is to examine the interplay between integrated reporting quality (IRQ) and capital markets. More specifically, the authors test the impact of IRQ on…

Abstract

Purpose

The purpose of this study is to examine the interplay between integrated reporting quality (IRQ) and capital markets. More specifically, the authors test the impact of IRQ on stock liquidity, cost of capital and analyst forecast accuracy.

Design/methodology/approach

The sample consists of listed firms on the Johannesburg Stock Exchange in South Africa, covering the period from 2012 to 2020. The IRQ measure used in this study is based on data from Ernst and Young. To test the proposed hypotheses, the authors conducted a generalized least squares regression analysis.

Findings

The empirical results evince a positive relationship between IRQ and stock liquidity. However, the authors did not find a significant effect of IRQ on the cost of capital and financial analysts’ forecast accuracy. In robustness tests, it was shown that firms with a higher IRQ score exhibit higher liquidity and improved analyst forecast accuracy. Additional analysis indicates a negative association between IRQ and the cost of capital, as well as a positive association between IRQ and financial analyst forecast accuracy for firms with higher IRQ scores (TOP ten, Excellent, Good).

Originality/value

The study stands as one of the initial endeavors to investigate the impact of IRQ on the capital market. It provides valuable insights for managers and policymakers who are interested in enhancing disclosure practices within the financial market. Furthermore, these findings are significant for investors as they make informed investment decisions.

Details

Journal of Financial Reporting and Accounting, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1985-2517

Keywords

Open Access
Article
Publication date: 8 December 2023

Armin Mahmoodi, Leila Hashemi, Amin Mahmoodi, Benyamin Mahmoodi and Milad Jasemi

The proposed model has been aimed to predict stock market signals by designing an accurate model. In this sense, the stock market is analysed by the technical analysis of Japanese…

Abstract

Purpose

The proposed model has been aimed to predict stock market signals by designing an accurate model. In this sense, the stock market is analysed by the technical analysis of Japanese Candlestick, which is combined by the following meta heuristic algorithms: support vector machine (SVM), meta-heuristic algorithms, particle swarm optimization (PSO), imperialist competition algorithm (ICA) and genetic algorithm (GA).

Design/methodology/approach

In addition, among the developed algorithms, the most effective one is chosen to determine probable sell and buy signals. Moreover, the authors have proposed comparative results to validate the designed model in this study with the same basic models of three articles in the past. Hence, PSO is used as a classification method to search the solution space absolutelyand with the high speed of running. In terms of the second model, SVM and ICA are examined by the time. Where the ICA is an improver for the SVM parameters. Finally, in the third model, SVM and GA are studied, where GA acts as optimizer and feature selection agent.

Findings

Results have been indicated that, the prediction accuracy of all new models are high for only six days, however, with respect to the confusion matrixes results, it is understood that the SVM-GA and SVM-ICA models have correctly predicted more sell signals, and the SCM-PSO model has correctly predicted more buy signals. However, SVM-ICA has shown better performance than other models considering executing the implemented models.

Research limitations/implications

In this study, the authors to analyze the data the long length of time between the years 2013–2021, makes the input data analysis challenging. They must be changed with respect to the conditions.

Originality/value

In this study, two methods have been developed in a candlestick model, they are raw based and signal-based approaches which the hit rate is determined by the percentage of correct evaluations of the stock market for a 16-day period.

Details

Journal of Capital Markets Studies, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2514-4774

Keywords

Article
Publication date: 21 November 2023

Armin Mahmoodi, Leila Hashemi and Milad Jasemi

In this study, the central objective is to foresee stock market signals with the use of a proper structure to achieve the highest accuracy possible. For this purpose, three hybrid…

Abstract

Purpose

In this study, the central objective is to foresee stock market signals with the use of a proper structure to achieve the highest accuracy possible. For this purpose, three hybrid models have been developed for the stock markets which are a combination of support vector machine (SVM) with meta-heuristic algorithms of particle swarm optimization (PSO), imperialist competition algorithm (ICA) and genetic algorithm (GA).All the analyses are technical and are based on the Japanese candlestick model.

Design/methodology/approach

Further as per the results achieved, the most suitable algorithm is chosen to anticipate sell and buy signals. Moreover, the authors have compared the results of the designed model validations in this study with basic models in three articles conducted in the past years. Therefore, SVM is examined by PSO. It is used as a classification agent to search the problem-solving space precisely and at a faster pace. With regards to the second model, SVM and ICA are tested to stock market timing, in a way that ICA is used as an optimization agent for the SVM parameters. At last, in the third model, SVM and GA are studied, where GA acts as an optimizer and feature selection agent.

Findings

As per the results, it is observed that all new models can predict accurately for only 6 days; however, in comparison with the confusion matrix results, it is observed that the SVM-GA and SVM-ICA models have correctly predicted more sell signals, and the SCM-PSO model has correctly predicted more buy signals. However, SVM-ICA has shown better performance than other models considering executing the implemented models.

Research limitations/implications

In this study, the data for stock market of the years 2013–2021 were analyzed; the long length of timeframe makes the input data analysis challenging as they must be moderated with respect to the conditions where they have been changed.

Originality/value

In this study, two methods have been developed in a candlestick model; they are raw-based and signal-based approaches in which the hit rate is determined by the percentage of correct evaluations of the stock market for a 16-day period.

Details

EuroMed Journal of Business, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1450-2194

Keywords

Open Access
Article
Publication date: 22 August 2023

Mahesh Babu Purushothaman and Kasun Moolika Gedara

This pragmatic research paper aims to unravel the smart vision-based method (SVBM), an AI program to correlate the computer vision (recorded and live videos using mobile and…

1323

Abstract

Purpose

This pragmatic research paper aims to unravel the smart vision-based method (SVBM), an AI program to correlate the computer vision (recorded and live videos using mobile and embedded cameras) that aids in manual lifting human pose deduction, analysis and training in the construction sector.

Design/methodology/approach

Using a pragmatic approach combined with the literature review, this study discusses the SVBM. The research method includes a literature review followed by a pragmatic approach and lab validation of the acquired data. Adopting the practical approach, the authors of this article developed an SVBM, an AI program to correlate computer vision (recorded and live videos using mobile and embedded cameras).

Findings

Results show that SVBM observes the relevant events without additional attachments to the human body and compares them with the standard axis to identify abnormal postures using mobile and other cameras. Angles of critical nodal points are projected through human pose detection and calculating body part movement angles using a novel software program and mobile application. The SVBM demonstrates its ability to data capture and analysis in real-time and offline using videos recorded earlier and is validated for program coding and results repeatability.

Research limitations/implications

Literature review methodology limitations include not keeping in phase with the most updated field knowledge. This limitation is offset by choosing the range for literature review within the last two decades. This literature review may not have captured all published articles because the restriction of database access and search was based only on English. Also, the authors may have omitted fruitful articles hiding in a less popular journal. These limitations are acknowledged. The critical limitation is that the trust, privacy and psychological issues are not addressed in SVBM, which is recognised. However, the benefits of SVBM naturally offset this limitation to being adopted practically.

Practical implications

The theoretical and practical implications include customised and individualistic prediction and preventing most posture-related hazardous behaviours before a critical injury happens. The theoretical implications include mimicking the human pose and lab-based analysis without attaching sensors that naturally alter the working poses. SVBM would help researchers develop more accurate data and theoretical models close to actuals.

Social implications

By using SVBM, the possibility of early deduction and prevention of musculoskeletal disorders is high; the social implications include the benefits of being a healthier society and health concerned construction sector.

Originality/value

Human pose detection, especially joint angle calculation in a work environment, is crucial to early deduction of muscoloskeletal disorders. Conventional digital technology-based methods to detect pose flaws focus on location information from wearables and laboratory-controlled motion sensors. For the first time, this paper presents novel computer vision (recorded and live videos using mobile and embedded cameras) and digital image-related deep learning methods without attachment to the human body for manual handling pose deduction and analysis of angles, neckline and torso line in an actual construction work environment.

Details

Smart and Sustainable Built Environment, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2046-6099

Keywords

Article
Publication date: 20 March 2024

Qiuying Chen, Ronghui Liu, Qingquan Jiang and Shangyue Xu

Tourists with different cultural backgrounds think and behave differently. Accurately capturing and correctly understanding cultural differences will help tourist destinations in…

Abstract

Purpose

Tourists with different cultural backgrounds think and behave differently. Accurately capturing and correctly understanding cultural differences will help tourist destinations in product/service planning, marketing communication and attracting and retaining tourists. This research employs Hofstede's cultural dimensions theory to analyse the variations in destination image perceptions of Chinese-speaking and English-speaking tourists to Xiamen, a prominent tourist attraction in China.

Design/methodology/approach

The evaluation utilizes a two-stage approach, incorporating LDA and BERT-BILSTM models. By leveraging text mining, sentiment analysis and t-tests, this research investigates the variations in tourists' perceptions of Xiamen across different cultures.

Findings

The results reveal that cultural disparities significantly impact tourists' perceived image of Xiamen, particularly regarding their preferences for renowned tourist destinations and the factors influencing their travel experience.

Originality/value

This research pioneers applying natural language processing methods and machine learning techniques to affirm the substantial differences in the perceptions of tourist destinations among Chinese-speaking and English-speaking tourists based on Hofstede's cultural theory. The findings furnish theoretical insights for destination marketing organizations to target diverse cultural tourists through precise marketing strategies and illuminate the practical application of Hofstede's cultural theory in tourism and hospitality.

Details

Data Technologies and Applications, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2514-9288

Keywords

Article
Publication date: 3 November 2023

Rajeev R. Bhattacharya and Mahendra R. Gupta

The authors provide a general framework of behavior under asymmetric information and develop indices of diligence, objectivity and quality by an analyst and analyst firm about a…

Abstract

Purpose

The authors provide a general framework of behavior under asymmetric information and develop indices of diligence, objectivity and quality by an analyst and analyst firm about a studied firm, and relate them to the accuracy of its forecasts. The authors test the associations of these indices with time.

Design/methodology/approach

The test of Public Information versus Non-Public Information Models provides the index of diligence, which equals one minus the p-value of the Hausman Specification Test of Ordinary Least Squares (OLS) versus Two Stage Least Squares (2SLS). The test of Objectivity versus Non-Objectivity Models provides the index of objectivity, which equals the p-value of the Wald Test of zero coefficients versus non-zero coefficients in 2SLS regression of the earnings forecast residual. The exponent of the negative of the standard deviation of the residuals of the analyst forecast regression equation provides the index of analytical quality. Each index asymptotically equals the Bayesian ex post probability, by the analyst and analyst firm about the studied firm, of the relevant behavior.

Findings

The authors find that ex post accuracy is a statistically and economically significant increasing function of the product of the indices of diligence, objectivity and quality by the analyst and analyst firm about the studied firm, which asymptotically equals the Bayesian ex post joint probability of diligence, objectivity and quality. The authors find that diligence, objectivity, quality and accuracy did not improve with time.

Originality/value

There has been no previous work done on the systematic and objective characterization and joint analysis of diligence, objectivity and quality of analyst forecasts by an analyst and analyst firm for a studied firm, and their relation with accuracy. This paper puts together the frontiers of various disciplines.

Details

Journal of Accounting Literature, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0737-4607

Keywords

1 – 10 of over 2000