Search results

1 – 7 of 7
Open Access
Article
Publication date: 15 February 2024

Di Kang, Steven W. Kirkpatrick, Zhipeng Zhang, Xiang Liu and Zheyong Bian

Accurately estimating the severity of derailment is a crucial step in quantifying train derailment consequences and, thereby, mitigating its impacts. The purpose of this paper is…

Abstract

Purpose

Accurately estimating the severity of derailment is a crucial step in quantifying train derailment consequences and, thereby, mitigating its impacts. The purpose of this paper is to propose a simplified approach aimed at addressing this research gap by developing a physics-informed 1-D model. The model is used to simulate train dynamics through a time-stepping algorithm, incorporating derailment data after the point of derailment.

Design/methodology/approach

In this study, a simplified approach is adopted that applies a 1-D kinematic analysis with data obtained from various derailments. These include the length and weight of the rail cars behind the point of derailment, the train braking effects, derailment blockage forces, the grade of the track and the train rolling and aerodynamic resistance. Since train braking/blockage effects and derailment blockage forces are not always available for historical or potential train derailment, it is also necessary to fit the historical data and find optimal parameters to estimate these two variables. Using these fitted parameters, a detailed comparison can be performed between the physics-informed 1-D model and previous statistical models to predict the derailment severity.

Findings

The results show that the proposed model outperforms the Truncated Geometric model (the latest statistical model used in prior research) in estimating derailment severity. The proposed model contributes to the understanding and prevention of train derailments and hazmat release consequences, offering improved accuracy for certain scenarios and train types

Originality/value

This paper presents a simplified physics-informed 1-D model, which could help understand the derailment mechanism and, thus, is expected to estimate train derailment severity more accurately for certain scenarios and train types compared with the latest statistical model. The performance of the braking response and the 1-D model is verified by comparing known ride-down profiles with estimated ones. This validation process ensures that both the braking response and the 1-D model accurately represent the expected behavior.

Details

Smart and Resilient Transportation, vol. 6 no. 1
Type: Research Article
ISSN: 2632-0487

Keywords

Open Access
Article
Publication date: 27 November 2023

J.I. Ramos and Carmen María García López

The purpose of this paper is to analyze numerically the blowup in finite time of the solutions to a one-dimensional, bidirectional, nonlinear wave model equation for the…

225

Abstract

Purpose

The purpose of this paper is to analyze numerically the blowup in finite time of the solutions to a one-dimensional, bidirectional, nonlinear wave model equation for the propagation of small-amplitude waves in shallow water, as a function of the relaxation time, linear and nonlinear drift, power of the nonlinear advection flux, viscosity coefficient, viscous attenuation, and amplitude, smoothness and width of three types of initial conditions.

Design/methodology/approach

An implicit, first-order accurate in time, finite difference method valid for semipositive relaxation times has been used to solve the equation in a truncated domain for three different initial conditions, a first-order time derivative initially equal to zero and several constant wave speeds.

Findings

The numerical experiments show a very rapid transient from the initial conditions to the formation of a leading propagating wave, whose duration depends strongly on the shape, amplitude and width of the initial data as well as on the coefficients of the bidirectional equation. The blowup times for the triangular conditions have been found to be larger than those for the Gaussian ones, and the latter are larger than those for rectangular conditions, thus indicating that the blowup time decreases as the smoothness of the initial conditions decreases. The blowup time has also been found to decrease as the relaxation time, degree of nonlinearity, linear drift coefficient and amplitude of the initial conditions are increased, and as the width of the initial condition is decreased, but it increases as the viscosity coefficient is increased. No blowup has been observed for relaxation times smaller than one-hundredth, viscosity coefficients larger than ten-thousandths, quadratic and cubic nonlinearities, and initial Gaussian, triangular and rectangular conditions of unity amplitude.

Originality/value

The blowup of a one-dimensional, bidirectional equation that is a model for the propagation of waves in shallow water, longitudinal displacement in homogeneous viscoelastic bars, nerve conduction, nonlinear acoustics and heat transfer in very small devices and/or at very high transfer rates has been determined numerically as a function of the linear and nonlinear drift coefficients, power of the nonlinear drift, viscosity coefficient, viscous attenuation, and amplitude, smoothness and width of the initial conditions for nonzero relaxation times.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 34 no. 3
Type: Research Article
ISSN: 0961-5539

Keywords

Open Access
Article
Publication date: 31 July 2023

Daniel Šandor and Marina Bagić Babac

Sarcasm is a linguistic expression that usually carries the opposite meaning of what is being said by words, thus making it difficult for machines to discover the actual meaning…

2983

Abstract

Purpose

Sarcasm is a linguistic expression that usually carries the opposite meaning of what is being said by words, thus making it difficult for machines to discover the actual meaning. It is mainly distinguished by the inflection with which it is spoken, with an undercurrent of irony, and is largely dependent on context, which makes it a difficult task for computational analysis. Moreover, sarcasm expresses negative sentiments using positive words, allowing it to easily confuse sentiment analysis models. This paper aims to demonstrate the task of sarcasm detection using the approach of machine and deep learning.

Design/methodology/approach

For the purpose of sarcasm detection, machine and deep learning models were used on a data set consisting of 1.3 million social media comments, including both sarcastic and non-sarcastic comments. The data set was pre-processed using natural language processing methods, and additional features were extracted and analysed. Several machine learning models, including logistic regression, ridge regression, linear support vector and support vector machines, along with two deep learning models based on bidirectional long short-term memory and one bidirectional encoder representations from transformers (BERT)-based model, were implemented, evaluated and compared.

Findings

The performance of machine and deep learning models was compared in the task of sarcasm detection, and possible ways of improvement were discussed. Deep learning models showed more promise, performance-wise, for this type of task. Specifically, a state-of-the-art model in natural language processing, namely, BERT-based model, outperformed other machine and deep learning models.

Originality/value

This study compared the performance of the various machine and deep learning models in the task of sarcasm detection using the data set of 1.3 million comments from social media.

Details

Information Discovery and Delivery, vol. 52 no. 2
Type: Research Article
ISSN: 2398-6247

Keywords

Open Access
Article
Publication date: 29 February 2024

Leandro Pinheiro Vieira and Rafael Mesquita Pereira

This study aims to investigate the effect of smoking on the income of workers in the Brazilian labor market.

Abstract

Purpose

This study aims to investigate the effect of smoking on the income of workers in the Brazilian labor market.

Design/methodology/approach

Using data from the 2019 National Health Survey (PNS), we initially address the sample selection bias concerning labor market participation by using the Heckman (1979) method. Subsequently, the decomposition of income between smokers and nonsmokers is analyzed, both on average and across the earnings distribution by employing the procedure of Firpo, Fortin, and Lemieux (2009) - FFL decomposition. Ñopo (2008) technique is also used to obtain more robust estimates.

Findings

Overall, the findings indicate an income penalty for smokers in the Brazilian labor market across both the average and all quantiles of the income distribution. Notably, the most significant differentials and income penalties against smokers are observed in the lower quantiles of the distribution. Conversely, in the higher quantiles, there is a tendency toward a smaller magnitude of this gap, with limited evidence of an income penalty associated with this habit.

Research limitations/implications

This study presents an important limitation, which refers to a restriction of the PNS (2019), which does not provide information about some subjective factors that also tend to influence the levels of labor income, such as the level of effort and specific ability of each worker, whether smokers or not, something that could also, in some way, be related to some latent individual predisposition that would influence the choice of smoking.

Originality/value

The relevance of the present study is clear in identifying the heterogeneity of the income gap in favor of nonsmokers, as in the lower quantiles there was a greater magnitude of differentials against smokers and a greater incidence of unexplained penalties in the income of these workers, while in the higher quantiles, there was low magnitude of the differentials and little evidence that there is a penalty in earnings since the worker is a smoker.

Details

EconomiA, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1517-7580

Keywords

Open Access
Article
Publication date: 2 April 2024

Koraljka Golub, Osma Suominen, Ahmed Taiye Mohammed, Harriet Aagaard and Olof Osterman

In order to estimate the value of semi-automated subject indexing in operative library catalogues, the study aimed to investigate five different automated implementations of an…

Abstract

Purpose

In order to estimate the value of semi-automated subject indexing in operative library catalogues, the study aimed to investigate five different automated implementations of an open source software package on a large set of Swedish union catalogue metadata records, with Dewey Decimal Classification (DDC) as the target classification system. It also aimed to contribute to the body of research on aboutness and related challenges in automated subject indexing and evaluation.

Design/methodology/approach

On a sample of over 230,000 records with close to 12,000 distinct DDC classes, an open source tool Annif, developed by the National Library of Finland, was applied in the following implementations: lexical algorithm, support vector classifier, fastText, Omikuji Bonsai and an ensemble approach combing the former four. A qualitative study involving two senior catalogue librarians and three students of library and information studies was also conducted to investigate the value and inter-rater agreement of automatically assigned classes, on a sample of 60 records.

Findings

The best results were achieved using the ensemble approach that achieved 66.82% accuracy on the three-digit DDC classification task. The qualitative study confirmed earlier studies reporting low inter-rater agreement but also pointed to the potential value of automatically assigned classes as additional access points in information retrieval.

Originality/value

The paper presents an extensive study of automated classification in an operative library catalogue, accompanied by a qualitative study of automated classes. It demonstrates the value of applying semi-automated indexing in operative information retrieval systems.

Details

Journal of Documentation, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0022-0418

Keywords

Open Access
Article
Publication date: 6 February 2023

Assunta Di Vaio, Badar Latif, Nuwan Gunarathne, Manjul Gupta and Idiano D'Adamo

In this study, the authors examine artificial knowledge as a fundamental stream of knowledge management for sustainable and resilient business models in supply chain management…

9976

Abstract

Purpose

In this study, the authors examine artificial knowledge as a fundamental stream of knowledge management for sustainable and resilient business models in supply chain management (SCM). The study aims to provide a comprehensive overview of artificial knowledge and digitalization as key enablers of the improvement of SCM accountability and sustainable performance towards the UN 2030 Agenda.

Design/methodology/approach

Using the SCOPUS database and Google Scholar, the authors analyzed 135 English-language publications from 1990 to 2022 to chart the pattern of knowledge production and dissemination in the literature. The data were collected, reviewed and peer-reviewed before conducting bibliometric analysis and a systematic literature review to support future research agenda.

Findings

The results highlight that artificial knowledge and digitalization are linked to the UN 2030 Agenda. The analysis further identifies the main issues in achieving sustainable and resilient SCM business models. Based on the results, the authors develop a conceptual framework for artificial knowledge and digitalization in SCM to increase accountability and sustainable performance, especially in times of sudden crises when business resilience is imperative.

Research limitations/implications

The study results add to the extant literature by examining artificial knowledge and digitalization from the resilience theory perspective. The authors suggest that different strategic perspectives significantly promote resilience for SCM digitization and sustainable development. Notably, fostering diverse peer exchange relationships can help stimulate peer knowledge and act as a palliative mechanism that builds digital knowledge to strengthen and drive future possibilities.

Practical implications

This research offers valuable guidance to supply chain practitioners, managers and policymakers in re-thinking, re-formulating and re-shaping organizational processes to meet the UN 2030 Agenda, mainly by introducing artificial knowledge in digital transformation training and education programs. In doing so, firms should focus not simply on digital transformation but also on cultural transformation to enhance SCM accountability and sustainable performance in resilient business models.

Originality/value

This study is, to the authors' best knowledge, among the first to conceptualize artificial knowledge and digitalization issues in SCM. It further integrates resilience theory with institutional theory, legitimacy theory and stakeholder theory as the theoretical foundations of artificial knowledge in SCM, based on firms' responsibility to fulfill the sustainable development goals under the UN's 2030 Agenda.

Details

Journal of Enterprise Information Management, vol. 37 no. 2
Type: Research Article
ISSN: 1741-0398

Keywords

Open Access
Article
Publication date: 9 February 2024

Luca Menicacci and Lorenzo Simoni

This study aims to investigate the role of negative media coverage of environmental, social and governance (ESG) issues in deterring tax avoidance. Inspired by media…

1564

Abstract

Purpose

This study aims to investigate the role of negative media coverage of environmental, social and governance (ESG) issues in deterring tax avoidance. Inspired by media agenda-setting theory and legitimacy theory, this study hypothesises that an increase in ESG negative media coverage should cause a reputational drawback, leading companies to reduce tax avoidance to regain their legitimacy. Hence, this study examines a novel channel that links ESG and taxation.

Design/methodology/approach

This study uses panel regression analysis to examine the relationship between negative media coverage of ESG issues and tax avoidance among the largest European entities. This study considers different measures of tax avoidance and negative media coverage.

Findings

The results show that negative media coverage of ESG issues is negatively associated with tax avoidance, suggesting that media can act as an external monitor for corporate taxation.

Practical implications

The findings have implications for policymakers and regulators, which should consider tax transparency when dealing with ESG disclosure requirements. Tax disclosure should be integrated into ESG reporting.

Social implications

The study has social implications related to the media, which act as watchdogs for firms’ irresponsible practices. According to this study’s findings, increased media pressure has the power to induce a better alignment between declared ESG policies and tax strategies.

Originality/value

This study contributes to the literature on the mechanisms that discourage tax avoidance and the literature on the relationship between ESG and taxation by shedding light on the role of media coverage.

Details

Sustainability Accounting, Management and Policy Journal, vol. 15 no. 7
Type: Research Article
ISSN: 2040-8021

Keywords

Access

Only Open Access

Year

Last 3 months (7)

Content type

1 – 7 of 7