Search results

1 – 2 of 2
Article
Publication date: 3 August 2021

Irvin Dongo, Yudith Cardinale, Ana Aguilera, Fabiola Martinez, Yuni Quintero, German Robayo and David Cabeza

This paper aims to perform an exhaustive revision of relevant and recent related studies, which reveals that both extraction methods are currently used to analyze credibility on…

Abstract

Purpose

This paper aims to perform an exhaustive revision of relevant and recent related studies, which reveals that both extraction methods are currently used to analyze credibility on Twitter. Thus, there is clear evidence of the need of having different options to extract different data for this purpose. Nevertheless, none of these studies perform a comparative evaluation of both extraction techniques. Moreover, the authors extend a previous comparison, which uses a recent developed framework that offers both alternates of data extraction and implements a previously proposed credibility model, by adding a qualitative evaluation and a Twitter-Application Programming Interface (API) performance analysis from different locations.

Design/methodology/approach

As one of the most popular social platforms, Twitter has been the focus of recent research aimed at analyzing the credibility of the shared information. To do so, several proposals use either Twitter API or Web scraping to extract the data to perform the analysis. Qualitative and quantitative evaluations are performed to discover the advantages and disadvantages of both extraction methods.

Findings

The study demonstrates the differences in terms of accuracy and efficiency of both extraction methods and gives relevance to much more problems related to this area to pursue true transparency and legitimacy of information on the Web.

Originality/value

Results report that some Twitter attributes cannot be retrieved by Web scraping. Both methods produce identical credibility values when a robust normalization process is applied to the text (i.e. tweet). Moreover, concerning the time performance, Web scraping is faster than Twitter API and it is more flexible in terms of obtaining data; however, Web scraping is very sensitive to website changes. Additionally, the response time of the Twitter API is proportional to the distance from the central server at San Francisco.

Details

International Journal of Web Information Systems, vol. 17 no. 6
Type: Research Article
ISSN: 1744-0084

Keywords

Open Access
Article
Publication date: 29 May 2024

Mohanad Rezeq, Tarik Aouam and Frederik Gailly

Authorities have set up numerous security checkpoints during times of armed conflict to control the flow of commercial and humanitarian trucks into and out of areas of conflict…

Abstract

Purpose

Authorities have set up numerous security checkpoints during times of armed conflict to control the flow of commercial and humanitarian trucks into and out of areas of conflict. These security checkpoints have become highly utilized because of the complex security procedures and increased truck traffic, which significantly slow the delivery of relief aid. This paper aims to improve the process at security checkpoints by redesigning the current process to reduce processing time and relieve congestion at checkpoint entrance gates.

Design/methodology/approach

A decision-support tool (clearing function distribution model [CFDM]) is used to minimize the effects of security checkpoint congestion on the entire humanitarian supply network using a hybrid simulation-optimization approach. By using a business process simulation, the current and reengineered processes are both simulated, and the simulation output was used to estimate the clearing function (capacity as a function of the workload). For both the AS-IS and TO-BE models, key performance indicators such as distribution costs, backordering and process cycle time were used to compare the results of the CFDM tool. For this, the Kerem Abu Salem security checkpoint south of Gaza was used as a case study.

Findings

The comparison results demonstrate that the CFDM tool performs better when the output of the TO-BE clearing function is used.

Originality/value

The efforts will contribute to improving the planning of any humanitarian network experiencing congestion at security checkpoints by minimizing the impact of congestion on the delivery lead time of relief aid to the final destination.

Details

Journal of Humanitarian Logistics and Supply Chain Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2042-6747

Keywords

1 – 2 of 2