Search results

1 – 10 of 22
Article
Publication date: 17 July 2019

Ali Ayyed Abdul-Kadhim, Fue-Sang Lien and Eugene Yee

This study aims to modify the standard probabilistic lattice Boltzmann methodology (LBM) cellular automata (CA) algorithm to enable a more realistic and accurate computation of…

Abstract

Purpose

This study aims to modify the standard probabilistic lattice Boltzmann methodology (LBM) cellular automata (CA) algorithm to enable a more realistic and accurate computation of the ensemble rather than individual particle trajectories that need to be updated from one time step to the next (allowing, as such, a fraction of the collection of particles in any lattice grid cell to be updated in a time step, rather than the entire collection of particles as in the standard LBM-CA algorithm leading to a better representation of the dynamic interaction between the particles and the background flow). Exploitation of the inherent parallelism of the modified LBM-CA algorithm to provide a computationally efficient scheme for computation of particle-laden flows on readily available commodity general-purpose graphics processing units (GPGPUs).

Design/methodology/approach

This paper presents a framework for the implementation of a LBM for the simulation of particle transport and deposition in complex flows on a GPGPU. Towards this objective, the authors have shown how to map the data structure of the LBM with a multiple-relaxation-time (MRT) collision operator and the Smagorinsky subgrid-scale turbulence model (for turbulent fluid flow simulations) coupled with a CA probabilistic method (for particle transport and deposition simulations) to a GPGPU to give a high-performance computing tool for the calculation of particle-laden flows.

Findings

A fluid-particle simulation using our LBM-MRT-CA algorithm run on a single GPGPU was 160 times as computationally efficient as the same algorithm run on a single CPU.

Research limitations/implications

The method is limited by the available computational resources (e.g. GPU memory size).

Originality/value

A new 3D LBM-MRT-CA model was developed to simulate the particle transport and deposition in complex laminar and turbulent flows with different hydrodynamic characteristics (e.g. vortex shedding, impingement, free shear layer, turbulent boundary layer). The solid particle information is encapsulated locally at the lattice grid nodes, allowing for straightforward mapping of the datastructure onto a GPGPU enabling a massive parallel execution of the LBM-MRT-CA algorithm. The new particle transport algorithm was based on the local (bulk) particle density and velocity and provides more realistic results for the particle transport and deposition than the standard LBM-CA algorithm.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 29 no. 7
Type: Research Article
ISSN: 0961-5539

Keywords

Article
Publication date: 17 August 2012

Janusz Będkowski, Andrzej Masłowski and Geert De Cubber

The purpose of this paper is to demonstrate a real time 3D localization and mapping approach for the USAR (Urban Search and Rescue) robotic application, focusing on the…

Abstract

Purpose

The purpose of this paper is to demonstrate a real time 3D localization and mapping approach for the USAR (Urban Search and Rescue) robotic application, focusing on the performance and the accuracy of the General‐purpose computing on graphics processing units (GPGPU)‐based iterative closest point (ICP) 3D data registration implemented using modern GPGPU with FERMI architecture.

Design/methodology/approach

The authors put all the ICP computation into GPU, and performed the experiments with registration up to 106 data points. The main goal of the research was to provide a method for real‐time data registration performed by a mobile robot equipped with commercially available laser measurement system 3D. The main contribution of the paper is a new GPGPU based ICP implementation with regular grid decomposition. It guarantees high accuracy as equivalent CPU based ICP implementation with better performance.

Findings

The authors have shown an empirical analysis of the tuning of GPUICP parameters for obtaining much better performance (acceptable level of the variance of the computing time) with minimal lost of accuracy. Loop closing method is added and demonstrates satisfactory results of 3D localization and mapping in urban environments. This work can help in building the USAR mobile robotic applications that process 3D cloud of points in real time.

Practical implications

This work can help in developing real time mapping for USAR robotic applications.

Originality/value

The paper proposes a new method for nearest neighbor search that guarantees better performance with minimal loss of accuracy. The variance of computational time is much less than SoA.

Article
Publication date: 19 June 2017

Janusz Marian Bedkowski and Timo Röhling

This paper aims to focus on real-world mobile systems, and thus propose relevant contribution to the special issue on “Real-world mobile robot systems”. This work on 3D laser…

Abstract

Purpose

This paper aims to focus on real-world mobile systems, and thus propose relevant contribution to the special issue on “Real-world mobile robot systems”. This work on 3D laser semantic mobile mapping and particle filter localization dedicated for robot patrolling urban sites is elaborated with a focus on parallel computing application for semantic mapping and particle filter localization. The real robotic application of patrolling urban sites is the goal; thus, it has been shown that crucial robotic components have reach high Technology Readiness Level (TRL).

Design/methodology/approach

Three different robotic platforms equipped with different 3D laser measurement system were compared. Each system provides different data according to the measured distance, density of points and noise; thus, the influence of data into final semantic maps has been compared. The realistic problem is to use these semantic maps for robot localization; thus, the influence of different maps into particle filter localization has been elaborated. A new approach has been proposed for particle filter localization based on 3D semantic information, and thus, the behavior of particle filter in different realistic conditions has been elaborated. The process of using proposed robotic components for patrolling urban site, such as the robot checking geometrical changes of the environment, has been detailed.

Findings

The focus on real-world mobile systems requires different points of view for scientific work. This study is focused on robust and reliable solutions that could be integrated with real applications. Thus, new parallel computing approach for semantic mapping and particle filter localization has been proposed. Based on the literature, semantic 3D particle filter localization has not yet been elaborated; thus, innovative solutions for solving this issue have been proposed. Recently, a semantic mapping framework that was already published was developed. For this reason, this study claimed that the authors’ applied studies during real-world trials with such mapping system are added value relevant for this special issue.

Research limitations/implications

The main problem is the compromise between computer power and energy consumed by heavy calculations, thus our main focus is to use modern GPGPU, NVIDIA PASCAL parallel processor architecture. Recent advances in GPGPUs shows great potency for mobile robotic applications, thus this study is focused on increasing mapping and localization capabilities by improving the algorithms. Current limitation is related with the number of particles processed by a single processor, and thus achieved performance of 500 particles in real-time is the current limitation. The implication is that multi-GPU architectures for increasing the number of processed particle can be used. Thus, further studies are required.

Practical implications

The research focus is related to real-world mobile systems; thus, practical aspects of the work are crucial. The main practical application is semantic mapping that could be used for many robotic applications. The authors claim that their particle filter localization is ready to integrate with real robotic platforms using modern 3D laser measurement system. For this reason, the authors claim that their system can improve existing autonomous robotic platforms. The proposed components can be used for detection of geometrical changes in the scene; thus, many practical functionalities can be applied such as: detection of cars, detection of opened/closed gate, etc. […] These functionalities are crucial elements of the safe and security domain.

Social implications

Improvement of safe and security domain is a crucial aspect of modern society. Protecting critical infrastructure plays an important role, thus introducing autonomous mobile platforms capable of supporting human operators of safe and security systems could have a positive impact if viewed from many points of view.

Originality/value

This study elaborates the novel approach of particle filter localization based on 3D data and semantic mapping. This original work could have a great impact on the mobile robotics domain, and thus, this study claims that many algorithmic and implementation issues were solved assuming real-task experiments. The originality of this work is influenced by the use of modern advanced robotic systems being a relevant set of technologies for proper evaluation of the proposed approach. Such a combination of experimental hardware and original algorithms and implementation is definitely an added value.

Details

Industrial Robot: An International Journal, vol. 44 no. 4
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 19 June 2017

Michał R. Nowicki, Dominik Belter, Aleksander Kostusiak, Petr Cížek, Jan Faigl and Piotr Skrzypczyński

This paper aims to evaluate four different simultaneous localization and mapping (SLAM) systems in the context of localization of multi-legged walking robots equipped with compact…

Abstract

Purpose

This paper aims to evaluate four different simultaneous localization and mapping (SLAM) systems in the context of localization of multi-legged walking robots equipped with compact RGB-D sensors. This paper identifies problems related to in-motion data acquisition in a legged robot and evaluates the particular building blocks and concepts applied in contemporary SLAM systems against these problems. The SLAM systems are evaluated on two independent experimental set-ups, applying a well-established methodology and performance metrics.

Design/methodology/approach

Four feature-based SLAM architectures are evaluated with respect to their suitability for localization of multi-legged walking robots. The evaluation methodology is based on the computation of the absolute trajectory error (ATE) and relative pose error (RPE), which are performance metrics well-established in the robotics community. Four sequences of RGB-D frames acquired in two independent experiments using two different six-legged walking robots are used in the evaluation process.

Findings

The experiments revealed that the predominant problem characteristics of the legged robots as platforms for SLAM are the abrupt and unpredictable sensor motions, as well as oscillations and vibrations, which corrupt the images captured in-motion. The tested adaptive gait allowed the evaluated SLAM systems to reconstruct proper trajectories. The bundle adjustment-based SLAM systems produced best results, thanks to the use of a map, which enables to establish a large number of constraints for the estimated trajectory.

Research limitations/implications

The evaluation was performed using indoor mockups of terrain. Experiments in more natural and challenging environments are envisioned as part of future research.

Practical implications

The lack of accurate self-localization methods is considered as one of the most important limitations of walking robots. Thus, the evaluation of the state-of-the-art SLAM methods on legged platforms may be useful for all researchers working on walking robots’ autonomy and their use in various applications, such as search, security, agriculture and mining.

Originality/value

The main contribution lies in the integration of the state-of-the-art SLAM methods on walking robots and their thorough experimental evaluation using a well-established methodology. Moreover, a SLAM system designed especially for RGB-D sensors and real-world applications is presented in details.

Details

Industrial Robot: An International Journal, vol. 44 no. 4
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 13 February 2023

Andro Rak, Luka Grbčić, Ante Sikirica and Lado Kranjčević

The purpose of this paper is the examination of fluid flow around NACA0012 airfoil, with the aim of the numerical validation between the experimental results in the wind tunnel…

Abstract

Purpose

The purpose of this paper is the examination of fluid flow around NACA0012 airfoil, with the aim of the numerical validation between the experimental results in the wind tunnel and the Lattice Boltzmann method (LBM) analysis, for the medium Reynolds number (Re = 191,000). The LBM–large Eddy simulation (LES) method described in this paper opens up opportunities for faster computational fluid dynamics (CFD) analysis, because of the LBM scalability on high performance computing architectures, more specifically general purpose graphics processing units (GPGPUs), pertaining at the same time the high resolution LES approach.

Design/methodology/approach

Process starts with data collection in open-circuit wind tunnel experiment. Furthermore, the pressure coefficient, as a comparative variable, has been used with varying angle of attack (2°, 4°, 6° and 8°) for both experiment and LBM analysis. To numerically reproduce the experimental results, the LBM coupled with the LES turbulence model, the generalized wall function (GWF) and the cumulant collision operator with D3Q27 velocity set has been used. Also, a mesh independence study has been provided to ensure result congruence.

Findings

The proposed LBM methodology is capable of highly accurate predictions when compared with experimental data. Besides, the special significance of this work is the possibility of experimental and CFD comparison for the same domain dimensions.

Originality/value

Considering the quality of results, root-mean-square error (RMSE) shows good correlations both for airfoil’s upper and lower surface. More precisely, maximal RMSE for the upper surface is 0.105, whereas 0.089 for the lower surface, regarding all angles of attack.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 33 no. 5
Type: Research Article
ISSN: 0961-5539

Keywords

Article
Publication date: 12 June 2017

Andre Luis Cavalcanti Bueno, Noemi de La Rocque Rodriguez and Elisa Dominguez Sotelino

The purpose of this work is to present a methodology that harnesses the computational power of multiple graphics processing units (GPUs) and hides the complexities of tuning GPU…

Abstract

Purpose

The purpose of this work is to present a methodology that harnesses the computational power of multiple graphics processing units (GPUs) and hides the complexities of tuning GPU parameters from the users.

Design/methodology/approach

A methodology for auto-tuning OpenCL configuration parameters has been developed.

Findings

This described process helps simplify coding and generates a significant gain in time for each method execution.

Originality/value

Most authors develop their GPU applications for specific hardware configurations. In this work, a solution is offered to make the developed code portable to any GPU hardware.

Details

Engineering Computations, vol. 34 no. 4
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 15 November 2011

Imre Kiss, József Pávó and Szabolcs Gyimóthy

The purpose of this paper is to accelerate the time‐consuming task of assembling the impedance matrix resulting from the discretization of integral equations by the moment method…

Abstract

Purpose

The purpose of this paper is to accelerate the time‐consuming task of assembling the impedance matrix resulting from the discretization of integral equations by the moment method, accelerated using massively parallel processing scheme.

Design/methodology/approach

This paper provides several approaches for the implementation of moment method on compute unified device architecture (CUDA) capable general purpose video cards, as well as giving general implementation design patterns and a good overview on the topic.

Findings

The proposed method seems to be efficient in the light of the presented numerical results.

Originality/value

The subject of the paper is an evolving, considerably new aspect among computation techniques which could be of high interest for the scientific community.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering, vol. 30 no. 6
Type: Research Article
ISSN: 0332-1649

Keywords

Book part
Publication date: 10 December 2018

Keren Caspin-Wagner, Silvia Massini and Arie Y. Lewin

This chapter discusses the phenomenon of online marketplaces for science, technology, engineering, and math (STEM) talent and highlights its effect on knowledge creation and…

Abstract

This chapter discusses the phenomenon of online marketplaces for science, technology, engineering, and math (STEM) talent and highlights its effect on knowledge creation and innovation through on-demand contract employment and problem solving of scientific challenges by online communities of experts globally. In particular, the authors discuss the key dynamics and events driving the development of the online marketplaces for innovation. Relying on data from various online platforms, including novel data from one of the world’s largest online platforms, the chapter characterizes the phenomenon, including the geographic dispersion of users and distribution of income, and discusses important implications and challenges for research and development (R&D) and innovation management in organizations. These include the need to develop new organizational and managerial capabilities, intellectual property (IP) protection issues, the ability of balancing internal and external innovation processes, and implications on the changing identity of R&D workers.

Details

International Business in the Information and Digital Age
Type: Book
ISBN: 978-1-78756-326-1

Keywords

Article
Publication date: 4 April 2016

Alain Yee Loong Chong, Boying Li, Eric W.T. Ngai, Eugene Ch'ng and Filbert Lee

The purpose of this paper is to investigate if online reviews (e.g. valence and volume), online promotional strategies (e.g. free delivery and discounts) and sentiments from user…

10002

Abstract

Purpose

The purpose of this paper is to investigate if online reviews (e.g. valence and volume), online promotional strategies (e.g. free delivery and discounts) and sentiments from user reviews can help predict product sales.

Design/methodology/approach

The authors designed a big data architecture and deployed Node.js agents for scraping the Amazon.com pages using asynchronous input/output calls. The completed web crawling and scraping data sets were then preprocessed for sentimental and neural network analysis. The neural network was employed to examine which variables in the study are important predictors of product sales.

Findings

This study found that although online reviews, online promotional strategies and online sentiments can all predict product sales, some variables are more important predictors than others. The authors found that the interplay effects of these variables become more important variables than the individual variables themselves. For example, online volume interactions with sentiments and discounts are more important than the individual predictors of discounts, sentiments or online volume.

Originality/value

This study designed big data architecture, in combination with sentimental and neural network analysis that can facilitate future business research for predicting product sales in an online environment. This study also employed a predictive analytic approach (e.g. neural network) to examine the variables, and this approach is useful for future data analysis in a big data environment where prediction can have more practical implications than significance testing. This study also examined the interplay between online reviews, sentiments and promotional strategies, which up to now have mostly been examined individually in previous studies.

Details

International Journal of Operations & Production Management, vol. 36 no. 4
Type: Research Article
ISSN: 0144-3577

Keywords

Article
Publication date: 19 October 2015

Eugene Ch'ng

The purpose of this paper is to present a Big Data solution as a methodological approach to the automated collection, cleaning, collation, and mapping of multimodal, longitudinal…

Abstract

Purpose

The purpose of this paper is to present a Big Data solution as a methodological approach to the automated collection, cleaning, collation, and mapping of multimodal, longitudinal data sets from social media. The paper constructs social information landscapes (SIL).

Design/methodology/approach

The research presented here adopts a Big Data methodological approach for mapping user-generated contents in social media. The methodology and algorithms presented are generic, and can be applied to diverse types of social media or user-generated contents involving user interactions, such as within blogs, comments in product pages, and other forms of media, so long as a formal data structure proposed here can be constructed.

Findings

The limited presentation of the sequential nature of content listings within social media and Web 2.0 pages, as viewed on web browsers or on mobile devices, do not necessarily reveal nor make obvious an unknown nature of the medium; that every participant, from content producers, to consumers, to followers and subscribers, including the contents they produce or subscribed to, are intrinsically connected in a hidden but massive network. Such networks when mapped, could be quantitatively analysed using social network analysis (e.g. centralities), and the semantics and sentiments could equally reveal valuable information with appropriate analytics. Yet that which is difficult is the traditional approach of collecting, cleaning, collating, and mapping such data sets into a sufficiently large sample of data that could yield important insights into the community structure and the directional, and polarity of interaction on diverse topics. This research solves this particular strand of problem.

Research limitations/implications

The automated mapping of extremely large networks involving hundreds of thousands to millions of nodes, encapsulating high resolution and contextual information, over a long period of time could possibly assist in the proving or even disproving of theories. The goal of this paper is to demonstrate the feasibility of using automated approaches for acquiring massive, connected data sets for academic inquiry in the social sciences.

Practical implications

The methods presented in this paper, together with the Big Data architecture can assist individuals and institutions with a limited budget, with practical approaches in constructing SIL. The software-hardware integrated architecture uses open source software, furthermore, the SIL mapping algorithms are easy to implement.

Originality/value

The majority of research in the literature uses traditional approaches for collecting social networks data. Traditional approaches can be slow and tedious; they do not yield adequate sample size to be of significant value for research. Whilst traditional approaches collect only a small percentage of data, the original methods presented here are able to collect and collate entire data sets in social media due to the automated and scalable mapping techniques.

Details

Industrial Management & Data Systems, vol. 115 no. 9
Type: Research Article
ISSN: 0263-5577

Keywords

1 – 10 of 22