Search results

1 – 10 of over 1000
Article
Publication date: 12 October 2018

Beichuan Yan and Richard Regueiro

This paper aims to present performance comparison between O(n2) and O(n) neighbor search algorithms, studies their effects for different particle shape complexity and

Abstract

Purpose

This paper aims to present performance comparison between O(n2) and O(n) neighbor search algorithms, studies their effects for different particle shape complexity and computational granularity (CG) and investigates the influence on superlinear speedup of 3D discrete element method (DEM) for complex-shaped particles. In particular, it aims to answer the question: O(n2) or O(n) neighbor search algorithm, which performs better in parallel 3D DEM computational practice?

Design/methodology/approach

The O(n2) and O(n) neighbor search algorithms are carefully implemented in the code paraEllip3d, which is executed on the Department of Defense supercomputers across five orders of magnitude of simulation scale (2,500; 12,000; 150,000; 1 million and 10 million particles) to evaluate and compare the performance, using both strong and weak scaling measurements.

Findings

The more complex the particle shapes (from sphere to ellipsoid to poly-ellipsoid), the smaller the neighbor search fraction (NSF); and the lower is the CG, the smaller is the NSF. In both serial and parallel computing of complex-shaped 3D DEM, the O(n2) algorithm is inefficient at coarse CG; however, it executes faster than O(n) algorithm at fine CGs that are mostly used in computational practice to achieve the best performance. This means that O(n2) algorithm outperforms O(n) in parallel 3D DEM generally.

Practical implications

Taking for granted that O(n) outperforms O(n2) unconditionally, complex-shaped 3D DEM is a misconception commonly encountered in the computational engineering and science literature.

Originality/value

The paper clarifies that performance of O(n2) and O(n) neighbor search algorithms for complex-shaped 3D DEM is affected by particle shape complexity and CG. In particular, the O(n2) algorithm outperforms the O(n) algorithm in large-scale parallel 3D DEM simulations generally, even though this outperformance is counterintuitive.

Details

Engineering Computations, vol. 35 no. 6
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 16 April 2018

Beichuan Yan and Richard Regueiro

The purpose of this paper is to extend complex-shaped discrete element method simulations from a few thousand particles to millions of particles by using parallel computing on…

204

Abstract

Purpose

The purpose of this paper is to extend complex-shaped discrete element method simulations from a few thousand particles to millions of particles by using parallel computing on department of defense (DoD) supercomputers and to study the mechanical response of particle assemblies composed of a large number of particles in engineering practice and laboratory tests.

Design/methodology/approach

Parallel algorithm is designed and implemented with advanced features such as link-block, border layer and migration layer, adaptive compute gridding technique and message passing interface (MPI) transmission of C++ objects and pointers, for high performance optimization; performance analyses are conducted across five orders of magnitude of simulation scale on multiple DoD supercomputers; and three full-scale simulations of sand pluviation, constrained collapse and particle shape effect are carried out to study mechanical response of particle assemblies.

Findings

The parallel algorithm and implementation exhibit high speedup and excellent scalability, communication time is a decreasing function of the number of compute nodes and optimal computational granularity for each simulation scale is given. Nearly 50 per cent of wall clock time is spent on rebound phenomenon at the top of particle assembly in dynamic simulation of sand gravitational pluviation. Numerous particles are necessary to capture the pattern and shape of particle assembly in collapse tests; preliminary comparison between sphere assembly and ellipsoid assembly indicates a significant influence of particle shape on kinematic, kinetic and static behavior of particle assemblies.

Originality/value

The high-performance parallel code enables the simulation of a wide range of dynamic and static laboratory and field tests in engineering applications that involve a large number of granular and geotechnical material grains, such as sand pluviation process, buried explosion in various soils, earth penetrator interaction with soil, influence of grain size, shape and gradation on packing density and shear strength and mechanical behavior under different gravity environments such as on the Moon and Mars.

Details

Engineering Computations, vol. 35 no. 2
Type: Research Article
ISSN: 0264-4401

Keywords

Open Access
Article
Publication date: 19 October 2021

Veronica Johansson and Jörgen Stenlund

Representations of time are commonly used to construct narratives in visualisations of data. However, since time is a value-laden concept, and no representation can provide a…

1615

Abstract

Purpose

Representations of time are commonly used to construct narratives in visualisations of data. However, since time is a value-laden concept, and no representation can provide a full, objective account of “temporal reality”, they are also biased and political: reproducing and reinforcing certain views and values at the expense of alternative ones. This conceptual paper aims to explore expressions of temporal bias and politics in data visualisation, along with possibly mitigating user approaches and design strategies.

Design/methodology/approach

This study presents a theoretical framework rooted in a sociotechnical view of representations as biased and political, combined with perspectives from critical literacy, radical literacy and critical design. The framework provides a basis for discussion of various types and effects of temporal bias in visualisation. Empirical examples from previous research and public resources illustrate the arguments.

Findings

Four types of political effects of temporal bias in visualisations are presented, expressed as limitation of view, disregard of variation, oppression of social groups and misrepresentation of topic and suggest that appropriate critical and radical literacy approaches require users and designers to critique, contextualise, counter and cross beyond expressions of the same. Supporting critical design strategies involve the inclusion of multiple datasets and representations; broad access to flexible tools; and inclusive participation of marginalised groups.

Originality/value

The paper draws attention to a vital, yet little researched problem of temporal representation in visualisations of data. It offers a pioneering bridging of critical literacy, radical literacy and critical design and emphasises mutual rather than contradictory interests of the empirical sciences and humanities.

Details

Journal of Documentation, vol. 78 no. 1
Type: Research Article
ISSN: 0022-0418

Keywords

Abstract

Details

Integrated Land-Use and Transportation Models
Type: Book
ISBN: 978-0-080-44669-1

Article
Publication date: 3 April 2018

Marcus Foth

The purpose of this paper is to trace how the relationship between city governments and citizens has developed over time with the introduction of urban informatics and smart city…

Abstract

Purpose

The purpose of this paper is to trace how the relationship between city governments and citizens has developed over time with the introduction of urban informatics and smart city technology.

Design/methodology/approach

The argument presented in the paper is backed up by a critical review approach based on a transdisciplinary assessment of social, spatial and technical research domains.

Findings

Smart cities using urban informatics can be categorised into four classes of maturity or development phases depending on the qualities of their relationship with their citizenry. The paper discusses the evolution of this maturity scale from people as residents, consumers, participants, to co-creators.

Originality/value

The paper’s contribution has practical implications for cities wanting to take advantage of urban informatics and smart city technology. First, recognising that technology is a means to an end requires cities to avoid technocratic solutions and employ participatory methodologies of urban informatics. Second, the most challenging part of unpacking city complexities is not about urban data but about a cultural shift in policy and governance style towards collaborative citymaking. The paper suggests reframing the design notion of usability towards “citizen-ability”.

Details

Smart and Sustainable Built Environment, vol. 7 no. 1
Type: Research Article
ISSN: 2046-6099

Keywords

Book part
Publication date: 12 December 2017

Susan Halford

This chapter explores the perfect storm brewing at the interface of an increasingly organized ethics review process, grounded in principles of anonymity and informed consent, and

Abstract

This chapter explores the perfect storm brewing at the interface of an increasingly organized ethics review process, grounded in principles of anonymity and informed consent, and the formation of a new digital data landscape in which vast quantities of unregulated and often personal information are readily available as research data. This new form of data not only offers huge potential for insight into everyday activities, values, and networks but it also poses some profound challenges, not least as it disrupts the established principles and structures of the ethics review process. The chapter outlines four key disruptions posed by social media data and considers the value of situational ethics as a response. Drawing on the experiences and contributions of Ph.D. students in interdisciplinary Web Science, the chapter concludes that there is a need for more sharing of the ethical challenges faced in the field by those at the ‘cutting edge’ of social media research and the development of shared resources. This might inform and speed-up the adaptation of ethics review processes to the challenges posed by new forms of digital data, to ensure that academic research with these data can keep pace with the methods and analyses being developed elsewhere, especially in commercial and journalistic contexts.

Open Access
Article
Publication date: 25 August 2023

Nathalie Kron, Jesper Björkman, Peter Ek, Micael Pihlgren, Hanan Mazraeh, Benny Berggren and Patrik Sörqvist

Previous research suggests that the compensation offered to customers after a service failure has to be substantial to make customer satisfaction surpass that of an error-free…

1088

Abstract

Purpose

Previous research suggests that the compensation offered to customers after a service failure has to be substantial to make customer satisfaction surpass that of an error-free service. However, with the right service recovery strategy, it might be possible to reduce compensation size while maintaining happy customers. The aim of the current study is to test whether an anchoring technique can be used to achieve this goal.

Design/methodology/approach

After experiencing a service failure, participants were told that there is a standard size of the compensation for service failures. The size of this standard was different depending on condition. Thereafter, participants were asked how much they would demand to be satisfied with their customer experience.

Findings

The compensation demand was relatively high on average (1,000–1,400 SEK, ≈ $120). However, telling the participants that customers typically receive 200 SEK as compensation reduced their demand to about 800 SEK (Experiment 1)—an anchoring effect. Moreover, a precise anchoring point (a typical compensation of 247 SEK) generated a lower demand than rounded anchoring points, even when the rounded anchoring point was lower (200 SEK) than the precise counterpart (Experiment 2)—a precision effect.

Implications/value

Setting a low compensation standard—yet allowing customers to actually receive compensations above the standard—can make customers more satisfied while also saving resources in demand-what-you-want service recovery situations, in particular when the compensation standard is a precise value.

Details

Journal of Service Theory and Practice, vol. 33 no. 7
Type: Research Article
ISSN: 2055-6225

Keywords

Article
Publication date: 1 July 1992

K.R. Tout and D.J. Evans

Applies a parallel backward‐chaining technique to a rule‐based expert system on a shared‐memory multiprocessor system. The condition for a processor to split up its search tree…

Abstract

Applies a parallel backward‐chaining technique to a rule‐based expert system on a shared‐memory multiprocessor system. The condition for a processor to split up its search tree (task‐node) and generate new OR nodes is based on the level in the goal tree at which the task‐node is found. The results indicate satisfactory speed‐up performance for a small number of processors (< 10) and a reasonably large number of rules.

Details

Kybernetes, vol. 21 no. 7
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 22 March 2019

Mustapha Munir, Arto Kiviniemi and Stephen W. Jones

Currently, building information modelling (BIM) is largely seen as a 3D model, not as an information model or information management tool. This wrong perception of BIM and low…

1725

Abstract

Purpose

Currently, building information modelling (BIM) is largely seen as a 3D model, not as an information model or information management tool. This wrong perception of BIM and low interest in 3D asset management (AM) is one of the major reasons for the slow adoption by clients in the architectural, engineering and construction (AEC) industry. The purpose of this paper is to identify the techniques and strategies of streamlining AM systems for BIM-based integration, and how the information is captured from physical assets towards BIM-based integration for clients to derive value from BIM investments.

Design/methodology/approach

A qualitative case study strategy was used to study the strategic implementation process of integrating BIM with AM systems and the business value of BIM in AM by a large asset owner in the UK.

Findings

The paper identifies key strategies in the adoption of BIM-based processes by an asset owner, the implementation process, the challenges and the benefits attained. Several barriers were identified as the challenges of adopting BIM-based processes in AM: complexity and cost associated with BIM; irrelevance of 3D geometric data in AM processes; nature of asset ownership structure; managing the asset handover process; managing change within the organisation. Organisations will have to consider the following issues in streaming asset information with BIM: the development for a clear strategy prior to adoption; connecting the strategy to the business goals; and conducting the discovery exercise to identify organisational information needs.

Originality/value

The research addresses a significant gap in the development of techniques and strategies for asset owners to streamline BIM with AM systems and derive business value from such integration. The research context is a case study involving a large owner-operator in the UK that has been able to derive value from BIM systems in their AM processes. The key value of the paper is improving asset owners’ understanding of BIM in AM by demonstrating the implementation strategies, linkage to organisational objectives, challenges, value management process and business value of BIM in AM. Another contribution of the paper is improving the understanding of BIM, which is usually viewed as 3D models and that 3D geometric data do not have much value for AM tasks.

Details

Engineering, Construction and Architectural Management, vol. 26 no. 6
Type: Research Article
ISSN: 0969-9988

Keywords

Content available
Book part
Publication date: 12 December 2017

Abstract

Details

The Ethics of Online Research
Type: Book
ISBN: 978-1-78714-486-6

1 – 10 of over 1000