Search results

1 – 10 of over 9000
Open Access
Article
Publication date: 7 May 2019

Yanan Wang, Jianqiang Li, Sun Hongbo, Yuan Li, Faheem Akhtar and Azhar Imran

Simulation is a well-known technique for using computers to imitate or simulate the operations of various kinds of real-world facilities or processes. The facility or process of…

1579

Abstract

Purpose

Simulation is a well-known technique for using computers to imitate or simulate the operations of various kinds of real-world facilities or processes. The facility or process of interest is usually called a system, and to study it scientifically, we often have to make a set of assumptions about how it works. These assumptions, which usually take the form of mathematical or logical relationships, constitute a model that is used to gain some understanding of how the corresponding system behaves, and the quality of these understandings essentially depends on the credibility of given assumptions or models, known as VV&A (verification, validation and accreditation). The main purpose of this paper is to present an in-depth theoretical review and analysis for the application of VV&A in large-scale simulations.

Design/methodology/approach

After summarizing the VV&A of related research studies, the standards, frameworks, techniques, methods and tools have been discussed according to the characteristics of large-scale simulations (such as crowd network simulations).

Findings

The contributions of this paper will be useful for both academics and practitioners for formulating VV&A in large-scale simulations (such as crowd network simulations).

Originality/value

This paper will help researchers to provide support of a recommendation for formulating VV&A in large-scale simulations (such as crowd network simulations).

Details

International Journal of Crowd Science, vol. 3 no. 1
Type: Research Article
ISSN: 2398-7294

Keywords

Open Access
Article
Publication date: 28 April 2020

Jialin Zou, Kun Wang and Hongbo Sun

Crowd network systems have been deemed as a promising mode of modern service industry and future economic society, and taking crowd network as the research object and exploring…

Abstract

Purpose

Crowd network systems have been deemed as a promising mode of modern service industry and future economic society, and taking crowd network as the research object and exploring its operation mechanism and laws is of great significance for realizing the effective governance of the government and the rapid development of economy, avoiding social chaos and mutation. Because crowd network is a large-scale, dynamic and diversified online deep interconnection, its most results cannot be observed in real world, and it cannot be carried out in accordance with traditional way, simulation is of great importance to put forward related research. To solve above problems, this paper aims to propose a simulation architecture based on the characteristics of crowd network and to verify the feasibility of this architecture through a simulation example.

Design/methodology/approach

This paper adopts a data-driven architecture by deeply analyzing existing large-scale simulation architectures and proposes a novel reflective memory-based architecture for crowd network simulations. In this paper, the architecture is analyzed from three aspects: implementation framework, functional architecture and implementation architecture. The proposed architecture adopts a general structure to decouple related work in a harmonious way and gets support for reflection storage by connecting to different devices via reflection memory card. Several toolkits for system implementation are designed and connected by data-driven files (DDF), and these XML files constitute a persistent storage layer. To improve the credibility of simulations, VV&A (verification, validation and accreditation) is introduced into the architecture to verify the accuracy of simulation system executions.

Findings

Implementation framework introduces the scenes, methods and toolkits involved in the whole simulation architecture construction process. Functional architecture adopts a general structure to decouple related work in a harmonious way. In the implementation architecture, several toolkits for system implementation are designed, which are connected by DDF, and these XML files constitute a persistent storage layer. Crowd network simulations obtain the support of reflective memory by connecting the reflective memory cards on different devices and connect the interfaces of relevant simulation software to complete the corresponding function call. Meanwhile, to improve the credibility of simulations, VV&A is introduced into the architecture to verify the accuracy of simulation system executions.

Originality/value

This paper proposes a novel reflective memory-based architecture for crowd network simulations. Reflective memory is adopted as share memory within given simulation execution in this architecture; communication efficiency and capability have greatly improved by this share memory-based architecture. This paper adopts a data-driven architecture; the architecture mainly relies on XML files to drive the entire simulation process, and XML files have strong readability and do not need special software to read.

Details

International Journal of Crowd Science, vol. 4 no. 2
Type: Research Article
ISSN: 2398-7294

Keywords

Article
Publication date: 14 November 2016

Yuye Wang, Guofeng Zhang and Xiaoguang Hu

Infrared simulation plays an important role in small and affordable unmanned aerial vehicles. Its key and main goal is to get the infrared image of a specific target. Infrared…

Abstract

Purpose

Infrared simulation plays an important role in small and affordable unmanned aerial vehicles. Its key and main goal is to get the infrared image of a specific target. Infrared physical model is established through a theoretical research, thus the temperature field is available. Then infrared image of a specific target can be simulated properly while taking atmosphere state and effect of infrared imaging system into account. For recent years, some research has been done in this field. Among them, the infrared simulation for large scale is still a key problem to be solved. In this passage, a method of classification based on texture blending is proposed and this method effectively solves the problem of classification of large number of images and increase the frame rate of large infrared scene rendering. The paper aims to discuss these issues.

Design/methodology/approach

Mosart Atmospheric Tool (MAT) is used first to calculate data of sun radiance, skyshine radiance, path radiance, temperatures of different material which is an offline process. Then, shader in OGRE does final calculation to get simulation result and keeps a high frame rate. Considering this, the authors convert data in MAT file into textures which can be easily handled by shader. In shader responding, radiance can be indexed by information of material, vertex normal, eye and sun. Adding the effect of infrared imaging system, the final radiance distribution is obtained. At last, the authors get infrared scene by converting radiance to grayscale.

Findings

In the fragment shader, fake infrared textures are used to look up temperature which can calculate radiance of itself and related radiance.

Research limitations/implications

The radiance is transferred into grayscale image while considering effect of infrared imaging system.

Originality/value

Simulation results show that a high frame rate can be reached while guaranteeing the fidelity.

Details

International Journal of Intelligent Computing and Cybernetics, vol. 9 no. 4
Type: Research Article
ISSN: 1756-378X

Keywords

Article
Publication date: 3 October 2019

Lisha He, Jianjing Zheng, Yao Zheng, Jianjun Chen, Xuan Zhou and Zhoufang Xiao

The purpose of this paper is to develop parallel algorithms for moving boundary simulations by local remeshing and compose them to a fully parallel simulation cycle for the…

Abstract

Purpose

The purpose of this paper is to develop parallel algorithms for moving boundary simulations by local remeshing and compose them to a fully parallel simulation cycle for the solution of problems with engineering interests.

Design/methodology/approach

The moving boundary problems are solved by unsteady flow computations coupled with six-degrees-of-freedom equations of rigid body motion. Parallel algorithms are developed for both computational fluid dynamics (CFD) solution and grid deformation steps. Meanwhile, a novel approach is developed for the parallelization of the local remeshing step. It inputs a distributed mesh after deformation, then marks low-quality elements to be deleted on the respective processors. After that, a parallel domain decomposition approach is used to repartition the hole mesh and then to redistribute the resulting sub-meshes onto all available processors. Then remesh individual sub-holes in parallel. Finally, the element redistribution is rebalanced.

Findings

If the CFD solver is parallelized while the remaining steps are executed in sequential, the performance bottleneck of such a simulation cycle is observed when the simulation of large-scale problem is executed. The developed parallel simulation cycle, in which all of time-consuming steps have been efficiently parallelized, could overcome these bottlenecks, in terms of both memory consumption and computing efficiency.

Originality/value

A fully parallel approach for moving boundary simulations by local remeshing is developed to solve large-scale problems. In the algorithm level, a novel parallel local remeshing algorithm is present. It repartitions distributed hole elements evenly onto all available processors and ensures the generation of a well-shaped inter-hole boundary always. Therefore, the subsequent remeshing step can fix the inter-hole boundary involves no communications.

Details

Engineering Computations, vol. 36 no. 8
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 1 February 1989

GORDON RICHARDS

This article examines the macroeconomic impact of a consumption‐based value‐added tax (VAT) using simulations of a large‐scale model. The VAT is imposed as a structural reform of…

Abstract

This article examines the macroeconomic impact of a consumption‐based value‐added tax (VAT) using simulations of a large‐scale model. The VAT is imposed as a structural reform of the tax code rather than as a revenue‐raising device, i.e., the revenues from the VAT are offset by compensatory reductions elsewhere. Three basic scenarios are examined, in which 1) the VAT is offset by individual rate reductions, 2) abolition of the corporate profits tax in conjunction with a small individual rate cut, and 3) an investment tax credit with the balance of the revenues offset by a personal rate cut. Additionally, this paper examines the effects of the microeconomic incidence of the VAT, i.e., whether it is fully passed through to output prices or shifted back onto profits. The finding is that the VAT in general raises the long‐term level of output, but at the cost of initial output losses, which are in evidence even when the associated rise in the price level is accommodated by a corresponding shift in monetary policy. In addition to changes in the intertemporal distribution of growth, there are significant changes in the composition of GNP, which shifts away from consumption, toward business fixed investment and net exports. These changes are particularly pronounced when the VAT is fully passed through. When the tax is partially shifted back, the gains in investment and trade are less marked, while business profits are reduced, and the long‐term increase in output is smaller.

Details

Studies in Economics and Finance, vol. 12 no. 2
Type: Research Article
ISSN: 1086-7376

Article
Publication date: 11 October 2011

Jiang Shu, Layne T. Watson, Naren Ramakrishnan, Frederick A. Kamke and Shubhangi Deshpande

This paper describes a practical approach to implement computational steering for problem solving environments (PSEs) by using WBCSim as an example. WBCSim is a Web based…

Abstract

Purpose

This paper describes a practical approach to implement computational steering for problem solving environments (PSEs) by using WBCSim as an example. WBCSim is a Web based simulation system designed to increase the productivity of wood scientists conducting research on wood‐based composites manufacturing processes. WBCSim serves as a prototypical example for the design, construction, and evaluation of small‐scale PSEs.

Design/methodology/approach

Various changes have been made to support computational steering across the three layers – client, server, developer – comprising the WBCSim system. A detailed description of the WBCSim system architecture is presented, along with a typical scenario of computational steering usage.

Findings

The set of changes and components are: design and add a very simple steering module at the legacy simulation code level, provide a way to monitor simulation execution (alert users when it is time to steer), add an interface to access and visualize simulation results, and perhaps to compare intermediate results across multiple steering attempts. These simple changes and components have a relatively low cost in terms of increasing software complexity.

Originality/value

The novelty lies in designing and implementing a practical approach to enable computational steering capability for PSEs embedded with legacy simulation code.

Article
Publication date: 1 January 2014

Hitesh S. Vaid, Kanwar Devesh Singh, Helen H. Lou, Daniel Chen and Peyton Richmond

This paper aims to present a novel run time combustion zoning (RTCZ) technique based on the working principle of eddy dissipation concept (EDC) for combustion modeling. This…

Abstract

Purpose

This paper aims to present a novel run time combustion zoning (RTCZ) technique based on the working principle of eddy dissipation concept (EDC) for combustion modeling. This technique selectively chooses cells in which the full reaction mechanism needs to be solved. The selection criterion is based on the concept of differentiating between combustion and the non-combustion zone. With this approach, considerable reduction in computational load and stability of the solution was observed and even the number of iterations required to achieve a stable solution was significantly reduced.

Design/methodology/approach

Computational fluid dynamics (CFD) simulations of real life combustion problems such as industrial scale flares, fuel fired furnaces and IC engines are difficult due to the strong interactions of chemistry with turbulence as well as the wide range distribution of time and length scales. In addition, comprehensive chemical mechanisms for hydrocarbon combustion may include hundreds of species and thousands of reactions that are known in detail for only a limited number of fuels. Even with the most advanced computers, accurate simulation of these problems is not easy. Hence, the modeler needs to have strategies to either simplify the chemistry or to improve the computational efficiency.

Findings

The EDC turbulence model has been widely used for treating the interaction between turbulence and the chemistry in combustion problems. In an EDC model, combustion is assumed to occur in a constant pressure reactor, with initial conditions taken as the concentration of the current species and temperature in the cell. With these assumptions, EDC solves the full or simplified reaction mechanism in all the grid cells at all iterations.

Originality/value

This paper presents a novel RTCZ technique for improving the computational efficiency, when the EDC model is used in CFD modeling. Considerable reduction in computational time and stability of the solution can be achieved. It was also observed that the number of iterations required to achieve a converged solution was significantly reduced.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 24 no. 1
Type: Research Article
ISSN: 0961-5539

Keywords

Abstract

Details

Integrated Land-Use and Transportation Models
Type: Book
ISBN: 978-0-080-44669-1

Article
Publication date: 1 April 1991

Malcolm Getz

Video is an increasingly important medium for the communication of ideas in academia. It has high potential as a tool in instruction and is valuable for the visual portrayal of…

Abstract

Video is an increasingly important medium for the communication of ideas in academia. It has high potential as a tool in instruction and is valuable for the visual portrayal of the outcomes of large‐scale mathematical models and simulations. A major problem for conventional video is that it is a passive medium, moving linearly at a fixed pace regardless of the viewer. This problem is overcome when video documents are made interactive so that the viewer can affect the pace and order of viewing. Video controlled by computer software permits the user to select video segments for viewing in any order and see video segments in the context of text, computer graphics, and sound. In effect, one can read a book (on screen) where the photographs are video segments, and one can branch from any page to any other as interest dictates. We might call such materials compound documents and the method of delivering them a multimedia presentation. Libraries have begun to collect interactive video materials and to sustain the equipment necessary to make the documents useful to readers. The challenge to library managers is making sensible decisions about investments in this arena.

Details

The Bottom Line, vol. 4 no. 4
Type: Research Article
ISSN: 0888-045X

1 – 10 of over 9000