Search results

1 – 10 of over 7000
Article
Publication date: 12 April 2013

Louise Manning

The purpose of this paper is to determine the mechanisms for effective verification of a food safety plan and reducing verification risk.

1630

Abstract

Purpose

The purpose of this paper is to determine the mechanisms for effective verification of a food safety plan and reducing verification risk.

Design/methodology/approach

The research involved analysis of both qualitative and quantitative methods of verification.

Findings

Effective development of the food safety management system (FSMS) is underpinned by appropriate determination of food safety hazards, as well as the acceptable level of risk to the consumer and measures for their control. Product and process validation, and revalidation if required, is the key to consistently producing safe food and the development of appropriate real‐time monitoring activities. However, it is the development of effective verification processes for the FSMS and the pre‐requisite programme (PRP) and the reduction of verification risk that ensures that food safety is assured for consumers.

Originality/value

This research is of academic value and of value to those working in the food supply chain.

Details

British Food Journal, vol. 115 no. 4
Type: Research Article
ISSN: 0007-070X

Keywords

Article
Publication date: 17 June 2022

Andrew Pidgeon and Nashwan Dawood

The purpose of this research is to develop through a two-stage verification and validation process a novel implementation framework for collaborative BIM, utilising experts from…

Abstract

Purpose

The purpose of this research is to develop through a two-stage verification and validation process a novel implementation framework for collaborative BIM, utilising experts from academia and industry as well as a real-world case study project.

Design/methodology/approach

The aim of this research was to build upon previous research findings by the authors in order to develop an implementation framework that stems from ousting the inefficiencies of current collaborative BIM practices. This is achieved by a more objectified and quantified approach towards seeking heightened transparency and objectivism of what is required through the implementation of BIM. The mixed research methods technique of both qualitative and quantitative data collection was utilised, with the structure consisting of a two-stage approach utilising the Delphi model for verification and validation. This was developed to test the novelty and beneficial structure hypothesis involving 15 core BIM experts from academia, construction and design with c. 22 years average experience. Validation was undertaken on a complex, high value real world building structures project in central London, inclusive of 8 core project BIM experts. The research utilised a developed solution that mirrored and provided a more holistic representation guiding the practitioners as a project team step by step through the determination of underpinning elements, which support the goal of enhanced information requirements as well as executing the prioritisation measurement tools as part of the framework. Data ascertained at the workshop case study prioritised areas of importance that are core in supporting the delivering of these enhanced information requirements at a project delivery level, which were in order of prioritisation determined by the project team (1) constraints (39.17%), (2) stakeholder requirements (35.78%), (3) coordination (existing asset) (15.86%), (4) exchange requirements (5.38%) and (5) level of information need (3.81%). Furthermore, risk mitigations for the top three priorities were focussed on early stakeholder engagement, appropriation of survey data collection, focus on quality of outputs and applying toolsets and processes with meaning and emphasis on the defined high-level requirements.

Findings

Findings show that the framework and the developed solution translate the process methodology of the framework schema into a useable and beneficial tool that provides both qualitative and quantitative inputs and outputs. Furthermore, a collective agreement on the objectives, risk mitigations and assignment of tasks in order to achieve outcomes is presented, with evidence on numerical weightings and goal achievement.

Research limitations/implications

Due to the impacts of COVID-19 on physical engagements both the verification (electronic survey questionnaire) and validation (case study project) were undertaken remotely, using available technologies and web interfaces.

Practical implications

The case study workshop was limited to one building structures project in central London of a value of c. £70 m design and build cost that the project team (participants) were actively engaged with.

Social implications

The social impacts of this research has resulted in the review of existing systems, methods and approaches from a wider perspective of theoretical and applied environments, which led to the development of a novel approach and framework guided by an interactive and useable solution.

Originality/value

As shown within the core findings, experts across academia and industry (design and construction) confirmed that the framework methodology and application were 100% novel, and added a benefit to the existing collaborative BIM approach. Value added is that through objectifying, weighting/prioritizing and creating a discussion supported by qualitative and quantitative reasoning the focus on what collaborative BIM is to achieve is increased, and thus the likelihood of successful implementation.

Details

Smart and Sustainable Built Environment, vol. 12 no. 4
Type: Research Article
ISSN: 2046-6099

Keywords

Open Access
Article
Publication date: 7 May 2019

Yanan Wang, Jianqiang Li, Sun Hongbo, Yuan Li, Faheem Akhtar and Azhar Imran

Simulation is a well-known technique for using computers to imitate or simulate the operations of various kinds of real-world facilities or processes. The facility or process of…

1547

Abstract

Purpose

Simulation is a well-known technique for using computers to imitate or simulate the operations of various kinds of real-world facilities or processes. The facility or process of interest is usually called a system, and to study it scientifically, we often have to make a set of assumptions about how it works. These assumptions, which usually take the form of mathematical or logical relationships, constitute a model that is used to gain some understanding of how the corresponding system behaves, and the quality of these understandings essentially depends on the credibility of given assumptions or models, known as VV&A (verification, validation and accreditation). The main purpose of this paper is to present an in-depth theoretical review and analysis for the application of VV&A in large-scale simulations.

Design/methodology/approach

After summarizing the VV&A of related research studies, the standards, frameworks, techniques, methods and tools have been discussed according to the characteristics of large-scale simulations (such as crowd network simulations).

Findings

The contributions of this paper will be useful for both academics and practitioners for formulating VV&A in large-scale simulations (such as crowd network simulations).

Originality/value

This paper will help researchers to provide support of a recommendation for formulating VV&A in large-scale simulations (such as crowd network simulations).

Details

International Journal of Crowd Science, vol. 3 no. 1
Type: Research Article
ISSN: 2398-7294

Keywords

Article
Publication date: 17 February 2015

Leslaw Kwasniewski and Cezary Bojanowski

This paper discusses the concepts of verification and validation in computational mechanics with special attention to structural fire engineering, by referring to recently…

Abstract

This paper discusses the concepts of verification and validation in computational mechanics with special attention to structural fire engineering, by referring to recently published papers and guides on V&V that define some best practices and show directions for future development. The perspective of an analyst, who develops computational models, makes runs, and analyses numerical results mostly using software based on the finite element method, is presented. The considerations emphasize practical problems encountered in the V&V process, potential sources of errors and uncertainties, the importance of sensitivity study, new ideas regarding the relationship between validation and verification, differences between calibration and validation, new aspects of the validation metrics, and guides for designing validation experiments. The discussion is illustrated by computational problem examples.

Details

Journal of Structural Fire Engineering, vol. 6 no. 1
Type: Research Article
ISSN: 2040-2317

Article
Publication date: 31 August 2020

Sohei Ito, Dominik Vymětal and Roman Šperka

The need for assuring correctness of business processes in enterprises is widely recognised in terms of business process re-engineering and improvement. Formal methods are a…

Abstract

Purpose

The need for assuring correctness of business processes in enterprises is widely recognised in terms of business process re-engineering and improvement. Formal methods are a promising approach to this issue. The challenge in business process verification is to create a formal model that is well-aligned to the reality. Process mining is a well-known technique to discover a model of a process based on facts. However, no studies exist that apply it to formal verification. This study aims to propose a methodology for formal business process verification by means of process mining, and attempts to clarify the challenges and necessary technologies in this approach using a case study.

Design/methodology/approach

A trading company simulation model is used as a case study. A workflow model is discovered from an event log produced by a simulation tool and manually complemented to a formal model. Correctness requirements of both domain-dependent and domain-independent types of the model are checked by means of model-checking.

Findings

For business process verification with both domain-dependent and domain-independent correctness requirements, more advanced process mining techniques that discover data-related aspects of processes are desirable. The choice of a formal modelling language is also crucial. It depends on the correctness requirements and the characteristics of the business process.

Originality/value

Formal verification of business processes starting with creating its formal model is quite new. Furthermore, domain-dependent and domain-independent correctness properties are considered in the same framework, which is also new. This study revealed necessary technologies for this approach with process mining.

Details

Journal of Modelling in Management, vol. 16 no. 2
Type: Research Article
ISSN: 1746-5664

Keywords

Article
Publication date: 1 October 2018

Peter Madzík and Vera Pelantová

Product verification and validation are integral to quality management. Product verification means verifying the conformity between a product’s actual and planned characteristics…

Abstract

Purpose

Product verification and validation are integral to quality management. Product verification means verifying the conformity between a product’s actual and planned characteristics whereas validation means determining whether and to what extent it satisfies customers’ requirements. One of the key forms of product validation is testing with a group of customers. The purpose of this paper is to introduce a graphical method of product validation based on the Kano model.

Design/methodology/approach

The approach is based on a proposed method for categorising requirements based on a Kano questionnaire and then applies this method for the validation of a product – a website. The proposed method is based on three steps: graphical determination of requirements in a Kano model; determination of requirement fulfilment degree and prioritisation of corrective measures and improvements.

Findings

The study opens space for discussion of the potential for improving a product and methods for identifying critical faults in products. The proposed method also permits an assessment of the potential effectiveness of an improvement because it is able to quantify the effect of the product on the consumer resulting from a given quantity of effort. A case study demonstrated that the resulting priority of corrective measures and improvements was affected not only by the level of fulfilment of the requirements but also by the type, the most critical being non-fulfilment of must-be requirements.

Research limitations/implications

The requirement curves are based on a verbal assessment of satisfaction in two states – if the requirement were fulfilled and if it were not fulfilled. The values of the start and end points may not be precise and could be affected by the natural character of subjective variables.

Practical implications

The proposed method is particularly suited to the initial testing of a product that is intended to lead to measures to eliminate customer dissatisfaction or increase their satisfaction – that is, to improve the product. The method also permits an assessment of the extent to which customers feel that their expectations have been satisfied and the effect that will be felt if the organisation decides to increase fulfilment.

Originality/value

The Kano model has not yet been applied to product validation, although it contains all the information necessary for this task. Knowing how satisfied customers are is an important part of product validation. At the same time, knowing a mechanism for “creating” this satisfaction is also very valuable information.

Details

International Journal of Quality & Reliability Management, vol. 35 no. 9
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 27 June 2008

Sidhartha R. Das, Ulku Yaylacicegi and Cem Canel

ISO 90003 provides guidelines for applying ISO 9001 to software development processes. The purpose of this paper is to present how the software development process in large…

1135

Abstract

Purpose

ISO 90003 provides guidelines for applying ISO 9001 to software development processes. The purpose of this paper is to present how the software development process in large, virtual teams (LVTs) can be managed, so that they are in compliance with ISO 9001.

Design/methodology/approach

The firm's actions are described in a case example format that illustrates how fit between theory and practice is achieved; and forms a precursor to the derivation of appropriate research arguments.

Findings

The steps presented show the application of ISO 90003 guidelines to software development planning activities in LVTs, to meet the requirements of ISO 9000 certification.

Research limitations/implications

The scope of this paper is limited to the application of Section 7.3 (Design and development) of ISO 90003:2004 to the software development process. The paper presents the discussion in a “generalized” fashion so that the steps described can be implemented by any software development company.

Practical implications

The implications for managers in this study lie in the presentation of a set of steps to manage software development processes in LVTs, so that they are in compliance with ISO 9001.

Originality/value

There is a dearth of studies on the application of process‐based approaches in virtual organizations. This paper addresses this gap in the literature by examining how software development processes in virtual organizations (specifically, LVTs) may be formally managed, so that they are in compliance with ISO 9001.

Details

Industrial Management & Data Systems, vol. 108 no. 6
Type: Research Article
ISSN: 0263-5577

Keywords

Article
Publication date: 7 April 2023

Ibrahim Ayaz, Ufuk Sakarya and Ibrahim Hokelek

The purpose of this paper is to present a verification methodology for custom micro coded components designed for Avionics projects. Every electronic hardware which will be…

Abstract

Purpose

The purpose of this paper is to present a verification methodology for custom micro coded components designed for Avionics projects. Every electronic hardware which will be developed for an aircraft must be designed with the compliance of DO-254 processes. Requirements are the key elements of the aviation. All the requirements must be covered by the design to be considered as completed. Therefore, verification of the custom micro coded components against requirements should be comprehensively addressed. The verification using the manual testing approach is less preferable, as humans can possibly make mistakes. Therefore, the most used verification method today is the automated simulation.

Design/methodology/approach

The industry has developed a common methodology for generating automated testbenches by following the standardized guideline. This methodology is named as the universal verification methodology (UVM). In this paper, the verification study of ARINC-429 data bus digital design is presented to describe the DO-254 verification process using the UVM.

Findings

The results are supported with functional coverage and code coverage in addition to the assertions. It is observed that the design worked correctly.

Originality/value

To the best of the authors’ knowledge, this is the first study comprehensively describing the DO-254 verification process and demonstrating it by the UVM application of ARINC-429 on programmable logic devices.

Details

Aircraft Engineering and Aerospace Technology, vol. 95 no. 7
Type: Research Article
ISSN: 1748-8842

Keywords

Article
Publication date: 12 July 2011

Omar AL‐Tabbaa and Rifat Rustom

This paper seeks to propose a general framework to be used in developing multi‐use simulation modules for estimating project durations at the planning phase.

Abstract

Purpose

This paper seeks to propose a general framework to be used in developing multi‐use simulation modules for estimating project durations at the planning phase.

Design/methodology/approach

The research method incorporates two main stages. First, conceptualisation of the general framework, and second, implementing the framework in modelling and experimenting simulation modules, which involves data collection, statistical analysis, templates building through the ARENA software, and modules verification and validation.

Findings

The framework was found to be effective in providing an approach for building multi‐use simulation modules. The validation and verification processes of the developed simulation module reflect the soundness of the proposed framework.

Practical implications

Useful insights have been presented in this research regarding building multi‐use simulation modules in infrastructure construction projects. In addition, the paper demonstrates examples about how simulation interaction interface can contribute to the efficiency of using the simulation technique.

Originality/value

Given the lack of general approaches for building multi‐use simulation modules, this research suggests a simplified approach for developing multi‐use modules. Both academics and practitioners can benefit from this new approach by understanding the mechanism behind the multi‐use model concept as explained in this paper.

Article
Publication date: 22 May 2008

Zbigniew Buliński and Andrzej J. Nowak

The purpose of this paper is to present a numerical and mathematical model of a moulding process of a dry electrical transformer. Moreover, the calculated results are reported and

Abstract

Purpose

The purpose of this paper is to present a numerical and mathematical model of a moulding process of a dry electrical transformer. Moreover, the calculated results are reported and compared with experimental measurements.

Design/methodology/approach

An experimental rig, for carrying out and monitoring a moulding process, has been designed and built. Two experiments were preformed. First was an isothermal experiment in which an analog liquid was used. The second experiment was a non‐isothermal one in which an epoxy resin was used. For the rig geometry, the numerical mesh, with the use of the commercial code Gambit, was built. All necessary physical properties, including viscosity, surface tension and contact angle of fluids used in the experiments were measured.

Findings

The Euler approach for modelling multiphase flow with a free surface is addressed in the presented work. Comparison of the computational results with measurements on the designed experimental rig revealed good agreement. Comparison was carried out through measurements of free surface characteristic features captured with a digital camera and through temperature measurements for the nonisothermal case. Richardson extrapolation method was successfully applied to estimate the numerical discretisation error, proving that a grid independent solution was obtained.

Originality/value

This paper is useful for researchers and industrialists involved in the modelling of moulding processes, giving guidance on the available mathematical models appropriate for this kind of problem. Moreover, it provides valuable information as to how to perform validation and verification procedures for such real‐life processes.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 18 no. 3/4
Type: Research Article
ISSN: 0961-5539

Keywords

1 – 10 of over 7000