Search results

1 – 10 of over 7000
Article
Publication date: 29 March 2011

Wei Zhan

The purpose of this paper is to demonstrate the effectiveness of Six Sigma as an innovative tool in software design optimization. The problem of reducing simulation time for…

Abstract

Purpose

The purpose of this paper is to demonstrate the effectiveness of Six Sigma as an innovative tool in software design optimization. The problem of reducing simulation time for characterizing a type of DC motor is studied in this paper. The case study illustrates how Six Sigma tools such as the design of experiments (DOE) method can be used to improve a simulation process.

Design/methodology/approach

A first‐principle model for the motor is used for simulation in MATLAB®. Each parameter in the model is assumed to have a known distribution. Using the random number generator in MATLAB®, Monte Carlo analysis is conducted. To reduce simulation time, several factors in the simulation process are identified. A two‐level full factorial DOE matrix is constructed. The Monte Carlo analysis is carried out for each of the parameter set in the DOE matrix. Based on the simulation results and the DOE analysis, the Simulink model is identified as the main contributor to the computational time of the simulation. Several steps are taken to reduce the computational time related to the Simulink model. The improved model requires only one‐fourth of the original computational time.

Findings

The paper illustrates that Six Sigma tools can be applied to algorithm and software‐development process for optimization purpose. Statistical analysis can be conducted in the simulation environment to provide valuable information.

Practical implications

As an example, the improved simulation process is used to derive statistical information related to the speed vs torque curve and response time as part of the motor characteristics. The findings suggest that with an optimized simulation model, large amount of statistical analyses can be conducted in the simulation environment to provide practical information. This approach can be effectively used in early stage of product design, e.g. during the feasibility study.

Originality/value

In industry, most of the DOE are conducted using real‐test data. It is usually time consuming and cost inefficient. This paper combines mathematical modelling and statistical analysis to optimize a simulation model using DOE. The novel approach used in this paper can be applied in many other software optimization problems. It is expected that this approach will broaden the application of Six Sigma in industry.

Details

International Journal of Lean Six Sigma, vol. 2 no. 1
Type: Research Article
ISSN: 2040-4166

Keywords

Article
Publication date: 29 October 2019

Mark E. Hopkins and Oksana L. Zavalina

A new approach to investigate serendipitous knowledge discovery (SKD) of health information is developed and tested to evaluate the information flow-serendipitous knowledge…

Abstract

Purpose

A new approach to investigate serendipitous knowledge discovery (SKD) of health information is developed and tested to evaluate the information flow-serendipitous knowledge discovery (IF-SKD) model. The purpose of this paper is to determine the degree to which IF-SKD reflects physicians’ information behaviour in a clinical setting and explore how the information system, Spark, designed to support physicians’ SKD, meets its goals.

Design/methodology/approach

The proposed pre-experimental study design employs an adapted version of the McCay-Peet’s (2013) and McCay-Peet et al.’s (2015) serendipitous digital environment (SDE) questionnaire research tool to address the complexity associated with defining the way in which SKD is understood and applied in system design. To test the IF-SKD model, the new data analysis approach combining confirmatory factor analysis, data imputation and Monte Carlo simulations was developed.

Findings

The piloting of the proposed novel analysis approach demonstrated that small sample information behaviour survey data can be meaningfully examined using a confirmatory factor analysis technique.

Research limitations/implications

This method allows to improve the reliability in measuring SKD and the generalisability of findings.

Originality/value

This paper makes an original contribution to developing and refining methods and tools of research into information-system-supported serendipitous discovery of information by health providers.

Details

Aslib Journal of Information Management, vol. 71 no. 6
Type: Research Article
ISSN: 2050-3806

Keywords

Article
Publication date: 14 December 2021

Matthew Moorhead, Lynne Armitage and Martin Skitmore

The purpose is to examine the risk management processes and methods used in determining project feasibility in the early stages of the property development process by…

Abstract

Purpose

The purpose is to examine the risk management processes and methods used in determining project feasibility in the early stages of the property development process by Australia/New Zealand property developers, including Monte Carlo simulation, Bayesian models and real option theory embedded in long-term property development and investment decision-making as instruments for providing flexibility and managing risk, uncertainty and change.

Design/methodology/approach

A questionnaire survey of 225 Australian and New Zealand trader developers, development managers, investors, valuers, fund managers and government/charities/other relating to Australia/New Zealand property development companies' decision-making processes in the early stages of the development process prior to site acquisition or project commencement – the methods used and confidence in their organisations' ability to both identify and manage the risks involved.

Findings

Few of the organisations sampled use sophisticated methods; those organisations that are more likely to use such methods for conducting risk analysis include development organisations that undertake large projects, use more risk analysis methods and have more layers in their project approval process. Decision-makers have a high level of confidence in their organisation's ability to both identify and manage the risks involved, although this is not mirrored in their actual risk management processes. Although the majority of property developers have a risk management plan, less than half have implemented it, and a third need improvement.

Practical implications

Property development organisations should incorporate more modern and sophisticated models of risk analysis to determine the uncertainty of, and risk in, a change of input variables in their financial viability appraisals. Practical application includes using such multiple techniques as what-if scenarios and probability analysis into feasibility processes and utilise these specific techniques in the pre-acquisition stages of the property development process and, specifically, in the site acquisition process to support decision-making, including a live risk register and catalogue of risks, including identification of and plans for mitigation of project risks, as a form of risk management.

Originality/value

First study to examine the extent of the decision-making methods used by property developers in the pre-acquisition stage of the development process.

Details

Journal of Property Investment & Finance, vol. 40 no. 6
Type: Research Article
ISSN: 1463-578X

Keywords

Article
Publication date: 1 June 2003

Angelo Marcello Anile, Salvatore Spinella and Salvatore Rinaudo

Tolerance analysis is a very important tool for chip design in the microelectronics industry. The usual method for tolerance analysis is Monte Carlo simulation, which, however, is…

Abstract

Tolerance analysis is a very important tool for chip design in the microelectronics industry. The usual method for tolerance analysis is Monte Carlo simulation, which, however, is extremely CPU intensive, because in order to yield statistically significant results, it needs to generate a large sample of function values. Here we report on another method, recently introduced in several fields, called stochastic response surface method, which might be a viable alternative to Monte Carlo simulation for some classes of problems. The application considered here is on the tolerance analysis of the current of a submicrometer n+nn+ diode as a function of the channel length and the channel doping. The numerical simulator for calculating the current is based on the energy transport hydrodynamical model introduced by Stratton, which is one of the most widely used in this field.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering, vol. 22 no. 2
Type: Research Article
ISSN: 0332-1649

Keywords

Article
Publication date: 2 November 2010

Ettore Settanni and Jan Emblemsvåg

The aim of this paper is to introduce uncertainty analysis within an environmentally extended input‐output technological model of life cycle costing. The application of this…

1576

Abstract

Purpose

The aim of this paper is to introduce uncertainty analysis within an environmentally extended input‐output technological model of life cycle costing. The application of this approach will be illustrated with reference to the ceramic floor tiles manufacturing process.

Design/methodology/approach

Input‐output analysis (IOA) provides a computational structure which is interesting for many applications within value chain analysis and business processes analysis. A technological model, which is built bottom‐upwards from the operations, warrants that production planning and corporate environmental accounting be closely related to cost accounting. Monte Carlo methods have been employed to assess how the uncertainty may affect the expected outcomes of the model.

Findings

It has been shown, when referring to a vertically‐integrated, multiproduct manufacturing process, how production and cost planning can be effectively and transparently integrated, also taking the product usage stage into account. The uncertainty of parameters has been explicitly addressed to reflect business reality, thus reducing risk while aiding management to take informed actions.

Research limitations/implications

The model is subject to all the assumptions characterizing IOA. Advanced issues such as non linearity and dynamics have not been addressed. These limitations can be seen as reasonable as long as the model is mostly tailored to situations where specialized information systems and competences about complex methods may be lacking, such as in many small and medium enterprises.

Practical implications

Developing a formal structure which is common to environmental, or other physically‐driven, assessments and cost accounting helps to identify and to understand those drivers that are relevant to both of them, especially the effects different design solutions may have on both material flows and the associated life cycle costs.

Originality/value

This approach integrates physical and monetary measures, making the computational mechanisms transparent. Unlike other microeconomic IOA models, the environmental extensions have been introduced. Uncertainty has been addressed with a focus on the easiness of implementing the model.

Details

Journal of Modelling in Management, vol. 5 no. 3
Type: Research Article
ISSN: 1746-5664

Keywords

Article
Publication date: 13 March 2017

Richard Walls, Celeste Viljoen, Hennie de Clercq and Charles Clifton

This paper aims to present a reliability analysis of the slab panel method (SPM) for the design of composite steel floors in severe fires. Rather than seeking to accurately define…

Abstract

Purpose

This paper aims to present a reliability analysis of the slab panel method (SPM) for the design of composite steel floors in severe fires. Rather than seeking to accurately define failure levels, this paper highlights areas of uncertainty in design and their effect on design results, whilst providing approximate reliability levels.

Design/methodology/approach

A Monte Carlo simulation has been conducted using the SPM design procedure to produce probability density functions of floor capacity for various floor layouts. Statistical input variables were obtained from the literature. Different configurations, geometries and fire severities are included to demonstrate how predicted floor capacities are influenced.

Findings

From the research presented, it is clear that the predicted reliability of SPM systems varies relative to a large number of criteria, but especially parameters related to fire loading. Predicted capacities are shown to be conservative compared to results of furnace and large-scale natural fire tests, which exhibit higher fire resistance. Due to distinct fire hazard categories with associated input values, there are step discontinuities in capacity graphs.

Originality/value

Limited research has been done to date on the reliability of structures in fire as discussed in this paper. It is important to verify the reliability levels of systems to ensure that partial and global factors of safety are adequate. Monte Carlo simulations are shown to be effective for calculating the average floor capacities and associated standard deviations. The presentation of probability density functions for composite floors in severe fires is novel.

Details

Journal of Structural Fire Engineering, vol. 8 no. 1
Type: Research Article
ISSN: 2040-2317

Keywords

Article
Publication date: 1 May 2001

Y. Helio Yang, Kamal Haddad and Chee W. Chow

Reviews the literature on capacity planning at strategic, tactical and operational levels but points out that, in practice, many enterprise resource planning systems make…

1345

Abstract

Reviews the literature on capacity planning at strategic, tactical and operational levels but points out that, in practice, many enterprise resource planning systems make unrealistic assumptions for production planning; and the advanced production software packages which can deal with uncertainty are both complex and expensive. Uses a theoretical company to demonstrate how a normal Excel spreadsheet can be used in conjunction with a common add‐on package (@RISK) to improve analysis and run Monte Carlo simulations as a basis for decision making. Compares the results produced with standard spreadsheet analysis and discusses the additional financial and operational insights they provide into the implications of different capacity levels under conditions of uncertainty. Warns that the validity of the simulation depends on the quality of the data and model; and that human judgement is still required to actually make a decision.

Details

Managerial Finance, vol. 27 no. 5
Type: Research Article
ISSN: 0307-4358

Keywords

Article
Publication date: 1 October 2008

C. Correia and P. Cramer

This study employs a sample survey to determine and analyse the corporate finance practices of South African listed companies in relation to cost of capital, capital structure and…

3007

Abstract

This study employs a sample survey to determine and analyse the corporate finance practices of South African listed companies in relation to cost of capital, capital structure and capital budgeting decisions.The results of the survey are mostly in line with financial theory and are generally consistent with a number of other studies. This study finds that companies always or almost always employ DCF methods such as NPV and IRR to evaluate projects. Companies almost always use CAPM to determine the cost of equity and most companies employ either a strict or flexible target debt‐equity ratio. Furthermore, most practices of the South African corporate sector are in line with practices employed by US companies. This reflects the relatively highly developed state of the South African economy which belies its status as an emerging market. However, the survey has also brought to the fore a number of puzzling results which may indicate some gaps in the application of finance theory. There is limited use of relatively new developments such as real options, APV, EVA and Monte Carlo simulation. Furthermore, the low target debt‐equity ratios reflected the exceptionally low use of debt by South African companies.

Article
Publication date: 16 January 2017

Maryam Shahtaheri, Carl Thomas Haas and Tabassom Salimi

Good planning is key to good project performance. However, for the sub-class of round-the-clock projects requiring mixed mode planning a suitable planning approach does not exist…

Abstract

Purpose

Good planning is key to good project performance. However, for the sub-class of round-the-clock projects requiring mixed mode planning a suitable planning approach does not exist. The purpose of this paper is to develop and validate such an approach.

Design/methodology/approach

Development of the approach builds on a synthesis and extensions of previous work related to projects with round-the-clock schedules, containing multiple workflows (sequential/cyclical). This approach considers the interdependence among shift-schedule, productivity, calendar duration, and risk registers. It quantifies the confidence in those strategies using a Monte Carlo and a multi-dimensional joint confidence limit (JCL) simulation platform.

Findings

n of workflows and their interdependencies. Also, the platform results show that the deviation between the deterministic outcomes and the simulated ones are a good indicator when dealing with projects with minimal tolerance for possible imposed mitigation strategies (e.g. round-the-clock projects).

Research limitations/implications

The validation of the approach is limited to a multi-billion dollar nuclear refurbishment case study and functional demonstration. The applicable class of projects is limited, and includes those for which failure of cost, schedule, or quality implies project failure.

Originality/value

It is anticipated that the proposed approach will assist with developing a realistic planning strategy by incorporating various factors and constraints under the impact of risks and uncertainty. This may lead to a more reliable determination of outcomes for round-the-clock projects.

Details

Engineering, Construction and Architectural Management, vol. 24 no. 1
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 24 January 2023

Jeremy R. Franks, Jessica Hepburn and Rachel S.E. Peden

This study aims to explore the impacts of long-term trends in the closure of abattoir businesses in the UK on the robustness of the network of abattoirs which provides private…

Abstract

Purpose

This study aims to explore the impacts of long-term trends in the closure of abattoir businesses in the UK on the robustness of the network of abattoirs which provides private kill services.

Design/methodology/approach

This proof-of-concept study uses responses from a farmer and an abattoir survey in a spatial analysis to help visualise the private kill network. Monte Carlo simulation is used to estimate the impacts of possible further closures of private kill abattoirs on the robustness of the private kill network.

Findings

In August 2020, 18% of the area of the UK was more than 45 km from a private kill abattoir, 21% was serviced by one, 14% by two and 47% by three or more abattoirs. After randomly removing 9 and 18% of private kill abattoirs, to reflect the current trend in the closure of private kill abattoirs, the area of the UK more than 45 km from a private kill service and the areas with one and two providers increased, whilst the area with three or more providers decreased for each scenario. This approach, therefore, can be used to quantify the network's resilience to further closures.

Research limitations/implications

The additional information that would be needed to allow this approach to help policymakers identify strategically valuable abattoir businesses is discussed.

Originality/value

No other national or international study has attempted to quantify the robustness of the network of private kill abattoirs.

Details

British Food Journal, vol. 125 no. 8
Type: Research Article
ISSN: 0007-070X

Keywords

1 – 10 of over 7000