Search results

1 – 10 of over 19000

Abstract

Details

Handbook of Transport Modelling
Type: Book
ISBN: 978-0-08-045376-7

Book part
Publication date: 15 January 2010

Michiel C.J. Bliemer and John M. Rose

Stated choice experiments can be used to estimate the parameters in discrete choice models by showing hypothetical choice situations to respondents. These attribute levels in each…

Abstract

Stated choice experiments can be used to estimate the parameters in discrete choice models by showing hypothetical choice situations to respondents. These attribute levels in each choice situation are determined by an underlying experimental design. Often, an orthogonal design is used, although recent studies have shown that better experimental designs exist, such as efficient designs. These designs provide more reliable parameter estimates. However, they require prior information about the parameter values, which is often not readily available. Serial efficient designs are proposed in this paper in which the design is updated during the survey. In contrast to adaptive conjoint, serial conjoint only changes the design across respondents, not within-respondent thereby avoiding endogeneity bias as much as possible. After each respondent, new parameters are estimated and used as priors for generating a new efficient design. Results using the multinomial logit model show that using such a serial design, using zero initial prior values, provides the same reliability of the parameter estimates as the best efficient design (based on the true parameters). Any possible bias can be avoided by using an orthogonal design for the first few respondents. Serial designs do not suffer from misspecification of the priors as they are continuously updated. The disadvantage is the extra implementation cost of an automated parameter estimation and design generation procedure in the survey. Also, the respondents have to be surveyed in mostly serial fashion instead of all parallel.

Details

Choice Modelling: The State-of-the-art and the State-of-practice
Type: Book
ISBN: 978-1-84950-773-8

Article
Publication date: 1 March 1999

W.J. Roux, R.J. du Preez and N. Stander

A general approach for the construction of global approximations to structural behaviour using response surface methodology is presented. The computation and use of these…

Abstract

A general approach for the construction of global approximations to structural behaviour using response surface methodology is presented. The computation and use of these approximations are demonstrated using a semi‐solid tyre example. The use of these global approximations to the responses made it viable to utilise the capabilities of non‐linear analysis software in design optimisation. The insight gained from a preliminary low fidelity model was utilized in a two‐stage approach to achieve the maximum benefit from a more expensive high fidelity model. The resulting high‐accuracy approximations greatly reduced the cost of subsequent design calculations such as multidisciplinary and discrete optimisation.

Details

Engineering Computations, vol. 16 no. 2
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 15 March 2021

Daniel Ayasse and Kangwon Seo

Planning an accelerated life test (ALT) for a product is an important task for reliability practitioners. Traditional methods to create an optimal design of an ALT are often…

Abstract

Purpose

Planning an accelerated life test (ALT) for a product is an important task for reliability practitioners. Traditional methods to create an optimal design of an ALT are often computationally burdensome and numerically difficult. In this paper, the authors introduce a practical method to find an optimal design of experiments for ALTs by using simulation and empirical model building.

Design/methodology/approach

Instead of developing the Fisher information matrix-based objective function and analytic optimization, the authors suggest “experiments for experiments” approach to create optimal planning. The authors generate simulated data to evaluate the quantity of interest, e.g. 10th percentile of failure time and apply the response surface methodology (RSM) to find an optimal solution with respect to the design parameters, e.g. test conditions and test unit allocations. The authors illustrate their approach applied to the thermal ALT with right censoring and lognormal failure time distribution.

Findings

The design found by the proposed approach shows substantially improved statistical performance in terms of the standard error of estimates of 10th percentile of failure time. In addition, the approach provides useful insights about the sensitivity of each decision variable to the objective function.

Research limitations/implications

More comprehensive experiments might be needed to test its scalability of the method.

Practical implications

This method is practically useful to find a reasonably efficient optimal ALT design. It can be applied to any quantities of interest and objective functions as long as those quantities can be computed from a set of simulated datasets.

Originality/value

This is a novel approach to create an optimal ALT design by using RSM and simulated data.

Details

International Journal of Quality & Reliability Management, vol. 39 no. 1
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 4 January 2021

Ben Mansour Dia

The author examine the sequestration of CO2 in abandoned geological formations where leakages are permitted up to only a certain threshold to meet the international CO2 emissions…

Abstract

Purpose

The author examine the sequestration of CO2 in abandoned geological formations where leakages are permitted up to only a certain threshold to meet the international CO2 emissions standards. Technically, the author address a Bayesian experimental design problem to optimally mitigate uncertainties and to perform risk assessment on a CO2 sequestration model, where the parameters to be inferred are random subsurface properties while the quantity of interest is desired to be kept within safety margins.

Design/methodology/approach

The author start with a probabilistic formulation of learning the leak-age rate, and the author later relax it to a Bayesian experimental design of learning the formations geo-physical properties. The injection rate is the design parameter, and the learned properties are used to estimate the leakage rate by means of a nonlinear operator. The forward model governs a two-phase two-component flow in a porous medium with no solubility of CO2 in water. The Laplace approximation is combined with Monte Carlo sampling to estimate the expectation of the Kullback–Leibler divergence that stands for the objective function.

Findings

Different scenarios, of confining CO2 while measuring the risk of harmful leakages, are analyzed numerically. The efficiency of the inversion of the CO2 leakage rate improves with the injection rate as great improvements, in terms of the accuracy of the estimation of the formation properties, are noticed. However, this study shows that those results do not imply in any way that the learned value of the CO2 leakage should exhibit the same behavior. Also this study enhances the implementation of CO2 sequestrations by extending the duration given by the reservoir capacity, controlling the injection while the emissions remain in agreement with the international standards.

Originality/value

Uncertainty quantification of the reservoir properties is addressed. Nonlinear goal-oriented inverse problem, for the estimation of the leakage rate, is known to be very challenging. This study presents a relaxation of the probabilistic design of learning the leakage rate to the Bayesian experimental design of learning the reservoir geophysical properties.

Details

Engineering Computations, vol. 38 no. 3
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 27 September 2019

Nuno Costa

The purpose of this paper is to address misconceptions about the design of experiments (DoE) usefulness, avoid bad practices and foster processes’ efficiency and products’ quality…

Abstract

Purpose

The purpose of this paper is to address misconceptions about the design of experiments (DoE) usefulness, avoid bad practices and foster processes’ efficiency and products’ quality in a timely and cost-effective manner with this tool.

Design/methodology/approach

To revisit and discuss the hindrances to DoE usage as well as bad practices in using this tool supported on the selective literature from Web of Science and Scopus indexed journals.

Findings

A set of recommendations and guidelines to mitigate DoE hindrances and avoid common errors or wrong decisions at the planning, running and data analysis phases of DoE are provided.

Research limitations/implications

Errors or wrong decisions in planning, running and analyzing data from statistically designed experiments are always possible so the expected results from DoE usage are not always 100 percent guaranteed.

Practical implications

Novice and intermediate DoE users have another perspective for developing and improving their “test and learn” capability and be successful with DoE. To appropriately plan and run statistically designed experiments not only save the user of DoE from incorrect decisions and depreciation of their technical competencies as they can optimize processes’ efficiency and products’ quality (reliability, durability, performance, robustness, etc.) in a structured, faster and cheaper way at the design and manufacturing stages.

Social implications

DoE usefulness will be increasingly recognized in industry and academy and, as consequence, better products can be made available for consumers, business performance can improve, and the link between industry and academy can be strengthened.

Originality/value

A supplemental perspective on how to succeed with DoE and foster its usage among managers, engineers and other technical staff is presented.

Article
Publication date: 7 June 2013

Lutfiye Canan Pekel, Suna Ertunc, Zehra Zeybek and Mustafa Alpbaz

The purpose of this paper is to investigate the electrochemical treatment of textile dye wastewater in the presence of NaCl electrolyte by using aluminium electrodes.

Abstract

Purpose

The purpose of this paper is to investigate the electrochemical treatment of textile dye wastewater in the presence of NaCl electrolyte by using aluminium electrodes.

Design/methodology/approach

The electrochemical treatment of textile dye wastewater was optimized using response surface methodology (RSM). RSM‐based D‐optimal design was employed to construct statistical models relating turbidity and designed effective parameters known as current density, electrolyte concentration and electrolysis time. The experimental plan consists of a three‐factor (three numerical) matrix.

Findings

The results show that the current density has significant effect on the reduction of turbidity. Besides, electrolysis time is the most influential factor on the turbidity. In order to enhance the electrochemical treatment performance, no coagulant addition or further physicochemical processes were employed.

Originality/value

Industrial certain textile dye wastewater in Turkey is used to determine optimal values.

Details

Management of Environmental Quality: An International Journal, vol. 24 no. 4
Type: Research Article
ISSN: 1477-7835

Keywords

Article
Publication date: 25 September 2009

Humberto Hijar‐Rivera and Victor Garcia‐Castellanos

The purpose of this paper is to present computer‐generated combined arrays as efficient alternatives to Taguchi's crossed arrays to solve robust parameter problems.

Abstract

Purpose

The purpose of this paper is to present computer‐generated combined arrays as efficient alternatives to Taguchi's crossed arrays to solve robust parameter problems.

Design/methodology/approach

The alternative combined array designs were developed for the cases including six to twelve variables where CMR designs are not smaller than Taguchi's designs. The efficiency to estimate the effects of interest was calculated and compared to the efficiency of the corresponding CMR designs.

Findings

For all the cases investigated at least one computer generated combined array design was found with the same size as the CMR design and with higher efficiency.

Practical implications

Robust parameter design identifies appropriate levels of controllable variables in a process for the manufacturing of a product. The designed experiments involve the controllable variables along with the uncontrollable or noise variables to design a product or process that will be robust to changes in these noise variables. Response surface methodology estimates the actual relationship between the response and the input variables with an empirical model based on the designed experiment. Once the empirical model is fitted, the surface described by the model can be used to describe the behavior of the response over the experimental region. The higher efficiency of the computer generated combined array designs proposed in this research produces lower variances for the parameter estimates and lower variance of prediction for the model. As a result, the response will be described in a more realistic form.

Originality/value

The paper shows that using a computer‐generated design to solve a robust parameter problem will result in a better approximation to the true response of the process. Consequently, optimizing the fitted model will produce settings for the parameters closer to the real optimal settings.

Details

Journal of Quality in Maintenance Engineering, vol. 15 no. 4
Type: Research Article
ISSN: 1355-2511

Keywords

Article
Publication date: 4 May 2012

Ahmed Abou‐Elyazied Abdallh, Guillaume Crevecoeur and Luc Dupré

The purpose of this paper is to determine a priori the optimal needle placement so to achieve an as accurate as possible magnetic property identification of an electromagnetic…

Abstract

Purpose

The purpose of this paper is to determine a priori the optimal needle placement so to achieve an as accurate as possible magnetic property identification of an electromagnetic device. Moreover, the effect of the uncertainties in the geometrical parameter values onto the optimal sensor position is studied.

Design/methodology/approach

The optimal needle placement is determined using the stochastic Cramér‐Rao lower bound method. The results obtained using the stochastic method are compared with a first order sensitivity analysis. The inverse problem is solved starting from real local magnetic induction measurements coupled with a 3D finite element model of an electromagnetic device (EI core inductor).

Findings

The optimal experimental design for the identification of the magnetic properties of an electromagnetic device is achieved. The uncertainties in the geometrical model parameters have a high effect on the inverse problem recovered solution.

Originality/value

The solution of the inverse problem is more accurate because the measurements are carried out at the optimal positions, in which the effects of the uncertainties in the geometrical model parameters are limited.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering, vol. 31 no. 3
Type: Research Article
ISSN: 0332-1649

Keywords

Article
Publication date: 2 August 2021

Elham Mahmoudi, Marcel Stepien and Markus König

A principle prerequisite for designing and constructing an underground structure is to estimate the subsurface's properties and obtain a realistic picture of stratigraphy…

Abstract

Purpose

A principle prerequisite for designing and constructing an underground structure is to estimate the subsurface's properties and obtain a realistic picture of stratigraphy. Obtaining direct measure of these values in any location of the built environment is not affordable. Therefore, any evaluation is afflicted with uncertainty, and we need to combine all available measurements, observations and previous knowledge to achieve an informed estimate and quantify the involved uncertainties. This study aims to enhance the geotechnical surveys based on a spatial estimation of subsoil to customised data structures and integrating the ground models into digital design environments.

Design/methodology/approach

The present study's objective is to enhance the geotechnical surveys based on a spatial estimation of subsoil to customised data structures and integrating the ground models into digital design environments. A ground model consisting of voxels is developed via Revit-Dynamo to represent spatial uncertainties employing the kriging interpolation method. The local arrangement of new surveys are evaluated to be optimised.

Findings

The visualisation model's computational performance is modified by using an octree structure. The results show that it adapts the structure to be modelled more efficiently. The proposed concept can identify the geological models' risky locations for further geological investigations and reveal an optimised experimental design. The modifications criteria are defined in global and local considerations.

Originality/value

It provides a transparent and repeatable approach to construct a spatial ground model for subsequent experimental or numerical analysis. In the first attempt, the ground model was discretised by a grid of voxels. In general, the required computing time primarily depends on the size of the voxels. This issue is addressed by implementing octree voxels to reduce the computational efforts. This applies especially to the cases that a higher resolution is required. The investigations using a synthetic soil model showed that the developed methodology fulfilled the kriging method's requirements. The effects of variogram parameters, such as the range and the covariance function, were investigated based on some parameter studies. Moreover, a synthetic model is used to demonstrate the optimal experimental design concept. Through the implementation, alternative locations for new boreholes are generated, and their uncertainties are quantified. The impact of the new borehole on the uncertainty measures are quantified based on local and global approaches. For further research to identify the geological models' risky spots, the development of this approach with additional criteria regarding the search neighbourhood and consideration of barriers and trends in real cases (by employing different interpolation methodologies) should be considered.

Details

Smart and Sustainable Built Environment, vol. 10 no. 3
Type: Research Article
ISSN: 2046-6099

Keywords

1 – 10 of over 19000