Search results

1 – 10 of over 13000
To view the access options for this content please click here
Article

Guanying Huo, Xin Jiang, Zhiming Zheng and Deyi Xue

Metamodeling is an effective method to approximate the relations between input and output parameters when significant efforts of experiments and simulations are required…

Abstract

Purpose

Metamodeling is an effective method to approximate the relations between input and output parameters when significant efforts of experiments and simulations are required to collect the data to build the relations. This paper aims to develop a new sequential sampling method for adaptive metamodeling by using the data with highly nonlinear relation between input and output parameters.

Design/methodology/approach

In this method, the Latin hypercube sampling method is used to sample the initial data, and kriging method is used to construct the metamodel. In this work, input parameter values for collecting the next output data to update the currently achieved metamodel are determined based on qualities of data in both the input and output parameter spaces. Uniformity is used to evaluate data in the input parameter space. Leave-one-out errors and sensitivities are considered to evaluate data in the output parameter space.

Findings

This new method has been compared with the existing methods to demonstrate its effectiveness in approximation. This new method has also been compared with the existing methods in solving global optimization problems. An engineering case is used at last to verify the method further.

Originality/value

This paper provides an effective sequential sampling method for adaptive metamodeling to approximate highly nonlinear relations between input and output parameters.

Details

Engineering Computations, vol. 37 no. 3
Type: Research Article
ISSN: 0264-4401

Keywords

To view the access options for this content please click here
Book part

Paolo Giordani and Robert Kohn

Our paper discusses simulation-based Bayesian inference using information from previous draws to build the proposals. The aim is to produce samplers that are easy to…

Abstract

Our paper discusses simulation-based Bayesian inference using information from previous draws to build the proposals. The aim is to produce samplers that are easy to implement, that explore the target distribution effectively, and that are computationally efficient and mix well.

Details

Bayesian Econometrics
Type: Book
ISBN: 978-1-84855-308-8

To view the access options for this content please click here
Article

Marc Guénot, Ingrid Lepot, Caroline Sainvitu, Jordan Goblet and Rajan Filomeno Coelho

The purpose of this paper is to propose a novel contribution to adaptive sampling strategies for non‐intrusive reduced order models based on Proper Orthogonal…

Abstract

Purpose

The purpose of this paper is to propose a novel contribution to adaptive sampling strategies for non‐intrusive reduced order models based on Proper Orthogonal Decomposition (POD). These strategies aim at reducing the cost of optimization by improving the efficiency and accuracy of POD data‐fitting surrogate models to be used in an online surrogate‐assisted optimization framework for industrial design.

Design/methodology/approach

The effect of the strategies on the model accuracy is investigated considering the snapshot scaling, the design of experiment size and the truncation level of the POD basis and compared to a state‐of‐the‐art radial basis function network surrogate model on objectives and constraints. The selected test case is a Mach number and angle of attack domain exploration of the well‐known RAE2822 airfoil. Preliminary airfoil shape optimization results are also shown.

Findings

The numerical results demonstrate the potential of the capture/recapture schemes proposed for adequately filling the parametric space and maximizing the surrogates relevance at minimum computational cost.

Originality/value

The proposed approaches help in building POD‐based surrogate models more efficiently.

To view the access options for this content please click here
Article

Manuel do Carmo, Paulo Infante and Jorge M Mendes

– The purpose of this paper is to measure the performance of a sampling method through the average number of samples drawn in control.

Abstract

Purpose

The purpose of this paper is to measure the performance of a sampling method through the average number of samples drawn in control.

Design/methodology/approach

Matching the adjusted average time to signal (AATS) of sampling methods, using as a reference the AATS of one of them the paper obtains the design parameters of the others. Thus, it will be possible to obtain, in control, the average number of samples required, so that the AATS of the mentioned sampling methods may be equal to the AATS of the method that the paper uses as the reference.

Findings

A more robust performance measure to compare sampling methods because in many cases the period of time where the process is in control is greater than the out of control period. With this performance measure the paper compares different sampling methods through the average total cost per cycle, in systems with Weibull lifetime distributions: three systems with an increasing hazard rate (shape parameter β=2, 4 and 7) and one system with a decreasing failure rate (β=0, 8).

Practical implications

In a usual production cycle where the in control period is much larger than the out of control period, particularly if the sampling costs and false alarms costs are high in relation to malfunction costs, the paper thinks that this methodology allows us a more careful choice of the appropriate sampling method.

Originality/value

To compare the statistical performance between different sampling methods using the average number of samples need to be inspected when the process is in control. Particularly, the paper compares the statistical and economic performance between different sampling methods in contexts not previously considered in literature. The paper presents an approximation for the average time between the instant that failure occurs and the first sample with the process out of control, as well.

Details

International Journal of Quality & Reliability Management, vol. 31 no. 5
Type: Research Article
ISSN: 0265-671X

Keywords

To view the access options for this content please click here
Article

Rui Lin, Haibo Huang and Maohai Li

This study aims to present an automated guided logistics robot mainly designed for pallet transportation. Logistics robot is compactly designed. It could pick up the…

Abstract

Purpose

This study aims to present an automated guided logistics robot mainly designed for pallet transportation. Logistics robot is compactly designed. It could pick up the pallet precisely and transport the pallet up to 1,000 kg automatically in the warehouse. It could move freely in all directions without turning the chassis. It could work without any additional infrastructure based on laser navigation system proposed in this work.

Design/methodology/approach

Logistics robot should be able to move underneath and lift up the pallet accurately. Logistics robot mainly consists of two sub-robots, like two forks of the forklift. Each sub-robot has front and rear driving units. A new compact driving unit is compactly designed as a key component to ensure access to the narrow free entry of the pallet. Besides synchronous motions in all directions, the two sub-robots should also perform synchronous lifting up and laying down the pallet. Logistics robot uses a front laser to detect obstacles and locate itself using on-board navigation system. A rear laser is used to recognize and guide the sub-robots to pick up the pallet precisely within ± 5mm/1o in x-/yaw direction. Path planning algorithm under different constraints is proposed for logistics robot to obey the traffic rules of pallet logistics.

Findings

Compared with the traditional forklift vehicles, logistics robot has the advantages of more compact structure and higher expandability. It can realize the omnidirectional movement flexibly without turning the chassis and take zero-radius turn by controlling compact driving units synchronously. Logistics robot can move collision-free into any pallet that has not been precisely placed. It can plan the paths for returning to charge station and charge automatically. So it can work uninterruptedly for 7 × 24 h. Path planning algorithm proposed can avoid traffic congestion and improve the passability of the narrow roads to improve logistics efficiencies. Logistics robot is quite suitable for the standardized logistics factory with small working space.

Originality/value

This is a new innovation for pallet transportation vehicle to improve logistics automation.

Details

Assembly Automation, vol. 41 no. 1
Type: Research Article
ISSN: 0144-5154

Keywords

To view the access options for this content please click here
Article

Luosong Jin, Weidong Liu, Cheng Chen, Wei Wang and Houyin Long

With the advent of the information age, this paper aims to apply risk analysis theories to study the risk prevention mechanism of information disclosure, thus supporting…

Abstract

Purpose

With the advent of the information age, this paper aims to apply risk analysis theories to study the risk prevention mechanism of information disclosure, thus supporting the green electricity supply.

Design/methodology/approach

This paper conducts a comprehensive evaluation and analysis of the impact of power market transactions, power market operations and effective government supervision, so as to figure out the core risk content of power market information disclosure. Moreover, AHP-entropy method is adopted to weigh different indicators of information disclosure risks for the participants in the electricity market.

Findings

The potential reasons for information disclosure risk in the electricity market include insufficient information disclosure, high cost of obtaining information, inaccurate information disclosure, untimely information disclosure and unfairness of information disclosure.

Originality/value

Some suggestions and implications on risk prevention mechanism of information disclosure in the electricity market are provided, so as to ensure the green electricity supply and promote the electricity market reform in China.

Details

Journal of Enterprise Information Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1741-0398

Keywords

To view the access options for this content please click here
Article

J. Rodrigues Dias and Paulo Infante

The purpose of this paper is to investigate a new sampling methodology previously proposed for systems with a known lifetime distribution: the Predetermined Sampling

Abstract

Purpose

The purpose of this paper is to investigate a new sampling methodology previously proposed for systems with a known lifetime distribution: the Predetermined Sampling Intervals (PSI) method.

Design/methodology/approach

The methodology is defined on basis of system hazard cumulative rate, and is compared with other approaches, particularly those whose parameters may change in real time, taking into account current sample information.

Findings

For different lifetime distributions, the results obtained for adjusted average time to signal (AATS) using a control chart for the sample mean are presented and analysed. They demonstrate the high degree of statistical performance of this sampling procedure, particularly when used in systems with an increasing failure rate distribution.

Practical implications

This PSI method is important from a quality and reliability management point of view.

Originality/value

This methodology involves a process by which sampling instants are obtained at the beginning of the process to be controlled. Also this new approach allows for statistical comparison with other sampling schemes, which is a novel feature.

Details

International Journal of Quality & Reliability Management, vol. 25 no. 4
Type: Research Article
ISSN: 0265-671X

Keywords

To view the access options for this content please click here
Article

Aitor Bilbao‐Guillerna, Manuel de la Sen and Santiago Alonso‐Quesada

The purpose of this paper is to improve the transient response and the inter‐sample behavior of a model reference adaptive control system by an appropriate selection of…

Abstract

Purpose

The purpose of this paper is to improve the transient response and the inter‐sample behavior of a model reference adaptive control system by an appropriate selection of the fractional order hold (FROH) gain β and the multirate gains used in the control reconstruction signal through a fully freely chosen reference model even when the continuous plant possesses unstable zeros.

Design/methodology/approach

A multiestimation adaptive control scheme for linear time‐invariant continuous‐time plant with unknown parameters is presented. The set of discrete adaptive models is calculated from a different combination of the correcting gain β in a FROH and the set of gains to reconstruct the plant input under multirate sampling with fast input sampling. Then the scheme selects online the model with the best continuous‐time tracking performance which includes a measure of the inter‐sample ripple, which is improved. The estimated discrete unstable zeros are avoided through an appropriate design of the multirate gains so that the reference model might be freely chosen with no constraints on potential unstable zeros.

Findings

The scheme is able to select online the discretization model with the best continuous‐time tracking performance without an appropriate initialization.

Research limitations/implications

The switching mechanism among the different models should maintain in operation the active discretization model at least for a minimum residence time in order to guarantee closed‐loop stability. The inter‐sample behavior is improved, but it is not always completely removed.

Practical implications

The transient response and the inter‐sample behavior are improved by using this multiestimation‐based discrete controller compared with a single estimation‐based one. The implementation of discrete controllers makes it easier and cheaper to implement and also more reliable than continuous‐time controllers.

Originality/value

The main innovation of the paper compared with previous background work is that the reference output is supplied by a stable continuous transfer function. Then the scheme is able to partly regulate the continuous‐time tracking error while the controller is essentially discrete‐time and operated by a FROH in general.

Details

Kybernetes, vol. 37 no. 6
Type: Research Article
ISSN: 0368-492X

Keywords

To view the access options for this content please click here
Article

Yan‐Kwang Chen, Hung‐Chang Liao and Fei‐Rung Chiu

The purpose of this paper is to re‐evaluate the performance of the adaptive control charts which allow some of their design parameters to change during production…

Abstract

Purpose

The purpose of this paper is to re‐evaluate the performance of the adaptive control charts which allow some of their design parameters to change during production depending on the collected information from samples over time. Instead of employing a single performance measure (average time to signal process changes), a set of measures, associated with the inspection efficiency and effort, is taken into account in the evaluation process.

Design/methodology/approach

A multivariate analysis of variation (MANOVA) approach along with the post hoc analysis are applied to investigate the performance of different adaptive control charts based on different measures.

Findings

The findings indicate that different adaptive control charts may have different performance, depending on the measure regarded and the value of shift in process mean. In general, the VSSC, VSSI, and VSI control charts would be recommended for a process with a small, moderate, and large shift, respectively. The SS chart is still the best choice for a process with an extremely large shift.

Research limitations/implications

Up to now, the proposed procedure has been developed for the comparative analyses of adaptive charts, but it could be adjusted for other adaptive charts as well.

Originality/value

This paper provides a review of the performance of adaptive control charts from a novel perspective.

Details

International Journal of Quality & Reliability Management, vol. 25 no. 6
Type: Research Article
ISSN: 0265-671X

Keywords

To view the access options for this content please click here
Article

Naga Hanumaiah and B. Ravi

The purpose of this paper is to present the results of an investigation on the straightness, flatness and circularity achievable on two direct RT methods: direct metal…

Abstract

Purpose

The purpose of this paper is to present the results of an investigation on the straightness, flatness and circularity achievable on two direct RT methods: direct metal laser sintering (DMLS) and stereolithography (SLA).

Design/methodology/approach

The steps included manufacturing of samples in eight custom designs with widely used geometric features, intelligent sampling of measurement data, and estimation of corresponding form tolerance by the least square method (LSM). The region elimination adaptive search‐based sampling method involved selecting additional sampling points around the maximum deviation in both positive and negative directions from the corresponding reference feature. The LSM solutions, which are commonly used in metrology, are used to estimate the form tolerances considering the best points along with initial measurement data points.

Findings

Application of the region elimination search‐based sampling method enables form tolerance estimation from a limited number of sample measurements. The study of the DMLS and SLA processes suggested that form accuracy of SLA samples are relatively poor, though their dimensional accuracy is much better than DMLS.

Research limitations/implications

This paper was focused on estimating the form tolerances based on limited number of measurement data using region elimination search‐based sampling technique. It was assumed that build process parameters suggested by the material and RP systems vendors gives optimum results, presently it does not cover the effect of geometry and other causes of errors on form accuracy.

Practical implications

There are two major applications of this investigation and the corresponding knowledge base: evaluating the process capabilities of different rapid tooling processes for comparison and for selecting an appropriate process; and allocating tolerance based on manufacturability considerations, so that the designs are compatible with the process, leading to fewer iterations. A similar approach can be used for updating the capabilities of an improved process as well as include newer processes to develop a comprehensive database of RT process capabilities.

Originality/value

In most of the previous benchmarking studies, a given RT process is compared with conventional practice or a limited number of other RT processes, the capabilities in terms of dimensional accuracy, form tolerance, and surface properties (surface finish, wear, and scratch resistance) have not been studied very well. To the best of authors' knowledge, no efforts have been made to estimate form tolerances of the parts or tooling produced by rapid prototyping and tooling processes. Application of the region elimination search‐based sampling technique enables estimation of form tolerances that saves costly experimentations. It appears to be completely new in the rapid prototyping and tooling domain.

Details

Rapid Prototyping Journal, vol. 13 no. 3
Type: Research Article
ISSN: 1355-2546

Keywords

1 – 10 of over 13000