Search results

1 – 10 of over 2000
Article
Publication date: 2 March 2015

Oscar E Ruiz, Camilo Cortes, Diego A Acosta and Mauricio Aristizabal

Curve fitting from unordered noisy point samples is needed for surface reconstruction in many applications. In the literature, several approaches have been proposed to solve this…

Abstract

Purpose

Curve fitting from unordered noisy point samples is needed for surface reconstruction in many applications. In the literature, several approaches have been proposed to solve this problem. However, previous works lack formal characterization of the curve fitting problem and assessment on the effect of several parameters (i.e. scalars that remain constant in the optimization problem), such as control points number (m), curve degree (b), knot vector composition (U), norm degree (k), and point sample size (r) on the optimized curve reconstruction measured by a penalty function (f). The paper aims to discuss these issues.

Design/methodology/approach

A numerical sensitivity analysis of the effect of m, b, k and r on f and a characterization of the fitting procedure from the mathematical viewpoint are performed. Also, the spectral (frequency) analysis of the derivative of the angle of the fitted curve with respect to u as a means to detect spurious curls and peaks is explored.

Findings

It is more effective to find optimum values for m than k or b in order to obtain good results because the topological faithfulness of the resulting curve strongly depends on m. Furthermore, when an exaggerate number of control points is used the resulting curve presents spurious curls and peaks. The authors were able to detect the presence of such spurious features with spectral analysis. Also, the authors found that the method for curve fitting is robust to significant decimation of the point sample.

Research limitations/implications

The authors have addressed important voids of previous works in this field. The authors determined, among the curve fitting parameters m, b and k, which of them influenced the most the results and how. Also, the authors performed a characterization of the curve fitting problem from the optimization perspective. And finally, the authors devised a method to detect spurious features in the fitting curve.

Practical implications

This paper provides a methodology to select the important tuning parameters in a formal manner.

Originality/value

Up to the best of the knowledge, no previous work has been conducted in the formal mathematical evaluation of the sensitivity of the goodness of the curve fit with respect to different possible tuning parameters (curve degree, number of control points, norm degree, etc.).

Details

Engineering Computations, vol. 32 no. 1
Type: Research Article
ISSN: 0264-4401

Keywords

Open Access
Article
Publication date: 19 August 2021

Linh Truong-Hong, Roderik Lindenbergh and Thu Anh Nguyen

Terrestrial laser scanning (TLS) point clouds have been widely used in deformation measurement for structures. However, reliability and accuracy of resulting deformation…

2351

Abstract

Purpose

Terrestrial laser scanning (TLS) point clouds have been widely used in deformation measurement for structures. However, reliability and accuracy of resulting deformation estimation strongly depends on quality of each step of a workflow, which are not fully addressed. This study aims to give insight error of these steps, and results of the study would be guidelines for a practical community to either develop a new workflow or refine an existing one of deformation estimation based on TLS point clouds. Thus, the main contributions of the paper are investigating point cloud registration error affecting resulting deformation estimation, identifying an appropriate segmentation method used to extract data points of a deformed surface, investigating a methodology to determine an un-deformed or a reference surface for estimating deformation, and proposing a methodology to minimize the impact of outlier, noisy data and/or mixed pixels on deformation estimation.

Design/methodology/approach

In practice, the quality of data point clouds and of surface extraction strongly impacts on resulting deformation estimation based on laser scanning point clouds, which can cause an incorrect decision on the state of the structure if uncertainty is available. In an effort to have more comprehensive insight into those impacts, this study addresses four issues: data errors due to data registration from multiple scanning stations (Issue 1), methods used to extract point clouds of structure surfaces (Issue 2), selection of the reference surface Sref to measure deformation (Issue 3), and available outlier and/or mixed pixels (Issue 4). This investigation demonstrates through estimating deformation of the bridge abutment, building and an oil storage tank.

Findings

The study shows that both random sample consensus (RANSAC) and region growing–based methods [a cell-based/voxel-based region growing (CRG/VRG)] can be extracted data points of surfaces, but RANSAC is only applicable for a primary primitive surface (e.g. a plane in this study) subjected to a small deformation (case study 2 and 3) and cannot eliminate mixed pixels. On another hand, CRG and VRG impose a suitable method applied for deformed, free-form surfaces. In addition, in practice, a reference surface of a structure is mostly not available. The use of a fitting plane based on a point cloud of a current surface would cause unrealistic and inaccurate deformation because outlier data points and data points of damaged areas affect an accuracy of the fitting plane. This study would recommend the use of a reference surface determined based on a design concept/specification. A smoothing method with a spatial interval can be effectively minimize, negative impact of outlier, noisy data and/or mixed pixels on deformation estimation.

Research limitations/implications

Due to difficulty in logistics, an independent measurement cannot be established to assess the deformation accuracy based on TLS data point cloud in the case studies of this research. However, common laser scanners using the time-of-flight or phase-shift principle provide point clouds with accuracy in the order of 1–6 mm, while the point clouds of triangulation scanners have sub-millimetre accuracy.

Practical implications

This study aims to give insight error of these steps, and the results of the study would be guidelines for a practical community to either develop a new workflow or refine an existing one of deformation estimation based on TLS point clouds.

Social implications

The results of this study would provide guidelines for a practical community to either develop a new workflow or refine an existing one of deformation estimation based on TLS point clouds. A low-cost method can be applied for deformation analysis of the structure.

Originality/value

Although a large amount of the studies used laser scanning to measure structure deformation in the last two decades, the methods mainly applied were to measure change between two states (or epochs) of the structure surface and focused on quantifying deformation-based TLS point clouds. Those studies proved that a laser scanner could be an alternative unit to acquire spatial information for deformation monitoring. However, there are still challenges in establishing an appropriate procedure to collect a high quality of point clouds and develop methods to interpret the point clouds to obtain reliable and accurate deformation, when uncertainty, including data quality and reference information, is available. Therefore, this study demonstrates the impact of data quality in a term of point cloud registration error, selected methods for extracting point clouds of surfaces, identifying reference information, and available outlier, noisy data and/or mixed pixels on deformation estimation.

Details

International Journal of Building Pathology and Adaptation, vol. 40 no. 3
Type: Research Article
ISSN: 2398-4708

Keywords

Abstract

Details

Functional Structure and Approximation in Econometrics
Type: Book
ISBN: 978-0-44450-861-4

Article
Publication date: 5 October 2015

Oduetse Matsebe, Khumbulani Mpofu, John Terhile Agee and Sesan Peter Ayodeji

The purpose of this paper is to present a method to extract corner features for map building purposes in man-made structured underwater environments using the sliding-window…

Abstract

Purpose

The purpose of this paper is to present a method to extract corner features for map building purposes in man-made structured underwater environments using the sliding-window technique.

Design/methodology/approach

The sliding-window technique is used to extract corner features, and Mechanically Scanned Imaging Sonar (MSIS) is used to scan the environment for map building purposes. The tests were performed with real data collected in a swimming pool.

Findings

The change in application environment and the use of MSIS present some important differences, which must be taken into account when dealing with acoustic data. These include motion-induced distortions, continuous data flow, low scan frequency and high noise levels. Only part of the data stored in each scan sector is important for feature extraction; therefore, a segmentation process is necessary to extract more significant information. To deal with continuous flow of data, data must be separated into 360° scan sectors. Although the vehicle is assumed to be static, there is a drift in both its rotational and translational motions because of currents in the water; these drifts induce distortions in acoustic images. Therefore, the bearing information and the current vehicle pose corresponding to the selected scan-lines must be stored and used to compensate for motion-induced distortions in the acoustic images. As the data received is very noisy, an averaging filter should be applied to achieve an even distribution of data points, although this is partly achieved through the segmentation process. On the selected sliding window, all the point pairs must pass the distance and angle tests before a corner can be initialised. This minimises mapping of outlier data points but can make the algorithm computationally expensive if the selected window is too wide. The results show the viability of this procedure under very noisy data. The technique has been applied to 50 data sets/scans sectors with a success rate of 83 per cent.

Research limitations/implications

MSIS gives very noisy data. There are limited sensorial modes for underwater applications.

Practical implications

The extraction of corner features in structured man-made underwater environments opens the door for SLAM systems to a wide range of applications and environments.

Originality/value

A method to extract corner features for map building purposes in man-made structured underwater environments is presented using the sliding-window technique.

Details

Journal of Engineering, Design and Technology, vol. 13 no. 4
Type: Research Article
ISSN: 1726-0531

Keywords

Article
Publication date: 10 July 2017

Hui Li, Yu-Hui Xu and Lean Yu

Available information for evaluating the possibility of hospitality firm failure in emerging countries is often deficient. Oversampling can compensate for this but can also yield…

Abstract

Purpose

Available information for evaluating the possibility of hospitality firm failure in emerging countries is often deficient. Oversampling can compensate for this but can also yield mixed samples, which limit prediction models’ effectiveness. This research aims to provide a feasible approach to handle possible mixed information caused by oversampling.

Design/methodology/approach

This paper uses mixed sample modelling (MSM) when evaluating the possibility of firm failure on enlarged hospitality firms. The mixed sample is filtered out with a mixed sample index through control of the noisy parameter and outliner parameter and meta-models are used to build MSM models for hospitality firm failure prediction, with performances compared to traditional models.

Findings

The proposed models are helpful in predicting hospitality firm failure in the mixed information situation caused by oversampling, whereas MSM significantly improves the performance of traditional models. Meanwhile, only partial mixed hospitality samples matter in predicting firm failure in both rich- and poor-information situations.

Practical implications

This research is helpful for managers, investors, employees and customers to reduce their hospitality-related risk in the emerging Chinese market. The two-dimensional sample collection strategies, three-step prediction process and five MSM modelling principles are helpful for practice of hospitality firm failure prediction.

Originality/value

This research provides a means of processing mixed hospitality firm samples through the early definition and proposal of MSM, which addresses the ranking information within samples in deficient information environments and improves forecasting accuracy of traditional models. Moreover, it provides empirical evidence for the validation of sample selection and sample pairing strategy in evaluating the possibility of hospitality firm failure.

Details

International Journal of Contemporary Hospitality Management, vol. 29 no. 7
Type: Research Article
ISSN: 0959-6119

Keywords

Article
Publication date: 1 February 2006

A. Rap, L. Elliott, D.B. Ingham, D. Lesnic and X. Wen

To develop a numerical technique for solving the inverse source problem associated with the constant coefficients convection‐diffusion equation.

Abstract

Purpose

To develop a numerical technique for solving the inverse source problem associated with the constant coefficients convection‐diffusion equation.

Design/methodology/approach

The proposed numerical technique is based on the boundary element method (BEM) combined with an iterative sequential quadratic programming (SQP) procedure. The governing convection‐diffusion equation is transformed into a Helmholtz equation and the ill‐conditioned system of equations that arises after the application of the BEM is solved using an iterative technique.

Findings

The iterative BEM presented in this paper is well‐suited for solving inverse source problems for convection‐diffusion equations with constant coefficients. Accurate and stable numerical solutions were obtained for cases when the number of sources is correctly estimated, overestimated, or underestimated, and with both exact and noisy input data.

Research limitations/implications

The proposed numerical method is limited to cases when the Péclet number is smaller than 100. Future approaches should include the application of the BEM directly to the convection‐diffusion equation.

Practical implications

Applications of the results presented in this paper can be of value in practical applications in both heat and fluid flow as they show that locations and strengths for an unknown number of point sources can be accurately found by using boundary measurements only.

Originality/value

The BEM has not as yet been employed for solving inverse source problems related with the convection‐diffusion equation. This study is intended to approach this problem by combining the BEM formulation with an iterative technique based on the SQP method. In this way, the many advantages of the BEM can be applied to inverse source convection‐diffusion problems.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 16 no. 2
Type: Research Article
ISSN: 0961-5539

Keywords

Article
Publication date: 5 September 2023

Lucas Silva and Alfredo Gay Neto

When establishing a mathematical model to simulate solid mechanics, considering realistic geometries, special tools are needed to translate measured data, possibly with noise…

Abstract

Purpose

When establishing a mathematical model to simulate solid mechanics, considering realistic geometries, special tools are needed to translate measured data, possibly with noise, into idealized geometrical entities. As an engineering application, wheel-rail contact interactions are fundamental in the dynamic modeling of railway vehicles. Many approaches used to solve the contact problem require a continuous parametric description of the geometries involved. However, measured wheel and rail profiles are often available as sets of discrete points. A reconstruction method is needed to transform discrete data into a continuous geometry.

Design/methodology/approach

The authors present an approximation method based on optimization to solve the problem of fitting a set of points with an arc spline. It consists of an initial guess based on a curvature function estimated from the data, followed by a least-squares optimization to improve the approximation. The authors also present a segmentation scheme that allows the method to increment the number of segments of the spline, trying to keep it at a minimal value, to satisfy a given error tolerance.

Findings

The paper provides a better understanding of arc splines and how they can be deformed. Examples with parametric curves and slightly noisy data from realistic wheel and rail profiles show that the approach is successful.

Originality/value

The developed methods have theoretical value. Furthermore, they have practical value since the approximation approach is better suited to deal with the reconstruction of wheel/rail profiles than interpolation, which most methods use to some degree.

Details

Engineering Computations, vol. 40 no. 7/8
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 1 September 1996

R.L. Wood

Draws a comparison between the use of a genetic algorithm and the sequential function specification method for the solution of a one‐dimensional linear inverse thermal field…

139

Abstract

Draws a comparison between the use of a genetic algorithm and the sequential function specification method for the solution of a one‐dimensional linear inverse thermal field problem, based on the use of noisy measurements. In solving this problem aims to estimate the value of a single constant convective heat transfer coefficient. Documents the findings that both approaches can provide estimates within 1 per cent of the target solution and that the sensitivity and robustness of each approach to measurement location, time step size and measurement errors are markedly different.

Details

Engineering Computations, vol. 13 no. 6
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 9 March 2020

Zahra Nematzadeh, Roliana Ibrahim, Ali Selamat and Vahdat Nazerian

The purpose of this study is to enhance data quality and overall accuracy and improve certainty by reducing the negative impacts of the FCM algorithm while clustering real-world…

Abstract

Purpose

The purpose of this study is to enhance data quality and overall accuracy and improve certainty by reducing the negative impacts of the FCM algorithm while clustering real-world data and also decreasing the inherent noise in data sets.

Design/methodology/approach

The present study proposed a new effective model based on fuzzy C-means (FCM), ensemble filtering (ENS) and machine learning algorithms, called an FCM-ENS model. This model is mainly composed of three parts: noise detection, noise filtering and noise classification.

Findings

The performance of the proposed model was tested by conducting experiments on six data sets from the UCI repository. As shown by the obtained results, the proposed noise detection model very effectively detected the class noise and enhanced performance in case the identified class noisy instances were removed.

Originality/value

To the best of the authors’ knowledge, no effort has been made to improve the FCM algorithm in relation to class noise detection issues. Thus, the novelty of existing research is combining the FCM algorithm as a noise detection technique with ENS to reduce the negative effect of inherent noise and increase data quality and accuracy.

Details

Engineering Computations, vol. 37 no. 7
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 9 May 2008

Ferrante Neri, Xavier del Toro Garcia, Giuseppe L. Cascella and Nadia Salvatore

This paper aims to propose a reliable local search algorithm having steepest descent pivot rule for computationally expensive optimization problems. In particular, an application…

1744

Abstract

Purpose

This paper aims to propose a reliable local search algorithm having steepest descent pivot rule for computationally expensive optimization problems. In particular, an application to the design of Permanent Magnet Synchronous Motor (PMSM) drives is shown.

Design/methodology/approach

A surrogate assisted Hooke‐Jeeves algorithm (SAHJA) is proposed. The SAHJA is a local search algorithm with the structure of the Hooke‐Jeeves algorithm, which employs a local surrogate model dynamically constructed during the exploratory move at each step of the optimization process.

Findings

Several numerical experiments have been designed. These experiments are carried out both on the simulation model (off‐line) and at the actual plant (on‐line). Moreover, the off‐line experiments have been considered in non‐noisy and noisy cases. The numerical results show that use of the SAHJA leads to a saving in terms of computational cost without requiring any extra hardware components.

Originality/value

The surrogate approach in the design of electric drives is novel. In addition, implementation of the proposed surrogate model allows the algorithm not only to reduce computational cost but also to filter noise caused by the sensors and measurement devices.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering, vol. 27 no. 3
Type: Research Article
ISSN: 0332-1649

Keywords

1 – 10 of over 2000