Search results

1 – 10 of 27
Article
Publication date: 13 November 2018

Alireza Ahangar Asr, Asaad Faramarzi and Akbar A. Javadi

This paper aims to develop a unified framework for modelling triaxial deviator stress – axial strain and volumetric strain – axial strain behaviour of granular soils with the…

Abstract

Purpose

This paper aims to develop a unified framework for modelling triaxial deviator stress – axial strain and volumetric strain – axial strain behaviour of granular soils with the ability to predict the entire stress paths, incrementally, point by point, in deviator stress versus axial strain and volumetric strain versus axial strain spaces using an evolutionary-based technique based on a comprehensive set of data directly measured from triaxial tests without pre-processing. In total, 177 triaxial test results acquired from literature were used to develop and validate the models. Models aimed to not only be capable of capturing and generalising the complicated behaviour of soils but also explicitly remain consistent with expert knowledge available for such behaviour.

Design/methodology/approach

Evolutionary polynomial regression (EPR) was used to develop models to predict stress – axial strain and volumetric strain – axial strain behaviour of granular soils. EPR integrates numerical and symbolic regression to perform EPR. The strategy uses polynomial structures to take advantage of favourable mathematical properties. EPR is a two-stage technique for constructing symbolic models. It initially implements evolutionary search for exponents of polynomial expressions using a genetic algorithm (GA) engine to find the best form of function structure; second, it performs a least squares regression to find adjustable parameters, for each combination of inputs (terms in the polynomial structure).

Findings

EPR-based models were capable of generalising the training to predict the behaviour of granular soils under conditions that have not been previously seen by EPR in the training stage. It was shown that the proposed EPR models outperformed ANN and provided closer predictions to the experimental data cases. The entire stress paths for the shearing behaviour of granular soils using developed model predictions were created with very good accuracy despite error accumulation. Parametric study results revealed the consistency of developed model predictions, considering roles of various contributing parameters, with physical and engineering understandings of the shearing behaviour of granular soils.

Originality/value

In this paper, an evolutionary-based data-mining method was implemented to develop a novel unified framework to model the complicated stress-strain behaviour of saturated granular soils. The proposed methodology overcomes the drawbacks of artificial neural network-based models with black box nature by developing accurate, explicit, structured and user-friendly polynomial models and enabling the expert user to obtain a clear understanding of the system.

Details

Engineering Computations, vol. 35 no. 8
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 22 August 2008

Mohammad Rezania, Akbar A. Javadi and Orazio Giustolisi

Analysis of many civil engineering phenomena is a complex problem due to the participation of a large number of factors involved. Traditional methods usually suffer from a lack of…

1727

Abstract

Purpose

Analysis of many civil engineering phenomena is a complex problem due to the participation of a large number of factors involved. Traditional methods usually suffer from a lack of physical understanding. Furthermore, the simplifying assumptions that are usually made in the development of the traditional methods may, in some cases, lead to very large errors. The purpose of this paper is to present a new method, based on evolutionary polynomial regression (EPR) for capturing nonlinear interaction between various parameters of civil engineering systems.

Design/methodology/approach

EPR is a data‐driven method based on evolutionary computing, aimed to search for polynomial structures representing a system. In this technique, a combination of the genetic algorithm and the least‐squares method is used to find feasible structures and the appropriate constants for those structures.

Findings

Capabilities of the EPR methodology are illustrated by application to two complex practical civil engineering problems including evaluation of uplift capacity of suction caissons and shear strength of reinforced concrete deep beams. The results show that the proposed EPR model provides a significant improvement over the existing models. The EPR models generate a transparent and structured representation of the system. For design purposes, the EPR models, presented in this study, are simple to use and provide results that are more accurate than the existing methods.

Originality/value

In this paper, a new evolutionary data mining approach is presented for the analysis of complex civil engineering problems. The new approach overcomes the shortcomings of the traditional and artificial neural network‐based methods presented in the literature for the analysis of civil engineering systems. EPR provides a viable tool to find a structured representation of the system, which allows the user to gain additional information on how the system performs.

Details

Engineering Computations, vol. 25 no. 6
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 12 October 2010

Alireza Ahangar‐Asr, Asaad Faramarzi and Akbar A. Javadi

Analysis of stability of slopes has been the subject of many research works in the past decades. Prediction of stability of slopes is of great importance in many civil engineering…

1575

Abstract

Purpose

Analysis of stability of slopes has been the subject of many research works in the past decades. Prediction of stability of slopes is of great importance in many civil engineering structures including earth dams, retaining walls and trenches. There are several parameters that contribute to the stability of slopes. This paper aims to present a new approach, based on evolutionary polynomial regression (EPR), for analysis of stability of soil and rock slopes.

Design/methodology/approach

EPR is a data‐driven method based on evolutionary computing, aimed to search for polynomial structures representing a system. In this technique, a combination of the genetic algorithm and the least square method is used to find feasible structures and the appropriate constants for those structures.

Findings

EPR models are developed and validated using results from sets of field data on the stability status of soil and rock slopes. The developed models are used to predict the factor of safety of slopes against failure for conditions not used in the model building process. The results show that the proposed approach is very effective and robust in modelling the behaviour of slopes and provides a unified approach to analysis of slope stability problems. It is also shown that the models can predict various aspects of behaviour of slopes correctly.

Originality/value

In this paper a new evolutionary data mining approach is presented for the analysis of stability of soil and rock slopes. The new approach overcomes the shortcomings of the traditional and artificial neural network‐based methods presented in the literature for the analysis of slopes. EPR provides a viable tool to find a structured representation of the system, which allows the user to gain additional information on how the system performs.

Details

Engineering Computations, vol. 27 no. 7
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 31 May 2011

Alireza Ahangar‐Asr, Asaad Faramarzi, Akbar A. Javadi and Orazio Giustolisi

Using discarded tyre rubber as concrete aggregate is an effective solution to the environmental problems associated with disposal of this waste material. However, adding rubber as…

Abstract

Purpose

Using discarded tyre rubber as concrete aggregate is an effective solution to the environmental problems associated with disposal of this waste material. However, adding rubber as aggregate in concrete mixture changes, the mechanical properties of concrete, depending mainly on the type and amount of rubber used. An appropriate model is required to describe the behaviour of rubber concrete in engineering applications. The purpose of this paper is to show how a new evolutionary data mining technique, evolutionary polynomial regression (EPR), is used to predict the mechanical properties of rubber concrete.

Design/methodology/approach

EPR is a data‐driven method based on evolutionary computing, aimed to search for polynomial structures representing a system. In this technique, a combination of the genetic algorithm and the least square method is used to find feasible structures and the appropriate constants for those structures.

Findings

Data from 70 cases of experiments on rubber concrete are used for development and validation of the EPR models. Three models are developed relating compressive strength, splitting tensile strength, and elastic modulus to a number of physical parameters that are known to contribute to the mechanical behaviour of rubber concrete. The most outstanding characteristic of the proposed technique is that it provides a transparent, structured, and accurate representation of the behaviour of the material in the form of a polynomial function, giving insight to the user about the contributions of different parameters involved. The proposed model shows excellent agreement with experimental results, and provides an efficient method for estimation of mechanical properties of rubber concrete.

Originality/value

In this paper, a new evolutionary data mining approach is presented for the analysis of mechanical behaviour of rubber concrete. The new approach overcomes the shortcomings of the traditional and artificial neural network‐based methods presented in the literature for the analysis of slopes. EPR provides a viable tool to find a structured representation of the system, which allows the user to gain additional information on how the system performs.

Details

Engineering Computations, vol. 28 no. 4
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 15 February 2022

Ha Duy Khanh, Soo Yong Kim and Le Quoc Linh

This study aims to focus on exploring the construction productivity of building projects under the influence of potential factors. The three primary purposes are (1) determining…

Abstract

Purpose

This study aims to focus on exploring the construction productivity of building projects under the influence of potential factors. The three primary purposes are (1) determining critical factors affecting construction productivity; (2) identifying causal relationship and occurrence probability of these factors to develop a Bayesian network (BN) model; and (3) validating the accuracy of predictions from the proposed BN model via a case study.

Design/methodology/approach

A conceptual framework that includes three performance stages was used. Twenty-two possible factors were screened from a comprehensive literature review and evaluated through expert opinions. Data were collected using a structured questionnaire-based survey and case-study-based survey. The sampling methods were based on non-probability sampling.

Findings

Worker characteristic-related factors significantly affect labour productivity for a construction task. Construction productivity is dominated by the working frequency of workers (overtime), complexity of the task, level of technology application and accidents. Labour productivity is defined as nearly 50% of the baseline productivity using the BN model created by the caut 2sal relationship and probability of factors. The prediction error of the BN model was 6.6%, 10.0% and 9.3% for formwork (m2/h), reinforcing steel (ton/h) and concrete (m3/h), respectively.

Research limitations/implications

The evaluation or prediction of productivity performance has become a necessary topic for research and practice.

Practical implications

Managers and practitioners in the construction sector can utilise the outcome of this study to create good productivity management policies for their prospective projects.

Originality/value

Worker-related characteristics are dominant among critical factors affecting labour productivity for a construction task; the proposed BN-based predictive model is built based on these critical factors. The BN approach is highly accurate for construction productivity prediction. The findings of this study can fill gaps in the construction management body of knowledge when modelling construction productivity under the effects of multiple factors and using a simple probabilistic graphic tool.

Details

Engineering, Construction and Architectural Management, vol. 30 no. 5
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 19 June 2017

Khai Tan Huynh, Tho Thanh Quan and Thang Hoai Bui

Service-oriented architecture is an emerging software architecture, in which web service (WS) plays a crucial role. In this architecture, the task of WS composition and…

Abstract

Purpose

Service-oriented architecture is an emerging software architecture, in which web service (WS) plays a crucial role. In this architecture, the task of WS composition and verification is required when handling complex requirement of services from users. When the number of WS becomes very huge in practice, the complexity of the composition and verification is also correspondingly high. In this paper, the authors aim to propose a logic-based clustering approach to solve this problem by separating the original repository of WS into clusters. Moreover, they also propose a so-called quality-controlled clustering approach to ensure the quality of generated clusters in a reasonable execution time.

Design/methodology/approach

The approach represents WSs as logical formulas on which the authors conduct the clustering task. They also combine two most popular clustering approaches of hierarchical agglomerative clustering (HAC) and k-means to ensure the quality of generated clusters.

Findings

This logic-based clustering approach really helps to increase the performance of the WS composition and verification significantly. Furthermore, the logic-based approach helps us to maintain the soundness and completeness of the composition solution. Eventually, the quality-controlled strategy can ensure the quality of generated clusters in low complexity time.

Research limitations/implications

The work discussed in this paper is just implemented as a research tool known as WSCOVER. More work is needed to make it a practical and usable system for real life applications.

Originality/value

In this paper, the authors propose a logic-based paradigm to represent and cluster WSs. Moreover, they also propose an approach of quality-controlled clustering which combines and takes advantages of two most popular clustering approaches of HAC and k-means.

Article
Publication date: 25 February 2014

Yen-Ning Su, Chia-Cheng Hsu, Hsin-Chin Chen, Kuo-Kuang Huang and Yueh-Min Huang

This study aims to use sensing technology to observe the learning status of learners in a teaching and learning environment. In a general instruction environment, teachers often…

Abstract

Purpose

This study aims to use sensing technology to observe the learning status of learners in a teaching and learning environment. In a general instruction environment, teachers often encounter some teaching problems. These are frequently related to the fact that the teacher cannot clearly know the learning status of students, such as their degree of learning concentration and capacity to absorb knowledge. In order to deal with this situation, this study uses a learning concentration detection system (LCDS), combining sensor technology and an artificial intelligence method, to better understand the learning concentration of students in a learning environment.

Design/methodology/approach

The proposed system uses sensing technology to collect information about the learning behavior of the students, analyzes their concentration levels, and applies an artificial intelligence method to combine this information for use by the teacher. This system includes a pressure detection sensor and facial detection sensor to detect facial expressions, eye activities and body movements. The system utilizes an artificial bee colony (ABC) algorithm to optimize the system performance to help teachers immediately understand the degree of concentration and learning status of their students. Based on this, instructors can give appropriate guidance to several unfocused students at the same time.

Findings

The fitness value and computation time were used to evaluate the LCDS. Comparing the results of the proposed ABC algorithm with those from the random search method, the algorithm was found to obtain better solutions. The experimental results demonstrate that the ABC algorithm can quickly obtain near optimal solutions within a reasonable time.

Originality/value

A learning concentration detection method of integrating context-aware technologies and an ABC algorithm is presented in this paper. Using this learning concentration detection method, teachers can keep abreast of their students' learning status in a teaching environment and thus provide more appropriate instruction.

Details

Engineering Computations, vol. 31 no. 2
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 2 July 2018

Francesco Tajani, Pierluigi Morano and Klimis Ntalianis

As regards the assessment of the market values of properties that compose real estate portfolios, the purpose of this paper is to propose and test an automated valuation model. In…

1312

Abstract

Purpose

As regards the assessment of the market values of properties that compose real estate portfolios, the purpose of this paper is to propose and test an automated valuation model. In particular, the method defined allows for providing for objective, reliable and “quick” valuations of the assets in the phases of periodic reviews of the property values.

Design/methodology/approach

Aiming at both predictive and interpretative purposes, the method, based on multi-objective genetic algorithms to search those model expressions that simultaneously maximize the accuracy of the data and the parsimony of the mathematical functions, is applied to a sample data of office properties characterized by medium and large size, located in the city of Milan (Italy) and sold in the period between 2004 and 2015.

Findings

The model obtained could be an integration of the canonical methodologies (market approach, income approach, cost approach) implemented in the assessment of the market values of properties, so as to provide an additional tool to verify the results. In particular, the inclusion of economic variables in the model is consistent with the need to reiterate the valuations, contextualizing them to the locational characteristics and to the current property cycle phase in the specific area.

Practical implications

The model can be applied by all the operators involved in the periodic reviews of the values of property portfolios: from real estate funds’ insiders, in order to monitor the values obtained through the canonical approaches, to the public institutions, such as the revenue agencies, in order to ensure the fair payment of the taxes through the updating values of the properties according to the actual and current market trends.

Originality/value

The method proposed can be a valid support for all public and private entities that hold significant property assets and that, for various reasons (periodic reviews of the balance sheets, sales, enhancement, investment, etc.), require cyclical updated values of the properties. The automated valuation model developed can be used for the assessment of “comparison” values with the estimates values obtained by other assessment techniques, in order to ensure a further monitoring tool of the results from the subjects involved.

Details

Journal of Property Investment & Finance, vol. 36 no. 4
Type: Research Article
ISSN: 1463-578X

Keywords

Article
Publication date: 27 February 2009

Mourad Elhadef

The purpose of this paper is to describe a novel diagnosis approach, using neural networks (NNs), which can be used to identify faulty nodes in distributed and multiprocessor…

Abstract

Purpose

The purpose of this paper is to describe a novel diagnosis approach, using neural networks (NNs), which can be used to identify faulty nodes in distributed and multiprocessor systems.

Design/methodology/approach

Based on a literature‐based study focusing on research methodology and theoretical frameworks, the conduct of an ethnographic case study is described in detail. A discussion of the reporting and analysis of the data is also included.

Findings

This work shows that NNs can be used to implement a more efficient and adaptable approach for diagnosing faulty nodes in distributed systems. Simulations results indicate that the perceptron‐based diagnosis is a viable addition to present diagnosis problems.

Research limitations/implications

This paper presents a solution for the asymmetric comparison model. For a more generalized approach that can be used for other comparison or invalidation models this approach requires a multilayer neural network.

Practical implications

The extensive simulations conducted clearly showed that the perceptron‐based diagnosis algorithm correctly identified all the millions of faulty situations tested. In addition, the perceptron‐based diagnosis requires an off‐line learning phase which does not have an impact on the diagnosis latency. This means that a fault set can be easily and rapidly identified. Simulations results showed that only few milliseconds are required to diagnose a system, hence, one can start talking about “real‐time” diagnosis.

Originality/value

The paper is first work that uses NNs to solve the system‐level diagnosis problem.

Details

Education, Business and Society: Contemporary Middle Eastern Issues, vol. 2 no. 1
Type: Research Article
ISSN: 1753-7983

Keywords

Open Access
Article
Publication date: 23 December 2020

Adam Redmer

The purpose of this paper is to develop an original model and a solution procedure for solving jointly three main strategic fleet management problems (fleet composition…

6649

Abstract

Purpose

The purpose of this paper is to develop an original model and a solution procedure for solving jointly three main strategic fleet management problems (fleet composition, replacement and make-or-buy), taking into account interdependencies between them.

Design/methodology/approach

The three main strategic fleet management problems were analyzed in detail to identify interdependencies between them, mathematically modeled in terms of integer nonlinear programing (INLP) and solved using evolutionary based method of a solver compatible with a spreadsheet.

Findings

There are no optimization methods combining the analyzed problems, but it is possible to mathematically model them jointly and solve together using a solver compatible with a spreadsheet obtaining a solution/fleet management strategy answering the questions: Keep currently exploited vehicles in a fleet or remove them? If keep, how often to replace them? If remove then when? How many perspective/new vehicles, of what types, brand new or used ones and when should be put into a fleet? The relatively large scale instance of problem (50 vehicles) was solved based on a real-life data. The obtained results occurred to be better/cheaper by 10% than the two reference solutions – random and do-nothing ones.

Originality/value

The methodology of developing optimal fleet management strategy by solving jointly three main strategic fleet management problems is proposed allowing for the reduction of the fleet exploitation costs by adjusting fleet size, types of exploited vehicles and their exploitation periods.

Details

Journal of Quality in Maintenance Engineering, vol. 28 no. 2
Type: Research Article
ISSN: 1355-2511

Keywords

1 – 10 of 27