Search results

1 – 10 of over 12000
Article
Publication date: 20 April 2015

Renato de Siqueira Motta, Silvana Maria Bastos Afonso, Paulo Roberto Lyra and Ramiro Brito Willmersdorf

Optimization under a deterministic approach generally leads to a final design in which the performance may degrade significantly and/or constraints can be violated because of…

1939

Abstract

Purpose

Optimization under a deterministic approach generally leads to a final design in which the performance may degrade significantly and/or constraints can be violated because of perturbations arising from uncertainties. The purpose of this paper is to obtain a better strategy that would obtain an optimum design which is less sensitive to changes in uncertain parameters. The process of finding these optima is referred to as robust design optimization (RDO), in which improvement of the performance and reduction of its variability are sought, while maintaining the feasibility of the solution. This overall process is very time consuming, requiring a robust tool to conduct this optimum search efficiently.

Design/methodology/approach

In this paper, the authors propose an integrated tool to efficiently obtain RDO solutions. The tool encompasses suitable multiobjective optimization (MO) techniques (encompassing: Normal-Boundary Intersection, Normalized Normal-Constraint, weighted sum method and min-max methods), a surrogate model using reduced order method for cheap function evaluations and adequate procedure for uncertainties quantification (Probabilistic Collocation Method).

Findings

To illustrate the application of the proposed tool, 2D structural problems are considered. The integrated tool prove to be very effective reducing the computational time by up to five orders of magnitude, when compared to the solutions obtained via classical standard approaches.

Originality/value

The proposed combination of methodologies described in the paper, leads to a very powerful tool for structural optimum designs, considering uncertainty parameters, that can be extended to deal with other class of applications.

Details

Engineering Computations, vol. 32 no. 2
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 18 October 2022

Stefania Stellacci, Leonor Domingos and Ricardo Resende

The purpose of this research is to test the effectiveness of integrating Grasshopper 3D and measuring attractiveness by a categorical based evaluation technique (M-MACBETH) for…

Abstract

Purpose

The purpose of this research is to test the effectiveness of integrating Grasshopper 3D and measuring attractiveness by a categorical based evaluation technique (M-MACBETH) for building energy simulation analysis within a virtual environment. Set of energy retrofitting solutions is evaluated against performance-based criteria (energy consumption, weight and carbon footprint), and considering the preservation of the cultural value of the building, its architectural and spatial configuration.

Design/methodology/approach

This research addresses the building energy performance analysis before and after the design of retrofitting solutions in extreme climate environments (2030–2100). The proposed model integrates data obtained from an advanced parametric tool (Grasshopper) and a multi-criteria decision analysis (M-MACBETH) to score different energy retrofitting solutions against energy consumption, weight, carbon footprint and impact on architectural configuration. The proposed model is tested for predicting the performance of a traditional timber-framed dwelling in a historic parish in Lisbon. The performance of distinct solutions is compared in digitally simulated climate conditions (design scenarios) considering different criteria weights.

Findings

This study shows the importance of conducting building energy simulation linking physical and digital environments and then, identifying a set of evaluation criteria in the analysed context. Architects, environmental engineers and urban planners should use computational environment in the development design phase to identify design solutions and compare their expected impact on the building configuration and performance-based behaviour.

Research limitations/implications

The unavailability of local weather data (EnergyPlus Weather File (EPW) file), the high time-resource effort, and the number/type of the energy retrofit measures tested in this research limit the scope of this study. In energy simulation procedures, the baseline generally covers a period of thirty, ten or five years. In this research, due to the fact that weather data is unavailable in the format required in the simulation process (.EPW file), the input data in the baseline is the average climatic data from EnergyPlus (2022). Additionally, this workflow is time-consuming due to the low interoperability of the software. Grasshopper requires a high-skilled analyst to obtain accurate results. To calculate the values for the energy consumption, i.e. the values of energy per day of simulation, all the values given per hour are manually summed. The values of weight are obtained by calculating the amount of material required (whose dimensions are provided by Grasshopper), while the amount of carbon footprint is calculated per kg of material. Then this set of data is introduced into M-MACBETH. Another relevant limitation is related to the techniques proposed for retrofitting this case study, all based on wood-fibre boards.

Practical implications

The proposed method for energy simulation and climate change adaptation can be applied to other historic buildings considering different evaluation criteria and context-based priorities.

Social implications

Context-based adaptation measures of the built environment are necessary for the coming years due to the projected extreme temperature changes following the 2015 Paris Agreement and the 2030 Agenda. Built environments include historical sites that represent irreplaceable cultural legacies and factors of the community's identity to be preserved over time.

Originality/value

This study shows the importance of conducting building energy simulation using physical and digital environments. Computational environment should be used during the development design phase by architects, engineers and urban planners to rank design solutions against a set of performance criteria and compare the expected impact on the building configuration and performance-based behaviour. This study integrates Grasshopper 3D and M-MACBETH.

Details

International Journal of Building Pathology and Adaptation, vol. 42 no. 1
Type: Research Article
ISSN: 2398-4708

Keywords

Open Access
Article
Publication date: 24 August 2023

Chiara Bertolin and Filippo Berto

This article introduces the Special Issue on Sustainable Management of Heritage Buildings in long-term perspective.

1013

Abstract

Purpose

This article introduces the Special Issue on Sustainable Management of Heritage Buildings in long-term perspective.

Design/methodology/approach

It starts by reviewing the gaps in knowledge and practice which led to the creation and implementation of the research project SyMBoL—Sustainable Management of Heritage Buildings in long-term perspective funded by the Norwegian Research Council over the 2018–2022 period. The SyMBoL project is the motivation at the base of this special issue.

Findings

The editorial paper briefly presents the main outcomes of SyMBoL. It then reviews the contributions to the Special Issue, focussing on the connection or differentiation with SyMBoL and on multidisciplinary findings that address some of the initial referred gaps.

Originality/value

The article shortly summarizes topics related to sustainable preservation of heritage buildings in time of reduced resources, energy crisis and impacts of natural hazards and global warming. Finally, it highlights future research directions targeted to overcome, or partially mitigate, the above-mentioned challenges, for example, taking advantage of no sestructive techniques interoperability, heritage building information modelling and digital twin models, and machine learning and risk assessment algorithms.

Article
Publication date: 8 April 2014

Carmen Camelo-Ordaz, Joaquin García-Cruz and Elena Sousa-Ginel

The aim of this paper is to analyze the influence of two categories of conflict antecedents – input and behavior antecedents – on the level of relationship conflict (RC) in top…

2068

Abstract

Purpose

The aim of this paper is to analyze the influence of two categories of conflict antecedents – input and behavior antecedents – on the level of relationship conflict (RC) in top management teams (TMTs). The authors apply a process view to conflict, and consider that the effect of the input antecedents on RC may be mediated by a behavioral antecedent: behavioral integration.

Design/methodology/approach

Using a survey instrument, multi-informant data were collected from 64 TMTs. An aggregation and measurement analysis was performed. To test the hypotheses of mediation, bootstrapping procedures were used.

Findings

The results show that the effects of team tenure, intragroup trust and value consensus on relationship conflict are mediated by behavioral integration. However, TMT size does not affect relationship conflict – either directly or indirectly – through behavioral integration.

Research limitations/implications

It is concluded that encouraging intragroup trust and value consensus among TMT members facilitates the integrated behavior of the team. This behavioral integration may allow conflict to be constructive. Therefore, firms should make an effort to encourage this psychological context.

Originality/value

Previous research about the antecedents of RC in the field of TMTs is inconclusive. Additionally, a new approach to conflict antecedents is considered, to establish a direct and independent relationship between different categories of antecedents and TMT conflict. A relationship of interdependence is considered between different types of antecedents and their effects on RC.

Details

International Journal of Conflict Management, vol. 25 no. 2
Type: Research Article
ISSN: 1044-4068

Keywords

Article
Publication date: 2 March 2010

Oliver Barima and S.M. Rowlinson

This paper aims to examine the use of the increasingly important virtual concept to deliver value in projects, focusing on the manifest, critical issues which can enhance value…

Abstract

Purpose

This paper aims to examine the use of the increasingly important virtual concept to deliver value in projects, focusing on the manifest, critical issues which can enhance value delivery in virtual construction projects.

Design/methodology/approach

The study uses both quantitative and qualitative triangulated methods to examine the concepts in the study. Triangulated data analysis is also used to give insight.

Findings

The study disentangles 16 manifest, independent variables as being crucial in enhancing value delivery in virtual construction projects. Also the research gives a fresh view, by differentiating between influential and necessary crucial variables in virtual construction project value delivery.

Research limitations/implications

The study demonstrates the use of triangulation in critical variables research. It also gives integrated insight on rigorous, triangulated, data analysis to enhance understanding in the critical variables research domain.

Practical implications

The research gives insight to managers on the issues which need attention in the design and implementation of virtual construction projects to deliver value.

Originality/value

The research adds value to the literature on the virtual construction project delivery concept, where little knowledge exists. Based on empirical evidence, the study also gives a fresh insightful lens, for the examination of critical variables from the perspective of influential and necessary items.

Details

Engineering, Construction and Architectural Management, vol. 17 no. 2
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 4 December 2019

Michael James McCord, John McCord, Peadar Thomas Davis, Martin Haran and Paul Bidanset

Numerous geo-statistical methods have been developed to analyse the spatial dimension and composition of house prices. Despite these advances, spatial filtering remains an…

Abstract

Purpose

Numerous geo-statistical methods have been developed to analyse the spatial dimension and composition of house prices. Despite these advances, spatial filtering remains an under-researched approach within house price studies. This paper aims to examine the spatial distribution of house prices using an eigenvector spatial filtering (ESF) procedure, to analyse the local variation and spatial heterogeneity.

Design/methodology/approach

Using 2,664 sale transactions over the one year period Q3 2017 to Q3 2018, an eigenvector spatial filtering approach is applied to evaluate spatial patterns within the Belfast housing market. This method consists of using geographical coordinates to specify eigenvectors across geographic distance to determine a set of spatial filters. These convey spatial structures representative of different spatial scales and units. The filters are incorporated as predictors into regression analyses to alleviate spatial autocorrelation. This approach is intuitive, given that detection of autocorrelation in specific filters and within the regression residuals can be markers for exclusion or inclusion criteria.

Findings

The findings show both robust and effective estimator consistency and limited spatial dependency – culminating in accurately specified hedonic pricing models. The findings show that the spatial component alone explains 14.6 per cent of the variation in property value, whereas 77.6 per cent of the variation could be attributed to an interaction between the structural characteristics and the local market geography expressed by the filters. This methodological step reduced short-scale spatial dependency and residual autocorrelation resulting in increased model stability and reduced misspecification error.

Originality/value

Eigenvector-based spatial filtering is a less known but suitable statistical protocol that can be used to analyse house price patterns taking into account spatial autocorrelation at varying (different) spatial scales. This approach arguably provides a more insightful analysis of house prices by removing spatial autocorrelation both objectively and subjectively to produce reliable, yet understandable, regression models, which do not suffer from traditional challenges of serial dependence or spatial mis-specification. This approach offers property researchers and policymakers an intuitive but comprehensible approach for producing accurate price estimation models, which can be readily interpreted.

Details

International Journal of Housing Markets and Analysis, vol. 13 no. 5
Type: Research Article
ISSN: 1753-8270

Keywords

Article
Publication date: 22 April 2022

Mhd Anwar Orabi, Jin Qiu, Liming Jiang and Asif Usmani

Reinforced concrete slabs in fire have been heavily studied over the last three decades. However, most experimental and numerical work focuses on long-duration uniform exposure to…

Abstract

Purpose

Reinforced concrete slabs in fire have been heavily studied over the last three decades. However, most experimental and numerical work focuses on long-duration uniform exposure to standard fire. Considerably less effort has been put into investigating the response to localised fires that result in planarly non-uniform temperature distribution in the exposed elements.

Design/methodology/approach

In this paper, the OpenSees for Fire framework for modelling slabs under non-uniform fire exposure is presented, verified against numerical predictions by Abaqus and then validated against experimental tests. The thermal wrapper developed within OpenSees for Fire is then utilised to apply localised fire exposure to the validated slab models using the parameters of an experimentally observed localised fire. The effect of the smoke layer is also considered in this model and shown to significantly contribute to the thermal and thus thermo-mechanical response of slabs. Finally, the effect of localised fire heat release rate (HRR) and boundary conditions are studied.

Findings

The analysis showed that boundary conditions are very important for the response of slabs subject to localised fire, and expansive strains may be accommodated as deflections without severely damaging the slab by considering the lateral restraint.

Originality/value

This work demonstrates the capabilities of OpenSees for Fire in modelling structural behaviours subjected to non-uniform fire conditions and investigates the damage pattens of flat slabs exposed to localised fires. It is an advancing step towards understanding structural responses to realistic fires.

Details

Journal of Structural Fire Engineering, vol. 14 no. 1
Type: Research Article
ISSN: 2040-2317

Keywords

Open Access
Article
Publication date: 14 March 2024

Niki Chatzipanagiotou, Anita Mirijamdotter and Christina Mörtberg

This paper aims to focus on academic library managers’ learning practices in the context of cooperative work supported by computational artefacts. Academic library managers’…

Abstract

Purpose

This paper aims to focus on academic library managers’ learning practices in the context of cooperative work supported by computational artefacts. Academic library managers’ everyday work is mainly cooperative. Their cooperation is supported predominantly by computational artefacts. Learning how to use the computational artefacts efficiently and effectively involves understanding the changes in everyday work that affect managers and, therefore, it requires deep understanding of their cooperative work practices.

Design/methodology/approach

Focused ethnography was conducted through participant observations, interviews and document analysis. Ten managers from a university library in Sweden participated in the research. A thematic method was used to analyse the empirical material. Computer supported cooperative work (CSCW) and work-integrated learning was used as the conceptual lens.

Findings

Five learning practices were identified: collaboration, communication, coordination, decision-making processes and computational artefacts’ use. The findings show that learning is embedded in managers’ cooperative work practices, which do not necessarily include sufficient training time. Furthermore, learning was intertwined with cooperating and was situational. Managers learned by reflecting together on their own experiences and through joint cooperation and information sharing while using the computational artefacts.

Originality/value

The main contribution lies in providing insights into how academic library managers learn and cooperate in their everyday work, emphasizing the role of computational artefacts, the importance of the work context and the collective nature of learning. It also highlights the need for continual workplace learning in contemporary knowledge work environments. Thus, the research generates contributions to the informatics field by extending the understanding of managers’ work-integrated learning in their everyday cooperative work practices supported by computational artefacts’ use. It also contributes to the intersection of CSCW and work-integrated learning.

Details

The Learning Organization, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0969-6474

Keywords

Article
Publication date: 1 December 1999

N.P. Weatherill, E.A. Turner‐Smith, J. Jones, K. Morgan and O. Hassan

As computer simulation increasingly supports engineering design and manufacture, the requirement for a computer software environment providing an integration platform for…

4183

Abstract

As computer simulation increasingly supports engineering design and manufacture, the requirement for a computer software environment providing an integration platform for computational engineering software increases. The potential benefits to industry are considerable. As a first step in the long‐term development of such a system, a computer software environment has been developed for pre‐ and post‐processing for unstructured grid‐based computational simulation. Arbitrary computer application software can be integrated into the environment to provide a multi‐disciplinary engineering analysis capability within one unified computational framework. Recognising the computational demands of many application areas, the environment includes a set of parallel tools to help the user maximise the potential of high performance computers and networks. The paper will present details of the environment and include an example of, and discussion about, the integration of application software.

Details

Engineering Computations, vol. 16 no. 8
Type: Research Article
ISSN: 0264-4401

Keywords

1 – 10 of over 12000