Search results

1 – 10 of 57
Article
Publication date: 5 March 2024

Maria Ghannoum, Joseph Assaad, Michel Daaboul and Abdulkader El-Mir

The use of waste polyethylene terephthalate (PET) plastics derived from shredded bottles in concrete is not formalized yet, especially in reinforced members such as beams and…

Abstract

Purpose

The use of waste polyethylene terephthalate (PET) plastics derived from shredded bottles in concrete is not formalized yet, especially in reinforced members such as beams and columns. The disposal of plastic wastes in concrete is a viable alternative to manage those wastes while minimizing the environmental impacts associated to recycling, carbon dioxide emissions and energy consumption.

Design/methodology/approach

This paper evaluates the suitability of 2D deterministic and stochastic finite element (FE) modeling to predict the shear strength behavior of reinforced concrete (RC) beams without stirrups. Different concrete mixtures prepared with 1.5%–4.5% PET additions, by volume, are investigated.

Findings

Test results showed that the deterministic and stochastic FE approaches are accurate to assess the maximum load of RC beams at failure and corresponding midspan deflection. However, the crack patterns observed experimentally during the different stages of loading can only be reproduced using the stochastic FE approach. This later method accounts for the concrete heterogeneity due to PET additions, allowing a statistical simulation of the effect of mechanical properties (i.e. compressive strength, tensile strength and Young’s modulus) on the output FE parameters.

Originality/value

Data presented in this paper can be of interest to civil and structural engineers, aiming to predict the failure mechanisms of RC beams containing plastic wastes, while minimizing the experimental time and resources needed to estimate the variability effect of concrete properties on the performance of such structures.

Details

International Journal of Building Pathology and Adaptation, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2398-4708

Keywords

Book part
Publication date: 5 April 2024

Hung-pin Lai

The standard method to estimate a stochastic frontier (SF) model is the maximum likelihood (ML) approach with the distribution assumptions of a symmetric two-sided stochastic…

Abstract

The standard method to estimate a stochastic frontier (SF) model is the maximum likelihood (ML) approach with the distribution assumptions of a symmetric two-sided stochastic error v and a one-sided inefficiency random component u. When v or u has a nonstandard distribution, such as v follows a generalized t distribution or u has a χ2 distribution, the likelihood function can be complicated or untractable. This chapter introduces using indirect inference to estimate the SF models, where only least squares estimation is used. There is no need to derive the density or likelihood function, thus it is easier to handle a model with complicated distributions in practice. The author examines the finite sample performance of the proposed estimator and also compare it with the standard ML estimator as well as the maximum simulated likelihood (MSL) estimator using Monte Carlo simulations. The author found that the indirect inference estimator performs quite well in finite samples.

Book part
Publication date: 5 April 2024

Taining Wang and Daniel J. Henderson

A semiparametric stochastic frontier model is proposed for panel data, incorporating several flexible features. First, a constant elasticity of substitution (CES) production…

Abstract

A semiparametric stochastic frontier model is proposed for panel data, incorporating several flexible features. First, a constant elasticity of substitution (CES) production frontier is considered without log-transformation to prevent induced non-negligible estimation bias. Second, the model flexibility is improved via semiparameterization, where the technology is an unknown function of a set of environment variables. The technology function accounts for latent heterogeneity across individual units, which can be freely correlated with inputs, environment variables, and/or inefficiency determinants. Furthermore, the technology function incorporates a single-index structure to circumvent the curse of dimensionality. Third, distributional assumptions are eschewed on both stochastic noise and inefficiency for model identification. Instead, only the conditional mean of the inefficiency is assumed, which depends on related determinants with a wide range of choice, via a positive parametric function. As a result, technical efficiency is constructed without relying on an assumed distribution on composite error. The model provides flexible structures on both the production frontier and inefficiency, thereby alleviating the risk of model misspecification in production and efficiency analysis. The estimator involves a series based nonlinear least squares estimation for the unknown parameters and a kernel based local estimation for the technology function. Promising finite-sample performance is demonstrated through simulations, and the model is applied to investigate productive efficiency among OECD countries from 1970–2019.

Book part
Publication date: 5 April 2024

Alecos Papadopoulos

The author develops a bilateral Nash bargaining model under value uncertainty and private/asymmetric information, combining ideas from axiomatic and strategic bargaining theory…

Abstract

The author develops a bilateral Nash bargaining model under value uncertainty and private/asymmetric information, combining ideas from axiomatic and strategic bargaining theory. The solution to the model leads organically to a two-tier stochastic frontier (2TSF) setup with intra-error dependence. The author presents two different statistical specifications to estimate the model, one that accounts for regressor endogeneity using copulas, the other able to identify separately the bargaining power from the private information effects at the individual level. An empirical application using a matched employer–employee data set (MEEDS) from Zambia and a second using another one from Ghana showcase the applied potential of the approach.

Open Access
Article
Publication date: 4 December 2023

Yonghua Li, Zhe Chen, Maorui Hou and Tao Guo

This study aims to reduce the redundant weight of the anti-roll torsion bar brought by the traditional empirical design and improving its strength and stiffness.

Abstract

Purpose

This study aims to reduce the redundant weight of the anti-roll torsion bar brought by the traditional empirical design and improving its strength and stiffness.

Design/methodology/approach

Based on the finite element approach coupled with the improved beluga whale optimization (IBWO) algorithm, a collaborative optimization method is suggested to optimize the design of the anti-roll torsion bar structure and weight. The dimensions and material properties of the torsion bar were defined as random variables, and the torsion bar's mass and strength were investigated using finite elements. Then, chaotic mapping and differential evolution (DE) operators are introduced to improve the beluga whale optimization (BWO) algorithm and run case studies.

Findings

The findings demonstrate that the IBWO has superior solution set distribution uniformity, convergence speed, solution correctness and stability than the BWO. The IBWO algorithm is used to optimize the anti-roll torsion bar design. The error between the optimization and finite element simulation results was less than 1%. The weight of the optimized anti-roll torsion bar was lessened by 4%, the maximum stress was reduced by 35% and the stiffness was increased by 1.9%.

Originality/value

The study provides a methodological reference for the simulation optimization process of the lateral anti-roll torsion bar.

Details

Railway Sciences, vol. 3 no. 1
Type: Research Article
ISSN: 2755-0907

Keywords

Book part
Publication date: 5 April 2024

Feng Yao, Qinling Lu, Yiguo Sun and Junsen Zhang

The authors propose to estimate a varying coefficient panel data model with different smoothing variables and fixed effects using a two-step approach. The pilot step estimates the…

Abstract

The authors propose to estimate a varying coefficient panel data model with different smoothing variables and fixed effects using a two-step approach. The pilot step estimates the varying coefficients by a series method. We then use the pilot estimates to perform a one-step backfitting through local linear kernel smoothing, which is shown to be oracle efficient in the sense of being asymptotically equivalent to the estimate knowing the other components of the varying coefficients. In both steps, the authors remove the fixed effects through properly constructed weights. The authors obtain the asymptotic properties of both the pilot and efficient estimators. The Monte Carlo simulations show that the proposed estimator performs well. The authors illustrate their applicability by estimating a varying coefficient production frontier using a panel data, without assuming distributions of the efficiency and error terms.

Details

Essays in Honor of Subal Kumbhakar
Type: Book
ISBN: 978-1-83797-874-8

Keywords

Abstract

Details

Essays in Honor of Subal Kumbhakar
Type: Book
ISBN: 978-1-83797-874-8

Book part
Publication date: 5 April 2024

Christine Amsler, Robert James, Artem Prokhorov and Peter Schmidt

The traditional predictor of technical inefficiency proposed by Jondrow, Lovell, Materov, and Schmidt (1982) is a conditional expectation. This chapter explores whether, and by…

Abstract

The traditional predictor of technical inefficiency proposed by Jondrow, Lovell, Materov, and Schmidt (1982) is a conditional expectation. This chapter explores whether, and by how much, the predictor can be improved by using auxiliary information in the conditioning set. It considers two types of stochastic frontier models. The first type is a panel data model where composed errors from past and future time periods contain information about contemporaneous technical inefficiency. The second type is when the stochastic frontier model is augmented by input ratio equations in which allocative inefficiency is correlated with technical inefficiency. Compared to the standard kernel-smoothing estimator, a newer estimator based on a local linear random forest helps mitigate the curse of dimensionality when the conditioning set is large. Besides numerous simulations, there is an illustrative empirical example.

Article
Publication date: 26 February 2024

Dyhia Doufene, Samira Benharat, Abdelmoumen Essmine, Oussama Bouzegaou and Slimane Bouazabia

This paper aims to introduce a new numerical model that predicts the flashover voltage (FOV) value in the presence of polluted air surrounding a high-voltage insulator. The model…

Abstract

Purpose

This paper aims to introduce a new numerical model that predicts the flashover voltage (FOV) value in the presence of polluted air surrounding a high-voltage insulator. The model focuses on simulating the propagation of arcs and aims to improve the accuracy and reliability of FOV predictions under these specific conditions.

Design/methodology/approach

This arc propagation method connecting the high voltage fitting and the grounded insulator cap involves a two-step process. First, the electric field distribution in the vicinity of the insulator is obtained using finite element method analysis software. Subsequently, critical areas with intense electric field strength are identified. Random points within these critical areas are then selected as initial points for simulating the growth of electric arcs.

Findings

by increasing the electric voltage applied to the insulator fittings, the arc path is, step by step, generated until a breakdown occurs on the polluted air surrounding the insulator surface, and thus a prediction of the FOV value.

Practical implications

The proposed model for the FOV prediction can be a very interesting alternative to dangerous and costly experimental tests requiring an investment in time and materials.

Originality/value

Some works were done trying to reproduce discharge propagation but it was always with simplified models such as propagation in one direction from a point to a plane. The difficulty and the originality of the present work is the geometry complexity of the insulator with arc propagation in three distinct directions that will require several proliferation conditions.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering , vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0332-1649

Keywords

Book part
Publication date: 5 April 2024

Ziwen Gao, Steven F. Lehrer, Tian Xie and Xinyu Zhang

Motivated by empirical features that characterize cryptocurrency volatility data, the authors develop a forecasting strategy that can account for both model uncertainty and…

Abstract

Motivated by empirical features that characterize cryptocurrency volatility data, the authors develop a forecasting strategy that can account for both model uncertainty and heteroskedasticity of unknown form. The theoretical investigation establishes the asymptotic optimality of the proposed heteroskedastic model averaging heterogeneous autoregressive (H-MAHAR) estimator under mild conditions. The authors additionally examine the convergence rate of the estimated weights of the proposed H-MAHAR estimator. This analysis sheds new light on the asymptotic properties of the least squares model averaging estimator under alternative complicated data generating processes (DGPs). To examine the performance of the H-MAHAR estimator, the authors conduct an out-of-sample forecasting application involving 22 different cryptocurrency assets. The results emphasize the importance of accounting for both model uncertainty and heteroskedasticity in practice.

1 – 10 of 57