Search results
1 – 10 of 231Baixi Chen, Weining Mao, Yangsheng Lin, Wenqian Ma and Nan Hu
Fused deposition modeling (FDM) is an extensively used additive manufacturing method with the capacity to build complex functional components. Due to the machinery and…
Abstract
Purpose
Fused deposition modeling (FDM) is an extensively used additive manufacturing method with the capacity to build complex functional components. Due to the machinery and environmental factors during manufacturing, the FDM parts inevitably demonstrated uncertainty in properties and performance. This study aims to identify the stochastic constitutive behaviors of FDM-fabricated polylactic acid (PLA) tensile specimens induced by the manufacturing process.
Design/methodology/approach
By conducting the tensile test, the effects of the printing machine selection and three major manufacturing parameters (i.e., printing speed S, nozzle temperature T and layer thickness t) on the stochastic constitutive behaviors were investigated. The influence of the loading rate was also explained. In addition, the data-driven models were established to quantify and optimize the uncertain mechanical behaviors of FDM-based tensile specimens under various printing parameters.
Findings
As indicated by the results, the uncertain behaviors of the stiffness and strength of the PLA tensile specimens were dominated by the printing speed and nozzle temperature, respectively. The manufacturing-induced stochastic constitutive behaviors could be accurately captured by the developed data-driven model with the R2 over 0.98 on the testing dataset. The optimal parameters obtained from the data-driven framework were T = 231.3595 °C, S = 40.3179 mm/min and t = 0.2343 mm, which were in good agreement with the experiments.
Practical implications
The developed data-driven models can also be integrated into the design and characterization of parts fabricated by extrusion and other additive manufacturing technologies.
Originality/value
Stochastic behaviors of additively manufactured products were revealed by considering extensive manufacturing factors. The data-driven models were proposed to facilitate the description and optimization of the FDM products and control their quality.
Details
Keywords
Christine Amsler, Robert James, Artem Prokhorov and Peter Schmidt
The traditional predictor of technical inefficiency proposed by Jondrow, Lovell, Materov, and Schmidt (1982) is a conditional expectation. This chapter explores whether, and by…
Abstract
The traditional predictor of technical inefficiency proposed by Jondrow, Lovell, Materov, and Schmidt (1982) is a conditional expectation. This chapter explores whether, and by how much, the predictor can be improved by using auxiliary information in the conditioning set. It considers two types of stochastic frontier models. The first type is a panel data model where composed errors from past and future time periods contain information about contemporaneous technical inefficiency. The second type is when the stochastic frontier model is augmented by input ratio equations in which allocative inefficiency is correlated with technical inefficiency. Compared to the standard kernel-smoothing estimator, a newer estimator based on a local linear random forest helps mitigate the curse of dimensionality when the conditioning set is large. Besides numerous simulations, there is an illustrative empirical example.
Details
Keywords
Taining Wang and Daniel J. Henderson
A semiparametric stochastic frontier model is proposed for panel data, incorporating several flexible features. First, a constant elasticity of substitution (CES) production…
Abstract
A semiparametric stochastic frontier model is proposed for panel data, incorporating several flexible features. First, a constant elasticity of substitution (CES) production frontier is considered without log-transformation to prevent induced non-negligible estimation bias. Second, the model flexibility is improved via semiparameterization, where the technology is an unknown function of a set of environment variables. The technology function accounts for latent heterogeneity across individual units, which can be freely correlated with inputs, environment variables, and/or inefficiency determinants. Furthermore, the technology function incorporates a single-index structure to circumvent the curse of dimensionality. Third, distributional assumptions are eschewed on both stochastic noise and inefficiency for model identification. Instead, only the conditional mean of the inefficiency is assumed, which depends on related determinants with a wide range of choice, via a positive parametric function. As a result, technical efficiency is constructed without relying on an assumed distribution on composite error. The model provides flexible structures on both the production frontier and inefficiency, thereby alleviating the risk of model misspecification in production and efficiency analysis. The estimator involves a series based nonlinear least squares estimation for the unknown parameters and a kernel based local estimation for the technology function. Promising finite-sample performance is demonstrated through simulations, and the model is applied to investigate productive efficiency among OECD countries from 1970–2019.
Details
Keywords
Zhichao Wang and Valentin Zelenyuk
Estimation of (in)efficiency became a popular practice that witnessed applications in virtually any sector of the economy over the last few decades. Many different models were…
Abstract
Estimation of (in)efficiency became a popular practice that witnessed applications in virtually any sector of the economy over the last few decades. Many different models were deployed for such endeavors, with Stochastic Frontier Analysis (SFA) models dominating the econometric literature. Among the most popular variants of SFA are Aigner, Lovell, and Schmidt (1977), which launched the literature, and Kumbhakar, Ghosh, and McGuckin (1991), which pioneered the branch taking account of the (in)efficiency term via the so-called environmental variables or determinants of inefficiency. Focusing on these two prominent approaches in SFA, the goal of this chapter is to try to understand the production inefficiency of public hospitals in Queensland. While doing so, a recognized yet often overlooked phenomenon emerges where possible dramatic differences (and consequently very different policy implications) can be derived from different models, even within one paradigm of SFA models. This emphasizes the importance of exploring many alternative models, and scrutinizing their assumptions, before drawing policy implications, especially when such implications may substantially affect people’s lives, as is the case in the hospital sector.
Details
Keywords
The standard method to estimate a stochastic frontier (SF) model is the maximum likelihood (ML) approach with the distribution assumptions of a symmetric two-sided stochastic…
Abstract
The standard method to estimate a stochastic frontier (SF) model is the maximum likelihood (ML) approach with the distribution assumptions of a symmetric two-sided stochastic error v and a one-sided inefficiency random component u. When v or u has a nonstandard distribution, such as v follows a generalized t distribution or u has a
Details
Keywords
Nisha, Neha Puri, Namita Rajput and Harjit Singh
The purpose of this study is to analyse and compile the literature on various option pricing models (OPM) or methodologies. The report highlights the gaps in the existing…
Abstract
Purpose
The purpose of this study is to analyse and compile the literature on various option pricing models (OPM) or methodologies. The report highlights the gaps in the existing literature review and builds recommendations for potential scholars interested in the subject area.
Design/methodology/approach
In this study, the researchers used a systematic literature review procedure to collect data from Scopus. Bibliometric and structured network analyses were used to examine the bibliometric properties of 864 research documents.
Findings
As per the findings of the study, publication in the field has been increasing at a rate of 6% on average. This study also includes a list of the most influential and productive researchers, frequently used keywords and primary publications in this subject area. In particular, Thematic map and Sankey’s diagram for conceptual structure and for intellectual structure co-citation analysis and bibliographic coupling were used.
Research limitations/implications
Based on the conclusion presented in this paper, there are several potential implications for research, practice and society.
Practical implications
This study provides useful insights for future research in the area of OPM in financial derivatives. Researchers can focus on impactful authors, significant work and productive countries and identify potential collaborators. The study also highlights the commonly used OPMs and emerging themes like machine learning and deep neural network models, which can inform practitioners about new developments in the field and guide the development of new models to address existing limitations.
Social implications
The accurate pricing of financial derivatives has significant implications for society, as it can impact the stability of financial markets and the wider economy. The findings of this study, which identify the most commonly used OPMs and emerging themes, can help improve the accuracy of pricing and risk management in the financial derivatives sector, which can ultimately benefit society as a whole.
Originality/value
It is possibly the initial effort to consolidate the literature on calibration on option price by evaluating and analysing alternative OPM applied by researchers to guide future research in the right direction.
Details
Keywords
Ziwen Gao, Steven F. Lehrer, Tian Xie and Xinyu Zhang
Motivated by empirical features that characterize cryptocurrency volatility data, the authors develop a forecasting strategy that can account for both model uncertainty and…
Abstract
Motivated by empirical features that characterize cryptocurrency volatility data, the authors develop a forecasting strategy that can account for both model uncertainty and heteroskedasticity of unknown form. The theoretical investigation establishes the asymptotic optimality of the proposed heteroskedastic model averaging heterogeneous autoregressive (H-MAHAR) estimator under mild conditions. The authors additionally examine the convergence rate of the estimated weights of the proposed H-MAHAR estimator. This analysis sheds new light on the asymptotic properties of the least squares model averaging estimator under alternative complicated data generating processes (DGPs). To examine the performance of the H-MAHAR estimator, the authors conduct an out-of-sample forecasting application involving 22 different cryptocurrency assets. The results emphasize the importance of accounting for both model uncertainty and heteroskedasticity in practice.
Details
Keywords
Yot Amornkitvikai, Martin O'Brien and Ruttiya Bhula-or
The development of green manufacturing has become essential to achieve sustainable development and modernize the nation’s manufacturing and production capacity without increasing…
Abstract
Purpose
The development of green manufacturing has become essential to achieve sustainable development and modernize the nation’s manufacturing and production capacity without increasing nonrenewable resource consumption and pollution. This study investigates the effect of green industrial practices on technical efficiency for Thai manufacturers.
Design/methodology/approach
The study uses stochastic frontier analysis (SFA) to estimate the stochastic frontier production function (SFPF) and inefficiency effects model, as pioneered by Battese and Coelli (1995).
Findings
This study shows that, on average, Thai manufacturing firms have experienced declining returns-to-scale production and relatively low technical efficiency. However, it is estimated that Thai manufacturing firms with a green commitment obtained the highest technical efficiency, followed by those with green activity, green systems and green culture levels, compared to those without any commitment to green manufacturing practices. Finally, internationalization and skill development can significantly improve technical efficiency.
Practical implications
Green industry policy mixes will be vital for driving structural reforms toward a more environmentally friendly and sustainable economic system. Furthermore, circular economy processes can promote firms' production efficiency and resource use.
Originality/value
To the best of the authors' knowledge, this study is the first to investigate the effect of green industry practices on the technical efficiency of Thai manufacturing enterprises. This study also encompasses analyses of the roles of internationalization, innovation and skill development.
Details
Keywords
Radha Subramanyam, Y. Adline Jancy and P. Nagabushanam
Cross-layer approach in media access control (MAC) layer will address interference and jamming problems. Hybrid distributed MAC can be used for simultaneous voice, data…
Abstract
Purpose
Cross-layer approach in media access control (MAC) layer will address interference and jamming problems. Hybrid distributed MAC can be used for simultaneous voice, data transmissions in wireless sensor network (WSN) and Internet of Things (IoT) applications. Choosing the correct objective function in Nash equilibrium for game theory will address fairness index and resource allocation to the nodes. Game theory optimization for distributed may increase the network performance. The purpose of this study is to survey the various operations that can be carried out using distributive and adaptive MAC protocol. Hill climbing distributed MAC does not need a central coordination system and location-based transmission with neighbor awareness reduces transmission power.
Design/methodology/approach
Distributed MAC in wireless networks is used to address the challenges like network lifetime, reduced energy consumption and for improving delay performance. In this paper, a survey is made on various cooperative communications in MAC protocols, optimization techniques used to improve MAC performance in various applications and mathematical approaches involved in game theory optimization for MAC protocol.
Findings
Spatial reuse of channel improved by 3%–29%, and multichannel improves throughput by 8% using distributed MAC protocol. Nash equilibrium is found to perform well, which focuses on energy utility in the network by individual players. Fuzzy logic improves channel selection by 17% and secondary users’ involvement by 8%. Cross-layer approach in MAC layer will address interference and jamming problems. Hybrid distributed MAC can be used for simultaneous voice, data transmissions in WSN and IoT applications. Cross-layer and cooperative communication give energy savings of 27% and reduces hop distance by 4.7%. Choosing the correct objective function in Nash equilibrium for game theory will address fairness index and resource allocation to the nodes.
Research limitations/implications
Other optimization techniques can be applied for WSN to analyze the performance.
Practical implications
Game theory optimization for distributed may increase the network performance. Optimal cuckoo search improves throughput by 90% and reduces delay by 91%. Stochastic approaches detect 80% attacks even in 90% malicious nodes.
Social implications
Channel allocations in centralized or static manner must be based on traffic demands whether dynamic traffic or fluctuated traffic. Usage of multimedia devices also increased which in turn increased the demand for high throughput. Cochannel interference keep on changing or mitigations occur which can be handled by proper resource allocations. Network survival is by efficient usage of valid patis in the network by avoiding transmission failures and time slots’ effective usage.
Originality/value
Literature survey is carried out to find the methods which give better performance.
Details