Search results

1 – 10 of over 265000
Article
Publication date: 17 March 2016

Arnaud Baraston, Laurent Gerbaud, Vincent Reinbold, Thomas Boussey and Frédéric Wurtz

Multiphysical models are often useful for the design of electrical devices such as electrical machines. In this way, the modeling of thermal, magnetic and electrical phenomena by…

Abstract

Purpose

Multiphysical models are often useful for the design of electrical devices such as electrical machines. In this way, the modeling of thermal, magnetic and electrical phenomena by using an equivalent circuit approach is often used in sizing problems. The coupling of such models with other models is difficult to take into account, partly because it adds complexity to the process. The paper proposes an automatic modelling of thermal and magnetic aspects from an equivalent circuit approach, with its computation of gradients, using selectivity on the variables. Then, it discusses the coupling of various physical models, for the sizing by optimization algorithms. Sensibility analyses are discussed and the multiphysical approach is applied on a permanent magnet synchronous machine.

Design/methodology/approach

The paper allows one to describe thermal and magnetic models by equivalent circuits. Magnetic aspects are represented by reluctance networks and thermal aspects by thermal equivalent circuits. From circuit modelling and analytical equations, models are generated, coupled and translated into computational codes (Java, C), including the computation of their jacobians. To do so, model generators are used: CADES, Reluctool, Thermotool. The paper illustrates the modelling and automatic programming aspects with Thermotool. The generated codes are directly available for optimization algorithms. Then, the formulation of the coupling with other models is studied in the case of a multiphysical sizing by optimization of the Toyota PRIUS electrical motor.

Findings

A main specificity of the approach is the ability to easily deal with the selectivity of the inputs and outputs of the generated model according to the problem specifications, thus reducing drastically the size of the jacobian matrix and the computational complexity. Another specificity is the coupling of the models using analytical equations, possibly implicit equations.

Research limitations/implications

At the present time, the multiphysical modeling is considered only for static phenomena. However, this limit is not important for numerous sizing applications.

Originality/value

The analytical approach with the selectivity gives fast models, well-adapted for optimization. The use of model generators allows robust programming of the models and their jacobians. The automatic calculation of the gradients allows the use of determinist algorithms, such as SQP, well adapted to deal with numerous constraints.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering , vol. 35 no. 3
Type: Research Article
ISSN: 0332-1649

Open Access
Article
Publication date: 3 April 2024

Juan D. Borrero and Shumaila Yousafzai

The shift toward a circular economy (CE) represents a collaborative endeavor necessitating the presence of efficient frameworks, conducive contexts and a common comprehension…

Abstract

Purpose

The shift toward a circular economy (CE) represents a collaborative endeavor necessitating the presence of efficient frameworks, conducive contexts and a common comprehension. This research serves as a pivotal stride towards this goal, presenting an exclusive prospect for the investigation and fusion of these frameworks, with particular emphasis on the Quintuple Helix Model (5HM), into a unified theoretical framework that underscores the core principles of the CE. This study is centered on three pivotal questions aimed at decoding the CE transition in specific regional settings.

Design/methodology/approach

Adopting an abductive approach firmly anchored in a two-stage qualitative process, this study specifically merges the foundational principles from institutional theory, entrepreneurship literature and CE frameworks to provide insights into the dynamics of circular ecosystems, with a specific focus on the Huelva region in Spain.

Findings

The findings demonstrate significant potential in the CE, ranging from the integration of product and service systems to innovations in eco-industrial practices. Yet, a notable deficiency exists: the absence of institutional entrepreneurs, highlighting the essential role that universities can play. As recognized centers of innovation, universities are suggested to be key contributors to the transformation toward a CE, aligning with their societal and economic responsibilities.

Practical implications

This study highlights the importance of managing relationships with entities like SMEs and policymakers or academia for effective CE adoption. Policymakers can refine strategies based on the research’s insights, while the impact of university-driven circular ecosystems on sustainable societies is another crucial area for research.

Originality/value

The sustainability models cited in CE literature may not be comprehensive enough to prevent problem shifting, and it can be argued that they lack a sound theoretical and conceptual basis. Furthermore, the connections between sustainability objectives and the three levels of the CE operating system remain vague. Additionally, there is insufficient information on how regions foster the involvement of the environment in fivefold helix cooperation and how this impacts the CE.

Open Access
Article
Publication date: 30 November 2002

Jae Ha Lee and Han Deog Hui

This study explores hedging strategies that use the KTB futures to hedge the price risk of the KTB spot portfolio. The study establishes the price sensitivity, risk-minimization…

51

Abstract

This study explores hedging strategies that use the KTB futures to hedge the price risk of the KTB spot portfolio. The study establishes the price sensitivity, risk-minimization, bivariate GARCH (1,1) models as hedging models, and analyzes their hedging performances. The sample period covers from September 29, 1999 to September 18, 2001. Time-matched prices at 11:00 (11:30) of the KTB futures and spot were used in the analysis. The most important findings may be summarized as follows. First, while the average hedge ration of the price sensitivity model is close to one, both the risk-minimization and GARCH model exhibit hedge ratios that are substantially lower than one. Hedge ratios tend to be greater for daily data than for weekly data. Second, for the daily in-sample data, hedging effectiveness is the highest for the GARCH model with time-varying hedge ratios, but the risk-minimization model with constant hedge ratios is not far behind the GARCH model in its hedging performance. In the case of out-of-sample hedging effectiveness, the GARCH model is the best for the KTB spot portfolio, and the risk-minimization model is the best for the corporate bond portfolio. Third, for daily data, the in-sample hedge shows a better performance than the out-of-sample hedge, except for the risk-minimization hedge against the corporate bond portfolio. Fourth, for the weekly in-sample hedges, the price sensitivity model is the worst and the risk-minimization model is the best in hedging the KTB spot portfolio. While the GARCH model is the best against the KTB +corporate bond portfolio, the risk-minimization model is generally as good as the GARCH model. The risk-minimization model performs the best for the weekly out-of-sample data, and the out-of-sample hedges are better than the in-sample hedges. Fifth, while the hedging performance of the risk-minimization model with daily moving window seems somewhat superior to the traditional risk-minimization model when the trading volume increased one year after the inception of the KTB futures, on the average the traditional model is better than the moving-window model. For weekly data, the traditional model exhibits a better performance. Overall, in the Korean bond markets, investors are encouraged to use the simple risk-minimization model to hedge the price risk of the KTB spot and corporate bond portfolios.

Details

Journal of Derivatives and Quantitative Studies, vol. 10 no. 2
Type: Research Article
ISSN: 2713-6647

Keywords

Open Access
Article
Publication date: 31 May 2006

Mi Ae Kim

Recently, domestic market participants have a growing interest in synthetic Collateralized Debt Obligation (CDO) as a security to reduce credit risk and create new profit…

17

Abstract

Recently, domestic market participants have a growing interest in synthetic Collateralized Debt Obligation (CDO) as a security to reduce credit risk and create new profit. Therefore, the valuation method and hedging strategy for synthetic CDO become an important issue. However, there is no won-denominated credit default swap transactions, which are essential for activating synthetic CDO transaction‘ In addition, there is no transparent market information for the default probability, asset correlation, and recovery rate, which are critical variables determining the price of synthetic CDO.

This study first investigates the method of estimating the default probability, asset correlation coefficient, and recovery rate. Next, using five synthetiC CDO pricing models‘ widely used OFGC (One-Factor Non-Gaussian Copula) model. OFNGC (One-Factor Non-Gaussian Copula) model such as OFDTC (One-Factor Double T-distribution Copula) model of Hull and White (2004) or NIGC (Normal Inverse Gaussian Copula) model of Kalemanova et al.(2005), SC<Stochastic Correlation) model of Burtschell et al.(2005), and FL (Forward Loss) model of Bennani (2005), I Investigate and compare three points: 1) appropriateness for portfolio loss distribution, 2) explanation for standardized tranche spread, 3) sensitivity for delta-neutral hedging strategy. To compare pricing models, parameter estimation for each model is preceded by using the term structure of iTraxx Europe index spread and the tranch spreads with different maturities and exercise prices Remarkable results of this study are as follows. First, the probability for loss interval determining mezzanine tranche spread is lower in all models except SC model than OFGC model. This result shows that all mαdels except SC model in some degree solve the implied correlation smile phenomenon, where the correlation coefficient of mezzanine tranche must be lower than other tranches when OFGC model is used. Second, in explaining standardized tranche spread, NIGC model is the best among various models with respect to relative error. When OFGC model is compared with OFDTC model, OFOTC model is better than OFGC model in explaining 5-year tranche spreads. But for 7-year or 10-year tranches, OFDTC model is better with respect to absolute error while OFGC model is better with respect to relative error. Third, the sensitivity sign of senior tranctle spread with respect to asset correlation is sometime negative in NIG model while it is positive in other models. This result implies that a long position may be taken by the issuers of synthet.ic COO as a correlation delta-neutral hedging strategy when OFGC model is used, while a short position may be taken when NIGC model is used.

Details

Journal of Derivatives and Quantitative Studies, vol. 14 no. 1
Type: Research Article
ISSN: 2713-6647

Keywords

Article
Publication date: 9 April 2024

Charles A. Donnelly, Sushobhan Sen, John W. DeSantis and Julie M. Vandenbossche

The time-varying equivalent linear temperature gradient (ELTG) significantly affects the development of faulting and must therefore be accounted for in pavement design. The same…

Abstract

Purpose

The time-varying equivalent linear temperature gradient (ELTG) significantly affects the development of faulting and must therefore be accounted for in pavement design. The same is true for faulting of bonded concrete overlays of asphalt (BCOA) with slabs larger than 3 x 3 m. However, the evaluation of ELTG in Mechanistic-Empirical (ME) BCOA design is highly time-consuming. The use of an effective ELTG (EELTG) is an efficient alternative to calculating ELTG. In this study, a model to quickly evaluate EELTG was developed for faulting in BCOA for panels 3 m or longer in size, whose faulting is sensitive to ELTG.

Design/methodology/approach

A database of EELTG responses was generated for 144 BCOAs at 169 locations throughout the continental United States, which was used to develop a series of prediction models. Three methods were evaluated: multiple linear regression (MLR), artificial neural networks (ANNs), and multi-gene genetic programming (MGGP). The performance of each method was compared, considering both accuracy and model complexity.

Findings

It was shown that ANNs display the highest accuracy, with an R2 of 0.90 on the validation dataset. MLR and MGGP models achieved R2 of 0.73 and 0.71, respectively. However, these models consisted of far fewer free parameters as compared to the ANNs. The model comparison performed in this study highlights the need for researchers to consider the complexity of models so that their direct implementation is feasible.

Originality/value

This research produced a rapid EELTG prediction model for BCOAs that can be incorporated into the existing faulting model framework.

Article
Publication date: 1 November 2007

Agnieszka Cichocka, Pascal Bruniaux and Vladan Koncar

This paper presents an introduction to the modelling of virtual garment design process in 3D… Our global project of virtual clothing design, along with the conception of a virtual…

Abstract

This paper presents an introduction to the modelling of virtual garment design process in 3D… Our global project of virtual clothing design, along with the conception of a virtual adaptive mannequin, is devoted to creating and modelling garments in 3D. Starting from ideas of mass customization, e-commerce and the need of numerical innovations in the garment industry, this article presents a model of virtual garment and methodology enabling virtual clothing to be conceived directly on an adaptive mannequin morphotype in 3D. A short description of the overall garment model under constraints is presented. To explain the overall methodology, the basic pattern of trousers is given. The global model of garment creation in 3D is composed of three parts - a human body model, an ease model and a garment model. The most essential part is the ease model, which is necessary for the proposed process of garment modelling. After describing each garment modelling element influencing this process, a detailed presentation of the ease model in relation to the garment model is proposed. The combination of the previously mentioned models may be considered as 2 interconnected sub-models. The first sub-model is linked with the front pattern position on the body and the second with the back pattern position on the trousers with appropriate ease values. In order to execute the identification procedure of the correct ease values and consequently their right positions on the human body, an algorithm of identification is proposed. The two sub-models are strongly connected as in the feedback effect caused by the interactions of the trouser front and back patterns. The aforementioned connection phenomenon appears during modelling and it depends on the structure of the proposed ease model. The relatively significant number of parameters requires the use of the identification technique. Finally, the superposition of virtual and real patterns was done in order to visualise the results.

Details

Research Journal of Textile and Apparel, vol. 11 no. 4
Type: Research Article
ISSN: 1560-6074

Keywords

Open Access
Article
Publication date: 30 November 2004

Joon Haeng Lee

This paper estimates and forecasts yield curve of korea bond market using a three factor term structure model based on the Nelson-Siegel model. The Nelson-Siegel model is…

12

Abstract

This paper estimates and forecasts yield curve of korea bond market using a three factor term structure model based on the Nelson-Siegel model. The Nelson-Siegel model is in-terpreted as a model of level, slope and curvature and has the flexibility required to match the changing shape of the yield curve. To estimate this model, we use the two-step estima-tion procedure as in Diebold and Li. Estimation results show our model is Quite flexible and gives a very good fit to data.

To see the forecasting ability of our model, we compare the RMSEs (root mean square error) of our model to random walk (RW) model and principal component model for out-of sample period as well as in-sample period. we find that our model has better forecasting performances over principal component model but shows slight edge over RW model especially for long run forecasting period. Considering that it is difficult for any model to show better forecasting ability over the RW model in out-of-sample period, results suggest that our model is useful for practitioners to forecast yields curve dynamics.

Details

Journal of Derivatives and Quantitative Studies, vol. 12 no. 2
Type: Research Article
ISSN: 2713-6647

Article
Publication date: 8 March 2024

Wenqian Feng, Xinrong Li, Jiankun Wang, Jiaqi Wen and Hansen Li

This paper reviews the pros and cons of different parametric modeling methods, which can provide a theoretical reference for parametric reconstruction of 3D human body models for…

Abstract

Purpose

This paper reviews the pros and cons of different parametric modeling methods, which can provide a theoretical reference for parametric reconstruction of 3D human body models for virtual fitting.

Design/methodology/approach

In this study, we briefly analyze the mainstream datasets of models of the human body used in the area to provide a foundation for parametric methods of such reconstruction. We then analyze and compare parametric methods of reconstruction based on their use of the following forms of input data: point cloud data, image contours, sizes of features and points representing the joints. Finally, we summarize the advantages and problems of each method as well as the current challenges to the use of parametric modeling in virtual fitting and the opportunities provided by it.

Findings

Considering the aspects of integrity and accurate of representations of the shape and posture of the body, and the efficiency of the calculation of the requisite parameters, the reconstruction method of human body by integrating orthogonal image contour morphological features, multifeature size constraints and joint point positioning can better represent human body shape, posture and personalized feature size and has higher research value.

Originality/value

This article obtains a research thinking for reconstructing a 3D model for virtual fitting that is based on three kinds of data, which is helpful for establishing personalized and high-precision human body models.

Details

International Journal of Clothing Science and Technology, vol. 36 no. 2
Type: Research Article
ISSN: 0955-6222

Keywords

Article
Publication date: 7 March 2024

Fei Xu, Zheng Wang, Wei Hu, Caihao Yang, Xiaolong Li, Yaning Zhang, Bingxi Li and Gongnan Xie

The purpose of this paper is to develop a coupled lattice Boltzmann model for the simulation of the freezing process in unsaturated porous media.

Abstract

Purpose

The purpose of this paper is to develop a coupled lattice Boltzmann model for the simulation of the freezing process in unsaturated porous media.

Design/methodology/approach

In the developed model, the porous structure with complexity and disorder was generated by using a stochastic growth method, and then the Shan-Chen multiphase model and enthalpy-based phase change model were coupled by introducing a freezing interface force to describe the variation of phase interface. The pore size of porous media in freezing process was considered as an influential factor to phase transition temperature, and the variation of the interfacial force formed with phase change on the interface was described.

Findings

The larger porosity (0.2 and 0.8) will enlarge the unfrozen area from 42 mm to 70 mm, and the rest space of porous medium was occupied by the solid particles. The larger specific surface area (0.168 and 0.315) has a more fluctuated volume fraction distribution.

Originality/value

The concept of interfacial force was first introduced in the solid–liquid phase transition to describe the freezing process of frozen soil, enabling the formulation of a distribution equation based on enthalpy to depict the changes in the water film. The increased interfacial force serves to diminish ice formation and effectively absorb air during the freezing process. A greater surface area enhances the ability to counteract liquid migration.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 34 no. 4
Type: Research Article
ISSN: 0961-5539

Keywords

Article
Publication date: 4 March 2024

Yongjiang Xue, Wei Wang and Qingzeng Song

The primary objective of this study is to tackle the enduring challenge of preserving feature integrity during the manipulation of geometric data in computer graphics. Our work…

Abstract

Purpose

The primary objective of this study is to tackle the enduring challenge of preserving feature integrity during the manipulation of geometric data in computer graphics. Our work aims to introduce and validate a variational sparse diffusion model that enhances the capability to maintain the definition of sharp features within meshes throughout complex processing tasks such as segmentation and repair.

Design/methodology/approach

We developed a variational sparse diffusion model that integrates a high-order L1 regularization framework with Dirichlet boundary constraints, specifically designed to preserve edge definition. This model employs an innovative vertex updating strategy that optimizes the quality of mesh repairs. We leverage the augmented Lagrangian method to address the computational challenges inherent in this approach, enabling effective management of the trade-off between diffusion strength and feature preservation. Our methodology involves a detailed analysis of segmentation and repair processes, focusing on maintaining the acuity of features on triangulated surfaces.

Findings

Our findings indicate that the proposed variational sparse diffusion model significantly outperforms traditional smooth diffusion methods in preserving sharp features during mesh processing. The model ensures the delineation of clear boundaries in mesh segmentation and achieves high-fidelity restoration of deteriorated meshes in repair tasks. The innovative vertex updating strategy within the model contributes to enhanced mesh quality post-repair. Empirical evaluations demonstrate that our approach maintains the integrity of original, sharp features more effectively, especially in complex geometries with intricate detail.

Originality/value

The originality of this research lies in the novel application of a high-order L1 regularization framework to the field of mesh processing, a method not conventionally applied in this context. The value of our work is in providing a robust solution to the problem of feature degradation during the mesh manipulation process. Our model’s unique vertex updating strategy and the use of the augmented Lagrangian method for optimization are distinctive contributions that enhance the state-of-the-art in geometry processing. The empirical success of our model in preserving features during mesh segmentation and repair presents an advancement in computer graphics, offering practical benefits to both academic research and industry applications.

Details

Engineering Computations, vol. 41 no. 2
Type: Research Article
ISSN: 0264-4401

Keywords

1 – 10 of over 265000