Search results

1 – 10 of over 3000
Article
Publication date: 28 September 2021

Àlex Ferrer and Sebastián Miguel Giusti

The purpose of this study is to solve the inverse homogenization problem, or so-called material design problem, using the topological derivative concept.

Abstract

Purpose

The purpose of this study is to solve the inverse homogenization problem, or so-called material design problem, using the topological derivative concept.

Design/methodology/approach

The optimal topology is obtained through a relaxed formulation of the problem by replacing the characteristic function with a continuous design variable, so-called density variable. The constitutive tensor is then parametrized with the density variable through an analytical interpolation scheme that is based on the topological derivative concept. The intermediate values that may appear in the optimal topologies are removed by penalizing the perimeter functional.

Findings

The optimization process benefits from the intermediate values that provide the proposed method reaching to solutions that the topological derivative had not been able to find before. In addition, the presented theory opens the path to propose a new framework of research where the topological derivative uses classical optimization algorithms.

Originality/value

The proposed methodology allows us to use the topological derivative concept for solving the inverse homogenization problem and to fulfil the optimality conditions of the problem with the use of classical optimization algorithms. The authors solved several material design examples through a projected gradient algorithm to show the advantages of the proposed method.

Article
Publication date: 24 August 2021

Mohamed Abdelhamid and Aleksander Czekanski

This is an attempt to better bridge the gap between the mathematical and the engineering/physical aspects of the topic. The authors trace the different sources of…

Abstract

Purpose

This is an attempt to better bridge the gap between the mathematical and the engineering/physical aspects of the topic. The authors trace the different sources of non-convexification in the context of topology optimization problems starting from domain discretization, passing through penalization for discreteness and effects of filtering methods, and end with a note on continuation methods.

Design/methodology/approach

Starting from the global optimum of the compliance minimization problem, the authors employ analytical tools to investigate how intermediate density penalization affects the convexity of the problem, the potential penalization-like effects of various filtering techniques, how continuation methods can be used to approach the global optimum and how the initial guess has some weight in determining the final optimum.

Findings

The non-convexification effects of the penalization of intermediate density elements simply overshadows any other type of non-convexification introduced into the problem, mainly due to its severity and locality. Continuation methods are strongly recommended to overcome the problem of local minima, albeit its step and convergence criteria are left to the user depending on the type of application.

Originality/value

In this article, the authors present a comprehensive treatment of the sources of non-convexity in density-based topology optimization problems, with a focus on linear elastic compliance minimization. The authors put special emphasis on the potential penalization-like effects of various filtering techniques through a detailed mathematical treatment.

Details

Engineering Computations, vol. 39 no. 3
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 1 January 1993

Reza Eftekharzadeh

A comprehensive review of the literature for the problem oflot‐size scheduling (serial and assembly) considering the uncapacitatedproblem and complicated capacitated assembly…

Abstract

A comprehensive review of the literature for the problem of lot‐size scheduling (serial and assembly) considering the uncapacitated problem and complicated capacitated assembly manufacturing structure. Analyses the different solution techniques and findings for each product set.

Details

International Journal of Physical Distribution & Logistics Management, vol. 23 no. 1
Type: Research Article
ISSN: 0960-0035

Keywords

Article
Publication date: 29 August 2008

Wilma Penzo

The semantic and structural heterogeneity of large Extensible Markup Language (XML) digital libraries emphasizes the need of supporting approximate queries, i.e. queries where the…

Abstract

Purpose

The semantic and structural heterogeneity of large Extensible Markup Language (XML) digital libraries emphasizes the need of supporting approximate queries, i.e. queries where the matching conditions are relaxed so as to retrieve results that possibly partially satisfy the user's requests. The paper aims to propose a flexible query answering framework which efficiently supports complex approximate queries on XML data.

Design/methodology/approach

To reduce the number of relaxations applicable to a query, the paper relies on the specification of user preferences about the types of approximations allowed. A specifically devised index structure which efficiently supports both semantic and structural approximations, according to the specified user preferences, is proposed. Also, a ranking model to quantify approximations in the results is presented.

Findings

Personalized queries, on one hand, effectively narrow the space of query reformulations, on the other hand, enhance the user query capabilities with a great deal of flexibility and control over requests. As to the quality of results, the retrieval process considerably benefits because of the presence of user preferences in the queries. Experiments demonstrate the effectiveness and the efficiency of the proposal, as well as its scalability.

Research limitations/implications

Future developments concern the evaluation of the effectiveness of personalization on queries through additional examinations of the effects of the variability of parameters expressing user preferences.

Originality/value

The paper is intended for the research community and proposes a novel query model which incorporates user preferences about query relaxations on large heterogeneous XML data collections.

Details

International Journal of Web Information Systems, vol. 4 no. 3
Type: Research Article
ISSN: 1744-0084

Keywords

Article
Publication date: 4 January 2021

Ben Mansour Dia

The author examine the sequestration of CO2 in abandoned geological formations where leakages are permitted up to only a certain threshold to meet the international CO2 emissions…

Abstract

Purpose

The author examine the sequestration of CO2 in abandoned geological formations where leakages are permitted up to only a certain threshold to meet the international CO2 emissions standards. Technically, the author address a Bayesian experimental design problem to optimally mitigate uncertainties and to perform risk assessment on a CO2 sequestration model, where the parameters to be inferred are random subsurface properties while the quantity of interest is desired to be kept within safety margins.

Design/methodology/approach

The author start with a probabilistic formulation of learning the leak-age rate, and the author later relax it to a Bayesian experimental design of learning the formations geo-physical properties. The injection rate is the design parameter, and the learned properties are used to estimate the leakage rate by means of a nonlinear operator. The forward model governs a two-phase two-component flow in a porous medium with no solubility of CO2 in water. The Laplace approximation is combined with Monte Carlo sampling to estimate the expectation of the Kullback–Leibler divergence that stands for the objective function.

Findings

Different scenarios, of confining CO2 while measuring the risk of harmful leakages, are analyzed numerically. The efficiency of the inversion of the CO2 leakage rate improves with the injection rate as great improvements, in terms of the accuracy of the estimation of the formation properties, are noticed. However, this study shows that those results do not imply in any way that the learned value of the CO2 leakage should exhibit the same behavior. Also this study enhances the implementation of CO2 sequestrations by extending the duration given by the reservoir capacity, controlling the injection while the emissions remain in agreement with the international standards.

Originality/value

Uncertainty quantification of the reservoir properties is addressed. Nonlinear goal-oriented inverse problem, for the estimation of the leakage rate, is known to be very challenging. This study presents a relaxation of the probabilistic design of learning the leakage rate to the Bayesian experimental design of learning the reservoir geophysical properties.

Details

Engineering Computations, vol. 38 no. 3
Type: Research Article
ISSN: 0264-4401

Keywords

Book part
Publication date: 15 August 2006

Virginia M. Miori

Truckload routing has always been a challenge. This paper explores the development of continuous flow truckload routes, which resemble less than truckload routes, and a new way to…

Abstract

Truckload routing has always been a challenge. This paper explores the development of continuous flow truckload routes, which resemble less than truckload routes, and a new way to formulate the truckload routing problem (TRP). Rather than view the problem as a succession of origin/destination pairs, we look at the problem as a series of routing triplets. This enables us to use alternate solution methods, which may result in greater efficiency and improved solutions.

Details

Applications of Management Science: In Productivity, Finance, and Operations
Type: Book
ISBN: 978-0-85724-999-9

Article
Publication date: 1 October 2018

Nataliya Chukhrova and Arne Johannssen

The purpose of this paper is to construct innovative exact and approximative sampling plans for acceptance sampling in statistical quality control. These sampling plans are…

Abstract

Purpose

The purpose of this paper is to construct innovative exact and approximative sampling plans for acceptance sampling in statistical quality control. These sampling plans are determined for crisp and fuzzy formulation of quality limits, various lot sizes and common α- and β-levels.

Design/methodology/approach

The authors use generalized fuzzy hypothesis testing to determine sampling plans with fuzzified quality limits. This test method allows a consideration of the indifference zone related to expert opinion or user priorities. In addition to the exact sampling plans calculated with the hypergeometric operating characteristic function, the authors consider approximative sampling plans using a little known, but excellent operating characteristic function. Further, a comprehensive sensitivity analysis of calculated sampling plans is performed, in order to examine how the inspection effort depends on crisp and fuzzy formulation of quality limits, the lot size and specifications of the producer’s and consumer’s risks.

Findings

The results related the parametric sensitivity analysis of the calculated sampling plans and the conclusions regarding the approximation quality provide the user a comprehensive basis for a direct implementation of the sampling plans in practice.

Originality/value

The constructed sampling plans ensure the simultaneous control of producer’s and consumer’s risks with the smallest possible inspection effort on the one hand and a consideration of expert opinion or user priorities on the other hand.

Details

International Journal of Quality & Reliability Management, vol. 35 no. 9
Type: Research Article
ISSN: 0265-671X

Keywords

Book part
Publication date: 13 March 2013

Mark T. Leung

This study examines the scheduling problem for a two-stage flowshop. All jobs are immediately available for processing and job characteristics including the processing times and…

Abstract

This study examines the scheduling problem for a two-stage flowshop. All jobs are immediately available for processing and job characteristics including the processing times and due dates are known and certain. The goals of the scheduling problem are (1) to minimize the total flowtime for all jobs, (2) to minimize the total number of tardy jobs, and (3) to minimize both the total flowtime and the total number of tardy jobs simultaneously. Lower bound performances with respect to the total flowtime and the total number of tardy jobs are presented. Subsequently, this study identifies the special structure of schedules with minimum flowtime and minimum number of tardy jobs and develops three sets of heuristics which generate a Pareto set of bicriteria schedules. For each heuristic procedure, there are four options available for schedule generation. In addition, we provide enhancements to a variety of lower bounds with respect to flowtime and number of tardy jobs in a flowshop environment. Proofs and discussions to lower bound results are also included.

Details

Advances in Business and Management Forecasting
Type: Book
ISBN: 978-1-78190-331-5

Keywords

Book part
Publication date: 17 January 2009

Virginia M. Miori

The challenge of truckload routing is increased in complexity by the introduction of stochastic demand. Typically, this demand is generalized to follow a Poisson distribution. In…

Abstract

The challenge of truckload routing is increased in complexity by the introduction of stochastic demand. Typically, this demand is generalized to follow a Poisson distribution. In this chapter, we cluster the demand data using data mining techniques to establish the more acceptable distribution to predict demand. We then examine this stochastic truckload demand using an econometric discrete choice model known as a count data model. Using actual truckload demand data and data from the bureau of transportation statistics, we perform count data regressions. Two outcomes are produced from every regression run, the predicted demand between every origin and destination, and the likelihood that that demand will occur. The two allow us to generate an expected value forecast of truckload demand as input to a truckload routing formulation. The negative binomial distribution produces an improved forecast over the Poisson distribution.

Details

Advances in Business and Management Forecasting
Type: Book
ISBN: 978-1-84855-548-8

Article
Publication date: 1 November 2002

Jiaqin Yang and Richard H. Deane

The importance of reducing product lotsizes in converting traditional job shops into just‐in‐time (JIT) type manufacturing systems has been addressed in the literature. This paper…

2143

Abstract

The importance of reducing product lotsizes in converting traditional job shops into just‐in‐time (JIT) type manufacturing systems has been addressed in the literature. This paper presents a lotsize reduction model for closed stochastic production systems. The model is formulated based on an M/G/c queuing lotsize model. Product lotsize choice is related to all major components of job flow time: waiting time in queue, batch processing time, batch moving time, and finished goods warehousing time. The research is motivated by the fact that an optimal lotsize solution that minimizes only average job waiting time in the shop may not be optimal when the effects of job batch processing time, batch moving time, and batch warehousing time are also considered. There is no general closed form solution to the model due to the complexity of its nonlinear formulation. Based on the unique properties of the model, heuristic solution procedures are developed. The research demonstrates opportunities for shop managers to significantly reduce product lotsizes while minimizing total operating cost.

Details

Integrated Manufacturing Systems, vol. 13 no. 7
Type: Research Article
ISSN: 0957-6061

Keywords

1 – 10 of over 3000